JP4515615B2 - Image display device - Google Patents

Image display device Download PDF

Info

Publication number
JP4515615B2
JP4515615B2 JP2000280058A JP2000280058A JP4515615B2 JP 4515615 B2 JP4515615 B2 JP 4515615B2 JP 2000280058 A JP2000280058 A JP 2000280058A JP 2000280058 A JP2000280058 A JP 2000280058A JP 4515615 B2 JP4515615 B2 JP 4515615B2
Authority
JP
Japan
Prior art keywords
image
straight line
cross
cut
curve
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
JP2000280058A
Other languages
Japanese (ja)
Other versions
JP2002092590A (en
JP2002092590A5 (en
Inventor
拡樹 谷口
良洋 後藤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Healthcare Manufacturing Ltd
Original Assignee
Hitachi Medical Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Medical Corp filed Critical Hitachi Medical Corp
Priority to JP2000280058A priority Critical patent/JP4515615B2/en
Publication of JP2002092590A publication Critical patent/JP2002092590A/en
Publication of JP2002092590A5 publication Critical patent/JP2002092590A5/ja
Application granted granted Critical
Publication of JP4515615B2 publication Critical patent/JP4515615B2/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Landscapes

  • Image Processing (AREA)
  • Image Analysis (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Processing Or Creating Images (AREA)

Description

【0001】
【発明の属する技術分野】
本発明は画像表示装置に係り、特に血管、腸、気管等の管状の観察対象の断面像を表示する画像表示装置に関する。
【0002】
【従来の技術】
従来、血管、腸等の観察対象をその観察対象に沿った曲面(切断曲面)で縦切りにすることにより、観察対象の内部を観察できるようにした画像表示装置が提案されている(特開平11−318884号公報)。
【0003】
【発明が解決しようとする課題】
ところで、特開平11−318884号公報に記載の画像表示装置は、観察対象に沿った切断曲面として、観察対象に沿った複数の視点を観察対象の内部に設定し、これらの複数の視点を通る面として定義している。
しかしながら、血管が複数に枝分かれするような場合における、各血管を縦切りにする切断曲面は三次元的な曲面となるが、特開平11−318884号公報には、三次元的な切断曲面によって切断された断面像を構成する具体的手段は開示されていない。
【0004】
本発明はこのような事情に鑑みてなされたもので、枝分かれした血管、気管や複雑に蛇行する腸等の管状の観察対象をその観察対象に沿った三次元的な曲面で縦切りにした断面像を表示することができる画像表示装置を提供することを目的としている。
【0005】
【課題を解決するための手段】
前記目的を達成するために本発明は、三次元の原画像に含まれる管状の観察対象の略中心を通る経路を求める手段と、前記観察対象を所定の間隔で切断する複数の切断面を設定する手段と、前記経路と前記切断面との交点を求める手段と、前記切断面上の交点を通る直線又は曲線を前記切断面ごとに求める手段と、前記直線又は前記曲線上の点列を前記経路に沿って積み上げて三次元的な切断曲面を構成する手段と、前記切断曲線で切断される前記観察対象の断面像を作成する手段と、前記断面像を表示する手段と、を備えたことを特徴としている。
【0006】
即ち、前記三次元の原画像に含まれる観察対象の略中心を通る経路は、視点を観察対象の内部に設定することができる中心投影法(特開平7−210704号公報、及び特開平8−16813号公報)において使用された視点の経路や、観察対象を領域拡張法等で抽出した後に2値化し、これを細線化処理した細線などから求めることができる。そして、上記経路と観察対象を所定の間隔で切断する各切断面との交点を求める。尚、各切断面は前記経路を横断することができる任意の傾きをもった平面である。
【0007】
上記のようにして求めた交点を通る切断面上の直線又は曲線を求める。1つの切断面上に1点又は2点の交点が存在する場合には、その交点を通る線は直線であり、1つの切断面上に3点以上の交点が存在する場合には、その交点を通る線は、近似多項式による補間(内挿)及び補外(外挿)計算などによる曲線である。尚、交点が1点の場合には、その前後の切断面上の直線又は曲線の情報を用いて直線を決定する。例えば、前後の切断面上の直線又は曲線と略同方向となる直線とする。そして、上記のようにして求めた各切断面ごとに得られる前記直線又は曲線を積み上げて三次元的な切断曲面を構成し、該切断曲面で切断された前記観察対象を含む原画像の断面像を表示する。
【0008】
【発明の実施の形態】
以下添付図面に従って本発明に係る画像表示装置の好ましい実施の形態について詳説する。
【0009】
図1は本発明に係る画像表示装置によって表示される表示例を示している。同図において、1はCRTモニタ、2はマウス、3は中心投影法によって構成された血管内を示す疑似三次元画像(内視鏡的画像)、4は血管、腸、気管等の管状の観察対象4の縦切りの断面像、6は観察対象12の横断像である。
【0010】
上記断面像4の構成方法について説明する。
図2は上記断面像の構成方法の実施の形態を示す図である。図2(A)において、10はX線CT装置等によって得られる複数のCT画像(CT1,CT2,CT3,…)が積み上げられた三次元の原画像であり、12は原画像10に含まれる観察対象12を示している。また、5a〜5gは中心投影法によって観察対象12内に順次設定された視点を示す。
【0011】
上記視点5a〜5gは、例えば図1の内視鏡的画像3を見ながら画面中のカーソル(図示せず)をマウス等で操作しながら順次設定され、又は視線方向が視点から最も遠い位置に向かうように順次自動的に更新されることにより設定される。尚、中心投影法及び視点の更新方法の詳細については、特開平7−210704号公報、及び特開平8−16813号公報に記載されている。
【0012】
上記のようにして設定された視点5a〜5gを含む曲面(切断面)は、視点5a〜5gを通る複数の直線5aL〜5gLを含んでおり、また、直線5aL〜5gLは、図2に示す実施の形態では、CT画像(CT1,CT2,CT3,…)の積み上げ方向の座標軸yに平行な場合に関して示している。
【0013】
さて、観察対象12の縦切りの断面像4(図1)を構成する場合には、次のようにする。視点5aを通るy軸に平行な直線5aLを求め、直線5aL上の各点でのCT値をCT画像(CT1,CT2,CT3,…)から求める。尚、各CT画像間のCT値は、補間によって求める。このようにして求めた直線5aL上のCT値をメモリ14(図2(B))に格納する。他の視点5b〜5gを通る直線5bL〜5gL上のCT値も同様にしてメモリ14に格納するが、直線(視点)の間隔が長い場合には、例えば直線5cL〜5dLの間で示すように所望の画質が得られる間隔の直線を補間により求め、これらの直線上のCT値をメモリ14に格納する。
【0014】
また、図2(A)に示すように、視点5aを通る複数の直線5aL1、5aL2、5aL3…を、y軸から所定の角度回転するように設定し、他の視点5b〜5gを通る直線についても同様に複数設定する。そして、直線5aL1と同じ角度の各直線上のCT値をメモリに格納することにより1フレーム目の画像データを構成し、同様にして直線5aL2、5aL3、…と同じ角度の各直線上のCT値をメモリに格納することにより視点を結ぶ経路を軸としてあらゆる角度で切断した画像データを構成することができる。
【0015】
図3に展開画像構成方法の処理フローを示し、順を追って説明する。
【0016】
図4は観察領域の中心位置に視点を変更する方法を示す図である(図3のステップS11に相当)。まず、前記中心投影法によって得られた複数の視点に基づいて、視点の方向ベクトルに直交した横断面7を設定し(図4(a))、その横断面7の濃度プロファイルを作成する(図4(b))。続いて、観察対象内部と外部の濃度値を分別するため、二次差分法を用いる。二次差分オペレータは、領域境界を抽出するために用いられる。領域境界は、図4(a)に示す二次差分値の極性が変わる、つまりゼロクロス点とする。オペレータとして、ラプラシアンを適用するが、これはランダムノイズに弱いので、平滑化を行うオペレータとして図4(a)のラプラシアンオペレータを適用する。オペレータの幅は図示した値ばかりでなく、幅の値を変更でき、対象領域によっては経験的に設定した値を使用すればよい。この処理によって領域境界が求められ、2つの領域境界の中心位置が断面上での中心位置となる。更に、同様に切断する角度を変更した時の断面上の中心位置を求めることによって、三次元的に視点を観察対象の中心位置に変更することができる。
【0017】
また、図4(c)に観察領域の中心位置に視点を変更する第二の方法を示す。この方法は、中心投影法から内視鏡的画像を作成する時に、初期に求められた視点から放射状にスキャンし、観察対象領域から外れた時点の位置情報(黒丸の位置情報)を基に、円近似又は楕円近似によって観察対象内の中心を求め、新たな視点とする方法である。円近似又は楕円近似は最小自乗近似により推定される。例えば、サンプル点データを、
{(X1,Y1)(X2,Y2)…(Xn,Yn ) }
とすると、これを円近似又は楕円近似する場合の目的評価関数はそれぞれ、
【0018】
【数1】

Figure 0004515615
で与えられる。求められたパラメータによって以下の円関数、楕円関数が定まり、中心座標を推計することができる。
【0019】
【数2】
Figure 0004515615
図5は観察対象12の中心から観察対象12外部に走査して、陰影付けした円筒状の投影面を展開して擬似的に三次元画像を表示する展開画像構成方法の実施の形態を示す図である。
【0020】
図5(a)に示すように、視点5aと5bを結ぶベクトルn1 、視点5bと5cを結ぶベクトルn2 、…の各ベクトルniと直交する円筒の内面を投影面とし、各視点から投影面に三次元の原画像10を陰影付けして投影する。例えば、視点5aからの投影線は、ベクトルn1 と直交し、一定の角度でそれぞれ投影位置a〜zに放射状に投影される。尚、陰影付けの方法としては、サーフェイス法、ボリュームレンダリング法、デプス法等のいずれを使用してもよい。
即ち、視点5a〜5gを通る曲線を仮想の線光源とし、この線光源を中心とする円筒状の投影面に陰影付けして投影する。
【0021】
このようして円筒状の投影面に投影された画像情報は、図5(b)に示すようにメモリ14上で直線状に展開されて格納される。尚、展開の開始位置(角度)aは、図6(a)に示す観察対象外部からマウスによって指示された展開位置(カットライン)である。
これにより、上記メモリ14に格納された画像情報により、観察対象12を切り開いた疑似三次元画像を表示することができる。尚、この方法は観察対象中心位置設定手段によって設定された視点が観察対象内部に設定された場合のみ有効である。
【0022】
一方、視点が観察対象外部に設定された場合には、図6(b)に示すように前記展開位置設定手段によって設定された位置と観察対象中心位置との線上で、かつ観察対象内部に視点を再設定する。即ち、観察対象中心位置設定手段によって設定された視点を中心として、中心投影法等で初期に求められた視点を、展開位置設定手段によって設定された位置に回転移動させる。これらの処理をすべての視点に対して行い、視点群を再設定する(図3のステップS13)。
【0023】
また、観察対象中心位置設定手段や展開位置設定手段によって設定された視点を展開中心点と定めると、図5(a)のように観察対象外部に走査角度a〜zで放射状に投影する場合、走査角度を一定にするか否かで得られる画像が異なる(図4のステップS14)。走査角度を等角度で走査し投影すると(ステップS15)、展開中心点からの距離によって構成される展開画像の空間分解能が異なる。つまり、構成された展開画像の1ピクセル毎に表現されている情報はピクセル間の距離が異なる位置情報を表現したものである。そこで、図7のようにメッシュ状に等方性のスケールを表示して異方性の空間分解能を持つ展開画像を補う(ステップS16)。展開画像は平面上に展開して引き伸ばした画像であるため、直観的に形態を診断したいという要求を満たしている。
【0024】
図8に等方性スケールの展開画像を再構成するための走査方法を示す。まず、円近似又は楕円近似によって径を求める。近似した径の長さに対して角度を1度走査することとして、ある角度分、等角度で走査したときの平均的な距離によってもう一度走査する角度を設定する。その角度分移動して走査した後、その位置からもう一度等角度で走査して次の角度を設定するという処理を繰り返し、観察対象外周をすべて走査する(ステップS17)。この展開画像は等方性スケールであるため、計測に有効な表示である。このことから、異方性、等方性の展開画像の長所を生かせるように、相互に表示する方法で診断画像を提供することが望ましい。
【0025】
図9は複数の視点の経路を同時に通る曲面によって原画像を切断し、観察対象の断面像を含む原画像の断面像を構成する画像構成方法の実施の形態を示す処理フローを示し、図10にその処理によって構成される断面像等を示す。
まず、図9に示すように中心投影法によって視点の経路(ルート)を求め(ステップS20)、観察対象の中心に視点を設定する(ステップS21)。次に、それら視点のルートのうち、基準となるメインルートを設定する(ステップS22)。尚、図10では、観察対象を領域拡張法等で領域を抽出した後、細線化処理等で芯線情報を得、サーフェイス法、ボリュームレンダリング法やデプス法で陰影付けした画像に対して設定したルートでもよい。
【0026】
その後、複数のルートを通る曲面を再構成する。図11(a)及び(b)に画像構成方法を示す。まず、あらかじめ観察対象を任意の傾きで切断する面(横断像1、2、3)を設定し、各横断像1、2、3上のルートの位置情報(即ち、切断面とルートとの交点の情報)を得る(ステップS23)。
【0027】
同図に示すように、ルートを追加した場合、横断像の位置によってルートとの交点の数が異なり、この交点を通る切断直線又は切断曲線を求めなければならない。交点の数が1点及び2点の場合は直線で、3点以上の場合は近似多項式による補間及び補外計算による曲線で、横断像上を切断する点列を計算する(ステップS24)。尚、交点の数が1点である場合には、直線の方向が定まらないが、この場合には、その前後の横断像上の切断直線又は切断曲線と略同方向となるように直線の方向を決定する。
【0028】
上記のようにして求めた各横断像上の切断直線又は切断曲線上の点列を積み上げることによって断面像が構成される(ステップS25)。
【0029】
図11(c)は点列を積み上げるための方向ベクトル及び切断画像中心線を求めるための図である。横断像βn 上の初期の中心線の点Oをαn,次の横断像βn+1 上の点A,B中心線の点Cをαn+1とすると、ベクトルOCが点列を積み上げるための方向ベクトルである。ベクトルOCは、点Oから曲線上に下ろした足が垂直となる位置なので、幾何学的に、次式、
【0030】
【数3】
Figure 0004515615
が得られる。この処理をすべての点に対して行い、最短距離となる中心線の方向ベクトル及び中心線の点を求める。
【0031】
図12は複雑に蛇行する観察対象を切断する曲面を構成する画像構成方法の実施の形態を示す図である。同図に示すように複雑に蛇行する観察対象12では、同一横断像6が1つの視点経路5を複数回横切る場合がある。この場合も図11と同様に任意に設定した横断像6上の交点の数によって切断直線又は切断曲線で横断像6上を切断する点列を計算する。尚、交点の数が1点である場合、その前後の横断像上の視点群によって構成される切断直線又は切断曲線の情報を用いて、切断する点列の計算を行い、その点列を積み上げることによって切断面像が構成される。
【0032】
図13は本発明に係る画像表示装置のハードウェア構成例を示すブロック図である。
同図に示すように、この画像表示装置は、主として磁気ディスク50と、主メモリ52と、中央処理装置(CPU)54と、表示メモリ56と、CRTモニタ1と、各種の操作指令、位置指令、メニュー選択指令を入力するためのキーボード58、マウス2、マウスコントローラ60と、これらの各構成要素を接続する共通バス62とから構成されている。
【0033】
磁気ディスク50には、複数のCT画像(CT1,CT2,CT3,…)が積み上げられた三次元の原画像10、画像構成プログラム等が格納され、主メモリ52には、装置の制御プログラムが格納されるとともに、演算処理用の領域等が設けられている。
【0034】
CPU54は、三次元の原画像10や各種のプログラムを読み出し、主メモリ52を用いて本発明に係る断面像や疑似三次元画像等の構成を行い、その構成した画像を示す画像データを表示メモリ56に送り、CRTモニタ1に表示させる。
【0035】
【発明の効果】
以上説明したように本発明に係る画像表示装置によれば、血管、気管などの分岐がある観察対象や複雑に蛇行する腸などの観察対象を、その観察対象に沿った三次元的な曲面で縦切りにした断面像として表示することができ、手術計画に至る術前段階でより有効な診断情報を提供することができる。
【図面の簡単な説明】
【図1】本発明に係る画像表示装置に基づいて表示された断面像を含む表示例を示す図。
【図2】断面像構成方法の実施の形態を示す図。
【図3】観察対象の展開画像構成方法の処理フロー。
【図4】観察対象中心算出方法を示す図。
【図5】展開画像構成方法の実施の形態を示す図。
【図6】カットライン及び展開中心点設定方法を示す図。
【図7】異方性のスケールを持つ展開画像を示す図。
【図8】等方性のスケールを持つ展開画像を構成する方法を示す図。
【図9】複数の視点経路を同時に通る曲面を再構成する方法の処理フロー。
【図10】上記処理フローによって断面像を再構成する方法を説明するために用いた図。
【図11】複数の視点経路を通る曲面によって画像を再構成する方法を示す図。
【図12】複雑に蛇行する観察対象を切断する曲面を構成する画像構成方法を示す図。
【図13】本発明に係る画像表示装置のハードウェア構成例を示すブロック図。
【符号の説明】
1…CRTモニタ、2…マウス、3…内視鏡的画像、4…断面像、5aL1〜5aL3…断面像を構成する面、6…横断像、7…横断面、10…三次元の原画像、12…観察対象、14…メモリ、16…カットライン、18…展開中心座標、50…磁気ディスク、52…主メモリ、54…中央処理装置(CPU)、56…表示メモリ[0001]
BACKGROUND OF THE INVENTION
The present invention relates to an image display device, and more particularly to an image display device that displays a cross-sectional image of a tubular observation target such as a blood vessel, an intestine, or a trachea.
[0002]
[Prior art]
2. Description of the Related Art Conventionally, an image display device has been proposed in which an observation target such as a blood vessel and an intestine is vertically cut along a curved surface (cut curved surface) along the observation target so that the inside of the observation target can be observed. 11-318884).
[0003]
[Problems to be solved by the invention]
By the way, the image display apparatus described in Japanese Patent Application Laid-Open No. 11-318884 sets a plurality of viewpoints along the observation object as a cut curved surface along the observation object, and passes through the plurality of viewpoints. It is defined as a surface.
However, in the case where a blood vessel branches into a plurality of branches, the cut curved surface that cuts each blood vessel vertically is a three-dimensional curved surface. However, Japanese Patent Application Laid-Open No. 11-318884 discloses a cut by a three-dimensional cut curved surface. No specific means for constructing the obtained sectional image is disclosed.
[0004]
The present invention has been made in view of such circumstances, and is a cross-section obtained by vertically cutting a tubular observation target such as a branched blood vessel, trachea or intricately meandering intestine with a three-dimensional curved surface along the observation target. An object of the present invention is to provide an image display device capable of displaying an image.
[0005]
[Means for Solving the Problems]
In order to achieve the above object, the present invention sets means for obtaining a path passing through the approximate center of a tubular observation target included in a three-dimensional original image, and a plurality of cutting planes for cutting the observation target at predetermined intervals. Means for obtaining an intersection between the path and the cut surface, means for obtaining a straight line or a curve passing through the intersection on the cut surface for each cut surface, and a sequence of points on the straight line or the curve. Means for stacking along a path to form a three-dimensional cut curved surface, means for creating a cross-sectional image of the observation object cut by the cutting curve, and means for displaying the cross-sectional image It is characterized by.
[0006]
That is, the path passing through the approximate center of the observation target included in the three-dimensional original image is a center projection method (Japanese Patent Laid-Open Nos. 7-210704 and 8-8-1) in which the viewpoint can be set inside the observation target. The path of the viewpoint used in Japanese Patent No. 16813) and the observation target are extracted by a region expansion method or the like and then binarized, and can be obtained from a thin line obtained by thinning. And the intersection of the said path | route and each cut surface which cut | disconnects an observation object by a predetermined space | interval is calculated | required. Each cut surface is a plane having an arbitrary inclination that can cross the path.
[0007]
A straight line or a curve on the cut surface passing through the intersection obtained as described above is obtained. When one or two intersections exist on one cut surface, the line passing through the intersection is a straight line, and when three or more intersections exist on one cut surface, the intersection points The line passing through is a curve by interpolation (interpolation) and extrapolation (extrapolation) calculation using an approximate polynomial. When there is one intersection point, a straight line is determined using information on a straight line or a curved line on the front and rear cut surfaces. For example, it is a straight line that is in the same direction as a straight line or a curved line on the front and rear cut surfaces. Then, the straight lines or curves obtained for each cut plane obtained as described above are stacked to form a three-dimensional cut curved surface, and a cross-sectional image of the original image including the observation target cut by the cut curved surface Is displayed.
[0008]
DETAILED DESCRIPTION OF THE INVENTION
Hereinafter, preferred embodiments of an image display device according to the present invention will be described in detail with reference to the accompanying drawings.
[0009]
FIG. 1 shows a display example displayed by the image display apparatus according to the present invention. In the figure, 1 is a CRT monitor, 2 is a mouse, 3 is a pseudo three-dimensional image (endoscopic image) showing the inside of a blood vessel constructed by the central projection method, and 4 is a tubular observation of blood vessels, intestines, trachea, etc. A longitudinal cross-sectional image of the object 4, and 6 a transverse image of the observation object 12.
[0010]
A configuration method of the cross-sectional image 4 will be described.
FIG. 2 is a diagram showing an embodiment of the method for constructing the cross-sectional image. 2A, 10 is a three-dimensional original image in which a plurality of CT images (CT1, CT2, CT3,...) Obtained by an X-ray CT apparatus or the like are stacked, and 12 is included in the original image 10. An observation object 12 is shown. Reference numerals 5a to 5g denote viewpoints sequentially set in the observation object 12 by the central projection method.
[0011]
The viewpoints 5a to 5g are sequentially set, for example, while operating a cursor (not shown) on the screen with a mouse or the like while viewing the endoscopic image 3 in FIG. 1, or the line-of-sight direction is the farthest from the viewpoint. It is set by being automatically updated sequentially so as to go. Details of the central projection method and the viewpoint update method are described in Japanese Patent Application Laid-Open Nos. 7-210704 and 8-16813.
[0012]
The curved surface (cut surface) including the viewpoints 5a to 5g set as described above includes a plurality of straight lines 5aL to 5gL passing through the viewpoints 5a to 5g, and the straight lines 5aL to 5gL are shown in FIG. In the embodiment, the case where the CT images (CT1, CT2, CT3,...) Are parallel to the coordinate axis y in the stacking direction is shown.
[0013]
Now, when the longitudinal cross-sectional image 4 (FIG. 1) of the observation object 12 is constructed, the following is performed. A straight line 5aL passing through the viewpoint 5a and parallel to the y-axis is obtained, and CT values at respective points on the straight line 5aL are obtained from CT images (CT1, CT2, CT3,...). The CT value between the CT images is obtained by interpolation. The CT value on the straight line 5aL obtained in this way is stored in the memory 14 (FIG. 2B). Similarly, the CT values on the straight lines 5bL to 5gL passing through the other viewpoints 5b to 5g are also stored in the memory 14, but when the distance between the straight lines (viewpoints) is long, for example, as shown between the straight lines 5cL to 5dL. The straight lines of the intervals at which the desired image quality is obtained are obtained by interpolation, and the CT values on these straight lines are stored in the memory 14.
[0014]
Further, as shown in FIG. 2A, a plurality of straight lines 5aL1, 5aL2, 5aL3... Passing through the viewpoint 5a are set to rotate by a predetermined angle from the y axis, and straight lines passing through the other viewpoints 5b to 5g. Similarly, multiple settings are made. Then, the CT value on each straight line having the same angle as that of the straight line 5aL1 is stored in the memory, so that the image data of the first frame is constructed. Similarly, the CT value on each straight line having the same angle as that of the straight lines 5aL2, 5aL3,. Is stored in the memory, and image data cut at any angle about the path connecting the viewpoints can be constructed.
[0015]
FIG. 3 shows a processing flow of the developed image construction method, which will be described in order.
[0016]
FIG. 4 is a diagram showing a method of changing the viewpoint to the center position of the observation area (corresponding to step S11 in FIG. 3). First, based on a plurality of viewpoints obtained by the central projection method, a cross section 7 orthogonal to the direction vector of the viewpoint is set (FIG. 4A), and a density profile of the cross section 7 is created (FIG. 4). 4 (b)). Subsequently, a secondary difference method is used in order to separate the density values inside and outside the observation target. The secondary difference operator is used to extract the region boundary. The region boundary is a zero-cross point where the polarity of the secondary difference value shown in FIG. As an operator, Laplacian is applied. Since this is weak against random noise, the Laplacian operator shown in FIG. 4A is applied as an operator for smoothing. The width of the operator is not limited to the illustrated value, and the width value can be changed. Depending on the target area, an empirically set value may be used. By this processing, the region boundary is obtained, and the center position of the two region boundaries becomes the center position on the cross section. Furthermore, by similarly obtaining the center position on the cross section when the cutting angle is changed, the viewpoint can be three-dimensionally changed to the center position of the observation target.
[0017]
FIG. 4C shows a second method for changing the viewpoint to the center position of the observation region. This method, when creating an endoscopic image from the central projection method, scans radially from the viewpoint obtained in the initial stage, and based on the position information (position information of the black circle) at the time of deviating from the observation target area, This is a method of obtaining a new viewpoint by obtaining the center in the observation object by circular approximation or elliptic approximation. Circle approximation or ellipse approximation is estimated by least square approximation. For example, sample point data
{(X 1 , Y 1 ) (X 2 , Y 2 ) ... (X n , Y n )}
Then, the objective evaluation function when this is approximated to a circle or an ellipse is
[0018]
[Expression 1]
Figure 0004515615
Given in. The following circular and elliptic functions are determined by the obtained parameters, and the center coordinates can be estimated.
[0019]
[Expression 2]
Figure 0004515615
FIG. 5 is a diagram showing an embodiment of a developed image construction method for scanning the outside of the observed object 12 from the center of the observed object 12 and developing a shaded cylindrical projection surface to display a pseudo three-dimensional image. It is.
[0020]
As shown in FIG. 5A, the inner surface of a cylinder orthogonal to each vector ni of the vector n1 connecting the viewpoints 5a and 5b, the vector n2 connecting the viewpoints 5b and 5c,. A three-dimensional original image 10 is shaded and projected. For example, the projection line from the viewpoint 5a is orthogonal to the vector n1 and is projected radially onto the projection positions a to z at a certain angle. As a shading method, any one of a surface method, a volume rendering method, a depth method and the like may be used.
That is, a curve passing through the viewpoints 5a to 5g is used as a virtual line light source, and is projected on a cylindrical projection surface centered on the line light source.
[0021]
The image information projected on the cylindrical projection surface in this way is expanded and stored in a straight line on the memory 14 as shown in FIG. 5B. The development start position (angle) a is a development position (cut line) designated by the mouse from the outside of the observation target shown in FIG.
Accordingly, a pseudo three-dimensional image obtained by cutting the observation target 12 can be displayed based on the image information stored in the memory 14. This method is effective only when the viewpoint set by the observation target center position setting means is set inside the observation target.
[0022]
On the other hand, when the viewpoint is set outside the observation target, the viewpoint is on the line between the position set by the development position setting unit and the observation target center position and inside the observation target as shown in FIG. To reset. That is, with the viewpoint set by the observation target center position setting means as the center, the viewpoint initially obtained by the central projection method or the like is rotated to the position set by the development position setting means. These processes are performed for all viewpoints, and the viewpoint group is reset (step S13 in FIG. 3).
[0023]
Further, when the viewpoint set by the observation target center position setting means or the development position setting means is defined as the development center point, when projecting radially at the scanning angles a to z outside the observation target as shown in FIG. Different images are obtained depending on whether or not the scanning angle is constant (step S14 in FIG. 4). When the scanning angle is scanned at an equal angle and projected (step S15), the spatial resolution of the developed image formed by the distance from the developed center point is different. That is, the information expressed for each pixel of the constructed developed image is information representing position information having different distances between pixels. Therefore, an isotropic scale is displayed in a mesh form as shown in FIG. 7 to supplement the developed image having anisotropic spatial resolution (step S16). Since the developed image is an image expanded and stretched on a plane, it satisfies the requirement to intuitively diagnose the form.
[0024]
FIG. 8 shows a scanning method for reconstructing a developed image of an isotropic scale. First, the diameter is obtained by circular approximation or elliptic approximation. As the angle is scanned once with respect to the length of the approximate diameter, the angle to be scanned again is set by an average distance when scanning is performed at an equal angle by a certain angle. After moving by that angle and scanning, the process of scanning again at the same angle from that position and setting the next angle is repeated, and the entire outer periphery of the observation target is scanned (step S17). Since this developed image is an isotropic scale, it is a display effective for measurement. For this reason, it is desirable to provide a diagnostic image by a mutual display method so as to take advantage of the anisotropic and isotropic developed image.
[0025]
FIG. 9 shows a processing flow showing an embodiment of an image construction method for cutting an original image by a curved surface passing through a plurality of viewpoint paths at the same time and constructing a cross-sectional image of the original image including a cross-sectional image to be observed. Shows a cross-sectional image formed by the processing.
First, as shown in FIG. 9, a viewpoint path is obtained by the central projection method (step S20), and the viewpoint is set at the center of the observation target (step S21). Next, of the viewpoint routes, a reference main route is set (step S22). In FIG. 10, after extracting a region to be observed by a region expansion method or the like, a route set for an image shaded by a surface method, a volume rendering method, or a depth method after obtaining core information by thinning processing or the like. But you can.
[0026]
Thereafter, a curved surface passing through a plurality of routes is reconstructed. FIGS. 11A and 11B show an image construction method. First, planes (cross-sectional images 1, 2, and 3) for cutting the observation target with an arbitrary inclination are set in advance, and route position information on each of the cross-sectional images 1, 2, and 3 (that is, the intersection of the cut surface and the root) (Step S23).
[0027]
As shown in the figure, when a route is added, the number of intersections with the route differs depending on the position of the cross-sectional image, and a cutting straight line or cutting curve passing through this intersection must be obtained. When the number of intersection points is 1 and 2, a point sequence for cutting the cross-sectional image is calculated with a straight line, and when it is 3 or more points with a curve obtained by interpolation and extrapolation using an approximate polynomial (step S24). When the number of intersections is one, the direction of the straight line is not determined, but in this case, the direction of the straight line is substantially the same as the cutting straight line or cutting curve on the transverse image before and after that. To decide.
[0028]
A cross-sectional image is constructed by accumulating the cutting straight lines on each transverse image or the point sequence on the cutting curve obtained as described above (step S25).
[0029]
FIG. 11C is a diagram for obtaining a direction vector and a cut image center line for stacking point sequences. Assuming that the initial centerline point O on the transverse image βn is αn, the point A on the next transverse image βn + 1 is B, and the centerline point C is αn + 1, the vector OC is a direction vector for accumulating the point sequence. is there. Since the vector OC is a position where the foot dropped from the point O on the curve is vertical, geometrically,
[0030]
[Equation 3]
Figure 0004515615
Is obtained. This process is performed for all points, and the direction vector of the center line and the center line point that are the shortest distance are obtained.
[0031]
FIG. 12 is a diagram showing an embodiment of an image construction method for constructing a curved surface for cutting an observation target that meanders in a complicated manner. As shown in the figure, in the observation object 12 that meanders in a complicated manner, the same cross-sectional image 6 may cross one viewpoint path 5 a plurality of times. In this case as well, a point sequence for cutting the transverse image 6 with a cutting straight line or a cutting curve is calculated according to the number of intersection points on the transverse image 6 arbitrarily set as in FIG. When the number of intersections is 1, the point sequence to be cut is calculated using the information of the cutting line or cutting curve constituted by the viewpoint groups on the preceding and following cross-sectional images, and the point sequence is accumulated. Thus, a cut surface image is formed.
[0032]
FIG. 13 is a block diagram showing a hardware configuration example of the image display apparatus according to the present invention.
As shown in the figure, this image display device mainly includes a magnetic disk 50, a main memory 52, a central processing unit (CPU) 54, a display memory 56, a CRT monitor 1, various operation commands and position commands. , A keyboard 58 for inputting a menu selection command, a mouse 2 and a mouse controller 60, and a common bus 62 for connecting these components.
[0033]
The magnetic disk 50 stores a three-dimensional original image 10 in which a plurality of CT images (CT1, CT2, CT3,...) Are stacked, an image configuration program, and the like, and a main memory 52 stores an apparatus control program. In addition, an arithmetic processing area and the like are provided.
[0034]
The CPU 54 reads the three-dimensional original image 10 and various programs, uses the main memory 52 to construct a cross-sectional image, a pseudo three-dimensional image, and the like according to the present invention, and displays image data indicating the constructed image as a display memory. 56 and displayed on the CRT monitor 1.
[0035]
【The invention's effect】
As described above, according to the image display device of the present invention, an observation target having a branch such as a blood vessel or a trachea, or an observation target such as a complex meandering intestine is represented by a three-dimensional curved surface along the observation target. It can be displayed as a longitudinally cut cross-sectional image, and more effective diagnostic information can be provided at the preoperative stage leading to the surgical plan.
[Brief description of the drawings]
FIG. 1 is a diagram showing a display example including a cross-sectional image displayed based on an image display device according to the present invention.
FIG. 2 is a diagram showing an embodiment of a cross-sectional image construction method.
FIG. 3 is a process flow of a method for constructing a developed image to be observed.
FIG. 4 is a diagram showing a method for calculating an observation target center.
FIG. 5 is a diagram showing an embodiment of a developed image construction method.
FIG. 6 is a diagram showing a cut line and development center point setting method.
FIG. 7 is a diagram showing a developed image having an anisotropic scale.
FIG. 8 is a diagram illustrating a method for constructing a developed image having an isotropic scale.
FIG. 9 is a processing flow of a method for reconstructing a curved surface that simultaneously passes through a plurality of viewpoint paths.
FIG. 10 is a diagram used for explaining a method of reconstructing a cross-sectional image by the processing flow.
FIG. 11 is a diagram showing a method for reconstructing an image by curved surfaces passing through a plurality of viewpoint paths.
FIG. 12 is a diagram showing an image construction method for constructing a curved surface for cutting an observation target that meanders in a complicated manner.
FIG. 13 is a block diagram showing a hardware configuration example of an image display apparatus according to the present invention.
[Explanation of symbols]
DESCRIPTION OF SYMBOLS 1 ... CRT monitor, 2 ... Mouse, 3 ... Endoscopic image, 4 ... Cross-sectional image, 5aL1-5aL3 ... The surface which comprises a cross-sectional image, 6 ... Cross-sectional image, 7 ... Cross-section, 10 ... Three-dimensional original image , 12 ... Observation object, 14 ... Memory, 16 ... Cut line, 18 ... Development center coordinates, 50 ... Magnetic disk, 52 ... Main memory, 54 ... Central processing unit (CPU), 56 ... Display memory

Claims (3)

三次元の原画像に含まれる管状の観察対象の略中心を通る経路を求める手段と、
前記観察対象を所定の間隔で切断する複数の切断面を設定する手段と、
前記経路と前記切断面との交点を求める手段と、
前記切断面上の交点を通る直線又は曲線を前記切断面ごとに求める手段と、
前記直線又は前記曲線上の点列を前記経路に沿って積み上げて三次元的な切断曲面を構成する手段と、
前記切断曲線で切断される前記観察対象の断面像を作成する手段と、
前記断面像を表示する手段と、を備えたことを特徴とする画像表示装置。
Means for obtaining a path passing through the approximate center of the tubular observation object included in the three-dimensional original image ;
Means for setting a plurality of cutting planes for cutting the observation object at predetermined intervals;
Means for obtaining an intersection of the path and the cut surface;
Means for determining, for each cutting plane, a straight line or a curve passing through the intersection on the cutting plane;
Means for stacking point lines on the straight line or the curve along the path to form a three-dimensional cut curved surface;
Means for creating a cross-sectional image of the observation object cut along the cutting curve;
An image display device comprising: means for displaying the cross-sectional image .
請求項1に記載の画像表示装置において、The image display device according to claim 1,
前記直線又は前記曲線を求める手段は、前記切断面上の交点が1点または2点の場合は直線を求め、交点が3点以上の場合は曲線を求めることを特徴とする画像表示装置。The means for obtaining the straight line or the curve obtains a straight line when the intersection on the cut surface is one or two, and obtains a curve when the intersection is three or more.
請求項1または2のいずれかに記載の画像表示装置において、The image display device according to claim 1 or 2,
前記直線又は前記曲線を求める手段は、前記切断面上の交点が1点の場合はその前後の横断面上の直線又は曲線の情報を用いて直線を求めることを特徴とする画像表示装置。The image display apparatus characterized in that the means for obtaining the straight line or the curve obtains a straight line using information on a straight line or a curved line on the cross section before and after the intersection on the cut surface.
JP2000280058A 2000-09-14 2000-09-14 Image display device Expired - Lifetime JP4515615B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2000280058A JP4515615B2 (en) 2000-09-14 2000-09-14 Image display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2000280058A JP4515615B2 (en) 2000-09-14 2000-09-14 Image display device

Publications (3)

Publication Number Publication Date
JP2002092590A JP2002092590A (en) 2002-03-29
JP2002092590A5 JP2002092590A5 (en) 2007-10-11
JP4515615B2 true JP4515615B2 (en) 2010-08-04

Family

ID=18764962

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2000280058A Expired - Lifetime JP4515615B2 (en) 2000-09-14 2000-09-14 Image display device

Country Status (1)

Country Link
JP (1) JP4515615B2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101390762B (en) * 2007-09-21 2013-05-01 株式会社东芝 Device for getting ultrasonic image

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4564233B2 (en) * 2003-01-31 2010-10-20 株式会社東芝 Image processing apparatus, image data processing method, and program
EP1687778A1 (en) * 2003-11-14 2006-08-09 Philips Intellectual Property & Standards GmbH Method and apparatus for visualisation of a tubular structure
JP4526114B2 (en) * 2004-05-21 2010-08-18 株式会社日立メディコ Luminal organ resection simulation method
JP2006081640A (en) * 2004-09-15 2006-03-30 Ge Medical Systems Global Technology Co Llc Ultrasonic imaging device, image processor and program
WO2006118100A1 (en) * 2005-04-28 2006-11-09 Hitachi Medical Corporation Image display device and program
JP2009544394A (en) * 2006-07-25 2009-12-17 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Curved multi-slice display method and apparatus
JP4671204B2 (en) * 2008-07-09 2011-04-13 ザイオソフト株式会社 Medical image display control apparatus and medical image display control program
JP4541434B2 (en) 2008-07-14 2010-09-08 ザイオソフト株式会社 Image processing apparatus and image processing program
JP2015114290A (en) * 2013-12-13 2015-06-22 オムロン株式会社 Time correction device, measuring apparatus, and time correction method
JP6671482B2 (en) * 2016-08-31 2020-03-25 富士フイルム株式会社 CPR image generation apparatus, method and program
US11176666B2 (en) * 2018-11-09 2021-11-16 Vida Diagnostics, Inc. Cut-surface display of tubular structures

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0981770A (en) * 1995-09-08 1997-03-28 Hitachi Medical Corp Projection image constituting method, artificial three-dimensional image constituting method, and projection image display device
JPH09237352A (en) * 1996-03-01 1997-09-09 Hitachi Medical Corp Three-dimensional picture constituting method and device therefor
JPH11318884A (en) * 1998-03-09 1999-11-24 Hitachi Medical Corp Image display device
JP2000242766A (en) * 1999-02-18 2000-09-08 Hitachi Medical Corp Image display device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0981770A (en) * 1995-09-08 1997-03-28 Hitachi Medical Corp Projection image constituting method, artificial three-dimensional image constituting method, and projection image display device
JPH09237352A (en) * 1996-03-01 1997-09-09 Hitachi Medical Corp Three-dimensional picture constituting method and device therefor
JPH11318884A (en) * 1998-03-09 1999-11-24 Hitachi Medical Corp Image display device
JP2000242766A (en) * 1999-02-18 2000-09-08 Hitachi Medical Corp Image display device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101390762B (en) * 2007-09-21 2013-05-01 株式会社东芝 Device for getting ultrasonic image

Also Published As

Publication number Publication date
JP2002092590A (en) 2002-03-29

Similar Documents

Publication Publication Date Title
JP5384473B2 (en) Image display device and image display method
JP4200546B2 (en) Image display device
EP2420188B1 (en) Diagnosis support apparatus, diagnosis support method, and storage medium storing diagnosis support program
JP4891541B2 (en) Vascular stenosis rate analysis system
JP5224451B2 (en) Projection image creation apparatus, method and program
JP5191989B2 (en) Medical image display apparatus and medical image display method
JP4515615B2 (en) Image display device
JP2007509649A (en) Local path automatic planning method and apparatus for virtual colonoscopy
JP2007509649A6 (en) Local path automatic planning method and apparatus for virtual colonoscopy
JP6353827B2 (en) Image processing device
JP2012024517A (en) Diagnosis assisting apparatus, diagnosis assisting program, and diagnosis assisting method
US9530238B2 (en) Image processing apparatus, method and program utilizing an opacity curve for endoscopic images
JP4909792B2 (en) Image interpretation support apparatus, method, and program
JP5624336B2 (en) Medical image processing apparatus and medical image processing program
JP3632862B2 (en) Three-dimensional image display method and apparatus
JPH1176228A (en) Three-dimensional image construction apparatus
JP2009165718A (en) Medical image display
JP4738236B2 (en) Image display device
JP5631584B2 (en) Medical image processing apparatus, medical image processing program, and medical image diagnostic apparatus
JP2008067915A (en) Medical picture display
JP2006068350A (en) Medical image display method, medical image displaying device and program for medical image displaying
JP5683831B2 (en) Medical image processing apparatus and medical image processing program
JP4609967B2 (en) Image display device
JP3749322B2 (en) Distance measuring device that measures distance on pseudo 3D image
JPH11219448A (en) Method for displaying image

Legal Events

Date Code Title Description
A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20070823

A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20070823

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20100208

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20100401

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20100506

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20100513

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

Ref document number: 4515615

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130521

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130521

Year of fee payment: 3

S111 Request for change of ownership or part of ownership

Free format text: JAPANESE INTERMEDIATE CODE: R313111

S533 Written request for registration of change of name

Free format text: JAPANESE INTERMEDIATE CODE: R313533

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350

EXPY Cancellation because of completion of term