JP5822756B2 - Image quality evaluation method and apparatus for ultrasonic imaging apparatus - Google Patents
Image quality evaluation method and apparatus for ultrasonic imaging apparatus Download PDFInfo
- Publication number
- JP5822756B2 JP5822756B2 JP2012040887A JP2012040887A JP5822756B2 JP 5822756 B2 JP5822756 B2 JP 5822756B2 JP 2012040887 A JP2012040887 A JP 2012040887A JP 2012040887 A JP2012040887 A JP 2012040887A JP 5822756 B2 JP5822756 B2 JP 5822756B2
- Authority
- JP
- Japan
- Prior art keywords
- index
- sensory
- quantitative
- unit
- conversion rule
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Landscapes
- Ultra Sonic Daignosis Equipment (AREA)
Description
本発明は、超音波撮像装置の画質評価装置および方法に関する。 The present invention relates to an image quality evaluation apparatus and method for an ultrasonic imaging apparatus.
超音波撮像装置は、技師の手技によって取得画像の画質が左右される、画像を見慣れた人でないと病変の見落としが避けられないなど、MRIやCTなど他の医用撮像装置とくらべてヒューマンファクタを0にできない装置だと言われている。超音波撮像装置メーカーの製品化の最終工程で、安定してよい画質が得られるように送波周波数、フィルタ係数、ガンマカーブなどの撮像パラメータを調整する際も、「ぎらつき」「つながりのよさ」など、感覚的な指標による画質評価を行う必要がある。「輝度分布」などの定量的な指標を機械処理する前例はあるが(特許文献1)感覚的な指標を自動計算する報告はない。また、画像の特徴を定量化する方法は報告されているが(非特許文献1)、超音波画像に適用し、さらに時間方向に拡張して動画の画質評価が可能な報告はない。 Ultrasound imaging devices have a human factor compared to other medical imaging devices such as MRI and CT, such that the image quality of the acquired image is affected by the skill of the technician, and oversight of the lesion is unavoidable unless you are familiar with the image. It is said that this is a device that cannot be reduced to zero. When adjusting imaging parameters such as transmission frequency, filter coefficient, and gamma curve so that stable and good image quality can be obtained at the final stage of commercialization by the manufacturer of ultrasonic imaging equipment, `` glare '' and `` good connection '' It is necessary to perform image quality evaluation using sensory indicators such as Although there is a precedent in which a quantitative index such as “luminance distribution” is mechanically processed (Patent Document 1), there is no report for automatically calculating a sensory index. Although a method for quantifying image characteristics has been reported (Non-Patent Document 1), there is no report that can be applied to ultrasonic images and further expanded in the time direction to evaluate the image quality of moving images.
しかし、これらの画質は、画質評価に習熟した者でなければ評価できなかった。そのため画質評価のスピードは習熟者の人数が律速になっており、大量にある撮像パラメータの組み合わせをすべて評価し、真に最適な撮像パラメータの設定値を求めることは、現実的な時間では不可能であった。 However, these image quality can only be evaluated by those who are proficient in image quality evaluation . Speed of image quality evaluation for its is the number of skill becomes rate-limiting, to evaluate all the combination of imaging parameters, which is a large number, to determine the true setting of the optimal imaging parameters, in a realistic time It was impossible.
定量指標−感覚指標変換ルール作成部と、定量指標−感覚指標変換ルール適用部と、設定パラメータ値最適化部を有し、定量指標−感覚指標変換ルール作成部は、教師データ(画像、設定パラメータ値、感覚指標)入力部と、ルール作成時定量指標算出部と、定量指標−感覚指標変換ルール学習部と、定量指標−感覚指標変換ルール出力部とを有し、定量指標−感覚指標変換ルール適用部は、定量指標−感覚指標変換ルール保持部と、被評価データ(画像、設定パラメータ値)入力部と、ルール適用時定量指標算出部と、定量指標−感覚指標変換ルール実行部とを有し、設定パラメータ値最適化部は、目標感覚指標値設定部と、設定パラメータの最適値算出部と、最適化結果表示部とを有し、定量指標−感覚指標変換ルール作成部が1回以上、 画像と、設定パラメータ値と、感覚指標から成る教師データを入力し、定量指標−感覚指標変換ルールを作成した後は、画像と、設定パラメータ値から成る被評価データを入力して、感覚指標を算出可能にし、設定パラメータの最適値を算出する。 A quantitative index-sensory index conversion rule creating unit, a quantitative index-sensory index conversion rule applying unit, and a setting parameter value optimizing unit are included, and the quantitative index-sensory index conversion rule creating unit includes teacher data (image, setting parameter) Value, sensory index) input unit, rule creation quantitative index calculation unit, quantitative index-sensory index conversion rule learning unit, quantitative index-sensory index conversion rule output unit, quantitative index-sensory index conversion rule The application unit has a quantitative index-sensory index conversion rule holding unit, an evaluated data (image, setting parameter value) input unit, a rule applied quantitative index calculation unit, and a quantitative index-sensory index conversion rule execution unit. The setting parameter value optimization unit includes a target sensory index value setting unit, a setting parameter optimal value calculation unit, and an optimization result display unit, and the quantitative index-sensory index conversion rule creation unit is performed at least once. After inputting teacher data consisting of images, setting parameter values, and sensory indices and creating a quantitative index-sensory index conversion rule, input images and data to be evaluated consisting of setting parameter values to input sensory indices. The optimum value of the setting parameter is calculated.
画質評価に習熟した者の画質評価知識を明文化した結果、画質評価に習熟した者を介することなく超音波画像の画質を評価できる。 As a result of clarifying the image quality evaluation knowledge of those who are proficient in image quality evaluation, it is possible to evaluate the image quality of an ultrasonic image without intervention of those who are proficient in image quality evaluation.
さらに、大量にある撮像パラメータの組み合わせをすべて評価し、真に最適な撮像パラメータの設定値を現実的な時間で求めることができる。 Further, all combinations of a large number of imaging parameters can be evaluated, and a truly optimal imaging parameter setting value can be obtained in a realistic time.
図1から図7を用いて、第1の実施例を説明する。図1は本発明の超音波画質評価装置の概念図である。1はキーボード、タッチパネルなどの入力手段、2はパーソナルコンピュータあるいは超音波撮像装置のCPUなどの演算手段、3はパーソナルコンピュータあるいは超音波撮像装置のRAMやハードディスクなどの記憶手段、4はパーソナルコンピュータあるいは超音波撮像装置のディスプレイなどの表示手段である。 The first embodiment will be described with reference to FIGS. FIG. 1 is a conceptual diagram of an ultrasonic image quality evaluation apparatus according to the present invention. 1 is an input means such as a keyboard or touch panel, 2 is a computing means such as a CPU of a personal computer or an ultrasonic imaging apparatus, 3 is a storage means such as a RAM or a hard disk of the personal computer or ultrasonic imaging apparatus, and 4 is a personal computer or It is display means, such as a display of a sound wave imaging device.
100は本発明の超音波画質評価装置にかかる定量指標−感覚指標変換ルール作成部、200は本発明の超音波画質評価装置にかかる定量指標−感覚指標変換ルール適用部、300は本発明の超音波画質評価装置にかかる設定パラメータ値最適化部で、101は教師データ入力部、102はルール作成時定量指標算出部、103は定量指標−感覚指標変換ルール学習部、104は定量指標−感覚指標変換ルール出力部、201は定量指標−感覚指標変換ルール保持部、202は被評価データ(画像、設定パラメータ値)入力部、203はルール適用時定量指標算出部、204は定量指標−感覚指標変換ルール実行部、301は目標感覚指標値設定部、302は設定パラメータの最適値算出部、303は最適化結果表示部である。 100 is a quantitative index-sensory index conversion rule creation unit according to the ultrasonic image quality evaluation apparatus of the present invention, 200 is a quantitative index-sensory index conversion rule application unit according to the ultrasonic image quality evaluation apparatus of the present invention, and 300 is a super index of the present invention. A setting parameter value optimization unit according to the sonic image quality evaluation apparatus, 101 is a teacher data input unit, 102 is a rule creation quantitative index calculation unit, 103 is a quantitative index-sensory index conversion rule learning unit, and 104 is a quantitative index-sensory index. Conversion rule output unit, 201 is a quantitative index-sensory index conversion rule holding unit, 202 is a data to be evaluated (image, setting parameter value) input unit, 203 is a rule applied quantitative index calculation unit, and 204 is a quantitative index-sensory index conversion. A rule execution unit, 301 is a target sensation index value setting unit, 302 is a setting parameter optimum value calculation unit, and 303 is an optimization result display unit.
図2は本発明の超音波画質評価方法のフローチャートである。処理が開始されると、前記定量指標−感覚指標変換ルール作成部100が定量指標−感覚指標変換ルールを作成し(S100)、前記定量指標−感覚指標変換ルール適用部200が、十分な範囲を十分密に設定した設定パラメータに対して前記定量指標−感覚指標変換ルールを適用し感覚指標を算出し(S200)、前記設定パラメータ値最適化部300が、定量指標−感覚指標変換ルール適用部200の出力のうち目標とする感覚指標値にもっとも近い感覚指標値を与える設定パラメータ値を最適な設定パラメータ値として入力手段に表示する(S300)。 FIG. 2 is a flowchart of the ultrasonic image quality evaluation method of the present invention. When the processing is started, the quantitative index-sensory index conversion rule creating unit 100 creates a quantitative index-sensory index conversion rule (S100), and the quantitative index-sensory index conversion rule applying unit 200 has a sufficient range. The quantitative index-sensory index conversion rule is applied to the setting parameter set sufficiently densely to calculate a sensory index (S200), and the set parameter value optimization unit 300 determines the quantitative index-sensory index conversion rule application unit 200. The setting parameter value that gives the sensory index value closest to the target sensory index value is displayed on the input means as the optimal setting parameter value (S300).
より詳しくは、まず、前記教師データ入力部101が、図3を用いてその例を後述する教師データを入力し(S101)、前記ルール作成時定量指標算出部102が、図5を用いて例を後述する手順で、図4を用いて例を後述するような定量指標を1データごとに算出し(S102)、前記定量指標−感覚指標変換ルール学習部103が図6を用いて例を後述する形の定量指標−感覚指標変換ルールを機械学習あるいは手動設定、あるいはその両方によって学習し(S103)、前記定量指標−感覚指標変換ルール出力部104が、前記定量指標−感覚指標変換ルール保持部201に前記定量指標−感覚指標変換ルールを出力する(S104)。以上S101からS103の処理は、少なくとも1回実行されればよい。 More specifically, first, the teacher data input unit 101 inputs teacher data, which will be described later with reference to FIG. 3 (S101), and the rule creation quantitative index calculation unit 102 uses FIG. 5 as an example. In a procedure described later, a quantitative index as will be described later with reference to FIG. 4 is calculated for each data (S102), and the quantitative index-sensory index conversion rule learning unit 103 uses FIG. The quantitative index-sensory index conversion rule is learned by machine learning and / or manual setting (S103), and the quantitative index-sensory index conversion rule output unit 104 is the quantitative index-sensory index conversion rule holding unit The quantitative index-sensory index conversion rule is output to 201 (S104). The processing from S101 to S103 may be executed at least once.
続いて設定パラメータの最適化を行う必要が生じた時に、以下の処理を実行する。すなわち、前記定量指標−感覚指標変換ルール保持部201が前記定量指標−感覚指標変換ルール出力部104が出力した前記定量指標−感覚指標変換ルールを前記記憶手段3に保持し(S201)、前記被評価データ入力部202が、図3を用いてその例を後述する被評価データを入力し(S202)、前記ルール適用時定量指標算出部203が、前記ルール作成時定量指標算出部102と同様の定量指標を前記被評価データに対して算出し(S203)、前記定量指標−感覚指標変換ルール実行部204が、前記定量指標−感覚指標変換ルール保持部201に保持された前記定量指標−感覚指標変換ルールを用いて前記ルール適用時定量指標算出部203が算出した定量指標を感覚指標に変換して(S204)、十分な範囲および十分密な間隔で設定した設定パラメータ値に対する設定パラメータ値と感覚指標の対応を得る。最後に、前記目標感覚指標値設定部301が、前記入力手段1を人が操作する、あるいは前記記憶手段3に理想的な値を保持しておくなどにより、目標とする感覚指標の値を設定し(S301)、前記設定パラメータの最適値算出部302が、前記定量指標―感覚指標変換ルール適用部が作成した、十分な範囲および十分密な間隔で設定した設定パラメータ値に対する設定パラメータ値と感覚指標の対応のうち、目標とする感覚指標の値にもっとも近い値を与える設定パラメータ値を算出し(S302)、前記最適化結果表示部303が、その例を図7を用いて後述するように前記設定パラメータの最適値算出部が算出した設定パラメータ値を前記表示手段4に表示する(S303)。 Subsequently, when it becomes necessary to optimize the setting parameters, the following processing is executed. That is, the quantitative index-sensory index conversion rule holding unit 201 holds the quantitative index-sensory index conversion rule output by the quantitative index-sensory index conversion rule output unit 104 in the storage unit 3 (S201), and The evaluation data input unit 202 inputs data to be evaluated, which will be described later with reference to FIG. 3 (S202), and the rule application quantitative index calculation unit 203 is the same as the rule creation quantitative index calculation unit 102. A quantitative index is calculated for the data to be evaluated (S203), and the quantitative index-sensory index conversion rule execution unit 204 stores the quantitative index-sensory index held in the quantitative index-sensory index conversion rule holding unit 201. Using the conversion rule, the quantitative index calculated by the rule applied quantitative index calculation unit 203 is converted into a sensory index (S204), and a sufficient range and sufficient density are obtained. Obtain the corresponding configuration parameter values and sensory index for setting the parameter values set in intervals. Finally, the target sensory index value setting unit 301 sets a target sensory index value by operating the input unit 1 by a person or holding an ideal value in the storage unit 3. (S301), the setting parameter optimum value calculation unit 302 creates the setting parameter value and the sensation for the setting parameter value created by the quantitative index-sensation index conversion rule application unit and set with a sufficient range and a sufficiently dense interval. Among the correspondences of the indices, a setting parameter value that gives a value closest to the target sensory index value is calculated (S302), and the optimization result display unit 303 will explain an example thereof with reference to FIG. The setting parameter value calculated by the setting parameter optimum value calculation unit is displayed on the display means 4 (S303).
図3は、本発明の超音波画質評価装置における、前記教師データ入力部101が入力する教師データの一例を示す図である。前記教師データは、超音波撮像装置の撮像した静止画像あるいは動画、および撮像時の設定パラメータ値、および感覚指標を1データとして、十分な範囲を設定した設定パラメータに対するデータ数のセットとする。 FIG. 3 is a diagram showing an example of teacher data input by the teacher data input unit 101 in the ultrasonic image quality evaluation apparatus of the present invention. The teacher data is a set of the number of data with respect to the setting parameter for which a sufficient range is set, with the still image or moving image captured by the ultrasonic imaging apparatus, the setting parameter value at the time of imaging, and the sensory index as one data.
本例では、設定パラメータとして、ビデオ信号のフィルタ処理における空間方向およびフレーム方向のフィルタタップ数とガンマカーブを挙げたが、整相周波数や整相フィルタなどのフロントエンド処理の設定パラメータを含んでもよい。また本例では、感覚指標として距離分解能、コントラスト分解能、ぎらつき、つながりのよさ、時間的なめらかさを挙げたが、他の指標を含んでもよい。なお、前記被評価データ入力部202が入力する被評価データは、前記教師データに含まれる形の形式で、超音波撮像装置の撮像した静止画像あるいは動画、および撮像時の設定パラメータ値からなる。 In this example, the number of filter taps and gamma curve in the spatial direction and the frame direction in the filter processing of the video signal are given as setting parameters. However, setting parameters for front-end processing such as phasing frequency and phasing filter may be included. . Further, in this example, distance resolution, contrast resolution, glare, good connection, and temporal smoothness are listed as sensory indices, but other indices may be included. Note that the evaluated data input by the evaluated data input unit 202 includes a still image or a moving image captured by the ultrasonic imaging apparatus and a set parameter value at the time of imaging in a format included in the teacher data.
図4は、本発明の超音波画質評価装置における、前記ルール作成時定量指標算出部102および前記ルール適用時定量指標算出部203が算出する定量指標の一例を示す図である。これらの定量指標は、空間的な指標と時間的な指標から成り、図3に例を示した静止画像あるいは動画に対して算出される。図4(a)は1枚の静止画あるいは1つながりの動画に対して1値の定量指標を算出した例、図4(b)は1枚の静止画あるいは1つながりの動画を空間的に複数のROIに分割して、ROIごとに1値の定量指標を算出した例である。図4(b)のように空間的、あるいは時間的、あるいはその両方について分割したROIを設定してROIごとに定量指標を算出すると、静止画像あるいは動画が空間的あるいは時間的に性質の異なる部位を含む場合の画質評価に活用できる。 FIG. 4 is a diagram illustrating an example of a quantitative index calculated by the rule creation quantitative index calculation unit 102 and the rule application quantitative index calculation unit 203 in the ultrasonic image quality evaluation apparatus of the present invention. These quantitative indexes are composed of a spatial index and a temporal index, and are calculated for the still image or the moving image shown in FIG. FIG. 4A shows an example in which a single-value quantitative index is calculated for one still image or one connected moving image, and FIG. 4B shows one still image or one connected moving image spatially plural. This is an example in which a single quantitative index is calculated for each ROI. When ROI divided spatially or temporally or both is set as shown in FIG. 4B and a quantitative index is calculated for each ROI, a still image or a moving image has different spatial or temporal properties. Can be used for image quality evaluation.
図5は、本発明の超音波画質評価装置における、前記ルール作成時定量指標算出部102および前記ルール適用時定量指標算出部203が定量指標を算出する手順の一例を説明する図である。図5(a)は動画の概念図で、縦4ピクセル、横4ピクセル、2フレーム分の4階調のグレー画像をあらわしている。 FIG. 5 is a diagram illustrating an example of a procedure in which the rule creation quantitative index calculation unit 102 and the rule application quantitative index calculation unit 203 calculate a quantitative index in the ultrasonic image quality evaluation apparatus of the present invention. FIG. 5A is a conceptual diagram of a moving image, and represents a gray image of 4 gradations in the vertical direction, 4 pixels in the horizontal direction, and 4 gradations for 2 frames.
前記ルール作成時定量指標算出部102および前記ルール適用時定量指標算出部203は、まず、たとえば(式1)であらわすルールにしたがって画像の階調の、空間方向2方向、時間方向1方向に距離dist x, dist y, dist t 離れたピクセルの階調の共起頻度行列Pを(式2)のように算出し、共起頻度行列Pから(式3)で定義された統計量を算出して定量指標とする。なお、統計量の定義(式3)は本例に挙げた定義に限定するものではない。 The rule creation time quantification index calculation unit 102 and the rule application time quantification index calculation unit 203 first, for example, according to the rule expressed by (Equation 1), the distance of the image gradation in the spatial direction 2 direction and the time direction 1 direction. dist x, dist y, dist t Calculate the co-occurrence frequency matrix P of the gradations of distant pixels as in (Equation 2), and calculate the statistics defined in (Equation 3) from the co-occurrence frequency matrix P As a quantitative indicator. Note that the definition of the statistic (Formula 3) is not limited to the definition given in this example.
図6は、本発明の超音波画質評価装置における、前記定量指標−感覚指標変換ルール学習部103が機械学習あるいは手動設定、あるいはその両方によって学習する、定量指標−感覚指標変換ルールの一例を示す図である。本例では、M個の素子からなる中間層を1層持つニューラルネットの例を挙げた。入力は定量指標Fi(i=1、2、...、N)、出力は感覚指標fj(j=1、2、...、n)で、変換ルールは入力素子の閾値Θi(i=1、2、...、N)、出力素子の閾値θj(j=1、2、...、n)、入力素子と中間層の素子を結ぶ係数Cim(i=1、2、...、N、m=1、2、...、M)、中間層の素子と出力素子を結ぶ係数cjm(j=1、2、...、n、m=1、2、...、M)、の形で表される。 FIG. 6 shows an example of a quantitative index-sensory index conversion rule that the quantitative index-sensory index conversion rule learning unit 103 learns by machine learning, manual setting, or both in the ultrasonic image quality evaluation apparatus of the present invention. FIG. In this example, an example of a neural network having one intermediate layer composed of M elements is given. The input is a quantitative index Fi (i = 1, 2,..., N), the output is a sensory index fj (j = 1, 2,..., N), and the conversion rule is a threshold Θi (i = 1, 2,..., N), the output element threshold θj (j = 1, 2,..., N), and the coefficient Cim (i = 1, 2,...) Connecting the input element and the intermediate layer element. , N, m = 1, 2,..., M), coefficients cjm (j = 1, 2,..., N, m = 1, 2,. , M).
これらの値はたとえばバックプロパゲーションなどの機械学習により決定する。なお、本例ではニューラルネットによりルールを表現し、バックプロパゲーションによりルールを学習する例を挙げたが、ルール表現および学習方法はこれに限定するものではない。 These values are determined by machine learning such as back propagation. In this example, the rule is expressed by a neural network and the rule is learned by back-propagation. However, the rule expression and the learning method are not limited to this.
図7は、本発明の超音波画質評価装置における、前記最適化結果表示部303が前記表示手段4に表示する最適化結果の一例を示す図である。設定パラメータの最適値の他に、目標として設定したのく表感覚指標、および、最適化された画像など、補足情報を示してもよい。 FIG. 7 is a diagram illustrating an example of an optimization result displayed on the display unit 4 by the optimization result display unit 303 in the ultrasonic image quality evaluation apparatus of the present invention. In addition to the optimum value of the setting parameter, supplementary information such as a table-like index to be set as a target and an optimized image may be indicated.
以上の構成および方法によれば、画質評価に習熟した者の画質評価知識を前記定量指標−感覚指標変換ルールとして明文化することで、ヒューマンファクタによる律速がなくなり、いったん前記定量指標−感覚指標変換ルール作成部を実行して前記定量指標−感覚指標変換ルールを作成すれば、いつでもどこでも短時間で機械的に画質評価できるようになる。さらに、大量にある撮像パラメータの組み合わせをすべて評価し、真に最適な撮像パラメータの設定値を現実的な時間で求めることができる。 According to the above configuration and method, by defining the image quality evaluation knowledge of a person skilled in image quality evaluation as the quantitative index-sensory index conversion rule, there is no rate limiting due to human factors, and once the quantitative index-sensory index conversion is performed. If the rule creation unit is executed to create the quantitative index-sensory index conversion rule, the image quality can be mechanically evaluated anytime and anywhere in a short time. Further, all combinations of a large number of imaging parameters can be evaluated, and a truly optimal imaging parameter setting value can be obtained in a realistic time.
とくに、前記ルール作成時定量指標算出部および前記ルール適用時定量指標算出部が、前記教師データ、あるいは前記被評価データに含まれる画像の階調数がNの場合に、前記教師データ、あるいは前記被評価データに含まれる画像の階調をN以下の値に変更して、前記定量指標を算出することで、ROIを小さく設定でき、より細かい構造を反映した指標計算を行うことができる。 In particular, when the rule creation time quantification index calculation unit and the rule application time quantification index calculation unit are N in the number of gradations of the image included in the teacher data or the evaluated data, the teacher data, or the By changing the gradation of the image included in the data to be evaluated to a value of N or less and calculating the quantitative index, the ROI can be set small, and index calculation reflecting a finer structure can be performed.
また、以上の構成および方法によれば、医師や技師などのユーザによる画質評価知識の違いを前記定量指標−感覚指標変換ルールとして明文化、比較可能になり、設定パラメータをユーザ別にカスタマイズ可能になる。 Further, according to the above configuration and method, differences in image quality evaluation knowledge by users such as doctors and engineers can be clearly written and compared as the quantitative index-sensory index conversion rule, and setting parameters can be customized for each user. .
また、以上の構成および方法によれば、撮像部位や臨床目的により異なる最適画質を前記目標感覚指標設定部に設定することで、撮像部位や臨床目的別に最適な設定パラメータを求めることが可能になる。 Further, according to the above configuration and method, it is possible to obtain optimal setting parameters for each imaging region and clinical purpose by setting the optimum image quality that varies depending on the imaging region and clinical purpose in the target sense index setting unit. .
1・・・入力手段、2・・・演算手段、3・・・記憶手段、4・・・表示手段、100・・・定量指標−感覚指標変換ルール作成部、200・・・定量指標−感覚指標変換ルール適用部、300・・・設定パラメータ値最適化部、101・・・教師データ入力部、102・・・ルール作成時定量指標算出部、103・・・定量指標−感覚指標変換ルール学習部、104・・・定量指標−感覚指標変換ルール出力部、201・・・定量指標−感覚指標変換ルール保持部、202・・・被評価データ入力部、203・・・ルール適用時定量指標算出部、204・・・定量指標−感覚指標変換ルール実行部、301・・・目標感覚指標値設定部、302・・・設定パラメータの最適値算出部、303・・・最適化結果表示部 DESCRIPTION OF SYMBOLS 1 ... Input means, 2 ... Calculation means, 3 ... Memory | storage means, 4 ... Display means, 100 ... Quantitative index-sensory index conversion rule preparation part, 200 ... Quantitative index-sensory Index conversion rule application unit, 300... Setting parameter value optimization unit, 101... Teacher data input unit, 102... Rule creation quantitative index calculation unit, 103. 104: quantitative index-sensory index conversion rule output unit, 201: quantitative index-sensory index conversion rule holding unit, 202 ... evaluated data input unit, 203: calculation of quantitative index when applying rules 204, quantitative index-sensory index conversion rule execution unit, 301 ... target sensory index value setting unit, 302 ... setting parameter optimum value calculation unit, 303 ... optimization result display unit
Claims (7)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012040887A JP5822756B2 (en) | 2012-02-28 | 2012-02-28 | Image quality evaluation method and apparatus for ultrasonic imaging apparatus |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012040887A JP5822756B2 (en) | 2012-02-28 | 2012-02-28 | Image quality evaluation method and apparatus for ultrasonic imaging apparatus |
Publications (2)
Publication Number | Publication Date |
---|---|
JP2013176409A JP2013176409A (en) | 2013-09-09 |
JP5822756B2 true JP5822756B2 (en) | 2015-11-24 |
Family
ID=49268747
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
JP2012040887A Expired - Fee Related JP5822756B2 (en) | 2012-02-28 | 2012-02-28 | Image quality evaluation method and apparatus for ultrasonic imaging apparatus |
Country Status (1)
Country | Link |
---|---|
JP (1) | JP5822756B2 (en) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102656543B1 (en) * | 2018-03-16 | 2024-04-12 | 삼성메디슨 주식회사 | Medical imaging apparatus, method for controlling the same, and computer program |
US20190374165A1 (en) * | 2018-06-07 | 2019-12-12 | Canon Medical Systems Corporation | Image processing apparatus and method |
JP2020049062A (en) * | 2018-09-28 | 2020-04-02 | ゼネラル・エレクトリック・カンパニイ | Ultrasound image display apparatus |
JP7446736B2 (en) | 2019-08-09 | 2024-03-11 | キヤノンメディカルシステムズ株式会社 | Medical data processing equipment and medical image diagnostic equipment |
KR102545714B1 (en) * | 2020-10-07 | 2023-06-21 | 주식회사 오큐라이트 | Apparatus and method of predicting visualization parameter of medical image based on artificial neural network |
JP7225345B1 (en) | 2021-10-18 | 2023-02-20 | ジーイー・プレシジョン・ヘルスケア・エルエルシー | ULTRASOUND DIAGNOSTIC DEVICE AND DISPLAY METHOD |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4795470B2 (en) * | 2010-02-03 | 2011-10-19 | キヤノン株式会社 | Image processing apparatus and program |
-
2012
- 2012-02-28 JP JP2012040887A patent/JP5822756B2/en not_active Expired - Fee Related
Also Published As
Publication number | Publication date |
---|---|
JP2013176409A (en) | 2013-09-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5822756B2 (en) | Image quality evaluation method and apparatus for ultrasonic imaging apparatus | |
CN109978778B (en) | Convolutional neural network medical CT image denoising method based on residual learning | |
US20140135627A1 (en) | Methods, systems, and media for determining carotid intima-media thickness | |
JP2018171177A (en) | Fundus image processing device | |
JP5306061B2 (en) | Image processing apparatus, image processing method, program, and storage medium | |
JP2022517769A (en) | 3D target detection and model training methods, equipment, equipment, storage media and computer programs | |
JP5773781B2 (en) | Ultrasonic diagnostic apparatus, image processing apparatus, and program | |
US8643668B2 (en) | Medical imaging viewer | |
US11683438B2 (en) | Systems and methods to semi-automatically segment a 3D medical image using a real-time edge-aware brush | |
Li et al. | Fusion of medical sensors using adaptive cloud model in local Laplacian pyramid domain | |
JP7271244B2 (en) | CNN processing device, CNN processing method, and program | |
JP2020198041A (en) | Training device, training method, estimation device, and program | |
WO2022045210A1 (en) | Image processing device, image processing method, learning device, learning method, and program | |
JP2021103344A (en) | Learning support device, learning device, learning support method and learning support program | |
JP2019023798A (en) | Super-resolution device and program | |
JP2015094788A (en) | Display device and display control program | |
CN113299369B (en) | Medical image window adjusting optimization method | |
JP7214236B2 (en) | Learning device, reasoning device, computer program, computer system, reasoning method and learning method | |
KR20190099929A (en) | Method and ultrasound diagnostic apparatus for displaying a doppler image | |
CN113205148A (en) | Medical image frame interpolation method and terminal for iterative interlayer information fusion | |
JP2022100493A (en) | X-ray image processing apparatus and x-ray image processing method | |
JPWO2018159288A1 (en) | Image processing apparatus, image processing method, and program | |
Ibarra et al. | Constrained extended Kalman filter for improving Bayesian inference of vocal function from laryngeal high-speed videoendoscopy | |
CN117131712B (en) | Virtual-real combined emergency rescue simulation system and method | |
WO2021172280A1 (en) | Blood flow field estimation apparatus, learning apparatus, blood flow field estimation method, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A621 | Written request for application examination |
Free format text: JAPANESE INTERMEDIATE CODE: A621 Effective date: 20140807 |
|
A977 | Report on retrieval |
Free format text: JAPANESE INTERMEDIATE CODE: A971007 Effective date: 20150519 |
|
A131 | Notification of reasons for refusal |
Free format text: JAPANESE INTERMEDIATE CODE: A131 Effective date: 20150616 |
|
A521 | Written amendment |
Free format text: JAPANESE INTERMEDIATE CODE: A523 Effective date: 20150817 |
|
TRDD | Decision of grant or rejection written | ||
A01 | Written decision to grant a patent or to grant a registration (utility model) |
Free format text: JAPANESE INTERMEDIATE CODE: A01 Effective date: 20150908 |
|
A61 | First payment of annual fees (during grant procedure) |
Free format text: JAPANESE INTERMEDIATE CODE: A61 Effective date: 20151006 |
|
R151 | Written notification of patent or utility model registration |
Ref document number: 5822756 Country of ref document: JP Free format text: JAPANESE INTERMEDIATE CODE: R151 |
|
LAPS | Cancellation because of no payment of annual fees |