WO2017056779A1 - Dispositif de détection de tissu par ultrasons, procédé de détection de tissu par ultrasons, et programme de détection de tissu par ultrasons - Google Patents

Dispositif de détection de tissu par ultrasons, procédé de détection de tissu par ultrasons, et programme de détection de tissu par ultrasons Download PDF

Info

Publication number
WO2017056779A1
WO2017056779A1 PCT/JP2016/074329 JP2016074329W WO2017056779A1 WO 2017056779 A1 WO2017056779 A1 WO 2017056779A1 JP 2016074329 W JP2016074329 W JP 2016074329W WO 2017056779 A1 WO2017056779 A1 WO 2017056779A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
transverse
boundary
echo
cross
Prior art date
Application number
PCT/JP2016/074329
Other languages
English (en)
Japanese (ja)
Inventor
竜雄 新井
Original Assignee
古野電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 古野電気株式会社 filed Critical 古野電気株式会社
Priority to CN201680056473.0A priority Critical patent/CN108135578B/zh
Priority to JP2017543010A priority patent/JP6535097B2/ja
Publication of WO2017056779A1 publication Critical patent/WO2017056779A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • A61B8/14Echo-tomography

Definitions

  • the present invention relates to a technique for detecting a measurement target site such as muscle tissue from an echo image picked up by transmitting and receiving ultrasonic waves into the body such as the abdomen of a human body.
  • an internal structure such as the abdomen may be imaged using an ultrasonic tissue detection device.
  • the ultrasonic tissue detection device generates an echo image of an image of the inside of the human body by transmitting ultrasonic waves from the surface of the human body and receiving ultrasonic waves reflected in the body.
  • the abdomen of the human body has a structure in which a plurality of tissues overlap in the order of the epidermis, subcutaneous tissue, adipose tissue, muscle tissue, and viscera from the body surface to the inside of the body. Furthermore, tissues such as blood vessels and diaphragms are also present inside each tissue. Therefore, in the ultrasonic tissue detection apparatus, the ultrasonic wave transmitted into the body is reflected at the boundary between the tissues, and an echo image in which a part of the boundary between the various tissues is captured as a linear or dotted image is obtained.
  • the operator of the ultrasonic tissue detection apparatus visually observes a plurality of images on the echo image and estimates the boundary located on the body surface side of the internal tissue to be measured and the boundary located on the inside of the body. Then, the operator operates the positions of the two cursors displayed on the operation screen, and matches each position with the boundary of the internal tissue on the echo image. Thereby, the interval between the cursors on the operation screen corresponds to the thickness of the internal tissue. For this reason, the operator has grasped the thickness of the internal tissue by reading a scale or a numerical value indicating the thickness of the internal tissue on the operation screen.
  • Patent Document 1 detects the boundary of an internal tissue based on the length of an image and the direction in which the image extends from various images on an echo image obtained by ultrasonic waves.
  • the technique disclosed in Patent Document 2 searches and tracks a linear image from the vicinity of a position on an echo image designated by an operator, and detects the linear image as a boundary of internal tissue.
  • each of the above-described conventional techniques detects a boundary regardless of the type of internal tissue, and does not detect a specific internal tissue such as a muscle tissue or the boundary of the internal tissue. Therefore, even if the conventional technique is used, it is necessary for the operator to visually determine the boundary of the target internal tissue, and it is possible to automatically detect and display the specific internal tissue and the boundary, The thickness of the film could not be measured automatically.
  • an object of the present invention is to provide an ultrasonic tissue detection device that automatically detects a specific internal tissue and its boundary with a certain degree of accuracy from an echo image of a subject such as an abdomen imaged using ultrasonic waves. It is to provide.
  • An ultrasonic tissue detection apparatus includes an image acquisition unit that acquires an echo image based on an echo from the body of an ultrasonic wave transmitted from the surface of a subject including a measurement target site to the body, and the image acquisition A plurality of transverse images detected by the transverse image detection unit; and a plurality of transverse images detected by the transverse image detection unit.
  • the images two cross-sectional images corresponding to the measurement target region of the subject are selected based on the feature amount of the cross-sectional image, and the two selected cross-sectional images are used as the boundaries of the measurement target region.
  • a boundary estimation unit is used to estimate the image.
  • the boundary estimation unit selects two transverse images corresponding to the measurement target site of the subject based on the feature amount of the transverse image. Therefore, each boundary of the measurement target part can be automatically detected according to a specific measurement target part (for example, an internal tissue including muscle tissue).
  • a boundary of a specific internal tissue can be automatically detected with an accuracy of a certain level or more from an echo image of a subject such as an abdomen imaged using ultrasonic waves.
  • FIG. 1 is a configuration diagram of an ultrasonic tissue detection apparatus according to an embodiment of the present invention.
  • FIG. 2 is a diagram showing a processing flow of the ultrasonic tissue detection apparatus according to the embodiment of the present invention.
  • FIG. 3 is a diagram illustrating a cross-sectional structure of an abdomen corresponding to an imaging target and an echo image obtained by imaging the abdomen.
  • FIG. 4 is a diagram showing a processing flow of the transverse image detection processing according to the embodiment of the present invention.
  • FIG. 5 is a diagram illustrating an image converted by the transverse image detection process according to the embodiment of the invention.
  • FIG. 6 is a diagram showing a processing flow of the boundary detection processing according to the embodiment of the present invention.
  • FIG. 7 is a diagram schematically showing the image of FIG.
  • FIG. 8 is a diagram illustrating a processing example by the boundary processing unit according to the embodiment of the present invention.
  • FIG. 1 is a configuration diagram of an ultrasonic tissue detection apparatus according to the first embodiment of the present invention.
  • the probe 2 has a substantially columnar shape, for example, and is configured so that an operator can hold and move the probe 2.
  • a cable is connected to the upper end of the probe 2, and the probe 2 is connected to the interface 10 of the image processing apparatus 11 via the cable.
  • Probe 2 receives a transmission signal from the image processing device 11.
  • the lower end surface of the probe 2 is configured as an ultrasonic wave transmission / reception surface, and an ultrasonic wave is transmitted from the lower end surface when a transmission signal is input. Accordingly, the probe 2 receives the transmission signal from the image processing apparatus 11 in a state where the lower end surface is pressed against the subject (in this embodiment, the abdomen of the human body) 101 by the operator, so that the human body Ultrasound is transmitted toward the body of the abdomen 101. Further, the probe 2 receives an ultrasonic echo reflected in the body of the abdomen 101 and outputs a reception signal corresponding to the reception level of the ultrasonic wave to the image processing apparatus 11.
  • the image processing apparatus 11 includes a transmission / reception processing unit 3, an image display unit 8, a control unit 9, and an interface (I / F) 10.
  • the control unit 9 includes an image acquisition unit 4, a transverse image detection unit 5, a boundary estimation unit 6, and a boundary processing unit 7.
  • the control unit 9 includes a CPU (computer) and a storage unit.
  • the image acquisition unit 4, the transverse image detection unit 5, the boundary estimation unit 6, and the boundary processing unit 7 are executed as software by executing an ultrasonic tissue detection program installed in the storage unit by the CPU.
  • FIG. 2 is a flowchart illustrating a schematic processing flow of the image processing apparatus 11.
  • the transmission / reception processing unit 3 generates a transmission signal obtained by shaping a signal having a frequency in the ultrasonic region into a pulse waveform, and outputs it to the probe 2 via the interface 10 (FIG. 2: S101). Thereby, the probe 2 is driven and ultrasonic waves are transmitted from the probe 2 to the abdomen 101.
  • the transmission / reception processing unit 3 receives the reception signal output from the probe 2 and performs processing such as analog-digital conversion on the reception signal (FIG. 2: S102).
  • the transmission / reception processing unit 3 performs these processing flows at regular time intervals, repeatedly outputs a transmission signal, and receives an input of a repeated reception signal.
  • the image acquisition unit 4 receives a reception signal that has been processed by the transmission / reception processing unit 3 such as analog-digital conversion.
  • the image acquisition unit 4 generates a first image (echo image) 21 obtained by imaging an echo inside the body of the abdomen 101 based on the received reception signal (FIG. 2: S103, image acquisition step).
  • the first image 21 is obtained by setting the luminance corresponding to the received signal intensity of the echo to the pixel corresponding to the position where the echo received by the probe 2 is reflected by the abdomen 101.
  • FIG. 3A is a diagram illustrating a schematic structure of the abdomen 101.
  • FIG. 3B is a diagram illustrating the first image 21 obtained from the abdomen 101.
  • the abdomen 101 has a structure in which the epidermis, subcutaneous tissue, adipose tissue, muscle tissue, and viscera are arranged in order from the body surface side to the inside of the body.
  • the ultrasonic waves transmitted from the probe 2 to the abdomen 101 are reflected by the epidermis, subcutaneous tissue, the boundary between the subcutaneous tissue and the fat tissue, the boundary between the fat tissue and the muscle tissue, the boundary between the muscle tissue and the internal organs, and the like.
  • a plurality of linear images with high luminance (white display) are reflected from the skin side to the inside of the body at the position where the ultrasonic waves are reflected. .
  • the transverse image detector 5 shown in FIG. 1 performs an image conversion process on the first image 21 and detects a plurality of transverse images that appear on the first image 21 (FIG. 2: S104, Transverse image detection step).
  • the cross-sectional image is a direction that intersects the transmission direction of ultrasonic waves, that is, the direction from the epidermis to the inside of the body (downward direction in the drawing) on the first image 21 shown in FIG. It is defined as a linear image that extends and crosses the first image 21.
  • the boundary estimation unit 6 compares each of the plurality of cross-sectional images detected by the cross-sectional image detection unit 5, and estimates one corresponding to the boundary of any internal tissue from them (FIG. 2: S105, Boundary estimation step).
  • the boundary estimation unit 6 determines the measurement target region (this embodiment) of the subject based on the feature amount of each of the cross-sectional images (for example, positional relationship and echo intensity as described below). Then, the two cross-sectional images corresponding to the internal tissue including the muscle tissue are selected, and the two selected cross-sectional images are set as the boundaries of the internal tissue. Further, based on the cross-sectional image estimated by the boundary estimating unit 6 as the boundary of the internal tissue, the boundary processing unit 7 performs predetermined display such as highlighting and interval display of the cross-sectional image, that is, highlighting or thickness display of some internal tissue. Processing is performed (FIG. 2: S106, predetermined processing step).
  • the image display unit 8 includes, for example, a display that displays an echo image or the like, and performs highlighting and thickness display of the transverse image that has been subjected to predetermined processing by the boundary processing unit 7.
  • the ultrasonic tissue detection device 1 can automatically estimate the boundary of an internal tissue such as a muscle tissue from an image of an abdomen or the like imaged using ultrasonic waves. .
  • FIG. 4 is a diagram illustrating a detailed processing flow of the image conversion processing in the cross-sectional image detection unit 5.
  • FIG. 5 is a diagram illustrating an image that has been subjected to the image conversion processing in the transverse image detection unit 5.
  • FIG. 7 is a diagram schematically illustrating an image that has been subjected to the image conversion processing in the cross-sectional image detection unit 5.
  • the transverse image detection unit 5 first divides the first image 21 into a second image 22 on the subcutaneous tissue side and a third image 23 on the muscle tissue side (FIG. 4: S111, FIG. )reference). For example, the cross-sectional image detection unit 5 first removes the extracorporeal part from the first image 21. Then, the transverse image detection unit 5 adds the luminances of the pixel columns in the transverse direction from the first image 21 from which the extracorporeal part is removed, and generates a luminance profile in which the added luminances are arranged in the depth direction. Then, in the luminance profile, the position closest to the epidermis among the positions where the luminance is minimized is divided into the second image 22 on the subcutaneous tissue side and the third image 23 on the muscle tissue side as a division position.
  • the transverse image detector 5 detects the first transverse image 31 corresponding to the boundary of the subcutaneous tissue from the second image 22 on the subcutaneous tissue side (FIG. 4: S112, FIG. 5 (A), FIG. 7 (A)).
  • the transverse image detector 5 first applies the Dijkstra method to the second image 22 on the subcutaneous tissue side, and detects the first transverse image 31 having the strongest echo in this region.
  • the outline of the Dijkstra method is a three-dimensional image in which a cost axis in which the luminance is converted to a lower cost as the luminance is higher is added to the depth axis and the transverse axis of the image. It is a kind of optimization algorithm that generates a cost map and searches for the shortest path with the lowest cost along the transverse axis in the cost map.
  • the edge conversion processing extracts an end portion in the depth direction of the region from the region in which high luminance is distributed in the depth direction (see FIG. 5B). It is a kind of processing algorithm. By detecting the cross-sectional image from the image subjected to the edge conversion process, it is possible to estimate the boundary of the internal tissue with high accuracy by removing the influence of unnecessary echo existing around the cross-sectional image.
  • the transverse image detector 5 detects the second transverse image 32 from the third image 23 on the muscle tissue side (see FIG. 4: S113, FIG. 5 (A), FIG. 7 (A)).
  • the Dijkstra method is used to detect the second transverse image 32 having the strongest echo in this region.
  • the Dijkstra method may be applied after performing edge conversion processing on the third image 23 on the muscle tissue side.
  • the cross-sectional image detection unit 5 further divides the third image 23 on the muscle tissue side with the second cross-sectional image 32 detected earlier as a boundary, and the fourth image 24 on the body surface side (FIG. 5).
  • C) and FIG. 7B) and a fifth image 25 inside the body are generated (FIG. 4: S114).
  • the transverse image detector 5 detects the third transverse image 33 from the fourth image 24 on the body surface side (see FIG. 4: S115, FIG. 5C, FIG. 7B). Also here, for example, the third transverse image 33 having the strongest echo in this region is detected using the Dijkstra method. Alternatively, the Dijkstra method may be applied after performing edge conversion processing on the fourth image 24 on the body surface side.
  • the transverse image detector 5 detects the fourth transverse image 34 from the fifth image 25 inside the body (see FIG. 4: S116, FIG. 5D, FIG. 7B). Again, for example, the fourth transverse image 34 having the strongest echo in this region is detected using the Dijkstra method. Alternatively, the Dijkstra method may be applied after performing edge conversion processing on the fifth image 25 inside the body.
  • the transverse image detector 5 detects the first to fourth transverse images 31 to 34.
  • the first transverse image 31 is a transverse image corresponding to the boundary of the subcutaneous tissue. Any two of the first to fourth transverse images 31 to 34 are transverse images corresponding to the boundary on the body surface side of the muscle tissue and the boundary inside the body of the muscle tissue. Accordingly, the transverse images corresponding to the two boundaries of the muscle tissue are estimated from the first to fourth transverse images 31 to 34 detected by the transverse image detector 5 as follows.
  • FIG. 6 is a diagram illustrating a detailed processing flow of the boundary estimation process in the boundary estimation unit 6.
  • the boundary estimation unit 6 compares each of the four cross-sectional images detected by the cross-sectional image detection unit 5, and performs a process of estimating which internal tissue boundary corresponds to them.
  • the boundary estimation unit 6 obtains the second transverse image 32 (see FIGS. 5A, 7A, and 7B) previously detected by the transverse image detection unit 5, Considered as the first boundary of muscle tissue (FIG. 6: S121).
  • the first boundary of muscle tissue FOG. 6: S121.
  • this is usually a region on the inner side of the subcutaneous tissue in the first image 21, that is, the first region. This is because, in the image 23 of FIG. 3 (see FIGS. 5A and 7A), the echo becomes maximum at the boundary on the body surface side of the muscle tissue or the boundary on the inside of the body.
  • the boundary estimation unit 6 includes a third transverse image 33 (see FIGS. 5C and 7B) and a fourth transverse image 34 (see FIGS. 5D and 7B).
  • a first feature amount is detected from each of (FIG. 6: S122).
  • the first feature amount relates to, for example, the strength of echo in the transverse image.
  • the boundary estimation unit 6 may calculate the total or average value of each pixel on the cross-sectional images 33 and 34 as the first feature amount related to the echo intensity. In this case, since an unclear echo is likely to be generated near the edge of the image, it is preferable to obtain the above value by extracting pixels of about 70% of the width of the image. Thereby, the strength of the echo can be detected with higher accuracy. Note that the above-described processing may be repeated while moving the position where pixels of about 70% of the width of the image are extracted in the width direction of the image.
  • the boundary estimation unit 6 detects the second feature amount from each of the third transverse image 33 and the fourth transverse image 34 (FIG. 6: S123).
  • the second feature amount relates to the height of linearity in the transverse image, for example.
  • the boundary estimation unit 6 obtains approximate straight lines along the cross-sectional images 33 and 34, and calculates the reciprocal value of the square sum of the deviation (error) of each pixel on the cross-sectional images 33 and 34 with respect to these approximate straight lines. calculate.
  • the above value is preferably obtained by extracting pixels of about 70% of the width of the image. Thereby, the height of linearity can be detected with higher accuracy. Note that the above-described processing may be repeated while moving the position where pixels of about 70% of the width of the image are extracted in the width direction of the image.
  • the boundary estimation unit 6 evaluates each of the third transverse image 33 and the fourth transverse image 34 using the first feature amount and the second feature amount obtained in the above as terms. Using the function, the evaluation points for the third transverse image 33 and the evaluation points for the fourth transverse image 34 are calculated (FIG. 6: S124).
  • the boundary estimation unit 6 regards the fourth transverse image 34 as the second boundary of the muscle tissue ( FIG. 6: S125).
  • the boundary estimation unit 6 uses the third transverse image 33 and the first transverse image 31 (FIG. 5A). ) And FIG. 7B), the detection determination is performed for the second boundary of the muscle tissue. This is because there is an extremely large individual difference in the thickness of the fat layer on the body surface side than the muscle tissue in the abdomen 101, and there is almost no fat tissue between the subcutaneous tissue and the muscle tissue. This is because one cross-sectional image 31 may almost coincide with the second boundary of the muscle tissue.
  • the boundary estimation unit 6 first detects a third feature amount from the first transverse image 31 and the third transverse image 33 (FIG. 6: S126).
  • the third feature amount relates to the interval between the first transverse image 31 and the third transverse image 33.
  • the boundary estimation unit 6 determines that the third transverse image 33 exists when the interval is smaller than the threshold, that is, when the interval between the first transverse image 31 and the third transverse image 33 is extremely narrow. Since there is a high risk that the image is a cross-sectional image in which the boundary is not captured, the first cross-sectional image 31 is regarded as the second boundary of the muscle tissue (FIG. 6: S127).
  • the boundary estimation unit 6 uses the first transverse image 31 and the third transverse image 33 to 4 is detected (FIG. 6: S128).
  • the fourth feature amount relates to, for example, the strength of echo in the transverse image.
  • the boundary estimation unit 6 regards the larger one of the fourth feature amount among the first transverse image 31 and the third transverse image 33 as the second boundary of the muscle tissue (FIG. 6: S129). ).
  • the boundary estimation unit 6 regards the second cross-sectional image 32 as the first boundary of the muscle tissue, and among the first, third, or fourth cross-sectional images 31, 33, 34. Is considered as the second boundary of muscle tissue.
  • the boundary processing unit 7 performs highlighting processing, thickness, etc. on the cross-sectional image determined by the boundary estimation unit 6 as the first boundary of the muscle tissue and the cross-sectional image determined as the second boundary. Predetermined processing such as measurement or display is performed. Since the portion excluding the subcutaneous tissue and the muscle tissue corresponds to the fat tissue, the boundary processing unit 7 performs predetermined processing such as highlighting of the fat tissue and measurement or display of the thickness of the fat tissue. It may be.
  • the boundary processing unit 7 uses the second transverse image 32 and the fourth transverse image 34.
  • the displayed portion is visually displayed on the image display unit 8 as a muscle tissue.
  • the boundary processing unit 7 determines that the second transverse image 32 and the third transverse image 33 are The displayed portion is visually displayed on the image display unit 8 as a muscle tissue.
  • the boundary processing unit 7 when measuring and displaying the thickness of an internal tissue such as a muscle tissue, the boundary processing unit 7 includes a measurement bar indicating each cross-sectional image 31 to 34, and each cross-section.
  • a display object such as an arrow or a number indicating a measurement range or distance between images is displayed on the image display unit 8.
  • the ultrasonic tissue detection apparatus 1 uses ultrasonic waves to measure a portion where bone tissue such as the abdomen 101 is not detected, and the plurality of cross-sectional images 31 detected by the cross-sectional image detection unit 5.
  • the boundary estimation unit 6 selects two cross-sectional images corresponding to the measurement target region of the subject based on the feature values of the cross-sectional images 31 to 34, so that a specific measurement target region (for example, muscle tissue) is selected.
  • a specific measurement target region for example, muscle tissue
  • the boundary of the measurement target region can be automatically detected according to the internal tissue such as the fat tissue. Therefore, even when an operator with a low level of proficiency performs measurement, the thickness of muscle tissue or the like can be detected with a certain level of accuracy.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

La présente invention vise à fournir un dispositif de détection de tissu par ultrasons pour détecter automatiquement un contour d'un tissu interne spécifique avec au moins une certaine précision à partir d'une image d'écho d'un abdomen ou analogue, capturée à l'aide d'ondes ultrasonores. Un dispositif de détection de tissu par ultrasons comprend : une unité d'acquisition d'image (4) pour acquérir une image d'écho sur la base d'un écho, depuis l'intérieur d'un corps, d'ondes ultrasonores transmises dans le corps depuis la surface d'un sujet comprenant une région d'objet de mesure; une unité de détection d'image recoupée (5) pour détecter une pluralité d'images recoupées dans une direction croisant une direction de transmission des ondes ultrasonores; et une unité d'estimation de contour (6) pour sélectionner deux images recoupées correspondant à la région d'objet de mesure du sujet sur la base d'une valeur de caractéristique des images recoupées parmi la pluralité d'images recoupées détectées par l'unité de détection d'image recoupée (5) et utiliser les deux images recoupées sélectionnées comme contours de la région d'objet de mesure.
PCT/JP2016/074329 2015-09-29 2016-08-22 Dispositif de détection de tissu par ultrasons, procédé de détection de tissu par ultrasons, et programme de détection de tissu par ultrasons WO2017056779A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201680056473.0A CN108135578B (zh) 2015-09-29 2016-08-22 超声波组织检测装置、超声波组织检测方法及超声波组织检测程序
JP2017543010A JP6535097B2 (ja) 2015-09-29 2016-08-22 超音波組織検出装置、超音波組織検出方法、および、超音波組織検出プログラム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015191427 2015-09-29
JP2015-191427 2015-09-29

Publications (1)

Publication Number Publication Date
WO2017056779A1 true WO2017056779A1 (fr) 2017-04-06

Family

ID=58427460

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/074329 WO2017056779A1 (fr) 2015-09-29 2016-08-22 Dispositif de détection de tissu par ultrasons, procédé de détection de tissu par ultrasons, et programme de détection de tissu par ultrasons

Country Status (3)

Country Link
JP (1) JP6535097B2 (fr)
CN (1) CN108135578B (fr)
WO (1) WO2017056779A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018198792A (ja) * 2017-05-26 2018-12-20 株式会社グローバルヘルス 情報処理装置

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7032533B2 (ja) * 2018-07-13 2022-03-08 古野電気株式会社 超音波撮像装置、超音波撮像システム、超音波撮像方法および超音波撮像プログラム

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014501593A (ja) * 2011-01-05 2014-01-23 コーニンクレッカ フィリップス エヌ ヴェ 体の実際の組織層境界を決定するためのデバイス及び方法
JP2015104477A (ja) * 2013-11-29 2015-06-08 セイコーエプソン株式会社 超音波測定装置および超音波測定方法

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4116122B2 (ja) * 1997-11-28 2008-07-09 株式会社東芝 超音波診断装置及び超音波画像処理装置
US6561980B1 (en) * 2000-05-23 2003-05-13 Alpha Intervention Technology, Inc Automatic segmentation of prostate, rectum and urethra in ultrasound imaging
US6482161B1 (en) * 2000-06-29 2002-11-19 Acuson Corporation Medical diagnostic ultrasound system and method for vessel structure analysis
CN101422378B (zh) * 2003-05-08 2012-07-18 株式会社日立医药 超声诊断设备
JP4262517B2 (ja) * 2003-05-16 2009-05-13 オリンパス株式会社 超音波画像処理装置
JP4299189B2 (ja) * 2004-05-27 2009-07-22 アロカ株式会社 超音波診断装置及び画像処理方法
ATE481652T1 (de) * 2004-11-25 2010-10-15 Tomtec Imaging Syst Gmbh Ultraschallverfahren und -gerät zur detektion von objektbewegungen
US8144956B2 (en) * 2006-03-20 2012-03-27 Koninklijke Philips Electronics N.V. Ultrasonic diagnosis by quantification of myocardial performance
JP4797194B2 (ja) * 2006-05-09 2011-10-19 独立行政法人産業技術総合研究所 超音波断層画像による生体組織評価システム
US8947629B2 (en) * 2007-05-04 2015-02-03 Asml Netherlands B.V. Cleaning device, a lithographic apparatus and a lithographic apparatus cleaning method
CN101744639A (zh) * 2008-12-19 2010-06-23 Ge医疗***环球技术有限公司 超声成像方法及设备
JP5691720B2 (ja) * 2011-03-25 2015-04-01 コニカミノルタ株式会社 超音波診断装置
JP5351925B2 (ja) * 2011-04-05 2013-11-27 株式会社日立製作所 スチールコードを含む移送機構用長尺部材の点検装置及び点検方法
CN103156637B (zh) * 2011-12-12 2017-06-20 Ge医疗***环球技术有限公司 超声体积图像数据处理方法和设备
JP2015016144A (ja) * 2013-07-11 2015-01-29 セイコーエプソン株式会社 超音波測定装置、超音波画像装置及び超音波測定方法
JP2015080570A (ja) * 2013-10-22 2015-04-27 セイコーエプソン株式会社 超音波測定装置および超音波測定方法

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014501593A (ja) * 2011-01-05 2014-01-23 コーニンクレッカ フィリップス エヌ ヴェ 体の実際の組織層境界を決定するためのデバイス及び方法
JP2015104477A (ja) * 2013-11-29 2015-06-08 セイコーエプソン株式会社 超音波測定装置および超音波測定方法

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018198792A (ja) * 2017-05-26 2018-12-20 株式会社グローバルヘルス 情報処理装置
JP2022016620A (ja) * 2017-05-26 2022-01-21 株式会社グローバルヘルス 情報処理装置

Also Published As

Publication number Publication date
JPWO2017056779A1 (ja) 2018-08-09
CN108135578B (zh) 2021-01-12
CN108135578A (zh) 2018-06-08
JP6535097B2 (ja) 2019-06-26

Similar Documents

Publication Publication Date Title
EP3554380B1 (fr) Positionnement d'une sonde cible pour imagerie pulmonaire par ultrasons
JP6490809B2 (ja) 超音波診断装置、及び画像処理方法
US20120065499A1 (en) Medical image diagnosis device and region-of-interest setting method therefore
JP6389521B2 (ja) 頸動脈狭窄の自動スクリーニングのための非撮像型2次元アレイプローブ及びシステム
JP5735914B2 (ja) 超音波診断装置とその関心領域設定方法
JP7336443B2 (ja) 超音波撮像システム、装置、方法及び記憶媒体
EP2924656B1 (fr) Appareil et procédé de génération d'image de diagnostic
US20170124701A1 (en) System and method for measuring artery thickness using ultrasound imaging
JPWO2013105197A1 (ja) 超音波診断装置、および、血管特定方法
US20140371593A1 (en) Ultrasound diagnostic device and method for controlling ultrasound diagnostic device
JP2015500120A (ja) 処理装置及び記憶媒体
WO2015011585A1 (fr) Sonde matricielle en deux dimensions sans imagerie et système de classification de sténose de la carotide
US10736608B2 (en) Ultrasound diagnostic device and ultrasound image processing method
WO2015011599A1 (fr) Procédé pour l'alignement spatial de sous-volumes de données ultrasonores d'un vaisseau sanguin
EP2910192A1 (fr) Appareil de mesure à ultrasons et procédé de mesure par ultrasons
WO2017056779A1 (fr) Dispositif de détection de tissu par ultrasons, procédé de détection de tissu par ultrasons, et programme de détection de tissu par ultrasons
US20140276062A1 (en) Ultrasound diagnostic device and method for controlling ultrasound diagnostic device
KR20130095160A (ko) 초음파 장치 및 초음파 영상 생성 방법
US20140249417A1 (en) Ultrasound diagnostic device and ultrasound diagnostic device control method
JP4322305B1 (ja) 内臓脂肪肥満検査装置およびプログラム
US20200305844A1 (en) Ultrasonic diagnostic apparatus, tracing method, and program
JP2018011634A5 (fr)
US20130116560A1 (en) Ultrasound temperature mapping system and method
EP4382052A1 (fr) Détermination d'un profil d'écoulement dans une artère sur la base de données d'imagerie ultrasonore
JP2008220389A (ja) 超音波診断装置、超音波診断支援装置、及び超音波診断支援プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16850955

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2017543010

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16850955

Country of ref document: EP

Kind code of ref document: A1