WO2023089845A1 - Dispositif d'imagerie ultrasonore, système d'imagerie ultrasonore, procédé d'imagerie ultrasonore, et programme d'imagerie ultrasonore - Google Patents

Dispositif d'imagerie ultrasonore, système d'imagerie ultrasonore, procédé d'imagerie ultrasonore, et programme d'imagerie ultrasonore Download PDF

Info

Publication number
WO2023089845A1
WO2023089845A1 PCT/JP2022/015562 JP2022015562W WO2023089845A1 WO 2023089845 A1 WO2023089845 A1 WO 2023089845A1 JP 2022015562 W JP2022015562 W JP 2022015562W WO 2023089845 A1 WO2023089845 A1 WO 2023089845A1
Authority
WO
WIPO (PCT)
Prior art keywords
probe
ultrasonic
subject
pressure
image
Prior art date
Application number
PCT/JP2022/015562
Other languages
English (en)
Japanese (ja)
Inventor
健介 井芹
勝利 前原
Original Assignee
古野電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 古野電気株式会社 filed Critical 古野電気株式会社
Priority to CN202280076768.XA priority Critical patent/CN118265489A/zh
Publication of WO2023089845A1 publication Critical patent/WO2023089845A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • A61B8/14Echo-tomography

Definitions

  • the present invention relates to an ultrasonic imaging apparatus, an ultrasonic imaging system, an ultrasonic imaging method, and an ultrasonic imaging program for imaging the inside of a subject using ultrasonic waves.
  • the quadriceps femoris muscle is the muscle of the thigh and controls movements such as pulling up the thigh and extending the knee joint. Since the muscle mass of the quadriceps femoris muscle decreases significantly with aging, the decrease in the quadriceps femoris muscle is a factor in walking difficulties and falls in the elderly. Therefore, by ascertaining the muscle mass of the quadriceps femoris, elderly people are diagnosed with walking difficulties and falls. For example, a CT (Computed Tomography) device or an M An RI (Magnetic Resonance Imaging) device is used.
  • CT Computer Tomography
  • M An RI Magnetic Resonance Imaging
  • Japanese Patent Laid-Open No. 2002-200000 discloses a technique for imaging cross-sections of the thigh, upper arm, abdomen, and the like of a human body, which is a subject, using a probe that transmits and receives ultrasonic waves.
  • the operator continuously moves the probe along the cross section around the object to be imaged while maintaining the angle of the probe at an appropriate angle with respect to the surface of the human body.
  • a panorama synthetic image hereinafter also simply referred to as "composite image" obtained by imaging a wide cross-section of the region of the imaging target is obtained.
  • Ultrasound images captured using ultrasound include errors caused by pressing the probe against the subject. For example, if the angle of the probe with respect to the surface of the human body, which is the subject, is not at an appropriate angle, and the probe is not evenly pressed against the surface of the subject, the ultrasound image to be captured will include , includes errors due to deformation of the tissue on the surface of the object. If the ultrasonic image contains an error, the muscle thickness and muscle cross-sectional area measured based on the ultrasonic image also contain an error, making it impossible to obtain accurate measurement values.
  • Patent Literature 2 discloses a technique in which an operator adjusts the posture of a probe by detecting pressure of an ultrasonic sensor against a subject using force sensors provided at four corners of the probe.
  • Ultrasonic images are continuously taken around the thighs, upper arms, abdomen, etc. of the human body, which is the subject, while moving the probe along the surface of the subject, and a panoramic composite image of the region to be imaged is obtained.
  • Acquisition attempts cause continuous deformation of the tissue on the subject's surface along the direction in which the probe moves.
  • the degree of tissue deformation due to the pressure applied by the probe changes moment by moment according to the movement of the probe along the surface of the subject.
  • An object of the present invention is to provide an ultrasonic imaging apparatus that quantifies an operator's manipulation of the probe when moving the probe along the surface of the subject.
  • An ultrasonic imaging apparatus receives, through the probe, a signal related to ultrasonic waves transmitted from a probe placed on the surface of a subject to the inside of the subject and reflected inside the subject.
  • An ultrasonic wave receiving unit receives an ultrasonic image based on the received ultrasonic signal, and at least partial imaging regions acquired at a plurality of probe positions on the surface of the subject.
  • pressure estimation for estimating a pressure index which is an index indicating the magnitude of the contact pressure of the probe to the subject, based on the displacement of the positions of the corresponding feature points between the two or more superimposed ultrasound images. and a part.
  • An ultrasonic imaging system includes a probe that transmits ultrasonic waves from the surface of a subject to the inside and receives the ultrasonic waves reflected inside the subject, and an ultrasonic imaging device according to the present invention. , is provided.
  • An ultrasonic imaging method receives, through the probe, a signal related to ultrasonic waves transmitted from a probe placed on the surface of a subject to the inside of the subject and reflected inside the subject.
  • an ultrasonic wave receiving step an image generating step of generating an ultrasonic image based on the received ultrasonic wave signal; pressure estimation for estimating a pressure index, which is an index indicating the magnitude of the contact pressure of the probe to the subject, based on the displacement of the positions of the corresponding feature points between the two or more superimposed ultrasound images. and a step.
  • An ultrasonic imaging program receives a signal related to ultrasonic waves transmitted from a probe placed on the surface of a subject to the inside of the subject and reflected inside the subject through the probe.
  • a pressure estimating unit that estimates a pressure index, which is an index indicating the magnitude of the contact pressure of the probe on the subject, based on the displacement of the positions of the corresponding feature points between the two or more ultrasound images; It is characterized by operating the computer as
  • an ultrasonic imaging apparatus that quantifies the manipulation of the probe by the operator when moving the probe along the surface of the object.
  • FIG. 4 is a schematic cross-sectional view for explaining deformation of thigh soft tissue caused by pressing the probe against the thigh.
  • FIG. 4 is a schematic cross-sectional view for explaining deformation of thigh soft tissue caused by pressing the probe against the thigh.
  • FIG. 4 is a schematic cross-sectional view for explaining deformation of thigh soft tissue caused by pressing the probe against the thigh.
  • 1 is a schematic diagram showing the configuration of an ultrasound imaging system according to one embodiment
  • FIG. 1 is a block diagram showing the configuration of an ultrasonic imaging apparatus according to one embodiment
  • FIG. 4 is a flow chart showing a processing procedure of an ultrasonic imaging method according to one embodiment
  • FIG. 4 is a schematic diagram for explaining a plurality of pairs of ultrasound images that are consecutive in time series;
  • FIG. 10 is a schematic diagram for explaining feature point matching processing performed when estimating a pressure index;
  • FIG. 7 is a flowchart showing a detailed processing procedure of step S3 shown in FIG. 6;
  • FIG. 7 is a flowchart showing a detailed processing procedure of step S6 shown in FIG. 6;
  • FIG. It is an example of a panorama synthetic image of the cross section of the thigh when the pressure index is good.
  • It is an example of a panorama synthetic image of the cross section of the thigh when the pressure index is not good.
  • 4 is a graph showing verification results according to Example 1.
  • FIG. 9 is a graph showing comparison results according to Example 2.
  • FIG. 9 is a graph showing comparison results according to Example 2.
  • FIG. 9 is a graph showing comparison results according to Example 2.
  • FIGS. 1 to 3 are schematic cross-sectional views for explaining the deformation of femoral soft tissue caused by pressing the probe against the femoral region.
  • FIGS. 1 to 3 the deformation of the subject that occurs when an ultrasonic image of the subject is acquired using the probe will be discussed.
  • An index indicating the magnitude of (hereinafter also simply referred to as a pressure index) will be described.
  • each feature in the soft tissue 93 is detected.
  • the point moves to various positions depending on the degree of pressing of the soft tissue 93 by the probe 2 and the positional relationship between the probe 2 and the feature point.
  • reference numeral 94 indicates a feature point before soft tissue 93 is deformed
  • reference numeral 95 indicates a feature point after movement after deformation of soft tissue 93 .
  • Numeral 96 is a displacement vector (hereinbelow, the displacement vector is also referred to as a movement vector) indicating the movement of the feature point caused by deformation of the soft tissue 93 .
  • Reference numeral 96X denotes the component of the displacement vector 96 in the horizontal direction (X direction)
  • reference numeral 96Y denotes the component of the displacement vector 96 in the vertical direction (Y direction).
  • the movement of feature points in soft tissue 93 that occurs when probe 2 is pressed against soft tissue 93 will be considered.
  • the lateral displacement 96X is smaller nearer the centerline 97 of the probe 2 and larger nearer both ends of the probe 2 . That is, the lateral displacement distribution varies with distance from centerline 97 .
  • the longitudinal displacement 96Y increases as it approaches the centerline 97 of the probe 2 and decreases as it approaches both ends of the probe 2 . Further, even in the vertical displacement 96Y, the magnitude of the displacement varies between the shallow region and the deep region.
  • two feature points 94 can be observed at symmetrical positions across a center line 97 in the soft tissue 93 as shown in FIG.
  • the two feature points 94 are at the same distance from the center line 97 . Therefore, due to the deformation of the soft tissue 93 due to the pressing force of the probe 2 , in the illustrated example, the two left and right feature points 94 shift left and right by 0.5 on the X coordinate scale, respectively, and move to the position of the feature point 95 . Since the left and right two feature points 94 are symmetrical with respect to the center line 97, the magnitude of the displacement from the feature point 94 to the feature point 95 is also the same on the left and right sides.
  • the state of time T1 shown in FIG. 3 is reached by moving the probe 2 in the lateral direction (right side in the drawing: positive direction of the X-axis) from the state of time T0 shown in FIG .
  • the feature points 94 and 95 and the femur 91 also move laterally (left side in the drawing: negative direction of the X axis) by the amount corresponding to the lateral movement of the probe 2. .
  • the lateral displacement distribution varies with distance from centerline 97 .
  • the two characteristic points 94 and 95 located on the left side of the center line 97 in the drawing are directed away from the center line 97.
  • the displacement 94a, 95a of these two feature points 94, 95 located on the left side is larger than the displacement at time T0 when the probe 2 is not moved in the lateral direction. growing. This is the leftward shift amount ⁇ shown in the drawing.
  • the two feature points 94 and 95 located on the right side of the center line 97 in the drawing move toward the center line 97, so that the displacement in the lateral direction becomes small, and these two feature points located on the right side move toward the center line 97.
  • Displacements 94b and 95b of 94 and 95 are smaller than the displacement at time T0 when the probe 2 is not moved laterally. This is the shift amount ⁇ to the left shown in the figure.
  • the deviation amount of the feature points has different characteristics in the X-axis direction and the Y-axis direction.
  • any feature point is shifted only in one direction in proportion to the degree of deformation due to pressure.
  • the directions of the shifts are opposite to the left and right of the feature points (left and right of the center line 97).
  • the amount of deviation in the Y-axis direction is highly random and becomes a noise component as a pressure index.
  • a difference occurs between a shallow region and a deep region within the subject 9 depending on whether or not there is pressing.
  • the degree of deformation is large in shallow regions, and the amount of deviation in the X-axis direction is large.
  • a deep region has a small degree of deformation and a small amount of deviation in the X-axis direction. As the degree of deformation due to pressing increases, the amount of deviation also increases.
  • the extent to which the feature point moves in the soft tissue 93 is That is, the magnitude of the displacement vector of the feature point changes according to the extent to which the probe 2 is pressed against the subject 9 .
  • an index (pressure index) indicating the magnitude of the contact pressure of the probe to the subject is quantified in the manipulation of the probe by the operator.
  • the magnitude of the displacement vector of the feature point differs between the shallow region and the deep region in the soft tissue 93 with respect to the direction in which the probe 2 is moved. It takes advantage of the differences between shallow and deep regions within the soft tissue 93 .
  • a displacement vector representing the movement of a feature point that is, the displacement of the position of a feature point, is calculated by matching the position of the corresponding feature point between two or more ultrasound images in which at least a part of the imaging regions overlap each other.
  • FIG. 4 is a schematic diagram showing the configuration of the ultrasonic imaging system 1 according to one embodiment.
  • An ultrasound imaging system 1 includes a probe 2 and an ultrasound imaging device 3 .
  • the subject uses the ultrasonic imaging system 1 to visualize and confirm the state of his/her own muscles. That is, in this embodiment, the operator of the ultrasonic imaging system 1 is the subject himself/herself.
  • the probe 2 is a device that transmits ultrasonic waves from the surface of the subject 9 toward the inside of the subject 9 and receives the ultrasonic waves reflected inside the subject 9 .
  • the probe 2 is configured so that it can be held and moved by an operator.
  • the lower end of the probe 2 is provided with an ultrasonic transmission/reception surface on which a plurality of ultrasonic transducers are arranged in a row.
  • the subject 9 is the thigh of a human body, but the body part included in the subject 9 is not particularly limited.
  • the probe 2 operates in a linear scan mode for acquiring a fragmentary image (first fragmentary image 41) by linear scanning, and a fragmentary image (second fragmentary image 42) by sector scanning with a wider imaging range than linear scanning. ) to operate in both drive schemes with sector scan mode.
  • a fragment image is an ultrasonic image obtained by one-time imaging in linear scan mode or sector scan mode, and is equivalent to an image obtained by an ultrasonic diagnostic device (ultrasonic imaging device) with a general configuration. It is.
  • the operator When acquiring the panoramic composite image 47 of the cross section of the subject 9, the operator brings the ultrasonic wave transmitting/receiving surface of the probe 2 into contact with the subject 9 and moves the probe 2 along the surface of the subject 9 ( Scan around the thigh with probe 2). During this time, the probe 2 intermittently transmits ultrasonic waves from the ultrasonic transmission/reception surface toward the inside of the subject 9 while switching the scan mode between the linear scan mode and the sector scan mode at a predetermined cycle. receive the ultrasonic waves reflected inside the ultrasonic wave transmitting/receiving surface. As a result, the probe 2 outputs electrical signals (echo signals) representing the received ultrasonic waves in each of the linear scan mode and the sector scan mode.
  • an angle sensor is attached to the probe 2, and information on the tilt angle of the probe 2 (for example, the tilt angle of the probe 2 from the vertical direction) is transmitted to the ultrasonic imaging apparatus 3 together with the echo signal. be.
  • the ultrasonic imaging device 3 is connected to the probe 2 by wireless such as WiFi (registered trademark).
  • the ultrasonic imaging apparatus 3 is configured by, for example, a tablet terminal, and estimates the index of pressure applied to the subject 9 by the probe 2 based on echo signals received from the probe 2 .
  • the ultrasonic imaging apparatus 3 generates a plurality of fragmentary images (a plurality of first fragmentary images 41 and a plurality of second fragmentary images) for each of the linear scan mode and the sector scan mode, based on echo signals. 42).
  • the ultrasonic imaging device 3 While the probe 2 is moved along the surface of the subject 9, the ultrasonic imaging device 3 continuously generates a pair of the first fragment image 41 and the second fragment image 42 in time series, Generate multiple pairs of time-sequential ultrasound images.
  • the ultrasonic imaging apparatus 3 compares two or more time-series fragmented images in which at least a part of the imaging region overlaps with each other, thereby matching the two or more time-series successive fragment images. Calculate the displacement of the position of the feature point.
  • the ultrasonic imaging apparatus 3 estimates a pressing index of the probe 2 to the subject 9 based on the calculated displacement of the position.
  • the ultrasonic imaging apparatus 3 calculates the displacement of the position of the corresponding feature point between the plurality of first fragment images 41, 41, and between the plurality of second fragment images 42, 42, Calculate the displacement of the position of the corresponding feature point.
  • the ultrasonic imaging apparatus 3 estimates a pressing index of the probe 2 to the subject 9 based on the calculated displacement of the position of the feature point.
  • the ultrasonic imaging apparatus 3 further has a function of displaying a panorama synthetic image 47 of a cross section obtained by synthesizing these fragmentary images.
  • the ultrasonic imaging device 3 is not particularly limited as long as it can display an image, and can be configured with a general-purpose personal computer, smartphone, or the like. Also, the method of connecting the probe 2 and the ultrasonic imaging apparatus 3 is not particularly limited, and a wired connection may be used.
  • FIG. 5 is a block diagram showing the configuration of the ultrasonic imaging apparatus 3 according to one embodiment.
  • the ultrasound imaging apparatus 3 has a hardware configuration including a display 31, an input device 32, an auxiliary storage device 33, a communication interface section (I/F section) 34, and an output interface section (I/F section) 36. , and a speaker 37 .
  • the display 31 can be composed of, for example, a liquid crystal display, a plasma display, an organic EL display, or the like. Note that the display 31 may be configured as a device separate from the ultrasonic imaging device 3 .
  • the input device 32 is a touch panel provided on the surface of the display 31. An operator can perform an input operation on the image displayed on the display 31 via the input device 32 .
  • the auxiliary storage device 33 is a non-volatile storage device that stores an operating system (OS), various control programs, and data generated by the programs. Drive), etc.
  • An ultrasonic imaging program P is stored in the auxiliary storage device 33 .
  • the ultrasonic imaging program P may be installed in the ultrasonic imaging apparatus 3 via a network such as the Internet. Alternatively, the ultrasonic imaging program P can be transferred to the ultrasonic imaging apparatus 3 by causing the ultrasonic imaging apparatus 3 to read a computer-readable non-temporary tangible recording medium such as a memory card in which the ultrasonic imaging program P is recorded. can be installed on
  • the communication interface unit 34 transmits and receives data to and from an external device, and in this embodiment, demodulates signals received from the probe 2 and modulates control signals to be transmitted to the probe 2.
  • the output interface unit 36 outputs various data generated by arithmetic processing of the ultrasonic imaging apparatus 3 to the display 31 and the speaker 37 .
  • the output interface unit 36 displays the image on the display 31 by developing various generated image data in the VRAM. 31.
  • the output interface unit 36 outputs a sound corresponding to the determination result from the speaker 37 based on the determination result data regarding the pressure indicator generated by the pressure determination unit 355 .
  • the ultrasonic imaging apparatus 3 includes, as other hardware configurations, a processor such as a CPU that performs data processing, and a memory (main storage device) that the processor uses as a work area for data processing. I have more.
  • the ultrasonic imaging apparatus 3 also includes a signal processing unit 35 as a software configuration.
  • the signal processing unit 35 is a functional block realized by executing the ultrasonic imaging program P by the processor.
  • the signal processing unit 35 has a function of processing the echo signal received from the probe 2, estimating the pressure index of the probe 2 on the subject 9, and determining whether the estimated pressure index is good. there is Further, the signal processing unit 35 processes the echo signals received from the probe 2 to generate a composite image 47 of the cross section of the subject 9 for the operator, subject, doctor, imaging staff, etc. to understand the state of the subject 9 .
  • the signal processing unit 35 includes an ultrasonic wave receiving unit 351, a first fragment image generation unit 352, a second fragment image generation unit 353, a pressure estimation unit 354, and a pressure determination unit. 355 and a cross-sectional image synthesizing unit 356 .
  • the signal processing unit 35 may be implemented in hardware by a logic circuit formed on an integrated circuit.
  • the ultrasonic wave receiving unit 351 generates a transmission signal by giving a delay to a signal having a frequency in the ultrasonic range, and outputs it to a control device (not shown) built in the probe 2 .
  • the controller drives the probe 2 based on the received transmission signal.
  • the ultrasonic wave receiving unit 351 can control the driving method and beam shape of the probe 2 by controlling the delay.
  • a received signal is input from the probe 2 to the ultrasonic wave receiving section 351 .
  • the ultrasonic wave receiving unit 351 performs processing such as analog-to-digital conversion on the input received signal, and transmits the processed received signal to the first fragment image generation unit 352 when driven in the linear scan mode. When driving, they are output to the second fragment image generator 353 respectively.
  • the ultrasonic wave receiving unit 351 While the probe 2 is moved along the surface of the subject 9, the ultrasonic wave receiving unit 351 repeatedly outputs the transmission signal at regular time intervals for each of the linear scan mode and the sector scan mode, and outputs the transmission signal. Each time, a received signal of ultrasonic waves received by the probe 2 is obtained.
  • the function of the ultrasonic wave receiving unit 351 may be provided in the control device that controls the probe 2 .
  • the control device may be connected to the ultrasonic imaging device 3, or an ultrasonic image may be stored in the control device and transmitted to the ultrasonic imaging device 3 via a recording medium. good.
  • Each of the first fragment image generation unit 352 and the second fragment image generation unit 353 performs image conversion processing according to the driving method of the probe 2 based on the reception signal output by the ultrasonic wave reception unit 351 to determine the object to be imaged. Generating a partially captured fragment image.
  • the first image fragment generator 352 generates the first image fragment 41 in linear scan mode
  • the second image fragment generator 353 generates the second image fragment 42 in sector scan mode. Generate. While the probe 2 is moved along the surface of the subject 9, the first fragment image generation unit 352 and the second fragment image generation unit 353 each generate signals based on the reception signals repeatedly input from the ultrasound reception unit 351.
  • a plurality of fragmentary images obtained by imaging a cross section of the subject 9 from various directions are taken from the subject 9 when the fragmentary images were acquired. It is generated together with angle information (information on the angle of inclination) of the probe 2 with respect to the surface.
  • a pair of fragmentary images of the first fragmentary image 41 in the linear scan mode and the second fragmentary image 42 in the sector scan mode is generated, and while the probe 2 is continuously moved along the surface of the subject 9, a plurality of pairs of such fragmentary images are generated for each tilt angle of the probe 2, together with information on the tilt angle of the probe 2. generated.
  • the number of fragment image pairs generated varies depending on the transmission/reception time and transmission/reception cycle of the ultrasonic waves by the probe 2 .
  • one fragmentary image pair of the first fragmentary image 41 and the second fragmentary image 42 is generated every approximately 125 msec.
  • the pressure estimating unit 354 compares a plurality of ultrasonic images acquired at a plurality of probe positions on the surface of the subject 9, thereby estimating the displacement of the position of the corresponding feature point between the plurality of ultrasonic images.
  • the index of pressure of the probe 2 on the subject 9 is estimated based on the calculated displacement.
  • the first fragmentary image 41 acquired in the linear scan mode is used as an ultrasound image of a shallow region within the subject 9
  • the second fragmentary image 42 acquired in the sector scan mode is used as the ultrasound image of the subject. It is used as an ultrasonic image of a deep region within the specimen 9 .
  • the pressure estimating unit 354 calculates displacement vectors representing displacements of the positions of a plurality of feature points for each of shallow and deep regions in the ultrasonic image of the subject 9 .
  • Shallow region and deep region refer to regions within the object 9 along the transmission direction of the ultrasound beam. In the ultrasound image, the shallow and deep regions may be partially overlapped.
  • the pressure estimation unit 354 estimates the pressure index based on the difference between the displacement vector acquired for the shallow region and the displacement vector acquired for the deep region. At least a part of the imaging regions of the plurality of ultrasound images to be compared overlap each other, and corresponding feature points between the plurality of ultrasound images are detected by, for example, feature point matching.
  • the pressure determination unit 355 determines whether the estimated pressure index is good or not, and notifies whether the pressure index is good or not based on the determination result in different modes. In this embodiment, the pressure determination unit 355 determines whether the pressure index is good based on a predetermined threshold. The determination result is notified to the operator through the speaker 37, for example, as sound intensity. Alternatively, the determination result is displayed on the display 31, for example, as character information.
  • the pressure determination unit 355 repeatedly performs the determination of the pressure index while the probe 2 is moved along the surface of the subject 9. whether or not can be notified in different manners. As a determination through the entire imaging, the pressure determination unit 355 determines the pressure based on a plurality of pressure indicators obtained during the movement (that is, during the measurement) after the probe 2 is moved along the surface of the subject 9. It can also be communicated differently whether the metric is good or not. When performing determination throughout the imaging, the pressure determination unit 355 is based on the ratio of the sum of multiple pressure indicators obtained during movement and the sum of the magnitudes of multiple displacement vectors obtained during movement. , it can be determined whether the pressure index is good or not. Note that the ratio of two values means division in which one of the two values is the numerator and the other value is the denominator.
  • the cross-sectional image synthesis unit 356 combines the plurality of first fragment images 41 generated by the first fragment image generation unit 352 and the plurality of second fragment images 42 generated by the second fragment image generation unit 353. are synthesized unevenly.
  • section or “transverse section” is a concept including not only a circular section but also a partial section.
  • the cross-sectional image synthesizing unit 356 performs a process of unevenly synthesizing the first fragmentary image 41 acquired in the linear scan mode and the second fragmentary image 42 acquired in the sector scan mode. is performed for each tilt angle to generate a plurality of intermediate synthesized images. For example, by replacing the region corresponding to the first fragment image 41 of the second fragment image 42 acquired by the sector scan mode with the first fragment image 41 acquired by the linear scan mode, the second fragment The first fragment image 41 is partially superimposed on the image 42 and synthesized to generate an intermediate synthesized image. Information on the tilt angle of the probe 2 is associated with the intermediate synthesized image.
  • the composite image 47 of the cross section of the subject 9 generated by the cross-sectional image synthesizing unit 356 is input to the output interface unit 36 .
  • the output interface unit 36 displays the synthesized image 47 on the display 31 by developing the data of the synthesized image 47 in the VRAM.
  • FIG. 6 is a flow chart showing a processing procedure of an ultrasonic imaging method according to one embodiment.
  • step S1 ultrasonic waves are transmitted from the probe 2 placed on the surface of the subject 9 to the inside of the subject 9, and signals related to the ultrasonic waves reflected inside the subject 9 are received through the probe 2.
  • the ultrasonic wave receiving unit 351 drives the probe 2 in linear scan mode, and the probe 2 transmits ultrasonic waves from the surface of the subject 9 toward the inside of the subject 9 in linear scan mode.
  • the probe 2 receives the ultrasonic waves reflected inside the subject 9, and the probe 2 outputs an echo signal corresponding to the linear scan mode.
  • the ultrasound receiver 351 drives the probe 2 in sector scan mode. The probe 2 outputs an echo signal corresponding to the sector scan mode.
  • step S2 an ultrasonic image is generated for each of the deep region and shallow region within the subject 9 based on the received ultrasonic signal.
  • the first fragmentary image 41 is used as an ultrasonic image of a shallow region within the subject 9
  • the second fragmentary image 42 is used as an ultrasonic image of a deep region within the subject 9 .
  • the ultrasound reception unit 351 performs processing such as analog-to-digital conversion on the input reception signal, and outputs the processed reception signal to the first fragment image generation unit 352 .
  • the first fragment image generator 352 generates the first fragment image 41 in linear scan mode.
  • the first fragment image generator 352 generates the first fragment image 41 each time the probe 2 outputs an echo signal.
  • the second fragment image generator 353 generates the second fragment image 42 in sector scan mode each time the probe 2 outputs an echo signal.
  • FIG. 7 is a schematic diagram for explaining a plurality of pairs of ultrasonic images that are consecutive in time series.
  • steps S1 and S2 are repeatedly executed while the probe 2 is moved along the surface of the subject 9 from time T1 to time TN .
  • the ultrasonic imaging apparatus 3 continuously generates pairs of the first fragmentary image 41 and the second fragmentary image 42 in time series, and generates a plurality of pairs of ultrasonic images that are consecutive in time series.
  • Generate P 1 , P 2 , . . . PN Generate P 1 , P 2 , . . . PN .
  • a pair P1 of a first fragmentary image 41 and a second fragmentary image 42 is generated at time T1
  • a pair P1 of a first fragmentary image 41 and a second fragmentary image 42 is generated at time T2 .
  • a pair P2 with the image 42 is generated, and at time TN , a pair PN with the first image fragment 41 and the second image fragment 42 is generated.
  • the plurality of pairs P 1 , P 2 Among the plurality of pairs P 1 , P 2 , .
  • the pair of ultrasound images P1 and P2 acquired from time T1 to time T2 at least a part of the imaging region overlaps between the first fragment images 41, 41, and At least a part of the imaging area overlaps between the second fragment images 42 , 42 .
  • step S3 the index of pressure on the subject 9 by the probe 2 is estimated.
  • the pressure estimating unit 354 matches the positions of corresponding feature points between two or more ultrasound images by comparing two or more ultrasound images in which at least a part of the imaging regions overlap each other. do.
  • the pressure estimation unit 354 calculates the positional displacement of the feature points between the plurality of ultrasound images by matching the positions of the feature points, and estimates the pressure index based on the positional displacement of the feature points.
  • the matching of the positions of the feature points is performed for each of shallow regions and deep regions within the subject 9 .
  • a displacement vector representing the displacement of the position of the feature point is calculated for each of the shallow region and the deep region.
  • the pressure index is estimated based on the difference between the displacement vector in the shallow region and the displacement vector in the deep region.
  • FIG. 8 is a schematic diagram for explaining feature point matching processing performed when estimating a pressure index.
  • (a) is a diagram for explaining a displacement vector B (b x , b y ) calculated for a deep region.
  • (b) is a diagram for explaining a displacement vector A(a x , a y ) calculated for a shallow region.
  • (c) is a diagram for explaining a difference C(c x , c y ) between two displacement vectors used for estimating a pressure index.
  • FIG. 9 is a flowchart showing the detailed processing procedure of step S3 shown in FIG. Step S3 has steps S31 to S33.
  • a displacement vector B(b x , b y ) is calculated for the deep region.
  • B 1 (b 1x , b 1y ), B 2 (b 2x , b 2y ), B 3 (b 3x , b 3y ) are obtained.
  • the number nB of feature points is set to 3 for simplification of explanation, the number nB of feature points is not limited to 3. In FIG.
  • reference numeral 52a denotes a feature point in the fragment image 42 of the ultrasound image pair P1 at time T1
  • reference numeral 52b denotes the fragment image 42 of the ultrasound image pair P2 at time T2 . It shows the feature points inside.
  • the displacement vector B1 is expressed as a vector indicating movement from the feature point 52a to the feature point 52b.
  • a known algorithm such as an ORB (Oriented FAST and Rotated Brief) algorithm can be used for extracting feature point pairs between the two ultrasound images 42 , 42 .
  • the displacement vector B(b x , b y ) is calculated, for example, by averaging the elements of the three displacement vectors B 1 , B 2 and B 3 . That is, b x is calculated by (b 1x +b 2x +b 3x )/3, and b y is calculated by (b 1y +b 2y +b 3y )/3.
  • a displacement vector A(a x , a y ) is calculated for the shallow region.
  • the displacement vector A (a x , a y ) is also calculated in the same manner as the displacement vector B (b x , b y ).
  • the number nA of feature points is set to 3 for simplification of explanation, the number nA of feature points is not limited to 3.
  • reference numeral 51a denotes a feature point in the fragment image 41 of the ultrasound image pair P1 at time T1
  • reference numeral 51b denotes the fragment image 41 of the ultrasound image pair P2 at time T2 . It shows the feature points inside.
  • the displacement vector A1 is expressed as a vector indicating movement from the feature point 51a to the feature point 51b.
  • the displacement vector A (a x , a y ) is calculated, for example, by averaging the elements of the three displacement vectors A 1 , A 2 and A 3 .
  • an ORB Oriented FAST and Rotated Brief
  • template matching is performed using the displacement vector B (b x , b y ) calculated between the two ultrasound images 42, 42 in step S31 as an initial value.
  • a plurality of displacement vectors A 1 (a 1x , a 1y ), A 2 (a 2x , a 2y ), A 3 (a 3x , a 3y ) between a plurality of feature points between the fragment images 41 and 41 of one image are expressed as can ask.
  • NCC Normalized Cross-Correlation
  • ZNCC Zero-mean Normalized Cross-Correlation
  • SSD Sum of Squared Difference
  • step S33 the pressure index is estimated based on the displacement vector difference C(c x , c y ).
  • the displacement vector difference C(c x , c y ) is calculated from B(b x , b y ) ⁇ A(a x , a y ). Calculate c x as (b x ⁇ a x ) and c y as (b y ⁇ a y ).
  • c x which is the component in the X direction
  • the pressure index is used as the pressure index.
  • the feature calculated by feature point matching is This is because the displacement vector of the point mainly includes the displacement deviation in the X direction.
  • the X-direction component means a component in a direction perpendicular to the transmission direction of the ultrasonic beam transmitted from the probe 2, and means a component in the azimuth direction.
  • step S4 the determination result regarding the pressure index is notified by sound.
  • the magnitude of the estimated pressure indicator (c x ) is determined based on a predetermined threshold value (first threshold value) of the pressure indicator.
  • the pressure determination unit 355 determines whether or not the pressure index (c x ) estimated in step S3 is good, based on the first threshold value of the predetermined pressure index.
  • the first threshold is set in advance and stored in memory.
  • the press determination unit 355 transmits the determination result data to the output interface unit 36 , and a sound corresponding to the determination result is output from the speaker 37 . For example, when the pressure index is good, no sound is output from the speaker 37, and no sound is output.
  • step S4 scanning by the probe 2 from time T1 to time TN is not completed, and the determination result notified by the pressure determination unit 355 by sound in step S4 is a real-time determination result. It can be said.
  • the determination result is notified by sound, but the sound may be replaced by voice, or the determination result may be notified by other forms such as light or vibration instead of sound.
  • the ultrasonic imaging apparatus 3 can be appropriately provided with a light-emitting device such as an LED and a vibration generator such as a vibrator in addition to the speaker 37 .
  • Steps S1 to S4 are repeated until scanning by the probe 2 ends (Yes in step S5).
  • an index pressing index
  • an index indicating the magnitude of the contact pressure of the probe to the subject is quantified. be done.
  • the ultrasonic imaging apparatus 3 determines whether or not the operator has performed a good probing technique, and notifies the operator of the determination result.
  • step S5 When the scanning by the probe 2 from the time T1 to the time TN ends (Yes in step S5), a plurality of pairs P1 , P2 , . N is generated. In steps S6 to S8, processing using a plurality of pairs P 1 , P 2 , . . . PN of these ultrasound images is performed.
  • step S6 the cross-sectional image synthesizing unit 356 unevenly synthesizes the plurality of first fragment images 41 and the plurality of second fragment images 42 to generate a synthetic image 47 of the cross section of the subject 9. .
  • step S7 a panorama composite image 47 of the cross section of the subject 9 is displayed on the display 31.
  • FIG. 10 is a flowchart showing the detailed processing procedure of step S6 shown in FIG. Step S6 has steps S61 to S63.
  • step S61 the cross-sectional image synthesizing unit 356 generates an intermediate synthesized image for each tilt angle of the probe 2.
  • the cross-sectional image synthesizing unit 356 partially superimposes the first fragmentary image 41 on the second fragmentary image 42 and synthesizes them to generate an intermediate synthetic image.
  • Step S61 is repeated until the generation of intermediate composite images is completed for all probe tilt angles (Yes in step S62). Thereby, a plurality of intermediate composite images are generated for each tilt angle of the probe 2 .
  • step S63 the cross-sectional image synthesizing unit 356 rotates and synthesizes a plurality of intermediate synthesized images based on the tilt angle of the probe 2.
  • a panorama composite image 47 of the cross section of the subject 9 is generated.
  • An example of a panorama composite image 47 of a cross-section of the thigh produced by an ultrasound imaging method according to one embodiment is illustrated in FIGS. 11 and 12 .
  • FIG. 11 is an example of a panorama synthesized image when the pressure index is good
  • FIG. 12 is an example of a panorama synthesized image when the pressure index is not good.
  • step S8 the determination result regarding the pressure index is displayed in characters.
  • the magnitude of the pressure indicator is determined based on a predetermined threshold value (second threshold value) of the pressure indicator.
  • the pressure determination unit 355 calculates the average magnitude of the pressure index obtained at each time during the scanning by the probe 2 from time T1 to time TN .
  • the average magnitude of the pressure indices at each time is calculated by dividing the sum total of a plurality of pressure indices obtained during movement of the probe 2 by the sum total of the magnitudes of the displacement vectors.
  • the sum of a plurality of pressure indices obtained during movement is calculated by adding the X components of the displacement vector difference C(c x , c y ) throughout the measurement, as shown in Equation (1).
  • the sum of the magnitudes of the displacement vectors is the magnitude of the displacement vector B calculated in the alignment with the second fragment image 42 in the sector scan mode, which is the base of feature point matching. is calculated by summing over the entire measurement.
  • the pressure determination unit 355 determines whether or not the calculated average value of the magnitudes of the pressure indicators is good based on the second threshold value of the predetermined pressure indicators.
  • the pressure determination unit 355 transmits data of the determination result to the output interface unit 36, and characters corresponding to the determination result are displayed on the display 31.
  • FIG. For example, if the pressure index is good, the characters "Good” are displayed as indicated by reference numeral 51 in FIG. 11, and if the pressure index is not good, the characters "Error" are displayed as indicated by reference numeral 51 in FIG. Is displayed.
  • the score of the pressure index indicated by reference numeral 52 is displayed in addition to the character of the determination result of the pressure index indicated by reference numeral 51.
  • FIG. The score of the pressure indicator can be obtained by scoring the estimated size of the pressure indicator based on, for example, a score table prepared in advance.
  • step S8 the scanning by the probe 2 performed from time T1 to time TN has been completed, and the determination result displayed in characters by the pressure determination unit 355 in step S8 is It can be said that this is a comprehensive judgment result.
  • the displacement of the position of the corresponding feature point between the plurality of ultrasound images is calculated.
  • the index of pressure of the probe 2 on the subject 9 is estimated based on the calculated displacement. It is determined whether or not the estimated pressure index is good, and based on the determination result, whether or not the pressure index is good is notified in different modes.
  • Determination of the pressure index can be performed in real time or after the end of imaging. Notification as to whether or not the pressure index is appropriate may be made by sound, voice, light, vibration, or the like, or by text or display.
  • the pressure index is estimated by matching feature points between a plurality of images, that is, by performing image processing. can be suppressed to quantify the pressure.
  • the pressure estimating unit 354 compares a plurality of ultrasonic images acquired at a plurality of probe positions on the surface of the subject 9, thereby identifying corresponding feature points among the plurality of ultrasonic images. The displacement of the position is calculated, and the pressure index of the probe 2 to the subject 9 is estimated based on the calculated displacement. Not limited.
  • the pressure estimation unit 354 can further have the function of a speed estimation unit, and the pressure determination unit 355 can further have the function of a speed determination unit.
  • the velocity estimator operates in the same manner as the pressure estimator 354.
  • the operations of the velocity estimator and the pressure estimator 354 can be shared up to the process of calculating the displacement of the position of the corresponding feature point. That is, the velocity estimating unit compares a plurality of ultrasonic images acquired at a plurality of probe positions on the surface of the subject 9, thereby estimating the displacement of the position of the corresponding feature point between the plurality of ultrasonic images. Based on the ratio between the calculated displacement and the measurement time, an index (hereinafter also referred to as a speed index) relating to the moving speed of the probe 2 is estimated.
  • a speed index an index relating to the moving speed of the probe 2 is estimated.
  • the speed determination section operates in the same manner as the pressure determination section 355 . That is, the speed determination unit determines whether or not the estimated speed index is good based on a predetermined speed index threshold value (third threshold value), and determines whether or not the speed index is good based on the determination result. Notify in different ways. Similar to the pressure determination unit 355, the determination may be performed in real time while scanning is performed by the probe 2 from time T1 to time TN , or may be performed during the scanning by the probe 2 from time T1 to time TN . may be performed as a comprehensive determination through the entire imaging by the probe 2 after the is completed.
  • a predetermined speed index threshold value third threshold value
  • the speed judgment unit displays the characters "Good” as indicated by reference numeral 53 in FIGS. If it is not good, for example, the characters "Error" are displayed. In the example shown in FIGS. 11 and 12, the score of the speed index is displayed as indicated by reference numeral 54 for the speed index as well as the pressure index.
  • the displacement vector B(b x , b y ) is calculated by, for example, averaging the elements of the three displacement vectors B 1 , B 2 , B 3 .
  • the method for calculating x , b y ) is not limited to this aspect. For example, instead of such simple arithmetic averaging, weighting according to the depth in the second image fragment 42 is added to the displacement vectors B 1 , B 2 , B 3 , and the displacement vector B(b x , b y ) may be calculated.
  • the contribution of the displacement vector B 1 existing at a shallow position in the second fragment image 42 is The weighting can be such that it increases and the contribution by the deep-lying displacement vector B3 is reduced. The same applies to the calculation of the displacement vector A(a x , a y ) for the first image fragment 41 .
  • the first fragmentary image 41 acquired in the linear scan mode is used as an ultrasound image of a shallow region within the subject 9, and the second fragmentary image 42 acquired in the sector scan mode is used as the ultrasound image of the subject.
  • the ultrasonic images are used as the ultrasonic images of the deep region inside the subject 9
  • the combination of ultrasonic images used when the pressure estimation unit 354 estimates the pressure index is not limited to this.
  • the second fragment image 42 acquired in the sector scan mode is used for a shallow region within the subject 9, and the first fragment image acquired in the linear scan mode is used.
  • a fragment image 41 may be used for a deep region within the subject 9 .
  • the first fragmentary image 41 acquired by the linear scan mode may be used, or for both shallow and deep regions within the subject 9.
  • the second fragment image 42 acquired by the sector scan mode may be used.
  • the ORB Oriented FAST and Rotated Brief
  • the algorithm used to extract feature point pairs is not limited to this.
  • Various other algorithms for extracting feature points can also be used, such as SIFT (Scale-invariant feature transform), SURF (Speeded Up Robust Features), AKAZE (Accelerated-KAZE), and the like.
  • the cross-sectional image synthesizing unit 356 performs a process of non-uniformly synthesizing the first fragment image 41 and the second fragment image 42 for each tilt angle of the probe 2 to generate a plurality of intermediate synthetic images.
  • the panorama synthetic image 47 is generated by synthesizing these multiple intermediate synthetic images by rotating them based on the tilt angle of the probe 2, but the manner in which the panorama synthetic image 47 is generated is not limited to this.
  • the cross-sectional image synthesizing unit 356 rotates and synthesizes a plurality of first fragment images acquired in the linear scan mode based on the tilt angle of the probe 2, thereby performing the first intermediate synthesizing. Generate an image.
  • a plurality of second fragment images generated for each tilt angle of the probe 2 are rotated based on the tilt angle of the probe 2 and synthesized.
  • a second intermediate synthesized image is generated.
  • the first intermediate synthesized image and the second intermediate synthesized image are weighted and synthesized to generate a panorama synthesized image 47 in which the first fragmentary image and the second fragmentary image are unevenly synthesized.
  • the pressure determination unit 355 determines whether or not the pressure indicator is good based on a predetermined threshold, but the determination of the pressure indicator is not limited to using the threshold.
  • the pressure determination unit 355 may determine whether or not the pressure index is good, for example, using artificial intelligence such as machine learning or deep learning.
  • the threshold values (first threshold value, second threshold value, and third threshold value) used by the pressure determination unit 355 and the speed determination unit for determination may be set in advance and stored in a memory. Alternatively, an operator input via the input device 32 may be used.
  • the pressure estimating unit 354 compares two or more ultrasound images in which the imaging regions at least partially overlap each other, thereby calculating the displacement of the position of the feature point between the two or more ultrasound images. , and the pressure index is estimated based on the calculated displacement, the manner in which the pressure estimation unit 354 estimates the pressure index is not limited to this manner.
  • the pressure estimator 354 may estimate the pressure index by using artificial intelligence such as machine learning or deep learning.
  • the pressure estimation unit 354 can be configured as a learned learning model.
  • the pressure estimation unit 354 learns the ultrasonic image and the pressure index obtained by comparing the displacement of the feature points between the ultrasonic images, and outputs the estimated value of the pressure index when the ultrasonic image is input.
  • the pressure index is estimated by the learning model of .
  • the signal processing unit 35 can include a learning unit 359 that causes the pressure estimating unit 354 to learn the ultrasonic image and the pressure index obtained by comparing the displacement of the feature points between the ultrasonic images.
  • the probe 2 in order to obtain fragmentary images corresponding to a plurality of mutually different positions on the surface of the subject 9, the probe 2 is moved along the surface of the subject 9, and ultrasonic waves are intermittently emitted from the probe 2.
  • the manner in which fragment images are obtained is not limited to this.
  • a plurality of probes 2 may be arranged on the subject 9 and ultrasonic waves may be transmitted simultaneously from each probe 2 .
  • the probe 2 operates in both the linear scan mode and the sector scan mode in the above embodiment
  • the probe 2 drive method is not limited to this.
  • the probe 2 may operate in convex scan mode instead of sector scan mode. That is, the probe 2 may operate in both the linear scan mode and the convex scan mode.
  • An ultrasonic image obtained by the linear scan mode is belt-shaped, and an ultrasonic image obtained by the sector scan mode or convex scan mode is fan-shaped or convex.
  • the ultrasonic images were continuously acquired while pressing the probe against the surface of the subject and moving the probe along the surface of the subject.
  • Ultrasound images were acquired in linear scan mode.
  • the reduction ratio of the cross-sectional area of the ultrasonic image was based on the cross-sectional image captured by the MRI apparatus.
  • Fig. 13 shows a graph of the verification results.
  • (a) is the difference in the amount of misalignment in the X-axis direction
  • (b) is the difference in the amount of misalignment in the Y-axis direction.
  • the misalignment amount means the amount of movement of the feature points when the feature points are matched between temporally adjacent ultrasound images.
  • the difference in the amount of displacement means the difference between the amount of displacement in a shallow region and the amount of displacement in a deep region within the object.
  • the larger the numerical value on the horizontal axis the greater the deformation due to pressure, and the smaller the numerical value of the ratio shown on the vertical axis, the greater the deformation due to pressure and the smaller the cross-sectional area.
  • the difference in the amount of misalignment in the X-axis direction (the component in the X-axis direction of the amount of movement of the feature points) is correlated with the area reduction ratio of the ultrasonic image. was confirmed. As a result, it was shown that it is effective to use the component in the X-axis direction and exclude the component in the Y-axis direction for the misalignment amount used as the pressing index.
  • the pressure index was calculated by the following two calculation methods.
  • the first pressure index was calculated by dividing the total amount of misalignment in the X-axis direction by the number of frames (the number of ultrasonic images).
  • the second pressure index was calculated by dividing the total amount of misalignment in the X-axis direction by the total vector.
  • the sum of the vectors is the magnitude of the displacement vector (displacement vector B in the description of the above embodiment) calculated in the alignment on which the feature point matching is based while the probe is moved along the surface of the subject. means the sum of
  • Feature point matching was performed under the following four conditions to simulate changes in probe movement speed. If the number of images to be skipped is increased, the moving speed of the probe will be simulated faster.
  • condition (II) As the number of skipped images was gradually increased from condition (II) to condition (IV), the value of the first pressure index increased monotonously.
  • the second pressure index showed a substantially constant value throughout the conditions (I) to (IV), although there were some variations in the values depending on the operator.
  • the first pressure indicator is an indicator that depends on the moving speed of the probe
  • the second pressure indicator is an indicator that is less affected by the moving speed of the probe. It was shown that it is appropriate to use the second pressure index with less variation in value.
  • the present invention can be applied to both medical and non-medical applications, and is particularly suitable for use by subjects who are not medical professionals to visualize and routinely check their own muscle conditions. .
  • Ultrasonic Imaging System 2 Probe 3 Ultrasonic Imaging Device 9 Subject 31 Display 32 Input Device 33 Auxiliary Storage Device 34 Communication Interface Unit 35 Signal Processing Unit 36 Output Interface Unit 37 Speaker 41 First Fragmented Image 42 Second Fragmented Image 47 Panoramic composite image 351 Ultrasonic wave receiving unit 352 First fragment image generating unit (image generating unit) 353 second fragment image generator (image generator) 354 pressure estimation unit 355 pressure determination unit 356 cross-sectional image synthesis unit 359 learning unit

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

Le problème décrit par la présente invention est de fournir un dispositif d'imagerie ultrasonore qui quantifie la manipulation d'une sonde par un opérateur lorsque la sonde est déplacée sur la surface d'un sujet. La solution selon l'invention porte sur un dispositif d'imagerie ultrasonore qui comprend : une unité de réception ultrasonore 351 qui reçoit, par l'intermédiaire d'une sonde disposée sur une surface du sujet, des signaux se rapportant à des ondes ultrasonores transmises par la sonde audit sujet et réfléchis à l'intérieur du sujet ; des unités de génération d'image 352 et 353 qui génèrent des images ultrasonores sur la base des signaux des ondes ultrasonores reçues ; et une unité d'estimation de force de pression 354 qui estime, sur la base des déplacements de positions de points caractéristiques correspondants, un indicateur de force de pression qui est un indicateur représentant l'amplitude d'une force de contact sur le sujet en raison de la sonde, dans au moins deux images ultrasonores qui sont acquises au niveau d'une pluralité de positions de sonde sur la surface du sujet et se chevauchent partiellement.
PCT/JP2022/015562 2021-11-19 2022-03-29 Dispositif d'imagerie ultrasonore, système d'imagerie ultrasonore, procédé d'imagerie ultrasonore, et programme d'imagerie ultrasonore WO2023089845A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202280076768.XA CN118265489A (zh) 2021-11-19 2022-03-29 超声波拍摄装置、超声波拍摄***、超声波拍摄方法、以及超声波拍摄程序

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-189007 2021-11-19
JP2021189007 2021-11-19

Publications (1)

Publication Number Publication Date
WO2023089845A1 true WO2023089845A1 (fr) 2023-05-25

Family

ID=86396562

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/015562 WO2023089845A1 (fr) 2021-11-19 2022-03-29 Dispositif d'imagerie ultrasonore, système d'imagerie ultrasonore, procédé d'imagerie ultrasonore, et programme d'imagerie ultrasonore

Country Status (3)

Country Link
JP (1) JP2023075904A (fr)
CN (1) CN118265489A (fr)
WO (1) WO2023089845A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010017585A (ja) * 2004-06-09 2010-01-28 Hitachi Medical Corp 超音波診断装置の作動方法及び超音波診断装置
US20170281094A1 (en) * 2016-04-05 2017-10-05 The Board Of Trustees Of The University Of Illinois Information Based Machine Learning Approach to Elasticity Imaging
WO2020039796A1 (fr) * 2018-08-22 2020-02-27 古野電気株式会社 Dispositif d'analyse ultrasonore, procédé d'analyse ultrasonore et programme d'analyse ultrasonore

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010017585A (ja) * 2004-06-09 2010-01-28 Hitachi Medical Corp 超音波診断装置の作動方法及び超音波診断装置
US20170281094A1 (en) * 2016-04-05 2017-10-05 The Board Of Trustees Of The University Of Illinois Information Based Machine Learning Approach to Elasticity Imaging
WO2020039796A1 (fr) * 2018-08-22 2020-02-27 古野電気株式会社 Dispositif d'analyse ultrasonore, procédé d'analyse ultrasonore et programme d'analyse ultrasonore

Also Published As

Publication number Publication date
JP2023075904A (ja) 2023-05-31
CN118265489A (zh) 2024-06-28

Similar Documents

Publication Publication Date Title
CN104220005B (zh) 超声波摄像装置以及超声波摄像方法
US8538103B2 (en) Medical image processing device, medical image processing method, medical image diagnostic apparatus, operation method of medical image diagnostic apparatus, and medical image display method
JP4997225B2 (ja) 前立腺の機械的イメージングをリアルタイムで行うための二重アレイ式トランスデューサ・プローブ
US20140114194A1 (en) Ultrasound diagnosis apparatus and ultrasound probe controlling method
JP5368615B1 (ja) 超音波診断システム
WO2019114034A1 (fr) Procédé et appareil pour acquérir un paramètre biomécanique en fonction d'un myogramme d'élasticité ultrasonore
KR101656127B1 (ko) 계측 장치 및 그 제어 프로그램
CN112998748A (zh) 用于超声弹性成像的应变自动测量和应变比计算的方法和***
JP2016112285A (ja) 超音波診断装置
WO2023089845A1 (fr) Dispositif d'imagerie ultrasonore, système d'imagerie ultrasonore, procédé d'imagerie ultrasonore, et programme d'imagerie ultrasonore
JP6518116B2 (ja) 超音波診断システム
US11430120B2 (en) Ultrasound image evaluation apparatus, ultrasound image evaluation method, and computer-readable non-transitory recording medium storing ultrasound image evaluation program
TWI681755B (zh) 脊椎側彎量測系統與方法
US20170251998A1 (en) Ultrasonic diagnostic device
WO2017183466A1 (fr) Dispositif de diagnostic à ultrasons
JP7294996B2 (ja) 超音波診断装置及び表示方法
JP7322774B2 (ja) 超音波診断装置、超音波診断装置の制御方法、及び超音波診断装置の制御プログラム
JP6356528B2 (ja) 超音波診断装置
JP6937731B2 (ja) 超音波診断装置および超音波診断装置の制御方法
JP6885908B2 (ja) 超音波診断装置および超音波診断装置の制御方法
WO2022239400A1 (fr) Dispositif d'imagerie ultrasonore, procédé d'imagerie ultrasonore, système d'imagerie ultrasonore et programme d'imagerie ultrasonore
Hampson et al. Towards robust 3D registration of non-invasive tactile elasticity images of breast tissue for cost-effective cancer screening
JP7273708B2 (ja) 超音波画像処理装置
JP7008590B2 (ja) 超音波画像処理装置
KR20140070044A (ko) 바이오 피드백이 가능한 관절 움직임의 평가 방법 및 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22895134

Country of ref document: EP

Kind code of ref document: A1