CN113729777A - Ultrasonic diagnostic apparatus and image processing apparatus - Google Patents

Ultrasonic diagnostic apparatus and image processing apparatus Download PDF

Info

Publication number
CN113729777A
CN113729777A CN202110576727.4A CN202110576727A CN113729777A CN 113729777 A CN113729777 A CN 113729777A CN 202110576727 A CN202110576727 A CN 202110576727A CN 113729777 A CN113729777 A CN 113729777A
Authority
CN
China
Prior art keywords
image
ultrasonic
measurement
rectangle
diagnostic apparatus
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110576727.4A
Other languages
Chinese (zh)
Inventor
天城星奈
今村智久
渡边正毅
高田优子
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Medical Systems Corp
Original Assignee
Canon Medical Systems Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Medical Systems Corp filed Critical Canon Medical Systems Corp
Publication of CN113729777A publication Critical patent/CN113729777A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0866Detecting organic movements or changes, e.g. tumours, cysts, swellings involving foetal diagnosis; pre-natal or peri-natal diagnosis of the baby
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4444Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/467Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/483Diagnostic techniques involving the acquisition of a 3D volume of data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/488Diagnostic techniques involving Doppler signals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Gynecology & Obstetrics (AREA)
  • Pregnancy & Childbirth (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

An embodiment relates to an ultrasonic diagnostic apparatus and an image processing apparatus, which improve the accuracy of measurement using an ultrasonic image. The ultrasonic diagnostic apparatus of the present embodiment includes a detection unit, an estimation unit, and an output unit. The detection unit detects a first rectangle that covers the measurement site from the ultrasonic image. The estimation unit estimates a measurement point for measuring the measurement site based on the position information of the first rectangle. An output unit outputs information on the measurement point.

Description

Ultrasonic diagnostic apparatus and image processing apparatus
Reference to related applications
This application enjoys the benefit of priority of Japanese patent application No. 2020-93551 filed on 28/5/2020, the entire contents of which are incorporated herein by reference.
Technical Field
The embodiments disclosed in the present specification and the drawings relate to an ultrasonic diagnostic apparatus and an image processing apparatus.
Background
The ultrasound images are used to confirm the development of the fetus. For example, an ultrasonic diagnostic apparatus can measure parameters (parameters) such as the fetal parietal diameter (BPD), the fetal circumference (HC), the Abdominal Circumference (AC), the Femoral Length (FL), and the Humeral Length (HL) of a fetus by using an ultrasonic image.
Here, when measuring parameters such as BPD, HC, AC, FL, and HL, a method of estimating a measurement point for measuring a measurement region by image processing based on luminance is known. However, in this method, when there is a portion having a higher brightness than the outline of the actual measurement site in the ultrasonic image, the measurement point may be estimated under the influence of the portion having the higher brightness.
Disclosure of Invention
One of the problems to be solved by the embodiments disclosed in the present specification and the drawings is to improve the accuracy of measurement using an ultrasonic image. However, the problems to be solved by the embodiments disclosed in the present specification and the drawings are not limited to the above problems. Problems corresponding to the effects of the respective configurations shown in the embodiments described below may be set as other problems.
The ultrasonic diagnostic apparatus of the present embodiment includes a detection unit, an estimation unit, and an output unit. The detection unit detects a first rectangle that covers the measurement region from the ultrasonic image. The estimation unit estimates a measurement point for measuring the measurement site based on the position information of the first rectangle. An output unit outputs information on the measurement point.
Effects of the invention
According to the ultrasonic diagnostic apparatus of the embodiment, the accuracy of measurement using an ultrasonic image is improved.
Drawings
Fig. 1 is a block diagram showing an example of the configuration of the ultrasonic diagnostic apparatus according to the present embodiment.
Fig. 2 is a diagram for explaining a problem when a measurement site is measured.
Fig. 3 is a diagram for explaining the processing of the image processing circuit of the ultrasonic diagnostic apparatus according to the present embodiment.
Fig. 4 is a diagram for explaining the processing of the image processing circuit of the ultrasonic diagnostic apparatus according to the present embodiment.
Fig. 5 is a diagram for explaining the processing of the image processing circuit of the ultrasonic diagnostic apparatus according to the present embodiment.
Fig. 6 is a diagram for explaining the processing of the image processing circuit of the ultrasonic diagnostic apparatus according to the present embodiment.
Fig. 7 is a diagram for explaining the processing of the image processing circuit of the ultrasonic diagnostic apparatus according to the present embodiment.
Fig. 8 is a flowchart (flowchart) showing a procedure of processing by the ultrasonic diagnostic apparatus according to the present embodiment.
Fig. 9 is a diagram for explaining the processing of the image processing circuit of the ultrasonic diagnostic apparatus according to the present embodiment.
Fig. 10 is a diagram for explaining the processing of the image processing circuit of the ultrasonic diagnostic apparatus according to the present embodiment.
Fig. 11 is a diagram for explaining the processing of the image processing circuit of the ultrasonic diagnostic apparatus according to the present embodiment.
Fig. 12 is a diagram for explaining the processing of the image processing circuit of the ultrasonic diagnostic apparatus according to the present embodiment.
Fig. 13 is a block diagram showing an example of the configuration of the learned model generation device according to the present embodiment.
Detailed Description
The ultrasonic diagnostic apparatus and the image processing apparatus according to the embodiments will be described below with reference to the attached drawings. The embodiments are not limited to the following embodiments. In principle, the contents described in one embodiment are also applicable to other embodiments.
Fig. 1 is a block diagram showing an example of the configuration of an ultrasonic diagnostic apparatus 1 according to the present embodiment. As shown in fig. 1, an ultrasonic diagnostic apparatus 1 according to the present embodiment includes an apparatus main body 100, an ultrasonic probe (probe)101, an input device 102, and a display (display) 103. The ultrasonic probe 101, the input device 102, and the display 103 are connected to the apparatus main body 100.
The ultrasonic probe 101 performs transmission and reception of ultrasonic waves (ultrasonic scanning (scan)). For example, the ultrasonic probe 101 contacts a body surface of the subject P (for example, the abdomen of a pregnant woman), and performs transmission and reception of ultrasonic waves to and from a region including at least a part of a fetus in the uterus of the pregnant woman. The ultrasonic probe 101 has a plurality of piezoelectric transducers. The plurality of piezoelectric vibrators are piezoelectric elements having a piezoelectric effect of converting an electric signal (pulse voltage) and mechanical vibration (vibration due to sound) to each other, and generate ultrasonic waves based on a drive signal (electric signal) supplied from the apparatus main body 100. The generated ultrasonic waves are reflected by the mismatched surface of the acoustic impedance (impedance) in the subject P, and received by the plurality of piezoelectric transducers as reflected wave signals (electrical signals) including components scattered by scatterers in the tissue. The ultrasonic probe 101 transmits reflected wave signals received by the plurality of piezoelectric transducers to the apparatus main body 100.
In the present embodiment, the ultrasonic probe 101 may be any type of ultrasonic probe such as a 1D array probe (array probe) having a plurality of piezoelectric transducers arranged one-dimensionally in a predetermined direction, a 2D array probe in which a plurality of piezoelectric transducers are two-dimensionally arranged in a lattice shape, or a mechanical (mechanical)4D probe in which a plurality of piezoelectric transducers arranged one-dimensionally are mechanically oscillated to scan a three-dimensional region.
The input device 102 includes a mouse (mouse), a keyboard (keyboard), buttons (button), a panel switch (panel switch), a touch command screen (touch command screen), a foot switch (foot switch), a wheel switch (wheel switch), a trackball (trackball), a joystick (joystick), and the like, receives various setting requests from an operator of the ultrasonic diagnostic apparatus 1, and transmits the received various setting requests to the apparatus main body 100.
The display 103 displays a gui (graphical User interface) for the operator of the ultrasonic diagnostic apparatus 1 to input various setting requests using the input device 102, ultrasonic image data (data) generated in the apparatus main body 100, and the like. The display 103 displays various messages (messages) for notifying the operator of the processing status of the apparatus main body 100. The display 103 is an example of a "display unit".
The apparatus main body 100 is an apparatus that generates ultrasonic image data based on a reflected wave signal received by the ultrasonic probe 101. The ultrasound image data generated by the apparatus main body 100 may be two-dimensional ultrasound image data generated based on a two-dimensional reflected wave signal or may be three-dimensional ultrasound image data generated based on a three-dimensional reflected wave signal.
As shown in fig. 1, the apparatus main body 100 includes, for example, a transmission/reception circuit 110, a B-mode (B mode) processing circuit 120, a doppler (doppler) processing circuit 130, an image processing circuit 140, an image memory (memory)150, a storage circuit 160, and a control circuit 170. The transceiver circuit 110, the B-mode processing circuit 120, the doppler processing circuit 130, the image processing circuit 140, the image memory 150, the storage circuit 160, and the control circuit 170 are communicatively connected to each other. The apparatus main body 100 is connected to a network (network)2 in a hospital.
The transmission/reception circuit 110 controls transmission of the ultrasonic wave by the ultrasonic probe 101. For example, the transceiver circuit 110 applies the above-described drive signal (drive pulse) to the ultrasound probe 101 at a timing (timing) given a predetermined transmission delay time for each transducer based on an instruction from the control circuit 170. Thereby, the transmission/reception circuit 110 transmits the ultrasonic beam in which the ultrasonic waves are bundled into a beam shape to the ultrasonic probe 101.
The transceiver circuit 110 controls reception of a reflected wave signal by the ultrasonic probe 101. As described above, the reflected wave signal is a signal obtained by reflecting the ultrasonic wave transmitted from the ultrasonic probe 101 on the internal tissue of the subject P. For example, the transceiver circuit 110 adds the reflected wave signals received by the ultrasonic probe 101 by giving a predetermined delay time based on an instruction from the control circuit 170. Thereby, the reflection component from the direction corresponding to the reception directivity of the reflected wave signal is emphasized. Then, the transceiver circuit 110 converts the added reflected wave signal into an In-phase signal (I signal, I: In-phase) and a Quadrature signal (Q signal) In a baseband (baseband) band. Then, the transmission/reception circuit 110 transmits the I signal and the Q signal (hereinafter, referred to as IQ signal) as reflected wave data to the B-mode processing circuit 120 and the doppler processing circuit 130. The transceiver circuit 110 may convert the added reflected wave signal into an rf (radio frequency) signal and transmit the rf signal to the B-mode processing circuit 120 and the doppler processing circuit 130. The IQ signal and the RF signal are signals (reflected wave data) containing phase information.
The B-mode processing circuit 120 performs various signal processing on the reflected wave data generated by the transmission/reception circuit 110 from the reflected wave signal. The B-mode processing circuit 120 performs logarithmic amplification, envelope detection processing, and the like on the reflected wave data received from the transmission/reception circuit 110, and generates data (B-mode data) representing the signal intensity at each sampling point (observation point) by the brightness of the luminance. The B-mode processing circuit 120 transmits the generated B-mode data to the image processing circuit 140.
The B-mode processing circuit 120 performs signal processing for harmonic imaging (harmonic imaging) for imaging the harmonic components. As the Harmonic Imaging, Contrast Harmonic Imaging (CHI) and Tissue Harmonic Imaging (THI) are known. In contrast harmonic imaging and tissue harmonic imaging, known scanning methods include Amplitude Modulation (AM), Phase Modulation (PM) called "Pulse Subtraction (Pulse Subtraction) or" Pulse Inversion (Pulse Inversion) methods ", and AMPM in which AM and PM are combined to obtain both AM and PM effects.
The doppler processing circuit 130 generates, as doppler data, data obtained by extracting motion information based on the doppler effect of the moving object at each sampling point in the scanning area from the reflected wave data generated from the reflected wave signal by the transceiver circuit 110. Here, the motion information of the moving object is information such as an average velocity, a variance value, and an energy (power) value of the moving object, and the moving object is, for example, a blood flow, a tissue such as a heart wall, or a contrast agent. The doppler processing circuit 130 transmits the generated doppler data to the image processing circuit 140.
For example, when the moving object is a blood flow, the motion information of the blood flow is information (blood flow information) such as an average velocity, a variance value, and energy of the blood flow. The blood flow information is obtained, for example, by a color doppler method.
In the color doppler method, first, ultrasonic waves are transmitted and received a plurality of times on the same scanning line, and then, a signal of a specific frequency band is passed through a signal of a data sequence indicating reflected wave data at the same position (at the same sampling point) using an mti (moving Target indicator) filter, and signals of other frequency bands are attenuated. That is, signals (clutter) components from stationary tissue or slow moving tissue are suppressed. Thus, a blood flow signal relating to the blood flow is extracted from the signal indicating the data sequence of the reflected wave data. In the color doppler method, blood flow information such as an average velocity, variance value, and energy of a blood flow is estimated from the extracted blood flow signal, and the estimated blood flow information is generated as doppler data.
The image processing circuit 140 performs image data (ultrasonic image data) generation processing, various image processing on the image data, and the like. For example, the image processing circuit 140 generates two-dimensional B-mode image data representing the intensity of the reflected wave at luminance from the two-dimensional B-mode data generated by the B-mode processing circuit 120. The image processing circuit 140 generates two-dimensional doppler image data in which blood flow information is visualized, based on the two-dimensional doppler data generated by the doppler processing circuit 130. The two-dimensional doppler image data is velocity image data indicating an average velocity of blood flow, variance image data indicating a variance value of blood flow, energy image data indicating energy of blood flow, or image data combining these. The image processing circuit 140 generates, as doppler image data, color doppler image data in which blood flow information such as an average velocity, a variance value, and power of a blood flow is displayed in color, or generates doppler image data in which one piece of blood flow information is displayed in grayscale.
Here, the image processing circuit 140 generally converts (scan convert) a scanning line signal sequence of ultrasonic scanning into a scanning line signal sequence of a video format (video format) represented by a television (television) or the like, and generates ultrasonic image data for display. Specifically, the image processing circuit 140 performs coordinate conversion in accordance with the scanning method of the ultrasonic waves of the ultrasonic probe 101, and generates ultrasonic image data for display. In addition to the scan conversion, the image processing circuit 140 performs various image processing such as image processing (smoothing processing) for reproducing an average image of luminance, image processing (edge emphasis processing) for using a differential filter in an image, and the like using a plurality of image frames (frames) after the scan conversion. The image processing circuit 140 synthesizes the ultrasonic image data with character information, scale, body mark (body mark), and the like of various parameters.
That is, the B-mode data and the doppler data are ultrasound image data before scan conversion processing, and the data generated by the image processing circuit 140 is ultrasound image data for display after scan conversion processing. The B-mode Data and the doppler Data are also referred to as Raw Data (Raw Data). The image processing circuit 140 generates two-dimensional ultrasonic image data for display from the two-dimensional ultrasonic image data before the scan conversion processing.
The image processing circuit 140 then generates three-dimensional B-mode image data by performing coordinate conversion on the three-dimensional B-mode data generated by the B-mode processing circuit 120. The image processing circuit 140 also performs coordinate conversion on the three-dimensional doppler data generated by the doppler processing circuit 130 to generate three-dimensional doppler image data.
The image processing circuit 140 performs rendering (rendering) processing on the volume image data in order to generate various two-dimensional image data for displaying the volume image data on the display 103. As the rendering process performed by the image processing circuit 140, for example, a process of generating MPR image data from volume image data by performing a Multi Planar Reconstruction (MPR) method is known. The Rendering process performed by the image processing circuit 140 is, for example, a Volume Rendering (VR) process of generating two-dimensional image data in which information of a three-dimensional image is reflected. The Rendering process performed by the image processing circuit 140 includes, for example, a Surface Rendering (SR) process for generating two-dimensional image data in which only Surface information of a three-dimensional image is extracted.
The image processing circuit 140 stores the generated image data and the image data obtained by performing various image processing in the image memory 150. The image processing circuit 140 may also generate information indicating the display position of each image data, various information for assisting the operation of the ultrasonic diagnostic apparatus 1, and additional information related to diagnosis such as patient information, together with the image data, and store the information in the image memory 150.
The image processing circuit 140 of the present embodiment executes an image generation function 141, a detection function 142, an estimation function 143, and a display control function 144. The detection function 142 is an example of a detection unit. The inference function 143 is an example of an inference unit. The display control function 144 is an example of an output unit.
Here, the processing functions executed by the image generation function 141, the detection function 142, the estimation function 143, and the display control function 144, which are components of the image processing circuit 140 shown in fig. 1, are recorded in the storage circuit 160 as programs (programs) executable by a computer (computer), for example. The image processing circuit 140 is a processor (processor) that reads out and executes each program from the storage circuit 160 to realize a function corresponding to each program. In other words, the image processing circuit 140, which has read out the states of the programs, has the functions shown in the image processing circuit 140 of fig. 1. The processing contents of the image generation function 141, the detection function 142, the estimation function 143, and the display control function 144 executed by the image processing circuit 140 will be described later.
In fig. 1, the processing functions performed by the image generation function 141, the detection function 142, the estimation function 143, and the display control function 144 are described as being implemented in a single image processing circuit 140, but a processing circuit may be configured by combining a plurality of independent processors, and the functions may be implemented by executing programs by the respective processors.
The term "processor" used in the above description refers to, for example, a cpu (central Processing unit), a gpu (graphics Processing unit), an Application Specific Integrated Circuit (ASIC)), a Programmable Logic Device (e.g., a Simple Programmable Logic Device (SPLD), a Complex Programmable Logic Device (CPLD), and a Field Programmable Gate Array (FPGA)), and the like. When the processor is, for example, a CPU, the processor reads out and executes a program stored in the memory circuit 60 to realize a function. On the other hand, in the case where the processor is, for example, an ASIC, the program is directly programmed into the circuit of the processor instead of storing the program in the memory circuit 60. Note that each processor of the present embodiment is not limited to a single circuit for each processor, and a plurality of independent circuits may be combined to form one processor and realize the functions thereof. Moreover, a plurality of components in fig. 1 may be combined into one processor to realize the functions thereof.
The image Memory 150 and the Memory circuit 160 are, for example, semiconductor Memory elements such as a ram (random Access Memory) and a Flash Memory (Flash Memory), or storage devices such as a hard disk (hard disk) and an optical disk.
The image memory 150 is a memory for storing image data such as B-mode image data and doppler image data generated by the image processing circuit 140 as ultrasound image data. The image memory 150 may store image data such as B-mode data generated by the B-mode processing circuit 120 and doppler data generated by the doppler processing circuit 130 as ultrasound image data. The ultrasound image data stored in the image memory 150 may be called up by an operator after diagnosis, for example, and may be used as ultrasound image data for display via the image processing circuit 140.
The memory circuit 160 stores various data such as a control program for performing ultrasound transmission/reception, image processing, and display processing, diagnostic information (for example, patient ID, doctor's findings, and the like), a diagnostic protocol (protocol), and various body markers. The data stored in the memory circuit 160 can be transmitted to an external device via an interface (interface) unit (not shown). The external device is, for example, a storage medium such as a pc (personal computer), CD, or DVD, or a printer (printer) used by a doctor for performing image diagnosis. In addition, if the ultrasound diagnostic apparatus 1 is accessible (access) on the network 2, the storage circuit 160 may not be built in the ultrasound diagnostic apparatus 1.
In addition, the storage circuit 160 stores a learning completed model (model) 400. The learned model 400 is a model generated by learning using ultrasonic image data obtained when ultrasonic scanning was performed in the past and learning data indicating a region including a measurement site in the ultrasonic image data. Examples of the learning include learning using ai (intellectual intelligence) based on a neural network, machine learning, and the like. By this learning, the learned model 400 is given the following functions: the ultrasonic image data is input as input data, and output data indicating a result of detecting a region including a measurement site in the ultrasonic image data is output. Here, the measurement site includes the head, abdomen, thigh, and upper arm of the fetus. The use and generation of the learned model 400 will be described later.
The control circuit 170 controls the entire processing of the ultrasonic diagnostic apparatus 1. Specifically, the control circuit 170 controls the processing of the transmission/reception circuit 110, the B-mode processing circuit 120, the doppler processing circuit 130, the image processing circuit 140, and the like based on various setting requests input from the operator via the input device 102, various control programs read from the storage circuit 160, and various data.
The transceiver circuit 110, the B-mode Processing circuit 120, the doppler Processing circuit 130, the image Processing circuit 140, and the control circuit 170, which are incorporated in the apparatus main body 100, may be configured by hardware (hardware) of a processor (cpu (central Processing Unit), MPU (Micro-Processing Unit), integrated circuit, or the like), but may be configured by a program modularized in software.
The ultrasonic diagnostic apparatus 1 configured as described above is used, for example, to check the development of a fetus in a womb of a pregnant woman, and the ultrasonic probe 101 transmits and receives ultrasonic waves (ultrasonic scanning) to and from a region including a part (measurement site) of the fetus in the womb of the pregnant woman, and the image processing circuit 140 generates an ultrasonic image in which the region including the measurement site is imaged based on the scanning result. For example, in the ultrasonic diagnostic apparatus 1, instead of manual measurement using calipers (calipers), measurement points for measuring parameters of a fetus such as a fetal parietal diameter (BPD), a fetal Head Circumference (HC), an Abdominal Circumference (AC), a Femoral Length (FL), and a Humeral Length (HL) are estimated by automatically detecting measurement sites such as a head and an abdomen of the fetus from an ultrasonic image. In addition, in the ultrasonic diagnostic apparatus 1, by using the parameters measured at the measurement points, the Estimated Fetal Weight (EFW) can be calculated.
The entire configuration of the ultrasonic diagnostic apparatus 1 according to the present embodiment is described above. With this configuration, the ultrasonic diagnostic apparatus 1 according to the present embodiment can improve the accuracy of measurement using an ultrasonic image.
For example, when measuring parameters such as BPD, HC, AC, FL, and HL, a method of estimating a measurement point for measuring a measurement region by image processing based on luminance is known. Specifically, when parameters such as BPD, HC, and AC are measured from an ultrasound image, the outline of a measurement region such as the head or abdomen is approximated with an ellipse, and a high-luminance region in the ellipse is detected in the ultrasound image based on the luminance value, thereby estimating a measurement point for measuring the measurement region. In this case, the measurement points are the end points of the major axis and the minor axis of the ellipse. Similarly, when measuring parameters such as FL and HL from an ultrasound image, a rod-like region corresponding to a measurement site such as femur and humerus is detected in the ultrasound image based on the brightness value, and a measurement point for measuring the measurement site is estimated. In this case, the measurement points are both end points of the bar-shaped region. However, this method has the following problems.
Fig. 2 is a diagram for explaining a problem when a measurement site is measured. For example, when HC is measured from an ultrasonic image, as shown in fig. 2, an ellipse HC100 obtained by approximating the outline of a measurement region is detected using brightness, and measurement points P101 to P103 for measuring the measurement region are estimated. However, in the case where there is a portion having a higher brightness than the brightness of the outline of the actual measurement site indicated by the ellipse HC100 in the ultrasonic image, the measurement point may be estimated by the influence of the portion having the higher brightness. For example, as shown in fig. 2, when a portion having a higher luminance than the luminance of the contour indicated by the ellipse HC100 exists and the ellipse HC200 is detected so as to include the portion, the point P201 located outside the measurement portion is estimated as the measurement point. In this way, when a measurement site is detected based on brightness, there is a possibility that a correct measurement point cannot be estimated from a high-brightness portion other than the measurement site, and thus parameters such as BPD, HC, AC, FL, and HL cannot be measured with high accuracy in some cases.
Therefore, in the ultrasonic diagnostic apparatus 1 according to the present embodiment, the detection function 142 detects the first rectangle covering the measurement region from the ultrasonic image. The inference function 143 infers a measurement point for measuring the measurement site based on the position information of the first rectangle. The display control function 144 outputs information related to the measurement point.
Hereinafter, each function of the image generation function 141, the detection function 142, the estimation function 143, and the display control function 144 will be described with reference to fig. 3 to 8.
Fig. 3 to 7 are diagrams for explaining the processing of the image processing circuit 140 of the ultrasonic diagnostic apparatus 1 according to the present embodiment. Fig. 3 to 7 are explanatory diagrams of a case where the ultrasonic diagnostic apparatus 1 measures the Abdominal Circumference (AC) of a fetus using an ultrasonic image, for example. In this case, the measurement site is the abdomen of the fetus. Fig. 8 is a flowchart showing a procedure of processing in the ultrasonic diagnostic apparatus 1 according to the present embodiment. Fig. 8 shows a flowchart for explaining the operation (image processing method) of the entire ultrasonic diagnostic apparatus 1, and describes which step (step) in the flowchart each component corresponds to.
Step S101 in fig. 8 is a step performed by the ultrasonic probe 101. In step S101, the ultrasound probe 101 performs ultrasound scanning. Specifically, the ultrasonic probe 101 contacts the body surface of the subject P (the abdomen of the pregnant woman), performs ultrasonic scanning on a region including a part of the fetus (measurement site) in the uterus of the pregnant woman, and collects reflected wave signals of the region as a result of the ultrasonic scanning.
Step S102 in fig. 8 is a step in which the image processing circuit 140 calls a program corresponding to the image generation function 141 from the storage circuit 160 and executes the program. In step S102, the image generation function 141 generates an ultrasound image.
Specifically, the image generating function 141 generates an ultrasonic image 200 as shown in fig. 3 based on the reflected wave signal obtained by the ultrasonic probe 101. The ultrasonic image 200 is ultrasonic image data obtained by imaging a region including a measurement site. Here, the image generating function 141 may generate the ultrasonic image 200 by generating B-mode image data using the B-mode data generated by the B-mode processing circuit 120, or may generate the ultrasonic image 200 using ultrasonic image data stored in the image memory 150.
Step S103 in fig. 8 is a step in which the image processing circuit 140 calls a program corresponding to the detection function 142 from the storage circuit 160 and executes the program. In step S103, the detection function 142 detects a first bounding box (bounding box) enclosing the measurement site from the ultrasonic image 200.
Specifically, first, the detection function 142 detects a measurement site from the ultrasonic image 200 using the learned model 400 read out from the storage circuit 160. The learned model 400 receives input of ultrasonic image data as input data, and outputs output data, which is a result of detection of a measurement site in the ultrasonic image data. For example, the learned model 400 outputs a rectangle in which an ellipse obtained by approximating the outline of the abdomen as the measurement site is wrapped as output data. This rectangle is also called a bounding box (circumscribed rectangle). The detection function 142 inputs the ultrasonic image 200 into the learned model 400, and detects the first bounding box 210 as shown in fig. 4 as a bounding box that represents the position and size of the abdomen of the fetus whose outline is approximated by an ellipse and that encloses the abdomen of the fetus. The first bounding box 210 is an example of a "first rectangle".
The detection function 142 detects a predetermined structure from the ultrasound image 200 using the learned model 400. The learned model 400 receives input of ultrasonic image data as input data, and outputs output data, which is a result of detection of a predetermined structure in the ultrasonic image data. When the measurement site is the abdomen of the fetus, the predetermined structure is the spine of the fetus. For example, as shown in fig. 4, the detection function 142 inputs the ultrasonic image 200 to the learned model 400 to detect the spine 202 and outputs the detected spine as output data.
Here, the inference function 143 infers a provisional measurement point for measuring the abdomen of the fetus based on the position information of the first bounding box 210. For example, as shown in fig. 4, the inference function 143 infers, as temporary measurement points, points 211 to 213 inscribed in the first bounding box 210 and an ellipse obtained by approximating the contour of the abdomen of the fetus. The measurement points 211 to 213 are the end points of the major axis and the end points of the minor axis of the ellipse inscribed in the first bounding box 210, and are the middle points of the long sides and the short sides of the first bounding box 210.
Step S104 in fig. 8 is a step in which the image processing circuit 140 calls a program corresponding to the detection function 142 from the storage circuit 160 and executes the program. In step S104, detection function 142 detects the orientation of the measurement site based on the position information of first boundary frame 210 and the position information of the predetermined structure.
Specifically, first, as shown in fig. 4, the detection function 142 detects a line segment L210 connecting the center C210 of the first bounding box 210 and the center of the spine 202 of the fetus, which is a predetermined structure, in the ultrasound image 200. Next, the detection function 142 detects the direction of the line segment L210 as the orientation of the abdomen of the fetus, which is the measurement site. In the example shown in fig. 4, the direction of the line segment L210 is inclined by the angle θ clockwise or counterclockwise with reference to the vertical or horizontal direction of the screen displayed on the ultrasonic image 200. The vertical and horizontal directions of the screen displayed in the ultrasonic image 200 correspond to the column direction and the row direction of the pixels in the ultrasonic image 200, respectively. For example, as shown in fig. 4, the direction of the line segment L210 is inclined counterclockwise by an angle θ with respect to the vertical direction of the screen displayed on the ultrasonic image 200.
Step S105 in fig. 8 is a step in which the image processing circuit 140 calls a program corresponding to the detection function 142 from the storage circuit 160 and executes the program. In step S105, the detection function 142 performs a process of determining whether or not the measurement site detected in step S104 is tilted.
As described above, the direction of the line segment L210 detected by the detection function 142 can be regarded as the orientation of the abdomen of the fetus, which is the measurement site. For example, when the abdomen depicted in the ultrasound image 200 is based on the vertical direction of the screen displayed in the ultrasound image 200, it can be regarded as being tilted counterclockwise by the angle θ. In this case, in the determination processing in step S105, detection function 142 determines that the measurement site inclination detected in step S104 is inclined (step S105; yes).
Step S106 in fig. 8 is a step in which the image processing circuit 140 calls a program corresponding to the detection function 142 from the storage circuit 160 and executes the program. In step S106, the detection function 142 performs rotation processing on the ultrasonic image 200 based on the orientation of the measurement site detected in step S104. Specifically, the detection function 142 performs rotation processing so that the direction of the line segment L210 is the vertical or horizontal direction of the screen displayed in the ultrasound image 200.
By performing the rotation processing, the possibility that the front-back direction of the abdomen of the fetus as the measurement site becomes the vertical direction or the horizontal direction of the screen, and the left-right direction becomes the horizontal direction or the vertical direction of the screen becomes large. That is, by performing the rotation processing, the possibility that the major axis or the minor axis of the ellipse detected as the outline of the abdomen becomes the vertical direction or the horizontal direction of the screen displayed in the ultrasound image 200 becomes high. By performing this rotation processing, it is highly possible that one side of the bounding box is in the vertical direction or the horizontal direction of the screen displayed in the ultrasound image 200.
For example, as shown in fig. 5, the detection function 142 performs rotation processing so that the direction of the line segment L210 is in the vertical direction of the screen displayed in the ultrasonic image 200. That is, since the direction of the line segment L210 is inclined by the angle θ counterclockwise with respect to the vertical direction of the screen displayed in the ultrasound image 200, the detection function 142 rotates the ultrasound image 200 by the angle θ clockwise.
Step S107 in fig. 8 is a step in which the image processing circuit 140 calls a program corresponding to the detection function 142 from the storage circuit 160 and executes the program. In step S107, the detection function 142 detects a second bounding box enclosing the measurement site from the ultrasonic image 200 after the rotation process performed in step S106.
Specifically, the detection function 142 inputs the ultrasonic image 200 after the rotation processing to the learned model 400, and detects the second bounding box 220 as shown in fig. 5 as a bounding box that covers the abdomen of the fetus with the position and size of the abdomen of the fetus approximated by an ellipse. In this case, the bounding box is changed from the first bounding box 210 to the second bounding box 220. For example, the second bounding box 220 is an example of a "second rectangle".
Step S108 in fig. 8 is a step in which the image processing circuit 140 calls a program corresponding to the inference function 143 from the storage circuit 160 and executes the program. In step S108, the inference function 143 infers a measurement point for measuring the abdomen of the fetus based on the position information of the second bounding box 220.
Specifically, as shown in fig. 5, the estimation function 143 estimates, as the measurement point, points 221 to 223 inscribed between the ellipse 224 obtained by approximating the contour of the abdomen of the fetus and the second bounding box 220. The points 221 to 223 are the end points of the major axis and the minor axis of the ellipse 224, and are the middle points of the long side and the short side of the second bounding box 220. In this case, the inference function 143 updates the measurement points 211 to 213 based on the position information of the first bounding box 210 to the measurement points 221 to 223 based on the position information of the second bounding box 220 as the measurement points for measuring the abdomen of the fetus.
In the determination processing in step S105, when the direction of the line segment L210 detected by the detection function 142 is the vertical direction or the horizontal direction of the screen displayed on the ultrasound image 200 with reference to the vertical direction or the horizontal direction of the screen displayed on the ultrasound image 200, the abdomen depicted on the ultrasound image 200 can be regarded as not being tilted. In this case, detection function 142 determines that the measurement site detected in step S104 is not tilted (step S105; no), and executes step S108 without executing steps S106 and S107. That is, the detection function 142 does not perform rotation processing on the ultrasonic image 200, and in step S108, the estimation function 143 sets the measurement points 211 to 213 based on the position information of the first bounding box 210 as the measurement points for measuring the abdomen of the fetus.
Step S109 in fig. 8 is a step in which the image processing circuit 140 calls a program corresponding to the display control function 144 from the memory circuit 160 and executes the program. In step S109, the display control function 144 outputs information related to the measurement points 221 to 223 through a screen.
For example, the display control function 144 causes the display 103 to display, as information on the measurement points 221 to 223, an image in which marks (marks) indicating the positions of the measurement points 221 to 223 are drawn on the ultrasonic image 200 before the rotation processing is performed. Specifically, first, as shown in fig. 5, the display control function 144 draws marks indicating the positions of the measurement points 221 to 223 on the ultrasonic image 200 after the rotation processing. Here, the ultrasonic image 200 in which the marks indicating the positions of the measurement points 221 to 223 are drawn is an image obtained by rotating the ultrasonic image 200 shown in fig. 4 clockwise by the angle θ. Therefore, as shown in fig. 6, the display control function 144 rotates the ultrasonic image 200, on which the marks indicating the positions of the measurement points 221 to 223 are drawn, counterclockwise by the angle θ. At this time, as shown in fig. 7, the display 103 displays an image in which the marks indicating the positions of the measurement points 221 to 223 are drawn on the ultrasonic image 200 before the rotation processing is performed. On the display 103, ellipses 224 whose measurement points 221 to 223 are the end points of the major axis and the end points of the minor axis are displayed in an overlapping manner on the image.
Here, the display control function 144 may notify the operator that the measurement points 221 to 223 are estimated when the image shown in fig. 7 is displayed on the display 103 as the information on the measurement points 221 to 223. For example, a lamp (lamp) such as an led (light Emitting diode) may be provided in the input device 102 or the display 103, the lamp may be turned on by the display control function 144 to notify the operator that the measurement points 221 to 223 are estimated, and then the image shown in fig. 7 may be displayed on the display 103. For example, the display control function 144 may display the information indicating that the measurement points 221 to 223 are estimated on the display 103 by a message, and then display the image shown in fig. 7 on the display 103. For example, the display 103 may have speakers (not shown), and the display control function 144 may output a predetermined sound such as a beep sound, and then display the image shown in fig. 7 on the display 103.
The mark displayed on the image shown in fig. 7 can be changed by the operator. For example, the operator can move the marker displayed on the image by operating the input device 102.
Thus, the estimation function 143 can measure the Abdominal Circumference (AC) of the fetus by using the ultrasonic image 200 and the measurement points 221 to 223. For example, the inference function 143 measures the circumference of an ellipse 224 formed by the measurement points 221 to 223 as AC in the ultrasonic image 200.
As described above, in the ultrasonic diagnostic apparatus 1 according to the present embodiment, the detection function 142 detects the first boundary frame 210 covering the measurement region from the ultrasonic image 200, and the estimation function 143 estimates the measurement point for measuring the measurement region based on the positional information of the first boundary frame 210. Specifically, the detection function 142 detects a predetermined structure from the ultrasound image 200 together with the first boundary frame 210. Next, the detection function 142 performs rotation processing on the ultrasound image 200 based on the position information of the predetermined structure and the position information of the first bounding box 210, and detects the second bounding box 220 covering the measurement site from the ultrasound image 200 subjected to the rotation processing. Then, the estimation function 143 updates the measurement points 211 to 213 based on the position information of the first bounding box 210 to the measurement points 221 to 223 based on the position information of the second bounding box 220 as measurement points for measuring the measurement site. Then, the display control function 144 outputs information on the measurement points 221 to 223.
Thus, in the ultrasonic diagnostic apparatus 1 according to the present embodiment, the measurement point is estimated by detecting the boundary frame that covers the entire measurement region, instead of detecting the ellipse that matches the outline of the measurement region based on the high-luminance region, and the measurement point is estimated, so the accuracy of measurement using the ultrasonic image 200 is improved. In addition, in the ultrasonic diagnostic apparatus 1 according to the present embodiment, since the measurement points are estimated from the entire measurement region using the bounding box, the measurement points that are less likely to be characterized are estimated point by point than one point, and the measurement points can be efficiently estimated.
Although the detection using the learned model 400 is not limited to the detection, the part can be detected at a more accurate position when the posture of the part to be detected is not tilted during the part detection. In the present embodiment, a measurement site is detected by a bounding box, a predetermined structure capable of estimating the posture of the measurement site is detected, and an image of a detection target is rotated so that one side of the bounding box covering the measurement site is vertical or horizontal. In addition, in the present embodiment, by performing the detection processing of the bounding box again from the image after the rotation processing, a site can be detected at a more accurate position, and as a result, the estimation accuracy of the measurement point can be improved.
In the above-described embodiment, the case of measuring the Abdominal Circumference (AC) of the fetus has been described, but the present embodiment is also applicable to the case of measuring parameters such as the fetal parietal diameter (BPD), the fetal Head Circumference (HC), the Femoral Length (FL), and the Humeral Length (HL) of the fetus, without being limited thereto.
First, a case of measuring parameters such as BPD and HC will be described. Fig. 9 is a diagram for explaining the processing of the image processing circuit of the ultrasonic diagnostic apparatus according to the present embodiment, and is an explanatory diagram in the case of measuring parameters such as BPD and HC.
For example, after steps S101 and S102 are executed, in step S103, the detection function 142 detects a first bounding box enclosing the measurement site from the ultrasonic image. Specifically, the detection function 142 inputs the ultrasonic image 500 into the learned model 400, and detects the first bounding box 510 as shown in fig. 9 as a bounding box that represents the position and size of the head of the fetus with the outline approximated by an ellipse and that encloses the head of the fetus. The detection function 142 inputs the ultrasonic image 500 into the learned model 400 to detect a predetermined structure. For example, where the measurement site is a fetal head, the defined structure is a transparent compartment 502 of the fetus. Here, the inference function 143 infers a provisional measurement point for measuring the head of the fetus based on the position information of the first bounding box 510. For example, as shown in fig. 9, the inference function 143 infers as temporary measurement points the points 511-514 inscribed between the ellipse obtained by approximating the contour of the fetal head and the first bounding box 510.
In step S104, detection function 142 detects the orientation of the measurement site based on the position information of first boundary box 510 and the position information of the predetermined structure. Specifically, first, as shown in fig. 9, the detection function 142 detects a line segment L510 connecting the center C510 of the first bounding box 510 and the central portion of the transparent compartment 502 of the fetus, which is a predetermined structure, in the ultrasound image 500. Next, the detection function 142 detects the direction of the line segment L510 as the measurement location, i.e., the orientation of the head of the fetus. For example, when the vertical or horizontal direction of the screen displayed in the ultrasound image 500 is used as a reference, the direction of the line segment L510 is the horizontal direction of the screen displayed in the ultrasound image 500.
In step S105, the detection function 142 performs a process of determining whether or not the measurement site detected in step S104 is tilted.
In the determination processing in step S105, when the direction of the line segment L510 detected by the detection function 142 is the vertical direction or the horizontal direction of the screen displayed in the ultrasound image 500 with reference to the vertical direction or the horizontal direction of the screen displayed in the ultrasound image 500, the head portion drawn in the ultrasound image 500 can be regarded as not being tilted. In this case, detection function 142 determines that the measurement site detected in step S104 is not tilted (step S105; no), and executes the processing of step S108 without executing the processing of steps S106 and S107. That is, the detection function 142 does not perform rotation processing on the ultrasonic image 500, and in step S108, the estimation function 143 sets the measurement points 511 to 514 based on the position information of the first bounding box 310 as measurement points for measuring the head of the fetus.
In step S109, the display control function 144 causes the display 103 to display, as information on the measurement points 511 to 514, an image in which marks indicating the positions of the measurement points 511 to 514 are drawn on the ultrasonic image 500.
Thus, the estimation function 143 can measure HC by using the ultrasonic image 500 and the measurement points 511 to 514. For example, the inference function 143 measures the perimeter of the ellipse formed by the measurement points 511 to 514 in the ultrasound image 500 as HC.
Thus, the estimation function 143 can measure HC by using the ultrasonic image 500 and the measurement points 511 to 514. For example, the inference function 143 measures the perimeter of the ellipse formed by the measurement points 511 to 514 in the ultrasound image 500 as HC.
The estimation function 143 can measure the BPD by using the ultrasonic image 500 and the measurement points 513 and 514. For example, the estimation function 143 measures the length of a line segment connecting the measurement point 513 and the measurement point 514 as the BPD in the ultrasound image 500.
In the determination processing in step S105, if detection function 142 determines that the measurement site detected in step S104 is inclined (step S105; yes), the following processing in steps S106 to S109 is executed.
First, in step S106, the detection function 142 performs rotation processing on the ultrasonic image 500 based on the orientation of the measurement site detected in step S104. Specifically, the detection function 142 performs rotation processing so that the direction of the line segment L510 is the vertical or horizontal direction of the screen displayed in the ultrasound image 500. Next, in step S107, the detection function 142 detects a second bounding box enclosing the measurement site from the ultrasonic image 500 after the rotation processing performed in step S106. Specifically, the detection function 142 inputs the ultrasonic image 500 after the rotation processing to the learned model 400, and detects the second bounding box as a bounding box that represents the position and size of the head of the fetus with the outline approximated by an ellipse and encloses the head of the fetus.
Next, in step S108, the inference function 143 infers a measurement point for measuring the head of the fetus based on the position information of the second bounding box. Specifically, the inference function 143 infers, as the measurement point, a point at which an ellipse obtained by approximating the contour of the head of the fetus is inscribed in the second bounding box. In this case, the inference function 143 updates the measurement points 511 to 514 based on the position information of the first bounding box 510 as the measurement points for measuring the head of the fetus to the measurement points based on the position information of the second bounding box. Then, in step S109, the display control function 144 causes the display 103 to display, as information on the measurement point, an image in which a mark indicating the updated position of the measurement point is drawn on the ultrasound image 500.
In step S103, the case where the measurement site is the head of the fetus, and the predetermined structure is the transparent compartment 502 of the fetus is exemplified, but the present invention is not limited thereto.
For example, as shown in fig. 10, when the measurement site is the head of a fetus, the predetermined structure may be a tetrad pool 503 of the fetus. In this case, in step S104, first, as shown in fig. 10, the detection function 142 detects a line segment L510 connecting the center C510 of the first bounding box 510 and the center portion of the quadruple pool 503 of the fetus, which is a predetermined structure, in the ultrasound image 500. Next, the detection function 142 detects the direction of the line segment L510 as the orientation of the head of the fetus, which is the measurement site. Thereafter, the processing in and after step S105 is performed. For example, when the measurement site is not tilted (step S105; NO), the ultrasonic image 500 is not rotated, the measurement points 511 to 514 are estimated from the position information of the first border frame 310 (step S108), and information on the measurement points 511 to 514 is displayed on the display 103 (step S109). On the other hand, when the measurement region is tilted (step S105; YES), the rotation processing is performed on the ultrasonic image 500 (step S106), and a second bounding box enclosing the measurement region is detected from the ultrasonic image 500 after the rotation processing (step S107). Then, the measurement point is estimated from the position information of the second bounding box (step S108), and information on the measurement point is displayed on the display 103 (step S109). In the example shown in fig. 10, even when the predetermined structure is a quadruple pool 503 of a fetus, the orientation of the head of the fetus can be detected as in the case where the predetermined structure is a transparent compartment 502.
For example, as shown in fig. 11, when the measurement site is the head of a fetus, the predetermined structures may be a transparent cell 502 and a quadruple cell 503 of the fetus. In this case, in step S104, first, as shown in fig. 11, the detection function 142 detects a line segment L510 connecting the center C510 of the first bounding box 510 and the central portion of the transparent compartment 502 and the quadruple pool 503 of the fetus, which are predetermined structures, in the ultrasound image 500. Next, the detection function 142 detects the direction of the line segment L510 as the orientation of the head of the fetus, which is the measurement site. Thereafter, the processing in and after step S105 is performed. For example, when the measurement site is not tilted (step S105; NO), the ultrasonic image 500 is not rotated, the measurement points 511 to 514 are estimated from the position information of the first border frame 310 (step S108), and information on the measurement points 511 to 514 is displayed on the display 103 (step S109). On the other hand, when the measurement region is tilted (step S105; YES), the rotation process is performed on the ultrasonic image 500 (step S106), and a second bounding box enclosing the measurement region is detected from the ultrasonic image 500 after the rotation process (step S107). Then, the measurement point is estimated from the position information of the second bounding box (step S108), and information on the measurement point is displayed on the display 103 (step S109). In the example shown in fig. 11, the predetermined structures are the transparent cell 502 and the quadruple pool 503 of the fetus, and therefore, the orientation of the head of the fetus can be detected more accurately than in the case where the predetermined structure is either one of the transparent cell 502 and the quadruple pool 503.
Next, a case of measuring FL will be explained. Fig. 12 is a diagram for explaining the processing of the image processing circuit of the ultrasonic diagnostic apparatus according to the present embodiment, and is an explanatory diagram in the case of FL measurement.
For example, after steps S101 and S102 are executed, in step S103, the detection function 142 detects a first bounding box enclosing the measurement site from the ultrasonic image. Specifically, the learned model 400 receives the ultrasonic image data, and outputs, as output data, a bounding box circumscribing a rod-like region corresponding to the femur as a measurement site. The detection function 142 inputs the ultrasonic image 600 to the learned model 400, and detects a first boundary frame 610 as shown in fig. 12 as a boundary frame that represents the position and size of the fetal thigh close to the rod-shaped region and covers the fetal thigh. Here, when the measurement site is the femur or humerus, the detection function 142 does not perform steps S104 to S107.
In step S108, the inference function 143 infers a measurement point for measuring the femoral part of the fetus based on the position information of the first bounding box 610. For example, as shown in fig. 12, the inference function 143 infers, as the measurement points, points 611 and 612 that are inscribed in the first bounding box 610 and correspond to the rod-shaped region of the femoral part of the fetus. The measurement points 611 and 612 correspond to both ends of the femur.
In step S109, the display control function 144 causes the display 103 to display an image in which marks indicating the positions of the measurement points 611 and 612 are drawn on the ultrasound image 600 as information on the measurement points 611 and 612.
Thus, the estimation function 143 can measure the FL by using the ultrasonic image 600 and the measurement points 611 and 612. For example, the estimation function 143 measures the length of a line segment connecting the measurement point 611 and the measurement point 612 as FL in the ultrasound image 600.
Similarly, when the HL is measured, the estimation function 143 measures the length of a line segment connecting two measurement points in the ultrasound image as the HL by executing steps S101 to S103, S108, and S109, for example.
Here, a process of generating the learned model 400 by machine learning will be described.
The learned model 400 is generated by another device (hereinafter, referred to as a learned model generating device) different from the ultrasonic diagnostic apparatus 1, for example, and stored in the ultrasonic diagnostic apparatus 1 (for example, stored in the storage circuit 160). Here, the learned model generating device may be realized by the ultrasonic diagnostic apparatus 1, and for example, the learned model 400 may be generated by the ultrasonic diagnostic apparatus 1 (for example, the estimating function 143) and stored in the storage circuit 160. The following description will be given taking as an example a case where the learned model 400 is generated by the learned model generating apparatus and stored in the ultrasonic diagnostic apparatus 1.
Fig. 13 is a block diagram showing an example of the configuration of the learned model generation device 300 according to the present embodiment. As shown in fig. 13, the learned model generation apparatus 300 includes an input device 302, a display 303, an image processing circuit 340, and a storage circuit 360. The learned model generation apparatus 300 is used as a browser for generating the learned model 400.
The input device 302 includes a mouse, a keyboard, buttons, a panel switch, a touch panel, a foot switch, a wheel, a trackball, a joystick, and the like, receives various setting requests from the operator of the learned model generation apparatus 300, and transmits the received various setting requests to the image processing circuit 340. The display 303 is a monitor to be referred to by the operator of the learned model generation apparatus 300.
The image processing circuit 340 controls the entire processing of the learned model generation apparatus 300. For example, as shown in fig. 13, the image processing circuit 340 executes a learning data acquisition function 341 and a learned model generation function 342.
The memory circuit 360 is, for example, a semiconductor memory element such as a RAM or a flash memory, or a storage device such as a hard disk or an optical disk. Here, the processing functions executed by the learning data acquisition function 341 and the learned model generation function 342, which are components of the image processing circuit 340, are recorded in the storage circuit 360 as programs executable by a computer, for example. The image processing circuit 340 is a processor that reads out each program from the storage circuit 360 and executes the program to realize a function corresponding to each program. In addition, the memory circuit 360 stores a learning program as an algorithm (algorithm) of a neural network.
In the process of generating the learned model 400, first, the learning data acquisition function 341 of the learned model generation apparatus 300 acquires a plurality of data sets (datasets) obtained in the past when the ultrasonic scan was performed. The plurality of data sets include ultrasound image data collected when ultrasound scanning was performed on a measurement site in the past, and learning data indicating a region including the measurement site in the ultrasound image data. Further, since the algorithm of the neural network is an algorithm that is gradually learned empirically, the data sets used for learning may not be the same subject.
The learned model generating function 342 of the learned model generating apparatus 300 learns the pattern (pattern) of the data for learning from a plurality of data sets using the learning program 502 read from the storage circuit 160. At this time, the learned model generating function 342 generates the learned model 400 to which the following functions are added: the input data (ultrasonic image data) is received, and output data indicating a result of detecting a region including a measurement site in the ultrasonic image data is output. For example, the generated learned model 400 is stored in the ultrasonic diagnostic apparatus 1 (stored in the storage circuit 160). The learned model 400 in the storage circuit 160 can be read by the detection function 142 of the ultrasonic diagnostic apparatus 1, for example.
In the processing using the learned model 400, the detection function 142 of the ultrasonic diagnostic apparatus 1 inputs ultrasonic image data as input data. Then, the detection function 142 detects a region including the measurement site from the ultrasound image 200 as ultrasound image data using the learned model 400 read out from the storage circuit 160, and outputs output data as a result of the detection.
In the present embodiment, the case of using the learned model 400 is described, but the learned model 400 may not be used. For example, in the present embodiment, after a measurement site such as an abdomen is detected by image processing using brightness, a bounding box enclosing the measurement site may be detected. In this case, the measurement site may be detected as a site surrounded by an arbitrary curve, instead of being detected as a pattern such as an ellipse suitable for the measurement site in the present embodiment. In the present embodiment, the predetermined structure may be detected by image processing using luminance without using the learned model 400.
(other embodiments)
The embodiments are not limited to the above-described embodiments. For example, the image processing circuit 140 may be a workstation provided separately from the ultrasonic diagnostic apparatus 1. In this case, the workstation has a processing circuit similar to the image processing circuit 140, and executes the above-described processing.
The components of each device shown in the embodiments are functionally conceptual, and do not necessarily need to be physically configured as shown in the drawings. That is, the specific form of the dispersion and integration of the respective devices is not limited to the illustration, and all or a part thereof may be functionally or physically dispersed and integrated in arbitrary units according to various loads, use situations, and the like. All or some of the processing functions performed by the respective devices may be realized by a CPU and a program analyzed and executed by the CPU, or may be realized as hardware of wired logic (wired logic).
The display method described in the above embodiment can be realized by executing an image processing program prepared in advance by a computer such as a personal computer (personal computer) or a workstation (workstation). The image processing program can be distributed via a network such as the internet (internet). The image processing program may be recorded on a non-transitory computer-readable recording medium such as a hard disk, a Flexible Disk (FD), a CD-ROM, an MO, or a DVD, and may be read from the recording medium and executed by the computer.
According to at least one embodiment described above, the accuracy of measurement using an ultrasonic image can be improved.
Although several embodiments have been described, these embodiments are provided as examples and are not intended to limit the scope of the invention. These embodiments can be implemented in other various manners, and various omissions, substitutions, changes, and combinations of the embodiments can be made without departing from the scope of the invention. These embodiments and modifications thereof are included in the scope and gist of the invention, and are also included in the scope and equivalents of the invention described in the claims.

Claims (11)

1. An ultrasonic diagnostic apparatus is provided with:
a detection unit that detects a first rectangle that covers the measurement site from the ultrasonic image;
an estimation unit configured to estimate a measurement point for measuring the measurement site based on position information of the first rectangle; and
and an output unit that outputs information on the measurement point.
2. The ultrasonic diagnostic apparatus according to claim 1,
the detection part is used for detecting the position of the object,
detecting a predetermined structure from the ultrasonic image together with the first rectangle,
performing rotation processing on the ultrasonic image based on the position information of the predetermined structure and the position information of the first rectangle,
detecting a second rectangle enclosing the measurement region from the ultrasonic image subjected to the rotation processing,
the estimation unit updates, as the measurement point, a measurement point based on the position information of the first rectangle to a measurement point based on the position information of the second rectangle,
the output unit outputs information on the measurement point based on the position information of the second rectangle.
3. The ultrasonic diagnostic apparatus according to claim 2,
the detection unit performs the rotation processing so that a direction of a line segment connecting the predetermined structure and the center of the first rectangle is a vertical direction or a horizontal direction of a screen on which the ultrasonic image is displayed.
4. The ultrasonic diagnostic apparatus according to claim 3,
the detection unit does not perform the rotation processing when the direction of the line segment is a vertical direction or a horizontal direction of a screen on which the ultrasonic image is displayed.
5. The ultrasonic diagnostic device according to any one of claims 2 to 4,
the detection unit performs the rotation processing so that a major axis or a minor axis of an ellipse approximates an outline of the measurement region in the longitudinal direction or the lateral direction of a screen on which the ultrasonic image is displayed, when the ellipse approximates the outline of the measurement region.
6. The ultrasonic diagnostic device according to any one of claims 2 to 4,
the detection unit performs the rotation processing so that one side of a rectangle approximates the outline of the measurement region in the vertical or horizontal direction of a screen on which the ultrasonic image is displayed, when the rectangle approximates the outline of the measurement region.
7. The ultrasonic diagnostic apparatus according to claim 1,
the output unit causes a display unit to display an image in which a mark indicating the position of the measurement point is drawn on the ultrasonic image.
8. The ultrasonic diagnostic apparatus according to claim 2,
the output unit displays, on a display unit, an image in which a mark indicating the position of the measurement point is drawn on the ultrasonic image before the rotation processing.
9. The ultrasonic diagnostic apparatus according to claim 2,
in the case where the measurement site is an abdomen of a fetus, the predetermined structure is a spine of the fetus.
10. The ultrasonic diagnostic apparatus according to claim 2,
in the case where the measurement site is a head of a fetus, the predetermined structure is at least one of a transparent compartment and a quadruple pool of the fetus.
11. An image processing apparatus includes:
a detection unit that detects a first rectangle that covers the measurement site from the ultrasonic image;
an estimation unit configured to estimate a measurement point for measuring the measurement site based on position information of the first rectangle; and
and an output unit that outputs information on the measurement point.
CN202110576727.4A 2020-05-28 2021-05-26 Ultrasonic diagnostic apparatus and image processing apparatus Pending CN113729777A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-093551 2020-05-28
JP2020093551A JP2021186208A (en) 2020-05-28 2020-05-28 Ultrasonic diagnostic apparatus and image processing device

Publications (1)

Publication Number Publication Date
CN113729777A true CN113729777A (en) 2021-12-03

Family

ID=78728336

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110576727.4A Pending CN113729777A (en) 2020-05-28 2021-05-26 Ultrasonic diagnostic apparatus and image processing apparatus

Country Status (2)

Country Link
JP (1) JP2021186208A (en)
CN (1) CN113729777A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002008008A (en) * 2000-06-27 2002-01-11 Fuji Photo Film Co Ltd Medical image output method and device therefor
CN102551748A (en) * 2010-09-29 2012-07-11 富士胶片株式会社 An image processing apparatus, a radiographic image system, and an image processing method and program
JP2020039645A (en) * 2018-09-11 2020-03-19 株式会社日立製作所 Ultrasonic diagnostic apparatus and display method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002008008A (en) * 2000-06-27 2002-01-11 Fuji Photo Film Co Ltd Medical image output method and device therefor
CN102551748A (en) * 2010-09-29 2012-07-11 富士胶片株式会社 An image processing apparatus, a radiographic image system, and an image processing method and program
JP2020039645A (en) * 2018-09-11 2020-03-19 株式会社日立製作所 Ultrasonic diagnostic apparatus and display method

Also Published As

Publication number Publication date
JP2021186208A (en) 2021-12-13

Similar Documents

Publication Publication Date Title
US6290648B1 (en) Ultrasonic diagnostic apparatus
JP5620666B2 (en) Ultrasonic diagnostic equipment, ultrasonic image processing equipment
JP5002181B2 (en) Ultrasonic diagnostic apparatus and ultrasonic diagnostic apparatus control method
JP5461845B2 (en) Ultrasonic diagnostic apparatus and control program for ultrasonic diagnostic apparatus
CN102415902B (en) Ultrasonic diagnostic apparatus and ultrasonic image processng apparatus
JP7258568B2 (en) ULTRASOUND DIAGNOSTIC DEVICE, IMAGE PROCESSING DEVICE, AND IMAGE PROCESSING PROGRAM
EP1685799B1 (en) Ultrasonic diagnostic apparatus and ultrasonic image acquiring method
US10368841B2 (en) Ultrasound diagnostic apparatus
JP2006218210A (en) Ultrasonic diagnostic apparatus, ultrasonic image generating program and ultrasonic image generating method
JP5689591B2 (en) Ultrasonic diagnostic apparatus and ultrasonic image processing program
JP2020531086A (en) An ultrasound system that extracts an image plane from volume data using touch interaction with an image
JP2010148828A (en) Ultrasonic diagnostic device and control program of ultrasonic diagnostic device
JP3936450B2 (en) Projection image generation apparatus and medical image apparatus
US8858442B2 (en) Ultrasonic diagnostic apparatus and ultrasonic image processing apparatus
JP7171228B2 (en) Ultrasound diagnostic equipment and medical information processing program
JP5606025B2 (en) Ultrasonic diagnostic apparatus, ultrasonic image processing apparatus, and ultrasonic image processing program
JP5196994B2 (en) Ultrasonic diagnostic apparatus, ultrasonic image processing apparatus, and ultrasonic image processing program
JP5202916B2 (en) Ultrasound image diagnostic apparatus and control program thereof
JP7171291B2 (en) Ultrasound diagnostic equipment and image processing program
CN113729777A (en) Ultrasonic diagnostic apparatus and image processing apparatus
JP5060141B2 (en) Ultrasonic diagnostic equipment
JP7237512B2 (en) ultrasound diagnostic equipment
JP2024053759A (en) Ultrasound image processing device, ultrasound diagnostic device, ultrasound image processing method, and program
JP2024018636A (en) Medical image processing device, medical image processing method, and medical image processing program
JP6976869B2 (en) Ultrasonic diagnostic equipment and its control program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination