CN110584714A - Ultrasonic fusion imaging method, ultrasonic device, and storage medium - Google Patents

Ultrasonic fusion imaging method, ultrasonic device, and storage medium Download PDF

Info

Publication number
CN110584714A
CN110584714A CN201911009845.6A CN201911009845A CN110584714A CN 110584714 A CN110584714 A CN 110584714A CN 201911009845 A CN201911009845 A CN 201911009845A CN 110584714 A CN110584714 A CN 110584714A
Authority
CN
China
Prior art keywords
scanning
ultrasonic
area
detection
detection part
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911009845.6A
Other languages
Chinese (zh)
Inventor
邹建宇
莫若理
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuxi Chison Medical Technologies Co Ltd
Original Assignee
Wuxi Chison Medical Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuxi Chison Medical Technologies Co Ltd filed Critical Wuxi Chison Medical Technologies Co Ltd
Priority to CN201911009845.6A priority Critical patent/CN110584714A/en
Publication of CN110584714A publication Critical patent/CN110584714A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/085Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating body or organic structures, e.g. tumours, calculi, blood vessels, nodules
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5207Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of raw data to produce diagnostic data, e.g. for generating an image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5238Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
    • A61B8/5261Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from different diagnostic modalities, e.g. ultrasound and X-ray
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/40Animals

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Vascular Medicine (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The invention relates to the technical field of fusion imaging, and particularly discloses an ultrasonic fusion imaging method, an ultrasonic device and a storage medium, wherein the ultrasonic fusion imaging method comprises the following steps: judging whether the detection part can be completely displayed in the single-frame ultrasonic image or not; if the detection part can not be completely displayed in the single-frame ultrasonic image, acquiring a reference scanning area of the detection part on the body surface of the detection object; acquiring attitude information and scanning path information of a scanning detection part of an ultrasonic probe; generating a scanning coverage area of the ultrasonic probe according to the attitude information and the scanning path information; and the reference scanning area and the scanning coverage area are displayed in a fused mode and are used for judging whether missing detection exists after the ultrasonic probe finishes scanning the detection part. The invention integrates and displays the reference scanning area and the scanning coverage area, and visually and clearly provides the basis for judging whether the detection part is missed.

Description

Ultrasonic fusion imaging method, ultrasonic device, and storage medium
Technical Field
The present invention relates to the field of fusion imaging technologies, and in particular, to an ultrasound fusion imaging method, an ultrasound device, and a storage medium.
Background
Currently, when an operator performs an ultrasonic examination on a patient, whether a detection part of a detection object is completely scanned by an ultrasonic probe cannot be determined, and currently, the ultrasonic examination is mainly performed by depending on clinical experience of the operator; for some operators with insufficient clinical experience, the condition of missed detection is easy to occur; if the focus is just at the position of missed detection, the condition of the disease is delayed. Herein, the general meaning of the test object refers to a patient who needs to be subjected to an ultrasonic examination; and may also include various animals in a broad sense, etc.
Disclosure of Invention
The invention aims to overcome the defects in the prior art and provide an ultrasonic fusion imaging method which can dynamically display the area scanned by ultrasonic and provide a basis for judging whether to miss detection in a clear contrast mode. The technical scheme adopted by the invention is as follows:
an ultrasound fusion imaging method comprising:
judging whether the detection part can be completely displayed in the single-frame ultrasonic image or not;
if the detection part can not be completely displayed in the single-frame ultrasonic image, acquiring a reference scanning area of the detection part on the body surface of the detection object;
acquiring attitude information and scanning path information of a scanning detection part of an ultrasonic probe;
generating a scanning coverage area of the ultrasonic probe according to the attitude information and the scanning path information;
and fusing and displaying the reference scanning area and the scanning coverage area, and judging whether the ultrasonic probe has missing detection after the ultrasonic probe finishes scanning the detection part.
Further, when the detection site is located in the body to be detected,
determining a reference distribution area of a detection part in a detection object body;
projecting a reference distribution region of the detection part in the detection object body to the body surface of the detection object along a projection direction in a set angle range to form a projection region;
and determining the projection area with the largest area as a reference scanning area of the detection part on the body surface of the detection object.
Further, the step of determining a reference distribution area of the detection site in the detection object body includes:
determining the distribution area of the detection part in the detection object body through CT scanning or MRI scanning; or
And determining the distribution area of the detection part in the detection object body according to a preset reference model.
Further, when the detection part is positioned on the body surface of the detection object,
acquiring a video image at least comprising the detection part through a camera;
and identifying a reference scanning area of the detection part at the body surface of the detection object from the video image through an edge detection algorithm.
Further, the step of acquiring the posture information and the scanning path information of the ultrasonic probe scanning detection part includes:
analyzing and obtaining the attitude information and the scanning path information of the ultrasonic probe from a video image which is acquired by a camera and at least comprises the ultrasonic probe and a detection object; or
Acquiring attitude information and scanning path information of the ultrasonic probe by receiving a sensing signal of a magnetic sensor arranged on the ultrasonic probe and by a magnetic sensor positioning method; or
And calculating to obtain the attitude information and the scanning path information of the ultrasonic probe by receiving the output signal of the inertia measurement unit arranged on the ultrasonic probe.
Further, the fusion display of the reference scanning area and the scanning coverage area is used for judging whether missing detection exists after the ultrasonic probe finishes scanning the detection part, and includes:
the scanning coverage area is displayed on the reference scanning area according to the posture information and the scanning path information of the ultrasonic probe;
if the reference scanned area is completely covered by the scanned coverage area, a criterion of no missing detection is provided.
Or, further, the fusion display of the reference scanning area and the scanning coverage area is used to determine whether there is missing inspection after the ultrasonic probe completes scanning of the detection site, and includes:
the scanning coverage area is displayed on the reference scanning area according to the posture information and the scanning path information of the ultrasonic probe;
if the reference scanning area is completely covered by the scanning coverage area, reconstructing a plurality of ultrasonic images obtained by scanning the ultrasonic probe to generate a wide-scene ultrasonic image corresponding to the detection part;
matching the panoramic ultrasonic image with a pre-stored detection part model;
if the matching fails, extending the reference scanning area along the direction of the matching failure area by a preset distance to generate an updated reference scanning area until the matching is successful;
and if the matching is successful, providing a criterion without missing detection.
The present invention also provides an ultrasound device comprising:
a memory storing a computer program;
a processor for executing a computer program for carrying out the steps of the ultrasound fusion imaging method of any one of the above.
The present invention also provides a computer storage medium having a computer program stored therein, the computer program being for implementing the steps of any one of the above ultrasound fusion imaging methods when executed by a processor.
The invention has the advantages that:
the ultrasonic fusion imaging method can perform fusion display on the reference scanning area and the scanning coverage area generated by scanning the ultrasonic probe when the detection part cannot be displayed in a single-frame ultrasonic image, and further judge whether the detection is missed or not after the ultrasonic probe finishes scanning the detection part.
Furthermore, the invention can clearly and intuitively compare the reference scanning area and the scanning coverage area through the differential display to prompt whether the missed inspection exists.
Furthermore, the invention can reconstruct a plurality of ultrasonic images scanned by the ultrasonic probe to generate a panoramic ultrasonic image, and then detect whether missing detection occurs or not by the panoramic ultrasonic image and a prestored detection part model, thereby improving the accuracy.
Drawings
Fig. 1 is a flow chart of an ultrasonic fusion imaging method of the invention.
Fig. 2 is a flowchart of acquiring a reference scanning area when a detection site is located in a detection target body according to the present invention.
Fig. 3 is a flowchart of acquiring a reference scanning area when a detection region is located on a body surface of a detection object according to the present invention.
FIG. 4 is a flowchart of an embodiment of the present invention for merging a reference scanned area and a scanned coverage area and determining whether to miss an inspection.
FIG. 5 is a flow chart of another embodiment of the present invention for merging a reference scanned area and a scanned coverage area and determining whether to miss an inspection.
Fig. 6 is a schematic structural view of an ultrasonic apparatus of the present invention.
Detailed Description
Currently, when an operator performs an ultrasonic examination on a patient, whether a detection part of a detection object is completely scanned by an ultrasonic probe cannot be determined, and currently, the ultrasonic examination is mainly performed by depending on clinical experience of the operator; for some operators with insufficient clinical experience, the condition of missed detection is easy to occur; if the focus is just at the position of missed detection, the condition of the disease is delayed.
The present invention will be described in further detail with reference to the following detailed description and accompanying drawings. Wherein like elements in different embodiments are numbered with like associated elements. In the following description, numerous details are set forth in order to provide a better understanding of the present application. However, those skilled in the art will readily recognize that some of the features may be omitted or replaced with other elements, materials, methods in different instances. In some instances, certain operations related to the present application have not been shown or described in detail in order to avoid obscuring the core of the present application from excessive description, and it is not necessary for those skilled in the art to describe these operations in detail, so that they may be fully understood from the description in the specification and the general knowledge in the art. Furthermore, the features, operations, or characteristics described in the specification may be combined in any suitable manner to form various embodiments. Also, the various steps or actions in the method descriptions may be transposed or transposed in order, as will be apparent to one of ordinary skill in the art. Thus, the various sequences in the specification and drawings are for the purpose of describing certain embodiments only and are not intended to imply a required sequence unless otherwise indicated where such sequence must be followed.
Fig. 1 is a flow chart of an ultrasonic fusion imaging method of the invention. As shown in FIG. 1, in one aspect the present invention provides an ultrasound fusion imaging method comprising:
step S100, judging whether the detection part can be completely displayed in the single-frame ultrasonic image;
it is to be understood that the term "operator" as used herein is a medical professional, and may be a doctor, nurse, medical technician, medical imaging specialist, etc., when the operator uses the ultrasound probe for scanning, the operator will first select the type of ultrasound probe, e.g., a linear, convex, or area array probe. The parts which can be detected by the ultrasonic probes of different types are different, and the parts which need to be detected can be selected to obtain preset parameters after the operator selects the type of the ultrasonic probe which needs to be used. An operator can input relevant information through an input unit on the ultrasonic device according to the part of the detection object to be detected, so that the ultrasonic device can acquire the detection part of the detection object; the input unit can be a keyboard, a trackball, a mouse, a touch pad or the like or a combination thereof; the input unit can also adopt a voice recognition input unit, a gesture recognition input unit and the like; after the ultrasonic device obtains the information of the detected part of the detected object, the ultrasonic device can inquire a mapping relation table which is established in advance and is used for judging whether the detected part is completely displayed or not, and determine whether the detected part can be completely displayed in the single-frame ultrasonic image or not.
It can be understood that, when examining the carotid artery, the carotid artery can be completely displayed in a single frame ultrasound image, and when screening a heart or breast lesion, a single scan of the ultrasound probe at a certain position cannot display the complete heart or breast, requiring the operator to move the position of the ultrasound probe many times to detect the whole detection site; if the operator only carries out ultrasonic scanning according to clinical experience, the condition of missing detection is easy to occur;
step S200, if the detection part can not be completely displayed in the single-frame ultrasonic image, acquiring a reference scanning area of the detection part on the body surface of the detection object;
the detection site to be detected of the object may be located in the body of the object, such as the heart, liver, kidney, etc., or may be located on the body surface of the object, such as the breast. If the detection part is located in the body of the detection object, a reference scanning area of the detection part on the body surface of the detection object needs to be acquired, so that an operator can scan the reference scanning area by using an ultrasonic probe to further acquire an ultrasonic image of the detection part located in the body of the detection object.
Step S300, acquiring attitude information and scanning path information of a scanning detection part of the ultrasonic probe;
it should be understood that, on one hand, the distribution area of the detection part is not clear, and therefore, all areas cannot be completely scanned when an operator operates the ultrasonic device to perform the ultrasonic examination, which may cause missed detection; on the other hand, the operator cannot know which parts of the detection part are scanned after scanning is over, and detection omission occurs. Therefore, the invention can acquire the attitude information and the scanning path information of the scanning detection part of the ultrasonic probe in real time in the process of carrying out ultrasonic scanning by an operator so as to judge the scanned area of the ultrasonic probe and prevent missing detection caused by manual misoperation.
In one implementation, acquiring the posture information and the scanning path information of the ultrasonic probe scanning detection part comprises the following steps:
analyzing and obtaining the attitude information and the scanning path information of the ultrasonic probe from a video image which is acquired by a camera and at least comprises the ultrasonic probe and a detection object; or
The method comprises the steps that sensing signals of a magnetic sensor arranged on an ultrasonic probe are received, and attitude information and scanning path information of the ultrasonic probe are obtained through a magnetic sensor positioning method, it is to be understood that the magnetic sensor arranged on the ultrasonic probe is a magnetic receiving sensor, a magnetic emission sensor is also required to be arranged, the magnetic emission sensor establishes a world coordinate system through emitting a low-frequency magnetic field, and the magnetic receiving sensor arranged on the ultrasonic probe can analyze according to relative distance and angle to obtain the attitude information and the scanning path information of the ultrasonic probe; or
The attitude information and the scanning path information of the ultrasonic probe are obtained by receiving the output signals of an inertia measurement unit arranged on the ultrasonic probe and resolving, the inertia measurement unit can measure the three-axis attitude angle and the acceleration of an object, and generally, one inertia measurement unit comprises three single-axis gyroscopes and three single-axis accelerometers; according to the output signal of the inertia measurement unit, the attitude information and the scanning path information of the ultrasonic probe can be obtained by calculation.
Step S400, generating a scanning coverage area of the ultrasonic probe according to the attitude information and the scanning path information;
the scanning coverage area of the ultrasonic probe is generated through the acquired attitude information and scanning path information of the ultrasonic probe. It should be understood that the scanning cross sections and scanning cross section sizes of different ultrasonic probes are prestored in the ultrasonic device, that is, the cross section of the ultrasonic probe contacting with the detection part, for example, the linear array probe is a rectangular scanning cross section; and generating a scanning coverage area of the ultrasonic probe according to the scanning section, the scanning section size, the posture information and the scanning path information of the ultrasonic probe. It is to be understood that the scanning coverage area is the area scanned by the ultrasound probe.
And S500, fusing and displaying the reference scanning area and the scanning coverage area, and judging whether the ultrasonic probe has missing detection after the ultrasonic probe finishes scanning the detection part.
In order to dynamically display the area which is scanned by the ultrasonic wave to an operator, whether the ultrasonic probe has missed detection or not is prompted in a clear contrast mode. The invention integrates and displays the reference scanning area and the scanning coverage area, and the reference scanning area and the scanning coverage area are overlapped and displayed in a distinguishing way to judge whether the detection omission exists after the ultrasonic probe finishes the scanning of the detection part.
The ultrasonic fusion imaging method can perform fusion display on the reference scanning area and the scanning coverage area generated by scanning the ultrasonic probe when the detection part cannot be displayed in a single-frame ultrasonic image, and further judge whether the detection is missed or not after the ultrasonic probe finishes scanning the detection part.
In another embodiment, as shown in fig. 2, if the detection region is located in the body of the detection object, the present invention obtains the reference scanning area of the detection region on the body surface of the detection object by the following steps.
Step S210, determining a reference distribution area of the detection part in the detection object body;
the invention determines the distribution area of the detection part in the body of the detection object through CT scanning or MRI scanning; or determining the distribution area of the detection part in the detection object body according to a preset reference model. For example, the current ultrasound diagnosis is often combined with CT or MRI diagnosis to perform comprehensive diagnosis, and then the reference distribution area of the detection part in the detection object body can be determined through the CT or MRI image; alternatively, a reference model may be established in advance, and a reference distribution region of the detection site in the detection target body may be obtained by searching for the reference model. For example, when a person is detected, the reference model is established in advance by using factors such as sex, age, weight, height and the like, the reference distribution area of the corresponding detection part in the detection object body is established, and the reference distribution area of the detection part in the detection object body can be obtained according to the height and the weight of the detection object.
Step S220, projecting the reference distribution region of the detection part in the detection object body to the body surface of the detection object along the projection direction within the set angle range to form a projection region;
it is to be understood that an operator touches the body surface of an object to be examined by operating an ultrasonic probe, transmits and receives ultrasonic waves, the ultrasonic probe is excited by a transmission pulse, transmits the ultrasonic waves to the object to be examined (an organ, a tissue, a blood vessel, etc. in the human body or an animal body), receives an ultrasonic echo with information of a target tissue reflected from a target region after a certain time delay, and converts the ultrasonic echo back into an electric signal to obtain an ultrasonic image. Therefore, the invention needs to confirm the reference scanning area on the body surface of the detection object, and then can acquire the ultrasonic image of the detection part needing to be detected by scanning the reference scanning area. The method projects a reference distribution region of a detection part in a detection object body to the body surface of the detection object along a projection direction in a set angle range to form a projection region, and it is understood that the projection regions obtained by the projection directions in different angle ranges are different; preferably, the projection mode is a parallel projection mode, and the set angle range may be 5 degrees of deflection along a normal direction from the ultrasonic probe to the detection site.
In step S230, the projection region with the largest area is determined as the reference scanning region of the detection region on the body surface of the detection object.
It should be understood that different positions of the projection of different detection parts to the body surface of the detection object are different, for example, when an ultrasonic probe scans the kidney, a reference projection region corresponding to the kidney of the detection part is on the side of the human body, and the body surface of the detection object has a certain bending amplitude.
In another embodiment, as shown in fig. 3, when the detection region is located on the body surface of the detection object, the present invention obtains the reference scanning area of the detection region on the body surface of the detection object by:
step S240, collecting a video image at least comprising a detection part through a camera;
and step S250, identifying a reference scanning area of the detection part at the body surface of the detection object from the video image through an edge detection algorithm.
In some embodiments, if the detected region is located on the body surface of the detected object, a video image at least including the detected region may be acquired by a camera, and image recognition processing is performed, or after the detected region in the video image is manually labeled, a reference scanning area of the detected region on the body surface of the detected object is obtained, where the edge detection algorithm specifically is: firstly, smoothing a video image by using a Gaussian filter to filter noise;
calculating the gradient strength and direction of each pixel point in the video image;
suppressing through a Non-Maximum value (Non-Maximum Suppression) to eliminate spurious response brought by edge detection;
the elimination of the spurious response caused by the edge detection through Non-Maximum Suppression (Non-Maximum Suppression) specifically includes: comparing the gradient strength of the current pixel with two pixels along the positive and negative gradient directions; if the gradient intensity of the current pixel is maximum compared with the other two pixels, the pixel point is reserved as an edge point, otherwise, the pixel point is restrained
Determining real and potential edges by Double-Threshold (Double-Threshold) detection;
after applying non-maximum suppression, the remaining pixels may more accurately represent the actual edges in the video image. However, there are still some edge pixels due to noise and color variations. To account for these spurious responses, edge pixels must be filtered with weak gradient values and edge pixels with high gradient values are retained, which can be achieved by selecting high and low thresholds. If the gradient value of the edge pixel is above the high threshold, marking it as a strong edge pixel; if the gradient value of the edge pixel is less than the high threshold and greater than the low threshold, marking it as a weak edge pixel; if the gradient value of the edge pixel is less than the low threshold, it is suppressed. The choice of threshold depends on the content of a given input image.
In order to facilitate the operator to dynamically display the area which has been scanned by the ultrasonic wave, and prompt whether the ultrasonic probe has missed detection or not in a clear contrast mode. In an embodiment, as shown in fig. 4, the present invention integrates and displays a reference scanning area and a scanning coverage area, and is used to determine whether there is a missing inspection after an ultrasonic probe completes scanning a detection site, specifically including:
step S510, respectively displaying scanning coverage areas on the reference scanning areas according to the posture information and the scanning path information of the ultrasonic probe;
if the ultrasonic device is only provided with one display unit, the obtained reference scanning area is displayed in one display area on a display of the ultrasonic device; for example, when the display of the ultrasound device displays an ultrasound image, one display area at the lower left corner is used for displaying the acquired reference scanning area;
tracking the posture and the scanning path of the ultrasonic probe in real time, and fusing and displaying the scanning coverage area of the ultrasonic probe in the corresponding reference scanning area in real time, for example, differently displaying the scanning coverage area in the reference scanning area by coloring or highlighting; after the ultrasonic probe executes the ultrasonic scanning, an operator observes whether all reference scanning areas are colored or highlighted;
in step S520, if the reference scanning area is completely covered by the scanning coverage area, a criterion that no missing detection exists is provided.
In another embodiment, as shown in fig. 5, the present invention integrates and displays a reference scanning area and a scanning coverage area, and is used for determining whether there is a missing inspection after the ultrasonic probe completes scanning the detection site, specifically including:
s530, respectively displaying scanning coverage areas on the reference scanning areas according to the posture information and the scanning path information of the ultrasonic probe;
if the ultrasonic device is only provided with one display unit, the obtained reference scanning area is displayed in one display area on a display of the ultrasonic device; for example, when the display of the ultrasound device displays an ultrasound image, one display area at the lower left corner is used for displaying the acquired reference scanning area; tracking the posture and the scanning path of the ultrasonic probe in real time, and fusing and displaying the scanning coverage area of the ultrasonic probe in the corresponding reference scanning area in real time, for example, differently displaying the scanning coverage area in the reference scanning area by coloring or highlighting; after the ultrasonic probe executes the ultrasonic scanning, an operator observes whether all reference scanning areas are colored or highlighted;
s540, if the reference scanning area is completely covered by the scanning covering area, reconstructing a plurality of ultrasonic images obtained by scanning the ultrasonic probe to generate a wide-scene ultrasonic image corresponding to the detection part;
after the ultrasonic probe finishes the scanning of the detected part, a plurality of ultrasonic images obtained by the scanning of the ultrasonic probe are reconstructed, and the reconstructed wide-scene ultrasonic image can display all the detected parts in one frame of ultrasonic image.
S550, matching the panoramic ultrasonic image with a pre-stored detection part model;
it should be understood that if an error occurs in the acquired reference scanning area, a missed inspection may finally occur in the ultrasonic probe, and in order to improve the accuracy of the reference scanning area, after the reference scanning area is completely covered by the scanning coverage area, the panoramic ultrasonic image is matched with the pre-stored detection part model, and it is determined whether an error exists in the panoramic ultrasonic image corresponding to the acquired detection part relative to the detection part model.
S5510, identifying the category of the panoramic ultrasonic image through the trained classification neural network model, and training and determining the marked ultrasonic images through the classification neural network model through a convolution neural network;
the classification neural network model comprises a convolutional layer, a maximum pooling layer, an activation function layer, a batch normalization layer and the like so as to classify detection parts in the wide-scene ultrasonic image, such as a liver, a kidney and the like;
step 5520, retrieving a corresponding detection site model from a pre-stored model library according to the category of the panoramic ultrasound image;
the model base pre-stored in the ultrasonic device stores the detection part models of different detection parts, and after the wide view ultrasonic images are automatically identified through the classified neural network model, the corresponding detection part models can be automatically called from the model base
Step 5530, the panoramic ultrasound image and the detected part model are matched through the trained matching neural network model.
In an embodiment, the matching neural network model includes an input layer, a plurality of convolution layers, and an activation function layer, where the activation function layer uses a tanh function as an activation function, the matching neural network model matches the panoramic ultrasound image and the overall similarity of the appearance feature data of the detection site model, if the matching degree is greater than a preset matching degree threshold, the matching is successful, otherwise, the matching is failed.
In another embodiment, the matching neural network model of the invention is a twin neural network model, and the similarity between the panoramic ultrasonic image and the detection part model is calculated through the twin neural network model. The twin neural network model is provided with two input layers, the detection part model and the panoramic ultrasonic image are respectively input into the first neural network and the second neural network through the distribution of the two input layers, the panoramic ultrasonic image and the detection part model are respectively mapped to a space by the first neural network and the second neural network, and the similarity of the panoramic ultrasonic image and the detection part model is calculated through an LOSS function.
S560, if the matching fails, extending the reference scanning area along the direction of the matching failure area by a preset distance to generate an updated reference scanning area until the matching succeeds;
s570, if the matching is successful, providing a criterion that no missing detection exists.
As another embodiment of the present invention, a storage medium is provided, in which program instructions of a computer program are stored, the program instructions of the computer program being for being loaded and executed by a processor to implement the steps of the ultrasound fusion imaging method as set forth above.
As another embodiment of the present invention, there is provided an ultrasonic apparatus including:
a memory storing program instructions of a computer program; program instructions of the computer program are for being loaded and executed by a processor to carry out the steps of the ultrasound fusion imaging method as described hereinbefore;
a processor for loading and executing program instructions on the memory to implement the foregoing steps of the ultrasound fusion imaging method;
the memory may include a volatile memory (RAM), such as a random-access memory (RAM); the memory may also include a non-volatile memory (english: non-volatile memory), such as a flash memory (english: flash memory), a hard disk (english: hard disk drive, abbreviated: HDD) or a solid-state drive (english: SSD); the memory may also comprise a combination of memories of the kind described above.
The processor may be a Central Processing Unit (CPU), a Network Processor (NP), or a combination of a CPU and an NP. The processor may further include a hardware chip. The hardware chip may be an application-specific integrated circuit (ASIC), a Programmable Logic Device (PLD), or a combination thereof. The PLD may be a Complex Programmable Logic Device (CPLD), a field-programmable gate array (FPGA), a General Array Logic (GAL), or any combination thereof.
In another embodiment the ultrasound device of the present invention further comprises:
the input unit is coupled with the processor and at least used for inputting the detection part related information of the detection object;
the camera is coupled with the processor and is used for acquiring video images of the detected object including the detected part and/or tracking and shooting the video images of the ultrasonic probe;
the ultrasonic probe posture and scanning path tracking acquisition unit is coupled with the processor and used for acquiring the posture information and the scanning path information of the ultrasonic probe; the ultrasonic probe posture and scanning path tracking acquisition unit can comprise a magnetic sensor or an inertial measurement unit which is arranged on the ultrasonic probe;
the display unit is coupled with the processor and used for displaying the ultrasonic image and fusing and displaying the reference scanning area and the scanning coverage area; the display unit can be one, and is used for displaying the ultrasonic image and displaying the reference scanning area and the scanning coverage area in a fusion manner; the display unit can also be two independent display units which are respectively used for displaying the ultrasonic image, fusing and displaying the reference scanning area and the scanning coverage area.
The ultrasonic device can perform fusion display on the reference scanning area and the scanning coverage area generated by scanning the ultrasonic probe when the detection part cannot be displayed in a single-frame ultrasonic image, so as to judge whether the detection omission exists after the ultrasonic probe finishes the scanning of the detection part.
Finally, it should be noted that the above embodiments are only for illustrating the technical solutions of the present invention and not for limiting, and although the present invention has been described in detail with reference to examples, it should be understood by those skilled in the art that modifications or equivalent substitutions may be made on the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention, which should be covered by the claims of the present invention.

Claims (10)

1. An ultrasound fusion imaging method, comprising:
judging whether the detection part can be completely displayed in the single-frame ultrasonic image or not;
if the detection part can not be completely displayed in the single-frame ultrasonic image, acquiring a reference scanning area of the detection part on the body surface of the detection object;
acquiring attitude information and scanning path information of a scanning detection part of an ultrasonic probe;
generating a scanning coverage area of the ultrasonic probe according to the attitude information and the scanning path information;
and fusing and displaying the reference scanning area and the scanning coverage area, and judging whether the ultrasonic probe has missing detection after the ultrasonic probe finishes scanning the detection part.
2. The ultrasonic fusion imaging method of claim 1 wherein, if the examination site is located within an examination subject,
determining a reference distribution area of a detection part in a detection object body;
projecting a reference distribution region of the detection part in the detection object body to the body surface of the detection object along a projection direction in a set angle range to form a projection region;
and determining the projection area with the largest area as a reference scanning area of the detection part on the body surface of the detection object.
3. The ultrasound fusion imaging method of claim 2 wherein the step of determining a reference distribution area of the examination site within the examination subject comprises:
determining the distribution area of the detection part in the detection object body through CT scanning or MRI scanning; or
And determining the distribution area of the detection part in the detection object body according to a preset reference model.
4. The ultrasonic fusion imaging method of claim 1 wherein, when the examination region is located on a body surface of an examination object,
acquiring a video image at least comprising the detection part through a camera;
and identifying a reference scanning area of the detection part at the body surface of the detection object from the video image through an edge detection algorithm.
5. The ultrasound fusion imaging method of claim 1 wherein the step of acquiring pose information and scan path information of an ultrasound probe scanning a detection site comprises:
analyzing and obtaining the attitude information and the scanning path information of the ultrasonic probe from a video image which is acquired by a camera and at least comprises the ultrasonic probe and a detection object; or
Acquiring attitude information and scanning path information of the ultrasonic probe by receiving a sensing signal of a magnetic sensor arranged on the ultrasonic probe and by a magnetic sensor positioning method; or
And calculating to obtain the attitude information and the scanning path information of the ultrasonic probe by receiving the output signal of the inertia measurement unit arranged on the ultrasonic probe.
6. The ultrasound fusion imaging method of claim 1, wherein the fusion displaying the reference scanned region and the scanned coverage region for determining whether there is a missing inspection after the ultrasound probe completes the scanning of the inspection site comprises:
the scanning coverage area is displayed on the reference scanning area according to the posture information and the scanning path information of the ultrasonic probe;
if the reference scanned area is completely covered by the scanned coverage area, a criterion of no missing detection is provided.
7. The ultrasound fusion imaging method of claim 1, wherein the fusion displaying the reference scanned region and the scanned coverage region for determining whether there is a missing inspection after the ultrasound probe completes the scanning of the inspection site comprises:
the scanning coverage area is displayed on the reference scanning area according to the posture information and the scanning path information of the ultrasonic probe;
if the reference scanning area is completely covered by the scanning coverage area, reconstructing a plurality of ultrasonic images obtained by scanning the ultrasonic probe to generate a wide-scene ultrasonic image corresponding to the detection part;
matching the panoramic ultrasonic image with a pre-stored detection part model;
if the matching fails, extending the reference scanning area along the direction of the matching failure area by a preset distance to generate an updated reference scanning area until the matching is successful;
and if the matching is successful, providing a criterion without missing detection.
8. The ultrasound fusion imaging method of claim 7 wherein said matching the panoramic ultrasound image with a pre-stored model of the examination site comprises:
identifying the category of the panoramic ultrasonic image through a trained classification neural network model, wherein the classification neural network model is used for training and determining a plurality of marked ultrasonic images through a convolution neural network;
calling a corresponding detection part model from a pre-stored model library according to the category of the panoramic ultrasonic image;
and matching the panoramic ultrasonic image and the detection part model through the trained matching neural network model.
9. An ultrasound device, comprising:
a memory storing a computer program;
a processor for executing the computer program to carry out the steps of the ultrasound fusion imaging method as claimed in any one of claims 1 to 8.
10. A computer storage medium comprising, in combination,
the computer storage medium has stored therein a computer program which, when executed by a processor, is adapted to carry out the steps of the ultrasound fusion imaging method as claimed in any one of claims 1 to 8.
CN201911009845.6A 2019-10-23 2019-10-23 Ultrasonic fusion imaging method, ultrasonic device, and storage medium Pending CN110584714A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911009845.6A CN110584714A (en) 2019-10-23 2019-10-23 Ultrasonic fusion imaging method, ultrasonic device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911009845.6A CN110584714A (en) 2019-10-23 2019-10-23 Ultrasonic fusion imaging method, ultrasonic device, and storage medium

Publications (1)

Publication Number Publication Date
CN110584714A true CN110584714A (en) 2019-12-20

Family

ID=68851498

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911009845.6A Pending CN110584714A (en) 2019-10-23 2019-10-23 Ultrasonic fusion imaging method, ultrasonic device, and storage medium

Country Status (1)

Country Link
CN (1) CN110584714A (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111657997A (en) * 2020-06-23 2020-09-15 无锡祥生医疗科技股份有限公司 Ultrasonic auxiliary guiding method, device and storage medium
CN111816281A (en) * 2020-06-23 2020-10-23 无锡祥生医疗科技股份有限公司 Ultrasonic image inquiry unit
CN111950388A (en) * 2020-07-22 2020-11-17 上海市同仁医院 Vulnerable plaque tracking and identifying system and method
CN112155596A (en) * 2020-10-10 2021-01-01 达闼机器人有限公司 Ultrasonic diagnostic apparatus, method of generating ultrasonic image, and storage medium
CN112215843A (en) * 2019-12-31 2021-01-12 无锡祥生医疗科技股份有限公司 Ultrasonic intelligent imaging navigation method and device, ultrasonic equipment and storage medium
CN112336381A (en) * 2020-11-07 2021-02-09 吉林大学 Echocardiogram end systole/diastole frame automatic identification method based on deep learning
CN112807025A (en) * 2021-02-08 2021-05-18 威朋(苏州)医疗器械有限公司 Ultrasonic scanning guiding method, device, system, computer equipment and storage medium
CN113116377A (en) * 2019-12-31 2021-07-16 无锡祥生医疗科技股份有限公司 Ultrasonic imaging navigation method, ultrasonic device and storage medium
CN113116386A (en) * 2019-12-31 2021-07-16 无锡祥生医疗科技股份有限公司 Ultrasound imaging guidance method, ultrasound apparatus, and storage medium
CN113616235A (en) * 2020-05-07 2021-11-09 中移(成都)信息通信科技有限公司 Ultrasonic detection method, device, system, equipment, storage medium and ultrasonic probe
CN113786215A (en) * 2021-09-09 2021-12-14 江苏霆升科技有限公司 Electrocardiogram mapping method based on electromechanical ultrasonic imaging
WO2022073410A1 (en) * 2020-10-10 2022-04-14 达闼机器人有限公司 Ultrasonic diagnostic device, ultrasonic probe, image generation method and storage medium
CN114842239A (en) * 2022-04-02 2022-08-02 北京医准智能科技有限公司 Breast lesion attribute prediction method and device based on ultrasonic video
CN117094976A (en) * 2023-08-23 2023-11-21 脉得智能科技(无锡)有限公司 Focus missing detection judging method, device and electronic equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101002681A (en) * 2006-01-19 2007-07-25 株式会社东芝 Ultrasonic probe track display device and method, ultrasonic wave diagnosis device and method
CN105007825A (en) * 2013-02-20 2015-10-28 株式会社东芝 Ultrasonic diagnostic device and medical image processing device
CN108230300A (en) * 2016-12-09 2018-06-29 三星电子株式会社 For handling the device and method of ultrasonoscopy
CN108634985A (en) * 2018-05-08 2018-10-12 广州尚医网信息技术有限公司 Ultrasonic-B probe non-blind area checking method and system
CN109152566A (en) * 2016-05-23 2019-01-04 皇家飞利浦有限公司 Correct deformation caused by the probe in ultrasonic fusion of imaging system
WO2019069898A1 (en) * 2017-10-02 2019-04-11 株式会社Lily MedTech Medical imaging apparatus
CN110087550A (en) * 2017-04-28 2019-08-02 深圳迈瑞生物医疗电子股份有限公司 A kind of ultrasound pattern display method, equipment and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101002681A (en) * 2006-01-19 2007-07-25 株式会社东芝 Ultrasonic probe track display device and method, ultrasonic wave diagnosis device and method
CN105007825A (en) * 2013-02-20 2015-10-28 株式会社东芝 Ultrasonic diagnostic device and medical image processing device
CN109152566A (en) * 2016-05-23 2019-01-04 皇家飞利浦有限公司 Correct deformation caused by the probe in ultrasonic fusion of imaging system
CN108230300A (en) * 2016-12-09 2018-06-29 三星电子株式会社 For handling the device and method of ultrasonoscopy
CN110087550A (en) * 2017-04-28 2019-08-02 深圳迈瑞生物医疗电子股份有限公司 A kind of ultrasound pattern display method, equipment and storage medium
WO2019069898A1 (en) * 2017-10-02 2019-04-11 株式会社Lily MedTech Medical imaging apparatus
CN108634985A (en) * 2018-05-08 2018-10-12 广州尚医网信息技术有限公司 Ultrasonic-B probe non-blind area checking method and system

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113116386A (en) * 2019-12-31 2021-07-16 无锡祥生医疗科技股份有限公司 Ultrasound imaging guidance method, ultrasound apparatus, and storage medium
CN112215843B (en) * 2019-12-31 2021-06-11 无锡祥生医疗科技股份有限公司 Ultrasonic intelligent imaging navigation method and device, ultrasonic equipment and storage medium
CN113116377A (en) * 2019-12-31 2021-07-16 无锡祥生医疗科技股份有限公司 Ultrasonic imaging navigation method, ultrasonic device and storage medium
CN112288742B (en) * 2019-12-31 2021-11-19 无锡祥生医疗科技股份有限公司 Navigation method and device for ultrasonic probe, storage medium and electronic equipment
CN112215843A (en) * 2019-12-31 2021-01-12 无锡祥生医疗科技股份有限公司 Ultrasonic intelligent imaging navigation method and device, ultrasonic equipment and storage medium
CN112288742A (en) * 2019-12-31 2021-01-29 无锡祥生医疗科技股份有限公司 Navigation method and device for ultrasonic probe, storage medium and electronic equipment
CN113616235A (en) * 2020-05-07 2021-11-09 中移(成都)信息通信科技有限公司 Ultrasonic detection method, device, system, equipment, storage medium and ultrasonic probe
CN113616235B (en) * 2020-05-07 2024-01-19 中移(成都)信息通信科技有限公司 Ultrasonic detection method, device, system, equipment, storage medium and ultrasonic probe
CN111816281A (en) * 2020-06-23 2020-10-23 无锡祥生医疗科技股份有限公司 Ultrasonic image inquiry unit
CN111816281B (en) * 2020-06-23 2024-05-14 无锡祥生医疗科技股份有限公司 Ultrasonic image inquiry device
CN111657997A (en) * 2020-06-23 2020-09-15 无锡祥生医疗科技股份有限公司 Ultrasonic auxiliary guiding method, device and storage medium
CN111950388A (en) * 2020-07-22 2020-11-17 上海市同仁医院 Vulnerable plaque tracking and identifying system and method
CN111950388B (en) * 2020-07-22 2024-04-05 上海市同仁医院 Vulnerable plaque tracking and identifying system and method
CN112155596A (en) * 2020-10-10 2021-01-01 达闼机器人有限公司 Ultrasonic diagnostic apparatus, method of generating ultrasonic image, and storage medium
WO2022073410A1 (en) * 2020-10-10 2022-04-14 达闼机器人有限公司 Ultrasonic diagnostic device, ultrasonic probe, image generation method and storage medium
WO2022073413A1 (en) * 2020-10-10 2022-04-14 达闼机器人有限公司 Ultrasonic diagnostic device, ultrasonic image generation method and storage medium
CN112336381A (en) * 2020-11-07 2021-02-09 吉林大学 Echocardiogram end systole/diastole frame automatic identification method based on deep learning
CN112807025A (en) * 2021-02-08 2021-05-18 威朋(苏州)医疗器械有限公司 Ultrasonic scanning guiding method, device, system, computer equipment and storage medium
CN113786215A (en) * 2021-09-09 2021-12-14 江苏霆升科技有限公司 Electrocardiogram mapping method based on electromechanical ultrasonic imaging
CN114842239B (en) * 2022-04-02 2022-12-23 北京医准智能科技有限公司 Breast lesion attribute prediction method and device based on ultrasonic video
CN114842239A (en) * 2022-04-02 2022-08-02 北京医准智能科技有限公司 Breast lesion attribute prediction method and device based on ultrasonic video
CN117094976A (en) * 2023-08-23 2023-11-21 脉得智能科技(无锡)有限公司 Focus missing detection judging method, device and electronic equipment
CN117094976B (en) * 2023-08-23 2024-03-01 脉得智能科技(无锡)有限公司 Focus missing detection judging method, device and electronic equipment

Similar Documents

Publication Publication Date Title
CN110584714A (en) Ultrasonic fusion imaging method, ultrasonic device, and storage medium
JP7407790B2 (en) Ultrasound system with artificial neural network for guided liver imaging
KR101922180B1 (en) Ultrasonic image processing apparatus and method for processing of ultrasonic image
JP6467041B2 (en) Ultrasonic diagnostic apparatus and image processing method
CN102056547B (en) Medical image processing device and method for processing medical image
KR101565311B1 (en) 3 automated detection of planes from three-dimensional echocardiographic data
JP6598508B2 (en) Ultrasonic diagnostic device and its program
US6381350B1 (en) Intravascular ultrasonic analysis using active contour method and system
TWI473598B (en) Breast ultrasound image scanning and diagnostic assistance system
CN111629670B (en) Echo window artifact classification and visual indicator for ultrasound systems
JP6490809B2 (en) Ultrasonic diagnostic apparatus and image processing method
JP2022031825A (en) Image-based diagnostic systems
US20140334706A1 (en) Ultrasound diagnostic apparatus and contour extraction method
JP2020511250A (en) Volume rendered ultrasound image
EP4017371A1 (en) Ultrasound guidance dynamic mode switching
CN106687048A (en) Medical imaging apparatus
KR20190085342A (en) Method for controlling ultrasound imaging apparatus and ultrasound imaging aparatus thereof
CN113129342A (en) Multi-modal fusion imaging method, device and storage medium
US20200305837A1 (en) System and method for guided ultrasound imaging
US20200170624A1 (en) Diagnostic apparatus and diagnostic method
KR20210093049A (en) Ultrasonic diagnostic apparatus and operating method for the same
CN113662579A (en) Ultrasonic diagnostic apparatus, medical image processing apparatus and method, and storage medium
CN113116384A (en) Ultrasonic scanning guidance method, ultrasonic device and storage medium
CN114391878B (en) Ultrasonic imaging equipment
EP4311499A1 (en) Ultrasound image acquisition

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20191220