CN112842394A - Ultrasonic imaging system, ultrasonic imaging method and storage medium - Google Patents

Ultrasonic imaging system, ultrasonic imaging method and storage medium Download PDF

Info

Publication number
CN112842394A
CN112842394A CN202011345372.XA CN202011345372A CN112842394A CN 112842394 A CN112842394 A CN 112842394A CN 202011345372 A CN202011345372 A CN 202011345372A CN 112842394 A CN112842394 A CN 112842394A
Authority
CN
China
Prior art keywords
detection information
thyroid
confidence
image
calculating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011345372.XA
Other languages
Chinese (zh)
Inventor
安兴
乔佳新
张凯伦
陈志新
张晟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Mindray Bio Medical Electronics Co Ltd
Original Assignee
Shenzhen Mindray Bio Medical Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Mindray Bio Medical Electronics Co Ltd filed Critical Shenzhen Mindray Bio Medical Electronics Co Ltd
Publication of CN112842394A publication Critical patent/CN112842394A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0825Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of the breast, e.g. mammography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/085Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating body or organic structures, e.g. tumours, calculi, blood vessels, nodules
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Vascular Medicine (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

An ultrasound imaging system, an ultrasound imaging method, and a computer storage medium are disclosed. The system comprises: the probe transmits ultrasonic waves to the thyroid or the breast to be detected and receives ultrasonic echoes to obtain ultrasonic echo signals; the processor processes the ultrasonic echo signal to obtain an ultrasonic image of the thyroid or the breast to be detected; obtaining detection information of the thyroid or the mammary gland to be detected according to the ultrasonic image, wherein the detection information of the thyroid comprises TI-RADS detection information, and the detection information of the mammary gland to be detected comprises BI-RADS detection information; calculating the confidence of the detection information; and the display displays the detection information and the confidence coefficient of the detection information. The medical doctor can be helped to identify the credibility of the detection information obtained by the machine, and misdiagnosis caused by blindly believing the detection information obtained by the machine is avoided.

Description

Ultrasonic imaging system, ultrasonic imaging method and storage medium
Technical Field
The embodiment of the application relates to the field of ultrasonic imaging, in particular to an ultrasonic imaging method, an ultrasonic imaging device and a storage medium.
Background
The technological revolution promotes the development of intelligent medical diagnosis, and the development of intelligent medical diagnosis greatly improves the working efficiency and diagnosis rate of doctors. However, the intelligent diagnosis product still has a place to be improved, the result of the intelligent diagnosis is not correct in a hundred percent, but the confidence level of the result is not reflected in the existing product or scheme, especially for the case that the disease symptoms in the clinical diagnosis of breast and thyroid ultrasound are relatively more, and the result of the machine analysis is easy to be blindly believed by inexperienced or low-age doctors, so that the correctness of the diagnosis is influenced.
Disclosure of Invention
An ultrasound imaging system, an ultrasound imaging method, and a computer storage medium are provided.
In a first aspect, the present application provides an ultrasound imaging system comprising:
the probe transmits ultrasonic waves to the thyroid or the breast to be detected and receives ultrasonic echoes to obtain ultrasonic echo signals;
the processor processes the ultrasonic echo signal to obtain an ultrasonic image of the thyroid or the breast to be detected; obtaining detection information of the thyroid or the mammary gland to be detected according to the ultrasonic image, wherein the detection information of the thyroid comprises TI-RADS detection information, and the detection information of the mammary gland to be detected comprises BI-RADS detection information; calculating the confidence of the detection information;
a display that displays the detection information and a confidence of the detection information.
In a second aspect, the present application provides an ultrasound imaging system comprising:
the processor acquires an ultrasonic image of the thyroid or the breast to be detected; obtaining detection information of the thyroid or the breast to be detected according to the ultrasonic image, wherein the detection information comprises TI-RADS detection information of the thyroid or BI-RADS detection information of the breast; calculating the confidence of the detection information;
a display that displays the detection information and a confidence of the detection information.
In a third aspect, the present application provides an ultrasound imaging method comprising:
transmitting ultrasonic waves to the thyroid or the mammary gland to be detected and receiving ultrasonic echoes to obtain ultrasonic echo signals;
processing the ultrasonic echo signal to obtain an ultrasonic image of the thyroid or the breast to be detected;
obtaining detection information of the thyroid or the breast to be detected according to the ultrasonic image, wherein the detection information comprises TI-RADS detection information of the thyroid or BI-RADS detection information of the breast;
calculating the confidence of the detection information;
and displaying the detection information and the confidence of the detection information.
In a fourth aspect, the present application provides a method of ultrasound imaging comprising:
acquiring an ultrasonic image of the thyroid or the breast to be detected;
obtaining detection information of the thyroid or the mammary gland to be detected according to the ultrasonic image, wherein the detection information of the thyroid comprises TI-RADS detection information, and the detection information of the mammary gland to be detected comprises BI-RADS detection information;
calculating the confidence of the detection information;
and displaying the detection information and the confidence of the detection information.
In a fifth aspect, the present application provides an ultrasound imaging system comprising:
the processor is used for processing and acquiring an ultrasonic image of the tissue to be detected; acquiring detection information according to the ultrasonic image of the tissue to be detected; calculating the confidence of the detection information;
a display that displays the detection information and a confidence of the detection information.
In a sixth aspect, the present application provides a computer storage medium having stored thereon a computer program for an ultrasound imaging apparatus, the computer program, when executed by a processor, implementing the method of the third or fourth aspect.
According to the method and the device, the detection information is obtained according to the ultrasonic image of the tissue to be detected, the confidence coefficient of the detection information is calculated, the doctor is helped to identify the credibility of the detection information obtained by the machine, and misdiagnosis caused by blindly believing the detection information obtained by the machine is avoided.
Drawings
FIG. 1 is a schematic block diagram of one embodiment of an ultrasound imaging system;
FIG. 2 is a schematic flow chart diagram of one embodiment of an ultrasound imaging system workflow;
FIG. 3 is a schematic flow chart diagram of one embodiment of an ultrasound imaging system workflow;
FIG. 4 is a schematic flow chart diagram of one embodiment of a partial workflow of an ultrasound imaging system;
FIG. 5 is a schematic flow chart diagram of one embodiment of a partial workflow of an ultrasound imaging system;
FIG. 6 is a schematic flow chart diagram of one embodiment of a partial workflow of an ultrasound imaging system;
FIG. 7 is a schematic flow chart diagram of one embodiment of a partial workflow of an ultrasound imaging system;
FIG. 8 is a schematic flow chart diagram of one embodiment of an ultrasound imaging system workflow;
FIG. 9 is a schematic flow chart diagram of one embodiment of a partial workflow of an ultrasound imaging system;
FIG. 10 is a schematic flow chart diagram of one embodiment of a partial workflow of an ultrasound imaging system;
FIG. 11 is a schematic flow chart diagram of one embodiment of an ultrasound imaging system workflow;
FIG. 12 is a schematic flow chart diagram of one embodiment of an ultrasound imaging system workflow;
FIG. 13 is a schematic block diagram of one embodiment of a display interface for an ultrasound imaging system workflow;
FIG. 14 is a schematic block diagram of one embodiment of a display interface for an ultrasound imaging system workflow;
figure 15 is a schematic flow chart diagram of one embodiment of an ultrasound imaging system workflow.
Detailed Description
Fig. 1 is a schematic structural block diagram of an ultrasound imaging system in an embodiment of the present application. The ultrasound imaging system 10 may include a probe 100, transmit circuitry 101, a transmit/receive selection switch 102, receive circuitry 103, beamforming circuitry 104, a processor 105, and a display 106. The transmitting circuit 101 can excite the probe 100 to transmit ultrasonic waves to the tissue to be detected; the receiving circuit 103 may receive the ultrasonic echo returned from the tissue to be measured through the probe 100, thereby obtaining an ultrasonic echo signal/data; the ultrasonic echo signals/data are subjected to beamforming processing by the beamforming circuit 104, and then sent to the processor 105. The processor 105 processes the ultrasound echo signals/data to obtain an ultrasound image of the tissue under test. The ultrasound images obtained by the processor 105 may be stored in the memory 107. These ultrasound images may be displayed on the display 106.
In an embodiment of the present application, the display 106 of the ultrasound imaging system 10 may be a touch screen, a liquid crystal display, or the like, or may be an independent display device such as a liquid crystal display, a television, or the like, which is independent from the ultrasound imaging system 10, or may be a display screen on an electronic device such as a mobile phone, a tablet computer, or the like.
In practical applications, the Processor 105 may be at least one of an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a Central Processing Unit (CPU), a controller, a microcontroller, and a microprocessor, so that the Processor 105 may perform corresponding steps of the ultrasound imaging method in the embodiments of the present Application.
The Memory 107 may be a volatile Memory (volatile Memory), such as a Random Access Memory (RAM); or a non-volatile Memory (non-volatile Memory), such as a Read Only Memory (ROM), a flash Memory (flash Memory), a Hard Disk (Hard Disk Drive, HDD) or a Solid-State Drive (SSD); or a combination of the above types of memories and provides instructions and data to the processor.
In one embodiment, the tissue to be detected is a thyroid or a breast to be detected, and the transmitting circuit 101 excites the probe 100 to transmit ultrasonic waves to the thyroid or the breast to be detected; the receiving circuit 103 receives the ultrasonic echo returned from the thyroid or breast to be tested through the probe 100 to obtain an ultrasonic echo signal; the processor 105 processes the ultrasonic echo signal to obtain an ultrasonic image of the thyroid or breast to be detected, and the processor 105 is further configured to obtain detection information of the thyroid or breast to be detected according to the ultrasonic image; calculating the confidence of the detection information, optionally, the processor 105 may also obtain an ultrasound image by calling the ultrasound image stored in the memory 107 or receiving an ultrasound image transmitted by another device, for subsequent processing; the display 106 displays the detection information and the confidence of the detection information, and optionally, the display 106 may also display an ultrasound image of the thyroid or breast to be detected.
Various embodiments of the ultrasound imaging system 10 are described in detail below in connection with the workflow of the ultrasound imaging system 10.
As shown in FIG. 2, in one embodiment, an ultrasound imaging system 10 includes:
11 the probe 100 transmits ultrasonic waves to the thyroid or the breast to be detected and receives ultrasonic echoes to obtain ultrasonic echo signals.
And (12) processing the ultrasonic echo signal by the processor to obtain an ultrasonic image of the thyroid or the breast to be detected.
The acquired ultrasound image of the thyroid or breast to be detected may be at least one of a B-mode ultrasound image, a C-mode ultrasound image, an M-mode ultrasound image, an elastic ultrasound image, and the like.
And 13, the processor obtains the detection information of the thyroid or the mammary gland to be detected according to the ultrasonic image.
The detection information of the thyroid to be detected comprises TI-RADS detection information, and the detection information of the mammary gland to be detected comprises BI-RADS detection information. The processor obtains the detection information of the thyroid or the breast to be detected according to the ultrasonic image, and the detection information comprises the following steps: and acquiring TI-RADS detection information according to the ultrasonic image of the thyroid to be detected, or acquiring BI-RADS detection information according to the ultrasonic image of the mammary gland to be detected. The TI-RADS (Thyroid imaging reporting and data system, TI-RADS) is a Thyroid gland image report and data system, lesions of the Thyroid gland are evaluated and comprehensively graded through different features in a Thyroid gland image, and TI-RADS detection information in the embodiment can comprise evaluation of different image features in the image report and comprehensive grading; wherein BI-RADS (Breast Imaging reporting and data system) is a Breast image report and data system, and the lesion of the Breast is evaluated and comprehensively graded according to different features in the Breast image. It is emphasized that the above-described TI-RADS or BI-RADS may be any version of the TI-RADS or BI-RADS standard and is not limited to a particular version.
In one embodiment, the detection information may include at least two feature items, and for example, as shown in fig. 13, the TI-RADS detection information may include feature items such as composition, echo, morphology, edge, and hyperecho, and TI-RADS ranking. The processor 105 may obtain the detection information according to the ultrasound image of the thyroid or breast to be detected, and further, the processor 105 may further obtain at least two feature item information of the detection information according to the ultrasound image of the thyroid or breast to be detected, respectively. For example, as shown in fig. 13, the detection information obtained by the processor 105 according to the thyroid ultrasound image to be detected includes feature item information: the thyroid nodule in the ultrasound image is composed of solidity, the thyroid nodule echo is equal echo, the thyroid nodule is in the form of transverse diameter > longitudinal diameter, the thyroid nodule has smooth edge, the thyroid nodule has no strong echo, and the TI-RADS is classified as TR 1.
For convenience of description, the detection information may refer to the detection information as a whole, or may refer to at least one feature item of the detection information.
The processor 14 calculates a confidence level of the detected information.
The result of acquiring the detection information according to the ultrasound image of the thyroid or breast to be detected may have a deviation due to the influence of various factors, and therefore, a doctor needs to perform comprehensive judgment by referring to the confidence of the detection information. After the processor 105 acquires the detection information of the thyroid or breast to be detected according to the ultrasound image, the confidence of the detection information can be further calculated to prompt the credibility of the acquired detection information, so that a doctor can make an accurate judgment.
In one embodiment, the confidence level of the detected information may be an overall confidence level of at least two feature items. For example, the confidence of the TI-RADS detection information may be the total confidence of the components, echoes, morphology, edges, hyperechoes, and TI-RADS level feature items; the confidence of the detection information can be the total confidence of the components, echoes, forms, edges and hyperechoic characteristic items; or the overall confidence of a combination of other at least two feature terms.
In another embodiment, the confidence level of the detected information may be the confidence level of at least one feature item information of the detected information, including that one feature item information corresponds to one confidence level. For example, the confidence of the detection information may be at least one of the confidence of the component feature item information, the confidence of the echo feature item information, the confidence of the morphological feature item information, the confidence of the edge feature item information, the confidence of the hyperecho feature item information, and the confidence of the TI-RADS level feature item information.
In one embodiment, the confidence of the detected information may be a specific value, which may be expressed by way of ten-degree score, percent score, percentage, or the like; and may also be a qualitative criterion including high confidence, medium confidence, low confidence, etc.
15 the display 106 displays the detection information and the confidence level of the detection information.
Optionally, the display 106 may also simultaneously display an ultrasound image of the thyroid or breast to be measured. In one embodiment, the detection information and/or the confidence level of the detection information may be displayed after the user selects to enter a specific mode.
In one embodiment, the display 106 displays at least two feature item information of the detected information and the total confidence level of the at least two feature item information of the detected information. For example, the display 106 may display the component feature item information, the echo feature item information, the morphological feature item information, the edge feature item information, the hyperecho feature item information, and the total confidence of the above feature item information or the total confidence of any at least two of the above feature item information. For example, the confidence level may be displayed below or to the right of the at least two feature item information corresponding thereto, and other convenient positions for viewing.
In one embodiment, the display 106 displays at least one feature item of the detected information, and displays a confidence corresponding to the feature item of the detected information in the vicinity of the at least one feature item of the detected information. For example, the display 106 may display the component feature item information and the confidence level of the component feature item information in its vicinity, which may be below or to the right of the feature item information, as well as other convenient viewing locations.
In one embodiment, the confidence level is displayed by a level icon, the level icon represents the magnitude of the confidence level through the change of the icon, and the level icon may be an icon displaying the change of the level through the visual change of the shape, color, state, or the like of the icon. For example, the degree icon may include at least one of a variable number of icons, a variable scale icon, a variable color icon, a variable shape icon, an icon to be pointed, a numeric icon, and a text icon. For example, a variable number of icons may represent the confidence level by the number of icons, a variable proportion of icons may represent the confidence level by the proportion of a particular portion of an icon, a variable color of icons may represent the confidence level by the color of an icon, a variable shape of icons may represent the confidence level by the shape of an icon, an icon with a pointer may represent the confidence level by a character to which the pointer points, a numeric icon may represent the confidence level by a number, and a textual icon may represent the confidence level by a letter. It is emphasized that the words and numerals themselves should be understood to mean either an alphanumeric or a numeric icon.
For example, as shown in fig. 13, after the feature item information of the corresponding detection information, the confidence of the feature item information is expressed by the number of body position icons, for example, the confidence of the feature item information whose component is substantial is high when there are three body position icons after the feature item information whose component is substantial; the echo equal echo feature item information is only followed by a body position icon, and the confidence coefficient of the echo equal echo feature item information is low. Illustratively, the posture icon can further represent the current scanning position of the ultrasound image.
Illustratively, as shown in fig. 14, the degree of confidence is represented by the size of the lit portion on the circumference after the feature item information of the corresponding detection information, for example, the confidence of the feature item information with a substantial component is 70%, and the lit portion on the corresponding circumference occupies 70% of the circumference; the confidence of the strong echo without the strong echo characteristic item information is 10%, and the lighting part on the corresponding circle accounts for 10% of the circle. Alternatively, a specific number of confidence levels may be displayed in the middle of the circumferential icon. Optionally, the lighted part on the circumference may be set to different colors according to the difference of the confidence degree values, for example, the lighted part on the confidence degree circumference icon corresponding to the characteristic item information with a component of solidity accounts for 70% of the circumference, the corresponding confidence degree is medium, and the color of the lighted part is yellow; the highlighted portion on the execution degree circumference icon corresponding to the feature item information with no hyperechos accounts for 10% of the circumference, the corresponding confidence coefficient is low, and the color of the highlighted portion is red.
In other embodiments, variations of the icons may be combined in other ways to more intuitively indicate the amount of confidence.
In one embodiment, the degree icon may include an icon displayed in combination with the detection information, and the degree icon may display the detection information while representing the confidence of the detection information through a visual change in the shape, color, or state of the icon. For example, the severity icon may include a variable status detection information icon. The variable-state detection information icon comprises a step of displaying the detection information through the detection information icon, and a step of expressing the confidence degree through the state of the detection information icon. The detection information icon includes an icon for displaying detection information, it should be emphasized that the display detection information character itself is also one of the detection information icons, and the state of the detection information icon includes: a color state, a shape state, a size state, etc., and the state of the detection information icon differs depending on the degree of confidence of the detection information. Taking the detection information icon in the variable color state as an example, the degree icon can be a circular icon, the detection information is displayed in the circular icon, and when the reliability of the detection information is high, the circular icon displays green; when the confidence of the detected information is medium and within the required range, the circular icon displays yellow; when the confidence of the detected information is low, the circular icon displays red. Optionally, the detection information icon with a variable color may also represent the confidence level by detecting the background color of the information font, or represent the confidence level by detecting the color of the information font itself.
In one embodiment, the detection information may be displayed by a list or by an indication icon. The indicator icon includes a visual icon that can indicate the detection information. For example, the indication icon of the component feature item information may be similar to an oil meter icon, and the indicators thereof include: cystic, spongy, cystic solidity and solidity, and the information of the component characteristic items of the thyroid nodules of the ultrasonic image is visually displayed by pointing to the solidity index through the pointer. Alternatively, a degree icon representing the degree of confidence may be displayed in combination with an indication icon representing the detection information, for example, a similar oil meter icon indicates that the constituent feature item information is substantial, at which time the degree of confidence of the substantial result is high, and the oil meter icon is displayed in green.
As shown in fig. 3, in one embodiment, the 14 processor 105 calculates the confidence level of the detection information, which may include:
the processor 105 calculates 24 the validity of the ultrasound image.
The validity of the ultrasound image is whether the ultrasound image is valid as a basis for detecting information or the validity degree of the ultrasound image, whether the ultrasound image is too bright or too dark, whether the ultrasound image is blurred, whether the resolution of the ultrasound image is high enough, and the like. The validity of the ultrasonic image can be a specific value, is similar to the confidence of the detection information, and the representation form of the validity can be represented in the form of a ten-degree score, a percentile score or a percentile; and may also be a qualitative criterion including valid, invalid, etc.
The processor 105 calculates the confidence level of the detected information by the validity of the ultrasound image 25.
The higher the validity of the ultrasound image, the more reliable the corresponding detection information obtained by the ultrasound image, and the higher the confidence of the detection information. Correspondingly, the higher effectiveness of the ultrasonic image is further determined by the factors of moderate brightness of the ultrasonic image, no blurring of the ultrasonic image, high resolution of the ultrasonic image and the like, so that the effectiveness of the ultrasonic image can be evaluated from at least one of the angles. Of course, the criterion for judging the validity of the ultrasound image is not limited to this, and the validity of the ultrasound image may be obtained by inputting the ultrasound image into an artificial intelligence model, for example. For example, when the validity of the ultrasound image is a specific value, a functional relationship or other corresponding relationship between the validity of the image and the confidence level of the detection information may be established, so as to calculate the confidence level of the detection information according to the validity of the ultrasound image. When the validity of the ultrasonic image is a qualitative standard, the validity of the ultrasonic image can be assigned, so that the confidence of the detection information is calculated through the validity of the ultrasonic image by establishing a functional relationship or other corresponding relationships between the validity of the ultrasonic image and the confidence of the detection information; the corresponding relation between the validity of the ultrasonic image and the confidence of the detection information can also be directly established, so that the confidence of the detection information can be calculated through the validity of the ultrasonic image.
As shown in fig. 4, in one embodiment, 24 the processor 105 calculating the validity of the ultrasound image may include:
the processor 105 calculates the sharpness of the ultrasound image based on the ultrasound image. When the processor 105 acquires the detection information, the ultrasonic image is an important basis for acquiring the detection information, and if the definition of the ultrasonic image is high, the validity of the ultrasonic image is correspondingly high; if the ultrasound image has low definition, the effectiveness of the ultrasound image is correspondingly low. Similar to the validity of the ultrasonic image, the definition of the ultrasonic image can be a specific value, and the expression form of the definition can be embodied in the form of a ten-degree score, a percentile score or a percentile; but may also be a qualitative determination of clearness, blurring, or the like.
The processor 105 calculates 32 the validity of the ultrasound image by the sharpness of the ultrasound image.
For example, when the ultrasound image clarity is a specific value, a functional relationship or other corresponding relationship between the image clarity and the validity of the ultrasound image can be established to calculate the validity of the ultrasound image through the image clarity. When the definition of the ultrasonic image is a qualitative standard, the definition of the ultrasonic image can be assigned, so that the validity of the ultrasonic image can be calculated through the definition of the image by establishing a functional relation or other corresponding relations between the definition of the image and the validity of the ultrasonic image; the corresponding relation between the image definition and the effectiveness of the ultrasonic image can be directly established, so that the effectiveness of the ultrasonic image can be calculated through the image definition.
The image sharpness may be calculated from dimensions such as whether the ultrasound image is too bright or too dark, or whether the resolution of the ultrasound image is high enough.
As shown in fig. 5, in one embodiment, 31 the processor 105 calculates the sharpness of the ultrasound image based on the ultrasound image may include:
processor 41 determines the active area from the ultrasound image.
The active area may be an ultrasound image area associated with the acquisition of the detection information. For example, for the thyroid, the effective region may be an image region of the thyroid in the ultrasound image, or an ultrasound image region related to detection information acquisition, such as an image region of a thyroid nodule.
The processor 105 detects 42 gradient information of the active area.
Processor 43 calculates the sharpness of the ultrasound image based on the gradient information. Generally, the higher the gradient value is, the more abundant the edge information of the picture is, and the sharper the image is. For example, a functional relationship or other correspondence relationship between the gradient information of the effective region and the image sharpness may be established. The image sharpness may be calculated based on the gradient information, for example, by a Brenner gradient function, a Tenengrad gradient function, a Laplacian gradient function, or the like.
As shown in fig. 6, in one embodiment, 31 the processor 105 calculates the sharpness of the ultrasound image based on the ultrasound image, which may include:
51 a first artificial intelligence model is trained by inputting two types of thyroid or breast ultrasound images with clear and fuzzy active areas. Illustratively, the first artificial intelligence model can perform clear and fuzzy classification on the effective area of the ultrasonic image, and for the input ultrasonic image to be detected, the first artificial intelligence model can input a clear or fuzzy classification result. The first artificial intelligent model can also grade the definition degrees of the ultrasonic image effective area such as clear, clearer, more fuzzy and fuzzy degrees, so that the first artificial intelligent model can output the definition grade of the input ultrasonic image to be detected.
The 52 processor 105 inputs the ultrasonic image of the thyroid or the breast to be detected into the first artificial intelligence model, and obtains the result of the definition of the ultrasonic image output by the first artificial intelligence model. The thyroid or breast ultrasound image to be detected is input into the first artificial intelligent model, so that a clear or fuzzy classification result or a classification of the degree of definition output by the first artificial intelligent model can be obtained, and optionally, a probability value of the classification result can also be obtained. For example, the image clarity may be calculated by setting the ultrasound image clarity classification result to 1, setting the ultrasound image blur classification result to 0, and further setting the image clarity as a function of the image classification result. Optionally, a probability value of the classification result output by the first artificial intelligence model is obtained, and a functional relationship or other corresponding relationship between the classification result output by the first artificial intelligence model, the probability value of the classification result, and the definition of the ultrasound image may be established, so as to calculate the definition of the image.
As shown in fig. 7, in one embodiment, processor 105 performs 24 calculating the validity of the ultrasound image, which may include:
the processor 105 determines 121 an active area from the ultrasound image.
The processor 105 calculates 122 a gray level average of the active area.
For example, the gray values of all places in the effective area of the ultrasound image may be averaged to obtain a total gray average value of the effective area; the effective region of the ultrasound image may be divided into a plurality of regions, and the respective gray-scale average values of the respective regions of the effective region may be calculated.
123 calculating the validity of the ultrasonic image according to the mean value of the gray levels.
For example, whether the grayscale mean value of the ultrasound image is within the threshold range or not can be judged, the grayscale value range of the normal ultrasound image is between 0 and 255, and the grayscale mean value of the effective area can be set to be between 40 and 200, that is, when the grayscale mean value of the effective area is lower than 40, the ultrasound image can be considered to be too dark, and when the grayscale mean value of the effective area exceeds 200, the ultrasound image can be considered to be too bright. It is emphasized that the threshold range can be adjusted according to clinical requirements and is not limited to between 40 and 200. Illustratively, the validity of the ultrasound image may be assigned according to whether the gray average is within a threshold range or exceeds the threshold range, for example, the gray average of the effective area is set to be between 40 and 200, and when the gray average of the effective area of the image is between 40 and 200, the validity of the ultrasound image is assigned to 10 points; when the mean value of the gray scale of the effective area of the image is between 35-40 or 200-; when the mean value of the gray levels of the effective areas of the images is between 30-35 or 205 and 210, the assignment of the effectiveness of the ultrasonic images is 8, and so on. It should be emphasized that the above is only an exemplary method for calculating the validity of the ultrasound image according to the gray-scale mean, and other correlations or functions of the gray-scale mean and the validity of the ultrasound image may also be established to calculate the validity of the ultrasound image according to the gray-scale mean.
In one embodiment, processor 105 performs 24 the calculating the validity of the ultrasound image may include: the processor 105 detects the gray level of the ultrasound image, and determines the validity of the ultrasound image through the gray level of the ultrasound image.
Detecting the gray scale of the ultrasonic image, which can be the integral gray scale of the ultrasonic image; or the effective area can be determined in the ultrasonic image, and then the gray scale of the ultrasonic image in the effective area can be detected. Similarly, the validity of the ultrasound image is determined by the gray scale of the ultrasound image, may be determined by the entire gray scale of the ultrasound image, or may be determined by the gray scale of the ultrasound image in the valid region. Wherein, the determining the validity of the ultrasound image through the grayscale of the ultrasound image includes but is not limited to at least one of whether the grayscale mean is within the threshold range, whether the grayscale of the ultrasound image is uniform, and whether the grayscale extremum of the ultrasound image satisfies the criterion of the grayscale extremum. For whether the mean grayscale value of the ultrasound image is within the threshold range, refer to the above description, and are not repeated herein. Whether the gray scale of the ultrasonic image is uniform or not can be drawn, and the gray scale histogram of the ultrasonic image is judged to be uniformly distributed or not, so that the gray scale of the image is not concentrated in a certain area to influence the effectiveness of the ultrasonic image.
If the gray scale of the ultrasound image meets the gray scale standard, for example, the average gray scale of the ultrasound image is proper and the image is uniform, the ultrasound image can more accurately display the shape of the thyroid or the breast, and the ultrasound image has high effectiveness; on the contrary, if the gray scale of the ultrasound image does not meet the gray scale standard, the validity of the ultrasound image is low, and therefore, the validity of the ultrasound image can be determined by the gray scale of the ultrasound image. For example, the gray scale standard of the ultrasound image effective, such as the standard of the gray scale mean value, the standard of the gray scale uniformity, and the standard of the gray scale extreme value of the ultrasound image, may be set, further, the deviation between the gray scale of the ultrasound image and the gray scale standard may be calculated, and the functional relationship or other corresponding relationship between the deviation and the image effectiveness may be established, so as to determine the effectiveness of the ultrasound image through the relationship between the gray scale of the ultrasound image and the gray scale standard. Of course, the deviation between the gray scale of the ultrasound image and the gray scale standard can be evaluated from an angle, such as a gray scale uniformity dimension; the evaluation may be performed from a plurality of dimensions, for example, dimensions such as a gray average, a gray extreme, and gray uniformity, and the deviation between the gray level of the ultrasound image and the gray level standard may be obtained by integration.
In one embodiment, processor 105 performs 24 the calculating the validity of the ultrasound image may include: the processor 105 detects the presence of speckles, snowflakes or webbing in the ultrasound image to determine the validity of the ultrasound image.
Detecting whether spots, snowflake fine lines or reticulate patterns exist in the ultrasonic image, which can be used for detecting the whole ultrasonic image; or the effective area can be determined in the ultrasonic image, and then the ultrasonic image in the effective area can be detected. It is understood that if there are spots, snow grains or cobwebbing in the ultrasound image, the spots, snow grains or cobwebbing in the ultrasound image may cover the key structures of the thyroid or the mammary gland, affecting the validity of the image, and therefore, it is possible to establish whether there are spots, snow grains or cobwebbing as a function of the validity of the image or other corresponding relationships, for example, the validity of the image is lower when the range of spots, snow grains or cobwebbing appearing in the ultrasound image is larger; conversely, the smaller the range of speckles, snowflakes, or reticulations appearing in the ultrasound image, the higher the effectiveness of the image; when there are no spots, snowflakes or webbing in the ultrasound image, the image is most effective in the evaluation dimension of image imperfections. Furthermore, the validity of the ultrasonic image can be determined according to whether the three image defects, namely the speckles, the snowflakes or the reticulate patterns exist in the detected ultrasonic image or not by giving different weights to the three image defects, namely the speckles, the snowflakes or the reticulate patterns according to different degrees of influence on the identification of the thyroid or the mammary gland in the image.
For the detection of spots, snowflake fine lines or reticulate patterns in the ultrasonic image, whether the texture of the ultrasonic sectional image meets the preset image texture standard or not can be detected. For example, a detection model of an image texture may be trained in advance, and an ultrasound sectional image is input into the detection model to obtain a detection result of whether the texture meets a preset image texture standard, where the image texture includes: the image had spots, snowflake, and moire.
In one embodiment, processor 105 performs 24 the calculating the validity of the ultrasound image may include: the processor 105 detects the effective area ratio of the ultrasound image, and determines the effectiveness of the ultrasound image according to the effective area ratio of the ultrasound image.
The active area of the ultrasound image may be an area of the ultrasound image that is relevant for detection information acquisition. For example, for the thyroid, the effective region may be a region in the ultrasound image that includes an image of the thyroid, or an ultrasound image region such as an image region of a thyroid nodule that is relevant to detection information acquisition. The effective area ratio of the ultrasonic image is mainly detected to ensure that the effective area ratio of the ultrasonic sectional image to the whole image is proper, for example, the ratio is not too small, but should be larger than 1/2. Illustratively, a specific detection method is to acquire an effective area by means of threshold segmentation of image processing or the like, calculate a ratio of the effective area to an entire image area, and determine whether the ratio meets a preset ratio requirement. Wherein, the size or the proportion of the effective area is related to parameters such as the ultrasonic scanning depth or the magnification/reduction factor. In one embodiment, whether the ultrasound scanning depth meets the standard, for example, whether the ultrasound scanning depth is within a threshold range, may be detected to determine whether the effective area ratio of the ultrasound image is appropriate.
It can be understood that if the effective area ratio of the ultrasound image is too small, it is difficult to accurately reflect the morphology of the thyroid or the breast on the ultrasound image, and it is not beneficial to obtain the detection information based on the ultrasound image, so that the validity of the ultrasound image can be determined by the effective area ratio of the ultrasound image, for example, the effective area ratio of the ultrasound image can be calculated, and a functional relationship or other corresponding relationships between the effective area ratio of the ultrasound image and the validity of the image can be established, so as to determine the validity of the ultrasound image by the effective area ratio of the ultrasound image.
In one embodiment, processor 105 performs 24 the calculating the validity of the ultrasound image may include: the processor 105 detects the probe, the probe parameter and/or the imaging parameter, and determines the validity of the ultrasonic image according to the corresponding relation between the probe, the probe parameter and/or the imaging parameter and the thyroid or mammary gland to be detected included in the ultrasonic image
When the ultrasonic detection is performed on a patient, different probes, probe parameters and imaging parameters need to be selected according to different detection parts, so that the optimal imaging effect is achieved on different detection parts. For example, superficial thyroid, mammary, linear probes with high frequency of use; the abdominal organs use a low-frequency convex probe. In actual operation, however, a user may use an ultrasound probe and corresponding probe parameters for the abdomen and imaging parameters corresponding to the abdomen incorrectly due to inexperience or negligence during the ultrasound imaging of the thyroid or the breast, which may cause that a high-quality ultrasound image of the thyroid or the breast cannot be obtained during the ultrasound imaging of the thyroid or the breast, and thus the validity of the ultrasound image is affected; alternatively, the user may mistakenly use suitable imaging parameters for the breast during the thyroid ultrasound imaging process, which may also result in failure to obtain a high quality thyroid ultrasound image, and thus the validity of the ultrasound image may be affected.
The processor 105 may identify a tissue type included in the ultrasound image, and compare the tissue type with a probe, a probe parameter, and an imaging parameter used for scanning the ultrasound image, determine that the validity of the ultrasound image is high when the tissue type included in the ultrasound image corresponds to the probe, the probe parameter, and the imaging parameter used, and determine that the validity of the ultrasound image is low when the tissue type included in the ultrasound image does not correspond to the probe, the probe parameter, and the imaging parameter used. The tissue type of the ultrasonic image can be compared with the probe, the probe parameter and the imaging parameter used for scanning the ultrasonic image, or the tissue type of the ultrasonic image can be compared with one or two of the probe, the probe parameter and the imaging parameter used for scanning the ultrasonic image to determine the corresponding relation, so that the effectiveness of the image is determined. Further, a functional relationship or other corresponding relationship between the type of the probe, the probe parameter and/or the imaging parameter and the validity of the image and the corresponding relationship between the type of the probe, the probe parameter and/or the imaging parameter and the thyroid or breast to be detected included in the ultrasound image can be established, so that the validity of the ultrasound image can be determined through the corresponding relationship.
As shown in fig. 8, in one embodiment, the 14 processor 105 calculates the confidence level of the detection information, which may include:
the 64 processor 105 identifies a target structure of the thyroid or breast in the ultrasound image. The target structure of the thyroid or breast includes a tissue structure related to acquisition of the detection information in the ultrasound image. The identification of the target structure of the thyroid or the breast in the ultrasound image can be realized by image identification, machine learning and the like. Exemplary, target structures of the thyroid gland may include: thyroid tissue, carotid artery, trachea, etc.; the target structure of the breast includes a layered structure including a skin layer, a subcutaneous fat layer, an glandular tissue layer, a pectoral layer, a costal layer, and the like.
The processor 105 calculates 65 a confidence level of the detection information from the recognition result of the target structure. It can be understood that if the image includes the target structure of the thyroid or breast to be detected, the calculation of the detection information of the thyroid or breast based on the image is more reliable; if the image does not include the target structure of the thyroid or breast to be detected at all, or includes only a part of the target structure, the calculation of the detection information of the thyroid or breast based on the image is less reliable. In one embodiment, similar to calculating the confidence of the detection information through the definition of the ultrasound image, the confidence of the detection information and the functional relationship or other corresponding relationship between whether the target structure is included or not and whether the target structure is completely included may be established, so as to calculate the confidence of the detection information through the target structure of the thyroid or breast to be detected in the image.
As shown in fig. 9, in an embodiment, taking the thyroid ultrasound image as an example, the 64 processor 105 identifies a target structure of a thyroid or a breast in the ultrasound image, which may include:
71 the processor 105 identifies thyroid tissue in the ultrasound image of the thyroid to be measured. The thyroid tissue in the ultrasonic image can be identified through image identification, machine learning and the like.
The 72 processor 105 identifies the positional relationship of the thyroid tissue to other target structures. In one embodiment, when scanning the thyroid gland, left, right and isthmus scans are often performed, and the obtained ultrasound images may have some differences. In this case, if a left image of the thyroid is obtained based on a left scanning of the thyroid and analysis is performed on a right image of the thyroid when detection information is calculated, the confidence of the obtained detection information may not be high. In one embodiment, the scanning position of the thyroid can be determined by identifying the position relationship between the thyroid tissue and other target structures in the thyroid ultrasound image, and whether the preset scanning position is met is determined, for example: if the carotid artery is on the left side of the thyroid, the carotid artery is a right scanning image of the thyroid; if the carotid artery is on the right side of the thyroid, the carotid artery is a left scanning image of the thyroid; thyroid glands are arranged on the left side and the right side of the image, and a trachea structure is arranged in the middle of the image, so that the image is scanned for the isthmus of the thyroid glands. If the scanning direction of the thyroid gland judged by the positions of other target structures and thyroid tissue in the image does not accord with or deviate from the preset direction, the confidence coefficient of the detection information is lower; and if the scanning direction of the thyroid gland, which is judged by the positions of other target structures and thyroid tissue in the image, is consistent with the preset direction, the confidence coefficient of the detection information is higher.
Continuing with the example of the thyroid, the location of the thyroid tissue and other target structures is not limited to the location of the other target structures on the thyroid tissue, and may also include the location of the thyroid tissue and other target structures on the ultrasound image, the size of the image locations occupied by the thyroid tissue and other target structures, and so on. In another embodiment, taking a thyroid ultrasound image as an example, when scanning a thyroid, in order to obtain detection information, a standard sectional view of the thyroid needs to be acquired, and according to the standard sectional view, a specific other target structure needs to appear on the image, and thyroid tissue needs to occupy a certain proportion in the image. If the ultrasonic image of the thyroid to be detected is judged to accord with the preset condition of the standard section according to the positions of other target structures in the image and the image size of the thyroid tissue, the confidence coefficient of the detection information is higher; and if the ultrasonic image of the thyroid to be detected does not meet the preset condition of the standard tangent plane according to the position relation of other target structures in the image and the image size of the thyroid tissue, the confidence of the detection information is lower.
Specifically, similar to calculating the confidence of the detection information according to the effectiveness of the ultrasound image, a functional relationship or other corresponding relationship between the confidence of the detection information and the above conformity degree may be established, so as to calculate the confidence of the detection information according to the conformity degree.
As shown in fig. 10, in one embodiment, 64 the processor 105 identifies a target structure of the thyroid or breast in the ultrasound image may include:
the processor 105 trains at least one second artificial intelligence model by inputting ultrasound images of the thyroid or breast containing the target structure.
Taking thyroid as an example, some thyroid ultrasound images are acquired in advance, thyroid tissues in the thyroid ultrasound images are marked, and the thyroid ultrasound images are input into at least one second artificial intelligence model to train the at least one second artificial intelligence model. Further, other target structures in the thyroid ultrasound image may also be marked, such as: the cervical artery and trachea. Similar to the thyroid, the at least one second artificial intelligence model may also be trained by inputting ultrasound images of the breast containing the target structure, e.g., ultrasound images of the breast marking the layered structure are input into the at least one second artificial intelligence model. All target structures of the thyroid or the mammary gland can be identified by training a second artificial intelligence model; the identification may also be performed by training a plurality of second artificial intelligence models, wherein each second artificial intelligence model corresponds to a target structure.
And 82, the processor 105 inputs the ultrasonic image of the thyroid or the breast to be detected into at least one second artificial intelligence model, and obtains the identification result of the target structure of the thyroid or the breast in the ultrasonic image output by the at least one second artificial intelligence model.
Illustratively, continuing to take the thyroid as an example, the ultrasound image of the thyroid to be detected is input into the at least one second artificial intelligence model, and the at least one second artificial intelligence model outputs the recognition result of whether the thyroid tissue is included, further, the recognition results of the position of the thyroid tissue and the positions of other target structures may also be included, and optionally, the probability value of the determination result of whether the target structure including the thyroid is included may also be output.
In one embodiment, similar to calculating the confidence of the detection information through the validity of the ultrasound image, a functional relationship or other corresponding relationship between the confidence of the detection information and the recognition result output by the at least one second artificial intelligence may be established to calculate the confidence of the detection information through the output recognition result. Illustratively, in the example of the thyroid ultrasound image, the confidence of the detection information may also be calculated comprehensively in combination with the recognition result of the position relationship between the thyroid tissue and other target structures; further, the confidence of the detection information can be comprehensively calculated by combining the output result of whether the target structure of the thyroid gland is contained and the probability value of the result. Similar to thyroid, the target structures of the breast ultrasound images, the position relationship between the target structures, or the probability value of the output result can be identified through the second artificial intelligence model, so that the confidence of the detection information can be calculated.
As shown in fig. 11, in an embodiment, the 13 obtaining, by the processor 105, detection information of the thyroid or breast to be detected according to the ultrasound image may include:
93 the processor 105 trains at least one third artificial intelligence model by inputting ultrasound images of the thyroid or breast corresponding to the detected information.
When the detection information includes a plurality of feature item information, a third artificial intelligence model can be trained, ultrasonic images corresponding to the plurality of feature item information are input into the third artificial intelligence model to train the third artificial intelligence model, and the third artificial intelligence model corresponds to the detection of the plurality of feature item information; or training a plurality of third artificial intelligence models, inputting an ultrasonic image corresponding to one feature item information into each third artificial intelligence model to train the plurality of third artificial intelligence models, wherein the plurality of third artificial intelligence models respectively correspond to the detection of the plurality of feature item information.
The processor 94 inputs the ultrasound image of the thyroid or breast to be detected into the at least one third artificial intelligence model, and obtains the detection information output by the at least one third artificial intelligence model.
When the detection information comprises a plurality of feature item information, training a third artificial intelligence model to detect the plurality of feature item information, and inputting the ultrasonic image of the thyroid or the breast to be detected into the third artificial intelligence model to obtain the plurality of feature item information of the detection information of the thyroid or the breast to be detected; and training a plurality of third artificial intelligence models to respectively correspond to the detection of the information of the plurality of characteristic items, and inputting the ultrasonic images of the thyroid or the mammary gland to be detected into the plurality of third artificial intelligence models to obtain the plurality of characteristic items of the detection information of the thyroid to be detected.
In one embodiment, the 14 processor 105 calculating the confidence level of the detection information may include:
a confidence level of the detected information is calculated 95 based on the performance of the at least one third artificial intelligence model. The third artificial intelligence performance may be a probability value of the detection information result output by the third artificial intelligence model while outputting the detection information, and a functional relationship or other corresponding relationship between the probability value of the detection result and the confidence level of the detection information may be established to calculate the confidence level of the detection information by the probability of the third artificial intelligence detection result. The performance of the third artificial intelligence may also be other factors affecting the confidence level of the detection information output by the third artificial intelligence, and the confidence level of the detection information may be calculated by assigning values or establishing a corresponding relationship between the confidence level of the detection information and the values. When the at least one third artificial intelligence model is a third artificial intelligence model, the performance of the third artificial intelligence model is the performance of the at least one third artificial intelligence model; when the at least one third artificial intelligence model is a plurality of third artificial intelligence models, the performance of the at least one third artificial intelligence model may be a statistical value of the performance of the plurality of third artificial intelligence models, including: average, median, maximum or minimum, and the like.
In one embodiment, the 14 processor 105 calculating the confidence level of the detection information may include: calculating index confidence of the detection information through at least two indexes; and obtaining the confidence degree of the detection information through weighted calculation of at least two index confidence degrees. The indicators may include, but are not limited to, the above-mentioned image validity, the target structure of the thyroid or breast in the ultrasound image, the performance of the artificial intelligence model, and the like, and the indicators that can calculate the confidence of the detection information may all be used as the indicator confidence. In one embodiment, the index confidence of the detection information can be directly used as the confidence of the detection information; in another embodiment, the confidence of the detection information may also be obtained by weighting and calculating the index confidences of at least two pieces of detection information. Wherein the weight of the weighted calculation can be adjusted according to the clinical requirement or the performance of the device. And weighting the index confidence degrees of at least two pieces of detection information to calculate the confidence degrees of the detection information, wherein the obtained confidence degrees take influence factors of multiple aspects into consideration, and the obtained confidence degrees of the detection information are more accurate.
As shown in fig. 15, in one embodiment, an ultrasound imaging system includes:
141 the processor 105 acquires an ultrasound image of the thyroid or breast to be examined. The ultrasound image of the thyroid or breast to be measured acquired by the processor 105 may be acquired by real-time ultrasound scanning, may be acquired by reading an ultrasound image stored in the memory 107 in advance, or may be remotely transmitted to the processor 105 through another device.
The 142 processor 105 obtains the detection information of the thyroid or the breast to be detected according to the ultrasound image, wherein the detection information comprises TI-RADS detection information of the thyroid or BI-RADS detection information of the breast.
143 the processor 105 calculates a confidence level of the detected information.
144 the display 106 displays the detection information and the confidence level of the detection information.
For the workflow of the ultrasound imaging system similar to the other embodiments, refer to the above description, and are not repeated here.
In one embodiment, a method of ultrasound imaging is included, comprising:
transmitting ultrasonic waves to the thyroid or the mammary gland to be detected and receiving ultrasonic echoes to obtain ultrasonic echo signals;
processing the ultrasonic echo signal to obtain an ultrasonic image of the thyroid or the breast to be detected;
acquiring detection information of the thyroid or the mammary gland to be detected according to the ultrasonic image, wherein the detection information comprises TI-RADS detection information of the thyroid or BI-RADS detection information of the mammary gland;
calculating the confidence of the detection information;
and displaying the detection information and the confidence of the detection information.
The ultrasound imaging system disclosed above can perform the ultrasound imaging method, and is not described herein again to avoid repetition.
In one embodiment, a method of ultrasound imaging is included, comprising:
acquiring an ultrasonic image of a thyroid or a breast to be detected;
obtaining detection information of the thyroid or the mammary gland to be detected according to the ultrasonic image, wherein the detection information of the thyroid comprises TI-RADS detection information, and the detection information of the mammary gland to be detected comprises BI-RADS detection information;
calculating the confidence of the detection information;
and displaying the detection information and the confidence of the detection information.
The ultrasound imaging system disclosed above can perform the ultrasound imaging method, and is not described herein again to avoid repetition.
As shown in fig. 12, in an embodiment, the ultrasound image of thyroid or breast is not limited, and for other tissues to be tested which have a clinical need for obtaining the detection information, the physician may also be guided to diagnose by calculating the confidence of the detection information. In one embodiment, an ultrasound imaging device may comprise:
111 the processor 105 acquires an ultrasound image of the tissue to be measured;
112, the processor 105 acquires detection information according to the ultrasonic image of the tissue to be detected; the detection information comprises various information which is related to the tissue to be detected and can assist medical staff in analyzing the pathological condition of the tissue to be detected.
113 the processor 105 calculating a confidence level of the detected information;
the display 106 displays 114 the detection information and the confidence level of the detection information.
In addition, the embodiment of the invention also provides a computer storage medium, and the computer storage medium is stored with the computer program. When the computer program is executed by a computer or a processor, the steps of calculating and displaying the detection information and the confidence of the detection information for the ultrasound image of the tissue to be detected as shown in one or more of the foregoing fig. 2 to fig. 12 can be realized. For example, the computer storage medium is a computer-readable storage medium.
In one embodiment, the computer program instructions, when executed by a computer or processor, cause the computer or processor to perform the steps of: acquiring an ultrasonic image of a thyroid or a breast to be detected; acquiring detection information according to an ultrasonic image of a thyroid or a breast to be detected, wherein the detection information comprises the following steps: acquiring TI-RADS detection information according to an ultrasonic image of a thyroid to be detected, or acquiring BI-RADS detection information according to an ultrasonic image of a mammary gland to be detected; calculating the confidence of the detection information; and displaying the detection information and the confidence of the detection information.
The computer storage medium may include, for example, a memory card of a smart phone, a storage component of a tablet computer, a hard disk of a personal computer, a Read Only Memory (ROM), an Erasable Programmable Read Only Memory (EPROM), a portable compact disc read only memory (CD-ROM), a USB memory, or any combination of the above storage media. The computer-readable storage medium may be any combination of one or more computer-readable storage media.
Although the illustrative embodiments have been described herein with reference to the accompanying drawings, it is to be understood that the foregoing illustrative embodiments are merely exemplary and are not intended to limit the scope of the invention thereto. Various changes and modifications may be effected therein by one of ordinary skill in the pertinent art without departing from the scope or spirit of the present invention. All such changes and modifications are intended to be included within the scope of the present invention as set forth in the appended claims.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, and for example, the division of the units is only one logical functional division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another device, or some features may be omitted, or not executed.
In the description provided herein, numerous specific details are set forth. It is understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the invention and aiding in the understanding of one or more of the various inventive aspects. However, the method of the present invention should not be construed to reflect the intent: that the invention as claimed requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
It will be understood by those skilled in the art that all of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where such features are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the claims, any of the claimed embodiments may be used in any combination.
The various component embodiments of the invention may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that a microprocessor or Digital Signal Processor (DSP) may be used in practice to implement some or all of the functions of some of the modules in an item analysis apparatus according to embodiments of the present invention. The present invention may also be embodied as apparatus programs (e.g., computer programs and computer program products) for performing a portion or all of the methods described herein. Such programs implementing the present invention may be stored on computer-readable media or may be in the form of one or more signals. Such a signal may be downloaded from an internet website or provided on a carrier signal or in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names.
The above description is only for the specific embodiment of the present invention or the description thereof, and the protection scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and the changes or substitutions should be covered within the protection scope of the present invention. The protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (27)

1. An ultrasound imaging system, comprising:
the probe transmits ultrasonic waves to the thyroid or the breast to be detected and receives ultrasonic echoes to obtain ultrasonic echo signals;
the processor processes the ultrasonic echo signal to obtain an ultrasonic image of the thyroid or the breast to be detected; obtaining detection information of the thyroid or the mammary gland to be detected according to the ultrasonic image, wherein the detection information of the thyroid comprises TI-RADS detection information, and the detection information of the mammary gland to be detected comprises BI-RADS detection information; calculating the confidence of the detection information;
a display that displays the detection information and a confidence of the detection information.
2. The system of claim 1, wherein the detection information includes at least two feature item information, and wherein calculating a confidence level for the detection information includes: and calculating the total confidence of the at least two characteristic item information.
3. The system of claim 1, wherein the detection information includes at least one feature item information; wherein calculating the confidence level of the detection information comprises: calculating a confidence level of the at least one feature item information of the detection information.
4. The system of claim 3, wherein said displaying said detection information and a confidence level of said detection information comprises,
displaying at least one feature item information of the detection information, and displaying a confidence corresponding to the feature item information in the vicinity of the at least one feature item information.
5. The system of any of claims 1-4, wherein the displaying the detection information and the confidence level of the detection information comprises: and displaying the confidence level through a degree icon, wherein the degree icon represents the degree of the confidence level through the change of the icon.
6. The system of claim 5, wherein the degree icon comprises at least one of the following icon types:
the number of the icons is variable, and the variable number of the icons represents the degree of confidence through the number of the icons;
the variable proportion icon represents the degree of confidence through the proportion of the specific part of the icon;
a variable-color icon representing a magnitude of the confidence by a color of the icon;
a variable-shaped icon representing a magnitude of the confidence by a shape of the icon;
the icon with the pointer represents the confidence level through the character pointed by the pointer;
a digital icon numerically representing a magnitude of the confidence level; and
and the text icon represents the degree of confidence through text.
7. The system of claim 5, wherein the degree icon comprises an icon displayed in conjunction with the detection information, wherein the icon displayed in conjunction with the detection information comprises:
the variable-state detection information icon displays the detection information through the detection information icon, and the degree of confidence is expressed through the state of the detection information icon.
8. The system of any one of claims 1 to 7, wherein the displaying the detection information and the confidence level of the detection information comprises: and displaying the detection information through a list or displaying the detection information through an indication icon.
9. The system of any one of claims 1 to 8, wherein the processor-implemented calculating the confidence level of the detection information comprises:
calculating the validity of the ultrasonic image;
and calculating the confidence of the detection information according to the validity of the ultrasonic image.
10. The system of claim 9, wherein the calculating the validity of the ultrasound image performed by the processor comprises:
calculating a sharpness of the ultrasound image based on the ultrasound image;
and calculating the validity of the ultrasonic image through the definition of the ultrasonic image.
11. The system of claim 10, wherein the processor executing the calculating a sharpness of the ultrasound image based on the ultrasound image comprises:
determining an effective area from the ultrasonic image;
detecting gradient information of the effective area;
and calculating the definition of the ultrasonic image according to the gradient information.
12. The system of claim 10, wherein the processor executing the calculating a sharpness of the ultrasound image based on the ultrasound image comprises:
training a first artificial intelligent model by inputting two types of thyroid or mammary gland ultrasonic images with clear effective areas and fuzzy effective areas;
and inputting the ultrasonic image of the thyroid or the breast to be detected into the first artificial intelligent model to obtain a result of the definition of the ultrasonic image output by the first artificial intelligent model.
13. The system of claim 9, wherein the processor-implemented calculating the validity of the ultrasound image comprises:
determining an effective area from the ultrasonic image;
calculating the gray average value of the effective area;
and calculating the effectiveness of the ultrasonic image according to the gray average value.
14. The system of claim 9, wherein the processor-implemented calculating the validity of the ultrasound image comprises:
detecting the gray level of the ultrasonic image, and determining the effectiveness of the ultrasonic image according to the gray level of the ultrasonic image; or the like, or, alternatively,
detecting the existence of spots, snowflakes or webbing in the ultrasound image, thereby determining the validity of the ultrasound image; or the like, or, alternatively,
detecting the effective area ratio of the ultrasonic image, and determining the effectiveness of the ultrasonic image according to the effective area ratio of the ultrasonic image; or the like, or, alternatively,
detecting the used probe, probe parameters and/or imaging parameters, and determining the validity of the ultrasonic image according to the corresponding relation between the type of the probe, the probe parameters and/or the imaging parameters and the thyroid or mammary gland to be detected included in the ultrasonic image.
15. The system of any one of claims 1 to 8, wherein the processor-implemented calculating the confidence level of the detection information comprises:
identifying a target structure of a thyroid or breast in the ultrasound image;
and calculating the confidence of the detection information according to the recognition result of the target structure.
16. The system of claim 15, wherein the target structure of the thyroid comprises: thyroid tissue, carotid artery, trachea.
17. The system of claim 16, wherein the processor-implemented identifying a target structure of a thyroid gland in the ultrasound image comprises:
identifying thyroid tissue in an ultrasonic image of a thyroid to be detected;
identifying a positional relationship of the thyroid tissue to other target structures.
18. The system of claim 15, wherein the target structure of the breast comprises a layered structure comprising at least one of a skin layer, a subcutaneous fat layer, an glandular tissue layer, a pectoral layer, a costal layer.
19. The system of any one of claims 15 to 18, wherein the processor-implemented identifying a target structure of a thyroid or breast in the ultrasound image comprises,
training at least one second artificial intelligence model by inputting an ultrasound image of the thyroid or breast containing the target structure;
and inputting the ultrasonic image of the thyroid or the breast to be detected into the at least one second artificial intelligence model, and obtaining the identification result of the target structure of the thyroid or the breast in the ultrasonic image output by the at least one second artificial intelligence model.
20. The system of any one of claims 1 to 8, wherein the processor executing obtaining detection information of the thyroid or breast to be detected from the ultrasound image comprises,
training at least one third artificial intelligence model by inputting an ultrasonic image of the thyroid or the breast corresponding to the detection information;
and inputting the ultrasonic image of the thyroid or the breast to be detected into the at least one third artificial intelligence model to obtain the detection information output by the at least one third artificial intelligence model.
21. The system of claim 20, wherein the processor-implemented calculating the confidence level of the detection information comprises,
and calculating the confidence of the detection information according to the performance of the at least one third artificial intelligence model.
22. The system of any one of claims 1 to 21, wherein said calculating a confidence level of said detection information comprises,
calculating index confidence of the detection information through at least two indexes;
the indicators include: validity of the ultrasound image, a target structure of a thyroid or breast in the ultrasound image, or performance of an artificial intelligence model;
and weighting and calculating the confidence degrees of at least two indexes to obtain the confidence degree of the detection information.
23. An ultrasound imaging system, comprising:
the processor acquires an ultrasonic image of the thyroid or the breast to be detected; obtaining detection information of the thyroid or the breast to be detected according to the ultrasonic image, wherein the detection information comprises TI-RADS detection information of the thyroid or BI-RADS detection information of the breast; calculating the confidence of the detection information;
a display that displays the detection information and a confidence of the detection information.
24. An ultrasound imaging method, comprising:
transmitting ultrasonic waves to the thyroid or the mammary gland to be detected and receiving ultrasonic echoes to obtain ultrasonic echo signals;
processing the ultrasonic echo signal to obtain an ultrasonic image of the thyroid or the breast to be detected;
obtaining detection information of the thyroid or the breast to be detected according to the ultrasonic image, wherein the detection information comprises TI-RADS detection information of the thyroid or BI-RADS detection information of the breast;
calculating the confidence of the detection information;
and displaying the detection information and the confidence of the detection information.
25. An ultrasound imaging method, comprising:
acquiring an ultrasonic image of the thyroid or the breast to be detected;
obtaining detection information of the thyroid or the mammary gland to be detected according to the ultrasonic image, wherein the detection information of the thyroid comprises TI-RADS detection information, and the detection information of the mammary gland to be detected comprises BI-RADS detection information;
calculating the confidence of the detection information;
and displaying the detection information and the confidence of the detection information.
26. An ultrasound imaging system, comprising,
the processor is used for processing and acquiring an ultrasonic image of the tissue to be detected; acquiring detection information according to the ultrasonic image of the tissue to be detected; calculating the confidence of the detection information;
a display that displays the detection information and a confidence of the detection information.
27. A computer storage medium, on which a computer program is stored for application in an ultrasound imaging apparatus, which computer program, when being executed by a processor, carries out the method as set forth in claim 24 or 25.
CN202011345372.XA 2019-11-27 2020-11-26 Ultrasonic imaging system, ultrasonic imaging method and storage medium Pending CN112842394A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201911183438 2019-11-27
CN2019111834387 2019-11-27

Publications (1)

Publication Number Publication Date
CN112842394A true CN112842394A (en) 2021-05-28

Family

ID=75996534

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011345372.XA Pending CN112842394A (en) 2019-11-27 2020-11-26 Ultrasonic imaging system, ultrasonic imaging method and storage medium

Country Status (1)

Country Link
CN (1) CN112842394A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113450325A (en) * 2021-06-28 2021-09-28 什维新智医疗科技(上海)有限公司 Thyroid nodule benign and malignant recognition device
CN114469175A (en) * 2021-12-21 2022-05-13 上海深至信息科技有限公司 Method and device for judging integrity of thyroid gland scanning
CN118014976A (en) * 2024-03-06 2024-05-10 中国医学科学院阜外医院 Ultrasonic inspection system capable of automatically identifying image features and identification method

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113450325A (en) * 2021-06-28 2021-09-28 什维新智医疗科技(上海)有限公司 Thyroid nodule benign and malignant recognition device
CN113450325B (en) * 2021-06-28 2022-09-09 什维新智医疗科技(上海)有限公司 Thyroid nodule benign and malignant recognition device
CN114469175A (en) * 2021-12-21 2022-05-13 上海深至信息科技有限公司 Method and device for judging integrity of thyroid gland scanning
CN114469175B (en) * 2021-12-21 2024-04-05 上海深至信息科技有限公司 Thyroid gland scanning integrity judging method and device
CN118014976A (en) * 2024-03-06 2024-05-10 中国医学科学院阜外医院 Ultrasonic inspection system capable of automatically identifying image features and identification method

Similar Documents

Publication Publication Date Title
Guo et al. Radiomics analysis on ultrasound for prediction of biologic behavior in breast invasive ductal carcinoma
CN112842394A (en) Ultrasonic imaging system, ultrasonic imaging method and storage medium
JP6467041B2 (en) Ultrasonic diagnostic apparatus and image processing method
CN112971844A (en) Ultrasonic image acquisition quality evaluation method and ultrasonic imaging equipment
CN111768366A (en) Ultrasonic imaging system, BI-RADS classification method and model training method
TW201347737A (en) Breast ultrasound scanning and diagnosis aid system
CN103228216A (en) Medical image processing apparatus, x-ray CT apparatus and medical image processing program
KR20210081243A (en) Methods and systems for automatic measurement of strains and strain-ratio calculation for sonoelastography
CN111281425B (en) Ultrasound imaging system and method for displaying target object quality level
CN113116390A (en) Ultrasonic image detection method and ultrasonic imaging equipment
CN112568933B (en) Ultrasonic imaging method, apparatus and storage medium
CN114375179B (en) Ultrasonic image analysis method, ultrasonic imaging system and computer storage medium
CN113693627A (en) Ultrasonic image-based focus processing method, ultrasonic imaging device and storage medium
Chen et al. Breast density analysis for whole breast ultrasound images
CN114652353A (en) Ultrasonic imaging system and carotid plaque stability assessment method
CN114298958A (en) Ultrasonic imaging system and ultrasonic image analysis method
CN113229850A (en) Ultrasonic pelvic floor imaging method and ultrasonic imaging system
CN114391878B (en) Ultrasonic imaging equipment
CN114569154A (en) Ultrasound imaging system and method for assessing tissue elasticity
WO2022040878A1 (en) Ultrasound imaging system and ultrasound image analysis method
US20230380798A1 (en) Non-invasive ultrasound detection device for liver fibrosis and method thereof
Chang et al. Breast elastography diagnosis based on dynamic sequence features
CN115211895A (en) Ultrasonic system, viscoelasticity measurement result, and display method of ultrasonic measurement result
CN113057671A (en) Ultrasonic imaging method, ultrasonic imaging apparatus, and computer storage medium
CN115633985A (en) Cloud equipment, ultrasonic imaging system and ultrasonic image analysis method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination