CN112801114B - Method and device for determining projection position information of breast image - Google Patents

Method and device for determining projection position information of breast image Download PDF

Info

Publication number
CN112801114B
CN112801114B CN202110075957.2A CN202110075957A CN112801114B CN 112801114 B CN112801114 B CN 112801114B CN 202110075957 A CN202110075957 A CN 202110075957A CN 112801114 B CN112801114 B CN 112801114B
Authority
CN
China
Prior art keywords
projection position
information
position information
image
breast
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110075957.2A
Other languages
Chinese (zh)
Other versions
CN112801114A (en
Inventor
石磊
程根
史晶
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Yitu Healthcare Technology Co ltd
Original Assignee
Hangzhou Yitu Healthcare Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Yitu Healthcare Technology Co ltd filed Critical Hangzhou Yitu Healthcare Technology Co ltd
Priority to CN202110075957.2A priority Critical patent/CN112801114B/en
Publication of CN112801114A publication Critical patent/CN112801114A/en
Application granted granted Critical
Publication of CN112801114B publication Critical patent/CN112801114B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/5866Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using information manually generated, e.g. tags, keywords, comments, manually generated location and time information
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10116X-ray image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30068Mammography; Breast

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Public Health (AREA)
  • Library & Information Science (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Databases & Information Systems (AREA)
  • Epidemiology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

The invention discloses a method for determining projection position information of a breast image, which comprises the following steps: acquiring first projection position information contained in each breast image label information in a group of breast images; acquiring second projection position information of each breast image and confidence level of the second projection position information through an image identification model; when the first projection position information and the second projection position information of the breast image are inconsistent and the confidence degree belongs to a preset range, the second projection position information is used as the projection position information of the breast image. According to the method for determining the projection position information of the breast image, the projection position information of the breast image is automatically recognized through the trained deep learning image recognition model, the projection position information can be added to the breast image lacking in the projection position information, and the wrong projection position information can be found and corrected.

Description

Method and device for determining projection position information of breast image
Technical Field
The invention relates to the technical field of medical treatment, in particular to a method and a device for determining projection position information of a breast image and a breast image display method.
Background
The mammary gland X-ray examination is a more common mammary gland examination method, and when taking mammary gland images, one image needs to be respectively taken at the MLO position and the CC position of left and right breasts. Fig. 1 is a schematic view of a set of mammography images in a mammography examination. Referring to fig. 1, the four images are left MLO bits, right MLO bits, left CC bits, and right CC bits. The conventional PACS manufacturer performs matching according to the image tag defined by the image shooting device, and the general tag is marked with left and right breast information, projection position information, device manufacturer information and the like. However, in individual cases, the label information may be mislabeled or not labeled during the shooting due to operator error. The wrong label information can affect the judgment of the doctor on the subsequent focus, and even misjudgment can be caused.
Disclosure of Invention
In order to solve the problem that the breast image lacks projection position information or the projection position information is inaccurate in the background art, the invention provides a method for automatically generating the projection position information and correcting the wrong projection position information through a deep learning image recognition method.
In order to achieve the above object, the present invention provides a method for determining projection position information of a breast image, comprising:
acquiring first projection position information contained in each breast image label information in a group of breast images;
acquiring second projection position information of each breast image and confidence level of the second projection position information through an image identification model;
when the first projection position information and the second projection position information of the breast image are inconsistent and the confidence degree belongs to a preset range, the second projection position information is used as the projection position information of the breast image.
Optionally, the method further comprises:
when the first projection position information contained in the breast image label information cannot be acquired, the second projection position information is used as the projection position information of the breast image.
Optionally, the projection position information includes a CC projection position and an MLO projection position, and the method further includes:
and when the CC projection position or the MLO projection position is more than two in the projection position information of the group of breast images, selecting the two breast images with the highest confidence as the projection positions.
Optionally, the method further comprises:
and determining left and right breast information of the breast image according to the characteristic information in the breast image.
Optionally, the feature information includes:
calcifications, tumors, and structural distortions.
Optionally, the method further comprises:
acquiring equipment information which is contained in the mammary gland image tag information and used for generating a mammary gland image;
when the equipment information contains preset information, the first projection position information is used as the projection position information of the breast image.
Optionally, the device information includes one of a device manufacturer and a device model.
The invention also provides a device for determining the projection position information of the mammary gland image, which comprises:
the first information acquisition unit is used for acquiring first projection position information contained in each breast image label information in a group of breast images;
the second information acquisition unit is used for acquiring second projection position information and confidence coefficient thereof in the breast image through the image recognition model;
and the projection position information acquisition unit is used for taking the second projection position information as the projection position information of the breast image when the first projection position information and the second projection position information of the breast image are inconsistent and the confidence degree belongs to a preset range.
Optionally, in the apparatus, the device,
the first information acquisition unit is further used for acquiring equipment information which is contained in the mammary gland image tag information and is used for generating a mammary gland image;
the projection position information acquisition unit is further used for taking the first projection position information as projection position information of the breast image when the equipment information contains preset information.
The invention also provides a breast image display method, which comprises the following steps:
and displaying the breast image in response to a first operation, wherein displaying the breast image comprises displaying projection position information of the breast image and an acquisition way of the projection position information.
According to the method and the device for determining the projection position information of the breast image, the projection position information of the breast image is automatically recognized through the trained image recognition model, the projection position information can be added to the breast image without the projection position information, and the wrong projection position information can be found and corrected.
Further, according to the method and the device for determining the projection position information of the breast image, provided by the invention, the problem that the image recognition model is poor in generalization of the breast image generated by some manufacturer or model devices is considered, the device information with poor generalization is preset, and when the breast image belongs to the image generated by the equipment of the appointed manufacturer or model, the image recognition model is forbidden to be used, so that misleading of a doctor to perform wrong diagnosis due to wrong projection position information is avoided.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the description of the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic illustration of a set of breast images in a mammography examination;
FIG. 2 is a flowchart of a method for determining projection position information of a breast image according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of determining depth of focus in a CC-site breast image according to an embodiment of the present invention;
FIG. 4 is a schematic illustration of determining lesion depth in an MLO-site breast image according to an embodiment of the invention;
fig. 5 is a schematic structural diagram of an apparatus for determining projection position information of a breast image according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in further detail below with reference to the accompanying drawings, and it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Fig. 2 is a flow chart of a method of determining projection position information of a breast image according to an embodiment of the present invention. As shown in fig. 2, the method for determining the projection position information of the breast image includes:
s101, first projection position information contained in each breast image label information in a group of breast images is obtained.
S102, obtaining second projection position information and confidence coefficient of each breast image through an image recognition model.
S103, when the first projection position information and the second projection position information of the breast image are inconsistent and the confidence value belongs to a preset range, the second projection position information is used as the projection position information of the breast image.
S101, first projection position information contained in each breast image label information in a group of breast images is obtained. Taking a breast molybdenum target image as an example, a Dicom image (the whole name: digital Imaging and Communications in Medicine, digital imaging and communication in medicine, which is an international standard of medical images and related information) with a breast image as a standard, attribute information of the breast image is recorded in the Dicom image by a corresponding tag. The attribute information comprises projection position information input by a doctor during image shooting. Specifically, after a doctor assists a patient to position in a shooting process, the doctor selects a projection position through the operation of an operation interface or an operation button of the mammary gland X-ray examination equipment, and the mammary gland X-ray examination equipment adds the projection position information into attribute information of a picture. In some devices, the attribute information may be displayed directly in the image, such as in a smaller font at a location where the upper left corner of the image, etc., does not affect the display of breast tissue. At this time, the projection position information in the breast image can be obtained by an image character recognition technology such as OCR.
S102, obtaining second projection position information and confidence coefficient of each breast image through an image recognition model. The image recognition model is obtained through training a large number of breast images and projection position labeling results of the breast images, the projection position labeling results are labels which are manually read by professionals, the trained image recognition model can output projection position information of the breast images to be recognized and confidence degrees corresponding to the projection position information, and the higher the confidence degrees are, the more positive the result output to the image recognition model is. The confidence level may be represented by an index value between 0,1, e.g., when the output result of the image recognition model is to determine whether the projection position information of the breast image is an MLO position, the more likely the index value is an MLO projection position, the more likely it is a CC projection position, and the more likely it is that the index value is close to 0.
S103, when the first projection position information and the second projection position information of the breast image are inconsistent and the confidence degree belongs to a preset range, the second projection position information is used as the projection position information of the breast image.
In most cases, the first and second projection information of the breast image should be identical, but in a few cases, the first and second projection information are not identical. For example, the first projection information of a certain breast image is a CC projection, and the second projection information is an MLO projection. In this embodiment, considering that the probability of misoperation of the doctor is low, when the output result of the image recognition model is affirmative, for example, when the confidence of the second projection position information is greater than or equal to 0.8, the second projection position information is used as the projection position information of the breast image.
The method further comprises the step of preprocessing the breast image, wherein the preprocessing comprises binarizing the breast image, namely edge segmentation of the breast image. The detail of the mammary gland image can be reduced, and the recognition accuracy of the image recognition model can be improved.
The method for determining the projection position information of the breast image further comprises the following steps: when the first projection position information contained in the breast image label information cannot be acquired, the second projection position information is used as the projection position information of the breast image. In some breast images, the projection position information may be missing, and at this time, the problem that the first projection position information and the second projection position information are inconsistent as described above does not exist, and in this case, the second projection position information is directly used as the projection position information of the breast image.
Further, the method for determining the projection position information of the breast image according to the embodiment of the invention further comprises the following steps: the projection position information comprises CC projection positions and MLO projection positions, and when the number of the CC projection positions or the number of the MLO projection positions in the projection position information of the group of breast images is more than two, two breast images with the highest confidence level are selected as the projection positions.
In mammography, only one breast is sometimes imaged, but in most cases both breasts are imaged, depending on the patient and the condition. There are four breast images, two CC-site breast images and two MLO-site breast images when imaging both breasts.
However, in some cases, when the first projection position information included in each breast image tag information in a set of breast images is acquired through S101, two CC projection position breast images and two MLO projection position breast images are acquired, but after S102, the second projection position information and the confidence level thereof are acquired, and after S103, the judgment is performed, it is possible that 3 CC projection position breast images or 3 MLO projection position breast images appear. At this time, it is obvious that the requirement that two CC projection positions and two MLO projection positions are required to be output in the mammary gland X-ray examination is not met, so that it is required to judge which two are correct CC projection positions or MLO projection positions.
The solution in this case is described in detail below by means of specific examples.
And after receiving the first group of four breast images, acquiring first projection position information contained in the label information of each breast image in the first group. The first and second pieces of breast image label information comprise CC (computer aided diagnosis) projection positions, and the third and fourth pieces of breast image label information comprise MLO (multi-level image) projection positions. And further acquiring second projection position information and corresponding confidence coefficient values in the first group of breast images through an image recognition model, wherein the CC projection position reliability of the first breast image is 0.98, the CC projection position reliability of the second breast image is 0.02, the CC projection position reliability of the third breast image is 0.97, and the CC projection position reliability of the fourth breast image is 0.03. At this time, the image recognition model considers that the reliability of the CC projection position of the second breast image is only 0.02, that is, the reliability of the second breast image is 0.98 for the MLO projection position, and the confidence value is within the preset range (0.8 is the threshold, 0.98 > 0.8), so that the result is more affirmative, and the MLO projection position is used as the projection position of the second breast image. And similarly, taking the CC projection position as the projection position of the third breast image.
And after receiving the second group of four breast images, acquiring first projection position information contained in each breast image label information in the second group. The first and second pieces of breast image label information comprise CC projection positions, and the third and fourth pieces of breast image label information comprise MLO projection positions. And further acquiring the second projection position information and the corresponding confidence value in the second group of breast images through the image recognition model. The CC projection position reliability value of the first breast image is 0.98, the CC projection position reliability value of the second breast image is 0.99, the CC projection position reliability value of the third breast image is 0.97, and the CC projection position reliability value of the fourth breast image is 0.03. At this time, the image recognition model considers that the third breast image is the CC projection position, and the confidence value is within the preset range, so that the output result is more affirmative. However, considering that it is impossible to output 3 breast images with CC projection positions in the X-ray examination, the MLO projection position is taken as the projection position of the third breast image (CC projection position of the third image has the lowest reliability).
The method for determining the projection position information of the breast image, provided by the embodiment of the invention, further comprises the following steps: and determining left and right breast information of the breast image according to the characteristic information in the breast image.
In the breast images, the contour shapes of the breast images of different projection positions are different, so that the projection positions of the different breast images can be identified through a trained image identification model, but the left and right breast images of the same projection position are not different in contour shape, even if the differences exist, the differences are only specific to individual cases, and the differences are not common, so that the left and right breasts of the same projection position (including doctors with abundant experience) are difficult to identify through training the image identification model. The procedure of determining left and right breast information of a breast image according to feature information in the breast image in this embodiment refers to a procedure of determining a corresponding breast image of another projection position according to feature information in the breast image of the projection position on the premise that at least one image on the CC projection position or the MLO projection position includes the left and right breast information.
In order to achieve the above object, in this embodiment, a focus contained in each breast image in a set of breast images is first identified by a focus identification model, and the focus information is used as feature information, where the feature information includes: calcifications, tumors, and structural distortions. The process of obtaining the feature information is performed by a trained lesion recognition model, which is the prior art and will not be described in detail herein.
After the feature information contained in each breast image is acquired, the similarity between the features can be acquired through different measurement modes, such as: the similarity between features can be calculated by means of distance measures of the features in the breast image, such as: euclidean distance, mahalanobis distance, etc. The similarity between the features can be measured by means of cosine of the included angle, entropy of information and the like. And when the similarity is determined by a distance measurement method, the larger the distance difference between the features is, the smaller the similarity degree between the features is, namely, the less the features are similar.
The procedure of determining the left and right breast information of the breast image after acquiring the characteristic information is described in detail below by way of specific examples. It is assumed that, after receiving the third group of four breast images, the first projection position information and the left and right breast information included in the tag information of each breast image in the group are acquired. The left and right breast information contained in the first breast image tag information is right breast, and no projection position information exists; the label information of the second, third and fourth breast images does not contain left and right breast information and projection position information. The information of the projection position of each breast image is determined by the method, the first breast image and the second breast image are confirmed to be CC projection positions, and the third breast image and the fourth breast image are confirmed to be MLO projection positions. Knowing that the first breast image is the right breast image, the second breast image of another CC projection site can be judged to be the left breast image based on this. And comparing the first breast image with the third breast image and the fourth breast image respectively, and judging the feature similarity contained in the first breast image and the fourth breast image.
The feature similarity comparison is performed by comparing the distances between features in each breast image. The distance of the feature may include the depth of the lesion, such as the depth of the lesion that may be determined based on the location of the lesion and the location of the nipple. Fig. 3 is a schematic diagram of determining a depth of a lesion in a breast image of a CC projection site according to an embodiment of the present invention. Fig. 4 is a schematic diagram of determining a depth of a lesion in an MLO projection site breast image according to an embodiment of the present invention. In the right breast image of the CC projection position shown in fig. 3 or in the breast image of the MLO projection position shown in fig. 4, a straight line where an intersection point of an arc with a preset length as a radius and the edge of the breast is located is a first straight line by taking the nipple as a circle center; taking a straight line passing through the nipple and perpendicular to the first straight line as a second straight line; the projection of the lesion on the second straight line is determined, with the distance between the projection and the nipple as the depth of the lesion. In consideration of the fact that the depth of the focus determined based on the focus position and the nipple position is kept unchanged under different projection positions, the same focus under different projection positions can be accurately and rapidly determined in this way.
In this embodiment, the focus in the first breast image is the first focus, and the focus in the third breast image is the third focus; the focus in the fourth breast image is a fourth focus, and the similarity between the first focus and the third focus may be determined according to the depth of the first focus and the depth of the third focus, for example, if the depth of the first focus is depth 1, the depth of the third focus is depth 2, and if the distance is used to measure the first similarity difference between the depth of the first focus and the depth of the second focus, the first similarity difference= | "depth 1-depth 2" between the two is the same. The same method comprises the steps of obtaining the depth of a first focus and the depth of a fourth focus, determining a second similarity difference between the first focus and the fourth focus, and comparing the first similarity difference with the second similarity difference, wherein the smaller the similarity difference is, the higher the similarity between the two focuses is, and when the first similarity difference is smaller than the second similarity difference, the third breast image is obtained to be a right MLO projection position image; and when the first similarity difference is larger than the second similarity difference, obtaining a fourth breast image as a right MLO projection position image.
In order to improve the matching precision, other information in the focus information, such as the information of symptoms of the focus, the size of the focus and the like, can be further considered. And determining the similarity between the first focus and the third focus according to the similarity between the first focus and the third focus and the fourth focus, and further determining the similarity between the first focus and the third focus and the similarity between the first focus and the fourth focus according to the similarity between the first focus and the third focus and the fourth focus, and the similarity between the first focus and the fourth focus. Illustratively, if the depth of the first lesion is depth 1, the depth of the third lesion is depth 2, the sign of the first lesion is sign 1, the sign of the third lesion is sign 2, the similarity difference between the first lesion and the third lesion is the first similarity difference, then the first similarity difference= | "depth 1-depth 2" | "+" |sign 1 "-sign 2" |. The same method is adopted, a second similarity difference between the first focus and the fourth focus is obtained, the first similarity difference and the second similarity difference are compared, and when the first similarity difference is smaller than the second similarity difference, a third breast image is obtained to be a right MLO projection position image; and when the first similarity difference is larger than the second similarity difference, obtaining a fourth breast image as a right MLO projection position image.
In other embodiments, the similarity between the size of the first lesion and the size of the third and fourth lesions may be determined according to the size of the first lesion and the size of the third and fourth lesions, and the similarity between the first lesion and the third and fourth lesions may be determined according to the similarity between the size of the first lesion and the size of the third and fourth lesions and the similarity between the depth of the first lesion and the depth of the third and fourth lesions. For example, if the depth of the first lesion is depth 1, the depth of the third lesion is depth 2, the size of the first lesion is size 1, the size of the third lesion is size 2, the similarity difference between the first lesion and the third lesion is the first similarity difference, then the first similarity difference= | "depth 1-depth 2" | "+" |size 1 "-size 2" | ". The same method is adopted, a second similarity difference between the first focus and the fourth focus is obtained, the first similarity difference and the second similarity difference are compared, and when the first similarity difference is smaller than the second similarity difference, a third breast image is obtained to be a right MLO projection position image; and when the first similarity difference is larger than the second similarity difference, obtaining a fourth breast image as a right MLO projection position image.
In further embodiments, the similarity between the first lesion's sign and the third and fourth lesion's sign may also be determined based on the first lesion's sign and the third and fourth lesion's sign; determining the similarity between the size of the first focus and the size of the third and fourth focus according to the size of the first focus and the size of the third and fourth focus; and further determining the similarity between the first lesion and the third and fourth lesions based on the similarity between the first lesion's sign and the third and fourth lesions' signs, the similarity between the first lesion's size and the third and fourth lesions' sizes, and the similarity between the first lesion's depth and the third and fourth lesions' depths. Illustratively, if the first lesion has a depth of 1, the third lesion has a depth of 2, the first lesion has a sign of 1, the third lesion has a sign of 2, the first lesion has a size of 1, the third lesion has a size of 2, the first lesion has a first similarity difference with the third lesion, and the first similarity difference= | "depth 1-depth 2" | "+" | "sign 1" -sign 2 "|++ |" size 1 "-size 2" | ". The same method is adopted, a second similarity difference between the first focus and the fourth focus is obtained, the first similarity difference and the second similarity difference are compared, and when the first similarity difference is smaller than the second similarity difference, a third breast image is obtained to be a right MLO projection position image; and when the first similarity difference is larger than the second similarity difference, obtaining a fourth breast image as a right MLO projection position image.
In the above implementation manner, when calculating the similarity, coefficients corresponding to different parameter types may also be determined, taking the similarity calculation formula of the third implementation manner as an example, the similarity difference between the first lesion and the third and fourth lesions may be that the similarity difference=a| "depth 1-depth 2" | "+b" |sign 1 "-sign 2" |+c|size 1 "-size 2" |, where a, b, and c may be determined according to practical experience.
In this embodiment, the signs of the lesion may include at least one category, e.g., the signs of the lesion include one or a combination of the following: calcification, tumor, structural distortion, when determining the similarity between the first lesion's symptoms and the third and fourth lesion's symptoms based on the first lesion's symptoms and the third and fourth lesion's symptoms, the first vector may be determined based on the confidence that the first lesion belongs to each category; determining a second vector according to the confidence that the third and fourth lesions belong to each category; the distance between the first vector and the second vector is obtained, and the distance between the first vector and the second vector is used as the similarity between the sign of the first focus and the signs of the third and fourth focuses.
In the implementation process, the breast image can be respectively input into a calcification detection model, a tumor detection model and a structure distortion detection model to determine a vector corresponding to a certain focus in the breast image. If the first breast image is input to the calcification detection model, the tumor detection model and the structural distortion detection model respectively, the confidence that the first focus is calcified is determined to be 0.9, the confidence that the first focus is a tumor is 0.1, the confidence that the first focus is structural distortion is 0, the first vector corresponding to the first focus is (0.9,0.1,0), similarly, the third or fourth breast image is input to the calcification detection model, the tumor detection model and the structural distortion detection model respectively, the second vector corresponding to the third or fourth focus is determined to be (0.8,0.1,0), and the distance between the first vector and the second vector is the similarity between the sign of the first focus and the sign of the third or fourth. The first vector and the second vector may be determined according to the L2 norm, or may be determined according to other manners.
In addition, considering that the shape of the lesion may be elliptical, the size of the lesion may be determined according to the long diameter, the short diameter, and the average diameter of the lesion.
The method for determining projection position information of a breast image in this embodiment further includes:
and acquiring equipment information which is contained in the mammary gland image tag information and generates the mammary gland image.
When the equipment information contains preset information, the first projection position information is used as the projection position information of the breast image.
The mammary gland image generated by the X-ray inspection equipment is a standard Dicom image, and attribute information of the image is recorded in the Dicom image by corresponding tag information, such as manufacturers of the X-ray inspection equipment, models of the equipment, projection position information, left and right breast information and the like corresponding to the mammary gland image, and the manufacturers, the projection position information and the like can be obtained according to the tag information. For example, the breast image of the Fuji company is determined to be "FUJIFILM Corporation" according to the "Manufacturer" tag in the tag information, and then the left and right breast information, the image type information and the projection position information "R MAMMOGRAPHY, CC" are determined according to the "Acquisition Device Processing Description" tag by referring to the breast image tag rule of the Fuji company (R represents that the image is a right breast image, MAMMOGRAPHY represents that the image is a molybdenum target, and CC represents that the image is a CC projection position image).
In practice, after a large amount of data is verified, it is found that the image recognition model has poor generalization performance on the breast image generated by some manufacturer X-ray inspection devices, and the error rate of the result of the image recognition model is relatively high.
The equipment attribute information is one of equipment manufacturer and equipment model information. That is, when the breast image is from an X-ray examination apparatus produced by a specified manufacturer or an X-ray examination apparatus of a specified model, the first projection position information is directly adopted as the projection position of the breast image.
The embodiment of the invention also provides a device for determining the projection position information of the breast image, which is used for executing the method for determining the projection position information of the breast image, and comprises the following steps:
a first information acquiring unit 101, configured to acquire first projection position information included in each breast image tag information in a set of breast images.
The second information acquisition unit is used for acquiring second projection position information and the confidence coefficient thereof in the mammary gland image through the image recognition model.
And the projection position information acquisition unit is used for taking the second projection position information as the projection position information of the breast image when the first projection position information and the second projection position information of the breast image are inconsistent and the confidence degree belongs to a preset range.
In the present embodiment of the present invention,
the first information obtaining unit 101 is further configured to obtain device information for generating a breast image, where the device information includes breast image tag information.
The projection position information obtaining unit 103 is further configured to use the first projection position information as projection position information of the breast image when the device information includes preset information.
The embodiment of the invention also provides a breast image display method, which comprises the following steps: in response to a first operation, displaying a breast image includes displaying cast position information of the breast image and an acquisition approach of the cast position information.
The first operation may be that an operator inputs an operation command for displaying a breast image to the breast image display device by clicking a mouse or touching the device by a touch pad, etc., and the projection position information for displaying the breast image means that: the method comprises the steps that a schematic label of CC or MLO projection positions is displayed at positions which do not affect the observation of the mammary gland around or in the range of the mammary gland image, and the acquisition way of the information of the display projection positions is as follows: the position, which does not affect the observation of the breast, around or within the range of the breast image shows whether the projection is acquired according to the first projection information or the second projection information, or the projection is only marked by the acquisition route when the projection is acquired according to the first projection information or the second projection information, and the default of the non-marked projection is that the other projection information is acquired.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiments and all such alterations and modifications as fall within the scope of the invention.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present invention without departing from the spirit or scope of the invention. Thus, it is intended that the present invention also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.

Claims (6)

1. A method of determining projection location information for a breast image, comprising:
acquiring first projection position information contained in each breast image label information in a group of breast images;
acquiring second projection position information of each breast image and confidence level of the second projection position information through an image identification model;
when the first projection position information and the second projection position information of the breast image are inconsistent and the confidence coefficient belongs to a preset range, the second projection position information is used as the projection position information of the breast image;
the method further comprises the steps of: acquiring equipment information which is contained in the mammary gland image tag information and used for generating a mammary gland image; when the equipment information contains preset information, taking the first projection position information as the projection position information of the breast image; the equipment information comprises one of equipment manufacturer and equipment model; the preset information is X-ray inspection equipment produced by a specified manufacturer or X-ray inspection equipment of a specified model.
2. The method as recited in claim 1, further comprising:
when the first projection position information contained in the breast image label information cannot be acquired, the second projection position information is used as the projection position information of the breast image.
3. The method of claim 1, wherein the projection bit information comprises a CC projection bit and an MLO projection bit, the method further comprising:
in the projection position information of the group of breast images, when the CC projection position or the MLO projection position is more than two, selecting two breast images with the highest confidence as the projection positions.
4. The method as recited in claim 1, further comprising:
and determining left and right breast information of the breast image according to the characteristic information in the breast image.
5. The method of claim 4, wherein,
the characteristic information includes one or more of calcifications, bumps, and structural distortions.
6. An apparatus for determining projection location information for a breast image, comprising:
the first information acquisition unit is used for acquiring first projection position information contained in each breast image label information in a group of breast images; acquiring equipment information which is contained in the mammary gland image tag information and used for generating a mammary gland image; when the equipment information contains preset information, taking the first projection position information as the projection position information of the breast image; the equipment information comprises one of equipment manufacturer and equipment model; the preset information is X-ray inspection equipment produced by a specified manufacturer or X-ray inspection equipment of a specified model;
the second information acquisition unit is used for acquiring second projection position information and confidence coefficient thereof in the breast image through the image recognition model;
and the projection position information acquisition unit is used for taking the second projection position information as the projection position information of the breast image when the first projection position information and the second projection position information of the breast image are inconsistent and the confidence degree belongs to a preset range.
CN202110075957.2A 2021-01-20 2021-01-20 Method and device for determining projection position information of breast image Active CN112801114B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110075957.2A CN112801114B (en) 2021-01-20 2021-01-20 Method and device for determining projection position information of breast image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110075957.2A CN112801114B (en) 2021-01-20 2021-01-20 Method and device for determining projection position information of breast image

Publications (2)

Publication Number Publication Date
CN112801114A CN112801114A (en) 2021-05-14
CN112801114B true CN112801114B (en) 2024-03-08

Family

ID=75810809

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110075957.2A Active CN112801114B (en) 2021-01-20 2021-01-20 Method and device for determining projection position information of breast image

Country Status (1)

Country Link
CN (1) CN112801114B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109345515A (en) * 2018-09-17 2019-02-15 代黎明 Sample label confidence calculations method, apparatus, equipment and model training method
CN111353549A (en) * 2020-03-10 2020-06-30 创新奇智(重庆)科技有限公司 Image tag verification method and device, electronic device and storage medium
CN111414946A (en) * 2020-03-12 2020-07-14 腾讯科技(深圳)有限公司 Artificial intelligence-based medical image noise data identification method and related device
CN111430014A (en) * 2020-03-31 2020-07-17 杭州依图医疗技术有限公司 Display method, interaction method and storage medium of glandular medical image
CN112115913A (en) * 2020-09-28 2020-12-22 杭州海康威视数字技术股份有限公司 Image processing method, device and equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11657497B2 (en) * 2019-03-26 2023-05-23 The Johns Hopkins University Method and apparatus for registration of different mammography image views

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109345515A (en) * 2018-09-17 2019-02-15 代黎明 Sample label confidence calculations method, apparatus, equipment and model training method
CN111353549A (en) * 2020-03-10 2020-06-30 创新奇智(重庆)科技有限公司 Image tag verification method and device, electronic device and storage medium
CN111414946A (en) * 2020-03-12 2020-07-14 腾讯科技(深圳)有限公司 Artificial intelligence-based medical image noise data identification method and related device
CN111430014A (en) * 2020-03-31 2020-07-17 杭州依图医疗技术有限公司 Display method, interaction method and storage medium of glandular medical image
CN112115913A (en) * 2020-09-28 2020-12-22 杭州海康威视数字技术股份有限公司 Image processing method, device and equipment and storage medium

Also Published As

Publication number Publication date
CN112801114A (en) 2021-05-14

Similar Documents

Publication Publication Date Title
CN108520519B (en) Image processing method and device and computer readable storage medium
US20190214118A1 (en) Automated anatomically-based reporting of medical images via image annotation
CN109410188B (en) System and method for segmenting medical images
JP6687352B2 (en) Medical image data processing apparatus and medical image data processing method
JP5868231B2 (en) Medical image diagnosis support apparatus, medical image diagnosis support method, and computer program
US10318839B2 (en) Method for automatic detection of anatomical landmarks in volumetric data
US10424067B2 (en) Image processing apparatus, image processing method and storage medium
CN103985147A (en) Method and System for On-Site Learning of Landmark Detection Models for End User-Specific Diagnostic Medical Image Reading
US8285013B2 (en) Method and apparatus for detecting abnormal patterns within diagnosis target image utilizing the past positions of abnormal patterns
US10706534B2 (en) Method and apparatus for classifying a data point in imaging data
US10916010B2 (en) Learning data creation support apparatus, learning data creation support method, and learning data creation support program
US11468567B2 (en) Display of medical image data
EP3722996A2 (en) Systems and methods for processing 3d anatomical volumes based on localization of 2d slices thereof
CN110910441A (en) Method and device for extracting center line
CN113808125A (en) Medical image processing method, focus type identification method and related product
Lee et al. Deep learning based cephalometric landmark identification using landmark-dependent multi-scale patches
US20240078676A1 (en) Interactive coronary labeling using interventional x-ray images and deep learning
WO2012106580A2 (en) Methods and apparatus for computer-aided radiological detection and imaging
US11715208B2 (en) Image segmentation
CN112801114B (en) Method and device for determining projection position information of breast image
JP2018050761A (en) Image processing apparatus and image processing method
US20240029870A1 (en) Document creation support apparatus, document creation support method, and document creation support program
US11996198B2 (en) Determination of a growth rate of an object in 3D data sets using deep learning
CN113168912B (en) Determining growth rate of objects in 3D dataset using deep learning
CN110709888B (en) Information processing apparatus and method for controlling the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant