CN112102333B - Ultrasonic region segmentation method and system for B-ultrasonic DICOM (digital imaging and communications in medicine) image - Google Patents

Ultrasonic region segmentation method and system for B-ultrasonic DICOM (digital imaging and communications in medicine) image Download PDF

Info

Publication number
CN112102333B
CN112102333B CN202010907810.0A CN202010907810A CN112102333B CN 112102333 B CN112102333 B CN 112102333B CN 202010907810 A CN202010907810 A CN 202010907810A CN 112102333 B CN112102333 B CN 112102333B
Authority
CN
China
Prior art keywords
points
pixel values
column
dicom image
maximum value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010907810.0A
Other languages
Chinese (zh)
Other versions
CN112102333A (en
Inventor
付超
徐车
侯冰冰
吴子健
占倩珊
薛旻
常文军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei University of Technology
Original Assignee
Hefei University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei University of Technology filed Critical Hefei University of Technology
Priority to CN202010907810.0A priority Critical patent/CN112102333B/en
Publication of CN112102333A publication Critical patent/CN112102333A/en
Application granted granted Critical
Publication of CN112102333B publication Critical patent/CN112102333B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Epidemiology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
  • Image Processing (AREA)

Abstract

The invention provides an ultrasonic region segmentation method and system for B-mode ultrasound DICOM (digital imaging and communications in medicine) images, and relates to the technical field of medical image segmentation. The method comprises the steps of reading a B-mode ultrasound DICOM image, and acquiring the dimensionality of a two-dimensional pixel matrix of the B-mode ultrasound DICOM image, namely the row number M and the column number N; then setting a boundary control variable U; and traversing the two-dimensional pixel matrix of the DICOM image from the U-th column on the left, the N/2-th column from the middle, the U-th row from the upper side and the M/2-th row from the middle respectively to finally obtain coordinates (L, R, T and P) representing the ultrasonic region, and then acquiring the ultrasonic region based on the coordinates (L, R, T and P). The method effectively solves the problem that a large number of black useless areas are arranged around the ultrasonic area, thereby reducing the influence on the artificial intelligent disease auxiliary diagnosis model in the training process.

Description

Ultrasonic region segmentation method and system for B-ultrasonic DICOM (digital imaging and communications in medicine) image
Technical Field
The invention relates to the technical field of medical image segmentation, in particular to an ultrasonic region segmentation method and system for a B-mode ultrasound DICOM image.
Background
With the development of ultrasonic imaging technology, ultrasonic diagnosis has gradually become the most common disease examination means in various hospitals, for example, B-mode ultrasonic examination (B-mode ultrasound for short), which is the most widely used ultrasonic examination with the greatest influence. The B-mode ultrasound can effectively inspect superficial small organs such as thyroid, mammary gland, prostate, carotid artery and the like, and form DICOM image data including basic information of patients and related medical images according to a DICOM protocol, and doctors can read the images and diagnose the images by means of a professional DICOM reader. With the development of artificial intelligence technology, more and more researchers are working on researching artificial intelligence disease auxiliary diagnosis model based on DICOM image data, and in order to meet the training requirement of the model, firstly, the ultrasound region in the B-ultrasound DICOM image data needs to be effectively extracted.
In order to identify the ultrasound region in the B-mode DICOM image data, it is now more common practice to: the ultrasonic region is identified by reading the tag information in the DICOM image data. For example: by using a third-party image processing tool (such as pydicom packet in PYTHON), four region coordinates in DICOM images are read: (0018,6018) Region Location Min X0; (0018, 601a) Region Min Y0; (0018, 601c) Region Max X1; (0018, 601e) Region Location Max Y1, so as to determine the position of the ultrasonic Region in the DICOM image, and to realize the automatic acquisition of the ultrasonic Region.
The technical method for automatically reading the ultrasonic region according to the coordinates is usually only suitable for the relatively regular B-mode ultrasound DICOM images, namely, the validity of the method can be ensured when the ultrasonic region is completely overlapped with the region determined by the coordinates. However, in the actual B-ultrasound diagnosis, there are often a lot of irregular DICOM image data, that is, the ultrasound region is far smaller than the coordinate region, which results in that when the irregular DICOM image is processed by the technical method of using coordinates to perform automatic reading of the ultrasound region, a lot of black useless regions are provided around the ultrasound region, thereby affecting the training process of the artificial intelligence disease auxiliary diagnosis model.
Disclosure of Invention
Technical problem to be solved
Aiming at the defects of the prior art, the invention provides an ultrasonic region segmentation method and system for B-mode ultrasound DICOM images, which can solve the problem that a large number of black useless regions are arranged around an ultrasonic region.
(II) technical scheme
In order to realize the purpose, the invention is realized by the following technical scheme:
in a first aspect, a method for ultrasound region segmentation for B-mode ultrasound DICOM images is provided, which includes:
b, reading the B-ultrasonic DICOM image;
acquiring two-dimensional pixel matrix dimensionality of the DICOM image based on the B-mode ultrasound DICOM image, namely the number of image lines M and the number of image columns N;
setting a boundary control variable U;
traversing a two-dimensional pixel matrix of the DICOM image from a U-th column on the left to a N/2-th column at the center, selecting an L-th column, taking a midpoint of each column as a starting point, respectively taking N points upwards and downwards, obtaining specific pixel values of all 2N points, judging whether the maximum value in the specific pixel values of the 2N points is E, and recording the L value when the maximum value in the specific pixel values of the 2N points is not E for the first time;
traversing a two-dimensional pixel matrix of the DICOM image from the middle (N/2) column to the (N-U) column, selecting the R column, taking the midpoint of each column as a starting point, respectively taking N points upwards and downwards, obtaining specific pixel values of all 2N points, judging whether the maximum value of the specific pixel values of the 2N points is E, and recording the R value when the maximum value of the specific pixel values of the 2N points is E for the first time;
traversing a two-dimensional pixel matrix of the DICOM image from the Uth row to the M/2 th row from the top, selecting the T-th row, respectively taking n points to the left and the right by taking the midpoint of each row as a starting point, obtaining specific pixel values of all 2n points, judging whether the maximum value in the specific pixel values of the 2n points is E or not, and recording the T value when the maximum value in the specific pixel values of the 2n points is not E for the first time;
traversing a two-dimensional pixel matrix of the DICOM image from a middle M/2 th line to an M-U line, selecting a P-th line, respectively taking n points to the left and the right by taking a midpoint of each line as a starting point, obtaining specific pixel values of all 2n points, judging whether the maximum value of the specific pixel values of the 2n points is E, and recording the P value when the maximum value of the specific pixel values of the 2n points is E for the first time;
acquiring an ultrasound region based on the recorded (L, R, T, P) as ultrasound region boundary coordinates;
u is a natural number and satisfies that U is more than or equal to 0 and less than or equal to N/2 and U is more than or equal to 0 and less than or equal to M/2;
e is a preset pixel value of the black area;
n is a positive integer;
l is a positive integer and satisfies that U < L < N/2;
the R is a positive integer and satisfies N/2-R-N-U;
t is a positive integer and satisfies U < T < M/2;
the P is a positive integer and satisfies the conditions of M/2P and M-U.
Preferably, when the maximum value among the specific pixel values of the 2n points is not E for the first time, recording an L value includes: recording an L value if, traversing from the (L-U + 1) th column to the L-th column, the maximum value among the specific pixel values of the 2n of the points on each column is E, but the maximum value among the specific pixel values of the 2n of the points on the L +1 th column is not E;
when the maximum value of the specific pixel values of the 2n points is E for the first time, recording an R value, which specifically includes: recording an R value if, from the R-th column to the (R + U-1) -th column, the maximum value among the specific pixel values of the 2n of the points on each column is E, but the maximum value among the specific pixel values of the 2n of the points on the R-1-th column is not E;
when the maximum value of the specific pixel values of the 2n points is not E for the first time, recording a T value, specifically including: recording a T value if the maximum value of the specific pixel values of the 2n points on each line is E, but the maximum value of the specific pixel values of the 2n points on the T +1 th line is not E, traversing from the (T-U + 1) th line to the T-th line;
when the maximum value of the specific pixel values of the 2n points is E for the first time, recording a P value, which specifically includes: if the maximum value of the specific pixel values of the 2n points on each line is E, but the maximum value of the specific pixel values of the 2n points on the P-1 line is not E, traversing from the P-th line to the (P + U-1) th line, recording the P value.
Preferably, the acquiring a two-dimensional pixel matrix dimension of a DICOM image based on the B-mode ultrasound DICOM image, that is, the number of image rows M and the number of image columns N, specifically includes:
acquiring a pixel matrix of the whole DICOM image based on the B-mode ultrasound DICOM image;
acquiring a two-dimensional pixel matrix of the DICOM image based on the pixel matrix of the whole DICOM image;
and obtaining the two-dimensional pixel matrix dimensionality of the DICOM image, namely the image line number M and the line number N according to the two-dimensional pixel matrix of the DICOM image.
Preferably, the acquiring a two-dimensional pixel matrix of a DICOM image based on the pixel matrix of the whole DICOM image includes:
if the original DICOM image is a single-channel image, directly extracting two-dimensional pixel matrix information;
if the original DICOM image is a multi-channel image, extracting two-dimensional pixel matrix information on any layer of the original DICOM image.
In a second aspect, the present invention further provides an ultrasound region segmentation system for B-mode ultrasound DICOM images, the system comprising:
the data acquisition module is used for reading the B-mode ultrasound DICOM image data;
a matrix dimension acquisition module, configured to acquire dimensions of a two-dimensional pixel matrix of the B-mode ultrasound DICOM image, that is, an image row number M and a column number N;
the boundary control variable setting module is used for setting a boundary control variable U;
the boundary coordinate determination module is used for traversing the two-dimensional pixel matrix of the DICOM image to obtain a left boundary coordinate L, a right boundary coordinate R, an upper boundary coordinate T and a lower boundary coordinate P which represent the boundary of an ultrasonic region;
the ultrasonic region acquisition module is used for acquiring an ultrasonic region in the DICOM image according to the left boundary coordinate L, the right boundary coordinate R, the upper boundary coordinate T and the lower boundary coordinate P;
the data acquisition module, the matrix dimension acquisition module, the boundary coordinate determination module and the ultrasonic region acquisition module are sequentially connected, and the boundary control variable setting module is connected with the matrix dimension acquisition module;
the two-dimensional pixel matrix of the DICOM image is traversed to obtain a left boundary coordinate L, a right boundary coordinate R, an upper boundary coordinate T, and a lower boundary coordinate P representing the boundary of an ultrasound region, and the method specifically includes:
traversing a two-dimensional pixel matrix of the DICOM image from a U-th column on the left to a N/2-th column at the center, selecting an L-th column, taking a midpoint of each column as a starting point, respectively taking N points upwards and downwards, obtaining specific pixel values of all 2N points, judging whether the maximum value in the specific pixel values of the 2N points is E, and recording the L value when the maximum value in the specific pixel values of the 2N points is not E for the first time;
traversing a two-dimensional pixel matrix of the DICOM image from the middle (N/2) column to the (N-U) column, selecting the R column, taking the midpoint of each column as a starting point, respectively taking N points upwards and downwards, obtaining specific pixel values of all 2N points, judging whether the maximum value of the specific pixel values of the 2N points is E, and recording the R value when the maximum value of the specific pixel values of the 2N points is E for the first time;
traversing a two-dimensional pixel matrix of the DICOM image from the Uth row to the M/2 th row from the top, selecting the T-th row, respectively taking n points to the left and the right by taking the middle point of each row as a starting point, obtaining specific pixel values of all 2n points, judging whether the maximum value in the specific pixel values of the 2n points is E, and recording the T value when the maximum value in the specific pixel values of the 2n points is not E for the first time;
traversing a two-dimensional pixel matrix of the DICOM image from a middle M/2 th line to an M-U line, selecting a P-th line, respectively taking n points to the left and the right by taking a midpoint of each line as a starting point, obtaining specific pixel values of all 2n points, judging whether the maximum value of the specific pixel values of the 2n points is E, and recording the P value when the maximum value of the specific pixel values of the 2n points is E for the first time;
u is a natural number and satisfies that U is more than or equal to 0 and less than or equal to N/2 and U is more than or equal to 0 and less than or equal to M/2;
e is a preset pixel value of the black area;
n is a positive integer;
l is a positive integer and satisfies that U < L < N/2;
the R is a positive integer and satisfies N/2-R-N-U;
t is a positive integer and satisfies U < T < M/2;
the P is a positive integer and satisfies the conditions of M/2P and M-U.
Preferably, when the maximum value of the specific pixel values of the 2n points is not E for the first time, the recording of the L value includes: recording an L value if, traversing from the (L-U + 1) th column to the L-th column, the maximum value among the specific pixel values of the 2n of the points on each column is E, but the maximum value among the specific pixel values of the 2n of the points on the L +1 th column is not E;
when the maximum value of the specific pixel values of the 2n points is E for the first time, recording an R value, which specifically includes: recording an R value if, from the R-th column to the (R + U-1) -th column, the maximum value among the specific pixel values of the 2n of the points on each column is E, but the maximum value among the specific pixel values of the 2n of the points on the R-1-th column is not E;
when the maximum value of the specific pixel values of the 2n points is not E for the first time, recording a T value, specifically including: recording a T value if the maximum value of the specific pixel values of the 2n points on each line is E, but the maximum value of the specific pixel values of the 2n points on the T +1 th line is not E, traversing from the (T-U + 1) th line to the T-th line;
when the maximum value of the specific pixel values of the 2n points is E for the first time, recording a P value, which specifically includes: if the maximum value of the specific pixel values of the 2n points on each line is E, but the maximum value of the specific pixel values of the 2n points on the P-1 line is not E, traversing from the P-th line to the (P + U-1) th line, recording the P value.
Preferably, the acquiring a two-dimensional pixel matrix dimension of a DICOM image based on the B-mode ultrasound DICOM image, that is, the number of image rows M and the number of image columns N, specifically includes:
acquiring a pixel matrix of the whole DICOM image based on the B-mode ultrasound DICOM image;
acquiring a two-dimensional pixel matrix of the DICOM image based on the pixel matrix of the whole DICOM image;
and obtaining the two-dimensional pixel matrix dimensionality of the DICOM image, namely the image line number M and the line number N according to the two-dimensional pixel matrix of the DICOM image.
Preferably, the acquiring a two-dimensional pixel matrix of a DICOM image based on the pixel matrix of the whole DICOM image includes:
if the original DICOM image is a single-channel image, directly extracting two-dimensional pixel matrix information;
if the original DICOM image is a multi-channel image, extracting two-dimensional pixel matrix information on any layer of the original DICOM image.
(III) advantageous effects
The invention provides an ultrasonic region segmentation method and system for B-ultrasonic DICOM images, which have the following beneficial effects compared with the prior art:
after obtaining the dimensionality of a DICOM image two-dimensional pixel matrix, namely the number M and the number N of image lines, setting a boundary control variable U, and then respectively traversing the DICOM image two-dimensional pixel matrix from the U-th column on the left to the N/2 column in the center, from the N/2-th column in the middle to the N-U column, from the U-th row on the upper side to the M/2 line in the center, and from the M/2-th row in the middle to the M-U row to respectively determine values L, R, T and P representing the left, right, upper and lower boundaries of an ultrasonic region, and then automatically acquiring the ultrasonic region according to coordinates (L, R, T and P). Due to the adoption of the traversal method, the method not only can process the ultrasonic region in the regular B-mode ultrasound DICOM image, but also can effectively read the ultrasonic region in the irregular B-mode ultrasound DICOM image, and the obtained coordinates (L, R, T and P) just represent the boundary of the ultrasonic region, so that the problem that a large number of black useless regions exist around the obtained ultrasonic region is avoided, and the influence on the training process of an artificial intelligent disease auxiliary diagnosis model is reduced.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a flow chart of an ultrasound segmentation method for B-ultrasound DICOM images according to the present invention;
FIG. 2 is a diagram of an ultrasound segmentation system for B-ultrasound DICOM imaging according to the present invention.
Detailed Description
To make the objects, technical solutions and advantages of the embodiments of the present invention clearer and more complete description of the technical solutions in the embodiments of the present invention, it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The embodiment of the application provides an ultrasonic region segmentation method and system for B-mode ultrasound DICOM images, and solves the problem that a large number of black useless regions are arranged around an ultrasonic region when the ultrasonic region is obtained in the prior art.
In order to solve the technical problems, the general idea of the embodiment of the application is as follows:
according to the embodiment of the invention, after the dimensionality of the DICOM image two-dimensional pixel matrix, namely the number of lines and columns of the image, is obtained, the boundary control variable is set, then the DICOM image two-dimensional pixel matrix is traversed to respectively determine the values L, R, T and P representing the left, right, upper and lower boundaries of the ultrasonic region, the ultrasonic region is automatically obtained according to the coordinates (L, R, T and P), and the obtained coordinates (L, R, T and P) just represent the boundary of the ultrasonic region, so that the problem that a large number of black useless regions exist around the ultrasonic region when the ultrasonic region is obtained according to a traditional coordinate automatic reading technical method is avoided, and the influence on the training process of an artificial intelligent disease auxiliary diagnosis model is reduced.
The embodiment of the invention firstly provides an ultrasonic region segmentation method facing a B-mode ultrasound DICOM image, which comprises the following steps:
b, reading the B-ultrasonic DICOM image;
acquiring two-dimensional pixel matrix dimensionality of the DICOM image based on the B-mode ultrasound DICOM image, namely the number of image lines M and the number of image columns N;
setting a boundary control variable U;
traversing a two-dimensional pixel matrix of the DICOM image from a U-th column on the left to a N/2-th column at the center, selecting an L-th column, taking a midpoint of each column as a starting point, respectively taking N points upwards and downwards, obtaining specific pixel values of all 2N points, judging whether the maximum value in the specific pixel values of the 2N points is E or not, and recording the L value when the maximum value in the specific pixel values of the 2N points is not E for the first time;
traversing the two-dimensional pixel matrix of the DICOM image from the middle (N/2) column to the (N-U) column, selecting the R column, taking the midpoint of each column as a starting point, respectively taking N points upwards and downwards, obtaining specific pixel values of all 2N points, judging whether the maximum value in the specific pixel values of the 2N points is E, and recording the R value when the maximum value in the specific pixel values of the 2N points is E for the first time;
traversing a two-dimensional pixel matrix of the DICOM image from the Uth row to the M/2 th row from the top, selecting the T-th row, respectively taking n points to the left and the right by taking the midpoint of each row as a starting point, obtaining specific pixel values of all 2n points, judging whether the maximum value in the specific pixel values of the 2n points is E or not, and recording the T value when the maximum value in the specific pixel values of the 2n points is not E for the first time;
traversing the two-dimensional pixel matrix of the DICOM image from the middle M/2 th line to an M-U line, selecting a P line, respectively taking n points to the left and the right by taking the midpoint of each line as a starting point, obtaining specific pixel values of all 2n points, judging whether the maximum value in the specific pixel values of the 2n points is E, and recording the P value when the maximum value in the specific pixel values of the 2n points is E for the first time;
acquiring an ultrasonic region based on the recorded (L, R, T, P) as an ultrasonic region boundary coordinate;
the U is a natural number and satisfies that U is more than or equal to 0 and less than or equal to N/2 and U is more than or equal to 0 and less than or equal to M/2;
the E is a preset pixel value of the black area;
n is a positive integer;
l is a positive integer and satisfies that U < L < N/2;
the R is a positive integer and satisfies N/2-R-N-U;
t is a positive integer and satisfies that U < T < M/2;
the P is a positive integer and satisfies the condition that the P is formed by the P-woven fabrics of M/2.
It can be seen that, after obtaining the dimensions of the DICOM image two-dimensional pixel matrix, i.e. the number M and the number N of image lines, the invention sets the boundary control variable U, and then traverses the DICOM image two-dimensional pixel matrix from the U-th column on the left to the N/2-th column, from the N/2-th column in the middle to the N-U-th column, from the U-th row on the upper side to the M/2-th row, and from the M/2-th row in the middle to the M-U row, to respectively determine the values L, R, T, P representing the left, right, upper and lower boundaries of the ultrasound region, and then automatically acquire the ultrasound region according to the coordinates (L, R, T, P).
In the method of the embodiment of the present invention, when determining and recording L, R, T, and P values representing boundary coordinates of an ultrasound region, for a two-dimensional pixel matrix of an obtained DICOM image, first, traversing from the U-th column on the left to the center N/2 columns, taking N (N is a positive integer) points respectively upward and downward for the L-th (U < L < N/2) column from a midpoint of each column as a starting point, and obtaining specific pixel values thereof, determining whether a maximum value among the specific pixel values of the 2N points obtained is E, and if traversing from the (L-U + 1) th column to the L-th column, a maximum value among the specific pixel values of the 2N points on each column is E, but a maximum value among the specific pixel values of the 2N points on the L + 1-th column is not E, recording the L value; next, traversing from the middle N/2 th column to the N-U column, taking N points respectively upwards and downwards for the R (N/2 < -R < -N-U) column by taking the midpoint of each column as a starting point, and obtaining specific pixel values thereof, judging whether the maximum value of the obtained specific pixel values of the 2N points is E, if traversing from the R column to the R + U-1 th column, the maximum value of the specific pixel values of the 2N points on each column is E, but the maximum value of the specific pixel values of the 2N points on the R-1 th column is not E, recording the R value; next, traversing from the Uth line to the M/2 lines at the center, respectively taking n points to the left and the right by taking the midpoint of each line as a starting point for the T (U < T < M/2) line, and obtaining specific pixel values of the n points, judging whether the maximum value of the obtained pixel values of the 2n points is E, if traversing from the (T-U + 1) th line to the T line, the maximum value of the specific pixel values of the 2n points on each line is E, but the maximum value of the specific pixel values of the 2n points on the T +1 th line is not E, recording the T value; and finally, traversing from the middle M/2 line to the M-U line, respectively taking n points to the left and the right for the P (M/2P-U) line by taking the midpoint of each line as a starting point, obtaining specific pixel values of the n points, judging whether the maximum value of the obtained pixel values of the 2n points is E, and recording the P value if the maximum value of the specific pixel values of the 2n points on each line is E but the maximum value of the specific pixel values of the 2n points on the P-1 line is not E after traversing from the P line to the (P + U-1) line. By the method for traversing the rows and the columns of the two-dimensional pixel matrix of the DICOM image to acquire the ultrasonic region, the ultrasonic region in the regular B-mode ultrasonic DICOM image can be processed, the ultrasonic region in the irregular B-mode ultrasonic DICOM image can be effectively read, the obtained coordinates (L, R, T and P) can more accurately represent the boundary of the ultrasonic region, and the problem that a large number of black invalid regions exist around the obtained ultrasonic region is solved.
Due to production process and manufacturing errors, the performance of each B-mode ultrasonic machine is different, and the machines have differences when judging image pixel values. For example, some regions that appear black have a pixel value that is not necessarily equal to a theoretical 0 value (theoretically, the pixel value or the gray scale value has 256 levels from 0 to 255, a white-appearing region in an image has a larger pixel value that is close to or equal to 255, and a black-appearing region has a smaller pixel value that is close to or equal to 0), so that the E value of the preset pixel value that actually represents the black region may be different according to the selected B-mode ultrasound machine, specifically, when the B-mode ultrasound machines are different, the E value may be 0, 1, 2 or other indeterminacy values, but the E value is a determinate value as long as the B-mode ultrasound machines are determined, and therefore, in a specific traversal process, the size of the maximum value E of the preset pixel value is determined according to the actually selected B-mode ultrasound machine.
In practice, in the above method of the embodiment of the present invention, a computer program language (e.g. PYTHON language) may be used to read the B-mode DICOM image, process the B-mode DICOM image, and finally obtain the two-dimensional pixel matrix dimension of the DICOM image. The specific process of reading B-mode ultrasound DICOM images and processing the B-mode ultrasound DICOM images by utilizing PYTHON language comprises the following steps: the method comprises the steps of firstly, reading a B-mode ultrasound DICOM image by using a SimpleITK and nibabel tool kit provided by a PYTHON language, performing format conversion on the image by using a SimpleITK.ReadIimage () command, then obtaining a pixel matrix of the whole DICOM image based on the B-mode ultrasound DICOM image, then obtaining a two-dimensional pixel matrix of the DICOM image based on the pixel matrix of the whole DICOM image, and finally obtaining two-dimensional pixel matrix dimensions of the DICOM image, namely image row number M and column number N according to the two-dimensional pixel matrix of the DICOM image. Of course, the computer programming language used in this process, including but not limited to PYTHON, may also be JAVA or other computer programming languages that achieve the same functions.
In addition, in the process of acquiring the two-dimensional pixel matrix of the DICOM image based on the pixel matrix of the whole DICOM image, some original DICOM images are single-channel images, and some original DICOM images are multi-channel images. At this time, in an embodiment of the present invention, when acquiring a two-dimensional pixel matrix of a DICOM image, if the original DICOM image is a single-channel image, two-dimensional pixel matrix information is directly extracted, and if the original DICOM image is a multi-channel image, two-dimensional pixel matrix information on any layer thereof is extracted.
In an embodiment of the present invention, through observation and analysis of a large amount of B-mode ultrasound DICOM image data, the inventor finds that when a boundary control variable U is selected and n points are selected in different directions during traversal, if U =10,n =100, the efficiency is the highest and the best effect is obtained when L, R, T, P representing the boundary coordinates of an ultrasound region are obtained.
The following describes an implementation process of an embodiment of the present invention in detail by taking the boundary control variable U =10 and the number n =100 of selected points in the traversal process as an example.
Fig. 1 is a flowchart of an ultrasound region segmentation method for B-mode ultrasound DICOM images, and referring to fig. 1, the process of the ultrasound region segmentation method for B-mode ultrasound DICOM images includes:
firstly, reading a B-mode ultrasound DICOM image by using a SimpleITK and nibabel toolkit provided by a PYTHON language and applying a SimpleITK.ReadIimage () command, then performing format conversion on the image by using the SimpleITK.WriteImage () command, and then obtaining a pixel matrix of the whole DICOM image based on the B-mode ultrasound DICOM image; then, a nibabel.load () command is used to obtain a two-dimensional pixel matrix of the DICOM image, in the process, if the original DICOM image is a multi-channel image, the pixel matrix information on any layer of the original DICOM image is extracted, and when the original DICOM image is a single-channel image, the step is omitted.
The two-dimensional pixel matrix dimensionality of the DICOM image, namely the number M of image rows and the number N of image columns, is obtained based on the two-dimensional pixel matrix of the DICOM image.
A boundary control variable U is set, let U =10.
When the number n of points selected in different directions in the traversing process is selected, making n =100, and traversing along the columns and rows of the two-dimensional pixel matrix of the DICOM image, wherein the specific process is as follows:
firstly, traversing from the 10 th column on the left to the N/2 th column at the center, respectively taking 100 (N is a positive integer) points upwards and downwards for the L (10 < -L < -N/2) th column by taking the midpoint of each column as a starting point, obtaining specific pixel values of the points, judging whether the maximum value in the obtained specific pixel values of 200 points is E or not, and recording the L value if the maximum value in the specific pixel values of the 200 points on each column is E from the (L-10 < +1 >) th column, but the maximum value in the specific pixel values of the 2N points on the L +1 th column is not E;
next, traversing from the middle (N/2) th column to the (N-10) th column, taking 100 points respectively upwards and downwards for the R (N/2 < -R < -N-10) th column by taking the midpoint of each column as a starting point, obtaining specific pixel values of the 100 points, judging whether the maximum value of the obtained specific pixel values of the 200 points is E or not, and recording the R value if the maximum value of the specific pixel values of the 200 points on each column is E but the maximum value of the specific pixel values of the 200 points on the R-1 th column is not E after traversing from the R (R) th column to the (R + 10-1) th column;
next, traversing from the upper 10 th row to the center M/2 row, taking 100 points to the left and the right respectively with the midpoint of each row as a starting point for the T (10 < -T < -M/2) th row, and obtaining specific pixel values thereof, judging whether the maximum value among the obtained pixel values of 200 points is E, if traversing from the (T-10 + 1) th row to the T th row, the maximum value among the specific pixel values of 200 points on each row is E, but the maximum value among the specific pixel values of 200 points on the T +1 th row is not E, recording the T value;
and finally, traversing from the middle M/2 row to the M-10 row, taking 100 points to the left and the right respectively for the P (M/2T < -M-10) th row by taking the midpoint of each row as a starting point, obtaining specific pixel values of the points, judging whether the maximum value of the obtained pixel values of the 200 points is E, and if traversing from the P-th row to the (P + 10-1) th row, taking the maximum value of the specific pixel values of the 200 points on each row as E, but not taking the maximum value of the specific pixel values of the 200 points on the P-1 th row as E, recording the P value.
And forming boundary coordinates (L, R, T and P) of the ultrasonic region based on the L value, the R value, the T value and the P value obtained by the traversal method, thereby realizing the automatic acquisition of the ultrasonic region.
Thus, the process of ultrasonic region segmentation for the B-mode ultrasound DICOM image in the embodiment of the invention is completed.
In addition, in response to the above method, an embodiment of the present invention further provides an ultrasound segmentation system for B-mode ultrasound DICOM images, which is shown in fig. 2 and includes:
the data acquisition module is used for reading the B-mode ultrasound DICOM image data;
a matrix dimension obtaining module, configured to obtain dimensions of a two-dimensional pixel matrix of the B-mode ultrasound DICOM image, that is, a number M of image rows and a number N of columns;
the boundary control variable setting module is used for setting a boundary control variable U;
a boundary coordinate determination module, configured to traverse a two-dimensional pixel matrix of the DICOM image to obtain a left boundary coordinate L, a right boundary coordinate R, an upper boundary coordinate T, and a lower boundary coordinate P representing a boundary of an ultrasound region;
an ultrasonic region acquisition module, configured to acquire an ultrasonic region in the DICOM image according to the left boundary coordinate L, the right boundary coordinate R, the upper boundary coordinate T, and the lower boundary coordinate P;
the data acquisition module, the matrix dimension acquisition module, the boundary coordinate determination module and the ultrasonic region acquisition module are sequentially connected, and the boundary control variable setting module is connected with the matrix dimension acquisition module;
the traversing the two-dimensional pixel matrix of the DICOM image to obtain a left boundary coordinate L, a right boundary coordinate R, an upper boundary coordinate T, and a lower boundary coordinate P representing the boundary of the ultrasound region specifically includes:
traversing a two-dimensional pixel matrix of the DICOM image from a U-th column on the left to a N/2-th column at the center, selecting an L-th column, taking a midpoint of each column as a starting point, respectively taking N points upwards and downwards, obtaining specific pixel values of all 2N points, judging whether the maximum value in the specific pixel values of the 2N points is E or not, and recording the L value when the maximum value in the specific pixel values of the 2N points is not E for the first time;
traversing the two-dimensional pixel matrix of the DICOM image from the middle (N/2) column to the (N-U) column, selecting the R column, taking the midpoint of each column as a starting point, respectively taking N points upwards and downwards, obtaining specific pixel values of all 2N points, judging whether the maximum value in the specific pixel values of the 2N points is E, and recording the R value when the maximum value in the specific pixel values of the 2N points is E for the first time;
traversing a two-dimensional pixel matrix of the DICOM image from the Uth row to the M/2 th row from the top, selecting the T-th row, respectively taking n points to the left and the right by taking the midpoint of each row as a starting point, obtaining specific pixel values of all 2n points, judging whether the maximum value in the specific pixel values of the 2n points is E or not, and recording the T value when the maximum value in the specific pixel values of the 2n points is not E for the first time;
traversing the two-dimensional pixel matrix of the DICOM image from the middle M/2 th line to an M-U line, selecting a P line, respectively taking n points to the left and the right by taking the midpoint of each line as a starting point, obtaining specific pixel values of all 2n points, judging whether the maximum value in the specific pixel values of the 2n points is E, and recording the P value when the maximum value in the specific pixel values of the 2n points is E for the first time;
the U is a natural number and satisfies that U is more than or equal to 0 and less than or equal to N/2 and U is more than or equal to 0 and less than or equal to M/2; e is a preset pixel value of the black area; n is a positive integer; l is a positive integer and satisfies U < L < N/2; r is a positive integer and satisfies N/2-R-N-U; t is a positive integer and satisfies U < T < M/2; p is a positive integer and satisfies the conditions that the P-cloth is covered by M/2.
Each step in the method of the embodiment of the present invention corresponds to a step in the system of the embodiment of the present invention in the ultrasound region acquisition, and each step in the system of the embodiment of the present invention in the ultrasound region acquisition process is included in the method of the embodiment of the present invention, so that repeated descriptions are omitted here.
The ultrasound segmentation system for B-mode ultrasound DICOM images according to the present invention is further described in detail with reference to another specific embodiment.
The system of the embodiment comprises a data acquisition module, a matrix dimension acquisition module, a boundary control variable setting module, a boundary coordinate determination module and an ultrasonic region acquisition module.
The data acquisition module is used for reading the B-ultrasonic DICOM image data. In this embodiment, reading B-mode ultrasound DICOM images is realized by using a simplex itk. Readimage () command and a nibabel toolkit provided by PYTHON language, and then format conversion is performed on the images by using the simplex itk. Writeimage () command;
the matrix dimension acquisition module is used for acquiring the dimensions of a two-dimensional pixel matrix of the B-mode ultrasound DICOM image, namely the number of image rows M and the number of image columns N. Based on the B-mode ultrasound DICOM image data read by the data acquisition module, a nibabel.load () command is used for acquiring a two-dimensional pixel matrix of the DICOM image, and then the two-dimensional pixel matrix dimensionality of the DICOM image, namely the image line number M and the line number N, is acquired based on the two-dimensional pixel matrix of the DICOM image. In the process of obtaining the two-dimensional pixel matrix of the DICOM image, if the original DICOM image is a multi-channel image, the pixel matrix information on any layer of the original DICOM image is extracted, and if the original DICOM image is a single-channel image, the step is omitted.
And the boundary control variable setting module is used for setting a boundary control variable U. The variable U is a natural number, and satisfies that U is more than or equal to 0 and less than or equal to N/2 and U is more than or equal to 0 and less than or equal to M/2, and in specific operation, when U =10 is obtained through countless test data, the test efficiency can be greatly improved, and the effect is optimal.
And the boundary coordinate determination module is used for traversing the two-dimensional pixel matrix of the DICOM image to obtain a left boundary coordinate L, a right boundary coordinate R, an upper boundary coordinate T and a lower boundary coordinate P which represent the boundary of the ultrasonic region. And traversing the two-dimensional pixel matrix of the obtained DICOM image from the U-th column on the left to the N/2-th column in the center, from the N/2-th column in the middle to the N-U-th column, from the U-th row on the upper side to the M/2-th row in the center and from the M/2-th row in the middle to the M-U-row by applying a traversal algorithm so as to respectively determine values L, R, T and P representing the left, right, upper and lower boundaries of the ultrasonic region and form coordinates (L, R, T and P).
And the ultrasonic region acquisition module is used for acquiring the ultrasonic region in the DICOM image according to the left boundary coordinate L, the right boundary coordinate R, the upper boundary coordinate T and the lower boundary coordinate P. The technical method for automatically reading the ultrasonic region according to the coordinates obtains the required ultrasonic region based on the coordinates (L, R, T and P).
In summary, compared with the prior art, the ultrasound region segmentation method and system for B-mode ultrasound DICOM images provided by the embodiments of the present invention have the following beneficial effects:
according to the method, after the dimensionalities of the DICOM image two-dimensional pixel matrix, namely the number M and the number N of image lines, are obtained, the boundary control variable U is set, then the two-dimensional pixel matrix of the DICOM image is traversed from the U-th column on the left to the N/2 column in the center, from the N/2 column in the middle to the N-U column and from the U-th row on the upper side to the M/2 row in the center, and from the M/2 row in the middle to the M-U row, values L, R, T and P representing the left, right, upper and lower boundaries of an ultrasonic region are determined respectively, then the ultrasonic region is automatically obtained according to coordinates (L, R, T and P).
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising a … …" does not exclude the presence of another identical element in a process, method, article, or apparatus that comprises the element.
The above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (6)

1. An ultrasonic region segmentation method for B-mode ultrasound DICOM images is characterized by comprising the following steps:
b, reading the B-ultrasonic DICOM image;
acquiring two-dimensional pixel matrix dimensionality of the DICOM image based on the B-mode ultrasound DICOM image, namely the number of image lines M and the number of image columns N;
setting a boundary control variable U;
traversing a two-dimensional pixel matrix of the DICOM image from a U-th column on the left to a N/2-th column at the center, selecting an L-th column, taking a midpoint of each column as a starting point, respectively taking N points upwards and downwards, obtaining specific pixel values of all 2N points, judging whether the maximum value in the specific pixel values of the 2N points is E, and recording the L value when the maximum value in the specific pixel values of the 2N points is not E for the first time;
traversing a two-dimensional pixel matrix of the DICOM image from the middle (N/2) column to the (N-U) column, selecting the R column, taking the midpoint of each column as a starting point, respectively taking N points upwards and downwards, obtaining specific pixel values of all 2N points, judging whether the maximum value of the specific pixel values of the 2N points is E, and recording the R value when the maximum value of the specific pixel values of the 2N points is E for the first time;
traversing a two-dimensional pixel matrix of the DICOM image from the Uth row to the M/2 th row from the top, selecting the T-th row, respectively taking n points to the left and the right by taking the midpoint of each row as a starting point, obtaining specific pixel values of all 2n points, judging whether the maximum value in the specific pixel values of the 2n points is E or not, and recording the T value when the maximum value in the specific pixel values of the 2n points is not E for the first time;
traversing a two-dimensional pixel matrix of the DICOM image from the middle M/2 th line to an M-U line, selecting a P line, respectively taking n points to the left and the right by taking the middle point of each line as a starting point, obtaining specific pixel values of all 2n points, judging whether the maximum value of the specific pixel values of the 2n points is E, and recording the P value when the maximum value of the specific pixel values of the 2n points is E for the first time;
acquiring an ultrasound region based on the recorded (L, R, T, P) as ultrasound region boundary coordinates;
u is a natural number and satisfies that U is more than or equal to 0 and less than or equal to N/2 and U is more than or equal to 0 and less than or equal to M/2;
e is a preset pixel value of the black area;
n is a positive integer;
l is a positive integer and satisfies that U < L < N/2;
r is a positive integer and satisfies N/2< R < N-U;
t is a positive integer and satisfies U < T < M/2;
p is a positive integer and satisfies M/2< P < M-U;
when the maximum value of the specific pixel values of the 2n points is not E for the first time, recording an L value, specifically including: if the maximum value of the specific pixel values of the 2n points on each column is E, but the maximum value of the specific pixel values of the 2n points on the L +1 th column is not E, traversing from the L-U +1 th column to the L-th column, recording the L value;
when the maximum value of the specific pixel values of the 2n points is E for the first time, recording an R value, specifically including: recording an R value if, from column R to column R + U-1, the maximum of the 2n said points on each column is E, but the maximum of the 2n said points on column R-1 is not E;
when the maximum value of the specific pixel values of the 2n points is not E for the first time, recording a T value, specifically including: recording a T value if the maximum value of the specific pixel values of the 2n points on each line is E, but the maximum value of the specific pixel values of the 2n points on the T +1 th line is not E, traversing from the T-U +1 th line to the T-th line;
when the maximum value of the specific pixel values of the 2n points is E for the first time, recording a P value, which specifically includes: if the maximum value of the specific pixel values of the 2n points on each line is E, but the maximum value of the specific pixel values of the 2n points on the P-1 line is not E, the P value is recorded if the P is traversed to the P + U-1 line.
2. The method of claim 1, wherein said acquiring the two-dimensional pixel matrix dimensions of the DICOM image based on the B-mode DICOM image, i.e. the number of image rows M and the number of image columns N, comprises:
acquiring a pixel matrix of the whole DICOM image based on the B-mode ultrasound DICOM image;
acquiring a two-dimensional pixel matrix of the DICOM image based on the pixel matrix of the whole DICOM image;
and obtaining the two-dimensional pixel matrix dimensionality of the DICOM image, namely the image line number M and the line number N according to the two-dimensional pixel matrix of the DICOM image.
3. The method of claim 2, wherein said obtaining a two-dimensional pixel matrix for a DICOM image based on a pixel matrix for the entire DICOM image comprises:
if the original DICOM image is a single-channel image, directly extracting two-dimensional pixel matrix information;
if the original DICOM image is a multi-channel image, extracting two-dimensional pixel matrix information on any layer of the original DICOM image.
4. An ultrasound segmentation system for B-mode ultrasound DICOM images, the system comprising:
the data acquisition module is used for reading the B-mode ultrasound DICOM image data;
a matrix dimension obtaining module, configured to obtain dimensions of a two-dimensional pixel matrix of the B-mode ultrasound DICOM image, that is, a number M of image rows and a number N of columns;
the boundary control variable setting module is used for setting a boundary control variable U;
the boundary coordinate determination module is used for traversing the two-dimensional pixel matrix of the DICOM image to obtain a left boundary coordinate L, a right boundary coordinate R, an upper boundary coordinate T and a lower boundary coordinate P which represent the boundary of an ultrasonic region;
the ultrasonic region acquisition module is used for acquiring an ultrasonic region in the DICOM image according to the left boundary coordinate L, the right boundary coordinate R, the upper boundary coordinate T and the lower boundary coordinate P;
the data acquisition module, the matrix dimension acquisition module, the boundary coordinate determination module and the ultrasonic region acquisition module are sequentially connected, and the boundary control variable setting module is connected with the matrix dimension acquisition module;
the two-dimensional pixel matrix of the DICOM image is traversed to obtain a left boundary coordinate L, a right boundary coordinate R, an upper boundary coordinate T and a lower boundary coordinate P which represent boundaries of an ultrasonic region, and the method specifically includes:
traversing a two-dimensional pixel matrix of the DICOM image from a U-th column on the left to a N/2-th column at the center, selecting an L-th column, taking a midpoint of each column as a starting point, respectively taking N points upwards and downwards, obtaining specific pixel values of all 2N points, judging whether the maximum value in the specific pixel values of the 2N points is E, and recording the L value when the maximum value in the specific pixel values of the 2N points is not E for the first time;
traversing a two-dimensional pixel matrix of the DICOM image from the middle (N/2) column to the (N-U) column, selecting the R column, taking the midpoint of each column as a starting point, respectively taking N points upwards and downwards, obtaining specific pixel values of all 2N points, judging whether the maximum value of the specific pixel values of the 2N points is E, and recording the R value when the maximum value of the specific pixel values of the 2N points is E for the first time;
traversing a two-dimensional pixel matrix of the DICOM image from the Uth row to the M/2 th row from the top, selecting the T-th row, respectively taking n points to the left and the right by taking the midpoint of each row as a starting point, obtaining specific pixel values of all 2n points, judging whether the maximum value in the specific pixel values of the 2n points is E or not, and recording the T value when the maximum value in the specific pixel values of the 2n points is not E for the first time;
traversing a two-dimensional pixel matrix of the DICOM image from a middle M/2 th line to an M-U line, selecting a P-th line, respectively taking n points to the left and the right by taking a midpoint of each line as a starting point, obtaining specific pixel values of all 2n points, judging whether the maximum value of the specific pixel values of the 2n points is E, and recording the P value when the maximum value of the specific pixel values of the 2n points is E for the first time;
u is a natural number and satisfies that U is more than or equal to 0 and less than or equal to N/2 and U is more than or equal to 0 and less than or equal to M/2;
e is a preset pixel value of the black area;
n is a positive integer;
l is a positive integer and satisfies that U < L < N/2;
r is a positive integer and satisfies N/2< R < N-U;
the T is a positive integer and satisfies U < T < M/2;
p is a positive integer and satisfies M/2< P < M-U;
when the maximum value of the specific pixel values of the 2n points is not E for the first time, recording an L value, which specifically includes: recording an L value if the maximum value of the specific pixel values of the 2n points on each column is E, but the maximum value of the specific pixel values of the 2n points on the L +1 th column is not E, traversing from the L-U +1 th column to the L-th column;
when the maximum value of the specific pixel values of the 2n points is E for the first time, recording an R value, which specifically includes: recording an R value if, from column R to column R + U-1, the maximum of the 2n said points on each column is E, but the maximum of the 2n said points on column R-1 is not E;
when the maximum value of the specific pixel values of the 2n points is not E for the first time, recording a T value, specifically including: recording a T value if the maximum value of the specific pixel values of the 2n points on each line is E, but the maximum value of the specific pixel values of the 2n points on the T +1 th line is not E, traversing from the T-U +1 th line to the T-th line;
when the maximum value of the specific pixel values of the 2n points is E for the first time, recording a P value, which specifically includes: and recording the P value if the maximum value of the specific pixel values of the 2n points on each line is E, but the maximum value of the specific pixel values of the 2n points on the P-1 line is not E, traversing from the P-th line to the P + U-1 th line.
5. The system of claim 4, wherein acquiring two-dimensional pixel matrix dimensions of a DICOM image based on the B-mode DICOM image, i.e., image row number M and column number N, comprises:
acquiring a pixel matrix of the whole DICOM image based on the B-mode ultrasound DICOM image;
acquiring a two-dimensional pixel matrix of the DICOM image based on the pixel matrix of the whole DICOM image;
and obtaining the two-dimensional pixel matrix dimensionality of the DICOM image, namely the image row number M and the image column number N according to the two-dimensional pixel matrix of the DICOM image.
6. The system of claim 5, wherein acquiring a two-dimensional pixel matrix of a DICOM image based on the pixel matrix of the entire DICOM image comprises:
if the original DICOM image is a single-channel image, directly extracting two-dimensional pixel matrix information;
if the original DICOM image is a multi-channel image, two-dimensional pixel matrix information on any layer of the image is extracted.
CN202010907810.0A 2020-09-02 2020-09-02 Ultrasonic region segmentation method and system for B-ultrasonic DICOM (digital imaging and communications in medicine) image Active CN112102333B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010907810.0A CN112102333B (en) 2020-09-02 2020-09-02 Ultrasonic region segmentation method and system for B-ultrasonic DICOM (digital imaging and communications in medicine) image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010907810.0A CN112102333B (en) 2020-09-02 2020-09-02 Ultrasonic region segmentation method and system for B-ultrasonic DICOM (digital imaging and communications in medicine) image

Publications (2)

Publication Number Publication Date
CN112102333A CN112102333A (en) 2020-12-18
CN112102333B true CN112102333B (en) 2022-11-04

Family

ID=73757162

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010907810.0A Active CN112102333B (en) 2020-09-02 2020-09-02 Ultrasonic region segmentation method and system for B-ultrasonic DICOM (digital imaging and communications in medicine) image

Country Status (1)

Country Link
CN (1) CN112102333B (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010027476A1 (en) * 2008-09-03 2010-03-11 Rutgers, The State University Of New Jersey System and method for accurate and rapid identification of diseased regions on biological images with applications to disease diagnosis and prognosis
CN103400365A (en) * 2013-06-26 2013-11-20 成都金盘电子科大多媒体技术有限公司 Automatic segmentation method for lung-area CT (Computed Tomography) sequence
CN104637044A (en) * 2013-11-07 2015-05-20 中国科学院深圳先进技术研究院 Ultrasonic image extracting system for calcified plaque and sound shadow thereof
CN104899849A (en) * 2014-01-21 2015-09-09 武汉联影医疗科技有限公司 Multi-target interactive image segmentation method and device
CN105389813A (en) * 2015-10-30 2016-03-09 上海联影医疗科技有限公司 Medical image organ recognition method and segmentation method
CN105701777A (en) * 2016-01-08 2016-06-22 北京航空航天大学 Helical tomotherapy image quality improvement method
CN106997596A (en) * 2017-04-01 2017-08-01 太原理工大学 A kind of Lung neoplasm dividing method of the LBF movable contour models based on comentropy and joint vector
CN107346546A (en) * 2017-07-17 2017-11-14 京东方科技集团股份有限公司 A kind of image processing method and device
CN108257120A (en) * 2018-01-09 2018-07-06 东北大学 A kind of extraction method of the three-dimensional liver bounding box based on CT images
CN109345585A (en) * 2018-10-26 2019-02-15 强联智创(北京)科技有限公司 A kind of measurement method and system of the Morphologic Parameters of intracranial aneurysm image
CN109461163A (en) * 2018-07-20 2019-03-12 河南师范大学 A kind of edge detection extraction algorithm for magnetic resonance standard water mould
CN109829902A (en) * 2019-01-23 2019-05-31 电子科技大学 A kind of lung CT image tubercle screening technique based on generalized S-transform and Teager attribute
CN111166332A (en) * 2020-03-04 2020-05-19 南京鼓楼医院 Radiotherapy target region delineation method based on magnetic resonance spectrum and magnetic resonance image
CN111598895A (en) * 2020-04-14 2020-08-28 苏州复元医疗科技有限公司 Method for measuring lung function index based on diagnostic image and machine learning

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010027476A1 (en) * 2008-09-03 2010-03-11 Rutgers, The State University Of New Jersey System and method for accurate and rapid identification of diseased regions on biological images with applications to disease diagnosis and prognosis
CN103400365A (en) * 2013-06-26 2013-11-20 成都金盘电子科大多媒体技术有限公司 Automatic segmentation method for lung-area CT (Computed Tomography) sequence
CN104637044A (en) * 2013-11-07 2015-05-20 中国科学院深圳先进技术研究院 Ultrasonic image extracting system for calcified plaque and sound shadow thereof
CN104899849A (en) * 2014-01-21 2015-09-09 武汉联影医疗科技有限公司 Multi-target interactive image segmentation method and device
CN105389813A (en) * 2015-10-30 2016-03-09 上海联影医疗科技有限公司 Medical image organ recognition method and segmentation method
CN105701777A (en) * 2016-01-08 2016-06-22 北京航空航天大学 Helical tomotherapy image quality improvement method
CN106997596A (en) * 2017-04-01 2017-08-01 太原理工大学 A kind of Lung neoplasm dividing method of the LBF movable contour models based on comentropy and joint vector
CN107346546A (en) * 2017-07-17 2017-11-14 京东方科技集团股份有限公司 A kind of image processing method and device
CN108257120A (en) * 2018-01-09 2018-07-06 东北大学 A kind of extraction method of the three-dimensional liver bounding box based on CT images
CN109461163A (en) * 2018-07-20 2019-03-12 河南师范大学 A kind of edge detection extraction algorithm for magnetic resonance standard water mould
CN109345585A (en) * 2018-10-26 2019-02-15 强联智创(北京)科技有限公司 A kind of measurement method and system of the Morphologic Parameters of intracranial aneurysm image
CN109829902A (en) * 2019-01-23 2019-05-31 电子科技大学 A kind of lung CT image tubercle screening technique based on generalized S-transform and Teager attribute
CN111166332A (en) * 2020-03-04 2020-05-19 南京鼓楼医院 Radiotherapy target region delineation method based on magnetic resonance spectrum and magnetic resonance image
CN111598895A (en) * 2020-04-14 2020-08-28 苏州复元医疗科技有限公司 Method for measuring lung function index based on diagnostic image and machine learning

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Automatic body segmentation for accelerated rendering of digitally reconstructed radiograph images;O.Dorgham等;《Informatics in Medicine Unlocked》;20200613;第20卷;1-17 *
CT图像肾脏分割与三维重建及其在肾结石穿刺手术中的应用;苗春发;《中国优秀硕士学位论文全文数据库 医药卫生科技辑》;20190715;第2019年卷(第7期);E067-69 *
结合DICOM CT序列三维信息肺结节CAD***设计研究;刘少芳;《中国优秀硕士学位论文全文数据库 信息科技辑》;20140115;第2014年卷(第1期);I138-1848 *

Also Published As

Publication number Publication date
CN112102333A (en) 2020-12-18

Similar Documents

Publication Publication Date Title
CN108764286B (en) Classification and identification method of feature points in blood vessel image based on transfer learning
CN109241967B (en) Thyroid ultrasound image automatic identification system based on deep neural network, computer equipment and storage medium
CN112070119B (en) Ultrasonic section image quality control method, device and computer equipment
CN111462049B (en) Automatic lesion area form labeling method in mammary gland ultrasonic radiography video
CN107993221B (en) Automatic identification method for vulnerable plaque of cardiovascular Optical Coherence Tomography (OCT) image
US20230293079A1 (en) Electrocardiogram image processing method and device, medium, and electrocardiograph
CN102265308A (en) System for monitoring medical abnormalities and method of operation thereof
CN112381164B (en) Ultrasound image classification method and device based on multi-branch attention mechanism
CN113284149B (en) COVID-19 chest CT image identification method and device and electronic equipment
CN110838114B (en) Pulmonary nodule detection method, device and computer storage medium
CN112309566A (en) Remote automatic diagnosis system and method for intelligent image recognition and intelligent medical reasoning
CN116386902B (en) Artificial intelligent auxiliary pathological diagnosis system for colorectal cancer based on deep learning
CN111861989A (en) Method, system, terminal and storage medium for detecting midline of brain
CN112614133A (en) Three-dimensional pulmonary nodule detection model training method and device without anchor point frame
CN112508884A (en) Comprehensive detection device and method for cancerous region
CN112614573A (en) Deep learning model training method and device based on pathological image labeling tool
CN114494215A (en) Transformer-based thyroid nodule detection method
CN112102333B (en) Ultrasonic region segmentation method and system for B-ultrasonic DICOM (digital imaging and communications in medicine) image
CN113298773A (en) Heart view identification and left ventricle detection device and system based on deep learning
CN114010227A (en) Right ventricle characteristic information identification method and device
CN111062935B (en) Mammary gland tumor detection method, storage medium and terminal equipment
CN112734886A (en) Biological model die-threading detection method and device, electronic equipment and storage medium
CN113313685B (en) Renal tubular atrophy region identification method and system based on deep learning
Greenstein Earning stripes in medical machine learning
CN116012283B (en) Full-automatic ultrasonic image measurement method, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant