CN111105400A - Valve image processing method, valve image processing device and electronic equipment - Google Patents

Valve image processing method, valve image processing device and electronic equipment Download PDF

Info

Publication number
CN111105400A
CN111105400A CN201911326686.2A CN201911326686A CN111105400A CN 111105400 A CN111105400 A CN 111105400A CN 201911326686 A CN201911326686 A CN 201911326686A CN 111105400 A CN111105400 A CN 111105400A
Authority
CN
China
Prior art keywords
valve
image
contour
map
level set
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911326686.2A
Other languages
Chinese (zh)
Other versions
CN111105400B (en
Inventor
胡盛寿
聂宇
储庆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fuwai Hospital of CAMS and PUMC
Original Assignee
Fuwai Hospital of CAMS and PUMC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuwai Hospital of CAMS and PUMC filed Critical Fuwai Hospital of CAMS and PUMC
Priority to CN201911326686.2A priority Critical patent/CN111105400B/en
Publication of CN111105400A publication Critical patent/CN111105400A/en
Application granted granted Critical
Publication of CN111105400B publication Critical patent/CN111105400B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30048Heart; Cardiac
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30101Blood vessel; Artery; Vein; Vascular
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Quality & Reliability (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Image Analysis (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Image Processing (AREA)

Abstract

Disclosed are a valve image processing method, a valve image processing device and an electronic apparatus. The valve image processing method comprises the following steps: acquiring a first valve gray-scale image and a first inner cavity image of a previous frame and a second valve gray-scale image and a second inner cavity image of a current frame; obtaining a first level set function image of a previous frame based on the first valvular grayscale map; obtaining a second level set function image and a mask image of the current frame from the first valve gray scale image, the first level set function image, the first lumen image, the second valve gray scale image and the second lumen image through a predictor; obtaining an valve region contour map from the second valve gray scale image, the second level set function image and the mask image through a distance regularization level set evolution algorithm; and determining valve information based on the valve region contour map. In this way, accurate information of the valve portion can be obtained.

Description

Valve image processing method, valve image processing device and electronic equipment
Technical Field
The present application relates to the field of image processing technology, and more particularly, to a valve image processing method, a valve image processing apparatus, and an electronic device.
Background
The valve is mainly present in the region of the lumen of the artery, it is connected with the surrounding white tissue region, and more often has texture and gray level similar to the white tissue region, and the shape and position of the valve are changed at each moment, thus causing great difficulty in segmenting and tracking the valve portion, as shown in fig. 1. Fig. 1 illustrates a schematic diagram of a process of continuous evolution of a valve.
Specifically, fig. 1 shows a continuous multi-frame grayscale of heart valve images, i.e., frame number 737 through frame number 741. And, an image representing the difference and change between the two frames of gray-scale maps can be obtained by the difference calculation between the two frames, i.e. the subtraction of the two frames of gray-scale maps.
However, in both the grayscale map and the difference image in fig. 1, the valve part information cannot be accurately obtained.
Accordingly, it is desirable to provide an improved valve image processing scheme to obtain accurate information of valve portions.
Disclosure of Invention
The present application is proposed to solve the above-mentioned technical problems. Embodiments of the present application provide a valve image processing method, a valve image processing apparatus, and an electronic device, which predict a valve contour of a current frame from a valve contour of a previous frame through a predictor, and obtain an valve region contour map of the current frame through a distance regularization level set evolution algorithm, thereby obtaining accurate information of a valve portion.
According to an aspect of the present application, there is provided a valve image processing method including: acquiring a first valve gray-scale image and a first inner cavity image of a previous frame and a second valve gray-scale image and a second inner cavity image of a current frame; obtaining a first level set function image of the previous frame based on the first valve grayscale map; obtaining, by a predictor, a second level set function image and a mask image of the current frame from the first valve grayscale map, the first level set function image, the first lumen map, the second valve grayscale map, and the second lumen map; obtaining an valve region contour map from the second valve grayscale map, the second level set function image and the mask image by a distance regularization level set evolution algorithm; and determining valve information based on the valve region contour map.
In the above valve image processing method, obtaining a first level set function image of the previous frame based on the first valve grayscale map comprises: and performing artificial labeling on the first valve gray scale image to obtain the first level set function image.
In the above valve image processing method, obtaining, by a predictor, a second level set function image and a mask image of the current frame from the first valve grayscale map, the first level set function image, the first lumen map, the second valve grayscale map, and the second lumen map includes: determining a first valve contour of the previous frame based on the first valve grayscale map, the first level set function image, and the first lumen map; determining an expansion normal vector of each edge pixel point of the first valve contour; outwardly expanding the first valve contour by a product of the expansion normal vector and an expansion coefficient to obtain a second valve contour and a mask image of the current frame; obtaining the second level set function image based on the second valve contour and the second valve grayscale map; and obtaining the mask image based on the second valve contour and the second lumen map.
In the above valve image processing method, determining an expansion normal vector of each edge pixel point of the first valve contour includes: calculating a first gradient value and a second gradient value of each edge pixel point of the first valve contour in the x direction and the y direction; and dividing a product of the first gradient value and the second gradient value by a square root of a sum of squares of the first gradient value and the second gradient value and a predetermined constant to obtain the dilated normal vector.
In the above valve image processing method, the step of outwardly expanding the first valve contour by a product of the expansion normal vector and an expansion coefficient to obtain a second valve contour and a mask image of the current frame further includes: in response to a first valve contour edge point of the previous frame being within a lumen region of the current frame, searching along a normal vector of the first valve contour edge point; determining a search path distance in response to searching for a surrounding white tissue region; and determining the minimum value of the search path distance and the expansion limit threshold value as the expansion coefficient corresponding to the first valve contour edge point.
In the above valve image processing method, the step of outwardly expanding the first valve contour by a product of the expansion normal vector and an expansion coefficient to obtain a second valve contour and a mask image of the current frame further includes: in response to a second valve contour edge point of the previous frame being within a white tissue region surrounding the valve contour of the current frame, searching along a normal vector of the second valve contour edge point; in response to searching for a surrounding white tissue region, determining a width of the white tissue region; determining a predetermined coefficient of expansion that is less than a predetermined threshold; and determining the minimum value of the preset expansion coefficient, the width of the white tissue area and the expansion limit threshold value as the expansion coefficient corresponding to the second valve contour edge point.
In the valve image processing method, the determining the valve information based on the valve region contour map includes: calculating an inward normal vector of a valve contour edge based on the valve region contour map; constructing a rectangular frame along the inward normal vector; removing the part of the white tissue in the rectangular frame to obtain an optimized valve area contour map; and determining valve information based on the optimized valve region contour map.
In the above valve image processing method, calculating an inward normal vector of a valve contour edge based on the valve region contour map includes: calculating a first gradient value and a second gradient value of each edge pixel point of the first valve contour in the x direction and the y direction; dividing a product of the first gradient value and the second gradient value by a square root of a sum of squares of the first gradient value and the second gradient value and a sum of a predetermined constant to obtain an expanded normal vector; and calculating a negative value of the expanded normal vector as the inward normal vector.
In the above valve image processing method, constructing a rectangular frame along the inward normal vector includes: taking each edge pixel point as a central point of one side of the rectangular frame; and constructing a rectangular frame of a predetermined height and a predetermined width along the inward normal vector.
According to another aspect of the present application, there is provided a valve image processing apparatus including: the valve and lumen image acquisition unit is used for acquiring a first valve gray-scale image and a first lumen image of a previous frame and a second valve gray-scale image and a second lumen image of a current frame; a level set function image acquisition unit for acquiring a first level set function image of the previous frame based on the first valve grayscale map; an image prediction unit for obtaining a second level set function image and a mask image of the current frame from the first valvular grayscale map, the first level set function image, the first lumen map, the second valvular grayscale map, and the second lumen map through a predictor; a region contour obtaining unit, configured to obtain an valve region contour map from the second valve grayscale map, the second level set function image, and the mask image through a distance regularization level set evolution algorithm; and a valve information determination unit for determining valve information based on the valve region contour map.
In the above valve image processing apparatus, the level set function image acquiring unit is configured to: and performing artificial labeling on the first valve gray scale image to obtain the first level set function image.
In the above valve image processing apparatus, the image prediction unit is configured to: determining a first valve contour of the previous frame based on the first valve grayscale map, the first level set function image, and the first lumen map; determining an expansion normal vector of each edge pixel point of the first valve contour; outwardly expanding the first valve contour by a product of the expansion normal vector and an expansion coefficient to obtain a second valve contour and a mask image of the current frame; obtaining the second level set function image based on the second valve contour and the second valve grayscale map; and obtaining the mask image based on the second valve contour and the second lumen map.
In the above valve image processing apparatus, the determining, by the image prediction unit, the expansion normal vector for each edge pixel point of the first valve contour includes: calculating a first gradient value and a second gradient value of each edge pixel point of the first valve contour in the x direction and the y direction; and dividing a product of the first gradient value and the second gradient value by a square root of a sum of squares of the first gradient value and the second gradient value and a predetermined constant to obtain the dilated normal vector.
In the above valve image processing apparatus, the image prediction unit performing outward expansion on the first valve contour by a product of the expansion normal vector and an expansion coefficient to obtain a second valve contour and a mask image of the current frame further includes: in response to a first valve contour edge point of the previous frame being within a lumen region of the current frame, searching along a normal vector of the first valve contour edge point; determining a search path distance in response to searching for a surrounding white tissue region; and determining the minimum value of the search path distance and the expansion limit threshold value as the expansion coefficient corresponding to the first valve contour edge point.
In the above valve image processing apparatus, the image prediction unit performing outward expansion on the first valve contour by a product of the expansion normal vector and an expansion coefficient to obtain a second valve contour and a mask image of the current frame further includes: in response to a second valve contour edge point of the previous frame being within a white tissue region surrounding the valve contour of the current frame, searching along a normal vector of the second valve contour edge point; in response to searching for a surrounding white tissue region, determining a width of the white tissue region; determining a predetermined coefficient of expansion that is less than a predetermined threshold; and determining the minimum value of the preset expansion coefficient, the width of the white tissue area and the expansion limit threshold value as the expansion coefficient corresponding to the second valve contour edge point.
In the above valve image processing apparatus, the valve information determination unit is configured to: calculating an inward normal vector of a valve contour edge based on the valve region contour map; constructing a rectangular frame along the inward normal vector; removing the part of the white tissue in the rectangular frame to obtain an optimized valve area contour map; and determining valve information based on the optimized valve region contour map.
In the above valve image processing device, the calculating, by the valve information determination unit, an inward normal vector of the valve contour edge based on the valve region contour map includes: calculating a first gradient value and a second gradient value of each edge pixel point of the first valve contour in the x direction and the y direction; dividing a product of the first gradient value and the second gradient value by a square root of a sum of squares of the first gradient value and the second gradient value and a sum of a predetermined constant to obtain an expanded normal vector; and calculating a negative value of the expanded normal vector as the inward normal vector.
In the above valve image processing device, the valve information determination unit constructing a rectangular frame along the inward normal vector includes: taking each edge pixel point as a central point of one side of the rectangular frame; and constructing a rectangular frame of a predetermined height and a predetermined width along the inward normal vector.
According to still another aspect of the present application, there is provided an electronic apparatus including: a processor; and a memory having stored therein computer program instructions which, when executed by the processor, cause the processor to perform the valve image processing method as described above.
According to yet another aspect of the present application, there is provided a computer readable medium having stored thereon computer program instructions which, when executed by a processor, cause the processor to perform the valve image processing method as described above.
The valve image processing method, the valve image processing device and the electronic equipment provided by the embodiment of the application can predict the valve contour of the current frame from the valve contour of the previous frame through the predictor, and obtain the valve area contour map of the current frame through the distance regularization level set evolution algorithm, so that the accurate information of the valve part is obtained.
Drawings
The above and other objects, features and advantages of the present application will become more apparent by describing in more detail embodiments of the present application with reference to the attached drawings. The accompanying drawings are included to provide a further understanding of the embodiments of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the principles of the application. In the drawings, like reference numbers generally represent like parts or steps.
Fig. 1 illustrates a schematic diagram of a process of continuous evolution of a valve.
Fig. 2 illustrates a schematic flow diagram of a valve image processing method according to an embodiment of the present application.
Fig. 3 illustrates a schematic diagram of the principle of a predictor in a valve image processing method according to an embodiment of the present application.
Fig. 4 illustrates a schematic diagram of the principle of edge optimization in a valve image processing method according to an embodiment of the present application.
Fig. 5 illustrates a schematic block diagram of a valve image processing apparatus according to an embodiment of the present application.
FIG. 6 illustrates a block diagram of an electronic device in accordance with an embodiment of the present application.
Detailed Description
Hereinafter, example embodiments according to the present application will be described in detail with reference to the accompanying drawings. It should be understood that the described embodiments are only some embodiments of the present application and not all embodiments of the present application, and that the present application is not limited by the example embodiments described herein.
Exemplary method
Fig. 2 illustrates a schematic flow diagram of a valve image processing method according to an embodiment of the present application.
As shown in fig. 2, a valve image processing method according to an embodiment of the present application includes the following steps.
Step S110, a first valve grayscale map and a first lumen map of a previous frame and a second valve grayscale map and a second lumen map of a current frame are obtained. Here, the first and second valve grayscale images are original grayscale images of a valve image, and the first and second lumen images will show an arterial lumen.
In the embodiment of the present application, the previous frame and the current frame may be two consecutive frames, for example, the previous frame is an nth frame, and the current frame is an N +1 th frame. Of course, it will be understood by those skilled in the art that the previous frame and the current frame may be separated by several frames if the change between the shape and position of the valve is small.
Step S120, obtaining a first level set function image of the previous frame based on the first valvular grayscale map. Here, the first Level Set Function (LSF) image is used to determine an outline of the valve portion. In an embodiment of the application, the first level set function image may be obtained by determining a valve contour for the first valve gray map.
In one example, the first level set function image may be obtained by artificially labeling a valve contour for the first valve gray scale map. Of course, it will be understood by those skilled in the art that the first level set function image may also be obtained from the first valve gray scale map by other image processing methods, such as contour recognition algorithms and the like.
Therefore, in the valve image processing method according to an embodiment of the present application, obtaining the first level set function image of the previous frame based on the first valve grayscale map includes: and performing artificial labeling on the first valve gray scale image to obtain the first level set function image.
Step S130, obtaining a second level set function image and a mask image of the current frame from the first valve grayscale map, the first level set function image, the first lumen map, the second valve grayscale map, and the second lumen map through a predictor. In the present embodiment, in order to be able to quickly and accurately obtain the contour of the valve portion, it is necessary to predetermine the area where the contour is reasonably effective. However, as described above, since the shape and position of the valve change at each time, the valve contour of the current frame needs to be predicted from the valve contour of the previous frame by a prediction mechanism.
The specifics of the predictor according to embodiments of the present application will be described in further detail below.
Step S140, obtaining an outline image of the valve region from the second valve gray scale image, the second level set function image and the mask image through a distance regularization level set evolution algorithm. In the embodiment of the present application, the Distance Regularized Level Set Evolution (DRLSE) algorithm obtains the whole valve contour (including the valve portion and the artery lumen) through multiple iterations, but the DRLSE algorithm needs to specify the region where the contour is located.
Therefore, in the embodiment of the present application, after the region where the contour of the valve of the current frame is predicted from the previous frame by the predictor, the DRLSE algorithm can be used to obtain the contour map of the valve region from the valve gray scale map, the LSF image and the mask image. Here, the mask image is obtained by labeling the lumen map of the corresponding frame with the artery lumen as a mask.
The details of the DRLSE algorithm according to the embodiments of the present application will be described in further detail below.
And S150, determining valve information based on the valve area contour map. That is, by obtaining a contour map of the valve region (including the valve portion and the artery lumen), and combining the black portions in the lumen map, valve information such as the shape and position of the valve, the contour of the artery lumen, and the like can be determined.
Next, a specific mechanism of the predictor according to the embodiment of the present application will be described in detail.
In the embodiment of the present application, the predictor predicts the valve contour of the current frame according to the valve contour of the previous frame and the gray-scale map and the lumen map of the current frame, so as to obtain the corresponding LSF image and mask image.
Because the valve contour of the previous frame and the current frame has difference, each pixel point of the valve edge has inward collapse or outward expansion, but the collapse or expansion does not have great change. Therefore, in order to predict the valve region of the current frame, the valve contour obtained from the previous frame needs to be expanded outward to some extent along the normal vector of the edge, as shown in fig. 3.
Fig. 3 illustrates a schematic diagram of the principle of a predictor in a valve image processing method according to an embodiment of the present application. As shown in fig. 3, p1 and p2 represent edge points for two different cases, where p1 represents that the valve contour in the next frame swells significantly outward, resulting in the valve contour edge points from the previous frame being located in the lumen region of the valve contour of the current frame, and p2 represents that the valve contour in the current frame swells little outward or collapses inward, resulting in the valve contour edge points from the previous frame still being located in the white tissue region surrounding the valve contour of the current frame.
In order to obtain a more accurate valve contour for the current frame in the DRLSE algorithm, the predicted valve contour is larger than the actual valve contour, and the edge points are all located in the surrounding white tissue region. Therefore, the valve contour obtained from the current frame needs to be expanded outward along the normal vector of the edge, as shown in the following formula:
Figure BDA0002328569250000081
wherein
Figure BDA0002328569250000082
Is the normal vector representing the pixel points at the edge of the valve contour, and α is the expansion coefficient.
In this way, the valve contour obtained from the previous frame is expanded outwards to a certain extent along the normal vector of the edge to obtain the valve contour of the current frame, and then, for example, a circumscribed matrix of the contour is calculated and the edge is expanded to obtain the predicted valve region. And all the pixel values in the valve area belonging to the interior of the valve contour are set as negative numbers, and all the pixel values outside the valve contour are set as positive numbers, so that a corresponding LSF image of the current frame is obtained. And, the setting of the valve area relative to the lumen map is 0 for the lumen part and 1 for the white tissue area, so as to obtain the labeled mask image.
Therefore, in the valve image processing method according to an embodiment of the present application, obtaining, by a predictor, a second level set function image and a mask image of the current frame from the first valve grayscale map, the first level set function image, the first lumen map, the second valve grayscale map, and the second lumen map includes: determining a first valve contour of the previous frame based on the first valve grayscale map, the first level set function image, and the first lumen map; determining an expansion normal vector of each edge pixel point of the first valve contour; outwardly expanding the first valve contour by a product of the expansion normal vector and an expansion coefficient to obtain a second valve contour and a mask image of the current frame; obtaining the second level set function image based on the second valve contour and the second valve grayscale map; and obtaining the mask image based on the second valve contour and the second lumen map.
Specifically, in this embodiment of the present application, the normal vector of the pixel point at the edge of the valve contour is obtained by the following formula:
Figure BDA0002328569250000083
wherein, IinRepresenting lumen map, p representing valve edge pixel points, pxRepresenting the x-axis coordinate, p, of a pixel at the edge of the valveyThe x-axis coordinate of the pixel points representing the valve margin,
Figure BDA0002328569250000084
representing the gradient value of the valve edge pixel point p on the x-axis,
Figure BDA0002328569250000085
the gradient value of the pixel point p at the edge of the valve in the y-axis is represented, and epsilon is a small constant in order to prevent the denominator from being 0.
That is, in the valve image processing method according to the embodiment of the present application, determining the expansion normal vector of each edge pixel point of the first valve contour includes: calculating a first gradient value and a second gradient value of each edge pixel point of the first valve contour in the x direction and the y direction; and dividing a product of the first gradient value and the second gradient value by a square root of a sum of squares of the first gradient value and the second gradient value and a predetermined constant to obtain the dilated normal vector.
In addition, considering that collapse or expansion does not occur drastically, an expansion limit is set, αTAnd (4) showing. If the valve contour edge point obtained from the previous frame is located in the lumen region of the valve contour of the current frame, as shown by point p1 in fig. 3, the normal vector v1 along this edge point p1 is searched until the surrounding white tissue region is searched. Suppose the search path distance is d1Then finally the expansion coefficient of the edge point p1 is:
αp1=min(d1T)
on the other hand, if the edge point of the valve contour obtained from the previous frame is still located in the white tissue region surrounding the valve contour of the current frame, as shown by point p2 in FIG. 3, a small expansion coefficient is set, using αsIndicating, for example, the expansion coefficient αsThe value range of (1) is more than or equal to 1 pixel, and the value is usually 1, which means that 1 pixel point is expanded outwards each time. At the same time, the normal vector v2 search along the edge point p2 to obtain the width of the white tissue region, using d2Then finally the expansion coefficient of the edge point p2 is:
αp2=min(αsT,d1)
that is, in the valve image processing method according to the embodiment of the present application, the expanding the first valve contour outward by a product of the expansion normal vector and an expansion coefficient to obtain a second valve contour and a mask image of the current frame further includes: in response to a first valve contour edge point of the previous frame being within a lumen region of the current frame, searching along a normal vector of the first valve contour edge point; determining a search path distance in response to searching for a surrounding white tissue region; and determining the minimum value of the search path distance and the expansion limit threshold value as the expansion coefficient corresponding to the first valve contour edge point.
Also, in the valve image processing method according to an embodiment of the present application, the expanding the first valve contour outward by a product of the expansion normal vector and an expansion coefficient to obtain a second valve contour and a mask image of the current frame further includes: in response to a second valve contour edge point of the previous frame being within a white tissue region surrounding the valve contour of the current frame, searching along a normal vector of the second valve contour edge point; in response to searching for a surrounding white tissue region, determining a width of the white tissue region; determining a predetermined coefficient of expansion that is less than a predetermined threshold; and determining the minimum value of the preset expansion coefficient, the width of the white tissue area and the expansion limit threshold value as the expansion coefficient corresponding to the second valve contour edge point.
Next, the details of the DRLSE algorithm according to the embodiment of the present application will be described in further detail.
In the embodiment of the present application, the DRLSE algorithm can be performed iteratively, so as to obtain a more accurate valve contour region.
The DRLSE algorithm is a new variation level set formula constructed based on a distance regularization term and external energy, and the formula is as follows:
Figure BDA0002328569250000101
where μ, λ and α are constants representing the proportion of each part, div (·) represents a divergence operator, δ represents a dirac function, g is an image edge representation function, dp(. cndot.) is a function that calculates the direction of level set diffusion.
In order to avoid that the DRLSE algorithm erodes the level set contour to the lumen contour in the iterative process, which causes the valve contour evolution error, a mark mask is multiplied on the right side of the above formula, that is:
Figure BDA0002328569250000102
by solving the above formula for a number of iterations, an accurate valve contour can be obtained:
Figure BDA0002328569250000103
since the DRLSE algorithm reduces diffusion at the edge, the obtained valve contour edge is not fine, and there may be a fine white tissue region, so in the embodiment of the present application, some correction is further performed on the valve contour edge.
Specifically, an inward normal vector of the valve contour edge is first calculated, then a rectangular frame is constructed along the inward normal vector, and all parts belonging to white tissue in the rectangular frame are removed (the corresponding mark in the lumen map is 0). Fig. 4 illustrates a schematic diagram of the principle of edge optimization in a valve image processing method according to an embodiment of the present application. As shown in fig. 4, in which the valve area is shown in grayscale on the left and the corresponding lumen on the right, there are fine white areas around the edge points M1 and M2.
That is, in the valve image processing method according to the embodiment of the present application, determining the valve information based on the valve region contour map includes: calculating an inward normal vector of a valve contour edge based on the valve region contour map; constructing a rectangular frame along the inward normal vector; removing the part of the white tissue in the rectangular frame to obtain an optimized valve area contour map; and determining valve information based on the optimized valve region contour map.
In order to better remove the tiny white area around the edge point, firstly, the outward normal vector of the edge point is calculated according to the formula
Figure BDA0002328569250000111
Taking the opposite direction as the inward normal vector of the edge point, and recording as
Figure BDA0002328569250000112
Figure BDA0002328569250000113
That is, in the valve image processing method according to the embodiment of the present application, calculating the inward normal vector of the valve contour edge based on the valve region contour map includes: calculating a first gradient value and a second gradient value of each edge pixel point of the first valve contour in the x direction and the y direction; dividing a product of the first gradient value and the second gradient value by a square root of a sum of squares of the first gradient value and the second gradient value and a sum of a predetermined constant to obtain an expanded normal vector; and calculating a negative value of the expanded normal vector as the inward normal vector.
Then, the pixel point is taken as the center point of one side of the rectangle, and a rectangular frame with the width of w and the height of h is constructed along the inward normal vector. And finally, traversing all pixel points in the rectangular frame, and removing all the pixel points belonging to the white tissue, thereby obtaining a new fine valve contour edge.
That is, in the valve image processing method according to the embodiment of the present application, constructing a rectangular frame along the inward normal vector includes: taking each edge pixel point as a central point of one side of the rectangular frame; and constructing a rectangular frame of a predetermined height and a predetermined width along the inward normal vector.
Exemplary devices
Fig. 5 illustrates a schematic block diagram of a valve image processing apparatus according to an embodiment of the present application.
As shown in fig. 5, the valve image processing apparatus 200 according to the embodiment of the present application includes: a valve and lumen image acquisition unit 210 for acquiring a first valve grayscale map and a first lumen map of a previous frame and a second valve grayscale map and a second lumen map of a current frame; a level set function image acquisition unit 220 for obtaining a first level set function image of the previous frame based on the first valve grayscale map; an image prediction unit 230, configured to obtain, through a predictor, a second level set function image and a mask image of the current frame from the first valve grayscale map, the first level set function image, the first lumen map, the second valve grayscale map, and the second lumen map; a region contour obtaining unit 240, configured to obtain an valve region contour map from the second valve grayscale map, the second level set function image, and the mask image through a distance regularization level set evolution algorithm; and a valve information determination unit 250 for determining valve information based on the valve region contour map.
In one example, in the valve image processing apparatus 200 described above, the level set function image obtaining unit 220 is configured to: and performing artificial labeling on the first valve gray scale image to obtain the first level set function image.
In one example, in the valve image processing apparatus 200, the image prediction unit 230 is configured to: determining a first valve contour of the previous frame based on the first valve grayscale map, the first level set function image, and the first lumen map; determining an expansion normal vector of each edge pixel point of the first valve contour; outwardly expanding the first valve contour by a product of the expansion normal vector and an expansion coefficient to obtain a second valve contour and a mask image of the current frame; obtaining the second level set function image based on the second valve contour and the second valve grayscale map; and obtaining the mask image based on the second valve contour and the second lumen map.
In one example, in the above valve image processing apparatus 200, the determining, by the image prediction unit 230, the expansion normal vector of each edge pixel point of the first valve contour includes: calculating a first gradient value and a second gradient value of each edge pixel point of the first valve contour in the x direction and the y direction; and dividing a product of the first gradient value and the second gradient value by a square root of a sum of squares of the first gradient value and the second gradient value and a predetermined constant to obtain the dilated normal vector.
In one example, in the valve image processing apparatus 200, the outward expansion of the first valve contour by the product of the expansion normal vector and the expansion coefficient by the image prediction unit 230 to obtain the second valve contour and the mask image of the current frame further comprises: in response to a first valve contour edge point of the previous frame being within a lumen region of the current frame, searching along a normal vector of the first valve contour edge point; determining a search path distance in response to searching for a surrounding white tissue region; and determining the minimum value of the search path distance and the expansion limit threshold value as the expansion coefficient corresponding to the first valve contour edge point.
In one example, in the valve image processing apparatus 200, the outward expansion of the first valve contour by the product of the expansion normal vector and the expansion coefficient by the image prediction unit 230 to obtain the second valve contour and the mask image of the current frame further comprises: in response to a second valve contour edge point of the previous frame being within a white tissue region surrounding the valve contour of the current frame, searching along a normal vector of the second valve contour edge point; in response to searching for a surrounding white tissue region, determining a width of the white tissue region; determining a predetermined coefficient of expansion that is less than a predetermined threshold; and determining the minimum value of the preset expansion coefficient, the width of the white tissue area and the expansion limit threshold value as the expansion coefficient corresponding to the second valve contour edge point.
In one example, in the valve image processing apparatus 200 described above, the valve information determination unit 250 is configured to: calculating an inward normal vector of a valve contour edge based on the valve region contour map; constructing a rectangular frame along the inward normal vector; removing the part of the white tissue in the rectangular frame to obtain an optimized valve area contour map; and determining valve information based on the optimized valve region contour map.
In one example, in the valve image processing device 200 described above, the calculating, by the valve information determination unit 250, an inward normal vector of the valve contour edge based on the valve region contour map includes: calculating a first gradient value and a second gradient value of each edge pixel point of the first valve contour in the x direction and the y direction; dividing a product of the first gradient value and the second gradient value by a square root of a sum of squares of the first gradient value and the second gradient value and a sum of a predetermined constant to obtain an expanded normal vector; and calculating a negative value of the expanded normal vector as the inward normal vector.
In one example, in the valve image processing device 200 described above, the valve information determination unit 250 constructing a rectangular frame along the inward normal vector includes: taking each edge pixel point as a central point of one side of the rectangular frame; and constructing a rectangular frame of a predetermined height and a predetermined width along the inward normal vector.
Here, it will be understood by those skilled in the art that the specific functions and operations of the respective units and modules in the above-described valve image processing apparatus 200 have been described in detail in the above description of the valve image processing method with reference to fig. 2 to 4, and thus, a repetitive description thereof will be omitted.
As described above, the valve image processing apparatus 200 according to the embodiment of the present application may be implemented in various terminal devices, such as a server for processing a valve image, and the like. In one example, the valve image processing apparatus 200 according to the embodiment of the present application may be integrated into a terminal device as one software module and/or hardware module. For example, the valve image processing apparatus 200 may be a software module in the operating system of the terminal device, or may be an application developed for the terminal device; of course, the valve image processing device 200 may also be one of many hardware modules of the terminal device.
Alternatively, in another example, the valve image processing apparatus 200 and the terminal device may be separate devices, and the valve image processing apparatus 200 may be connected to the terminal device through a wired and/or wireless network and transmit the interactive information according to an agreed data format.
Exemplary electronic device
Next, an electronic apparatus according to an embodiment of the present application is described with reference to fig. 6.
FIG. 6 illustrates a block diagram of an electronic device in accordance with an embodiment of the present application.
As shown in fig. 6, the electronic device 10 includes one or more processors 11 and memory 12.
The processor 13 may be a Central Processing Unit (CPU) or other form of processing unit having data processing capabilities and/or instruction execution capabilities, and may control other components in the electronic device 10 to perform desired functions.
Memory 12 may include one or more computer program products that may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. The volatile memory may include, for example, Random Access Memory (RAM), cache memory (cache), and/or the like. The non-volatile memory may include, for example, Read Only Memory (ROM), hard disk, flash memory, etc. One or more computer program instructions may be stored on the computer readable storage medium and executed by the processor 11 to implement the valve image processing methods of the various embodiments of the present application described above and/or other desired functions. Various contents such as a raw gray scale map, an LSF map, an intra-cavity map, etc. may also be stored in the computer-readable storage medium.
In one example, the electronic device 10 may further include: an input device 13 and an output device 14, which are interconnected by a bus system and/or other form of connection mechanism (not shown).
The input device 13 may include, for example, a keyboard, a mouse, and the like.
The output device 14 can output various information including the obtained valve information and the like to the outside. The output devices 14 may include, for example, a display, speakers, a printer, and a communication network and its connected remote output devices, among others.
Of course, for simplicity, only some of the components of the electronic device 10 relevant to the present application are shown in fig. 6, and components such as buses, input/output interfaces, and the like are omitted. In addition, the electronic device 10 may include any other suitable components depending on the particular application.
Exemplary computer program product and computer-readable storage Medium
In addition to the above-described methods and apparatus, embodiments of the present application may also be a computer program product comprising computer program instructions which, when executed by a processor, cause the processor to perform the steps in the valve image processing method according to various embodiments of the present application described in the "exemplary methods" section above of this specification.
The computer program product may be written with program code for performing the operations of embodiments of the present application in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the first user computing device, partly on the first user device, as a stand-alone software package, partly on the first user computing device and partly on a remote computing device, or entirely on the remote computing device or server.
Furthermore, embodiments of the present application may also be a computer readable storage medium having stored thereon computer program instructions which, when executed by a processor, cause the processor to perform the steps in the valve image processing method according to various embodiments of the present application described in the "exemplary methods" section above in the present specification.
The computer-readable storage medium may take any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may include, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The foregoing describes the general principles of the present application in conjunction with specific embodiments, however, it is noted that the advantages, effects, etc. mentioned in the present application are merely examples and are not limiting, and they should not be considered essential to the various embodiments of the present application. Furthermore, the foregoing disclosure of specific details is for the purpose of illustration and description and is not intended to be limiting, since the foregoing disclosure is not intended to be exhaustive or to limit the disclosure to the precise details disclosed.
The block diagrams of devices, apparatuses, systems referred to in this application are only given as illustrative examples and are not intended to require or imply that the connections, arrangements, configurations, etc. must be made in the manner shown in the block diagrams. These devices, apparatuses, devices, systems may be connected, arranged, configured in any manner, as will be appreciated by those skilled in the art. Words such as "including," "comprising," "having," and the like are open-ended words that mean "including, but not limited to," and are used interchangeably therewith. The words "or" and "as used herein mean, and are used interchangeably with, the word" and/or, "unless the context clearly dictates otherwise. The word "such as" is used herein to mean, and is used interchangeably with, the phrase "such as but not limited to".
It should also be noted that in the devices, apparatuses, and methods of the present application, the components or steps may be decomposed and/or recombined. These decompositions and/or recombinations are to be considered as equivalents of the present application.
The previous description of the disclosed aspects is provided to enable any person skilled in the art to make or use the present application. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects without departing from the scope of the application. Thus, the present application is not intended to be limited to the aspects shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing description has been presented for purposes of illustration and description. Furthermore, the description is not intended to limit embodiments of the application to the form disclosed herein. While a number of example aspects and embodiments have been discussed above, those of skill in the art will recognize certain variations, modifications, alterations, additions and sub-combinations thereof.

Claims (11)

1. A valve image processing method, comprising:
acquiring a first valve gray-scale image and a first inner cavity image of a previous frame and a second valve gray-scale image and a second inner cavity image of a current frame;
obtaining a first level set function image of the previous frame based on the first valve grayscale map;
obtaining, by a predictor, a second level set function image and a mask image of the current frame from the first valve grayscale map, the first level set function image, the first lumen map, the second valve grayscale map, and the second lumen map;
obtaining an valve region contour map from the second valve grayscale map, the second level set function image and the mask image by a distance regularization level set evolution algorithm; and
determining valve information based on the valve region contour map.
2. The valve image processing method of claim 1, wherein obtaining a first level set function image of the previous frame based on the first valve grayscale map comprises:
and performing artificial labeling on the first valve gray scale image to obtain the first level set function image.
3. The valve image processing method of claim 1, wherein obtaining, by a predictor, a second level set function image and a mask image of the current frame from the first valve grayscale map, the first level set function image, the first lumen map, the second valve grayscale map, and the second lumen map comprises:
determining a first valve contour of the previous frame based on the first valve grayscale map, the first level set function image, and the first lumen map;
determining an expansion normal vector of each edge pixel point of the first valve contour;
outwardly expanding the first valve contour by a product of the expansion normal vector and an expansion coefficient to obtain a second valve contour and a mask image of the current frame;
obtaining the second level set function image based on the second valve contour and the second valve grayscale map; and
obtaining the mask image based on the second valve contour and the second lumen map.
4. The valve image processing method of claim 3, wherein determining an expansion normal vector for each edge pixel point of the first valve contour comprises:
calculating a first gradient value and a second gradient value of each edge pixel point of the first valve contour in the x direction and the y direction; and
dividing a product of the first gradient value and the second gradient value by a square root of a sum of squares of the first gradient value and the second gradient value and a predetermined constant to obtain the expanded normal vector.
5. The valve image processing method of claim 4, wherein outwardly expanding the first valve contour by a product of the expansion normal vector and an expansion coefficient to obtain a second valve contour and mask image of the current frame further comprises:
in response to a first valve contour edge point of the previous frame being within a lumen region of the current frame, searching along a normal vector of the first valve contour edge point;
determining a search path distance in response to searching for a surrounding white tissue region; and
and determining the minimum value of the search path distance and the expansion limit threshold value as the expansion coefficient corresponding to the first valve contour edge point.
6. The valve image processing method of claim 4, wherein outwardly expanding the first valve contour by a product of the expansion normal vector and an expansion coefficient to obtain a second valve contour and mask image of the current frame further comprises:
in response to a second valve contour edge point of the previous frame being within a white tissue region surrounding the valve contour of the current frame, searching along a normal vector of the second valve contour edge point;
in response to searching for a surrounding white tissue region, determining a width of the white tissue region;
determining a predetermined coefficient of expansion that is less than a predetermined threshold; and
determining the minimum of the predetermined expansion coefficient, the width of the white tissue region, and the expansion limit threshold as the expansion coefficient corresponding to the second valve contour edge point.
7. The valve image processing method of claim 1, wherein determining valve information based on the valve region contour map comprises:
calculating an inward normal vector of a valve contour edge based on the valve region contour map;
constructing a rectangular frame along the inward normal vector;
removing the part of the white tissue in the rectangular frame to obtain an optimized valve area contour map; and
determining valve information based on the optimized valve region contour map.
8. The valve image processing method of claim 7, wherein computing an inward normal vector for a valve contour edge based on the valve region contour map comprises:
calculating a first gradient value and a second gradient value of each edge pixel point of the first valve contour in the x direction and the y direction;
dividing a product of the first gradient value and the second gradient value by a square root of a sum of squares of the first gradient value and the second gradient value and a sum of a predetermined constant to obtain an expanded normal vector; and
and calculating the negative value of the expansion normal vector as the inward normal vector.
9. The valve image processing method of claim 8, wherein constructing a rectangular box along the inward normal vector comprises:
taking each edge pixel point as a central point of one side of the rectangular frame; and
a rectangular box of a predetermined height and a predetermined width is constructed along the inward normal vector.
10. A valve image processing device characterized by comprising:
the valve and lumen image acquisition unit is used for acquiring a first valve gray-scale image and a first lumen image of a previous frame and a second valve gray-scale image and a second lumen image of a current frame;
a level set function image acquisition unit for acquiring a first level set function image of the previous frame based on the first valve grayscale map;
an image prediction unit for obtaining a second level set function image and a mask image of the current frame from the first valvular grayscale map, the first level set function image, the first lumen map, the second valvular grayscale map, and the second lumen map through a predictor;
a region contour obtaining unit, configured to obtain an valve region contour map from the second valve grayscale map, the second level set function image, and the mask image through a distance regularization level set evolution algorithm; and
a valve information determination unit for determining valve information based on the valve region contour map.
11. An electronic device, comprising:
a processor; and the number of the first and second groups,
a memory having stored therein computer program instructions which, when executed by the processor, cause the processor to perform the valve image processing method of any one of claims 1 to 9.
CN201911326686.2A 2019-12-20 2019-12-20 Valve image processing method, valve image processing device and electronic equipment Active CN111105400B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911326686.2A CN111105400B (en) 2019-12-20 2019-12-20 Valve image processing method, valve image processing device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911326686.2A CN111105400B (en) 2019-12-20 2019-12-20 Valve image processing method, valve image processing device and electronic equipment

Publications (2)

Publication Number Publication Date
CN111105400A true CN111105400A (en) 2020-05-05
CN111105400B CN111105400B (en) 2023-09-19

Family

ID=70422446

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911326686.2A Active CN111105400B (en) 2019-12-20 2019-12-20 Valve image processing method, valve image processing device and electronic equipment

Country Status (1)

Country Link
CN (1) CN111105400B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060115161A1 (en) * 2004-11-30 2006-06-01 Samsung Electronics Co., Ltd. Face detection method and apparatus using level-set method
CN103593848A (en) * 2013-11-25 2014-02-19 深圳市恩普电子技术有限公司 Ultrasonic endocardium tracing method
CN107481252A (en) * 2017-08-24 2017-12-15 上海术理智能科技有限公司 Dividing method, device, medium and the electronic equipment of medical image

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060115161A1 (en) * 2004-11-30 2006-06-01 Samsung Electronics Co., Ltd. Face detection method and apparatus using level-set method
CN103593848A (en) * 2013-11-25 2014-02-19 深圳市恩普电子技术有限公司 Ultrasonic endocardium tracing method
CN107481252A (en) * 2017-08-24 2017-12-15 上海术理智能科技有限公司 Dividing method, device, medium and the electronic equipment of medical image

Also Published As

Publication number Publication date
CN111105400B (en) 2023-09-19

Similar Documents

Publication Publication Date Title
WO2018108129A1 (en) Method and apparatus for use in identifying object type, and electronic device
CN110443357B (en) Convolutional neural network calculation optimization method and device, computer equipment and medium
CN106952338B (en) Three-dimensional reconstruction method and system based on deep learning and readable storage medium
US9449384B2 (en) Method for registering deformable images using random Markov fields
US8000527B2 (en) Interactive image segmentation by precomputation
US20140064558A1 (en) Object tracking apparatus and method and camera
CN109255382B (en) Neural network system, method and device for picture matching positioning
CN110533046B (en) Image instance segmentation method and device, computer readable storage medium and electronic equipment
JP6597914B2 (en) Image processing apparatus, image processing method, and program
CN111444807B (en) Target detection method, device, electronic equipment and computer readable medium
US20130142420A1 (en) Image recognition information attaching apparatus, image recognition information attaching method, and non-transitory computer readable medium
CN116091414A (en) Cardiovascular image recognition method and system based on deep learning
US7480079B2 (en) System and method for sequential kernel density approximation through mode propagation
CN111626379A (en) X-ray image detection method for pneumonia
CN113643311B (en) Image segmentation method and device with robust boundary errors
JP7369288B2 (en) Image processing method, image processing command generation method and device
US11244460B2 (en) Digital image boundary detection
CN111105400B (en) Valve image processing method, valve image processing device and electronic equipment
CN116109907A (en) Target detection method, target detection device, electronic equipment and storage medium
CN113554124B (en) Image recognition method and device, computer-readable storage medium and electronic device
CN111383245A (en) Video detection method, video detection device and electronic equipment
US20200097522A1 (en) Parameter estimation apparatus, parameter estimation method, and computer-readable recording medium
KR101524944B1 (en) Multi-region image segmentation method and apparatus
CN110751672B (en) Method and apparatus for implementing multi-scale optical flow pixel transform using dilution convolution
US20190251703A1 (en) Method of angle detection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant