CN115670501A - Anatomical M-mode imaging method and ultrasonic equipment - Google Patents

Anatomical M-mode imaging method and ultrasonic equipment Download PDF

Info

Publication number
CN115670501A
CN115670501A CN202110832092.XA CN202110832092A CN115670501A CN 115670501 A CN115670501 A CN 115670501A CN 202110832092 A CN202110832092 A CN 202110832092A CN 115670501 A CN115670501 A CN 115670501A
Authority
CN
China
Prior art keywords
frame rate
picture
image
frame
map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110832092.XA
Other languages
Chinese (zh)
Inventor
陈永丽
付传卿
马克涛
刘振忠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Hisense Medical Equipment Co Ltd
Original Assignee
Qingdao Hisense Medical Equipment Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Hisense Medical Equipment Co Ltd filed Critical Qingdao Hisense Medical Equipment Co Ltd
Priority to CN202110832092.XA priority Critical patent/CN115670501A/en
Publication of CN115670501A publication Critical patent/CN115670501A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The application relates to the technical field of ultrasonic imaging, and discloses an anatomical M-type imaging method and ultrasonic equipment, which are used for solving the problem that mosaic phenomenon easily occurs to an anatomical M-type image when the frame rate of a B-type mode is low in the related art. In the application, in order to improve the quality of an anatomical M-type image and reduce the mosaic phenomenon, a method capable of effectively improving the frame rate of a B image under the condition of ensuring the quality of the B image is provided, and on an original ultrasonic device with a low frame rate of the B image, the quality of basic data for anatomical M-type imaging can be improved, namely, the high frame rate of the B image for ensuring the image quality is ensured.

Description

Anatomical M-mode imaging method and ultrasonic equipment
Technical Field
The application relates to the technical field of ultrasonic image processing, in particular to an anatomical M-type imaging method and ultrasonic equipment.
Background
In the ultrasonic image display interface, the staff can enable the anatomical M-mode function to obtain the anatomical M-mode image. The anatomical M-mode is similar to the conventional M-mode imaging method, and both the anatomical M-mode and the conventional M-mode use gray values to express the signal intensity of the ultrasonic echo signal in time and depth. However, the anatomical M-mode is substantially different from the conventional M-mode, which is a real scan line in the ultrasound scan control sequence, and the M-line of the anatomical M-mode is a one-dimensional representation of a B-mode ultrasound (Brightness mode) image.
B-mode ultrasound, generally called B-ultrasound, shows changes in physical characteristics of organs and tissues due to gray-scale changes from white to gray to black, and belongs to a two-dimensional image (black-white-gray image) of a Brightness modulation type (Brightness mode). The B picture is the image obtained by B ultrasonic.
The anatomical M type is influenced by the frame rate of the B image, and when the frame rate of the B type is low, mosaic phenomenon easily occurs to the anatomical M type image, so that the judgment and diagnosis of a doctor on the tissue structure are influenced.
Disclosure of Invention
The embodiment of the application provides an anatomical M-mode imaging method and ultrasonic equipment, which are used for solving the problem that mosaic phenomenon easily occurs to an anatomical M-mode image when the frame rate of a B-mode is low in the related art.
In a first aspect, the present application provides a method of anatomical M-mode imaging, the method comprising:
based on the routing speed of the M lines and the image quality requirement, adjusting the imaging parameters of the B picture for the purpose of improving the frame rate of the B picture, and scanning to obtain the B picture of the first frame rate;
performing interframe interpolation on the B picture at the first frame rate to obtain a B picture at a second frame rate;
and obtaining an anatomical M-type image based on the B image of the second frame rate.
Optionally, the adjusting B-image imaging parameters to obtain the B-image at the first frame rate by scanning based on the routing speed of the M-line and the image quality requirement, with the purpose of increasing the frame rate of the B-image, includes:
obtaining a B picture of the first frame rate by reducing the density of scanning lines based on the routing speed of the M lines and the image quality requirement; or,
obtaining a B picture of the first frame rate by reducing a scanning range based on the routing speed of the M lines and the image quality requirement; or,
and reducing the scanning line density until the condition A is met, and then reducing the scanning range to obtain a B picture of the first frame rate.
Optionally, before performing inter-frame interpolation on the B picture at the first frame rate to obtain the B picture at the second frame rate, the method further includes:
determining the number of frames of B pictures to be inserted between two adjacent frames of B pictures of the B pictures at the first frame rate; the frame number of the B picture to be inserted is positively correlated with the routing speed of the M line and negatively correlated with the first frame rate;
the inter-frame interpolation of the B picture at the first frame rate to obtain the B picture at the second frame rate includes:
and based on the number of the B pictures to be inserted, performing interframe interpolation on the B pictures at the first frame rate to obtain the B pictures at the second frame rate.
Optionally, the number of B-picture frames to be inserted is determined based on the following formula:
Figure BDA0003175912700000021
n denotes the number of frames to be inserted into B picture, M Speed Represents the trace speed, fps, of the M lines new Representing the first frame rate.
Optionally, the performing inter-frame interpolation on the B picture at the first frame rate to obtain the B picture at the second frame rate includes:
and aiming at the kth image and the (K + 1) th image, obtaining an intermediate image between the kth image and the (K + 1) th image by adopting a weighted summation mode, wherein K is a positive integer.
Optionally, a sum of the weight of the k images and the weight of the (k + 1) th image is 1, and the method includes:
determining a weight of the k-th image based on:
Figure BDA0003175912700000031
wherein:
a: for controlling the attenuation curve between frames;
b: the value is a constant and is used for controlling the flatness between two frames, and the value B is smaller when the motion speed of the part to be detected is higher;
d, representing the difference value between the pixel value of the current frame and the pixel value of the previous frame;
l represents the correlation level of two frame times;
c: represents a constant value, and the value is less than 1.
Optionally, the obtaining an anatomical M-mode image based on the B map at the second frame rate includes:
acquiring pixel point coordinates of an M line from the B picture of the second frame rate;
scanning and converting the pixel point coordinates to obtain coordinates of the rectangular coordinate system of the M line;
and obtaining and displaying the M-shaped anatomical image of the M line by using the coordinates of the rectangular coordinate system.
In a second aspect, the present application also provides an ultrasound device comprising: a processor, memory, display unit 130 and probe;
a probe for emitting an ultrasonic beam;
a display unit for displaying an anatomical M-mode image;
the memory is configured to store data required for an ultrasound image;
a processor, respectively connected with the probe and the display unit, configured to perform:
based on the routing speed of the M lines and the image quality requirement, adjusting the imaging parameters of the B picture for the purpose of improving the frame rate of the B picture, and scanning to obtain the B picture of the first frame rate;
performing inter-frame interpolation on the B picture at the first frame rate to obtain a B picture at a second frame rate;
and obtaining an anatomical M-type image based on the B image of the second frame rate.
Optionally, the routing speed and the image quality requirement based on the M lines are executed, for the purpose of improving the frame rate of the B-map, the imaging parameters of the B-map are adjusted, and the B-map at the first frame rate is obtained by scanning, where the processor is configured to execute:
obtaining a B picture of the first frame rate by reducing the density of scanning lines based on the routing speed of the M lines and the image quality requirement; or,
obtaining a B picture of the first frame rate by reducing a scanning range based on the routing speed of the M lines and the image quality requirement; or,
and reducing the scanning line density until the condition A is met, and then reducing the scanning range to obtain a B picture of the first frame rate.
Optionally, before performing the inter-frame interpolation on the B map at the first frame rate to obtain the B map at the second frame rate, the processor is further configured to:
determining the number of frames of B pictures to be inserted between two adjacent frames of B pictures of the B pictures at the first frame rate; the frame number of the B picture to be inserted is positively correlated with the routing speed of the M line and negatively correlated with the first frame rate;
the inter-frame interpolation is performed on the B picture at the first frame rate to obtain a B picture at a second frame rate, and the inter-frame interpolation includes:
and based on the number of the B pictures to be inserted, performing interframe interpolation on the B pictures at the first frame rate to obtain the B pictures at the second frame rate.
Optionally, the number of B-picture frames to be inserted is determined based on the following formula:
Figure BDA0003175912700000041
n denotes the number of frames to be inserted into B picture, M Speed Represents the trace speed, fps, of the M lines new Representing the first frame rate.
Optionally, the performing inter-frame interpolation on the B picture at the first frame rate to obtain the B picture at the second frame rate includes:
and aiming at the kth image and the (K + 1) th image, obtaining an intermediate image between the kth image and the (K + 1) th image by adopting a weighted summation mode, wherein K is a positive integer.
Optionally, a sum of the weight of the k images and the weight of the (k + 1) th image is 1, and the method includes:
determining a weight of the k-th image based on:
Figure BDA0003175912700000042
wherein:
a: for controlling the attenuation curve between frames;
b: the value is a constant and is used for controlling the flatness between two frames, and the B value is smaller when the motion speed of the part to be detected is higher;
d, representing the difference value between the pixel value of the current frame and the pixel value of the previous frame;
l represents the correlation level of two frame times;
c: represents a constant value, and the value is less than 1.
Optionally, the obtaining an anatomical M-mode image based on the B map at the second frame rate includes:
acquiring pixel point coordinates of an M line from a B picture of the second frame rate;
scanning and converting the pixel point coordinates to obtain coordinates of the rectangular coordinate system of the M line;
and obtaining and displaying the M-shaped anatomical image of the M line by using the coordinates of the rectangular coordinate system.
In a third aspect, an embodiment of the present application further provides a computer-readable storage medium, where instructions, when executed by a processor of an electronic device, enable the electronic device to perform any one of the methods as provided in the first aspect of the present application.
In a fifth aspect, an embodiment of the present application provides a computer program product comprising a computer program that, when executed by a processor, implements any of the methods as provided in the first aspect of the present application.
The technical scheme provided by the embodiment of the application at least has the following beneficial effects: in the embodiment of the application, in order to improve the quality of an anatomical M-type image and reduce mosaic phenomenon, a method capable of effectively improving the frame rate of the B image under the condition of ensuring the quality of the B image is provided, and on an original ultrasonic device with a low frame rate of the B image, the quality of basic data for anatomical M-type imaging can be improved, namely, the high B image frame rate of the image quality is ensured.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required to be used in the embodiments of the present application will be briefly described below, and it is obvious that the drawings described below are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on the drawings without creative efforts.
Fig. 1 is a schematic diagram of a frame of an ultrasound apparatus provided in an embodiment of the present application;
fig. 2 is a schematic view of an ultrasound apparatus provided in an embodiment of the present application for implementing M-mode imaging of an anatomy;
FIG. 3 is a schematic diagram of a user interface provided in an embodiment of the present application;
FIG. 4 is a schematic diagram of another user interface provided in an embodiment of the present application;
FIG. 5 is a schematic flow chart of an anatomical M-mode imaging method according to an embodiment of the present application;
fig. 6 is a schematic diagram illustrating a process of inter-frame difference values according to an embodiment of the present application;
FIG. 7 is another schematic flow chart diagram of an anatomical M-mode imaging method according to an embodiment of the present application;
fig. 8-9 are comparative illustrations provided in accordance with an embodiment of the present application.
Detailed Description
In order to make the technical solutions of the present application better understood by those of ordinary skill in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are capable of operation in sequences other than those illustrated or described herein. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the application, as detailed in the appended claims.
Hereinafter, some terms in the embodiments of the present application are explained to facilitate understanding by those skilled in the art.
(1) In the embodiments of the present application, the term "plurality" means two or more, and other terms are similar thereto.
(2) "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
(3) An ultrasound device, in the present embodiment, refers to an ultrasound imaging device that supports a B-mode and an anatomical M-mode. The user can select the anatomical M-mode in the display interface through a system menu or control keys. In this mode, the ultrasound device can generate and present an anatomical M-mode image based on the B-map. The anatomical M-shaped image describes the signal intensity of the echo signal in depth along with time variation, such as the motion of cardiac muscle can be known through the anatomical M-shaped image, and the doctor can be helped to diagnose diseases.
In view of the problem that mosaic phenomenon is easily generated in an anatomical M-mode image when the frame rate of a B-mode is low in the related art, which affects judgment and diagnosis of a doctor on a tissue structure, the embodiment of the present application provides an anatomical M-mode imaging method and an ultrasound apparatus.
In the related art, the frame rate of the anatomical M-mode image is increased by interpolating a plurality of pairs of M lines. If two anatomical M lines are interpolated, the biggest problem is that motion information of tissues is missed because the frame rate of a B image is low if the two anatomical M lines are interpolated before scan conversion, so that the two lines cannot accurately estimate the complete tissue motion. Polar coordinates often obtained in the conversion process from a rectangular coordinate system to a polar coordinate system in the scanning conversion process are not integers, interpolation is needed to be carried out on adjacent M lines in the conversion process, and because a point coordinate after scanning conversion does not correspond to a point coordinate before scanning conversion and often needs a plurality of points for interpolation, the motion track cannot be completely described by interpolation between two M lines, so that certain error exists only in the interpolation between the two adjacent M lines, and mosaic shape is easy to appear. If the interpolation of M lines is only carried out after the scan conversion, the accuracy of each point of the M lines can not be ensured, and the mosaic shape is easy to appear.
In the embodiment of the application, in order to improve the quality of the anatomical M-mode image and reduce the mosaic phenomenon, a method capable of effectively improving the frame rate of the B-mode image under the condition of ensuring the quality of the B-mode image is provided, and on the original ultrasonic equipment with the low frame rate of the B-mode image, the quality of basic data for the anatomical M-mode imaging can be improved, namely, the high frame rate of the B-mode image for ensuring the image quality is ensured, so that the original B-mode image data can well express the tissue motion condition, and the obtained anatomical M-mode image can improve the image quality and reduce the mosaic phenomenon.
The ultrasound apparatus and the anatomical M-mode imaging method provided by the embodiments of the present application are explained below with reference to the drawings.
Referring to fig. 1, a block diagram of an ultrasound apparatus provided in an embodiment of the present application is shown.
It should be understood that the ultrasound device 100 shown in fig. 1 is merely an example, and that the ultrasound device 100 may have more or fewer components than shown in fig. 1, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
A block diagram of a hardware configuration of an ultrasound apparatus 100 according to an exemplary embodiment is exemplarily shown in fig. 1.
As shown in fig. 1, the ultrasound apparatus 100 may include, for example: a processor 110, a memory 120, a display unit 130, and a probe 140; wherein,
a probe 140 for emitting an ultrasonic beam;
a display unit 130 for displaying an anatomical M-mode image;
the memory 120 is configured to store data required for ultrasound imaging, which may include software programs, application interface data, and the like;
a processor 110, respectively connected to the probe 140 and the display unit 130, configured to perform:
based on the routing speed of the M lines and the image quality requirement, adjusting the imaging parameters of the B picture with the aim of improving the frame rate of the B picture, and scanning to obtain the B picture of a first frame rate;
performing interframe interpolation on the B picture at the first frame rate to obtain a B picture at a second frame rate;
and obtaining an anatomical M-type image based on the B image of the second frame rate.
Optionally, the routing speed and the image quality requirement based on the M lines are executed, in order to improve the frame rate of the B-map, the imaging parameters of the B-map are adjusted, and the B-map at the first frame rate is obtained by scanning, where the processor is configured to execute:
obtaining a B picture of the first frame rate by reducing the density of scanning lines based on the routing speed of the M lines and the image quality requirement; or,
obtaining a B picture of the first frame rate by reducing a scanning range based on the routing speed of the M lines and the image quality requirement; or,
and reducing the scanning line density until the condition A is met, and then reducing the scanning range to obtain a B picture of the first frame rate.
Optionally, before performing inter-frame interpolation on the B picture at the first frame rate to obtain the B picture at the second frame rate, the processor is further configured to:
determining the number of B picture frames to be inserted between two adjacent B pictures of the B pictures at the first frame rate; the frame number of the B picture to be inserted is positively correlated with the routing speed of the M line and negatively correlated with the first frame rate;
the inter-frame interpolation of the B picture at the first frame rate to obtain the B picture at the second frame rate includes:
and based on the number of the B pictures to be inserted, performing interframe interpolation on the B pictures at the first frame rate to obtain the B pictures at the second frame rate.
Optionally, the number of B-picture frames to be inserted is determined based on the following formula:
Figure BDA0003175912700000091
n denotes the number of frames to be inserted into B picture, M Speed Represents the routing speed, fps, of the M lines new Representing the first frame rate.
Optionally, the performing inter-frame interpolation on the B picture at the first frame rate to obtain the B picture at the second frame rate includes:
and aiming at the kth image and the (K + 1) th image, obtaining an intermediate image between the kth image and the (K + 1) th image by adopting a weighted summation mode, wherein K is a positive integer.
Optionally, the sum of the weight of the k images and the weight of the (k + 1) th image is 1, and the method includes:
determining a weight of the k-th image based on:
Figure BDA0003175912700000092
wherein:
a: for controlling the attenuation curve between frames;
b: the value is a constant and is used for controlling the flatness between two frames, and the B value is smaller when the motion speed of the part to be detected is higher;
d, representing the difference value between the pixel value of the current frame and the pixel value of the previous frame;
l represents the correlation level of two frame times;
c: representing a constant, with a value less than 1.
Optionally, the obtaining an anatomical M-mode image based on the B-map at the second frame rate includes:
acquiring pixel point coordinates of an M line from the B picture of the second frame rate;
scanning and converting the pixel point coordinates to obtain coordinates of the rectangular coordinate system of the M line;
and obtaining and displaying the M-shaped anatomical image of the M line by using the coordinates of the rectangular coordinate system.
Fig. 2 is a schematic diagram of an application principle according to an embodiment of the present application. The part can be implemented by a part of modules or functional components of the ultrasound apparatus shown in fig. 1, and the following description will only be made for main components, and other components, such as a memory, a controller, a control circuit, etc., will not be described herein again.
As shown in fig. 2, a user interface 210, a display unit 220 for displaying the user interface, and a processor 230 may be included in the application environment.
The display unit 220 may include a display panel 221, a backlight assembly 222. Wherein the display panel 321 is configured to display the ultrasound image, the backlight assembly 222 is located at the back of the display panel 221, and the backlight assembly 222 may include a plurality of backlight partitions (not shown), each of which may emit light to illuminate the display panel 221.
The processor 230 may be configured to control the backlight brightness of the backlight zones in the backlight assembly 222, as well as control the probe to transmit the wide beam and receive the echo signals.
The processor 230 may include a focusing processing unit 231, a beam synthesizing unit 232, and a spectrum generating unit 233. Wherein the focus processing unit 231 may be configured to perform a focus process on the current frame ultrasound image, the focus process including: taking an ultrasonic puncture needle in the current frame ultrasonic image as a focusing position of a wide beam, and transmitting the wide beam to a target detection area according to an emission coefficient of the ultrasonic puncture needle; and receives the echo signal fed back by the needle body of the ultrasonic puncture needle. The beam synthesis unit 232 is configured to perform beam synthesis on echo signals fed back by the ultrasound puncture needle of the target detection region after the focusing process is completed on the ultrasound puncture needle, so as to obtain scanning information. The spectrum generation unit 233 is configured to perform doppler imaging based on the scanning information of each ultrasound puncture needle.
In the embodiment of the present application, in order to obtain an anatomical M-mode image, a user may set the number and trajectory of M lines through an operation interface of the ultrasound apparatus. In principle, any number of M lines can be provided, usually 1 to 3 lines, to meet the medical diagnosis requirements. In the embodiment of the present application, the anatomical M-mode imaging quality is improved based on the case of improving the B-map frame rate, and there is no limitation on the shape of the M-line, so that in the operation interface shown in fig. 3, a user can set an M-line with an arbitrary shape. The shape of the M-line is determined by the set M-line track points, and as shown in fig. 3, the user can set the number of track points (e.g., white circles in fig. 3 indicate track points) on the M-line and adjust the positions of the track points to obtain the M-line of the desired track.
In addition, the user can select the routing speed of the M lines in the operation interface. The trace speed is used to describe the moving speed of the M spectral line, for example, the moving speed on the time axis.
The routing speed can be divided into several gears, as shown in fig. 4, the routing speed includes a low gear, a middle gear and a high gear, and the higher the gear is, the higher the routing speed is. The higher the trace speed is, the higher the corresponding required B-picture frame rate is, so as to better describe the movement track of the organization. It should be noted that all the user interfaces in the embodiments of the present application are only used for illustrating the embodiments of the present application and are not used for limiting.
As shown in fig. 5, in the embodiment of the present application, in step 501, based on the routing speed of the M lines and the image quality requirement, to improve the frame rate of the B-map, the imaging parameters of the B-map are adjusted, and the B-map with the first frame rate is obtained by scanning.
The first frame rate is higher than the frame rate before adjusting the imaging parameters of the B picture, and the first frame rate is often lower than the frame rate matched with the wiring speed.
According to the method and the device, the frame rate of the B image is improved by optimizing the imaging parameters of the B image under the condition of ensuring the image quality requirement, and the B image with the first frame rate is obtained through actual scanning and can accurately express the motion condition of the tissue.
Since the frame rate can be increased if the number of scan lines in each frame B image is small, it is desirable to increase the frame rate by decreasing the number of scan lines in each frame B image. In addition, how many scans are needed in a frame depends on the scan range. Therefore, in implementation, the optimized B-map imaging parameters may include a line density and/or a scanning range, wherein a reasonable reduction of the line density can ensure an image quality and can also improve the B-map frame rate, and thus an implementation of improving the B-map frame rate based on the optimized B-map program parameters includes:
(1) Increasing the B-frame rate by decreasing the scan line density can be implemented as: and obtaining a B picture of the first frame rate by reducing the density of the scanning lines based on the routing speed of the M lines and the image quality requirement. The method can be implemented by determining the lower limit of the density of the scanning lines meeting the image quality requirement based on the routing speed of the M lines, and then improving the frame rate of the B picture by continuously reducing the density of the scanning lines and ensuring that the reduced density of the scanning lines is greater than the lower limit of the density of the scanning lines.
The scan line density can be expressed as shown in equation (1),
Figure BDA0003175912700000121
in the formula (1), D represents the scan line density, fps new The first frame rate is indicated, the scanning detection time of each scanning line is indicated, and R indicates the scanning range.
The first frame rate is related to the routing speed and limited by the capability of the device, and the first frame rate is often lower than the frame rate with the matching routing speed. In practice, in order to ensure the image quality, a lower limit of the scanning line density may be configured, and when it is determined that the scanning line density is lower than the lower limit of the scanning line density based on the first frame rate, the lower limit of the scanning line density may be used as the final scanning line density, and then the frame rate may be increased by adopting inter-frame interpolation.
(2) Raising the B-picture frame rate by lowering the scan range can be implemented as: and obtaining a B picture of the first frame rate by reducing the scanning range based on the routing speed of the M lines and the image quality requirement.
For example, on the premise of ensuring that a focus can be scanned for medical diagnosis, the B image frame rate is improved by adjusting the scanning range. For example, the original scanning range may be a first region including a large amount of tissues that need not be observed, and the lesion occupies a small portion of the area, so that the scanning range may be reduced to include a second region including the lesion area and a small amount of tissues around the lesion area. Therefore, the scanning range is reduced, and the frame rate can be improved.
The scanning range can be expressed as shown in formula (2):
Figure BDA0003175912700000122
in the formula (2), D represents the scanning line density, fps new The first frame rate is indicated, the scanning detection time of each scanning line is indicated, and R indicates the scanning range.
In practice, a smaller scanning range may be selected by the user, suitably according to his own needs, before entering the anatomical M-mode.
(3) Raising the B-map frame rate by reducing scan first density and scan range can be implemented as: the scanning line density is preferentially reduced to a certain extent, for example, when the scanning line density is reduced to the lower limit of the scanning line density, then the scanning range is reduced to improve the B picture frame rate.
After the B-image imaging parameters are optimized to increase the B-image frame rate, in order to further increase the B-image frame rate, in this embodiment of the application, in step 502, inter-frame interpolation may be performed on the B-image at the first frame rate to obtain a B-image at the second frame rate. Therefore, the tissue motion track between two adjacent frames of B images can be obtained through inter-frame interpolation.
In implementation, in order to ensure the interpolation image effect, the number of B-picture frames to be inserted between two adjacent B-pictures of the B-picture at the first frame rate may be determined, and then the B-picture at the first frame rate is subjected to inter-frame interpolation based on the number of B-picture frames to be inserted, so as to obtain the B-picture at the second frame rate.
In order to ensure the quality of the B-picture image, in the embodiment of the present application, the number of frames to be inserted into the B-picture is positively correlated with the routing speed of the M-line, and is negatively correlated with the first frame rate
For example, the number of B-picture frames to be inserted can be determined according to equation (3):
Figure BDA0003175912700000131
n represents the number of B frame to be inserted, M Speed To representThe routing speed, fps, of the M lines new Representing the first frame rate.
In the formula (3), the frame rate is matched with the track speed as much as possible based on the track speed.
The numerator may be directly used as the difference value, which indicates that several frames need to be inserted in a unit time.
Therefore, based on the first frame rate and the M-line routing speed, the number of B-picture frames needing to be inserted between two adjacent B-pictures can be reasonably determined, excessive B-picture insertion, which results in excessive B-picture quantity and waste of processing resources, and lack of tissue motion tracks due to too little B-picture insertion are avoided, and the number of B-picture frames to be inserted can be well balanced by the mode of the formula (3), so that the appropriate number of B-picture frames to be inserted is obtained.
When performing inter-frame interpolation, an intermediate image between the kth image and the (K + 1) th image can be obtained by adopting a weighted summation manner with respect to the kth image and the (K + 1) th image, where K is a positive integer. The expression of interpolation can be expressed as shown in equation (4):
I (n) =αI (k) +(1-α)I (k+1) (4)
wherein:
I k image data representing the K-th image B, I (k+1) Image data representing the (k + 1) th B-th image;
n represents the nth frame inserted between two frames; one possible implementation is to assume that A1 and A2 are scanned B images, i.e., images in B image at the first frame rate, as shown in fig. 6. Assuming that 3 images in the sequence of B1, B2 and B3 need to be inserted between A1 and A2, the B1 image is based on A1 and A2 as I k And I (k+1) Obtaining, as I, a B2 image based on the B1 image and the A2 image k And I (k+1) Obtained, and so on, the B3 picture is based on the B2 and A2 pictures as being I k And I (k+1) And (4) obtaining. Of course, in other embodiments, the intermediate frame B map may be obtained by using two maps by adjusting the weights of the two maps.
α: is a frame weight coefficient for controlling the weighting amount of the current frame and the previous frame.
Therefore, the calculation is convenient by adopting a weighted summation mode, the processing resource can be saved, and the intermediate frame B image meeting the requirement can be obtained.
In addition, in order to further improve the image quality of the interpolated inter-frame B image, in the embodiment of the present application, when performing interpolation, the weight of the k-th image is determined based on the following formula (5):
Figure BDA0003175912700000141
wherein d is more than or equal to 0 and less than or equal to 255 (5)
In equation (5):
a: used for controlling the attenuation curve between B frames; the larger the value of A, the faster the whole curve is attenuated, whereas the smaller the value is, the slower the attenuation is, the setting of the value of A is related to the part, the part moves faster, the larger value of A is needed, and the setting of the value of A can be obtained based on experimental tests.
B: the value of B is a constant value and is used for controlling the flatness between two frames, the flatness between two frames refers to the change speed of transition of two frames, if the motion is faster, the image of the next frame needs to be changed more greatly than that of the previous frame, so the value of B needs to be set to be a small value, otherwise, a larger value needs to be set. That is, the faster the movement speed of the part to be measured, the smaller the B value, and the B value is related to the part to be measured, the B values of different parts can be determined based on experiments;
d represents the difference between the pixel value of the current frame and the pixel value of the previous frame;
and L represents the correlation level of two frame times, the correlation level can be understood as the gear related to the frame, different positions are set as different gears, the gears are integers and can be selected from 0-M, and the higher the gear is, the higher the coefficient alpha is.
C, represents a constant, and the value is less than 1, and usually 0.5 can be taken.
The scheme for determining the weight is equivalent to the concept related to the frame, and the concept of motion is included, so that the determined weight is more reasonable, and an inter-frame image obtained by interpolation is more vivid.
After performing inter-frame interpolation to obtain a B-map at a second frame rate, in step 503, an anatomical M-type image may be obtained based on the B-map at the second frame rate.
For each M line, the following steps may be implemented:
in step SC1, pixel coordinates of M lines may be obtained from the B map at the second frame rate; the pixel point coordinates at this time are pixel points in the B picture, so the value is the value of the polar coordinate system.
In step SC2, scan converting the pixel coordinates to obtain coordinates of the rectangular coordinate system of the M-line;
in step SC3, obtaining and displaying the M-line anatomical M-shaped image using the coordinates of the rectangular coordinate system.
In addition, in the embodiment of the present application, the temporal resolution and/or the spatial resolution of the anatomical M-type image may be further improved by interpolating the M-line based on the B-map.
Fig. 7 is a schematic flow chart of an anatomical M-mode imaging method according to an embodiment of the present application, including the following steps:
in step 701, the scan line density is reduced, and then the scan range is reduced, and scanning is performed to obtain a B-map at the first frame rate.
In step 702, the number of B-frame frames to be inserted between two adjacent frames of B-frames, that is, the number of B-frame frames to be inserted, is determined based on the B-frame of the first frame rate and the routing speed of the M-line.
In step 703, determining the weight between two adjacent frames of B images, and then inserting the B images between the two adjacent frames of B images in a weighted summation manner to obtain the B images at the second frame rate.
In step 704, the coordinates of the pixels of M lines are obtained from the B map at the second frame rate.
In step 705, scan conversion is performed on the pixel coordinates to obtain coordinates of the rectangular coordinate system of the M-line.
In step 706, using the coordinates of the rectangular coordinate system, obtaining and displaying the M-line anatomical M-shaped image.
As shown in fig. 8, in order to obtain an interface M-type image before the embodiment of the present application, as can be seen from fig. 8, the image is relatively blurred, and the mosaic phenomenon affects the image quality. Fig. 9 shows an effect diagram after the scheme provided by the embodiment of the present application is adopted, it can be seen that M lines are clear in fig. 9, and the mosaic phenomenon is significantly reduced.
Although the operations of the methods of the present application are depicted in the drawings in a particular order, this does not require or imply that these operations must be performed in this particular order, or that all of the operations shown must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable image scaling apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable image scaling apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable image scaling apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While the preferred embodiments of the present application have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all alterations and modifications as fall within the scope of the application.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (10)

1. An anatomical M-mode imaging method, characterized in that the method comprises:
based on the routing speed of the M lines and the image quality requirement, adjusting the imaging parameters of the B picture with the aim of improving the frame rate of the B picture, and scanning to obtain the B picture of a first frame rate;
performing interframe interpolation on the B picture at the first frame rate to obtain a B picture at a second frame rate;
and obtaining an anatomical M-type image based on the B image of the second frame rate.
2. The method according to claim 1, wherein the scanning to obtain the B-map at the first frame rate by adjusting B-map imaging parameters based on the routing speed of the M-lines and the image quality requirement and aiming at increasing the frame rate of the B-map comprises:
obtaining a B picture of the first frame rate by reducing the density of scanning lines based on the routing speed of the M lines and the image quality requirement; or,
obtaining a B picture of the first frame rate by reducing a scanning range based on the routing speed of the M lines and the image quality requirement; or,
and reducing the scanning line density until the condition A is met, and then reducing the scanning range to obtain a B picture of the first frame rate.
3. The method according to claim 1, wherein before the inter-frame interpolation is performed on the B-map at the first frame rate to obtain the B-map at the second frame rate, the method further comprises:
determining the number of frames of B pictures to be inserted between two adjacent frames of B pictures of the B pictures at the first frame rate; the frame number of the B picture to be inserted is positively correlated with the routing speed of the M line and negatively correlated with the first frame rate;
the inter-frame interpolation of the B picture at the first frame rate to obtain the B picture at the second frame rate includes:
and performing interframe interpolation on the B picture at the first frame rate based on the number of the B picture frames to be inserted to obtain the B picture at the second frame rate.
4. The method according to claim 3, wherein the number of B-picture frames to be inserted is determined based on the following formula:
Figure FDA0003175912690000011
n denotes the number of frames to be inserted into B picture, M Speed Represents the trace speed, fps, of the M lines new Representing the first frame rate.
5. The method of claim 1, wherein the inter-frame interpolation of the B-map at the first frame rate to obtain the B-map at the second frame rate comprises:
and aiming at the kth image and the (K + 1) th image, obtaining an intermediate image between the kth image and the (K + 1) th image by adopting a weighted summation mode, wherein K is a positive integer.
6. The method according to claim 5, wherein the sum of the weights of the k images and the weight of the (k + 1) th image is 1, the method comprising:
determining a weight of the kth image based on:
Figure FDA0003175912690000021
wherein:
a: for controlling the attenuation curve between frames;
b: the value is a constant and is used for controlling the flatness between two frames, and the value B is smaller when the motion speed of the part to be detected is higher;
d represents the difference between the pixel value of the current frame and the pixel value of the previous frame;
l represents the correlation level of two frame times;
c: represents a constant value, and the value is less than 1.
7. The method according to any one of claims 1-6, wherein obtaining an anatomical M-mode image based on the B-map at the second frame rate comprises:
acquiring pixel point coordinates of an M line from the B picture of the second frame rate;
scanning and converting the pixel point coordinates to obtain coordinates of the rectangular coordinate system of the M line;
and obtaining and displaying the M-shaped anatomical image of the M line by using the coordinates of the rectangular coordinate system.
8. An ultrasound device, comprising: a processor, memory, display unit 130 and probe;
a probe for emitting an ultrasonic beam;
a display unit for displaying an anatomical M-mode image;
the memory is configured to store data required for an ultrasound image;
a processor, respectively connected with the probe and the display unit, configured to perform:
based on the routing speed of the M lines and the image quality requirement, adjusting the imaging parameters of the B picture with the aim of improving the frame rate of the B picture, and scanning to obtain the B picture of a first frame rate;
performing inter-frame interpolation on the B picture at the first frame rate to obtain a B picture at a second frame rate;
and obtaining an anatomical M-type image based on the B image of the second frame rate.
9. The ultrasound apparatus according to claim 8, wherein the M-line based trace speed and image quality requirements are implemented to adjust B-map imaging parameters for the purpose of improving B-map frame rate, and scan to obtain a B-map at a first frame rate, and the processor is configured to implement:
obtaining a B picture of the first frame rate by reducing the density of scanning lines based on the routing speed of the M lines and the image quality requirement; or,
obtaining a B picture of the first frame rate by reducing a scanning range based on the routing speed of the M lines and the image quality requirement; or,
and reducing the scanning line density until the condition A is met, and then reducing the scanning range to obtain a B picture of the first frame rate.
10. The ultrasound device according to claim 8 or 9, wherein before performing the inter-frame interpolation of the B-map at the first frame rate to obtain the B-map at the second frame rate, the processor is further configured to:
determining the number of frames of B pictures to be inserted between two adjacent frames of B pictures of the B pictures at the first frame rate; the frame number of the B picture to be inserted is positively correlated with the routing speed of the M line and negatively correlated with the first frame rate;
the inter-frame interpolation is performed on the B picture at the first frame rate to obtain a B picture at a second frame rate, and the inter-frame interpolation includes:
and performing interframe interpolation on the B picture at the first frame rate based on the number of the B picture frames to be inserted to obtain the B picture at the second frame rate.
CN202110832092.XA 2021-07-22 2021-07-22 Anatomical M-mode imaging method and ultrasonic equipment Pending CN115670501A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110832092.XA CN115670501A (en) 2021-07-22 2021-07-22 Anatomical M-mode imaging method and ultrasonic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110832092.XA CN115670501A (en) 2021-07-22 2021-07-22 Anatomical M-mode imaging method and ultrasonic equipment

Publications (1)

Publication Number Publication Date
CN115670501A true CN115670501A (en) 2023-02-03

Family

ID=85045021

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110832092.XA Pending CN115670501A (en) 2021-07-22 2021-07-22 Anatomical M-mode imaging method and ultrasonic equipment

Country Status (1)

Country Link
CN (1) CN115670501A (en)

Similar Documents

Publication Publication Date Title
US10813595B2 (en) Fully automated image optimization based on automated organ recognition
CN101523237B (en) 3d ultrasonic color flow imaging with grayscale invert
JP6147489B2 (en) Ultrasonic imaging system
JP7078487B2 (en) Ultrasound diagnostic equipment and ultrasonic image processing method
KR20160110239A (en) Continuously oriented enhanced ultrasound imaging of a sub-volume
JP2020531074A (en) Ultrasound system with deep learning network for image artifact identification and removal
CN102596048B (en) Ultrasonographic device, ultrasonic image processing device, medical image diagnostic device, and medical image processing device
JP5600285B2 (en) Ultrasonic image processing device
JP2008259605A (en) Ultrasonic diagnostic equipment
US10166004B2 (en) Ultrasound contrast imaging method and apparatus
US20200334818A1 (en) Medical image processing apparatus
CN101375803B (en) Ultrasonic diagnosis device
KR20060110466A (en) Method and system for estimating motion from continuous images
JP2022110089A (en) System and method for adaptively setting dynamic range for displaying ultrasonic image
EP0414261A2 (en) Ultrasonic diagnosing apparatus
JP2012050818A (en) Ultrasonic system and method for providing color doppler mode image
JPH10286256A (en) Ultrasonic diagnostic device and filter
CN115670501A (en) Anatomical M-mode imaging method and ultrasonic equipment
JP6745237B2 (en) Ultrasonic diagnostic equipment
CN110322420A (en) Image processing method and image processing system for electronic endoscope system
CN103654868B (en) The formation method of ultrasonic diagnostic equipment and system
US20210338207A1 (en) Image analyzing apparatus
JP2019005574A (en) Medical image processing apparatus and medical image processing program
JP2022015544A (en) Ultrasonic diagnostic apparatus and ultrasonic signal processing method
US20240061109A1 (en) Ultrasound diagnosis apparatus, image processing apparatus, and computer program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination