WO2018195946A1 - 一种超声图像显示方法、设备及存储介质 - Google Patents
一种超声图像显示方法、设备及存储介质 Download PDFInfo
- Publication number
- WO2018195946A1 WO2018195946A1 PCT/CN2017/082485 CN2017082485W WO2018195946A1 WO 2018195946 A1 WO2018195946 A1 WO 2018195946A1 CN 2017082485 W CN2017082485 W CN 2017082485W WO 2018195946 A1 WO2018195946 A1 WO 2018195946A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- organ
- image
- map
- data
- planar
- Prior art date
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
Definitions
- the present invention relates to an ultrasonic image display method and apparatus.
- Ultrasonic instruments are generally used by doctors to observe the internal tissue structure of the human body.
- the doctor places the ultrasound probe on the surface of the skin corresponding to the human body part, and an ultrasound image of the part can be obtained.
- Ultrasound has become one of the main aids for doctors' diagnosis because of its safety, convenience, losslessness and low cost.
- three-dimensional ultrasound has been widely used in clinical practice.
- the reason is that three-dimensional ultrasound can completely detect a tissue or an organ with a scan, and then the doctor can perform the clinical needs by post-processing such as rotation and translation.
- the cut surface is adjusted, which greatly reduces the doctor's scan time, and also facilitates data storage.
- the three-dimensional space is relatively abstract, and the existing ultrasound equipment lacks a clear orientation indication, which makes it difficult for many doctors to associate the three-dimensional spatial position with the actual organ, which limits the three-dimensional ultrasound to a certain extent in clinical practice. use.
- the present invention mainly provides an ultrasonic image display method and apparatus, which can display a cut surface image in a real organ by displaying a planar image mark on an organ map, so that the doctor can associate the cut image with the actual organ. Visually capture the cut surface position of the cut image in the real organ, which helps the doctor diagnose the cut image.
- an ultrasound image display method including:
- a planar image marker is displayed at a corresponding location on the organ map at the location of the slice.
- the planar image marker comprises a planar region and/or a line segment, wherein the planar region is enclosed by a line segment and/or a curved segment.
- the organ map is a three-dimensional map or a two-dimensional plan.
- the ultrasound image display method further includes:
- a planar image mark is displayed at a corresponding position on the organ map at the current slice position.
- the organ map and the planar image indicia are distinguished by setting transparency and/or color on the organ map.
- planar image marks ⁇ when a plurality of planar image marks ⁇ are included, different planar image marks are distinguished by setting transparency and/or color.
- the method for displaying an ultrasound image further includes: receiving an instruction for a position change of the slice image by a user, and when the position of the slice image is changed, a plane image corresponding to the slice image is associated The position of the marker on the organ map is also updated.
- the importing the organ model data corresponding to the target tissue includes:
- the organ model data of the organ type is introduced according to the acquired organ type.
- the method further includes:
- an ultrasound image display device including:
- a transmit/receive control circuit configured to control the probe to scan the target tissue to obtain three-dimensional volume data
- a data processor configured to generate one or more aspect images in the three-dimensional volume data; import organ model data corresponding to the target tissue, and generate an organ map according to the organ model data;
- the three-dimensional volume data is matched with the organ model data to obtain a correspondence relationship, and according to the correspondence relationship, a cut surface position of at least one of the one or more aspect images relative to the organ model data is obtained, and the organ is A plane image mark is generated on the map at a corresponding position of the cut surface position;
- a display for displaying one or more aspect images generated by the data processor, displaying the organ map and displaying the planar image mark at a corresponding position on the organ map at the cut surface position.
- the planar image marker comprises a planar region and/or a line segment, wherein the planar region is surrounded by a line segment and/or a curved segment.
- the organ map is a three-dimensional map or a two-dimensional plan.
- the data processor is further configured to acquire an instruction to activate a slice image, and obtain a current slice position of the current slice image relative to the organ model data according to the corresponding relationship;
- the display is further configured to display a planar image mark at a corresponding position of the current slice position on the organ map.
- the data processor is further configured to distinguish the organ map and the planar image marker by setting a transparency and/or a color.
- the data processor is further configured to distinguish different planar image marks by setting transparency and/or color.
- the ultrasonic image display device further includes an input unit, configured to receive an instruction that the user performs a position change on the slice image, and when the position of the slice image changes, the data processor is to be The position of the corresponding planar image mark associated with the facet image on the organ map is also updated and displayed by the display.
- the data processor is further configured to automatically identify the target organization according to the three-dimensional volume data.
- the organ type data of the organ type is imported according to the type of the organ that is automatically recognized.
- the data processor obtains an ultrasound image according to the three-dimensional volume data, receives an instruction of a user to adjust the ultrasound image or the organ map, and adjusts the organ map or the location according to the instruction.
- the display orientation of the ultrasound image wherein the ultrasound image comprises a two-dimensional ultrasound image or a three-dimensional ultrasound image
- the display is further configured to provide a selection or input interface for the user to select or input an organ type, and the data processor imports the organ model data of the organ type according to the organ type input by the user.
- an embodiment provides a storage medium storing a program executable by the processor to implement the ultrasound image display method disclosed in any of the above embodiments.
- the three-dimensional volume data is matched with the organ model data to obtain a correspondence relationship; and the one or more slice images are obtained according to the correspondence relationship a cut surface position of at least one of the cut surface images relative to the organ model data; and at the corresponding position of the cut surface position on the organ map, displaying the planar image mark, thereby displaying the cut surface image of the displayed three-dimensional volume data in the real organ
- the position of the cut surface is indicated by displaying a flat image mark on an organ map, which is convenient for the doctor to understand the cut surface position of the cut image of the three-dimensional volume data in the actual organ.
- FIG. 1 is a flow chart of an ultrasonic image display method according to an embodiment of the present invention.
- FIG. 2 is a flowchart of a method for displaying an ultrasound image according to an embodiment of the present invention, and importing data of an official model corresponding to the target tissue;
- FIG. 3 is a partial flow chart of an ultrasonic image display method according to another embodiment of the present invention.
- FIG. 4 is a schematic structural diagram of an ultrasonic image display device according to an embodiment of the present invention.
- FIG. 5 is a schematic diagram of an effect in an embodiment of the present invention.
- FIG. 6 is a diagram showing changes in a planar marker image after positional movement of a sliced image according to an embodiment of the present invention.
- FIG. 7 is a schematic diagram showing the effect of the organ image as a two-dimensional plan ⁇ in an embodiment of the present invention, wherein FIG. 7( a) is a section image, and FIG. 7 (b) is a plane on the organ diagram and the organ diagram.
- Image tag
- FIG. 8 is a schematic diagram showing the effect of organ images in a plurality of plane image marks in an embodiment of the present invention.
- connection and “connection” as used in this application include direct and indirect connections (connections) unless otherwise stated.
- a cut image is an image of a plane in which a certain slice is located in three-dimensional volume data. Similar to 2D ultrasound images, 3D volume According to the cut-off image, a cross-sectional image is displayed, so the lack of spatial information, coupled with the complicated operation of the ultrasound, makes it difficult for the doctor to map the cut-off image of the three-dimensional volume data with the actual organ.
- the present invention further obtains the cut surface position of the displayed three-dimensional volume data with respect to the organ model data, and finally displays the cut surface image of the displayed three-dimensional volume data in the real organ. The position is indicated by displaying a flat image mark on an organ map, which is convenient for the doctor to understand the cut surface position of the cut image of the three-dimensional volume data in the actual organ.
- an embodiment of the present invention provides an ultrasonic image display method, including steps S10 ⁇ S70.
- Step S10 The control probe scans the target tissue to obtain three-dimensional volume data.
- the ultrasound probes mentioned in this article can be either 1D probes or area array probes.
- the three-dimensional volume data may be volume data obtained by the area array probe, or three-dimensional volume data obtained by three-dimensional reconstruction after obtaining multi-frame two-dimensional ultrasound images by other types of probes, or may be STIC (Skull associated imaging, Spatiotemporal) Image
- Step S20 Importing organ model data corresponding to the target tissue.
- step S20 includes step S21 and step S23.
- Step S21 Obtain an organ type corresponding to the target tissue.
- organ types can be fetal heart, fetal brain, adult heart, liver, pelvic floor and endometrium.
- the determination of the organ type can be determined by a manual method, for example, providing an interface for the user to select the type of the organ; or automatically identifying the organ type of the target tissue based on the acquired three-dimensional volume data, for example, some machine learning methods can be employed.
- To classify three-dimensional volume data specifically, collect a large amount of target data (called a positive sample) and non-target data (called a negative sample), and then learn a feature that can distinguish between positive and negative samples by designing a machine learning algorithm.
- step S21 acquires an organ type corresponding to the target tissue, and at least includes one of the following methods: [0068] (1) Providing a selection or input interface for the user to select or input an organ type.
- Step S23 Importing organ model data of the organ type according to the acquired organ type.
- Step S30 Matching the foregoing three-dimensional volume data with the organ model data to obtain a correspondence relationship.
- the three-dimensional volume data can be matched with the organ model data in the next step.
- the position of some key anatomical structures in the organ model data can be obtained. Because the corresponding image position on the organ model is displayed, it is used to help The doctor understands the position of the cut surface image of the three-dimensional volume data in the actual organ, so it is not necessary to display the position of the cut surface on the organ model.
- the three-body data and the organ model data are linearly transformed. Relationship, the linear transformation relationship may be a rigid body transformation or a non-rigid body transformation.
- the transformation relationship between the two can be calculated by least squares estimation, data fitting, Ransac and the like. It may be assumed that the three-body data and the organ model data are rigid body transformations, and the transformation relationship between the two can be represented by a matrix. If the organ model data is data of a three-dimensional model, then at least three points are determined in space. The transformation relationship between the two can be established. If the organ model data is the data of a two-dimensional model, then at least the position of the two points can be determined to establish a transformation relationship between the two.
- the matching relationship between the three-dimensional volume data and the organ model data is established by matching, or the corresponding relationship, the key is to acquire the spatial position of some specific anatomical structures in the three-dimensional volume data, which can be based on the three-dimensional volume data.
- corresponding organs to select some anatomical structures in the organ that are easy to locate and identify.
- the anatomical features of the skull aura, sagittal plane, cerebellum and transparent compartment are relatively obvious. It is easier to identify.
- the matching relationship and correspondence between the three-dimensional volume data and the brain organ model data can be established.
- the anatomical features of the aorta, the four-chamber heart and the gastric vesicle are more obvious, and it is easier to identify.
- the fetus can be established.
- the matching relationship and correspondence relationship between the three-dimensional body data of the heart and the fetal heart model data can be established.
- the corresponding algorithm can be designed according to the characteristics of the anatomical structure.
- the structure of the stomach cavity in the fetal heart volume image data is usually low.
- An echogenic or anechoic ellipsoidal target can be segmented by image segmentation.
- the three-dimensional volume data may be firstly binarized, and some necessary morphological operations are performed to obtain a plurality of candidate regions, and then each candidate region is judged according to the shape and the like to determine the probability that the region is a gastric sac, and the probability of selection The highest area is the area of the stomach bubble.
- template matching can also be used to detect some key anatomical structures in the three-dimensional volume data.
- the transparent compartment in the brain is shaped like a crescent, and the data of the transparent compartment can be collected in advance to establish a template. Detecting ⁇ traversing all possible regions in the volume data, matching the template with similarity, and selecting the region with the highest similarity as the target region.
- a machine learning method can be used to detect some key anatomical structures in the three-dimensional body data.
- the cerebellum in the brain can collect a certain number of cerebellum images in advance, which is called a positive sample, and collect a certain number of The non-cerebellar image, called the negative sample, then the machine learning algorithm is designed to automatically learn the features that can distinguish between the positive and negative samples. Using these features to detect all the possible regions in the 3D volume data, the region is judged as The probability of positive samples, the region with the highest probability of selection is the target region.
- Commonly used machine learning algorithms can include Adaboost algorithm, support vector machine (SV M), neural network algorithm, deep learning algorithm and so on.
- Step S40 Display one or more aspect images in the foregoing three-dimensional volume data. For example, you can display a cut image or you can display multiple different cut images on one display.
- Step S50 Obtain a tangent position of at least one of the one or more aspect images relative to the organ model data according to a matching correspondence between the three-dimensional volume data and the organ model data.
- Step S60 Display the aforementioned organ model data to obtain an organ map.
- the aforementioned organ map is a three-dimensional map or a two-dimensional plan.
- Step S70 displaying a planar image mark on the aforementioned organ map at a corresponding position of the aforementioned slice position.
- the planar image indicia may comprise planar regions and/or line segments, wherein the planar regions are enclosed by line segments and/or curved segments.
- the aforementioned organ map and planar image markers can be distinguished by setting the transparency and/or color.
- different plane image marks can be distinguished by setting transparency and/or color. .
- the ultrasonic image display method further includes: activating the slice image, and obtaining a current slice position of the activated slice image relative to the organ model data according to a matching correspondence between the three-dimensional volume data and the organ model data, A planar image mark is displayed at a corresponding position on the organ map at the current slice position.
- the selected slice image may be activated according to an instruction issued by a mouse or a keyboard, etc., by using a mouse as an example, the face image is activated by capturing the cursor of the mouse on the display screen, so that the user can move the mouse by moving the mouse.
- the cursor to display the cut surface image that needs to be understood on the organ model's cut surface position, to help you better understand the current cut image. Therefore, referring to FIG. 3, in an embodiment, the ultrasonic image display method may further include steps S71-S77.
- Step S71 Acquire the position of the current cursor.
- Step S73 determining a current slice image corresponding to the position of the current cursor, for activating the current slice image.
- a desired slice by means of a button.
- Step S75 Obtain a current slice position of the current slice image relative to the organ model data according to the foregoing correspondence relationship.
- Step S77 displaying a plan view image mark at a corresponding position on the aforementioned organ map at the position of the current cut surface.
- the user can also change the cut surface position of the cut image by rotating and translating the displayed cut image, so that the cut image is also changed according to the actual shape, and the cut surface position on the organ model is required at this time.
- the change is also realized, that is, the plane image mark corresponding to the face image on the organ map is also changed, so that the user can know the actual position of the face image in the organ. Therefore, in an embodiment, the ultrasonic image display method may further include: receiving an instruction for the user to change the position of the cut image, and when the position of the cut image changes, the planar image corresponding to the cut image is marked on the organ map. The location is also updated.
- the following steps may be further included: [0086] obtaining an ultrasound image according to the foregoing three-dimensional volume data, where the ultrasound image includes a two-dimensional ultrasound image or a three-dimensional ultrasound image And then adjusting the display orientation of the aforementioned organ map according to the received instruction based on the received user's adjustment of the aforementioned ultrasound image. For example, when the user observes the three-dimensional ultrasound image displayed on the interface, If the orientation of the three-dimensional ultrasound image is adjusted, the organ map can also change the display orientation as the user adjusts the three-dimensional ultrasound image.
- the orientation of the aforementioned ultrasonic image can be adjusted according to the instruction.
- the system automatically associates it with the three-dimensional ultrasound image, and the fetus is imaged by three-dimensional image rendering and image segmentation processing algorithms. The face is gradually rotated to the orientation facing the user, thereby realizing the linkage of the ultrasound image and the organ map azimuth adjustment, thereby facilitating image adjustment and browsing.
- the present invention also provides an ultrasonic image display apparatus including a probe 110, a transmitting/receiving control circuit 120, a data processor 130, and a display 140.
- the probe 110 includes at least one array element for transmitting ultrasonic waves according to an excitation electric signal output from the transmission/reception control circuit 120, or converting the received ultrasonic waves into electrical signals.
- each element can be used to transmit ultrasound to the target tissue, as well as to receive ultrasound echoes that are returned by the tissue.
- the array and the received sequence can be used to control which array elements are used to transmit ultrasonic waves, which array elements are used to receive ultrasonic waves, or to control the array element gaps for transmitting ultrasonic waves or receiving ultrasonic echoes.
- the array elements participating in the ultrasonic transmission may be excited by the electric signal to transmit the ultrasonic waves simultaneously; or the array elements participating in the ultrasonic beam emission may also be excited by a plurality of electrical signals having a certain inter-turn interval, so that the continuous emission has a certain inter-turn interval. Ultrasound.
- the transmit/receive control circuit 120 is for controlling the probe 110 to transmit an ultrasonic beam to the target tissue, and on the other hand for controlling the probe 110 to receive the ultrasonic echo of the ultrasonic beam reflected by the tissue.
- the transmit/receive control circuit 120 is configured to generate a transmit sequence and a receive sequence, the transmit sequence is configured to control part or all of the plurality of array elements to transmit ultrasonic waves to the target tissue, and the transmit sequence parameters include the number of array elements for transmission.
- ultrasonic emission parameters (such as amplitude, frequency, number of waves, emission interval, emission angle, wave pattern, etc.).
- the receiving sequence is used to control some or all of the plurality of array elements to receive the echoes of the ultrasonic tissue, and the receiving sequence parameters include the number of array elements for receiving and the receiving parameters of the echo (eg, receiving angle, depth, etc.).
- the ultrasonic parameters in the transmitted sequence and the echo parameters in the received sequence are also different for different purposes of ultrasonic echo or depending on the image generated by the ultrasonic echo.
- the transmit/receive control circuit 120 is configured to control the probe 110 to scan the target tissue to obtain three-dimensional volume data.
- the data processor 130 is configured to generate one or more aspect images of the foregoing three-dimensional volume data; import organ model data corresponding to the target tissue, and generate an organ map according to the organ model data; and the foregoing three-dimensional volume data Matching with the organ model data to obtain a correspondence relationship, and obtaining, according to the foregoing correspondence relationship, a cut surface position of at least one of the one or more cut surface images relative to the organ model data, and located at the cut surface position on the organ map At the corresponding position, a planar image marker is generated.
- the data processor 130 introduces the organ model data corresponding to the target organization, and there are many ways, for example, the first method, the data processor 130 automatically recognizes the organ type corresponding to the target tissue according to the three-dimensional volume data, according to the automatic identification.
- the organ type is imported into the organ model data of the organ type; for example, in the second mode, the data processor 130 imports the organ type data of the organ type according to the organ type input by the user, wherein the display 140 provides a selection or input interface for the user to select or input. Organ type.
- the display 140 is configured to display one or more aspect images generated by the data processor, display the aforementioned organ map, and display the aforementioned planar image mark at a corresponding position on the aforementioned organ map at the aforementioned slice position.
- the displayed organ map is a three dimensional view or a two dimensional plan view.
- the planar image indicia comprises planar regions and/or line segments, wherein the planar regions are surrounded by line segments and/or curved segments.
- the aforementioned organ map and planar image markers can be distinguished by setting the transparency and/or color.
- a plurality of planar image marks ⁇ when a plurality of planar image marks ⁇ are included, for example, it is required to display a cut surface position ⁇ of a plurality of cut image on the organ model, different plane image marks can be distinguished by setting transparency and/or color.
- the facet image may be determined by activating the facet image, and thus, in an embodiment, the data processor 130 is further configured to acquire an instruction to activate the facet image. And obtaining, according to the correspondence relationship, a current slice position of the activated slice image relative to the organ model data, and correspondingly, the display 140 is configured to display the plane image at the corresponding position of the current slice position on the organ map. mark.
- the data processor 130 acquires an instruction for activating the cut image, which may be issued by moving a mouse or pressing a button.
- the mouse may be determined by capturing a cursor of the mouse on the display screen.
- the data processor 130 is further configured to acquire a current cursor position, determine a current slice image corresponding to the position of the current cursor, and obtain the current slice image relative to the organ model data according to the foregoing correspondence.
- the current slice position correspondingly, is used to display a planar image indicia at a corresponding location on the aforementioned organ map at the aforementioned current slice position.
- the ultrasonic image display apparatus further includes an input unit (not shown in the drawing) for receiving an instruction of the user to change the position of the cut image, when the position of the cut image changes,
- the data processor 130 updates the position of the planar image associated with the slice image on the aforementioned organ map, and displays it through the display 140 described above.
- the method may further include the following steps: the data processor obtains an ultrasound image according to the foregoing three-dimensional volume data, where the ultrasound image includes a two-dimensional ultrasound image or three-dimensional The ultrasound image is then adjusted based on the received command by the user to adjust the aforementioned ultrasound image, according to which the display orientation of the aforementioned organ map is adjusted. For example, when the user observes the three-dimensional ultrasound image displayed on the interface, if the orientation of the three-dimensional ultrasound image is adjusted, the organ map can also change the display orientation as the user adjusts the three-dimensional ultrasound image.
- the orientation of the aforementioned ultrasonic image is adjusted according to the instruction.
- the system automatically associates it with the three-dimensional ultrasound image, and the fetus is imaged by three-dimensional image rendering and image segmentation processing algorithms. The face is gradually rotated to the display orientation facing the user, thereby realizing the linkage function of the ultrasound image and the organ map azimuth adjustment.
- the present invention can simultaneously display three sectional images, and the target tissue can be described as a fetal heart, please refer to In Fig. 5, the screen interface displays three orthogonal slice images, wherein the slice position of the slice image 31 in the upper left corner is displayed in the organ map 32 in the lower right corner, that is, the slice position marked by the plane image mark 33. It can be seen that the position of the cut surface in the organ map 32 by the planar image mark 33 allows the user (e.g., a doctor) to easily know the cut position of the cut image 31 in the real organ.
- the method of an embodiment of the present invention can be utilized, that is, by using a cursor or a button to activate the slice image, for example, when the cursor moves to the slice image in the upper right corner. Then, in the organ map 32, the corresponding planar image mark associated with the cut image of the upper right corner is displayed, so that the user can easily understand the cut surface position of the upper right corner of the cut image in the real organ.
- the user may also change the cut surface position of the cut image by rotating and translating the displayed cut image, so that the cut image is also changed according to the actual shape, and the time is required to be on the organ model.
- the position of the cut surface has also changed.
- the cut surface image 43 is a four-chamber view, and the corresponding planar image associated with the organ map is labeled 41.
- the cut image 43 becomes a cut image 44, that is, left.
- the chamber exits the cut surface.
- the corresponding planar image mark associated with the cut image in the organ map also changes, that is, from the planar image mark 41 to the planar image mark 42, and the cut surface position changes.
- FIG. 5 and 6 are both three-dimensional views of the organ map, and in other embodiments, the displayed organ map may also be a two-dimensional plan view.
- Figure 7 (a) is a cerebellar section image
- Figure 7 (b) is a brain image of the brain, a two-dimensional plan
- Figure 7 (b) is a diagonal line across the brain.
- the image of the cerebellum section on the left is a planar image on the organ map, indicating that the cerebellar section is perpendicular to the organ map on the right.
- FIGS. 5 to 7 both show the effect of displaying a planar image mark on the organ map, and in other embodiments, a plurality of planar image marks can also be displayed on the organ map.
- a planar image mark on the organ map
- four cut images are shown, which are cerebellar section, thalamic section, median sagittal plane, and lateral ventricle section.
- Four planar image markers are shown in the organ map in the lower right corner, indicating four The facet image of the cut image in the real organ.
- Those skilled in the art may understand that all or part of the functions of the various methods in the foregoing embodiments may be implemented by hardware or by a computer program.
- the program is implemented by a computer program, and the program may be stored in a computer readable storage medium.
- the storage medium may include: a read only memory, a random access memory, a magnetic disk, an optical disk, a hard disk, etc., and the program is executed by a computer.
- the program is stored in the memory of the device, and when the program in the memory is executed by the processor, all or part of the above functions can be realized.
- the program may also be stored in a storage medium such as a server, another computer, a magnetic disk, an optical disk, a flash disk or a mobile hard disk, and may be saved by downloading or copying.
- the memory of the local device is updated, or the system of the local device is updated.
Landscapes
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
- Image Processing (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201780079229.0A CN110087550B (zh) | 2017-04-28 | 2017-04-28 | 一种超声图像显示方法、设备及存储介质 |
PCT/CN2017/082485 WO2018195946A1 (zh) | 2017-04-28 | 2017-04-28 | 一种超声图像显示方法、设备及存储介质 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2017/082485 WO2018195946A1 (zh) | 2017-04-28 | 2017-04-28 | 一种超声图像显示方法、设备及存储介质 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018195946A1 true WO2018195946A1 (zh) | 2018-11-01 |
Family
ID=63919343
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2017/082485 WO2018195946A1 (zh) | 2017-04-28 | 2017-04-28 | 一种超声图像显示方法、设备及存储介质 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN110087550B (zh) |
WO (1) | WO2018195946A1 (zh) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110652317A (zh) * | 2019-09-24 | 2020-01-07 | 深圳度影医疗科技有限公司 | 一种产前胎儿超声容积图像中标准切面的自动定位方法 |
CN110960262A (zh) * | 2019-12-31 | 2020-04-07 | 上海杏脉信息科技有限公司 | 超声扫查***、方法及介质 |
CN111248941A (zh) * | 2018-11-30 | 2020-06-09 | 深圳迈瑞生物医疗电子股份有限公司 | 超声图像的显示方法、***及设备 |
CN111768379A (zh) * | 2020-06-29 | 2020-10-13 | 深圳度影医疗科技有限公司 | 一种子宫三维超声图像的标准切面检测方法 |
WO2021099171A1 (en) * | 2019-11-22 | 2021-05-27 | Koninklijke Philips N.V. | Systems and methods for imaging screening |
US20230181163A1 (en) * | 2021-12-09 | 2023-06-15 | GE Precision Healthcare LLC | System and Method for Automatic Association and Display of Video Loop Subject Matter for Enhanced Identification |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112568933B (zh) * | 2019-09-29 | 2022-11-22 | 深圳迈瑞生物医疗电子股份有限公司 | 超声成像方法、设备和存储介质 |
CN110584714A (zh) * | 2019-10-23 | 2019-12-20 | 无锡祥生医疗科技股份有限公司 | 超声融合成像方法、超声装置及存储介质 |
CN114209354A (zh) * | 2021-12-20 | 2022-03-22 | 深圳开立生物医疗科技股份有限公司 | 一种超声图像的显示方法、装置、设备及可读存储介质 |
CN116503913A (zh) * | 2023-06-25 | 2023-07-28 | 浙江华诺康科技有限公司 | 医学图像识别方法、装置、***和存储介质 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050101855A1 (en) * | 2003-09-08 | 2005-05-12 | Vanderbilt University | Apparatus and methods of brain shift compensation and applications of the same |
CN103295455A (zh) * | 2013-06-19 | 2013-09-11 | 北京理工大学 | 基于ct影像模拟与定位的超声培训*** |
CN104757994A (zh) * | 2014-01-08 | 2015-07-08 | 三星麦迪森株式会社 | 超声诊断设备和操作该超声诊断设备的方法 |
CN106256326A (zh) * | 2015-06-19 | 2016-12-28 | 通用电气公司 | 计算机断层扫描切片图像的生成***及方法 |
CN106572827A (zh) * | 2014-07-02 | 2017-04-19 | 柯惠有限合伙公司 | 智能显示器 |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE10261673A1 (de) * | 2002-12-31 | 2004-07-15 | Riener, Robert, Dr.-Ing. | Interaktive Lehr- und Lernvorrichtung |
DE102009006147A1 (de) * | 2008-12-23 | 2010-06-24 | Siemens Aktiengesellschaft | Modellgenerator für kardiologische Erkrankungen |
CN201516047U (zh) * | 2009-06-05 | 2010-06-30 | 中国人民解放军第三军医大学第一附属医院 | 一种虚拟肝脏超声成像装置 |
CN102525662B (zh) * | 2012-02-28 | 2013-09-04 | 中国科学院深圳先进技术研究院 | 组织器官三维可视化手术导航*** |
JP5785214B2 (ja) * | 2013-05-08 | 2015-09-24 | 富士フイルム株式会社 | 型、手術支援セット、手術支援装置、手術支援方法および手術支援プログラム |
CN109954196B (zh) * | 2013-08-15 | 2021-11-09 | 直观外科手术操作公司 | 用于导管定位和***的图形用户界面 |
US10231704B2 (en) * | 2013-12-20 | 2019-03-19 | Raghu Raghavan | Method for acquiring ultrasonic data |
US10105107B2 (en) * | 2015-01-08 | 2018-10-23 | St. Jude Medical International Holding S.À R.L. | Medical system having combined and synergized data output from multiple independent inputs |
CN105632310B (zh) * | 2016-01-25 | 2019-02-19 | 新乡医学院 | 一种人体解剖教学*** |
CN105761304B (zh) * | 2016-02-02 | 2018-07-20 | 飞依诺科技(苏州)有限公司 | 三维脏器模型构造方法和装置 |
-
2017
- 2017-04-28 WO PCT/CN2017/082485 patent/WO2018195946A1/zh active Application Filing
- 2017-04-28 CN CN201780079229.0A patent/CN110087550B/zh active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050101855A1 (en) * | 2003-09-08 | 2005-05-12 | Vanderbilt University | Apparatus and methods of brain shift compensation and applications of the same |
CN103295455A (zh) * | 2013-06-19 | 2013-09-11 | 北京理工大学 | 基于ct影像模拟与定位的超声培训*** |
CN104757994A (zh) * | 2014-01-08 | 2015-07-08 | 三星麦迪森株式会社 | 超声诊断设备和操作该超声诊断设备的方法 |
CN106572827A (zh) * | 2014-07-02 | 2017-04-19 | 柯惠有限合伙公司 | 智能显示器 |
CN106256326A (zh) * | 2015-06-19 | 2016-12-28 | 通用电气公司 | 计算机断层扫描切片图像的生成***及方法 |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111248941A (zh) * | 2018-11-30 | 2020-06-09 | 深圳迈瑞生物医疗电子股份有限公司 | 超声图像的显示方法、***及设备 |
CN110652317A (zh) * | 2019-09-24 | 2020-01-07 | 深圳度影医疗科技有限公司 | 一种产前胎儿超声容积图像中标准切面的自动定位方法 |
CN110652317B (zh) * | 2019-09-24 | 2020-12-29 | 深圳度影医疗科技有限公司 | 一种产前胎儿超声容积图像中标准切面的自动定位方法 |
WO2021099171A1 (en) * | 2019-11-22 | 2021-05-27 | Koninklijke Philips N.V. | Systems and methods for imaging screening |
CN110960262A (zh) * | 2019-12-31 | 2020-04-07 | 上海杏脉信息科技有限公司 | 超声扫查***、方法及介质 |
CN110960262B (zh) * | 2019-12-31 | 2022-06-24 | 上海杏脉信息科技有限公司 | 超声扫查***、方法及介质 |
CN111768379A (zh) * | 2020-06-29 | 2020-10-13 | 深圳度影医疗科技有限公司 | 一种子宫三维超声图像的标准切面检测方法 |
US20230181163A1 (en) * | 2021-12-09 | 2023-06-15 | GE Precision Healthcare LLC | System and Method for Automatic Association and Display of Video Loop Subject Matter for Enhanced Identification |
Also Published As
Publication number | Publication date |
---|---|
CN110087550B (zh) | 2022-06-17 |
CN110087550A (zh) | 2019-08-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110087550B (zh) | 一种超声图像显示方法、设备及存储介质 | |
US10251627B2 (en) | Elastography measurement system and method | |
US20110201935A1 (en) | 3-d ultrasound imaging | |
JP6430498B2 (ja) | 超音波剪断波エラストグラフィ測定のマッピングのためのシステムおよび方法 | |
JP6097452B2 (ja) | 超音波撮像システム及び超音波撮像方法 | |
JP6018411B2 (ja) | 画像ガイド下手技のための超音波撮像システム | |
CN106137249A (zh) | 在窄视场情况下进行配准用于多模态医学成像融合 | |
WO2018205274A1 (zh) | 一种超声设备及其三维超声图像的显示变换方法、*** | |
US11931201B2 (en) | Device and method for obtaining anatomical measurements from an ultrasound image | |
JP7267928B2 (ja) | ボリュームレンダリングされる超音波画像 | |
CN115811961A (zh) | 三维显示方法和超声成像*** | |
CN107106128A (zh) | 用于分割解剖目标的超声成像装置和方法 | |
JP7427002B2 (ja) | フレームのインデックス付け及び画像レビューのためのシステム及び方法 | |
JP2012502682A (ja) | 体積データ処理を用いる3次元超音波撮像 | |
EP1952359A1 (en) | System and method for generating for display two-dimensional echocardiography views from a three-dimensional image | |
CN109069110A (zh) | 具有简化的3d成像控制的超声成像*** | |
WO2018195874A1 (zh) | 一种胎心超声检测方法及超声成像*** | |
WO2022099705A1 (zh) | 早孕期胎儿的超声成像方法和超声成像*** | |
JP6501796B2 (ja) | 超音波画像のモデル・ベースのセグメンテーションのための取得方位依存特徴 | |
JP2013141515A (ja) | 医用画像装置及び医用画像構成方法 | |
US20210383564A1 (en) | Ultrasound image acquisition method, system and computer storage medium | |
JP7261870B2 (ja) | 超音波画像内のツールを追跡するためのシステム及び方法 | |
CN112672696A (zh) | 用于跟踪超声图像中的工具的***和方法 | |
EP2807977B1 (en) | Ultrasound diagnosis method and aparatus using three-dimensional volume data | |
WO2022134049A1 (zh) | 胎儿颅骨的超声成像方法和超声成像*** |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17907025 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17907025 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 12/05/2020) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17907025 Country of ref document: EP Kind code of ref document: A1 |