CN116211349A - Ultrasonic imaging method, ultrasonic imaging device and medium for face part of fetus - Google Patents

Ultrasonic imaging method, ultrasonic imaging device and medium for face part of fetus Download PDF

Info

Publication number
CN116211349A
CN116211349A CN202111464965.2A CN202111464965A CN116211349A CN 116211349 A CN116211349 A CN 116211349A CN 202111464965 A CN202111464965 A CN 202111464965A CN 116211349 A CN116211349 A CN 116211349A
Authority
CN
China
Prior art keywords
dimensional
fetus
distance
face
eyeball
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111464965.2A
Other languages
Chinese (zh)
Inventor
梁天柱
喻爱辉
林穆清
邹耀贤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Mindray Bio Medical Electronics Co Ltd
Original Assignee
Shenzhen Mindray Bio Medical Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Mindray Bio Medical Electronics Co Ltd filed Critical Shenzhen Mindray Bio Medical Electronics Co Ltd
Priority to CN202111464965.2A priority Critical patent/CN116211349A/en
Publication of CN116211349A publication Critical patent/CN116211349A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0866Detecting organic movements or changes, e.g. tumours, cysts, swellings involving foetal diagnosis; pre-natal or peri-natal diagnosis of the baby
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/483Diagnostic techniques involving the acquisition of a 3D volume of data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5207Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of raw data to produce diagnostic data, e.g. for generating an image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/08Volume rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • G06T15/205Image-based rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30041Eye; Retina; Ophthalmic
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Public Health (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Veterinary Medicine (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Graphics (AREA)
  • General Physics & Mathematics (AREA)
  • Gynecology & Obstetrics (AREA)
  • Pregnancy & Childbirth (AREA)
  • Computing Systems (AREA)
  • Geometry (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

The present disclosure relates to an ultrasound imaging method, an ultrasound imaging apparatus, and a medium for fetal face. The method comprises the following steps: acquiring three-dimensional data of the face of the fetus; determining a three-dimensional eyeball area and a three-dimensional facial area of the fetus based on the acquired three-dimensional volume data; generating a VR map of the face together with the eyeball based on the three-dimensional eyeball region and the three-dimensional face region; determining binocular distance-related parameters including intraocular distance and extrabinocular distance based on the three-dimensional eyeball area; and presenting the determined binocular distance related parameters in association with the face portion along with the VR map of the eyeball. In this way, the three-dimensional eyeball area (and the three-dimensional facial area) of the fetus can be automatically and accurately determined directly by considering the relevance of voxels in space, and accurate binocular-distance-related parameters can be derived therefrom to be presented in association with the three-dimensional eyeball (e.g., VR diagram thereof). And the three-dimensional VR images of the face can be compared and displayed, so that a doctor can obtain comprehensive facial deformity information conveniently for efficient screening.

Description

Ultrasonic imaging method, ultrasonic imaging device and medium for face part of fetus
Technical Field
The present disclosure relates to a medical imaging method, a medical imaging apparatus, and a medium, and more particularly, to an ultrasound imaging method, an ultrasound imaging apparatus, and a medium.
Background
The ultrasonic examination has wide application in clinical examination due to the advantages of safety, convenience, no radiation, low cost and the like, and becomes one of main auxiliary means for doctors to diagnose diseases. Prenatal ultrasound examination is used as the main imaging examination in prenatal examination, and provides important imaging evidence for fetal growth and development measurement and structural abnormality screening. Prenatal ultrasound examination has been one of the examinations that must be performed during early pregnancy, middle pregnancy and late pregnancy.
Fetal interocular distance and morphology abnormalities are an ultrasonic representation of many fetal congenital defects such as chromosomal abnormalities. Medical staff can be helped to quickly and intuitively screen fetal eye deformity through three-dimensional ultrasonic imaging.
Ocular deformities are common defects of fetuses. Fetal interocular distance is one of the ultrasonic indexes for judging congenital diseases such as fetal chromosomal abnormality, full forebrain deformity and the like, and the interocular distance is very close to the full forebrain deformity, and 86% of cases can be detected by prenatal ultrasound. And other chromosomal deformity symptoms may also exhibit this feature. Too wide an eye distance is a major feature of chromosomal abnormalities, mid-facial fissure syndrome, frontal brain or meningeal distension, etc.
Most of the existing screening methods for fetal eye deformity are that a doctor can watch an ultrasonic image and simultaneously carry out manual measurement, the screening result is greatly influenced by experience and manipulation of the doctor, and the workload of the doctor is heavy.
Disclosure of Invention
Accordingly, there is a need for a method, apparatus and medium for ultrasound imaging of a fetal face that is capable of automatically and accurately determining a three-dimensional eyeball area of a fetus (and a three-dimensional face area of a fetus) directly from three-dimensional volume data of the fetal face in consideration of the spatial correlation of voxels, and deriving therefrom accurate binocular distance-related parameters for presentation in association with a three-dimensional eyeball (e.g., a VR map thereof). The three-dimensional VR images of the faces of the fetuses can be compared and presented while the three-dimensional eyeballs are presented according to the requirements of the users, so that doctors can obtain comprehensive face part deformity information including relevant parameters of the interocular distance and three-dimensional anatomical details of the faces, and therefore, the efficient screening of various face deformities of the fetuses in a larger range is facilitated.
According to a first aspect of the present disclosure, there is provided a method of ultrasound imaging of a fetal face section. The method may include obtaining three-dimensional volume data of a fetal face; determining, by the processor, a three-dimensional eyeball region of the fetus and a three-dimensional facial region of the fetus based on the acquired three-dimensional volume data of the facial region of the fetus; generating, by the processor, a VR map of the face along with the eyeball based on the three-dimensional eyeball region of the fetus and the three-dimensional face region of the fetus; determining, by the processor, binocular distance-related parameters including intra-binocular distance and extra-binocular distance based on the three-dimensional eyeball area of the fetus; and presenting, by the processor, the determined binocular distance related parameters in association with the face portion along with a VR map of the eyeball.
According to a second aspect of the present disclosure, there is provided a method of ultrasound imaging of a fetal face. The method may include: acquiring three-dimensional data of the face of the fetus; determining, by the processor, a three-dimensional eyeball area of the fetus based on the acquired three-dimensional volume data of the face of the fetus; determining, by the processor, binocular distance-related parameters including intra-binocular distance and extra-binocular distance based on the three-dimensional eyeball area of the fetus; and presenting, by the processor, the determined interocular distance-related parameter in association with a three-dimensional eyeball region of the fetus.
According to a third aspect of the present disclosure, there is provided an ultrasound imaging method of a face portion of a fetus, comprising: acquiring three-dimensional data of the face of the fetus; determining, by the processor, a three-dimensional eyeball area of the fetus based on the acquired three-dimensional volume data of the face of the fetus; determining, by the processor, binocular distance-related parameters including intra-binocular distance and extra-binocular distance based on the three-dimensional eyeball area of the fetus; extracting, by the processor, a profile through a center of the double eyeball based on the three-dimensional eyeball area of the fetus; and presenting, by the processor, the determined interocular distance related parameters in association with the extracted profile through the center point of each eyeball.
According to a fourth aspect of the present disclosure, an ultrasound imaging device for a fetal face is provided. The ultrasound imaging apparatus includes a processor configured to perform an ultrasound imaging method of a fetal face section according to various embodiments of the present disclosure.
According to a fifth aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon computer executable instructions which, when executed by a processor, implement an ultrasound imaging method of a fetal face section according to various embodiments of the present disclosure.
With the ultrasound imaging method, apparatus and medium of the fetal face part according to various embodiments of the present disclosure, it is possible to automatically and accurately determine the three-dimensional eyeball area of the fetus (and the three-dimensional face area of the fetus) directly from the three-dimensional volume data of the fetal face part in consideration of the correlation of voxels in space, and derive therefrom accurate binocular distance-related parameters to be presented in association with the three-dimensional eyeball (e.g., VR map thereof). The three-dimensional VR images of the faces of the fetuses can be compared and presented while the three-dimensional eyeballs are presented according to the requirements of the users, so that doctors can obtain comprehensive face part deformity information including relevant parameters of the interocular distance and three-dimensional anatomical details of the faces, and therefore, the efficient screening of various face deformities of the fetuses in a larger range is facilitated.
Drawings
Features, advantages, and technical and industrial significance of exemplary embodiments of the present invention will be described below with reference to the accompanying drawings, in which like numerals denote like elements, and in which:
FIG. 1 (a) shows a construction diagram of an ultrasound imaging system of a fetal face part in accordance with an embodiment of the present disclosure;
fig. 1 (b) shows a configuration diagram of an example of an ultrasonic imaging apparatus of a fetal face part according to an embodiment of the present disclosure;
fig. 2 shows a flowchart of a first example of an ultrasound imaging method of a fetal face part in accordance with an embodiment of the present disclosure;
FIG. 3 shows a graphical representation of an interface presented on a display using the ultrasound imaging method of the first example;
fig. 4 shows a flowchart of a second example of an ultrasound imaging method of a fetal face part in accordance with an embodiment of the present disclosure;
FIG. 5 shows a graphical representation of an interface presented on a display using the ultrasound imaging method of the second example;
fig. 6 shows a flowchart of a third example of an ultrasound imaging method of a fetal face part in accordance with an embodiment of the present disclosure; and
fig. 7 shows a graphical representation of an interface presented on a display using the ultrasound imaging method of the third example.
Detailed Description
Hereinafter, embodiments of the present invention will be described; however, the present invention is not intended to be limited to this embodiment. All components of this embodiment are not always essential.
Fig. 1 (a) shows a configuration diagram of a three-dimensional ultrasound imaging system of a fetal face section according to an embodiment of the present disclosure. As shown in fig. 1 (a), an ultrasound imaging system 100 may include a probe 101, a transmit circuit 102 for exciting the probe 101 to transmit ultrasound waves to a pregnant woman under examination, a receive circuit 103 for controlling the probe 101 to receive ultrasound echo signals returned from the pregnant woman under examination, and a processor 104.
Various types of probes 101 may be employed, such as, but not limited to, at least one of an ultrasonic volumetric probe, an area array probe, and a conventional ultrasonic array probe (such as a linear array probe, a convex array probe, etc.). The physician may move the probe 101 to select the appropriate position and angle and the transmit circuit 102 may send a set of delayed focused pulses to the probe 101, which probe 101 transmits ultrasound waveforms along the 2D scan plane to the pregnant woman under examination (i.e., toward the face of the fetus). The reflected ultrasonic waveform is received by the receiving circuit 103 and converted into an electrical signal for processing by the processor 104.
In some embodiments, the processor 104 may be a processing device including more than one general purpose processing device, such as a microprocessor, central Processing Unit (CPU), graphics Processing Unit (GPU), or the like. More specifically, the processor may be a Complex Instruction Set Computing (CISC) microprocessor, a Reduced Instruction Set Computing (RISC) microprocessor, a Very Long Instruction Word (VLIW) microprocessor, a processor running other instruction sets, or a processor running a combination of instruction sets. The processor may also be one or more special purpose processing devices or circuits, such as an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA), a Digital Signal Processor (DSP), a system on a chip (SoC), or the like.
In the ultrasound imaging system 100, the processor 104 may be configured to perform beam forming, three-dimensional reconstruction, post-processing, etc. to obtain three-dimensional data of the facial portion of the fetus. Various processes may be implemented using dedicated circuit modules or software modules. In particular, the probe 101 emits an ultrasound waveform along a 2D scan plane towards the pregnant woman under examination (i.e. towards the face of the fetus). After the receiving circuit 103 receives the reflected ultrasonic waveform, the reflected ultrasonic waveform is converted into an electric signal, and the processor 104 can perform corresponding delay and weighted summation processing on the signals obtained by multiple transmission/reception processes, so as to realize beam synthesis. Further, the probe 101 may transmit/receive an ultrasonic waveform in a series of scan planes, convert it into an electrical signal, and integrate information of the signals according to a three-dimensional spatial relationship, so as to realize scanning of the face of the fetus in a three-dimensional space and reconstruction of a 3D image. Then, after post-processing is performed on the 3D image information of the face portion of the fetus obtained by reconstruction, such as one or more of denoising, smoothing and enhancement, the obtained three-dimensional data is the three-dimensional data of the face portion of the fetus. The three-dimensional data of the facial portion of the fetus processed by the processor 104 may be presented on the display 106. The display 106 may be an LCD, CRT or LED display.
In addition to the above processing, the processor 104 may also perform automatic detection and analysis of the facial and ocular regions of the fetus, such as, but not limited to, may perform ultrasound imaging methods of the fetal facial area according to various embodiments of the present disclosure. As such, the three-dimensional ultrasound imaging system of the face of the fetus itself may also be used as an ultrasound imaging device for automatic detection and analysis of the face region and the eye region of the fetus, but this is merely by way of example.
In some embodiments, the ultrasound imaging device for automatically detecting and analyzing the facial and ocular regions of the fetus may also be implemented as a device 100' separate from but in communication with the three-dimensional ultrasound imaging system of the fetal facial, noting that the technical term "ultrasound imaging device" in this disclosure is not limited to devices that contain an ultrasound probe and transmit/receive ultrasound to form an image, but may also include devices that detect and analyze images derived from ultrasound, such as image stations, remote image analysis platforms, and the like. For example, the device 100' may be a computer tailored for image data acquisition and image data processing tasks, or a server placed in the cloud.
As shown in fig. 1 (b), an apparatus 100' for fetal facial part according to embodiments of the present disclosure may include a processor 104', the processor 104' may be configured to perform an ultrasound imaging method for fetal facial part according to various embodiments of the present disclosure. The apparatus 100 'may include a communication interface 102' to obtain three-dimensional volume data of the face of the fetus, such as from a three-dimensional ultrasound imaging system as shown in fig. 1 (a), from an image database, from a PACS system, etc., which are not described in detail herein.
In some embodiments, the communication interface 102' may include a network adapter, a cable connector, a serial connector, a USB connector, a parallel connector, a high-speed data transmission adapter such as fiber optic, USB 9.0, lightning, a wireless network adapter such as a WiFi adapter, a telecommunications (4G, LTE, 5G, etc.) adapter. The device 100 'may be connected to a network through a communication interface 102'. The network may provide the functionality of a Local Area Network (LAN), a wireless network, a cloud computing environment (e.g., software as a service, a platform as a service, an infrastructure as a service, etc.), a client-server, a Wide Area Network (WAN), etc.
The hardware configuration of the processor 104' may refer to the hardware configuration of the processor 104 in fig. 1 (a), and is not described herein.
In some embodiments, the apparatus 100' may additionally include at least one of an input/output 105' and a display 106 '. Wherein the input/output 105 'may be configured to allow the apparatus 100' to receive and/or transmit data. Input/output 105 'may include one or more digital and/or analog communication devices that allow apparatus 100' to communicate with a user or other machine and device. For example, input/output 105' may include a keyboard and mouse that allow a user to provide input.
As shown in fig. 1 (a), the apparatus 100' may include Read Only Memory (ROM), flash memory, random Access Memory (RAM), dynamic Random Access Memory (DRAM) such as Synchronous DRAM (SDRAM) or Rambus DRAM, static memory (e.g., flash memory, static random access memory), etc., on which computer-executable instructions are stored in any format. In some embodiments, memory 100 'may store computer-executable instructions of one or more image processing programs (such as fetal face and eye detection analysis programs) that, when executed by processor 104', implement methods of ultrasound imaging of fetal face in accordance with various embodiments of the present disclosure. In particular, the computer program instructions may be accessed by the processor 104', read from ROM or any other suitable memory location, and loaded into RAM for execution by the processor 104'.
Fig. 2 shows a flowchart of a first example of an ultrasound imaging method of a fetal face part according to an embodiment of the present disclosure. The ultrasound imaging method may begin at step 201 with the acquisition of three-dimensional volume data of a fetal face. At step 202, a three-dimensional eyeball area of the fetus and a three-dimensional facial area of the fetus may be determined by the processor based on the acquired three-dimensional volume data of the facial area of the fetus; namely, the three-dimensional eyeball area of the fetus and the three-dimensional facial area of the fetus are automatically detected. Compared with the detection of the eyeball area and the three-dimensional facial area based on the two-dimensional profile of the face of the fetus, the association relationship of the voxels in the three-dimensional space, especially the association relationship of the voxels missing in the two-dimensional profile, can be comprehensively considered based on the three-dimensional volume data of the face of the fetus, so that the three-dimensional eyeball area and the three-dimensional facial area of the fetus can be more accurately determined. The three-dimensional eyeball area of the automatically detected fetus can more fully reflect the space size of the eyeball area, and the three-dimensional eyeball area can take various deformations of the three-dimensional eyeball area compared with a standard sphere into consideration, so that more accurate ocular parameters, such as various binocular-distance related parameters, which are mentioned below, can be obtained advantageously.
At step 203, a VR map of the face along with the eyeball may be generated by the processor based on the three-dimensional eyeball region of the fetus and the three-dimensional face region of the fetus. The VR image of the face and the eyeball is beneficial for the user to view the face and the eyeball of the fetus at various three-dimensional angles in an immersive manner, so that richer information is acquired for diagnosis, the VR image is limited to be displayed on the face and the eyeball, the observation requirement of the user is met, and meanwhile, compared with the fetal whole-body VR image without selectivity, the operation amount can be remarkably reduced, and the user can be enabled to be smoother when the observation angle is switched.
At step 204, binocular distance related parameters including intra-binocular distance and extra-binocular distance may be determined by the processor based on the three dimensional eyeball area of the fetus. The relevant parameters of the interocular distance are automatically analyzed directly based on the three-dimensional eyeball area of the fetus, so that the space size of the eyeball area can be more comprehensively considered, the user does not need to manually or semi-manually select the section of the eyeball, the workload of the user is reduced, and the obvious deviation of the relevant parameters of the interocular distance caused by improper section selection is avoided.
Next, at step 205, the determined binocular distance related parameters may be presented by the processor in association with the VR map of the face portion along with the eyeballs. Therefore, the method is beneficial to the comparison analysis of the VR image (immersive image information) of the face and the eyeballs and the determined relevant parameters (anatomical values) of the interocular distance, and improves the stability of the abnormal screening of the fetus by the user. For example, a wider face, low nose combined with a wide eye distance, small eye fissures, may be specifically directed to down's diagnostic results. For another example, too narrow an eye distance may be directed to the diagnosis of forebrain crazing free deformity, etc. For another example, some deformities cannot be significantly reflected in the related parameters of the interocular distance, and the reference face can be accurately judged, such as the small jaw deformity.
By utilizing the ultrasonic imaging method of the face of the fetus, the three-dimensional eyeball area of the fetus and the three-dimensional face area of the fetus can be automatically determined, the VR diagram of the face of the fetus and the eyes can be presented, the relevant parameters of the distance between the eyes can be accurately determined, the user can observe the VR diagram in an immersive and smooth manner, meanwhile, the relevant parameters of the distance between the eyes can be contrastively analyzed, the workflow of the abnormal screening of the face of the fetus can be obviously optimized, the working efficiency can be improved, the stability of the obtained relevant parameters of the distance between the eyes can be improved, and the stability of the screening result of the user on various fetal abnormalities (reflected in the relevant parameters of the distance between the eyes, the face or the combination of the relevant parameters of the distance between the eyes and the face) can be improved, and popularization and application of the abnormal screening of the distance between eyes and the morphology can be promoted.
Fig. 3 shows a graphical representation of an interface presented on a display using the ultrasound imaging method of the first example. As shown in fig. 3, intra-binocular distance 302b and extra-binocular distance 302a may be presented in association with a VR plot 301 of the face along with the eyeballs, providing a beneficial reference for the user to screen for facial anomalies. In some embodiments, the ratio of the interocular distance 302b to the interocular distance 302a, i.e., the interocular distance ratio 302e, may also be presented, along with the interocular distance 302b and the interocular distance 302a, as a interocular distance-related parameter for reference by a user to screen for facial abnormalities. Thus, the user can more accurately and efficiently judge whether the face of the fetus is abnormal or not by comparing the absolute value of the inter-and-outer eye distance with the relative ratio of the inter-and-outer distance.
Based on the three-dimensional eyeball area of the fetus, the intra-ocular distance 302b and the extra-ocular distance 302a may be determined in various ways. In some embodiments, a pair of medial boundary points and a pair of lateral boundary points of a double eyeball may be determined based on the three-dimensional eyeball area of the fetus. The distance between the paired inner boundary points of the binocular eyes can be determined as the intraocular distance. The distance between the paired outer boundary points of the binocular eyes can be determined as the binocular outer distance.
That is, the intra-ocular distance can be calculated by determining the paired inner boundary points, and the extra-ocular distance can be calculated by determining the paired outer boundary points.
The medial and lateral boundary points may be obtained in various ways.
For example, the distance between boundary points on the respective boundaries may be obtained by traversing the three-dimensional eyeball regions of both eyes, two boundary points closest to each other may be regarded as inner boundary points, and two boundary points farthest from each other may be regarded as outer boundary points. The method does not perform any idealization treatment (for example, imagine a sphere) on the three-dimensional eyeball area, and the obtained calculation results of the interocular distance and the interocular distance are more accurate, but the calculation load is higher in a mode of sampling, combining and comparing a large amount of the three-dimensional eyeball area.
For another example, the center of each eyeball may be determined based on the three-dimensional eyeball area of the fetus, and boundary points on the boundary of the three-dimensional eyeball area intersecting with the line connecting the centers of each eyeball may be determined as the paired inner boundary points and the paired outer boundary points. The method utilizes the geometrical property of the three-dimensional eyeball area approaching to the sphere, can obviously reduce the calculation load and obtain the calculation results of the intraocular distance and the extraocular distance with acceptable accuracy.
The determined interocular distance-related parameters may be visualized in various ways in association with the VR map of the face portion along with the eyeball. Specifically, as shown in fig. 3, a first line 302c of paired inner boundary points of both eyes and a second line 302d of paired outer boundary points of both eyes may be presented on the VR chart of the face portion together with the eyeball. An intra-ocular distance 302b may be presented in association with the first link 302c and an extra-ocular distance 302a may be presented in association with the second link 302d. Through the first connection 302c and the second connection 302d, the inter-and-intra-ocular distance is intuitively associated with the corresponding position in the VR image, so that a doctor can conveniently look over the corresponding presentation in the VR image while looking over the inter-and-intra-ocular distance, and morphological abnormality can be estimated more efficiently and accurately. In particular, the presentation of the association between the respective pitches and the corresponding links may be achieved in various ways, for example, such association may be presented via connection of the index wires as shown in fig. 3. As another example, such a correlation may also be presented via the same color. In fig. 3, various parameters related to interocular distance are displayed in the blank of the interface, so that occlusion of anatomical details in the VR diagrams of the face and eyes by the presentation of the parameters can be avoided. This is merely an example, and various interocular distance related parameters may also be presented in a floating window, such as may be presented immediately adjacent to the corresponding anatomy initially, so that a physician may intuitively correlate the anatomy with the interocular distance related parameters, and after intuitive correlation, may freely move the floating window of interocular distance related parameters elsewhere (e.g., in a blank space) to view anatomical details without occlusion, thereby more accurately and efficiently determining whether the facial portion of the fetus is abnormal. In some embodiments, the various interocular distance-related parameters are presented in proximity such that a physician can simultaneously contrast view the various interocular distance-related parameters without diverting vision, reducing attention fatigue, and thereby avoiding cognitive errors due to fatigue.
Fig. 4 shows a flowchart of a second example of an ultrasound imaging method of a fetal face part according to an embodiment of the present disclosure. The ultrasound imaging method may begin at step 401 with acquiring three-dimensional volume data of a fetal face. In step 402, a three-dimensional eyeball area of the fetus may be determined based on the acquired three-dimensional volume data of the face portion of the fetus. Similar to step 202, the three-dimensional eyeball area of the fetus is directly extracted and determined based on the three-dimensional volume data of the face of the fetus rather than via the two-dimensional section of the face of the fetus (whether or not derived from the three-dimensional volume data of the face of the fetus), so that the association of voxels in space can be comprehensively considered, and the three-dimensional eyeball area of the fetus can be more accurately determined. By reducing the target area from the three-dimensional eyeball area and the facial area of the fetus to the three-dimensional eyeball area of the fetus, the analysis and operation load of the three-dimensional volume data can be obviously reduced, and correspondingly, the subsequent rendering and presentation has lower requirement on calculation resources, so that the extraction and rendering and presentation of the three-dimensional eyeball area of the fetus can be smoothly executed in an ultrasonic imaging device with lower calculation performance.
At step 403, binocular distance related parameters including intra-binocular distance and extra-binocular distance may be determined based on the three-dimensional eyeball area of the fetus. The definition and calculation method of the binocular distance related parameter according to various embodiments of the present disclosure may be incorporated herein and are not described in detail herein.
At step 404, the determined interocular distance-related parameters may be presented in association with the three-dimensional eyeball area of the fetus. In some embodiments, similar to step 203, a VR processing unit may be utilized to generate a VR map of the eyeball for presentation based on the three-dimensional eyeball region of the fetus; but other 3D rendering modes of the three-dimensional eyeball area can also be adopted. For example, the determined interocular distance-related parameters may be presented in association with only the three-dimensional eyeball area of the fetus, such that the computational load required for rendering the presentation is significantly reduced, and the visualization requirements of the physician for fetal facial anomaly screening can also be met to some extent. Further, it is sometimes desirable for a physician to be able to focus on the observation of a three-dimensional eyeball area without interference from other anatomical structures when screening for facial abnormalities of a fetus, particularly for ocular abnormalities, by presenting a determined binocular distance-related parameter in association with the three-dimensional eyeball area of the fetus. Note that presenting the determined binocular distance related parameter in association with the three-dimensional eyeball area of the fetus may present only the three-dimensional eyeball area of the fetus and present the determined binocular distance related parameter in association therewith, but is not limited thereto. The three-dimensional eyeball area of the fetus may also be presented, aided by the presentation of the 3D face of the fetus, and the determined binocular inter-eye distance related parameters are presented in association with the presented three-dimensional eyeball area.
The binocular distance-related parameters and methods of associated presentation of the face portion along with VR diagrams of the eyeballs according to various embodiments of the present disclosure may be so incorporated and are not described in detail herein.
Fig. 5 shows a graphical representation of an interface presented on a display using the ultrasound imaging method of the second example. As shown in fig. 5, the face portion may not be displayed, but only the three-dimensional eyeball area 502 of the fetus, i.e., a pair of eyeballs, is presented. The presentation of the three-dimensional eyeball area of the fetus can allow the doctor to freely change the position, angle and zoom-in-out operation of the three-dimensional eyeball area so that the doctor can intensively observe the anatomical details of the three-dimensional eyeball area in the three-dimensional space.
As shown in fig. 5, intra-ocular distance 502b and extra-ocular distance 502a may be presented in association with three-dimensional eyeball area 502 to provide a beneficial reference for a user to screen for facial abnormalities. In some embodiments, the ratio of interocular distance 502b to interocular distance 502a, i.e., interocular distance ratio 502e, may also be presented, along with interocular distance 502b and interocular distance 502a, as a interocular distance-related parameter for reference by a user to screen for facial abnormalities. Thus, the user can more accurately and efficiently judge whether the face of the fetus is abnormal or not by comparing the absolute value of the inter-and-outer eye distance with the relative ratio of the inter-and-outer distance.
Similar to fig. 3, the determined interocular distance-related parameters may be visualized in association with the three-dimensional eyeball area 502 in various ways. Specifically, as shown in fig. 5, on the three-dimensional eyeball area 502, a first line 502c of paired inner boundary points of both eyes and a second line 502d of paired outer boundary points of both eyes may be present. An intra-ocular distance 502b may be presented in association with the first connection 502c and an extra-ocular distance 302a may be presented in association with the second connection 502d. Through the first connection 502c and the second connection 502d, the inter-ocular and intra-ocular distance is intuitively associated with the corresponding portion of the three-dimensional eyeball area 501, so that a doctor can conveniently look at the corresponding presentation in the three-dimensional eyeball area 502 (such as, but not limited to, a VR chart) while looking at the inter-ocular and intra-ocular distance, and morphological abnormality can be estimated more efficiently and accurately. In particular, the presentation of the association between the respective pitches and the corresponding links may be achieved in various ways, for example, such association may be presented via connection of the index wires as shown in fig. 5. As another example, such a correlation may also be presented via the same color. In fig. 5, various interocular related parameters are displayed in the blank of the interface, so that occlusion of anatomical detail in the three-dimensional eyeball area 502 by the presentation of the parameters is avoided. This is merely an example, and various interocular distance related parameters may also be presented in a floating window, such as may be presented immediately adjacent to the corresponding anatomy initially, so that a physician may intuitively correlate the anatomy with the interocular distance related parameters, and after intuitive correlation, may freely move the floating window of interocular distance related parameters elsewhere (e.g., in a blank space) to view anatomical details without occlusion, thereby more accurately and efficiently determining whether the facial portion of the fetus is abnormal. In some embodiments, the various interocular distance-related parameters are presented in proximity such that a physician can simultaneously contrast view the various interocular distance-related parameters without diverting vision, reducing attention fatigue, and thereby avoiding cognitive errors due to fatigue.
Fig. 6 shows a flowchart of a third example of an ultrasound imaging method of a fetal face part according to an embodiment of the present disclosure. In step 601, three-dimensional volume data of a fetal face may be acquired. At step 602, a three-dimensional eyeball area of the fetus may be determined based on the acquired three-dimensional volume data of the face portion of the fetus. At step 603, binocular distance related parameters including intra-binocular distance and extra-binocular distance may be determined based on the three-dimensional eyeball area of the fetus. Steps 601, 602, 603 are similar to steps 401, 402, and 403, respectively, of fig. 4, and various related embodiments and descriptions (such as, but not limited to, those described in connection with fig. 4) of the present disclosure may be incorporated herein and are not repeated herein. The definition and calculation method of the binocular distance related parameter according to various embodiments of the present disclosure may be incorporated herein and are not described in detail herein.
At step 604, a profile through the center of the double eyeball may be extracted based on the three-dimensional eyeball area of the fetus.
In step 605, the determined interocular distance-related parameters may be presented in association with the extracted profile through the center point of each eyeball. As shown in fig. 7, a cross-section of the center point of each eyeball may be displayed, and an eyeball area may be identified (e.g., framed with an anchor frame) on the cross-section, and an outer eye distance (i.e., an outer binocular distance) and an inner eye distance (i.e., an inner binocular distance) may be presented in association with the eyeball area. In some embodiments, the ratio of the inner eye distance to the outer eye distance may also be presented with the outer eye distance and the inner eye distance. By extracting the section passing through the center of the double eyeballs and presenting the determined relevant parameters of the distance between the double eyeballs in a correlated way, doctors can conveniently observe anatomical details on the section, and compared with the 3D rendering of the three-dimensional eyeball area or the 3D rendering of the three-dimensional eyeball area and the 3D rendering of the face, the method further reduces the workload, and particularly can present clear anatomical details in an ultrasonic imaging device with lower calculation performance, and the loss of details caused by the presentation of the clamping caused by 3D calculation is avoided; and the section passing through the center of the double eyeballs is more matched with the real inner eye distance and the real outer eye distance compared with other offset sections, and also provides richer anatomical details. Note that the associated presentation of the determined interocular distance-related parameters and the extracted profile through the center point of each eyeball may be combined with the method of the associated presentation of interocular distance-related parameters and the face together with the VR map of the eyeball according to various embodiments of the present disclosure, which is not described in detail herein. In particular, presenting the determined interocular distance-related parameters in association with the extracted profile passing through the center point of each eyeball may include: a first connecting line of paired inner boundary points of both eyes and a second connecting line of paired outer boundary points of both eyes are presented on a section passing through the center point of each eyeball; an intra-ocular spacing is presented in association with the first connection and an extra-ocular spacing is presented in association with the second connection.
A detailed description will be given below of how to determine the three-dimensional eyeball area of the fetus and the face area of the fetus based on the acquired three-dimensional volume data of the face area of the fetus. The determination of the three-dimensional eyeball area of the fetus and the three-dimensional face area of the fetus based on the acquired three-dimensional volume data of the face area of the fetus is achieved by any one or a combination of the following. For example, image features may be extracted based on three-dimensional volume data of the face of the fetus, and a trained regression model may be utilized to determine a three-dimensional eyeball area of the fetus and/or a three-dimensional face area of the fetus based on the extracted image features. From, for example, the three-dimensional eyeball area of the fetus and/or the three-dimensional face area of the fetus may be determined using the trained segmentation model based on the three-dimensional volume data of the face of the fetus. For another example, the three-dimensional eyeball area of the fetus and/or the three-dimensional face area of the fetus may be determined by matching the representative data of the three-dimensional volume data of the face of the fetus with a reference template of the representative data.
The above methods will be described below taking as an example the determination of the three-dimensional eyeball area of the fetus based on the acquired three-dimensional volume data of the face portion of the fetus. However, it should be understood that after knowing the method for determining the three-dimensional eyeball area of the fetus, the method can also be applied to determining the facial area of the fetus, and will not be described in detail herein.
Regression method
Image features may be extracted based on the three-dimensional volume data of the face of the fetus, and a trained regression model may be utilized to determine a three-dimensional eyeball area of the fetus based on the extracted image features.
In some embodiments, conventional image processing or deep learning methods may be used to extract image features from the fetal facial three-dimensional data; and then, according to the extracted image characteristics, the position and the direction of the eyeball area are regressed by using a traditional machine learning method or a deep learning method. The regression means that an optimal mapping function from the image characteristics of the three-dimensional data of the face of the fetus to the three-dimensional eyeball area is learned, so that the error between the position of the three-dimensional eyeball area obtained by the three-dimensional data of the face of the fetus and the position of the actual three-dimensional eyeball area is minimized through the mapping function.
Various methods may be employed to extract image features from the three-dimensional volume data of the fetal face, such as, but not limited to, conventional image processing methods, deep learning methods, and the like. The traditional image processing method comprises the steps of extracting image features, such as Sift features, gradient features, LBP and other texture features, PCA, LDA, haar features, HOG and LOG features and the like, and extracting image edges, such as edge extraction realized by a canny operator. For example, the deep learning method includes training one or more specific tasks, such as regressing the position of the eyeball area, identifying key anatomical structures and/or key points of the face of the fetus, and then extracting the output of one or more network nodes from the trained neural network as the extracted image features.
In some embodiments, the regression method of the position of the three-dimensional eyeball area may include a conventional machine learning method and a deep learning method. An ultrasound database of the face of the fetus may be established using the image features of the three-dimensional volume data of the face of the fetus back and forth to the position of the three-dimensional eyeball area of the fetus, wherein each piece of data may comprise the three-dimensional volume data of the face of the fetus and/or its image features, as well as the position of the three-dimensional eyeball area. When the regression model is trained, an optimal mapping function from the image features of the three-dimensional data of the face of the fetus to the positions of the three-dimensional eyeball areas of the fetus is found, so that the error between the positions of the three-dimensional eyeball areas obtained by the image features of the face of the fetus through the mapping function and the positions of the actual three-dimensional eyeball areas is minimized. By using the mapping function, the position of the three-dimensional eyeball area of the fetus can be predicted according to the image characteristics of the face of the fetus. Among the various regression methods, the conventional machine learning method may include an SVM support vector machine, a least square method, a logistic regression method, etc.; the mapping function may include a linear function, a polynomial function, a logic function, etc. In the deep learning method, a deep neural network may be used as a mapping function, including a CNN convolutional neural network, an MLP multi-layer perceptron, an RNN cyclic neural network, and the like.
Segmentation method
A three-dimensional eyeball area of the fetus may be determined using the trained segmentation model based on the three-dimensional volume data of the face of the fetus.
In some embodiments, the segmentation method may include a deep learning-based method, a machine learning method in combination with a conventional image processing method, and so on. First, it may be necessary to construct a database of ultrasound images, each of which accurately marks the boundary extent of the anatomical structure of the three-dimensional eyeball area in the three-dimensional volume data.
The image segmentation method based on the deep learning comprises the steps of performing feature learning and three-dimensional eyeball area anatomical structure boundary learning on a constructed database by stacking convolution layers and deconvolution layers. For example, for an input image, an image mask of the same size may be generated directly by the deep learning network, wherein the pixel values may represent whether the three-dimensional eyeball area, and thus a specific boundary range of the anatomy of the three-dimensional eyeball area. In some embodiments, the deep learning network for segmentation may include FCNs (full convolutional neural networks), unet, segNet, deepLab, masking RCNN, and so forth.
The image segmentation method based on other machine learning comprises the steps of pre-segmenting an image through traditional image processing methods such as threshold segmentation, snake, level set and GraphCut, ASM, AAM, and acquiring a group of candidate eyeball anatomical structure boundary ranges in an ultrasonic image; extracting the characteristics of the boundary range of each candidate eyeball, wherein the characteristic extraction method can be extracting PCA, LDA, HOG, harr, LBP and other traditional characteristics, and can also be extracting characteristics of a neural network; and then matching the extracted features with features extracted from the eyeball anatomical structure boundary range marked in the database, classifying by using a KNN, SVM, random forest or neural network and other discriminators, and determining whether the eyeball anatomical structure boundary range of the current candidate contains a key anatomical structure.
Matching method
The three-dimensional eyeball area of the fetus can be determined by matching the representative data of the three-dimensional volume data of the face portion of the fetus with a reference template of the representative data.
In some embodiments, the representative data of the three-dimensional volume data of the fetal face part may include at least one of the three-dimensional volume data of the fetal face part itself, data of a two-dimensional profile of the fetal face part, and data of a critical part of the fetal face part.
In some embodiments, the two-dimensional profile of the fetal face is a two-dimensional profile through the center of each eyeball. For template matching, an over-center two-dimensional profile is used, and the eye distance is preferably measured.
Additionally or alternatively, the critical portion of the fetal face portion may include critical anatomical structures and/or critical points of the fetal face portion. Template matching of critical anatomy and/or critical points can significantly reduce computational load as compared to template matching of faces and volumes while still obtaining a relatively accurate three-dimensional eyeball area.
In this embodiment, the reference templates representing data may include templates of three-dimensional volume data of the fetal face, templates of two-dimensional sections of the fetal face, reference data of key anatomical structures of the fetal face, reference data of key points of the fetal face, and the like. Wherein the template of the two-dimensional profile of the fetal face may be selected to best measure the interocular distance, such as, but not limited to, a two-dimensional profile through the center of each eyeball. The matching of the representative data of the acquired three-dimensional volume data of the fetal face with the reference template of the representative data may include matching with various templates/reference data as above.
Taking the example that the obtained three-dimensional data of the face part of the fetus is matched with the reference template of the three-dimensional data of the face part of the fetus as an illustration. An optimal three-dimensional space transformation can be found to maximize or minimize the similarity or phase difference between the acquired three-dimensional volume data of the fetal face and the reference template of the three-dimensional volume data of the fetal face. The three-dimensional data of the obtained fetal face and the reference template of the three-dimensional data of the fetal face can be extracted by image features (such as gradient features, LBP (local binary pattern) texture features, haar features, HOG/LOG features and the like) respectively, and then an optimal three-dimensional space transformation is searched for, so that the similarity of the image features extracted by the three-dimensional data of the obtained fetal face and the reference template of the three-dimensional data of the fetal face is the highest or the difference between the image features extracted by the three-dimensional data of the fetal face is the smallest. After the matching is completed, the position corresponding to the eyeball area of the fetus in the three-dimensional volume data of the face of the fetus can be obtained according to the position of the eyeball area of the fetus in the reference template of the three-dimensional volume data of the face of the fetus, for example, through the inverse transformation of the three-dimensional space transformation.
In some embodiments, the acquired three-dimensional volume data of the fetal face portion may also be matched to a reference template of the two-dimensional profile of the fetal face portion. An optimal two-dimensional section can be found in the obtained three-dimensional data of the face of the fetus, so that the similarity or the phase difference between the two-dimensional section and a reference template of the two-dimensional section of the face of the fetus is the highest or the phase difference is the smallest, or the similarity or the phase difference between the two-dimensional section and image features (such as gradient features, LBP (local binary pattern) texture features, haar features, HOG/LOG features and the like) respectively extracted from the reference templates of the two-dimensional section of the face of the fetus is the highest or the phase difference is the smallest. After the matching is completed, the corresponding position of the eyeball area of the fetus in the three-dimensional volume data of the face of the fetus can be obtained according to the position of the eyeball area of the fetus in the reference template of the two-dimensional section of the face of the fetus.
In some embodiments, the acquired three-dimensional volume data of the fetal face may also be matched to fiducial data of critical anatomical structures of the fetal face (e.g., without limitation, nose, bridge of nose, chin, eyeball, etc.). An optimal image block can be found in the acquired three-dimensional volume data of the fetal face so that the image block has the highest or smallest similarity or smallest difference with the reference data of the key anatomical structure of the fetal face, or the image features extracted from the image block have the highest or smallest similarity or smallest difference. Specifically, matching the obtained three-dimensional volume data of the face of the fetus with the reference data of the critical anatomy of the face of the fetus also includes detecting candidate regions of the critical anatomy of the face of the fetus in the obtained three-dimensional volume data of the face of the fetus using a target detection method such as Faster RCNN, mask RCNN, SSD, YOLO, retinanet, efficientnet, cornernet, centernet, FCOS, and matching the candidate regions with the reference data of the critical anatomy of the face of the fetus. The matching method can include searching an optimal candidate key anatomical structure of the face of the fetus, so that the similarity between the candidate key anatomical structure and the reference data of the key anatomical structure of the face of the fetus is the highest or the difference between the candidate key anatomical structure and the reference data of the key anatomical structure of the face of the fetus is the smallest; the image features of the candidate key anatomical structure of the fetal face and the reference data of the key anatomical structure of the fetal face can be extracted respectively, and then the optimal candidate key anatomical structure of the fetal face is searched for, so that the similarity of the image features of the candidate key anatomical structure of the fetal face and the reference data of the key anatomical structure of the fetal face is the highest or the difference between the image features of the candidate key anatomical structure of the fetal face and the reference data of the key anatomical structure of the fetal face is the smallest. In some embodiments, the optimal candidate critical anatomy and the optimal spatial transformation may also be found such that the optimal candidate critical anatomy, with the optimal spatial transformation, is minimally offset from the spatial location of the reference data for the critical anatomy of the fetal face. After matching is completed, a region position relative to the key anatomical structures can be determined according to the relative positions of the eyeball regions of the fetal face and the key anatomical structures, and the region position is used as a detection result of the eyeball regions of the fetal face. The corresponding position of the eyeball area can also be obtained according to the obtained optimal space transformation and the position of the eyeball area in the reference template of the three-dimensional volume data of the fetal face.
In some embodiments, the acquired three-dimensional volume data of the fetal face may also be matched to reference data of key points of an eyeball area of the fetal face. An optimal point can be found in the acquired three-dimensional data of the fetal face, so that the similarity or the difference between the image features near the point and the image features of the reference data of the key points of the eyeball area of the fetal face are the highest or the smallest. The obtained three-dimensional data of the fetal face is matched with the reference data of the key points of the eyeball area of the fetal face, or the candidate key points of the eyeball area are detected in the obtained three-dimensional data of the fetal face by using a characteristic point extraction method (such as a SIFT method), a corner detection method (such as a Harris method) or an area where the coordinates or points of the predicted points of the neural network method are located, and then the candidate key points are matched with the reference data of the key points of the eyeball area of the fetal face. The matching method may include searching for an optimal candidate key point, so that the similarity or the difference between the image features near the candidate key point and the image features of the reference data of the key point of the eyeball area is the highest or the smallest. The matching method may also include searching for an optimal candidate keypoint of the face of the fetus and an optimal spatial transformation such that the optimal candidate keypoint, under the effect of the optimal spatial transformation, is minimally different from the spatial position of the reference data of the keypoint of the eyeball region of the face of the fetus. After matching is completed, determining the position of an area corresponding to the key points according to the positions of the key points of the eyeball area of the face of the fetus, and taking the position as the detection result of the eyeball area of the face; the position corresponding to the eyeball area in the three-dimensional volume data of the face may also be obtained by using the obtained optimal spatial transformation (e.g., inverse transformation) according to the position of the eyeball area in the reference template of the three-dimensional volume data of the face.
Furthermore, although illustrative embodiments are described herein, the scope includes any and all embodiments having equivalent elements, modifications, omissions, combinations (e.g., of schemes across various embodiments), adaptations or alterations based on the present disclosure. Elements in the claims will be construed broadly based on the language used in the claims and not limited to examples described in the specification or during the lifetime of the application, which examples are to be construed as non-exclusive. Furthermore, the steps of the disclosed methods may be modified in any manner, including by reordering steps or inserting or deleting steps. It is intended, therefore, that the description be regarded as examples only, with a true scope being indicated by the following claims and their full range of equivalents.
The above description is intended to be illustrative and not restrictive. For example, the above-described examples (or one or more aspects thereof) may be used in combination with each other. Other embodiments may be used by those of ordinary skill in the art in view of the above description. Moreover, in the foregoing detailed description, various features may be grouped together to simplify the present disclosure. This should not be interpreted as intending that an unclaimed disclosed feature is essential to any claim. Rather, the inventive subject matter may lie in less than all features of a single disclosed embodiment. Thus, the following claims are hereby incorporated into the detailed description as examples or embodiments, with each claim standing on its own as a separate embodiment, and it is contemplated that these embodiments may be combined with each other in various combinations or permutations. The scope of the invention should be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.

Claims (26)

1. A method of ultrasound imaging of a fetal face, comprising:
acquiring three-dimensional data of the face of the fetus;
determining, by the processor, a three-dimensional eyeball region of the fetus and a three-dimensional facial region of the fetus based on the acquired three-dimensional volume data of the facial region of the fetus;
generating, by the processor, a VR map of the face along with the eyeball based on the three-dimensional eyeball region of the fetus and the three-dimensional face region of the fetus;
determining, by the processor, binocular distance-related parameters including intra-binocular distance and extra-binocular distance based on the three-dimensional eyeball area of the fetus; and
the determined binocular distance related parameters are presented by the processor in association with the VR map of the face portion along with the eyeballs.
2. The ultrasound imaging method of claim 1, wherein determining binocular distance related parameters including intra-binocular distance and extra-binocular distance based on the eyeball area of the fetus specifically comprises:
determining a pair of inner boundary points and a pair of outer boundary points of a double eyeball based on the three-dimensional eyeball area of the fetus;
determining the distance between paired inner boundary points of the two eyeballs as the inter-ocular distance; and
The distance between the paired outer boundary points of the binocular balls is determined as the binocular outer distance.
3. The ultrasonic imaging method according to claim 2, wherein determining a pair of inner boundary points and a pair of outer boundary points of a double eyeball based on the three-dimensional eyeball area of the fetus specifically includes:
determining a center of each eyeball based on the three-dimensional eyeball area of the fetus; and
determining boundary points intersecting with connecting lines of centers of the eyeballs on the boundary of the three-dimensional eyeball area as a pair of inner boundary points and a pair of outer boundary points; or alternatively
Two boundary points closest to the boundary of the three-dimensional eyeball area are determined as paired inner boundary points, and two boundary points farthest from the boundary of the three-dimensional eyeball area are determined as paired outer boundary points.
4. The ultrasound imaging method of claim 1, wherein the binocular distance-related parameter further comprises a ratio of interbinocular distance to extrabinocular distance.
5. The ultrasound imaging method according to claim 1, wherein presenting the determined binocular distance related parameters in association with the VR map of the face portion together with the eyeballs specifically comprises:
Presenting a first line of paired inner boundary points of both eyes and a second line of paired outer boundary points of both eyes on the VR diagram of the face portion together with the eyeballs;
an intra-ocular spacing is presented in association with the first connection and an extra-ocular spacing is presented in association with the second connection.
6. The ultrasonic imaging method according to claim 1, wherein determining the three-dimensional eyeball area of the fetus and the three-dimensional face area of the fetus based on the acquired three-dimensional volume data of the face of the fetus is achieved by any one of or a combination of the following:
extracting image features based on the three-dimensional data of the face of the fetus, and determining a three-dimensional eyeball area of the fetus and/or a three-dimensional face area of the fetus by using a trained regression model based on the extracted image features; and/or
Determining a three-dimensional eyeball area of the fetus and/or a three-dimensional facial area of the fetus by utilizing a trained segmentation model based on the three-dimensional volume data of the facial area of the fetus; and/or
And determining the three-dimensional eyeball area of the fetus and/or the three-dimensional face area of the fetus by matching the representative data of the three-dimensional volume data of the face of the fetus with a reference template of the representative data.
7. The ultrasonic imaging method of claim 6, wherein the representative data of the three-dimensional volume data of the fetal face part includes at least one of the three-dimensional volume data of the fetal face part itself, data of a two-dimensional profile of the fetal face part, and data of a critical part of the fetal face part.
8. The ultrasonic imaging method according to claim 7, wherein the two-dimensional profile of the fetal face part is a two-dimensional profile through the center of each eyeball; and/or the key parts of the fetal face part comprise key anatomical structures and/or key points of the fetal face part.
9. A method of ultrasound imaging of a fetal face, comprising:
acquiring three-dimensional data of the face of the fetus;
determining, by the processor, a three-dimensional eyeball area of the fetus based on the acquired three-dimensional volume data of the face of the fetus;
determining, by the processor, binocular distance-related parameters including intra-binocular distance and extra-binocular distance based on the three-dimensional eyeball area of the fetus; and
the determined interocular distance-related parameters are presented by the processor in association with a three-dimensional eyeball area of the fetus.
10. The ultrasound imaging method of claim 1, wherein determining binocular distance related parameters including intraocular distance and extrabinocular distance based on the three-dimensional eyeball area of the fetus specifically comprises:
determining a pair of inner boundary points and a pair of outer boundary points of a double eyeball based on the three-dimensional eyeball area of the fetus;
determining the distance between paired inner boundary points of the two eyeballs as the inter-ocular distance; and
the distance between the paired outer boundary points of the binocular balls is determined as the binocular outer distance.
11. The ultrasonic imaging method of claim 10, wherein determining a pair of inner boundary points and a pair of outer boundary points of a double eyeball based on the three-dimensional eyeball area of the fetus specifically comprises:
determining a center of each eyeball based on the three-dimensional eyeball area of the fetus; and
determining boundary points intersecting with connecting lines of centers of the eyeballs on the boundary of the three-dimensional eyeball area as a pair of inner boundary points and a pair of outer boundary points; or alternatively
Two boundary points closest to the boundary of the three-dimensional eyeball area are determined as paired inner boundary points, and two boundary points farthest from the boundary of the three-dimensional eyeball area are determined as paired outer boundary points.
12. The ultrasound imaging method of claim 9, wherein the binocular distance-related parameter further comprises a ratio of interbinocular distance to extrabinocular distance.
13. The ultrasound imaging method according to claim 9, wherein presenting the determined interocular distance related parameter in association with the three-dimensional eyeball area of the fetus specifically comprises:
a first line representing a pair of inner boundary points of both eyes and a second line representing a pair of outer boundary points of both eyes;
an intra-ocular spacing is presented in association with the first connection and an extra-ocular spacing is presented in association with the second connection.
14. The ultrasound imaging method of claim 9, wherein determining the three-dimensional eyeball area of the fetus based on the acquired three-dimensional volume data of the face portion of the fetus is accomplished by any one or combination of the following:
extracting image features based on the three-dimensional data of the face of the fetus, and determining a three-dimensional eyeball area of the fetus by using a trained regression model based on the extracted image features; and/or
Determining a three-dimensional eyeball area of the fetus by using a trained segmentation model based on the three-dimensional volume data of the face of the fetus; and/or
And determining the three-dimensional eyeball area of the fetus by matching the representative data of the three-dimensional volume data of the face of the fetus with a reference template of the representative data.
15. The ultrasonic imaging method of claim 14, wherein the representative data of the three-dimensional volume data of the fetal face part includes at least one of the three-dimensional volume data of the fetal face part itself, data of a two-dimensional profile of the fetal face part, and data of a critical part of the fetal face part.
16. The ultrasonic imaging method of claim 15, wherein the two-dimensional profile of the fetal face part is a two-dimensional profile through the center of each eyeball; and/or the key parts of the fetal face part comprise key anatomical structures and/or key points of the fetal face part.
17. A method of ultrasound imaging of a fetal face, comprising:
acquiring three-dimensional data of the face of the fetus;
determining, by the processor, a three-dimensional eyeball area of the fetus based on the acquired three-dimensional volume data of the face of the fetus;
determining, by the processor, binocular distance-related parameters including intra-binocular distance and extra-binocular distance based on the three-dimensional eyeball area of the fetus;
Extracting, by the processor, a profile through a center of the double eyeball based on the three-dimensional eyeball area of the fetus; and
the determined interocular distance-related parameters are presented by the processor in association with the extracted profile through the center point of each eyeball.
18. The ultrasound imaging method of claim 17, wherein determining binocular distance related parameters including intraocular distance and extrabinocular distance based on the three-dimensional eyeball area of the fetus specifically comprises:
determining a pair of inner boundary points and a pair of outer boundary points of a double eyeball based on the three-dimensional eyeball area of the fetus;
determining the distance between paired inner boundary points of the two eyeballs as the inter-ocular distance; and
the distance between the paired outer boundary points of the binocular balls is determined as the binocular outer distance.
19. The method of ultrasound imaging according to claim 18, wherein determining a pair of inner boundary points and a pair of outer boundary points of a double eyeball based on the three-dimensional eyeball area of the fetus specifically comprises:
determining a center of each eyeball based on the three-dimensional eyeball area of the fetus; and
determining boundary points intersecting with connecting lines of centers of the eyeballs on the boundary of the three-dimensional eyeball area as a pair of inner boundary points and a pair of outer boundary points; or alternatively
Two boundary points closest to the boundary of the three-dimensional eyeball area are determined as paired inner boundary points, and two boundary points farthest from the boundary of the three-dimensional eyeball area are determined as paired outer boundary points.
20. The ultrasound imaging method of claim 17, wherein the binocular distance-related parameter further comprises a ratio of interbinocular distance to extrabinocular distance.
21. The ultrasound imaging method according to claim 17, wherein presenting the determined interocular distance related parameters in association with the extracted profile passing through the center point of each eyeball comprises in particular:
a first connecting line of paired inner boundary points of both eyes and a second connecting line of paired outer boundary points of both eyes are presented on a section passing through the center point of each eyeball;
an intra-ocular spacing is presented in association with the first connection and an extra-ocular spacing is presented in association with the second connection.
22. The ultrasound imaging method of claim 17, wherein determining the three-dimensional eyeball area of the fetus based on the acquired three-dimensional volume data of the face portion of the fetus is accomplished by any one or combination of the following:
Extracting image features based on the three-dimensional data of the face of the fetus, and determining a three-dimensional eyeball area of the fetus by using a trained regression model based on the extracted image features; and/or
Determining a three-dimensional eyeball area of the fetus by using a trained segmentation model based on the three-dimensional volume data of the face of the fetus; and/or
And determining the three-dimensional eyeball area of the fetus by matching the representative data of the three-dimensional volume data of the face of the fetus with a reference template of the representative data.
23. The ultrasonic imaging method of claim 22, wherein the representative data of the three-dimensional volume data of the fetal face comprises at least one of the three-dimensional volume data of the fetal face itself, data of a two-dimensional profile of the fetal face, and data of a critical portion of the fetal face.
24. The ultrasonic imaging method of claim 23, wherein the two-dimensional profile of the fetal face part is a two-dimensional profile through the center of each eyeball; and/or the key parts of the fetal face part comprise key anatomical structures and/or key points of the fetal face part.
25. An ultrasound imaging apparatus of a fetal face part comprising a processor configured to perform the ultrasound imaging method of a fetal face part of any of claims 1-24.
26. A computer readable storage medium having stored thereon computer executable instructions which when executed by a processor implement the method of ultrasound imaging of a fetal face according to any of claims 1-24.
CN202111464965.2A 2021-12-03 2021-12-03 Ultrasonic imaging method, ultrasonic imaging device and medium for face part of fetus Pending CN116211349A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111464965.2A CN116211349A (en) 2021-12-03 2021-12-03 Ultrasonic imaging method, ultrasonic imaging device and medium for face part of fetus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111464965.2A CN116211349A (en) 2021-12-03 2021-12-03 Ultrasonic imaging method, ultrasonic imaging device and medium for face part of fetus

Publications (1)

Publication Number Publication Date
CN116211349A true CN116211349A (en) 2023-06-06

Family

ID=86587770

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111464965.2A Pending CN116211349A (en) 2021-12-03 2021-12-03 Ultrasonic imaging method, ultrasonic imaging device and medium for face part of fetus

Country Status (1)

Country Link
CN (1) CN116211349A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116687442A (en) * 2023-08-08 2023-09-05 汕头市超声仪器研究所股份有限公司 Fetal face imaging method based on three-dimensional volume data

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116687442A (en) * 2023-08-08 2023-09-05 汕头市超声仪器研究所股份有限公司 Fetal face imaging method based on three-dimensional volume data

Similar Documents

Publication Publication Date Title
JP6467041B2 (en) Ultrasonic diagnostic apparatus and image processing method
Namburete et al. Learning-based prediction of gestational age from ultrasound images of the fetal brain
US9430825B2 (en) Image processing apparatus, control method, and computer readable storage medium for analyzing retina layers of an eye
JP4909378B2 (en) Image processing apparatus, control method therefor, and computer program
US8699766B2 (en) Method and apparatus for extracting and measuring object of interest from an image
CN110279433A (en) A kind of fetus head circumference automatic and accurate measurement method based on convolutional neural networks
CN111374712B (en) Ultrasonic imaging method and ultrasonic imaging equipment
Ribeiro et al. Handling inter-annotator agreement for automated skin lesion segmentation
CN112164043A (en) Method and system for splicing multiple fundus images
JP2012061337A (en) Image forming apparatus, method for controlling the same, and computer program
CN116211349A (en) Ultrasonic imaging method, ultrasonic imaging device and medium for face part of fetus
Liu et al. Automated classification and measurement of fetal ultrasound images with attention feature pyramid network
CN116171131A (en) Ultrasonic imaging method and ultrasonic imaging system for early pregnancy fetus
Aji et al. Automatic measurement of fetal head circumference from 2-dimensional ultrasound
US20230115927A1 (en) Systems and methods for plaque identification, plaque composition analysis, and plaque stability detection
Droste et al. Discovering salient anatomical landmarks by predicting human gaze
Feng et al. Automatic fetal weight estimation using 3d ultrasonography
CN115813433A (en) Follicle measuring method based on two-dimensional ultrasonic imaging and ultrasonic imaging system
CN113792740A (en) Arteriovenous segmentation method, system, equipment and medium for fundus color photography
CN117495763A (en) Fetal facial image processing method, processing device, ultrasonic imaging system and medium
RU120799U1 (en) INTEREST AREA SEARCH SYSTEM IN THREE-DIMENSIONAL MEDICAL IMAGES
Nabila et al. Automated Cerebral Lateral Ventricle Ratio Measurement From 2-Dimensional Fetal Ultrasound Image to Predict Ventriculomegaly
CN117982169A (en) Method for determining endometrium thickness and ultrasonic equipment
CN113658152B (en) Cerebral stroke risk prediction device, cerebral stroke risk prediction method, computer device and storage medium
KR102393390B1 (en) Target data prediction method using correlation information based on multi medical image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination