CN107424162A - A kind of image partition method and system - Google Patents

A kind of image partition method and system Download PDF

Info

Publication number
CN107424162A
CN107424162A CN201710311908.8A CN201710311908A CN107424162A CN 107424162 A CN107424162 A CN 107424162A CN 201710311908 A CN201710311908 A CN 201710311908A CN 107424162 A CN107424162 A CN 107424162A
Authority
CN
China
Prior art keywords
model
image
edge
point
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710311908.8A
Other languages
Chinese (zh)
Other versions
CN107424162B (en
Inventor
郭延恩
王晓东
沈建华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai United Imaging Healthcare Co Ltd
Original Assignee
Shanghai United Imaging Healthcare Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai United Imaging Healthcare Co Ltd filed Critical Shanghai United Imaging Healthcare Co Ltd
Priority to CN201710311908.8A priority Critical patent/CN107424162B/en
Priority to US15/710,815 priority patent/US10482604B2/en
Publication of CN107424162A publication Critical patent/CN107424162A/en
Application granted granted Critical
Publication of CN107424162B publication Critical patent/CN107424162B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/003Reconstruction from projections, e.g. tomography
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10088Magnetic resonance imaging [MRI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10104Positron emission tomography [PET]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10108Single photon emission computed tomography [SPECT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10116X-ray image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20076Probabilistic image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

The present invention provides a kind of image partition method, including:Obtain view data;Based on described image data, reconstruction image, wherein, described image includes one or more first edges;A model is obtained, wherein, the model includes the one or more second edges corresponding with one or more of first edges;Match the model and the image after the reconstruction;And according to one or more of first edges, adjust one or more second edges of the model.

Description

A kind of image partition method and system
【Technical field】
The present invention relates to a kind of image processing method, more particularly to a kind of images match dividing method based on probability and it is System.
【Background technology】
With the raising of human living standard and the extension of life expectancy, angiocardiopathy turns into the No.1 cause of the death of the mankind, Therefore the early diagnosis of angiocardiopathy can effectively reduce case fatality rate.Understand the Radiologic imaging and its performance data of cardiac structure It is the important prerequisite of correct Diagnosing Cardiac disease, the development of CT technologies, hence it is evident that improve time sense, reduce heartbeat Artifact, good application potential is shown in terms of heart fine structure is shown.
Image Segmentation Technology is the key technology of graphical analysis link, and it plays increasing work in medical imaging With.Image segmentation is to extract the indispensable means of the quantitative information of particular tissues in imaged image, while is also visualization The pre-treatment step and premise of realization.Image after segmentation is just being widely used in various occasions, such as quantitative point of tissue volume Analysis, diagnosis, the positioning of pathological tissues, the study of anatomical structure, treatment planning, the local volume effect calibration of functional imaging data and Computer guidance is performed the operation.
, it is necessary to carry out fixation and recognition to the heart chamber on image after the reconstructed acquisition of CT images.The positioning of heart chamber Identification needs to carry out heart rim detection.Variable model is the more general way in heart chamber segmentation field.Heart chamber mould Type is averagely obtained based on view data corresponding to more set clinical heart chamber models.Obtained by the matching of model and image The image matched somebody with somebody.
【The content of the invention】
A kind of image processing method, including:Obtain view data;Based on described image data, reconstruction image, wherein, institute Stating image includes one or more first edges;A model is obtained, wherein, the model includes and one or more of the The corresponding one or more second edges in one edge;Match the model and the image after the reconstruction;And according to described One or more first edges, adjust one or more second edges of the model.
Further, adjusting one or more second edges of the model includes:Determine one in the second edge Individual reference point;It is determined that the target point corresponding with the reference point;And according to the target point, adjust the second of the model Edge.
Further, it is described to determine that the target point corresponding with the reference point includes:Determine the normal of the reference point; Obtain step-length and hunting zone;According to the step-length and hunting zone, one or more candidate points are determined along normal;Obtain one First grader;According to first grader, determine that one or more of candidate points correspond to the general of the first edge Rate;And correspond to the probability of the first edge based on one or more of candidate points, determine the target point.
Further, the normal for determining the reference point includes:It is determined that one or more adjacent with the reference point Individual polygonal mesh;Determine one or more normals corresponding to one or more of polygonal mesh;And according to described one Individual or multiple normals, determine the normal of the reference point.
Further, the second edge of the adjustment model includes:Similarity transformation is carried out to the second edge; The control point of the model is determined, one or more second edges according to the control point of the model and the model Relation, the association factor at the control point of the model is generated, according to association factor, affine transformation is carried out to the second edge; Or based on an energy function, the second edge is finely adjusted.
Further, described image data include brain image, skull image, chest image, cardiac image, mammary gland figure Picture, abdomen images, renal image, liver image, pelvis image, perineum image, limbs image, vertebra image or osteogram Picture.
Further, an image processing system, including:One memory, is configured as data storage and instruction;One The processor to communicate is established with memory, wherein, when performing the instruction in memory, the processor is configured as:Obtain View data;Based on described image data, reconstruction image, wherein, described image includes one or more first edges;Obtain one Individual model, wherein, the model includes the one or more second edges corresponding with one or more of first edges; With the image after the model and the reconstruction;And according to one or more of first edges, adjust the one of the model Individual or multiple second edges.
Further, the processor is further configured to:Determine a reference point in the second edge;It is determined that The target point corresponding with the reference point;And according to the target point, adjust the second edge of the model.
Further, the processor is further configured to:Determine the normal of the reference point;Obtain step-length and search Scope;According to the step-length and hunting zone, one or more candidate points are determined along normal;Obtain first grader;Root According to first grader, determine that one or more of candidate points correspond to the probability of the first edge;And based on institute The probability that one or more candidate points correspond to the first edge is stated, determines the target point.
Further, the second edge of the adjustment model includes:Similarity transformation is carried out to the second edge; The control point of the model is determined, one or more second edges according to the control point of the model and the model Relation, the association factor at the control point of the model is generated, according to association factor, affine transformation is carried out to the second edge; Or based on an energy function, the second edge is finely adjusted.
【Brief description of the drawings】
Fig. 1 is a kind of application scenarios signal of the example control and processing system according to some embodiments of the present application Figure;
Fig. 2 is a kind of schematic diagram of example system configuration of the processing equipment according to some embodiments of the present application;
Fig. 3 is that one kind for being used to implement some particular systems in the application according to some embodiments of the present application is shown Example mobile device schematic diagram;
Fig. 4 is the schematic diagram of the example process equipment according to some embodiments of the present application;
Fig. 5 is the exemplary process diagram of the implementation processing equipment according to some embodiments of the present application;
Fig. 6 is the schematic diagram of the example model structure module according to some embodiments of the present application;
Fig. 7 is the exemplary process diagram of the structure averaging model according to some embodiments of the present application;
Fig. 8 is the schematic diagram of the example training module according to some embodiments of the present application;
Fig. 9 is the exemplary process diagram of the training grader according to some embodiments of the present application;
Figure 10 is the structural representation of the example model matching module according to some embodiments of the present application;
Figure 11 is the matching averaging model and the exemplary stream for the image rebuild according to some embodiments of the present application Cheng Tu;
Figure 12 is the structural representation of the example adjusting module according to some embodiments of the present application;
Figure 13 is the exemplary process diagram of the adjustment model according to some embodiments of the present application;
Figure 14 is the exemplary process diagram to be set the goal really according to some embodiments of the present application a little;
Figure 15 is the exemplary of one averaging model marginal point normal of determination according to some embodiments of the present application Flow chart;
Figure 16 is the exemplary process diagram of the averaging model edge point transformation according to some embodiments of the present application;
Figure 17 is the sharp keen degree schematic diagram of an image of the application;
Figure 18 is the embodiment of the image border classification based training of the application;
Figure 19 is the embodiment of the model meshes classification of the application;
Figure 20 is the embodiment of the model meshes division of the application;
Figure 21 is the embodiment of the grid model based on association factor of the application;
Figure 22 is the image border exemplary plot based on the classification of sharp keen degree;
Figure 23 is the model example figure based on the classification of sharp keen degree;
Figure 24 is the image probability graph embodiment that the present embodiment obtains according to grader;
Figure 25 is the exemplary plot of the average meshes model and images match after Hough transformation;
Figure 26 is the image chamber segmentation result exemplary plot accurately matched after adjusting;
Figure 27 A are to be not based on the image segmentation figure that association factor is divided;And
Figure 27 B are the image segmentation figures divided based on association factor.
【Embodiment】
Many details are elaborated in the following description in order to fully understand the present invention.But the present invention can be with Much it is different from other manner described here to implement, those skilled in the art can be in the situation without prejudice to intension of the present invention Under do similar popularization, therefore the present invention is not limited to the specific embodiments disclosed below.
Secondly, the present invention is described in detail using schematic diagram, when the embodiment of the present invention is described in detail, for purposes of illustration only, institute It is example to state schematic diagram, and it should not limit the scope of protection of the invention herein.
Fig. 1 is a kind of application scenarios signal of the example control and processing system according to some embodiments of the present application Figure.As shown in figure 1, control and processing system 100 include 110, databases 120 of an imaging device and a processing equipment 130。
Imaging device 110 can generate image by scanning a target object.Described image can be various medical science figures Picture.For example, head image, chest image, abdomen images, pelvis image, perineum image, limbs image, vertebra image, vertebra Image etc..Wherein, head image can include brain image, skull image etc..Chest image can include whole chest image, Cardiac image, galactophore image etc..Abdomen images can include whole abdomen images, renal image, liver image etc..Cardiac image Comprehensive digitlization cardiod diagram, digitlization heart chromatographical X-ray figure, heart phase contrast figure, X ray image can be included but is not limited to (CR) figure, multi-modality images etc..The medical image can be two dimensional image or 3-D view.The form of the medical image can With including jpeg format, tiff format, GIF forms, FPX forms etc..The medical image can be stored in database 120, It can also transmit to processing equipment 130 and carry out image procossing.The application will illustrate by taking cardiac image as an example, but this area Technical staff is it is understood that the present processes can be used for other images.
Database 120 can store the information of image and/or image correlation.The described image information related to image can be with There is provided, can also be obtained outside system 100 by imaging device 110 and processing equipment 130, for example, user inputs information, from network Obtain information etc..The related information of described image can include handling in the algorithm of image, sample, model, parameter, processing procedure Real time data etc..Database 120 can be hierarchical database, network database or relational database.Database 120 Can be local data base or remote data base.Other storage devices can be by Information Number in database 120 or system The storage device to be worked in a manner of electricity, light or magnetic etc. is recycled to be stored after word.In certain embodiments, database 120 Or other storage devices can be the equipment using electric energy mode storage information in system, for example, it is random access memory (RAM), read-only Memory (ROM) etc..Random access memory can include but is not limited to dekatron, selectron, delay line storage, William This pipe, dynamic RAM (DRAM), SRAM (SRAM), IGCT random access memory (T-RAM), zero capacitance One or more combinations in random access memory (Z-RAM) etc..Read-only storage includes but is not limited to magnetic bubble memory, A.O. It is linear memory, thin-film memory, magnetic plated wire memeory, magnetic core internal memory, magnetic drum memory, CD drive, hard disk, tape, non- It is volatile storage (NVRAM), phase-change memory element, reluctance type random storage formula internal memory, ferroelectric random stored memory, non-volatile SRAM, programmable read only memory, Mask ROM, floating connection door random access memory, nanometer are random One or more combinations in memory, racing track internal memory, variable resistance type internal memory, programmable metallization unit etc..At some In embodiment, other storage devices can be the equipment using magnetic energy mode storage information in database 120 or system, such as firmly Disk, floppy disk, tape, core memory, magnetic bubble memory, USB flash disk, internal memory etc..In certain embodiments, database 120 or system Other interior storage devices can be the equipment using optical mode storage information, such as CD, DVD etc..In certain embodiments, number Can be the equipment using magneto-optic mode storage information according to storehouse 120, such as magneto-optic disk etc..Other are deposited in database 120 or system The access mode of storage equipment can be one or more combinations in random storage, serial access storage, read-only storage etc..Number Can be impermanent memory memory or permanent memory memory according to other storage devices in storehouse 120 or system.It is above-mentioned Storage device simply lists some examples, and the storage device that database 120 can use is not limited to this.
Database 120 can be a part for processing equipment 130 or a part for imaging device 110, can also Exist independently of processing equipment 130 and imaging device 110.In certain embodiments, database 120 can by network 150 with Control and the connection of other modules in processing system 100 or equipment.The connected mode can include wired connection, wireless connection Or both combination.
Processing equipment 130 can obtain view data from imaging device 110, can also obtain picture number from database 120 According to.Processing equipment 130 can implement a variety of processing to the image of acquisition.The processing can include grey level histogram processing, return One change processing, geometric transformation, spatial alternation, picture smooth treatment, image enhancement processing, image dividing processing, at image conversion Reason, image recovery, compression of images, image characteristics extraction etc..Processing equipment 130 can be by the view data storage after processing to number According to storehouse 120, can also be transferred in the equipment outside control and processing system 100.
In certain embodiments, processing equipment 130 can include one or more processors, memory etc..For example, processing Equipment 130 can include central processing unit (CPU), application specific integrated circuit (ASIC), ASIP (ASIP), image Processor (GPU), physical manipulations processor (PPU), digital signal processor (DSP), field programmable gate array (FPGA), can In programmed logic device (PLD), controller, micro-control unit, processor, microprocessor, Advance RISC Machine processor etc. One or more combinations.
In certain embodiments, control and processing system 100 can also include a terminal device 140.The terminal is set It is standby to carry out information exchange with imaging device 110, database 120 and processing equipment 130.For example, the terminal device 140 can With the view data after the acquisition processing from processing equipment 130.In certain embodiments, terminal device 140 can be set from imaging Standby 110 obtain view data, and view data is transferred into processing equipment 130 and carries out image procossing.The terminal device 140 can With including one or more input equipments, control panel etc..For example, the input equipment can include keyboard, touch-screen, mouse Mark, voice-input device, scanning device, information identification equipment (such as human eye recognition system, fingerprint recognition system, brain monitoring system Deng), remote controllers etc..
Control and processing system 100 can be connected with network 150.The network 150 can be wireless network, mobile network Network, finite element network or other connections.Wherein, wireless network can includeWLAN, Wi-Fi, WiMax etc.. Mobile network can include 2G signals, 3G signals, 4G signals etc..Cable network can include LAN (LAN), wide area network (WAN), proprietary network etc..
Control and the database 120 in processing system 100 and processing equipment 130 can perform operation by cloud computing platform Instruction.Cloud computing platform can include the storage-type cloud platform based on data storage, the calculation type cloud based on data processing The synthesis cloud computing platform that platform and calculating and data storage processing are taken into account.For example, caused by control and processing system 100 Some view data can be calculated or stored by cloud computing platform.
It should be noted that above to control and the description of processing system 100, only for convenience of description, can not be this Shen It please be limited within cited scope of embodiments.
Fig. 2 is a kind of schematic diagram of example system configuration of the processing equipment according to some embodiments of the present application. As shown in Fig. 2 processing equipment 130 can include a read-only storage of processor 220, one of data/address bus 210, one (ROM) 230,260, hard disks of input/output end port of COM1 250, one of random access memory (RAM) 240, one 270 and a display 280 being connected with input/output end port 260.Connection in the processing equipment 130 between each hardware Mode can be wired, wireless or both combination.Any one hardware can be local, long-range or both With reference to.
Data/address bus 210 can be used for transmitting data information.In certain embodiments, in processing equipment 130 each hardware it Between can pass through the data/address bus 210 carry out data transmission.For example, processor 220 can pass through the data/address bus 210 Transmit data in other hardware such as memory or input/output end port 260.It should be noted that the data can be Real data or instruction code, status information or control information.In certain embodiments, data/address bus 210 can be with It is mutual for industrial standard (I SA) bus, extension industrial standard (EISA) bus, video electronics standard (VESA) bus, external component Connection standard (PCI) bus etc..
Processor 220 can be used for logical operation, data processing and instruction generation.In certain embodiments, processor 220 Data/commands can be obtained from internal storage, the internal storage can include read-only storage (ROM), deposit at random Reservoir (RAM), cache memory (Cache) (not shown in FIG.) etc..In certain embodiments, processor 220 can be with Including multiple sub-processors, the sub-processor can be used for the difference in functionality for realizing system.
Read-only storage 230 is used for the initial of each functional module in the Power-On Self-Test of processing equipment 130, processing equipment 130 Change, the driver of basic input/output of processing equipment 130 etc..In certain embodiments, read-only storage can include can Program read-only memory (PROM), programmable and erasable read-only storage (EPROM) etc..Random access memory 240, which is used to deposit, to be grasped Make system, various application programs, data etc..In certain embodiments, random access memory 240 can include SRAM (SRAM), dynamic RAM (DRAM) etc..
COM1 250 is used for attended operation system and external network, realizes the communication exchanges between them.In some realities Apply in example, COM1 250 can include FTP ports, http port or DNS ports etc..Input/output end port 260 is used for outer Between portion's equipment or circuit and processor 210 carry out data, information exchange and control.In certain embodiments, input/output Port 260 can include USB port, PCI port, I DE ports etc..
Hard disk 270 is used to store received information caused by processing equipment 130 or outside processing equipment 130 and number According to.In certain embodiments, hard disk 270 can include mechanical hard disk (HDD), solid state hard disc (SSD) or hybrid hard disk (HHD) Deng.Information, the data that display 280 is used to generate system 130 are presented to user.In certain embodiments, display 280 can With including a physical display, such as display with loudspeaker, LCD display, light-emitting diode display, OLED display, electronic ink Water display (E-Ink) etc..
Fig. 3 is that one kind for being used to implement some particular systems in the application according to some embodiments of the present application is shown Example mobile device schematic diagram.As shown in figure 3, mobile device 350 can include a terminal device 150.In certain embodiments, User can be received or be sent the information related to control and processing system 100 by mobile device 350.Mobile device 350 can With including smart mobile phone, personal digital assistant (PDA), tablet personal computer, handheld device, intelligent glasses, intelligent watch, wearable One or more in equipment, virtual reality device or display enhancing equipment etc..In certain embodiments, mobile device 350 can With including 356, one or more central processing units (CPUs) 358, one or more image processors (GPUs) displays 354th, a memory 368 of communication platform 352, one of internal memory 362, one and one or more input-output apparatus 360.Enter One step, mobile device 350 can also include system bus, controller etc..As shown in figure 3, CPU can be from storage Module 368 should by mobile device operation system (for example, iOS, Android, Windows Phone etc.) 364 and one or more Downloaded to 366 in internal memory 362.One or more of applications 366 can include a webpage or other for receiving and passing Pass and control and the Mobile solution software (App) of information that processing system 100 is related.User can pass through input-output apparatus 360 obtain or provide information, and described information can further be transferred to the equipment in control and processing system 100 and/or system Unit.
In embodiments herein, computer hardware platforms may be used as one or more elements (for example, control and place Reason system 100 and its inside other parts) hardware platform, implement various modules, unit and their function.It is described hard Part element, operating system and programming language are inherently traditional, and those skilled in the art are possible to adapt these technologies And established and edge segmentation applied to cardiac image model.Computer with user interface can be used as PC (PC), Other work stations or terminal device, properly programmed computer can also be used as server.Because those skilled in the art are to this Structure, programming and the general operation of computer equipment used in application should be all very familiar with, therefore, no longer for other attached Figure makees related specific explanations.
Fig. 4 is the schematic diagram of the example process equipment according to some embodiments of the present application.Processing equipment 130 can be with Including 430, model construction modules 430 of a memory module of image reconstruction module 420, one of acquisition module 410, one, One training module, 450, matching modules 460 and an adjusting module 470.In the processing equipment 130 between each module Connected mode can be wired, wireless or both combination.Any one module can be it is local, long-range or Both combinations.
Memory module 430 can be used for storage image data or information, and its function can be by hard disk in Fig. 2 270, read-only One or more combinations in memory 230, random access memory 240 etc. are realized.Memory module 430 can store processing and set The information of module or equipment in standby 130 outside other modules or processing equipment 130.The information that memory module 430 stores can be with Scan data including imaging device 110, process part in the control command or parameter information, processing equipment 130 of user's input The intermediate data of generation or partial data information etc..In certain embodiments, memory module 430 can send the information of storage Image procossing is carried out to process part.In certain embodiments, the information that memory module 430 can be generated with storing process part, Such as data are calculated in real time.Memory module 430 can include but is not limited to common all kinds of storage devices such as solid state hard disc, machinery Hard disk, USB flash memory, SD storage cards, CD, random access memory (RAM) or read-only storage (ROM) etc..Memory module 430 can be with It is the storage device or its exterior or external storage device of internal system, such as the storage on cloud storage service device Device.
Acquisition module 410 can be used for obtaining the view data that imaging device 110 gathers, the image that database 120 stores Data outside data, or control and processing system 100, its function can be realized by the processor 220 in Fig. 2.It is described View data can include the view data of the collection of imaging device 110, the algorithm for handling image, sample, model, parameter, processing During real time data etc..In certain embodiments, acquisition module 410 can send the view data got or information Handled to image reconstruction module 420.In certain embodiments, acquisition module 410 can be by the processing image got The information such as algorithm, parameter are sent to model construction module 440.In certain embodiments, acquisition module 410 will can be got View data or information are sent to memory module 370 and stored.In certain embodiments, acquisition module 410 will can obtain To sample, parameter, model, the information such as real time data be sent to training module 450, matching module 460 or adjusting module 470. In certain embodiments, acquisition module 410 can receive the data acquisition instruction from processor 220, and complete corresponding Data acquisition operations.In certain embodiments, acquisition module 410 can be carried out pre- after view data or information is obtained to it Processing.
Image reconstruction module 420 can be used for building a medical image, and its function can be by the processor 220 in Fig. 2 To realize.In certain embodiments, image reconstruction module 420 can obtain image from acquisition module 410 or memory module 430 Data or information, and the medical image according to described image data or information architecture.The medical image can be a people Body 3 D medical image.Described image data can include different time, diverse location, the scan data of different angle.According to The scan data, image reconstruction module 420 can calculate the feature or state of human body corresponding position, such as human body corresponding position Density of absorbability, human body corresponding position tissue to ray etc., so as to construct the human body three-dimensional medical image.Enter one Step ground, the human body three-dimensional medical image can be shown by display 280, or be deposited by memory module 430 Storage.In certain embodiments, the human body three-dimensional medical image can also be sent to model as the pending image after rebuilding Structure module 440 is further handled.
Model construction module 440 can be used for the three-dimensional averaging model for establishing target object.In certain embodiments, it is described Target object can be heart, and the three-dimensional averaging model can be that the heart chamber based on more set reference model structures is three-dimensional flat Equal grid model.In certain embodiments, model construction module 440 can pass through acquisition module 410, memory module 430 or use The mode of family input obtains the reference model of at least one heart chamber and the information related to reference model.It is described with refer to mould The related information of type can include locus of the size of image, pixel, pixel etc..In certain embodiments, model construction Module 440 can be according to the reference model and the information related to reference model of at least one heart chamber of acquisition to reference to mould Type carries out the pretreatment such as registration so that the directions of all reference models, ratio etc. are consistent.The pretreated image can enter One step is manually or the mode of processor automatic marking marks cavity margin, and heart reference model is divided into several sub- hearts Splanchnocoel room, and heart chamber average meshes model is built according to the edge point data of each chamber.Model construction module 440 can be with The heart chamber average meshes model of structure is sent into memory module 430 to be stored, training module 450 can also be sent to Or matching module 460 is further operated.In certain embodiments, model construction module 440 can also refer to according to more sets Model data determines the relation between each chamber on averaging model.For example, model construction module 440 can build association factor Matrix, the association factor matrix can represent influence of each chamber to some or multiple number of edges strong points.Pass through structure Association factor matrix, chamber boundary separation situation can be improved.Model construction module 440 can be by the association factor matrix of structure It is sent to memory module 430 to be stored, matching module 460 can also be sent to or adjusting module 470 is used for calculation process.
Training module 450 can be used for training grader.Training module 450 can will likely marginal point be divided into difference In chamber classification.For example, a range of data point of reference model adjacent edges can be respectively divided to a left side for training module 450 In ventricle, atrium sinistrum, right ventricle, atrium dextrum, left cardiac muscle or six chamber classifications of sustainer.In another example training module 450 can be with Based on the intensity of variation of cavity margin by a range of data point of reference model adjacent edges be respectively divided left ventricle edge, Atrium sinistrum sharpened edge, the non-sharpened edge in atrium sinistrum, right ventricle sharpened edge, the non-sharpened edge of right ventricle, the sharp keen side in atrium dextrum Edge, the non-sharpened edge in atrium dextrum, sustainer edge, left myocardium sharpened edge and left cardiac muscle 10 chamber classifications of non-sharpened edge In.In certain embodiments, training module 450 can be inputted by memory module 430, model construction module 440 or user Mode obtains the reference model of at least one heart chamber and the information related to the reference model.It is described related to reference model Information can include edge point data etc. of each chamber in reference model.In certain embodiments, training module 450 can be with Point near cavity margin is divided into by positive sample and negative sample according to the point near cavity margin and the distance of cavity margin. In some embodiments, the positive sample can include the data point in the certain threshold range of cavity margin, the negative sample The data point of other random sites farther out and in space apart from edge can be included.In certain embodiments, training module 450 The point of reference model or averaging model upper chamber adjacent edges can be trained according to positive and negative sample point, and obtain one or more points Class device.In certain embodiments, training module 450 can utilize Probabilistic Boosting-Tree (PBT) training point Class device.The PBT can include two-stage PBT algorithms or multistage PBT algorithms.The grader that training module 450 will can train It is sent to memory module 430 to be stored, adjusting module 470 can also be sent to and be used for calculation process.
Matching module 460 can be used for the averaging model progress for establishing pending image and model construction module 440 Match somebody with somebody, build three-dimensional grid model corresponding with pending image.The pending image from image reconstruction module 420 or can be deposited Storage module 430 obtains.In certain embodiments, matching module 460 can pass through the methods of Hough transformation and match averaging model Onto pending image, obtain and the heart chamber three-dimensional grid model after pending image rough matching.Matching module 460 can Parameter required for the Hough transformation is obtained by way of acquisition module 410, memory module 430 or user's input etc. Information.Heart chamber three-dimensional grid model after matching can be sent to memory module 430 and be stored by matching module 460, 470 further optimization processing of adjusting module can also be sent to.
Adjusting module 470 can be used for Optimized model, make model closer to real heart (pending cardiac image number According to).Adjusting module 470 can be from the heart chamber grid mould after matching module 460 or the acquisition rough matching of memory module 430 Type.In certain embodiments, adjusting module 470 can be according to the gained cardiac module upper chamber a range of number in edge after matching Strong point belongs to the optimal heart chamber edge of determine the probability of cavity margin.Adjusting module 470 further can be adjusted accurately Heart chamber three-dimensional grid model.The accurate adjustment can include similarity transformation, piecewise affine transformations and/or based on energy Micro- change of function etc..In certain embodiments, adjusting module 470 can be by the heart chamber three-dimensional grid mould of accurate adjustment gained Type carries out image format conversion, obtains heart chamber edge segmentation figure (as shown in figure 26).Adjusting module 470 will can be adjusted accurately Heart chamber model or heart chamber segmentation figure after whole are sent to memory module 430 and stored, and can also be sent to display Device 280 is shown.
It should be noted that the above-mentioned description for processing equipment 130, only for convenience of description, can not limit the application System is within the scope of illustrated embodiment.It is appreciated that for those skilled in the art, it is former in the work for understanding the equipment After reason, modules may be combined in the case of without departing substantially from this principle, or form subsystem and other Module connects, and makees various modifications and variations in the form and details to implementing the said equipment.For example, model construction module 440 and/ Or training module 450 can remove, or merge with memory module 430.Such deformation, the protection model in the application Within enclosing.
Fig. 5 is the exemplary process diagram of the implementation processing equipment according to some embodiments of the present application.In step 510 In, view data can be obtained.In certain embodiments, step 510 can be realized by acquisition module 410.Described image number According to can from imaging device 110, database 120 or control and processing system 100 outside obtain.Described image data can include CT, positron emission chromatography imaging technique (PET), single photon emission tomography (SPECT), MRI (mr imaging technique), Ultrasonic (Ultrasound) and the raw image data of other medical imaging devices collection.In certain embodiments, described image Data can be the local raw image data of heart or heart.In certain embodiments, step 510 can include to acquisition Heart raw image data is pre-processed, and pretreated raw image data is sent into the figure in processing equipment 130 As rebuilding module 420 or memory module 430.The pretreatment can include distortion correction, denoising, smooth, enhancing of image etc..
In step 520, cardiac image can be rebuild according to cardiac image data.The step can be by processing equipment Image reconstruction module 420 in 130 is completed based on Image Reconstruction Technology.The cardiac image data can pass through acquisition module 410 Or memory module 430 obtains.The cardiac image can include comprehensive digitlization cardiod diagram, digitlization heart chromatographical X-ray Figure, heart phase contrast figure, X ray cardiac imaging (CR) figure, multi-modal cardiac image etc..The cardiac image can be two dimensional image Or 3-D view.The form of the cardiac image can include jpeg format, tiff format, GIF forms, FPX forms etc..It is described Image Reconstruction Technology can include solution of simultaneous equations, Fourier transformation reconstruction method, direct back projection reconstruction method, the anti-throwing of filtering Shadow reconstruction method, Fourier backprojection reconstruction method or convolution inverse projection reconstruction method, iterative reconstruction etc..In certain embodiments, walk Rapid 520 can pre-process to the cardiac image data of acquisition, and obtain multiple heart sectional views or perspective view.In some realities Apply in example, the cardiac image data of acquisition or pretreated cardiac image data can include multiple heart sectional views.Image Cardiac image or model can be rebuild according to a series of information that heart sectional views provide by rebuilding module 420.The heart The information that sectional view provides can include the tissue density of Heart tissue, to information such as the absorbabilities of ray.Again structure The cardiac image built can be shown by display 280, or be stored by memory module 430.Rebuild Cardiac image can also carry out further image procossing by the model construction module 440 in processing equipment 130.
In step 530, a three-dimensional cardiac average meshes model can be built.The step can be by processing equipment 130 In model construction module 440 according to multiple reference models complete.Step 530 can by module 410, memory module 430 or The mode of user's input obtains multiple reference models.In certain embodiments, step 530 can include entering multiple reference models Row image registration.Described image registration can be included based on gray level image registration, based on transform domain image registration, feature based figure As registration etc..Wherein, feature can include characteristic point, characteristic area, edge feature etc..In certain embodiments, it is the multiple Reference model can be the heart chamber partition data or reference model that cavity margin is crossed by user annotation.The averaging model Point Distribution Model (PDM), Active Shape Model (ASM), Active Contour can be included The model that Model (also referred to as Snakes), Active Appearance Model (AAM) etc. are calculated.In some embodiments In, step 530 can include each on the averaging model constructed according to the cavity margin data determination on multiple reference models Relation between chamber, and establish association factor two-dimensional matrix.In certain embodiments, three-dimensional cardiac average meshes model or contain The memory module 430 that the averaging model of relevant factor information can be transmitted directly in processing equipment 130 by processor 220 is entered Row storage, or be sent to matching module 460 and further handle.
Cardiac image data can be matched with three-dimensional cardiac average meshes model in step 540.Further, The matching can include the matching of first edge point and three-dimensional cardiac average meshes model in cardiac image data.In some realities Apply in example, step 540 can be completed by matching module 460 by image matching method.Described image matching process can include Matching process based on NNDR, the searching algorithm of adjacent features point, target detection based on Hough transformation etc..In some embodiments In, the heart averaging model that model construction module 440 is established can be matched by image reconstruction module by generalised Hough transform In first edge on the 420 obtained cardiac image datas of processing, and the cardiac module after being matched.In some embodiments In, the probability that edge can be belonged to based on each point on cardiac image data to be matched implements weighting generalised Hough transform.It is described general The grader that rate can train according to training module 450, each point on cardiac image data to be matched is inputted into classification Device is calculated.In certain embodiments, a heart to be matched can be built according to the probability of each point on gained heart to be matched Dirty marginal probability figure.The marginal probability figure can include shade of gray figure, color gradient figure (as shown in figure 24) etc..One , can be to the cardiod diagram before each point on cardiac image data to be matched is calculated is as the probability at edge in a little embodiments As being pre-processed.For example, the position for being unlikely to be heart edge completely can be excluded, so as to reduce the calculating of grader Amount.For example, for CT images, the CT values of musculature are generally higher than -50, then CT values can be less than to -50 position Marked by mask, grader is calculated the point at the position.In certain embodiments, the matching in processing equipment 130 Cardiac module after matching or three-dimensional cardiac grid model can be sent to memory module 430 and be stored by module 460, also may be used To be sent to 470 further optimization processing of adjusting module.
In step 550, the heart chamber segmentation figure after accurate adjustment can be obtained.The step can be by processing equipment Adjusting module 470 in 130 is completed.In certain embodiments, step 550 can be according to after matching on three-dimensional cardiac grid model Cavity margin determine edge destination point., can be according to after matching on three-dimensional cardiac grid model for example, in certain embodiments The a range of second edge point of cavity margin determine the probability edge destination point.In certain embodiments, the probability The second classifier calculated based on the training of second edge point can be used.In certain embodiments, the probability can call base In the first classifier calculated that multiple reference models or averaging model are trained.In certain embodiments, step 550 can be based on true Fixed edge destination point deforms to three-dimensional cardiac grid model, so as to obtain the three-dimensional heart after cavity margin further adjusts Dirty grid model.The deformation can include similarity transformation, affine transformation and other image Light deformation methods etc..For example, In some embodiments, edge destination point that can be based on determination carries out similarity transformation, piecewise affine transformations and/or is based on successively Micro- change of energy function.In certain embodiments, the adjusting module 470 in processing equipment 130 can be by the three-dimensional heart after adjustment Dirty grid model is converted into heart chamber segmentation figure picture (as shown in figure 26) by mask (mask).The chamber segmentation figure as Different chamber can be marked with different colors.In certain embodiments, the adjusting module 470 in processing equipment 130 can incite somebody to action Heart chamber model or heart chamber segmentation figure after accurate adjustment are sent to memory module 430 and stored, and can also send Shown to display 280.
It should be noted that the above-mentioned description that chamber cutting procedure is carried out for processing equipment 130, only for convenience of description, The application can not be limited within the scope of illustrated embodiment.It is appreciated that for those skilled in the art, After the operation principle for solving the equipment, any tune may be carried out to the order of each step in the case of without departing substantially from this principle It is whole, or some steps of addition deletion.For example, the step of structure averaging model 530, can be removed.In another example adjusting module 470 One or both of above-mentioned several deformations, or micro- change using other forms can be only carried out to grid model.It is all such The deformation of class, within the protection domain of the application.
Fig. 6 is the schematic diagram of the example model structure module according to some embodiments of the present application.Model construction mould Block 440 can include a registration unit 620, one of acquiring unit 610, one and mark 630, model generation units of unit 640 and an association factor generation unit 650.Connected mode in the model construction module 440 between each module can be Wired, wireless or both combination.Any one module can be local, long-range or both combination.
Acquiring unit 610 can be used for obtaining multiple reference models.Acquiring unit 610 can pass through database 120, control And storage device outside processing system 100 or the mode of user's input obtain above- mentioned information.Its function can pass through the place in Fig. 2 Device 220 is managed to realize.In certain embodiments, the multiple reference model can include a patient in different time, difference Position, the cardiac image data of different angle scanning.In certain embodiments, more set cardiac datas can include different diseases The cardiac image data that people is scanned in diverse location, different angle.In some embodiments, acquiring unit 610 can be used for obtaining Take the information such as modeling algorithm, parameter.The multiple reference models and/or other information of acquisition can be sent to by acquiring unit 610 Registration unit 620, mark unit 630, averaging model generation unit 640 or association factor generation unit 650.
Registration unit 620 can be used for adjusting by method for registering images multiple refers to mould acquired in acquiring unit 610 Type, and make the position of multiple reference models, ratio etc. consistent.Described image registration can be included based on space dimensionality registration, base In feature registration, based on Transformation Properties registration, based on optimized algorithm registration, based on image modalities registration, based on main body registration etc.. In certain embodiments, multiple reference models can be registrated in an identical coordinate system by registration unit 620.Registration unit Multiple reference models after registration can be sent to memory module 430 by 620 to be stored, and can also be sent to mark unit 630 and/or averaging model generation unit 640 further handle.
Mark unit 630 can be used for the multiple data points of cavity margin (the also referred to as point for marking multiple reference models Collection).The cardiac image or model can be registration unit 620 carry out image registration after multiple reference models or The averaging model that averaging model generation unit 640 is built.For example, cavity margin can carry out figure by user in registration unit 620 As being marked manually on multiple reference models after registration.In another example cavity margin can be by mark unit 630 according to significantly different Cavity margin feature automatic marking.In certain embodiments, marking unit 630 can be by the whole heart in multiple reference models Dirty image or model are divided into six parts according to chamber, respectively left ventricle, atrium sinistrum, right ventricle, atrium dextrum, cardiac muscle and Sustainer.In certain embodiments, mark unit 630 (can also be claimed according to the intensity of variation at reference model upper chamber edge For gradient), it is sharp keen class and non-sharp keen class by the whole cardiac image or model partition on multiple reference models.Specifically, mark Note unit 630 can be communicated with the outside the marginal point of several chambers or be class with the less mark of external change degree, To be connected with internal other chambers or be sharp keen class with more mark of external change, such as two arrow institutes in Figure 17 Show.For example, the whole cardiac image or model partition on multiple reference models can be 10 classifications by mark unit 630:It is left Ventricle edge, atrium sinistrum sharpened edge, the non-sharpened edge in atrium sinistrum, right ventricle sharpened edge, the non-sharpened edge of right ventricle, the right heart Room sharpened edge, the non-sharpened edge in atrium dextrum, sustainer edge, left myocardium sharpened edge and the left non-sharpened edge of cardiac muscle are (as schemed Shown in 18).In certain embodiments, unit 630 is marked by multiple reference model registrations into a same coordinate system to lead to The position of more multiple reference models and each point on the averaging model of the gained of averaging model generation unit 640 is crossed, marks multiple ginsengs Examine the cavity margin on model.For example, mark unit 630 can will be closest with corresponding points on reference model on averaging model Point belonging to classification of the classification as the point on reference model.Mark unit 630 can will be labeled with cavity margin point set Multiple reference models are sent to memory module 430 and stored, and can also be sent to training module 450, averaging model generation list Member 640 and/or association factor generation unit 650 are further handled or for calculating.
Averaging model generation unit 640 can be used for building three-dimensional cardiac average meshes model.In certain embodiments, put down Equal model generation unit 640 can extract the cavity margin in multiple reference models or averaging model after marking, by every Cavity margin model in individual reference model or averaging model carries out processing and obtains multiple grid of reference models, and passes through image mould Average meshes model is calculated in type construction method.Described image model building method can include Point Distr Ibution Model (PDM), Active Shape Model (ASM), Active Contour Model are (also referred to as Snakes), Active Appearance Model (AAM) etc..In certain embodiments, averaging model generation unit 640 can be with Whole heart averaging model after chamber is marked is divided into six submodels that are independent or be combineding with each other.For example, the left heart (as shown in figure 20) such as room model, left atrial model, right ventricle model, atrium dextrum model, left myocardial model and sustainer models. In certain embodiments, averaging model generation unit 640 can extract multiple cavity margins, and determine on multiple cavity margins Control point is distributed, and network is formed by connecting control point.In certain embodiments, averaging model generation unit 640 can be based on Grid model obtains the average meshes model of heart chamber, and corresponding characteristic value, characteristic vector etc. by ASM modeling methods Model parameter.In certain embodiments, averaging model generation unit 640 can add association factor pair in averaging model calculating The influence at control point.For example, in ASM calculating, averaging model generation unit 640 can utilize weighted average (i.e. Σ (Fi* Wi)) the adjustment result at control point is calculated, wherein, Fi is the deformation parameter of some chamber, and Wi is shadow of the chamber to control point Ring coefficient or weighted value.By the weighted average calculation based on association factor can on model control point adjustment by To the influence of multiple chamber results, so as to reach the purpose for associating multiple chambers.Averaging model generation unit 640 will can obtain Three-dimensional cardiac average meshes model be sent to memory module 430 stored or association factor generation unit 650 be used for calculate. Averaging model generation unit 640 obtained three-dimensional cardiac average meshes model can also be sent to training module 450 and/or Further handled with module 460.
Association factor generation unit 650 can be used for the relation for establishing control point on each chamber and average grid model. In some embodiments, the relation can be the two-dimentional association factor matrix of chamber and control point as ranks, and the value of matrix can To represent influence coefficient or weight of each chamber to each control point.In certain embodiments, the value of the matrix can be 0-1 it Between any real number.
In certain embodiments, association factor generation unit 650 can belong to according to the chamber at control point on grid model, And control point and the position relationship of other chambers, establish association factor matrix.In certain embodiments, association factor generation is single Member 650 can calculate the coverage of association factor according to the distance of control point and other chambers or influence coefficient.For example, association Factor generation unit 650 can control the meter of association factor influence coefficient by control point apart from the ultimate range of other chambers Calculate.In certain embodiments, association factor generation unit 650 can adjust different chamber according to the tightness degree between each chamber Between coverage and influence coefficient.As shown in figure 21, in grid control point model, light control point is represented only by place The influence of chamber, and dark chamber junction then represents that control point is influenceed by the chamber of multiple connections, wherein color is deeper Representative is influenceed bigger by other chambers.Obtained two-dimentional association factor matrix can be sent to by association factor generation unit 650 Memory module 430 is stored, and can also be sent to averaging model generation unit 640 and/or adjusting module 470 based on weighting Calculate.
It should be noted that the above-mentioned description for model construction module 440, only for convenience of description, can not be this Shen It please be limited within the scope of illustrated embodiment.It is appreciated that for those skilled in the art, understanding the work of the module After making principle, unit in the module may be combined in the case of without departing substantially from this principle, or formed Subsystem is connected with other units, makees various modifications and variations in the form and details to implementing above-mentioned module.For example, registration is single Member 620 and/or mark unit 630 can remove, or merge with acquiring unit 610, memory module 430.It is in another example described more Individual reference model or averaging model can include the cardiac data or model for having carried out edge mark via user.In another example institute State the cardiac data that multiple reference models or averaging model can include having carried out rough or fine chamber segmentation.Such as Such deformation, within the protection domain of the application.
Fig. 7 is the exemplary process diagram of the structure averaging model according to some embodiments of the present application.In step 710 In, multiple heart reference models can be obtained.The multiple heart reference model can by database 120, user input or Storage device outside control and processing system 100 obtains.In certain embodiments, the multiple heart reference model can include The cardiac image data that one patient is scanned in different time, diverse location, different angle.In certain embodiments, it is described more Individual heart reference model can include the cardiac image data that different patients are scanned in diverse location, different angle.In some realities Apply in example, the multiple heart reference model can include the cardiac data or model for having carried out edge mark via expert. In some embodiments, the multiple heart reference model can include the heart for having carried out rough or fine chamber segmentation Data.
In step 720, image registration can be carried out to multiple reference models of acquisition.The step can be by model construction Registration unit 620 in module 440 is completed.In certain embodiments, can by translate, rotate, the mode such as scale will be any Two reference models are transformed in the same coordinate system, and cause the point for corresponding to space same position in above-mentioned two reference model Correspond, so as to realize information fusion.In certain embodiments, described image registration can include matching somebody with somebody based on space dimensionality Standard, feature based are registering, match somebody with somebody based on Transformation Properties registration, based on optimized algorithm registration, based on image modalities registration, based on main body Standard etc..Wherein, it is described that 2D/2D registrations, 2D/3D registrations or 3D/3D registrations can be included based on space dimensionality registration.It is described to be based on Feature registration can include distinguished point based (such as the turning point of discontinuity point, figure, line crosspoint etc.) registration, based on face area Domain (such as curve, curved surface etc.) registration, based on pixel value registration, based on surface registration etc..It is described to be matched somebody with somebody based on Transformation Properties Will definitely with including based on rigid transformation registration, based on affine transformation registration, based on projective transformation registration and/or based on curve convert Registration etc..It is described based on optimized algorithm registration can include based on gradient descent method registration, based on Newton method registration, be based on Powell methods registration, based on genetic algorithm registration etc..It is described based on image modalities registration can include based on single mode registration and/ Or based on multi-mode registration.It is described based on main body registration can include based on the image registration from same patient, based on from Different patient image registrations and/or the registration based on patient data and collection of illustrative plates.
In step 730, cavity margin is marked on multiple reference models that can be after registration.The step 730 can be by Mark unit 630 in model construction module 440 is completed.In certain embodiments, can be by being joined by user in multiple hearts Examine and mark cavity margin point on model manually, heart can be divided into six portions by the edge point set formed on each reference model Divide, respectively left ventricle, atrium sinistrum, right ventricle, atrium dextrum, cardiac muscle and sustainer.In certain embodiments, can be according to chamber Heart is divided into 10 classifications by edge relative to intensity of variation outwardly and inwardly:Left ventricle edge, the sharp keen side in atrium sinistrum Edge, the non-sharpened edge in atrium sinistrum, right ventricle sharpened edge, the non-sharpened edge of right ventricle, atrium dextrum sharpened edge, atrium dextrum are non-sharp Sharp edge edge, sustainer edge, left myocardium sharpened edge and the left non-sharpened edge of cardiac muscle (as shown in figure 18).The sharpened edge can It is communicated with the outside or changed unobvious with referring to the edge of chamber.The non-sharp keen edge that can refer to chamber with it is internal or other Chamber is connected or change is obvious.
In step 740, it may be determined that the control point on multiple reference models.The step can be by model construction module Averaging model generation unit 640 in 440 is completed according to the multiple reference models marked by image registration and cavity margin. In some embodiments, each chamber can be determined according to the image registration results and cavity margin markup information of multiple reference models Axle.The axle can be 2 points of line direction being arbitrarily designated on chamber.For example, identified axle can be on chamber away from The major axis formed from 2 points farthest of line.In certain embodiments, after multiple reference model marks can be extracted respectively Cavity margin, the cross-sectional direction of identified axis is cut into slices to each chamber along each chamber, and according to cross section Intensive point set is formed in slicing edge with curved surface features, forms the point model (as shown in figure 19) of averaging model.In some realities Apply in example, the control point on each chamber can be determined according to point model.The control point can be the son of point set on point model Collection.For example, the subset is bigger, grid model is bigger, and the amount of calculation during cardiac segmentation is bigger, and segmentation effect is better;Institute The subset of selection is smaller, and grid model is smaller, and the amount of calculation during cardiac segmentation is smaller, and splitting speed is very fast.In some realities Apply in example, the number at control point can change on chamber.For example, roughly segmentation the stage, control count out can with less, so as to Fast positioning is to cavity margin;In the fine segmentation stage, control is counted out can be with more, so as to realize fine point of cavity margin Cut.
In step 750, heart average meshes model can be built according to control point.In certain embodiments, step 750 Difference can be connected into by polygonal network according to the relation between control point.For example, in certain embodiments, it can pass through The adjacent control point connected on contiguous slices forms triangular net.In certain embodiments, image distortion method can be passed through Obtain average meshes model.Described image deformation method can include Point Distribution Model (PDM), Active Shape Model (ASM), Active Contour Model (also referred to as Snakes), Active Appearance Model (AAM) etc..For example, in certain embodiments, the triangular net that can be built based on control point is obtained by ASM computational methods The average meshes model (as shown in figure 20) of multiple heart reference models.In certain embodiments, step 750 can be based on two dimension Association factor matrix is weighted averaging model calculating to net of control points lattice model.For example, in ASM calculating, averaging model life The adjustment result at control point can be calculated using weighted average (i.e. Σ (Fi*Wi)) into unit 640, wherein, Fi is some chamber The deformation parameter of room, Wi are influence coefficient or weighted value of the chamber to control point.
It should be noted that the description of the above-mentioned process that averaging model is built for model construction module 440, is only described It is convenient, the application can not be limited within the scope of illustrated embodiment.It is appreciated that come for those skilled in the art Say, after the operation principle of the module is understood, the order of each step may be carried out in the case of without departing substantially from this principle Some steps are deleted in any adjustment, or addition.For example, step 710 and step 720 can merge.In another example step 730 arrives Step 750 can circulate repeatedly.Such deformation, within the protection domain of the application.
Fig. 8 is the schematic diagram of the example training module according to some embodiments of the present application.Training module 450 can be with Including a taxon 810 and a grader generation unit 820.In the model construction module 440 between each module Connected mode can be wired, wireless or both combination.Any one module can be local, long-range or two The combination of person.
Taxon 810 can be used for the possibility cavity margin point on multiple reference models or averaging model being divided into not In same chamber classification.The function can be realized by processor 220.In certain embodiments, taxon 810 can basis The chamber classification that mark unit 630 divides is classified (such as Figure 22 institutes to possible marginal point on reference model or averaging model Show).For example, possible marginal point near reference model or averaging model upper chamber can be divided into 10 by taxon 810 In chamber classification, it is respectively:Left ventricle edge, atrium sinistrum sharpened edge, the non-sharpened edge in atrium sinistrum, right ventricle sharpened edge, The non-sharpened edge of right ventricle, atrium dextrum sharpened edge, the non-sharpened edge in atrium dextrum, sustainer edge, left myocardium sharpened edge and The left non-sharpened edge of cardiac muscle.The classification can be realized by a variety of sorting techniques, including but not limited to Decision Tree Algorithm, Bayes (Bayes) sorting algorithm, artificial neural network (ANN) sorting algorithm, k- neighbouring (kNN), SVMs (SVM), Sorting algorithm based on correlation rule, integrated study sorting algorithm etc..In certain embodiments, taxon 810 can basis Point near cavity margin is divided into positive sample and negative sample by the distance of point and cavity margin near cavity margin.For example, The positive sample can be the data point in the certain threshold range of cavity margin, the negative sample can be apart from edge compared with The data point of other random sites in remote and space.In certain embodiments, taxon 810 can be by multiple reference models Or the classification results of probable edge point or data are sent to memory module 430 and stored on averaging model, can also be sent to Grader generation unit 820 is further handled.
Grader generation unit 820 can be used for obtaining the grader trained.In certain embodiments, grader generates Unit 820 can be according to the marginal point classification that taxon 810 divides to the marginal point on multiple reference models or averaging model Carry out classifier training, and the grader (as shown in figure 23) trained.In certain embodiments, grader generation unit 820 can utilize PBT training graders.In certain embodiments, the grader trained can receive any one seat After punctuate, probability corresponding to the coordinate points is exported.The probability refers to probability of the certain point as cavity margin.In some implementations In example, the grader trained can be sent to memory module 430 and be stored by grader generation unit 820, can also be sent out Give matching module 460 and/or adjusting module 470 is used to calculate.
It should be noted that the above-mentioned description for training module 450, only for convenience of description, can not limit the application System is within the scope of illustrated embodiment.It is appreciated that for those skilled in the art, it is former in the work for understanding the module After reason, unit in the module may be combined in the case of without departing substantially from this principle, or form subsystem System is connected with other units, makees various modifications and variations in the form and details to implementing above-mentioned module.For example, taxon 810 can carry out chamber division to multiple reference models or averaging model, the chamber classification after division is divided relative to mark Chamber classification it is finer.Such deformation, within the protection domain of the application.
Fig. 9 is the exemplary process diagram of the training grader according to some embodiments of the present application.In step 910, Taxon 810 in training module 450 can obtain the sample point in multiple reference models or averaging model.In some implementations In example, training module 450 can extract chamber based on the chamber segmentation result on the multiple reference models or averaging model after mark Room edge (as shown in figure 22), and a range of point is as positive sample using near each cavity margin, apart from cavity margin The point of other random sites is as negative sample farther out and in space.For example, the cavity margin certain limit can be 0.1cm, 0.5cm, 1cm, 2cm etc..
Taxon 810 in step 920, training module 450 can classify to the positive and negative sample point of acquisition. In some embodiments, positive and negative sample point can be added in different chamber classifications by training module 450 according to sorting technique. In some embodiments, positive sample can be a range of point in averaging model edge, and negative sample can be some averaging models Point outside edge certain limit.In certain embodiments, the certain limit at averaging model edge could be arranged to zero, now positive sample This is averaging model marginal point.In certain embodiments, positive negative sample can be based on sharp keen degree and sample point present position Classified.In certain embodiments, sample point present position is the affiliated chamber of positive negative sample.For example, training module 450 can be with According to the chamber classification of mark, positive and negative sample point is divided into 10 chamber classifications:Left ventricle edge, the sharp keen side in atrium sinistrum Edge, the non-sharpened edge in atrium sinistrum, right ventricle sharpened edge, the non-sharpened edge of right ventricle, atrium dextrum sharpened edge, atrium dextrum are non-sharp Sharp edge edge, sustainer edge, left myocardium sharpened edge and the left non-sharpened edge of cardiac muscle.The sorting technique can include decision tree Sorting algorithm, Bayes (Bayes) sorting algorithm, artificial neural network (ANN) sorting algorithm, k- neighbouring (kNN), supporting vector Machine (SVM), the sorting algorithm based on correlation rule, integrated study sorting algorithm etc..Wherein, Decision Tree Algorithm can include ID3, C4.5, C5.0, CART, PUBLIC, SLIQ, SPRINT algorithm etc..Bayesian Classification Arithmetic can include naive Bayesian Algorithm, TAN algorithms (tree augmented Bayes network)) etc..Artificial neural network sorting algorithm can include BP Network, radial direction base RBF networks, Hopfield networks, stochastic neural net (such as Boltzmann machines), Competitive ANN (example Such as Hamming networks, self-organized mapping network) etc..Sorting algorithm based on correlation rule can include CBA, ADT, CMAR Deng.Integrated study sorting algorithm can be including Bagging, Boosting, AdpBoosting, PBT etc..
In step 930, training module 450 can obtain the grader by classification based training.In certain embodiments, train Grader generation unit 820 in module 450 can be by the above-mentioned sample point classification of PBT Algorithm for Training, and obtains one or more The individual grader (as shown in figure 23) trained.The PBT can include two-stage PBT algorithms or multistage PBT algorithms.In some realities Apply in example, the grader can include one or more with a range of point of multiple reference models or averaging model edge The grader (also referred to as " the first grader ") for training to obtain for positive sample.In certain embodiments, the grader can wrap It using a range of point in pending image border is that the grader that positive sample trains to obtain (is also referred to as " the to include one or more Two graders ").
It should be noted that the description of the above-mentioned process for the training grader of training module 450, only for convenience of description, The application can not be limited within the scope of illustrated embodiment.It is appreciated that for those skilled in the art, After the operation principle for solving the module, any tune may be carried out to the order of each step in the case of without departing substantially from this principle It is whole, or some steps of addition deletion.For example, positive sample and negative sample can not be differentiated between in step 910 and step 920, and directly Connect and all points near cavity margin are classified.In another example positive and negative sample point can be with apart from the ultimate range of cavity margin It is 2cm.Such deformation, within the protection domain of the application.
Figure 10 is the structural representation of the example model matching module according to some embodiments of the present application.Such as Figure 10 Shown, matching module 460 can include an acquiring unit 1010, a picture point extraction unit 1020, a Hough transformation Unit 1030 and a Model Matching unit 1040.Connected mode in the matching module 460 between each unit can be had Line, wireless or both combination.Any one unit can be local, long-range or both combination.
Acquiring unit 1010 can obtain image.The image of the acquisition is pending image.In certain embodiments, institute It can be the image rebuild based on view data to state image.The image of the reconstruction can be from other modules of processing equipment 130 Middle acquisition.Obtained for example, the image of the reconstruction can be acquiring unit 1010 from image reconstruction module 420.For another example institute The image for stating reconstruction can be the image being stored in after the reconstruction image of image reconstruction module 420 in memory module 430.In some realities Apply in example, described image can be the image being input to via external equipment in system.For example, external equipment passes through COM1 250 input an image into system.In certain embodiments, acquiring unit 1010 can obtain averaging model.The average mould Type can be the three-dimensional cardiac average meshes model that averaging model generation unit 640 generates.In certain embodiments, acquiring unit 1010 can obtain the first grader that training module 450 trains.
In certain embodiments, acquiring unit 1010 can obtain required during the progress images match of model fitting module 460 The parameter wanted.For example, acquiring unit 1010 can obtain the parameter for generalised Hough transform.In certain embodiments, it is described The parameter of generalised Hough transform can be based on three-dimensional average meshes model and its cavity margin control point obtains.For example, by true Allocate the barycenter of equal edge of model, calculate all control points on averaging model edge relative to the offset of barycenter and relative to The gradient direction of barycenter, can obtain corresponding to each gradient direction control point offset vector (hereinafter referred to as gradient to Amount).In certain embodiments, averaging model can be placed in x-y-z coordinate system, and determines that each gradient vector is sat in x-y-z Coordinate under mark system.In certain embodiments, the coordinate of each gradient vector can be converted to the coordinate under polar coordinate system.Specifically Ground, can using gradient vector in the projection of x-y plane and the angle of x coordinate axle as first angle, θ, span is -180 Spend 180 degree.Can be using the angle of gradient vector and x-y plane as second angle φ, span is -90 degree to 90 Degree.In certain embodiments, sliding-model control can be carried out to two angle, θs and φ of above-mentioned expression gradient vector, obtained such as Under the form (being also known as R-table).In certain embodiments, the offset on R-table can be zoomed in and out or Different angles is rotated to detect the shape of different size or different angle.
Gradient related angle φ, θ The offset of corresponding points
0,90 (x0,y0,z0),(x3,y3,z3),…
0,80 (x2,y2,z2),(x5,y5,z5),…
10,90 (x4,y4,z4),(x6,y6,z6),…
Picture point extraction unit 1020 can obtain the marginal probability figure of pending image.Specifically, in certain embodiments, Picture point extraction unit 1020 can be by by the grader that obtains of the coordinate put on pending image input acquiring unit 1010 In, probability of each point as cavity margin on pending image is calculated, and obtain according to the probability distribution of each point pending The marginal probability figure of image.In certain embodiments, the marginal probability figure can include shade of gray figure, color gradient figure (as shown in figure 24) etc..In certain embodiments, picture point extraction unit 1020 can be by pending image border probability graph Probable value is more than the point of certain threshold value as first edge point.The threshold value can be any real number between 0-1, for example, 0.3rd, 0.5 etc..
Model Matching unit 1030 can match averaging model on pending image.Specifically, in some embodiments In, the edge that Model Matching unit 1030 can match pending image by weighting generalised Hough transform by averaging model is general On rate figure.The weighting generalised Hough transform, which can include being obtained according to first edge point on pending image and R-table, to be treated All possible edge reference point on image is handled, the probability that all edge reference points are obtained by the method for weighted accumulation adds up Value, and the barycenter using the maximum edge reference point of probability accumulated value as image.The conversion of model barycenter to image centroid is joined Transformation parameter of the number as model.The edge reference point can be by pending image first edge point according in R-table Parameter carry out coordinate transform after obtain.The weighted accumulation can (will refer to first edge positioned at identical edge reference point Point falls on the behavior of same edge reference point after being shifted according to the parameter on R-table) corresponding to first edge point probability Cumulative process.According to the image centroid of acquisition, can according to transformation parameter by the centroid transformation of model to image centroid weight The position of conjunction.The transformation parameter can include the anglec of rotation and scaling etc..In certain embodiments, Model Matching unit 1030 can be rotated according to the transformation parameter of determination to the point on model, scaling is handled etc., so as to obtain and pending figure As the model (as shown in figure 25) of matching.
It should be noted that the above-mentioned description for model fitting module 460, only for convenience of description, can not be this Shen It please be limited within the scope of illustrated embodiment.It is appreciated that for those skilled in the art, understanding the work of the module After making principle, unit in the module can be combined in the case of without departing substantially from this principle, or formed Subsystem is connected with other units, makees various modifications and variations in the form and details to implementing above-mentioned module.For example, picture point Extraction unit 1020 can remove, and the marginal probability figure of pending image can be obtained directly by training module 450.It is such Deformation, within the protection domain of the application.
Figure 11 is the matching averaging model and the exemplary stream for the image rebuild according to some embodiments of the present application Cheng Tu.In step 1110, averaging model, pending image and the second grader trained can be obtained.In some implementations In example, the averaging model can be that averaging model generation unit 640 passes through iconic model structure side based on multiple reference models The three-dimensional cardiac average meshes model that method obtains.Described image model building method can include Point Distribution Model (PDM), Active Shape Model (ASM), Active Contour Model (also referred to as Snakes), Active Appearance Model (AAM) etc..Step 1110 can be realized by acquiring unit 1010.In certain embodiments, obtain single The pending image that member 1010 obtains can be the image that image reconstruction module 420 is rebuild.In certain embodiments, step 1110 The R-table based on averaging model can be obtained.
In step 1120, it may be determined that the parameter of generalised Hough transform.Specifically, in certain embodiments, step 1110 can the marginal probability figure based on pending image obtain the first edge point of pending image.The first edge point can To be probability is more than certain threshold value on pending image border probability graph point, such as the probability can be 0.3.In some realities Apply in example, the marginal probability figure can be by by the classification that obtains of the coordinate put on pending image input acquiring unit 1010 Probability of each point as cavity margin on pending image is calculated in device, and is obtained according to the probability distribution of each point.In some realities Apply in example, angle, θ and φ corresponding to first edge point gradient direction on pending image can be calculated, and it is true according to R-table Determine the offset of first edge point, be possible to side is used as using the difference of the coordinate value of first edge point and all corresponding offsets The coordinate value of edge reference point.It is possible to further the probable value of voted according to edge reference point number and corresponding first edge point All edge reference points are weighted cumulative.The weighted accumulation can be by positioned at first corresponding to same edge reference point The probability of marginal point adds up.In certain embodiments, can be by the R- corresponding to the maximum edge reference point of probability accumulated value Transformation parameter of the parameter as pending image in table.The transformation parameter can include the anglec of rotation and scaling Deng.The method formula of the weighted accumulation can be expressed as:
Wherein, i is the index of first edge point, and j is the index for the probable edge reference point being voted-for on ballot image, and p is every The probable value of individual first edge point, the two-valued functions of σ 0,1, i.e., when i-th of first edge o'clock is in j-th of probable edge reference point When having ballot contribution, the value is 1, is otherwise 0.
In step 1130, model corresponding to pending image can be obtained.Specifically, can be weighted based on determined by Generalised Hough transform parameter, line translation is clicked through to the first edge on pending image.For example, can be according to edge reference point pair Angle and scaling in the R-table answered, the coordinate of first edge point on pending image is converted, and on averaging model Corresponding information correspond on pending image, obtain pending image corresponding with average meshes model.
Figure 12 is the structural representation of the example adjusting module according to some embodiments of the present application.Such as Figure 12 institutes Show, the adjusting module 470 can include an acquiring unit 1210, a target point determining unit 1220, and a model becomes Change unit 1230.Connected mode in the adjusting module 470 between each unit can be wired, wireless or both knot Close.Any one unit can be local, long-range or both combination.
Acquiring unit 1210 can obtain model and the second grader trained.Specifically, acquiring unit 1210 can be with Obtain the coordinate data of second edge point on model.In certain embodiments, the second edge point of the model can be model On control point.In certain embodiments, acquiring unit 1210 can obtain the second grader that training module 450 trains. The grader can be that 10 chamber classifications based on chamber and the division of clear-cut margin degree are obtained by PBT classification algorithm trainings To 10 graders, such as left ventricle edge, atrium sinistrum sharpened edge, the non-sharpened edge in atrium sinistrum, right ventricle sharpened edge, the right side The non-sharpened edge of ventricle, atrium dextrum sharpened edge, the non-sharpened edge in atrium dextrum, sustainer edge, left myocardium sharpened edge and a left side The non-sharpened edge of cardiac muscle.Because the grey scale change unobvious inside and outside some cavity margin, sharp keen degree is relatively low, because This, does not classify to it according to sharp keen degree.In certain embodiments, acquiring unit 1210 can be obtained by model converter unit Model after 1230 processing.
Target point determining unit 1220 can determine target point corresponding to second edge point on model.With one on model Exemplified by second edge point, target point determining unit 1220 can determine multiple candidates around one model second edge point Point.In certain embodiments, target point determining unit 1220 can be by around one model second edge point of determination Multiple candidate points are input in the grader of the acquisition of acquiring unit 1210, determine one model second edge point and its surrounding Multiple candidate points correspond to the probability of image border, and according to the target of the one model second edge point of the determine the probability Point.In certain embodiments, target point determining unit 1220 can determine the corresponding target point of all second edge points on model.
Model converter unit 1230 can be adjusted to model.In certain embodiments, model converter unit 1230 can With the position of the target point adjustment model edge point based on determined by target point determining unit 1220.The adjustment can include phase Like property conversion, piecewise affine transformations and/or micro- change based on energy function etc..In certain embodiments, model converter unit 1230 can repeatedly adjust model, and each adjustment is required to redefine target point.Specifically, in some embodiments In, whether model converter unit 1230 meets preparatory condition after may determine that model adjustment.For example, whether model adjustment number reaches To certain threshold value.If model adjustment number reaches certain threshold value, the model accurately matched is exported;If model adjustment number is less than The predetermined threshold value, then target point determining unit 1220 is sent a signal to, carry out the determination of target point again, then become by model Change the conversion that unit 1230 carries out model edge point again.In certain embodiments, model converter unit 1230 can obtain essence The really heart chamber model after adjustment.Heart chamber model after the accurate adjustment can be very close with true heart.
It should be noted that the above-mentioned description for adjusting module 470, only for convenience of description, can not limit the application System is within the scope of illustrated embodiment.It is appreciated that for those skilled in the art, it is former in the work for understanding the module After reason, unit in the module may be combined in the case of without departing substantially from this principle, or form subsystem System is connected with other units, makees various modifications and variations in the form and details to implementing above-mentioned module.For example, model conversion is single Member 1230 can preset cycle-index, the circulation time without determining accurate adjusting module 470 by threshold decision Number.Such deformation, within the protection domain of the application.
Figure 13 is the exemplary process diagram of the adjustment model according to some embodiments of the present application.In step 1310 In, the point of the second edge on model and the grader trained can be obtained.In certain embodiments, acquiring unit 1210 and obtain The grader for taking unit 1010 to obtain is not same type.The grader that the acquiring unit 1010 obtains can be training module 450 take a range of point in average grid model edge to train to obtain for positive sample.Point that the acquiring unit 1210 obtains Class device can take a range of point in pending image border to train to obtain for positive sample.In certain embodiments, obtain The grader that unit 1010 obtains can be the first grader, and the grader that acquiring unit 1210 obtains can be the second classification Device.
In step 1320, the target point of second edge point on model can be determined based on the second grader.In some realities To apply in example, an a range of candidate point of model second edge point can be input in the second grader by step 1320, And obtain the probability that a range of candidate point of the model second edge point belongs to image border.In certain embodiments, The target point of one model second edge point can be determined by target point determining unit 1220 based on identified probability.
In step 1330, can the target point based on determination by the second edge point transformation on model to target point. In some embodiments, step 1330 can use a variety of mapping modes to click through line translation to model second edge.For example, it can lead to Model converter unit 1230 is crossed to be modified model second edge point using similitude change and affine transformation.
In step 1340, it can be determined that whether adjustment result meets preparatory condition.In certain embodiments, preparatory condition It can be whether adjustment number reaches certain threshold value.In certain embodiments, the threshold value is adjustable.When adjustment number When reaching certain threshold value, into step 1350, and the model after accurate matching is exported;When adjustment number is less than certain threshold value, Return to step 1320, new model edge point pair can be determined by target point determining unit 1220 based on new model edge point The target point answered.
Figure 14 is the exemplary process diagram to be set the goal really according to some embodiments of the present application a little.Flow 1400 can To be that target point determining unit 1220 is realized.The process of the respective objects point of any on averaging model edge is to determine in Figure 14, It will be understood by those skilled in the art that this method can be used for obtaining multiple target points corresponding to multiple marginal points. In certain embodiments, flow 1400 can be corresponding with step 1320.
In step 1410, it may be determined that the normal of an averaging model marginal point.In certain embodiments, the normal Direction be that outside is internally pointed to by averaging model.Specific normal acquisition methods may refer to, for example, flow 1500 and its retouching State.
In step 1420, step-length and search model along one averaging model marginal point normal direction can be obtained Enclose.In certain embodiments, the step-length and hunting zone can be pre-set values.In certain embodiments, it is described Step-length and hunting zone can be that user inputs.For example, user can be inputted everywhere by external equipment by COM1 250 Manage in equipment 130.In certain embodiments, the hunting zone is using one model edge point as starting point, along normal institute The line segment at least one direction in straight line both direction (on the outside of to model or inner side).
In step 1430, based on step-length and hunting zone, it may be determined that one or more candidate points.For example, search model Enclose for 10 centimetres, step-length is arranged to 1 centimetre, can respectively determine 10 points along straight line both direction where normal, totally 21 candidates Point (including marginal point is in itself).In some embodiments, it is also possible to determine step-length and step number, and determine to wait according to step-length and step number Reconnaissance.For example, step-length is arranged to 0.5 centimetre, step number is arranged to 3, can respectively determine 3 along straight line both direction where normal Point, farthest candidate point is apart from marginal point 1.5cm, totally 7 candidate points.
In step 1440, it may be determined that it is a range of general that one or more of candidate points correspond to image border Rate.In certain embodiments, the second grader is to take a range of point in image border to train to obtain for positive sample.Described one Surely a preset value is may range to be set by machine or user.For example, the preset value can be 1 centimetre.
In step 1450, it is a range of general corresponding to image border that one or more of candidate points can be based on Rate, it is target point to determine one in one or more of candidate points.In certain embodiments, target point can be based on following Function obtains:
Fi=max (Pi-λ*di 2) (2)
Wherein, PiCorrespond to a range of probability in image border for candidate point;diFor candidate point and one average mould The Euclidean distance of type marginal point;λ is weight, is constant to equilibrium distance and the relation of probable value.
In certain embodiments, multiple target points of multiple model edge points can be determined based on flow 1400, then Line translation is entered to multiple model edge points and model according to the multiple target point.Specific conversion process may refer to, for example, Figure 16 and its description.
Figure 15 is the exemplary process diagram of the determination marginal point normal according to some embodiments of the present application.At some In embodiment, flow 1500 can be corresponding with step 1420.
In step 1510, multiple polygons can be determined according to multiple marginal points of averaging model.In some embodiments In, the multiple polygon can be formed by connecting the multiple marginal point.The multiple polygon can be triangle, four The shapes such as side shape, polygon.In certain embodiments, the process for determining multiple polygons according to multiple marginal points is also referred to as Gridding is handled.Wherein, the multiple polygon can be referred to as grid, and the multiple marginal point can be referred to as node. In some embodiments, averaging model surface may form multiple polygons corresponding with the averaging model marginal point, In this case, step 1510 can be omitted.
In step 1520, it may be determined that adjacent multiple polygons with an averaging model marginal point.
In step 1530, it may be determined that multiple normals corresponding to the affiliated plane of the multiple polygon.In some realities Apply in example, it is (on the outside of averaging model or interior that multiple normal directions corresponding to the affiliated plane of the multiple polygon are located at homonymy Side).In certain embodiments, multiple normal line vectors corresponding to the affiliated plane of the multiple polygon are unit vector.
In step 1540, the normal of the marginal point can be determined based on the multiple normal.In certain embodiments, Multiple normal line vectors corresponding to the multiple polygon can be added or are averaged.
Figure 16 is the exemplary process diagram of the conversion averaging model marginal point according to some embodiments of the present application. In some embodiments, flow 1600 can be that model converter unit 1230 is realized.
In step 1610, similarity transformation can be performed to averaging model marginal point.For example, can be by averaging model side The grid of edge point composition the target point direction determined according to cavity margin point, is integrally carried out as an entirety to averaging model Conversion, it is main to include the operations such as translation, rotation, scaling.
In step 1620, piecewise affine transformations can be performed to averaging model marginal point.In certain embodiments, it is average The grid of model edge point composition can be divided according to certain rule.For example, can be according to heart chamber to heart mould Type is divided.As shown in figure 24, model meshes can be divided into left ventricle, atrium sinistrum, right ventricle, the right heart according to chamber Room, sustainer and left myocardium six parts.In certain embodiments, piecewise affine transformations are referred to the various pieces of division Grid carry out affine transformation respectively.The affine transformation can refer to moves conversion respectively to multiple nodes of various pieces And shape conversion.In certain embodiments, averaging model marginal point may be influenceed by multiple chambers.Averaging model marginal point The effect influenceed by different chamber can be showed in the form of association factor.When carrying out affine transformation, averaging model Marginal point can be changed towards target point.During conversion, averaging model marginal point by multiple chambers due to being influenceed.Close Join the weighted value that the factor can turn into conversion parameter (such as moving displacement, proportion of deformation etc.).According to corresponding to marginal point target point and Association factor, model converter unit 1230 are changed the marginal point on the more segment mesh of averaging model using piecewise affine transformations respectively To its corresponding position.
In step 1630, micro- change based on energy function can be performed to averaging model marginal point.In some embodiments In, energy function can be expressed as:
Wherein, EextFor external energy, represent current point and detect the relation of target point;EintFor internal energy, represent Current point and the relation of a marginal point of the averaging model;α is weight, and for balancing inside and outside energy, different chamber makes With different weights;C represents each chamber.When current point not only close to target point but also close to a marginal point of the averaging model When, then energy function is minimum, that is, tries to achieve optimum coordinates point.Gross energy E is smaller, as a result more accurate.
External energy function can be expressed as:
Wherein, i is each;wiFor the weight (i.e. the reliability of the point) shared by each point;Current point coordinates is vi, classify through PBT The point that device detects isFor the gradient (vector) of point,For Grad size.Internal energy function can be with It is expressed as:
Eint=∑ijkwI, k((vi-vj)-TAffine, k(mi-mj))2 (5)
Wherein, i is each point, and j is point i neighborhood (then vi-vjCorresponding to the side of each triangle in current point position);wI, kTo close Join the factor (factors of each chamber k to current point i);mi, mjFor the point (being tried to achieve by PDM/ASM) on averaging model;mi-mjIt is corresponding In the side of each triangle of mesh averaging models), TAffine, kThe transformation relation tried to achieve by each chamber k affine transformations PAT.Wherein, Point coordinates viAll it is space three-dimensional.
Generalised Hough transform, model adjustment and model conversion are weighted, the model and image accurately matched can be obtained. As shown in figure 25, each chamber of phantom heart after accurate matching clearly, is clearly split.
Basic conception is described above, it is clear that to those skilled in the art, foregoing invention discloses only As an example, and the restriction to the application is not formed.Although do not clearly state herein, those skilled in the art may Various modifications are carried out to the application, improves and corrects.Such modification, improvement and amendment are proposed in this application, so such Modification, improve, correct the spirit and scope for still falling within the application example embodiment.
Meanwhile the application has used particular words to describe embodiments herein.Such as " one embodiment ", " one implements Example ", and/or " some embodiments " mean a certain feature, structure or the feature related at least one embodiment of the application.Cause This, it should be highlighted that and it is noted that " embodiment " or " implementation that are referred to twice or repeatedly in diverse location in this specification Example " or " alternate embodiment " are not necessarily meant to refer to the same embodiment.In addition, in one or more embodiments of the application Some features, structure or feature can carry out appropriate combination.
In addition, it will be understood by those skilled in the art that each side of the application can be by some with patentability Species or situation are illustrated and described, including any new and useful process, the combination of machine, product or material, or right Their any new and useful improvement.Correspondingly, the various aspects of the application can be performed completely by hardware, can be complete Performed, can also be performed by combination of hardware by software (including firmware, resident software, microcode etc.).Hardware above is soft Part is referred to alternatively as " data block ", " module ", " engine ", " unit ", " component " or " system ".In addition, each side of the application The computer product being located in one or more computer-readable mediums may be shown as, the product includes computer-readable program Coding.
Computer-readable signal media may include the propagation data signal containing computer program code in one, such as A part in base band or as carrier wave.The transmitting signal may have many forms, including electromagnetic form, light form etc. Deng or suitable combining form.Computer-readable signal media can be any meter in addition to computer-readable recording medium Calculation machine computer-readable recording medium, the medium can by be connected to an instruction execution system, device or equipment with realize communication, propagate or Transmit the program for using.Program coding in computer-readable signal media can be carried out by any suitable medium Propagate, include the combination of radio, cable, fiber optic cables, RF or similar mediums or any of above medium.
In addition, except clearly stating in non-claimed, the order of herein described processing element and sequence, digital alphabet Using or other titles use, be not intended to limit the order of the application flow and method.Although by each in above-mentioned disclosure Kind of example discusses some it is now recognized that useful inventive embodiments, but it is to be understood that, such details only plays explanation Purpose, appended claims are not limited in the embodiment disclosed, on the contrary, claim is intended to cover and all meets the application The amendment of embodiment spirit and scope and equivalent combinations.For example, although system component described above can be set by hardware It is standby to realize, but only can also be achieved by the solution of software, such as pacify on existing server or mobile device The described system of dress.
Although the present invention is disclosed as above with preferred embodiment, it is not for limiting the present invention, any this area Technical staff without departing from the spirit and scope of the present invention, may be by the methods and technical content of the disclosure above to this hair Bright technical scheme makes possible variation and modification, therefore, every content without departing from technical solution of the present invention, according to the present invention Any simple modifications, equivalents, and modifications made to above example of technical spirit, belong to technical solution of the present invention Protection domain.

Claims (10)

1. a kind of image partition method, including:
Obtain view data;
Based on described image data, reconstruction image, wherein, described image includes one or more first edges;
A model is obtained, wherein, the model includes the one or more corresponding with one or more of first edges Second edge;
Match the model and the image after the reconstruction;And
According to one or more of first edges, one or more second edges of the model are adjusted.
2. according to the method for claim 1, it is characterised in that one or more second edges of the adjustment model Including:Determine a reference point in the second edge;It is determined that the target point corresponding with the reference point;And according to institute Target point is stated, adjusts the second edge of the model.
3. according to the method for claim 2, it is characterised in that described to determine the target point bag corresponding with the reference point Include:
Determine the normal of the reference point;
Obtain step-length and hunting zone;
According to the step-length and hunting zone, one or more candidate points are determined along normal;
Obtain first grader;
According to first grader, determine that one or more of candidate points correspond to the probability of the first edge;And
Correspond to the probability of the first edge based on one or more of candidate points, determine the target point.
4. according to the method for claim 3, it is characterised in that the normal for determining the reference point includes:It is determined that with The adjacent one or more polygonal mesh of the reference point;Determine corresponding to one or more of polygonal mesh one or Multiple normals;And according to one or more of normals, determine the normal of the reference point.
5. according to the method for claim 2, it is characterised in that the second edge of the adjustment model includes:
Similarity transformation is carried out to the second edge;
The control point of the model is determined, one or more second sides according to the control point of the model and the model The relation of edge, the association factor at the control point of the model is generated, according to association factor, affine change is carried out to the second edge Change;Or
Based on an energy function, the second edge is finely adjusted.
6. according to the method for claim 1, it is characterised in that described image data include brain image, skull image, chest Portion's image, cardiac image, galactophore image, abdomen images, renal image, liver image, pelvis image, perineum image, limbs Image, vertebra image or osteogram picture.
7. an image segmentation system, including:
One memory, is configured as data storage and instruction;
One is established the processor to communicate with memory, wherein, when performing the instruction in memory, the processor is configured For:
Obtain view data;
Based on described image data, reconstruction image, wherein, described image includes one or more first edges;
A model is obtained, wherein, the model includes the one or more corresponding with one or more of first edges Second edge;
Match the model and the image after the reconstruction;And
According to one or more of first edges, one or more second edges of the model are adjusted.
8. system according to claim 7, it is characterised in that the processor is further configured to:
Determine a reference point in the second edge;
It is determined that the target point corresponding with the reference point;And
According to the target point, the second edge of the model is adjusted.
9. system according to claim 8, it is characterised in that the processor is further configured to:
Determine the normal of the reference point;
Obtain step-length and hunting zone;
According to the step-length and hunting zone, one or more candidate points are determined along normal;
Obtain first grader;
According to first grader, determine that one or more of candidate points correspond to the probability of the first edge;And
Correspond to the probability of the first edge based on one or more of candidate points, determine the target point.
10. system according to claim 7, it is characterised in that the second edge of the adjustment model includes:To institute State second edge and carry out similarity transformation;The control point of the model is determined, according to the control point of the model and the model Described in one or more second edges relation, generate the association factor at the control point of the model, it is right according to association factor The second edge carries out affine transformation;Or based on an energy function, the second edge is finely adjusted.
CN201710311908.8A 2017-05-05 2017-05-05 Image segmentation method and system Active CN107424162B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201710311908.8A CN107424162B (en) 2017-05-05 2017-05-05 Image segmentation method and system
US15/710,815 US10482604B2 (en) 2017-05-05 2017-09-20 Systems and methods for image processing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710311908.8A CN107424162B (en) 2017-05-05 2017-05-05 Image segmentation method and system

Publications (2)

Publication Number Publication Date
CN107424162A true CN107424162A (en) 2017-12-01
CN107424162B CN107424162B (en) 2019-12-20

Family

ID=60425385

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710311908.8A Active CN107424162B (en) 2017-05-05 2017-05-05 Image segmentation method and system

Country Status (1)

Country Link
CN (1) CN107424162B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109685816A (en) * 2018-12-27 2019-04-26 上海联影医疗科技有限公司 Image segmentation method, device, equipment and storage medium
CN110675486A (en) * 2019-08-28 2020-01-10 电子科技大学 Frequency domain reconstruction method for non-rigid human body movement
CN111242877A (en) * 2019-12-31 2020-06-05 北京深睿博联科技有限责任公司 Mammary X-ray image registration method and device
WO2024002110A1 (en) * 2022-06-27 2024-01-04 Shanghai United Imaging Healthcare Co., Ltd. Methods and systems for determining image control point
CN117635942A (en) * 2023-12-05 2024-03-01 齐鲁工业大学(山东省科学院) Cardiac MRI image segmentation method based on edge feature enhancement

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103310449A (en) * 2013-06-13 2013-09-18 沈阳航空航天大学 Lung segmentation method based on improved shape model
US20160063726A1 (en) * 2014-08-28 2016-03-03 Koninklijke Philips N.V. Model-based segmentation of an anatomical structure
CN105719278A (en) * 2016-01-13 2016-06-29 西北大学 Organ auxiliary positioning segmentation method based on statistical deformation model
CN105976384A (en) * 2016-05-16 2016-09-28 天津工业大学 Human body thoracic and abdominal cavity CT image aorta segmentation method based on GVF Snake model

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103310449A (en) * 2013-06-13 2013-09-18 沈阳航空航天大学 Lung segmentation method based on improved shape model
US20160063726A1 (en) * 2014-08-28 2016-03-03 Koninklijke Philips N.V. Model-based segmentation of an anatomical structure
CN105719278A (en) * 2016-01-13 2016-06-29 西北大学 Organ auxiliary positioning segmentation method based on statistical deformation model
CN105976384A (en) * 2016-05-16 2016-09-28 天津工业大学 Human body thoracic and abdominal cavity CT image aorta segmentation method based on GVF Snake model

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
凌华强 等: "主动形体模型法在肝脏CT图像分割中的应用", 《浙江工业大学学报》 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109685816A (en) * 2018-12-27 2019-04-26 上海联影医疗科技有限公司 Image segmentation method, device, equipment and storage medium
CN109685816B (en) * 2018-12-27 2022-05-13 上海联影医疗科技股份有限公司 Image segmentation method, device, equipment and storage medium
CN110675486A (en) * 2019-08-28 2020-01-10 电子科技大学 Frequency domain reconstruction method for non-rigid human body movement
CN110675486B (en) * 2019-08-28 2023-03-07 电子科技大学 Frequency domain reconstruction method for non-rigid human body movement
CN111242877A (en) * 2019-12-31 2020-06-05 北京深睿博联科技有限责任公司 Mammary X-ray image registration method and device
WO2024002110A1 (en) * 2022-06-27 2024-01-04 Shanghai United Imaging Healthcare Co., Ltd. Methods and systems for determining image control point
CN117635942A (en) * 2023-12-05 2024-03-01 齐鲁工业大学(山东省科学院) Cardiac MRI image segmentation method based on edge feature enhancement
CN117635942B (en) * 2023-12-05 2024-05-07 齐鲁工业大学(山东省科学院) Cardiac MRI image segmentation method based on edge feature enhancement

Also Published As

Publication number Publication date
CN107424162B (en) 2019-12-20

Similar Documents

Publication Publication Date Title
CN107220965A (en) A kind of image partition method and system
US10482604B2 (en) Systems and methods for image processing
US10546014B2 (en) Systems and methods for segmenting medical images based on anatomical landmark-based features
US20190355117A1 (en) Techniques for Segmentation of Lymph Nodes, Lung Lesions and Other Solid or Part-Solid Objects
CN107424162A (en) A kind of image partition method and system
CN109215032A (en) The method and system of image segmentation
US8577130B2 (en) Hierarchical deformable model for image segmentation
US9218542B2 (en) Localization of anatomical structures using learning-based regression and efficient searching or deformation strategy
Xing et al. Lesion segmentation in ultrasound using semi-pixel-wise cycle generative adversarial nets
Li et al. Learning image context for segmentation of the prostate in CT-guided radiotherapy
US11935246B2 (en) Systems and methods for image segmentation
Kieselmann et al. Cross‐modality deep learning: contouring of MRI data from annotated CT data only
CN111709485B (en) Medical image processing method, device and computer equipment
EP3493154A1 (en) Segmentation system for segmenting an object in an image
US20220301224A1 (en) Systems and methods for image segmentation
CN107220984A (en) A kind of image partition method, system and grid model
US20220222873A1 (en) Devices and process for synthesizing images from a source nature to a target nature
Wu et al. Prostate segmentation based on variant scale patch and local independent projection
Feng et al. Supervoxel based weakly-supervised multi-level 3D CNNs for lung nodule detection and segmentation
CN113570627A (en) Training method of deep learning segmentation network and medical image segmentation method
CN111798424A (en) Medical image-based nodule detection method and device and electronic equipment
CN107230211A (en) A kind of image partition method and system
CN111724395A (en) Heart image four-dimensional context segmentation method, device, storage medium and device
Gao et al. Hybrid decision forests for prostate segmentation in multi-channel MR images
Yang et al. Automatic segmentation of the clinical target volume and organs at risk for rectal cancer radiotherapy using structure-contextual representations based on 3D high-resolution network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: 201807 Shanghai city Jiading District Industrial Zone Jiading Road No. 2258

Patentee after: Shanghai Lianying Medical Technology Co., Ltd

Address before: 201807 Shanghai city Jiading District Industrial Zone Jiading Road No. 2258

Patentee before: SHANGHAI UNITED IMAGING HEALTHCARE Co.,Ltd.

CP01 Change in the name or title of a patent holder