CN106028943B - Ultrasonic virtual endoscopic imaging system and method and device thereof - Google Patents
Ultrasonic virtual endoscopic imaging system and method and device thereof Download PDFInfo
- Publication number
- CN106028943B CN106028943B CN201480075906.8A CN201480075906A CN106028943B CN 106028943 B CN106028943 B CN 106028943B CN 201480075906 A CN201480075906 A CN 201480075906A CN 106028943 B CN106028943 B CN 106028943B
- Authority
- CN
- China
- Prior art keywords
- region
- point
- preliminary making
- making point
- display interface
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
Landscapes
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
- Image Processing (AREA)
Abstract
A kind of ultrasonic virtual endoscopic imaging system and method and device thereof, its system includes: Display processing module (110), for showing at least one flat image and at least one stereo-picture for obtaining using virtual Endoscopy simultaneously on same display interface;It pinpoints selecting module (120), for obtaining preliminary making point or region from any piece image shown on the display interface;Space mapping module (130) obtains the position in the corresponding all images on the display interface of the preliminary making point or region based on space reflection relationship for the spatial position coordinate according to the preliminary making point or region;And mark module (140), for marking the corresponding position in the corresponding all images on the display interface of the preliminary making point or region;The system and method solves the problems, such as not showing corresponding relationship between 3D rendering and 2D image in 3D ultrasonic virtual based endoscopic imaging technology in the prior art.
Description
Technical field
The present invention relates to ultrasonic virtual based endoscopic imaging technologies, more particularly to a kind of ultrasonic virtual endoscopic imaging system and side
Method and its device.
Background technique
Virtual Endoscopy (Virtual Endoscopy, VE) is application of the virtual reality technology in modern medicine.It
Using medical image as initial data, the means such as blending image processing, computer graphics, visualization in scientific computing, simulation
A kind of technology of traditional optical endoscope.Virtual Endoscopy overcomes traditional optical endoscope and endoscope body is needed to be inserted into people
Internal disadvantage is a kind of touchless inspection method completely, can be used for auxiliary diagnosis, surgery planning and realization in clinic
The accurate positioning etc. of operation.Virtual Endoscopy in 3D CT and MR imaging mature application for many years, due to CT and MR at
As characteristic, it is mainly used for concentrating on having organ such as colon, tracheae, blood vessel, esophagus, heart of cavity structure etc..
Ultrasound is widely used in clinic as a kind of quick, noninvasive Imaging Techniques.Recently, since 2D tradition is super
Often there is the defects of not comprehensive scanning scope, drain message in sound, 3D ultrasonic image can see organ and deep layer insertion
The actual location of structure and direction provide more accurate information for clinician.3D the and 4D technology of ultrasound rapidly develops, such as
Gynemetrics, blood vessel internal medicine, urological department etc. are all widely used.
Ultrasonic virtual Endoscopy is lumen or internal blood vessel that observation visual angle merging liquid is full, in conjunction with visualization skill
Art three-dimensionally shows observation cavity inside structure, to reach more precisely diagnosis effect and formulation therapeutic scheme etc..As one
It is different from the fly over mode of conventional ultrasound 3D, peeping in ultrasonic virtual can be understood as fly through mode and seen
It examines, this technology is a kind of unique ultrasonic post-processing approach, provides completely new visual angle to observe the structure inside organ.But
Be, when display peeped in current 3D ultrasonic virtual be only to provide Multi-planar reformation (Multi-planar reformation,
MPR three normal surfaces and the virtual endoscopic picture of three-dimensional rendering).On the one hand, compared with CT, MR, the resolution ratio of ultrasound image
It is poor, often there is pseudomorphism in 3D rendering image, is easy to obscure with lesion, to will affect judgement of the doctor to the state of an illness.Doctor
It is raw to need constantly to judge that corresponding position is lesion or pseudomorphism on 3D rendering in conjunction with 2D image.On the other hand, mould is peeped due to interior
Formula is different from traditional 3D imaging pattern visual angle, while doctor is more accustomed to seeing two-dimensional ultrasonic image, therefore user is difficult to think
Image space structural relation is often difficult to connect 3D rendering and 2D image in use.Based on existing in the prior art
Problem needs to be further improved associated picture processing technique.
Summary of the invention
Based on this, it is necessary to for can not show 3D rendering and 2D in 3D ultrasonic virtual based endoscopic imaging technology in the prior art
Corresponding relationship is between image so as to cause can not give medical staff to provide more accurate, more careful, more humanized physiology
The problem of information judgment basis, provides a kind of ultrasonic virtual endoscopic imaging system and method and device thereof.
A kind of ultrasonic virtual endoscopic imaging system provided by the invention comprising:
Display processing module, for showing obtain using virtual Endoscopy at least one simultaneously on same display interface
A flat image and at least one stereo-picture;
Selecting module is pinpointed, for obtaining the area preliminary making Dian Huo from any piece image shown on the display interface
Domain;
Space mapping module is closed for the spatial position coordinate according to the preliminary making point or region based on space reflection
System obtains the position in the corresponding all images on the display interface of the preliminary making point or region;And
Mark module, for marking in the corresponding all images on the display interface of the preliminary making point or region
Corresponding position.
Based on above system structure, the present invention also provides a kind of ultrasonic virtual endoscopic imaging methods comprising:
On same display interface simultaneously show using virtual Endoscopy obtain at least one flat image at least
One stereo-picture;
Preliminary making point or region are obtained from any piece image shown on the display interface;
According to the preliminary making point or the spatial position coordinate in region, the preliminary making point is obtained based on space reflection relationship
Or the position in the corresponding all images on the display interface in region;
Mark the corresponding position in the corresponding all images on the display interface of the preliminary making point or region.It is based on
The systems and methods, the present invention also provides a kind of ultrasonic virtual based endoscopic imaging devices comprising:
Image capture module, ultrasonic waveform, obtains the 3 d image data of destination organization for transmitting and receiving;
Memory, for storing the 3 d image data and executing the result of operation to the 3 d image data;
Human-computer interaction module, for obtaining the instruction operated to the 3 d image data;
Display, for showing that at least one flat image obtained using virtual Endoscopy is vertical at least one simultaneously
Body image;And
Processor module, for obtaining preliminary making point or region from any piece image shown on the display interface
Position the preliminary making point is obtained based on space reflection relationship according to the preliminary making point or the spatial position coordinate in region
Or the position in the corresponding all images on the display interface in region, and mark the preliminary making point or region corresponding in institute
State the corresponding position in all images on display interface.
Ultrasonic virtual endoscopic imaging system of the invention and method and device thereof peep the base of mode in conventional Ultrasound is virtual
On plinth, pass through man-machine interaction mode, it may be convenient to the corresponding relationship between 3D rendering and 2D image is established, according to user in 2D
Perhaps point-of-interest or region is specified to can be obtained in its corresponding 3D 2D image by space reflection relationship in 3D rendering
Position, and can be marked with display, thus to medical staff provide it is more accurate, more careful, more humanized
Physiologic information judgment basis.
Detailed description of the invention
Fig. 1 is the structural schematic diagram of ultrasonic virtual endoscopic imaging system of the invention;
Fig. 2 is that the invention shows interface schematic diagrams;
Fig. 3 is the structural schematic diagram of space mapping module 130 of the present invention;
Fig. 4 is parallel projection and perspective projection schematic diagram;
Fig. 5 is that light projects schematic diagram;
Fig. 6 is equidistantly to sample schematic diagram along radiation direction;
Fig. 7 is imaging plane schematic diagram;
Fig. 8 is an example structure schematic diagram of mark module 140;
Fig. 9 is the another example structure schematic diagram of mark module 140;
Figure 10 is an example structure schematic diagram of ultrasonic virtual endoscopic imaging system of the present invention;
Figure 11 is the schematic diagram of the display interface of Figure 10 embodiment;
Figure 12 is the another example structure schematic diagram of ultrasonic virtual endoscopic imaging system of the present invention;
Figure 13 is the schematic diagram of the display interface of Figure 12 embodiment;
Figure 14 is another example structure schematic diagram of ultrasonic virtual endoscopic imaging system of the present invention;
Figure 15 is the schematic diagram of the display interface of Figure 14 embodiment;
Figure 16 is the flow diagram of ultrasonic virtual endoscopic imaging method of the invention;
Figure 17 is the structural schematic diagram of ultrasonic virtual based endoscopic imaging device of the invention;
Figure 18 is an embodiment flow diagram of ultrasonic virtual endoscopic imaging method of the present invention;
Figure 19 is the present invention in the preferred embodiment schematic diagram for realizing coordinate conversion;
Figure 20 is the present invention in another preferred embodiment schematic diagram for realizing coordinate conversion.
Specific embodiment
The present invention is based on 3D ultrasonic virtual based endoscopic imaging technology, provide a kind of ultrasonic virtual endoscopic imaging system and method and
Its device passes through man-machine interaction mode, it may be convenient to establish 3D rendering on the basis of peeping mode in conventional Ultrasound is virtual
With the corresponding relationship between 2D image, point-of-interest or region are specified in 2D or 3D rendering according to user, reflected by space
The relationship of penetrating can be obtained the position in its corresponding 3D 2D image and be marked and display.It is detailed below in conjunction with attached drawing
Illustrate each specific embodiment of the invention.
Ultrasonic virtual endoscopic imaging system of the invention is mainly used on ultrasonic virtual based endoscopic imaging device, and the device is logical
Often include: the image capture module for the 3 d image data that ultrasonic waveform parsing for transmitting and receiving obtains destination organization, use
In store above-mentioned 3 d image data and to above-mentioned 3 d image data execute the result of operation memory, for obtain to
3 d image data is stated to carry out the human-computer interaction module of operational order and schemed for showing based on Multi-planar reformation (MPR)
The display of picture.As depicted in figs. 1 and 2, it is based on above-mentioned hardware environment, present embodiments provides a kind of ultrasonic virtual based endoscopic imaging
System 100 comprising:
Display processing module 110 is obtained extremely for showing simultaneously on same display interface 1 using virtual Endoscopy
A few flat image and at least one stereo-picture, flat image here refer to the section two obtained by Multi-planar reformation
Image, the region A, B, C for being shown in Fig. 2 are tieed up, stereo-picture here refers to that 3D renders virtual endoscopic picture, for showing
Show the region 3D in Fig. 2;
Selecting module 120 is pinpointed, for obtaining preliminary making point from any piece image shown on above-mentioned display interface
Or region 2, preliminary making point or region 2 as represented by the black point in Fig. 2;
Space mapping module 130 is reflected for the spatial position coordinate according to above-mentioned preliminary making point or region 2 based on space
The relationship of penetrating obtains the position in the corresponding all images on above-mentioned display interface of above-mentioned preliminary making point or region 2;And
Mark module 140, for marking the corresponding all images on above-mentioned display interface of above-mentioned preliminary making point or region 2
In corresponding position, as shown in Fig. 2, corresponding black point 2 is marked in the region A B C 3D, to indicate the area preliminary making Dian Huo
Position of the domain in respective image.
As shown in Fig. 2, the present embodiment is by being associated with specified interested of display on images all on same display interface
Point or region (preliminary making point or region 2 in such as Fig. 2) can provide more accurate, more careful, more people for medical staff
Property physiologic information judgment basis, facilitate medical staff directly by the display image on display intuitively obtain D image with
Corresponding relationship between 2D image.
Based on the above embodiment, as shown in figure 3, the space mapping module 130 in the present embodiment includes:
First converting unit 131, for will from the preliminary making point of above-mentioned flat image or the spatial position coordinate in region,
Be converted to the corresponding position in above-mentioned stereo-picture of above-mentioned preliminary making point or region;
Second converting unit 132, for will from the preliminary making point of above-mentioned stereo-picture or the spatial position coordinate in region,
Be converted to the corresponding position in above-mentioned all flat images of above-mentioned preliminary making point or region;And
Judging unit 133, for judge above-mentioned preliminary making point or region whether come from the above-mentioned flat image of any one pair or
Whether person comes from above-mentioned stereo-picture, calls above-mentioned first converting unit or above-mentioned second converting unit according to judging result.
The first converting unit 131 is mainly used for being converted to the coordinate position in 2D image in 3D rendering in the present embodiment
Coordinate position, the second converting unit 132 are mainly used for being converted to the coordinate position in 3D rendering into the coordinate position in 2D image,
The concrete methods of realizing of space reflection relationship is referring specifically to following the description.
Usual ultrasonic virtual Endoscopy is different from the parallel projection mode that tradition 3D volume drawing uses, in order to emulate interior peep
The imaging effect of mirror, using perspective projection mode.The what comes into a driver's body of the parallel projection as shown in Fig. 4 (a) is a parallel cuboid,
Both ends are identical, that is to say, that the distance between object and screen will not influence its size seemed.Unlike this, perspective is thrown
The what comes into a driver's body of shadow is one and is similar to pyramidal prismatoid, and one end is big and the other end is small, therefore object distance screen is more closely seen
Just seem up bigger, as shown in Fig. 4 (b).
Light mode of delivery is as shown in Figure 5 in perspective projection, it is assumed that a light from view eye position P point issue along unit to
The projection of the direction d to be measured, intersects at S (u, v) point with view plane UV, intersects at Q point with object, P point and the linear distance of Q point are t, that
The coordinate of Q point is Q=P+td.
Fig. 6 illustrates the process equidistantly sampled along radiation direction.Assuming that one group of light 503 is sent out from view eye position P (501)
Out, F point is intersected at across view plane 502 and 504 boundary of object, launches from L point, is carried out at equal intervals in crossing process in object
Sampling.According to the color and transparence value of pre-defined rule acquisition in sampling process, the final acquisition image of cumulative mixing is then carried out
The pixel value of plane S point S (u, v) as final result be shown in width be W, highly be H screen (view plane) on, such as Fig. 7
It is shown.
Based on the above-mentioned modular concept in relation to Fig. 4 to Fig. 7, in the present embodiment, in order to which 2D coordinate is converted to 3D coordinate, this
Embodiment provides an optimal implementation, and specific first converting unit 131 may include with lower unit:
First position unit, preliminary making point or region for basis from above-mentioned flat image, obtains in XYZ space system
In the spatial position coordinate of corresponding second mapping point of the preliminary making point or region or region (i.e. Q point in Fig. 5);
First computing unit, for calculating view eye position by view based on the perspective projection imaging mode in XYZ space system
The linear distance equation in the first mapping point or region (i.e. S point in Fig. 5) to above-mentioned second mapping point or region in plane (has
Body is referring to following formula (1)) and above-mentioned view plane equation (referring specifically to following formula (2));
First solves unit, for being associated with the calculating of above-mentioned linear distance equation and above-mentioned view plane equation, obtains in XYZ
The coordinate of first mapping point or region (i.e. S point in Fig. 5) in above-mentioned view plane in the system of space;And
First position converting unit, for above-mentioned first mapping point that will be obtained or region (i.e. S point in Fig. 5) in XYZ
Coordinate in the system of space is converted to the coordinate of above-mentioned first mapping point or region (i.e. S point in Fig. 5) in the view plane in which, thus
Obtain the corresponding position in above-mentioned stereo-picture of above-mentioned preliminary making point or region.
The concrete principle of the present embodiment is as follows:
It is typically based on XYZ space system to sample object, be mutually positioned in interactive process in 2D and 3D, depending on eye position P
Remain constant, i.e., coordinate is (x0, y0, z0).After user selects preliminary making point or region on interface, it is equivalent in Fig. 5
Q point coordinate uniquely determines as (x1, y1, z1).Therefore, the equation of space line PQ can be expressed as following formula (1).
Known P point is D to view plane distance, and view plane normal direction is N (A, B, C), then view plane equation can be write as
Following formula (2).
Simultaneous above-mentioned formula (1) and formula (2) can calculate the point of intersection S (x, y, z) of straight line PQ and view plane, the intersection point
A bit, to be related to UV plane S (u, v) by space reflection in XYZ space system, then it is located at the pixel of (u, v) on view plane
Position is exactly the corresponding position on 3D rendering image spatial point Q.
Based on the above-mentioned modular concept in relation to Fig. 4 to Fig. 7, in the present embodiment, in order to which 3D coordinate is converted to 2D coordinate, this
Embodiment provides an optimal implementation, and specific second converting unit 132 may include with lower unit:
Second position unit, preliminary making point or region for basis from above-mentioned stereo-picture, is obtained and is thrown based on perspective
The preliminary making point or region corresponding the first mapping point or region (the S point in Fig. 5) in the view plane in which when shadow imaging mode
Spatial position coordinate;
Second position converting unit, for by the first mapping point or region (i.e. S point in Fig. 5) in above-mentioned view plane
Spatial position coordinate is converted to the coordinate under XYZ space system;
Second computing unit passes through above-mentioned first mapping point or region (i.e. S point in Fig. 5) from view eye position for being based on
It is established to corresponding the second mapping point or region (i.e. Q point in Fig. 5) in XYZ space system of above-mentioned preliminary making point or region straight
Linear distance equation (is specifically shown in above-mentioned formula (1)), calculates from the XYZ of above-mentioned first mapping point or region (i.e. S point in Fig. 5) sky
Between be that coordinate is launched the light projection of light and vector and tracked;
Second judgment unit establishes above-mentioned second mapping point or region according to the above-mentioned light projection vector sum sampling interval
The spatial position of (i.e. Q point in Fig. 5) in XYZ space system, and judge above-mentioned second mapping point in the sampling time or region
Whether spatial position meets preset condition;And
Second solves unit, for according to above-mentioned second mapping point corresponding when meeting above-mentioned preset condition or region (i.e.
Q point in Fig. 5) spatial position, obtain the corresponding position in above-mentioned all flat images in above-mentioned preliminary making point or region.On
When stating the preset condition mentioned can be for the sampling of the spatial position of above-mentioned second mapping point or region (i.e. Q point in Fig. 5)
Gray value whether reach preset threshold.
The concrete principle of the present embodiment is as follows: as shown in Fig. 2 and Fig. 4 to Fig. 7, in the figure rendered according to user in 3D
When as selecting preliminary making point or region 2 in the region 3D, remain constant depending on eye position P in interactive process, coordinate is (x0,
y0, z0).When preliminary making point or region 2 is arranged in the region image 3D rendered in 3D in user, the point be located at imaging plane (u,
V) the position S in (i.e. above-mentioned view plane).S (u, v) is transformed into XYZ space system, imaging plane (u, v) interior position S is obtained and exists
Coordinate S (x under XYZ space system1, y1, z1).It so connects P, S two o'clock and can be expressed as using Q point as the linear equation of end
The content of above-mentioned formula (1).
Based on above-mentioned formula (1), then the light projection vector of the light is expressed as the content of following formula (3).
According to light projecting algorithm, object space is sampled along this light, then (i.e. above-mentioned second reflects above-mentioned Q point
Exit point or region) in the spatial position coordinate in XYZ space system it can be expressed as content shown in following formula (4).
Wherein Δ t be the sampling interval, i=0,1 ... for sampled point index, P indicate view eye position P coordinate.When adopting
In the sample time, the sampling for Q point reaches some for the first time and meets preset condition (such as the big Mr. Yu threshold value Gth of gray value) sample bits
When setting, which is position coordinates in the 3D rendering for finally needing to mark.
Based on the process of above-mentioned coordinate conversion, above-mentioned space mapping module 130 further include: if above-mentioned preliminary making point or region
From above-mentioned flat image, then executes and sit the spatial position of preliminary making point or region from any one above-mentioned flat image of pair
Mark is converted to the corresponding unit in other above-mentioned flat images the step of position of above-mentioned preliminary making point or region.Here plane
The position of point-of-interest between image maps, can based in Fig. 5 in the X-Y scheme that Q point position carries out multiple sections and obtains
As being obtained with the mapping relations of Q space of points position.
Based on above-mentioned each embodiment, the preliminary making point of acquisition or region are corresponded in the display interface in the present embodiment
When position in all images is marked, special mark is carried out to corresponding position as shown in Fig. 2, can use color or symbol
Note, to show difference.As used black box mark preliminary making point or region in Fig. 2, cross, dot, rectangle, side can also be used
One of a variety of icons such as frame, triangle are marked, and the selection of color should be biggish with background image contrast difference
Color.Simultaneously in order to make icon not stop image, not influence user's observation, color mark should be able to adjust transparency and reach
The effect that can be had an X-rayed.
Based on the above embodiment, it is more selected and in order to provide user convenient for unrestricted choice as needed in the present embodiment
Mark mode, and realize can distinguish mark to the preliminary making point of multiple selections or region on the same display interface simultaneously
Note, then as shown in figure 8, above-mentioned mark module 140 includes:
Mark instructions receiving unit 141, on above-mentioned display interface provide for select a selected marker symbol and/or
The prompting frame or key or instruction input frame of color, to obtain the first mark instructions;And
Label execution unit 142, label symbol and/or colouring information for will be selected in above-mentioned first mark instructions,
The corresponding position being added in the corresponding all images on above-mentioned display interface of above-mentioned preliminary making point or region.
Based on the above embodiment, as shown in figure 9, in order to realize that the transparency to color carries out unrestricted choice, the present embodiment
Middle mark module 140 further include:
Transparency selecting unit 143, for providing on above-mentioned display interface for selecting a selection filling region transparency
Prompting frame or key or instruction input frame, to obtain the second mark instructions;And
Show adjustment unit 144, for selecting information according to the transparency in above-mentioned second mark instructions, adjustment is added to
The color clarity of the corresponding corresponding position in above-mentioned all images of above-mentioned preliminary making point or region.
Based on the above embodiment, for the ease of the practical ruler of relevant position can be directly read based on above-mentioned mark position
It is very little, then as shown in Figure 10 and Figure 11, the ultrasonic virtual endoscopic imaging system of the present embodiment further include: background laminating module 150 is used
It is added to conduct in above-mentioned flat image and/or above-mentioned stereo-picture in by scale or background image containing scale beacon information
Background is shown.Specific (wherein dotted line frame 3 indicates a part of the above-mentioned display interface of interception, the same below) as shown in figure 11, figure
Imaging region 4 in 11 (a) in dotted line frame 3 be width be W, to be highly the display image of H (can be based on more plane weights
The obtained two-dimensional image of group (MPR) or stereo-picture), then it is superimposed scale respectively in the upper surface of the imaging region 4 or the left side
5, the phase of position that is marked in two-dimensional image or stereo-picture of understanding that can be visual and clear according to the scale on scale
Answer size.Here scale is preferably the scale that length is W in the upper surface of imaging region 4, and on the left side in imaging region 4
When be scale that length is H.Further, it is also possible to intuitively show the corresponding size of mark position using another way, i.e., as schemed
Imaging regions 4 shown in 11 (b) in dotted line frame 3 are that width is W, is highly the display image of H, in the imaging region 4 at
It is superimposed equidistant grid image 6 when picture to show as background, equidistant grid image 6 here is to contain scale beacon information
Background image, the background image containing scale beacon information of a kind of mode of however, the present invention is not limited thereto, as long as can be aobvious
Show the background image for intuitively showing that the correlation of mark position quantifies positional relationship on interface, such as by the mark in Figure 11 (a)
Ruler 5 replaces with, and only with the equidistant grid of a row or column, can indicate scale to equidistantly dividing in a row or column
Beacon information.
Based on the above embodiment, as shown in figure 12, in order to alloing scale or containing scale beacon information with client's
It needs to be adjusted, the ultrasonic virtual endoscopic imaging system 100 of this implementation further include:
Apart from setup unit 170, for providing on above-mentioned display interface for selecting a selection said scale or above-mentioned quarter
The scale of beacon information and/or the prompting frame or key or instruction input frame of the graphicaccess scale are spent, is referred to obtaining scale adjustment
It enables;And
Scale adjustment unit 160, for according to the scale and/or the graphicaccess scale information in above-mentioned scale adjustment instruction,
The scale for adjusting said scale or the scale beacon information in above-mentioned background image.
Based on the above embodiment, as shown in Figure 12 and Figure 13, in order to labeled in more intuitive accurately understanding image
Position and the relationship for being superimposed dimension information, the ultrasonic virtual endoscopic imaging system 100 of this implementation further include: lead generation module
180, for being given birth to respectively in all images on above-mentioned display interface when the position of above-mentioned preliminary making point or region is selected
At vertical line and/or parallel lines, and the scale beacon information being associated with into said scale or above-mentioned background image is carved to indicate
Spend beacon information.As shown in figure 13, the imaging region 4 in dotted line frame 3 is that width is W, is highly the display image of H, when it
In labeled point (black side's point i.e. in figure) it is selected when, then generate vertical leads and parallel lead wire 7 be separately connected in place
Above imaging region 4 and on the scale 5 on the left side, image observer can very easily be enabled to understand the corresponding positions of the mark point
It sets.In addition, in order to not influence the observation to imaging region figure elsewhere, then preferred choosing when release preliminary making point or region
When, then vertical leads and parallel lead wire 7 disappear.
Based on the above embodiment, as shown in Figure 14 and Figure 15, the ultrasonic virtual endoscopic imaging system 100 of this implementation also wraps
It includes:
Distance mark module 191, for measuring the position of two preliminary making points or region that above-mentioned fixed point selecting module obtains
The distance between set;And
Apart from mark module 192, for the information superposition of above-mentioned distance will to be contained to above-mentioned flat image and/or perspective view
To show as in.
Specific as shown in figure 15, the imaging region 4 in dotted line frame 3 is that width is W, is highly the display image of H, needle
Preliminary making point or region 2-1, black to the preliminary making point or region 2-2, black triangle side's point label marked with black side's point
It is superimposed corresponding range information 8 between every two point in the preliminary making point or region 2-3 of color dot label, it in this way can be more
The relative positional relationship of multiple mark positions is conveniently got information about, is provided for system user more humanized, more straight
The quantitative analysis of sight shows result.
Based on above-mentioned ultrasonic virtual endoscopic imaging system, as shown in figure 16, the present embodiment additionally provides a kind of ultrasonic virtual
Endoscopic imaging method comprising:
Step 610, at least one plan view obtained using virtual Endoscopy is shown simultaneously on same display interface
Picture and at least one stereo-picture;
Step 620, preliminary making point or region are obtained from any piece image shown on above-mentioned display interface;
Step 630, it according to above-mentioned preliminary making point or the spatial position coordinate in region, is obtained based on space reflection relationship above-mentioned
Position in the corresponding all images on above-mentioned display interface of preliminary making point or region;
Step 640, it marks corresponding in the corresponding all images on above-mentioned display interface of above-mentioned preliminary making point or region
Position.
The present embodiment provides a kind of imaging label display methods in fact, can be empty to utilizing using the method for the present embodiment
Intend the image that Endoscopy obtains to be marked and show, it can be from the display interface of display plane image and stereo-picture simultaneously
Intuitively, the position corresponding relationship of point-of-interest is clearly obtained, is increased for traditional ultrasonic virtual endoscopic imaging system new
Function.Preliminary making point is obtained in above-mentioned steps 620 or region can use human-computer interaction module to realize in display image subscript
The mode of note corresponding position obtains.
Based on the above embodiment, in the present embodiment, as shown in figure 18, above-mentioned steps 630 are according to the area above-mentioned preliminary making Dian Huo
The spatial position coordinate in domain obtains above-mentioned preliminary making point based on space reflection relationship or region is corresponding on above-mentioned display interface
The process of position in all images includes:
Step 631, judge whether above-mentioned preliminary making point or region come from the above-mentioned flat image of any one pair or whether come
From above-mentioned stereo-picture;
It step 632, will be from the pre- of above-mentioned flat image if above-mentioned preliminary making point or region come from above-mentioned flat image
The spatial position coordinate in mark point or region is converted to the corresponding position in above-mentioned stereo-picture of above-mentioned preliminary making point or region
It sets;
It step 633, will be from the pre- of above-mentioned stereo-picture if above-mentioned preliminary making point or region come from above-mentioned stereo-picture
The spatial position coordinate in mark point or region, is converted to above-mentioned preliminary making point or region is corresponding in above-mentioned all flat images
Position.
Step 632 is mainly used for being converted to the coordinate position in 2D image into the coordinate bit in 3D rendering in the present embodiment
It sets, step 633 is mainly used for being converted to the coordinate position in 3D rendering into the coordinate position in 2D image, space reflection relationship
Concrete methods of realizing is referring specifically to following the description.
Based on the above embodiment, as shown in figure 19, above-mentioned by the preliminary making point from above-mentioned flat image in the present embodiment
Or region spatial position coordinate, be converted to the step of corresponding position in above-mentioned stereo-picture in above-mentioned preliminary making point or region
632 preferably realize in the following ways:
Step 321, according to from above-mentioned flat image preliminary making point or region, obtain in XYZ space system the pre- mark
Note point or corresponding second mapping point in region or the spatial position coordinate in region;
Step 322, based on the perspective projection imaging mode in XYZ space system, view eye position is calculated by view plane
First mapping point or region to above-mentioned second mapping point or region linear distance equation and above-mentioned view plane equation;
Step 323, it is associated with the calculating of above-mentioned linear distance equation and above-mentioned view plane equation, is obtained in XYZ space system
First mapping point or the coordinate in region in above-mentioned view plane;
Step 324, the coordinate by above-mentioned first mapping point of acquisition or region in XYZ space system is converted to above-mentioned
The coordinate of one mapping point or region in the view plane in which, to obtain above-mentioned preliminary making point or region correspondence in above-mentioned stereo-picture
Position.
Based on above-mentioned implementation, as shown in figure 20, in the present embodiment above-mentioned preliminary making point by from above-mentioned stereo-picture or
The spatial position coordinate in region, the step for being converted to the corresponding position in above-mentioned all flat images of above-mentioned preliminary making point or region
Rapid 633 include:
Step 331, according to from above-mentioned stereo-picture preliminary making point or region, obtain be based on perspective projection imaging mode
When the preliminary making point or corresponding the first mapping point or region in the view plane in which in region spatial position coordinate;
Step 332, the spatial position coordinate of the first mapping point or region in above-mentioned view plane is converted into XYZ space system
Under coordinate;
Step 333, it is based on passing through above-mentioned first mapping point or region to above-mentioned preliminary making point or region pair from view eye position
It should calculate in the linear distance equation of the second mapping point or region foundation in XYZ space system from above-mentioned first mapping point or area
The XYZ space system coordinate in domain is launched the light projection vector of light and is tracked;
Step 334, above-mentioned second mapping point or region are established in XYZ sky according to the above-mentioned light projection vector sum sampling interval
Between be in spatial position, and judge whether the spatial position in above-mentioned second mapping point or region meets default item in the sampling time
Part;
Step 335, it according to above-mentioned second mapping point corresponding when meeting above-mentioned preset condition or the spatial position in region, obtains
Obtain the corresponding position in above-mentioned all flat images of above-mentioned preliminary making point or region.
Here the first mapping point or region is the S point in Fig. 5, and the second mapping point or region are the Q point in Fig. 5, above
It explains referring to above-mentioned in relation to the principle of step 632 and step 633 about the first converting unit 131 and the second converting unit 132
Related description does not make tired state herein.
Based on the above embodiment, as shown in figure 18, in the present embodiment further include: if above-mentioned preliminary making point or region are from upper
State flat image, then further include by the spatial position coordinate of preliminary making point or region from any one above-mentioned flat image of pair,
Be converted to the step 634 of the corresponding position in other above-mentioned flat images of above-mentioned preliminary making point or region.Here flat image it
Between point-of-interest position mapping, can based in Fig. 5 in the two dimensional image and Q that Q point position carries out multiple sections and obtains
The mapping relations of space of points position obtain.
Based on above-mentioned each embodiment, more select and in order to provide user convenient for unrestricted choice label side as needed
Formula, and realize can distinguish label, this reality to the preliminary making point of multiple selections or region on the same display interface simultaneously
Apply the step of above-mentioned label preliminary making point or region correspond to the corresponding position in all images on above-mentioned display interface in example
640 include:
It provides on above-mentioned display interface for selecting the prompting frame or key of a selected marker symbol and/or color or referring to
Input frame is enabled, to obtain the first mark instructions;
The label symbol and/or colouring information that will be selected in above-mentioned first mark instructions, be added to above-mentioned preliminary making point or
Corresponding position in the corresponding all images on above-mentioned display interface in region.
Above-mentioned label symbol in the present embodiment is one of cross, dot, rectangle, box, triangle etc..
Based on above-mentioned each embodiment, it in order to realize that the transparency to color carries out unrestricted choice, and hinder icon will not
Gear image does not influence user's observation, and the above-mentioned preliminary making point of above-mentioned label or region are corresponding in above-mentioned display interface in the present embodiment
On all images in corresponding position step 640 further include:
Prompting frame or key for selecting a selection filling region transparency are provided on above-mentioned display interface or instructed
Input frame, to obtain the second mark instructions;
Information is selected according to the transparency in above-mentioned second mark instructions, adjusts be added to above-mentioned preliminary making point or region pair
Should in above-mentioned all images corresponding position color clarity.
It based on the above embodiment, can be simultaneously in same display circle by selecting different label symbol and/or color
The position in multiple preliminary making points or region is distinctly displayed on face simultaneously.
Based on the above embodiment, for the ease of the practical ruler of relevant position can be directly read based on above-mentioned mark position
It is very little, the above method in the present embodiment further include: scale or the background image containing scale beacon information are added to above-mentioned plane
It is shown in image and/or above-mentioned stereo-picture as background.Specific label display mode mutually speaking on somebody's behalf referring to above-mentioned related Figure 11
It is bright.
Based on the above embodiment, in order to more it is intuitive accurately understand in image be labeled position be superimposed dimension information
Relationship, the above method in the present embodiment further include: when the position in above-mentioned preliminary making point or region is selected, in above-mentioned display
Vertical line and/or parallel lines are generated in all images on interface respectively, and are associated with into said scale or above-mentioned background image
Scale beacon information to indicating graduation information.Specific label display mode is referring to the above-mentioned related description in relation to Figure 13.
Based on the above embodiment, in order to alloing scale or the adjusting with client containing scale beacon information
It is whole, the ultrasonic virtual endoscopic imaging method in the present embodiment further include: provide on above-mentioned display interface for selecting a selection
The scale of scale or above-mentioned scale beacon information and/or the prompting frame or key of the graphicaccess scale or instruction input frame are stated, with
Obtain scale adjustment instruction;And according to the scale and/or the graphicaccess scale information in above-mentioned scale adjustment instruction, above-mentioned mark is adjusted
Scale beacon information in the scale of ruler or above-mentioned background image.
Based on the above embodiment, the above-mentioned ultrasonic virtual endoscopic imaging method in the present embodiment further include: measurement is above-mentioned fixed
The distance between two preliminary making points of point selection module acquisition or the position in region, and the information superposition of above-mentioned distance will be contained
Into above-mentioned flat image and/or stereo-picture to show.This implementation can directly be shown directly by the label of range information
The relative positional relationship known between multiple mark positions is obtained, is provided for system user more humanized, more intuitive
Quantitative analysis shows result.
The above-mentioned related refinement content in relation in ultrasonic virtual endoscopic imaging method may refer to above-mentioned related ultrasonic virtual
The related description of endoscopic imaging system, for example, based on space reflection relationship carry out coordinate conversion method may refer to it is above-mentioned related
Formula (1) illustrates content to formula (4).
Through the above description of the embodiments, those skilled in the art can be understood that above-described embodiment system
System and method can be realized by means of software and necessary general hardware platform, naturally it is also possible to by hardware, but it is very much
In the case of the former be more preferably embodiment.Based on this understanding, ultrasonic virtual endoscopic imaging system of the invention and method
Technical solution substantially the part that contributes to existing technology can be embodied in the form of software products in other words, should
Computer software product is stored in a non-volatile computer readable storage medium storing program for executing (such as ROM, magnetic disk, CD), if including
Dry instruction is used so that a terminal device (can be mobile phone, computer, server or the network equipment etc.) executes the present invention
System structure described in each embodiment and method.
If the ultrasonic virtual endoscopic imaging system and method for the invention that above-mentioned each embodiment is presented is applied to
In ultrasonic virtual based endoscopic imaging device, its new function can be assigned, so that the energy in existing ultrasonic virtual based endoscopic imaging device
It is enough to have the function of visualizing 2D and 3D rendering corresponding relationship and quantify mark point position, it is ultrasonic virtual based endoscopic imaging device
User provides more intuitive, more accurate comparative information.Shown in specific as follows.
Based on above content, as shown in figure 17, the present invention also provides a kind of ultrasonic virtual based endoscopic imaging device devices
700 comprising:
Image capture module 705, ultrasonic waveform, obtains the 3 d image data of destination organization for transmitting and receiving;
Memory 702, for storing above-mentioned 3 d image data and executing the result of operation to above-mentioned 3 d image data;
Human-computer interaction module 701, for obtaining the instruction operated to above-mentioned 3 d image data;
Display 703, for showing at least one flat image and at least one obtained using virtual Endoscopy simultaneously
A stereo-picture;And
Processor module 704, for obtained from any piece image shown on above-mentioned display interface preliminary making point or
The position in region obtains above-mentioned pre- mark based on space reflection relationship according to above-mentioned preliminary making point or the spatial position coordinate in region
Position in the corresponding all images on above-mentioned display interface of note point or region, and mark above-mentioned preliminary making point or region corresponding
The corresponding position in all images on above-mentioned display interface.
Based on above-mentioned ultrasonic virtual based endoscopic imaging device, the device of the present embodiment is in the sky according to preliminary making point or region
Between position coordinates, obtain based on space reflection relationship the corresponding all figures on above-mentioned display interface of above-mentioned preliminary making point or region
As in position when, by judge above-mentioned preliminary making point or region whether come from the above-mentioned flat image of any one pair or whether
Conversion instruction is obtained from above-mentioned stereo-picture, according to the conversion instruction, selection executes will be from the pre- of above-mentioned flat image
The spatial position coordinate in mark point or region is converted to the corresponding position in above-mentioned stereo-picture of above-mentioned preliminary making point or region
It sets, or executes the spatial position coordinate of preliminary making point or region from above-mentioned stereo-picture, be converted to above-mentioned preliminary making point
Or the corresponding position in above-mentioned all flat images in region.Here the detailed process that related conversion instruction obtains can be found in above-mentioned
It is not tired herein to state in conjunction with the related description that attached drawing 18 carries out.
In the ultrasonic virtual based endoscopic imaging device of the present embodiment, above-mentioned processor module 704 will come from above-mentioned plane
The preliminary making point of image or the spatial position coordinate in region are converted to above-mentioned preliminary making point or region is corresponding in above-mentioned stereo-picture
In position when or by the spatial position coordinate of preliminary making point or region from above-mentioned stereo-picture, be converted to above-mentioned preliminary making
When point or region correspond to the position in above-mentioned all flat images, reference can be made to the related step of above-mentioned combination attached drawing 19 and 20
632 it is related to 633 illustrate, it is not tired herein to state.
Based on above-mentioned ultrasonic virtual based endoscopic imaging device, in order to provide more mark mode selections, and exist simultaneously
The position in multiple preliminary making points or region is distinctly displayed on the same display interface simultaneously, above-mentioned processor module 704 passes through
Prompting frame or key or instruction input for selecting a selected marker symbol and/or color is provided on above-mentioned display interface
Frame includes the label symbol of setting and/or the first mark instructions of colouring information to obtain, to by above-mentioned first mark instructions
The label symbol and/or colouring information of middle selection, be added to above-mentioned preliminary making point or region are corresponding on above-mentioned display interface
Corresponding position in all images.Here label symbol can be one in cross, dot, rectangle, box, triangle etc.
Kind.
Based on above-mentioned ultrasonic virtual based endoscopic imaging device, in order to after selected marker color, icon will not stop image,
User's observation is not influenced, and above-mentioned processor module 704 on above-mentioned display interface by providing for selecting a selection fill area
The prompting frame or key or instruction input frame of domain transparency, to obtain the second mark instructions comprising transparency selection information,
Information is selected according to the transparency in above-mentioned second mark instructions, be added to above-mentioned preliminary making point or region are corresponding to exist to adjust
The color clarity of corresponding position in above-mentioned all images.
Based on above-mentioned ultrasonic virtual based endoscopic imaging device, in order to make user intuitively obtain label from display interface
The dimension information of position, then its above-mentioned processor module 704 is also by by scale or background image containing scale beacon information
It is added in above-mentioned flat image and/or above-mentioned stereo-picture and is shown as background.Here scale contains scale mark letter
The explanation of the background image of breath is referring to the above-mentioned related description in relation to Figure 11.
Based on above-mentioned ultrasonic virtual based endoscopic imaging device, in order to make user's unrestricted choice said scale and background image
The size of middle calibration information, above-mentioned processor module 704 on above-mentioned display interface also by providing for selecting a selection
The scale of scale or above-mentioned scale beacon information and/or the prompting frame or key of the graphicaccess scale or instruction input frame are stated, is come
The scale comprising setting and/or the scale adjustment instruction of the graphicaccess scale information are obtained, to according to above-mentioned scale adjustment instruction
In scale and/or the graphicaccess scale information, adjust the scale beacon information in the scale or above-mentioned background image of said scale.
Based on above-mentioned ultrasonic virtual based endoscopic imaging device, in order to intuitively understand the relative position between multiple mark positions
Relationship, above-mentioned processor module 704 is by measuring the two preliminary making points or regional location that are selected in same display image
The distance between, and by containing above-mentioned distance information superposition into above-mentioned flat image and/or stereo-picture to show.
It is corresponding in said scale in order to accurately understand mark position based on above-mentioned ultrasonic virtual based endoscopic imaging device
Or the dimension information on calibration information, then it is when the position in above-mentioned preliminary making point or region is selected, in above-mentioned display interface
On all images in generate vertical line and/or parallel lines respectively, and be associated with the quarter into said scale or above-mentioned background image
Beacon information is spent to indicating graduation information.
The embodiments described above only express several embodiments of the present invention, and the description thereof is more specific and detailed, but simultaneously
Limitations on the scope of the patent of the present invention therefore cannot be interpreted as.It should be pointed out that for those of ordinary skill in the art
For, without departing from the inventive concept of the premise, various modifications and improvements can be made, these belong to guarantor of the invention
Protect range.Therefore, the scope of protection of the patent of the invention shall be subject to the appended claims.
Claims (23)
1. a kind of ultrasonic virtual endoscopic imaging system, which is characterized in that the system comprises:
Display processing module is put down for showing simultaneously on same display interface using at least one that virtual Endoscopy obtains
Face image and at least one stereo-picture;
Selecting module is pinpointed, for obtaining preliminary making point or region from any piece image shown on the display interface;
Space mapping module is obtained for the spatial position coordinate according to the preliminary making point or region based on space reflection relationship
Obtain the position in the corresponding all images on the display interface of the preliminary making point or region;And
Mark module, it is corresponding in the corresponding all images on the display interface of the preliminary making point or region for marking
Position;
The space mapping module includes:
First converting unit, for that will be converted to from the preliminary making point of the flat image or the spatial position coordinate in region
The corresponding position in the stereo-picture of the preliminary making point or region;
First converting unit includes:
First position unit, preliminary making point or region for basis from the flat image, obtaining should in XYZ space system
Corresponding second mapping point of preliminary making point or region or the spatial position coordinate in region;
First computing unit, for calculating view eye position and passing through view plane based on the perspective projection imaging mode in XYZ space system
In the first mapping point or region to second mapping point or region linear distance equation and the view plane equation;
First solves unit, for being associated with the calculating of the linear distance equation and the view plane equation, obtains in XYZ space
First mapping point or the coordinate in region in view plane described in system;And
First position converting unit, for the coordinate of first mapping point that will be obtained or region in XYZ space system, conversion
For the coordinate of first mapping point or region in the view plane in which, to obtain the preliminary making point or region is corresponding described vertical
Position in body image.
2. ultrasonic virtual endoscopic imaging system according to claim 1, which is characterized in that the space mapping module also wraps
It includes:
Second converting unit, for that will be converted to from the preliminary making point of the stereo-picture or the spatial position coordinate in region
The corresponding position in all flat images of the preliminary making point or region;
Judging unit, for judge the preliminary making point or region whether come from any one pair flat image or whether
From the stereo-picture, first converting unit or second converting unit are called according to judging result.
3. ultrasonic virtual endoscopic imaging system according to claim 2, which is characterized in that the space mapping module also wraps
Include: if preliminary making point or region are from the flat image, execution will be from the pre- of flat image described in any one width
The spatial position coordinate in mark point or region is converted to the corresponding position in other flat images of the preliminary making point or region
The unit of step.
4. ultrasonic virtual endoscopic imaging system according to claim 3, which is characterized in that the second converting unit packet
It includes:
Second position unit, for according to from the stereo-picture preliminary making point or region, obtain based on perspective projection at
The spatial position coordinate of corresponding the first mapping point or region in the view plane in which of the preliminary making point or region when image space formula;
Second position converting unit, for being converted to the spatial position coordinate of the first mapping point or region in the view plane
Coordinate under XYZ space system;
Second computing unit passes through first mapping point or region to the area the preliminary making Dian Huo from view eye position for being based on
The linear distance equation that corresponding the second mapping point or region in XYZ space system in domain is established, calculates from first mapping point
Or the XYZ space system coordinate in region is launched the light projection vector of light and is tracked;
Second judgment unit establishes second mapping point or region in XYZ according to the light projection vector sum sampling interval
Spatial position in the system of space, and judge whether the spatial position in second mapping point or region meets default in the sampling time
Condition;And
Second solves unit, for the space bit according to second mapping point corresponding when meeting the preset condition or region
It sets, obtains the corresponding position in all flat images of the preliminary making point or region.
5. a kind of ultrasonic virtual endoscopic imaging system, which is characterized in that the system comprises:
Display processing module is put down for showing simultaneously on same display interface using at least one that virtual Endoscopy obtains
Face image and at least one stereo-picture;
Selecting module is pinpointed, for obtaining preliminary making point or region from any piece image shown on the display interface;
Space mapping module is obtained for the spatial position coordinate according to the preliminary making point or region based on space reflection relationship
Obtain the position in the corresponding all images on the display interface of the preliminary making point or region;And
Mark module, it is corresponding in the corresponding all images on the display interface of the preliminary making point or region for marking
Position;
The space mapping module includes:
Second converting unit, for that will be converted to from the preliminary making point of the stereo-picture or the spatial position coordinate in region
The corresponding position in all flat images of the preliminary making point or region;
Second converting unit includes:
Second position unit, for according to from the stereo-picture preliminary making point or region, obtain based on perspective projection at
The spatial position coordinate of corresponding the first mapping point or region in the view plane in which of the preliminary making point or region when image space formula;
Second position converting unit, for being converted to the spatial position coordinate of the first mapping point or region in the view plane
Coordinate under XYZ space system;
Second computing unit passes through first mapping point or region to the area the preliminary making Dian Huo from view eye position for being based on
The linear distance equation that corresponding the second mapping point or region in XYZ space system in domain is established, calculates from first mapping point
Or the XYZ space system coordinate in region is launched the light projection vector of light and is tracked;
Second judgment unit establishes second mapping point or region in XYZ according to the light projection vector sum sampling interval
Spatial position in the system of space, and judge whether the spatial position in second mapping point or region meets default in the sampling time
Condition;And
Second solves unit, for the space bit according to second mapping point corresponding when meeting the preset condition or region
It sets, obtains the corresponding position in all flat images of the preliminary making point or region.
6. ultrasonic virtual endoscopic imaging system according to claim 5, which is characterized in that the space mapping module also wraps
It includes:
First converting unit, for that will be converted to from the preliminary making point of the flat image or the spatial position coordinate in region
The corresponding position in the stereo-picture of the preliminary making point or region;
Judging unit, for judge the preliminary making point or region whether come from any one pair flat image or whether
From the stereo-picture, first converting unit or second converting unit are called according to judging result.
7. ultrasonic virtual endoscopic imaging system according to claim 6, which is characterized in that the first converting unit packet
It includes:
First position unit, preliminary making point or region for basis from the flat image, obtaining should in XYZ space system
Corresponding second mapping point of preliminary making point or region or the spatial position coordinate in region;
First computing unit, for calculating view eye position and passing through view plane based on the perspective projection imaging mode in XYZ space system
In the first mapping point or region to second mapping point or region linear distance equation and the view plane equation;
First solves unit, for being associated with the calculating of the linear distance equation and the view plane equation, obtains in XYZ space
First mapping point or the coordinate in region in view plane described in system;And
First position converting unit, for the coordinate of first mapping point that will be obtained or region in XYZ space system, conversion
For the coordinate of first mapping point or region in the view plane in which, to obtain the preliminary making point or region is corresponding described vertical
Position in body image.
8. ultrasonic virtual endoscopic imaging system according to claim 1 or 5, which is characterized in that the mark module includes:
Mark instructions receiving unit, for providing on the display interface for selecting a selected marker symbol and/or color
Prompting frame or key or instruction input frame, to obtain the first mark instructions;And
Execution unit is marked, label symbol and/or colouring information for will select in first mark instructions, be added to institute
State the corresponding position in the corresponding all images on the display interface of preliminary making point or region.
9. ultrasonic virtual endoscopic imaging system according to claim 8, which is characterized in that the mark module further include:
Transparency selecting unit, for providing the prompt for selecting a selection filling region transparency on the display interface
Frame or key or instruction input frame, to obtain the second mark instructions;And
Show adjustment unit, for selecting information according to the transparency in second mark instructions, adjustment is added to described pre-
The color clarity of the corresponding corresponding position in all images of mark point or region.
10. ultrasonic virtual endoscopic imaging system according to claim 1 or 5, which is characterized in that the system also includes:
Background laminating module, for by scale or background image containing scale beacon information be added to the flat image and/
Or it is shown in the stereo-picture as background.
11. ultrasonic virtual endoscopic imaging system according to claim 10, which is characterized in that the system also includes:
Apart from setup unit, for being provided on the display interface for selecting the selection scale or scale mark letter
The prompting frame or key or instruction input frame of the scale of breath and/or the graphicaccess scale, to obtain scale adjustment instruction;And
Scale adjustment unit, for according to the scale and/or the graphicaccess scale information in the scale adjustment instruction, described in adjustment
Scale beacon information in the scale of scale or the background image.
12. ultrasonic virtual endoscopic imaging system according to claim 1 or 5, which is characterized in that the system also includes:
Distance mark module, for measuring between two preliminary making points of the fixed point selecting module acquisition or the position in region
Distance;And
Apart from mark module, for will be used in the information superposition for containing the distance to the flat image and/or stereo-picture
With display.
13. ultrasonic virtual endoscopic imaging system according to claim 10, which is characterized in that the system also includes:
Lead generation module, for the institute when the position of the preliminary making point or region is selected, on the display interface
The scale mark for having and generating vertical line and/or parallel lines in image respectively, and be associated with into the scale or the background image
Information.
14. a kind of ultrasonic virtual endoscopic imaging method, which is characterized in that the described method includes:
Show at least one flat image obtained using virtual Endoscopy and at least one simultaneously on same display interface
Stereo-picture;
Preliminary making point or region are obtained from any piece image shown on the display interface;
According to the preliminary making point or the spatial position coordinate in region, the area the preliminary making Dian Huo is obtained based on space reflection relationship
Position in the corresponding all images on the display interface in domain;
Mark the corresponding position in the corresponding all images on the display interface of the preliminary making point or region;
It is described to obtain the preliminary making point according to the preliminary making point or the spatial position coordinate in region, based on space reflection relationship
Or the step of position of the region correspondence in all images on the display interface, includes:
Judge whether the preliminary making point or region come from flat image described in any one width;
If the preliminary making point or region come from the flat image, by from the flat image preliminary making point or region
Spatial position coordinate, be converted to the corresponding position in the stereo-picture in the preliminary making point or region;
The spatial position coordinate by preliminary making point or region from the flat image, be converted to the preliminary making point or
The step of corresponding position in the stereo-picture in region includes:
According to from the flat image preliminary making point or region, obtain in XYZ space system the preliminary making point or region pair
The spatial position coordinate of the second mapping point or region answered;
Based on the perspective projection imaging mode in XYZ space system, calculate view eye position by view plane the first mapping point or
Region to second mapping point or region linear distance equation and the view plane equation;
It is associated with the calculating of the linear distance equation and the view plane equation, is obtained in the view plane described in XYZ space system
First mapping point or the coordinate in region;
By the coordinate of first mapping point of acquisition or region in XYZ space system, first mapping point or area are converted to
The coordinate of domain in the view plane in which, to obtain the corresponding position in the stereo-picture of the preliminary making point or region.
15. ultrasonic virtual endoscopic imaging method according to claim 14, which is characterized in that in the method further include:
If the preliminary making point or region come from the flat image, further including will be from the pre- mark of flat image described in any one width
Note point or the spatial position coordinate in region, are converted to the step of the corresponding position in other flat images of the preliminary making point or region
Suddenly.
16. a kind of ultrasonic virtual endoscopic imaging method, which is characterized in that the described method includes:
Show at least one flat image obtained using virtual Endoscopy and at least one simultaneously on same display interface
Stereo-picture;
Preliminary making point or region are obtained from any piece image shown on the display interface;
According to the preliminary making point or the spatial position coordinate in region, the area the preliminary making Dian Huo is obtained based on space reflection relationship
Position in the corresponding all images on the display interface in domain;
Mark the corresponding position in the corresponding all images on the display interface of the preliminary making point or region;
It is described to obtain the preliminary making point according to the preliminary making point or the spatial position coordinate in region, based on space reflection relationship
Or the step of position of the region correspondence in all images on the display interface, includes:
Judge whether the preliminary making point or region come from the stereo-picture;
If the preliminary making point or region come from the stereo-picture, by from the stereo-picture preliminary making point or region
Spatial position coordinate, be converted to the corresponding position in all flat images in the preliminary making point or region;
According to from the stereo-picture preliminary making point or region, obtain preliminary making point when based on perspective projection imaging mode
Or the spatial position coordinate of corresponding the first mapping point or region in the view plane in which in region;
The spatial position coordinate of the first mapping point or region in the view plane is converted to the coordinate under XYZ space system;
Based on corresponding in XYZ space system to the preliminary making point or region by first mapping point or region from view eye position
In the linear distance equation established of the second mapping point or region, calculate from the XYZ space system in first mapping point or region
Coordinate is launched the light projection vector of light and is tracked;
The space of second mapping point or region in XYZ space system is established according to the light projection vector sum sampling interval
Position, and judge whether the spatial position in second mapping point or region meets preset condition in the sampling time;
According to second mapping point corresponding when meeting the preset condition or the spatial position in region, the preliminary making is obtained
The corresponding position in all flat images of point or region.
17. ultrasonic virtual endoscopic imaging method described in 4 or 16 according to claim 1, which is characterized in that the label is described pre-
The step of corresponding corresponding position in all images on the display interface in mark point or region includes:
Prompting frame or key for selecting a selected marker symbol and/or color are provided on the display interface or instructed defeated
Enter frame, to obtain the first mark instructions;
The label symbol and/or colouring information that will be selected in first mark instructions, be added to the preliminary making point or region
Corresponding position in corresponding all images on the display interface.
18. ultrasonic virtual endoscopic imaging method according to claim 17, which is characterized in that the label preliminary making
The step of point or region correspond to the corresponding position in all images on the display interface further include:
Prompting frame or key or instruction input for selecting a selection filling region transparency is provided on the display interface
Frame, to obtain the second mark instructions;
Information is selected according to the transparency in second mark instructions, adjustment is added to, and the preliminary making point or region are corresponding to exist
The color clarity of corresponding position in all images.
19. ultrasonic virtual endoscopic imaging method described in 4 or 16 according to claim 1, which is characterized in that the method is also wrapped
It includes: scale or the background image containing scale beacon information being added in the flat image and/or the stereo-picture and made
It is shown for background.
20. ultrasonic virtual endoscopic imaging method according to claim 19, which is characterized in that the method also includes:
Scale and/or the graphicaccess scale for selecting the selection scale or the scale beacon information is provided on the display interface
Prompting frame or key or instruction input frame, to obtain scale adjustment instruction;And according to the quarter in the scale adjustment instruction
Degree and/or the graphicaccess scale information, adjust the scale beacon information in the scale or the background image of the scale.
21. ultrasonic virtual endoscopic imaging method described in 4 or 16 according to claim 1, which is characterized in that the method is also wrapped
It includes: measurement the distance between two preliminary making points or the position in region, and the information superposition containing the distance is put down to described
To show in face image and/or stereo-picture.
22. ultrasonic virtual endoscopic imaging method according to claim 19, which is characterized in that the method also includes:
When the position in the preliminary making point or region is selected, vertical line is generated respectively in all images on the display interface
And/or parallel lines, and it is associated with the scale beacon information into the scale or the background image.
23. being peeped in a kind of ultrasonic virtual for containing ultrasonic virtual endoscopic imaging system described in claim 1-13 any one
Imaging device, which is characterized in that described device includes:
Image capture module, ultrasonic waveform, obtains the 3 d image data of destination organization for transmitting and receiving;
Memory, for storing the 3 d image data and executing the result of operation to the 3 d image data;
Human-computer interaction module, for obtaining the instruction operated to the 3 d image data;
Display, for showing at least one flat image and at least one perspective view for obtaining using virtual Endoscopy simultaneously
Picture;And
Processor module, for obtaining the position in preliminary making point or region from any piece image shown on the display interface
It sets, according to the preliminary making point or the spatial position coordinate in region, the area the preliminary making Dian Huo is obtained based on space reflection relationship
Position in the corresponding all images on the display interface in domain, and mark the preliminary making point or region corresponding described aobvious
Show the corresponding position in all images on interface.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2014/088141 WO2016054775A1 (en) | 2014-10-08 | 2014-10-08 | Ultrasonic virtual endoscopic imaging system and method, and apparatus thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106028943A CN106028943A (en) | 2016-10-12 |
CN106028943B true CN106028943B (en) | 2019-04-12 |
Family
ID=55652464
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201480075906.8A Active CN106028943B (en) | 2014-10-08 | 2014-10-08 | Ultrasonic virtual endoscopic imaging system and method and device thereof |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN106028943B (en) |
WO (1) | WO2016054775A1 (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018214063A1 (en) * | 2017-05-24 | 2018-11-29 | 深圳迈瑞生物医疗电子股份有限公司 | Ultrasonic device and three-dimensional ultrasonic image display method therefor |
CN109044400A (en) * | 2018-08-31 | 2018-12-21 | 上海联影医疗科技有限公司 | Ultrasound image mask method, device, processor and readable storage medium storing program for executing |
CN109345632B (en) * | 2018-09-17 | 2023-04-07 | 深圳达闼科技控股有限公司 | Method for acquiring image, related device and readable storage medium |
CN110989901B (en) * | 2019-11-29 | 2022-01-18 | 北京市商汤科技开发有限公司 | Interactive display method and device for image positioning, electronic equipment and storage medium |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101166470A (en) * | 2005-04-28 | 2008-04-23 | 株式会社日立医药 | Image display device and program |
CN101647717A (en) * | 2008-08-13 | 2010-02-17 | 株式会社东芝 | Ultrasonic diagnostic apparatus, ultrasonic image display apparatus, and medical image diagnostic apparatus |
CN103565470A (en) * | 2012-08-07 | 2014-02-12 | 香港理工大学 | Ultrasonic image automatic annotating method and system based on three-dimensional virtual image |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE10246355A1 (en) * | 2002-10-04 | 2004-04-15 | Rust, Georg-Friedemann, Dr. | Interactive virtual endoscopy method, requires two representations of three-dimensional data record with computed relative position of marked image zone in one representation |
US20060173324A1 (en) * | 2003-03-13 | 2006-08-03 | Koninklijke Philips Electronics N.V. | 3d imaging system and method for signaling an object of interest in a volume of data |
JP4177217B2 (en) * | 2003-09-24 | 2008-11-05 | アロカ株式会社 | Ultrasonic diagnostic equipment |
-
2014
- 2014-10-08 CN CN201480075906.8A patent/CN106028943B/en active Active
- 2014-10-08 WO PCT/CN2014/088141 patent/WO2016054775A1/en active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101166470A (en) * | 2005-04-28 | 2008-04-23 | 株式会社日立医药 | Image display device and program |
CN101647717A (en) * | 2008-08-13 | 2010-02-17 | 株式会社东芝 | Ultrasonic diagnostic apparatus, ultrasonic image display apparatus, and medical image diagnostic apparatus |
CN103565470A (en) * | 2012-08-07 | 2014-02-12 | 香港理工大学 | Ultrasonic image automatic annotating method and system based on three-dimensional virtual image |
Also Published As
Publication number | Publication date |
---|---|
CN106028943A (en) | 2016-10-12 |
WO2016054775A1 (en) | 2016-04-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5551957B2 (en) | Projection image generation apparatus, operation method thereof, and projection image generation program | |
JP5417609B2 (en) | Medical diagnostic imaging equipment | |
US9940747B2 (en) | Mapping 3D to 2D images | |
KR20030060492A (en) | Apparatus and method for displaying virtual endoscopy diaplay | |
CN106028943B (en) | Ultrasonic virtual endoscopic imaging system and method and device thereof | |
ITUB20155830A1 (en) | "NAVIGATION, TRACKING AND GUIDE SYSTEM FOR THE POSITIONING OF OPERATOR INSTRUMENTS" | |
JP2007195970A (en) | Tomographic system and method of visualization of tomographic display | |
CN105769240A (en) | Apparatus and method of displaying medical image | |
JP6112689B1 (en) | Superimposed image display system | |
JP2013017577A (en) | Image processing system, device, method, and medical image diagnostic device | |
JP2000279425A (en) | Navigation device | |
US8902305B2 (en) | System and method for managing face data | |
US20230309954A1 (en) | System for visualization and control of surgical devices utilizing a graphical user interface | |
CN113143463B (en) | Operation navigation device, system, calibration method, medium and electronic equipment | |
WO2014050019A1 (en) | Method and device for generating virtual endoscope image, and program | |
WO2016133847A1 (en) | Systems and methods for medical visualization | |
CN103593869A (en) | Scanning equipment and image display method thereof | |
US20140055448A1 (en) | 3D Image Navigation Method | |
KR20130059092A (en) | The method and apparatus combining a plurality of 2-dimensional images with 3-dimensional model | |
CN103654851B (en) | Method and apparatus for showing the steric information relevant with ultrasonic cross-sectional | |
JP5268229B2 (en) | 3D vector quantity visualization method and apparatus | |
JP5974238B2 (en) | Image processing system, apparatus, method, and medical image diagnostic apparatus | |
WO2016190328A1 (en) | Ultrasonic diagnostic device | |
KR101611484B1 (en) | Method of providing medical image | |
JP2008067915A (en) | Medical picture display |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |