CN104699235B - Three dimensions imaging exchange method and system based on ultrasonic wave - Google Patents
Three dimensions imaging exchange method and system based on ultrasonic wave Download PDFInfo
- Publication number
- CN104699235B CN104699235B CN201410217954.8A CN201410217954A CN104699235B CN 104699235 B CN104699235 B CN 104699235B CN 201410217954 A CN201410217954 A CN 201410217954A CN 104699235 B CN104699235 B CN 104699235B
- Authority
- CN
- China
- Prior art keywords
- module
- dimensions
- ultrasonic
- ultrasonic wave
- laser
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0033—Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/05—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves
- A61B5/055—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/466—Displaying means of special interest adapted to display 3D data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/481—Diagnostic techniques involving the use of contrast agent, e.g. microbubbles introduced into the bloodstream
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N29/00—Investigating or analysing materials by the use of ultrasonic, sonic or infrasonic waves; Visualisation of the interior of objects by transmitting ultrasonic or sonic waves through the object
- G01N29/04—Analysing solids
- G01N29/06—Visualisation of the interior, e.g. acoustic microscopy
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N29/00—Investigating or analysing materials by the use of ultrasonic, sonic or infrasonic waves; Visualisation of the interior of objects by transmitting ultrasonic or sonic waves through the object
- G01N29/04—Analysing solids
- G01N29/06—Visualisation of the interior, e.g. acoustic microscopy
- G01N29/0609—Display arrangements, e.g. colour displays
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Pathology (AREA)
- Engineering & Computer Science (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Veterinary Medicine (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Biophysics (AREA)
- Public Health (AREA)
- Radiology & Medical Imaging (AREA)
- Chemical & Material Sciences (AREA)
- Immunology (AREA)
- Acoustics & Sound (AREA)
- General Physics & Mathematics (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- High Energy & Nuclear Physics (AREA)
- Hematology (AREA)
- Computer Graphics (AREA)
- General Engineering & Computer Science (AREA)
- Physical Or Chemical Processes And Apparatus (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- User Interface Of Digital Computer (AREA)
- Investigating Or Analysing Materials By Optical Means (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention discloses a kind of three dimensions imaging exchange method and system based on ultrasonic wave, three dimensions of the present invention is imaged exchange method, different from existing holographic imaging technology, method and system of the present invention initially set up 3 d space coordinate, target three dimensions is generated, and launches the ultrasonic microbubble that can be sensed with ultrasound field;Then, ultrasound field is formed in target three dimensions by ultrasonic wave, potential well position is generated in target three dimensions, control ultrasonic microbubble moves in target three dimensions and is stable at the position, coloured simultaneously by adjusting laser for ultrasonic microbubble, so as to form 3-D view in target three dimensions.Further, by continuously repeating above-mentioned steps, 3-D view is made to be presented to user as 3D images.The method of the invention and system are the 3D images that user shows, and user can be allowed to be interacted with 3D images, user can not only be made to experience the true sense of touch of 3D images during interaction, and 3-D view can also keep original structure.
Description
Technical field
The present invention relates to three dimensions imaging interaction technique field, more particularly to it is a kind of based on the three dimensions of ultrasonic wave into
As exchange method and system.
Background technology
With the development of science and technology, electronic equipment and user's interacts constantly from virtually being evolved towards reality.From Amazon
The one-dimensional platform of one key delivery system, to the two-dimensional flat plate finger slide of smart mobile phone, then to leap motion,
The three-dimension gesture interaction platform such as kinect, increasingly closer to reality is interacted between user and equipment.
Traditional three-D imaging method needs fixed imaging plane, the imaging plane of described fixation can be screen,
Water curtain, air curtain etc., using fixed imaging plane, the continual and steady existing plane of reflection is carried out to the light of light source in space
Reflection, so as to which corresponding color be presented.
However, though heat is with the prevalence of 3D films and the development of holographic imaging technology, the exploitation of three dimensions imaging technique
Obtain the attention of more and more China and foreign countries researchers.But the image that current holographic imaging technology is generated can not be to user
Really to feed back, user can not be in the entity sense of touch with experiencing 3D images during 3D image interactives.
The content of the invention
For current imaging technique deficiency, the technical problems to be solved by the invention be to provide it is a kind of based on ultrasonic wave three
Dimension space is imaged exchange method and system, and the three dimensions is imaged exchange method without fixed imaging plane, can realize user with
The real-time, interactive of 3D images view-based access control model and tactile, that is, while user observes 3D images in three dimensions, moreover it is possible to
Interact to obtain the true sense of touch and pressure feedback of 3D images with 3D images.
Three dimensions imaging exchange method provided by the present invention based on ultrasonic wave, different from existing holographic imaging skill
It art, can not only be that user shows 3D images, while user can also be allowed to experience 3D images during being interacted with 3D images
True sense of touch, and the 3D images that generate of the present invention can have comprehensive observed by the observer of optional position
It is visual.Described 3D images can be static 3-D view, or the three-dimensional synthesized by the motion of multiple 3-D views is moved
State image, or 3 D video image or the Two-dimensional Surfaces image (as shown in figure 11) that generates in three dimensions.
3-D view of the present invention refers to the figure being at least made up of a frame time T all three-dimensional graph picture points
Picture, the image both can be 3-D views, can also two dimensional image, i.e. 3-D view the wherein one-dimensional feelings for being compressed into minimum value
Condition, it can also be one dimensional image, i.e. situation of the wherein two dimensional compaction of 3-D view into minimum value.
Frame time T of the present invention is the time of picture points all in particulate matter motion traversal three dimensions, and
Frame time T satisfactions persist time, i.e. 20 a quarter seconds less than human eye vision, and it is to connect in human eye to ensure whole 3-D view
Continuous.
Three-dimensional graph picture point of the present invention refers to that described three-dimensional graph picture point refers to form 3-D view
Each pixel, be by ultrasonic microbubble by after the Laser Focusing of three kinds of colors, the reflection laser in target three dimensions and present
The point of respective color, it includes 7 property values, be respectively a time attribute value t, three locus property values (r, θ,
φ), three color attribute values (R, G, B).
Described target three dimensions refer to 3-D view present three dimensions, it be different from other three-dimensional dimension images into
The two-dimentional curtain that image space method relies on, it is except the height and width of two-dimentional curtain, also with spatial depth;And it is different from curtain
Reflection source, presentation of the 3-D view in target three dimensions depend on reflection of the ultrasonic microbubble to light source.It is three-dimensional in target
In space, only ultrasonic microbubble can reflection laser, present three-dimensional graph picture point, i.e., in addition to ultrasonic microbubble, target three
Be not present in dimension space other can reflection laser, material (user and the three-dimensional graph picture point of three-dimensional graph picture point is presented
Except carrying out interactive object, such as user is used to send the objects such as the arm, hand, waddy of instruction);And ultrasonic microbubble can be with
Ultrasound field senses, and the motion of ultrasonic microbubble is controlled by ultrasound field, can moved and stably in the potential well position of ultrasound field
Put, reflect the laser of three kinds of colors, corresponding color is presented, the potential well position of ultrasound field can be in target three dimensions
All positions in optional position, i.e. target three dimensions all have accessibility for ultrasonic microbubble.
Ultrasonic microbubble of the present invention is to sense with ultrasound field, the thing moved to the potential well position of ultrasound field
Body;Ultrasound field of the present invention be ultrasonic waveform into field.
Respective color of the present invention refers to swash according to color scalar value (R', G', B'), three kinds of colors of red, green, blue
Light is with the transmitting laser of respective strengths value, the color presented when being mixed on particulate matter.Wherein color scalar value R' is corresponding red
The intensity level of laser, color scalar value G' correspond to the intensity level of green laser, and color scalar value B' corresponds to the intensity of blue laser
Value.
Further, the three dimensions imaging exchange method provided by the present invention based on ultrasonic wave can also pass through positioning
The hand position of user, the corresponding gesture of user is identified, obtains user instruction, enter user and the 3-D view of generation
Row real-time interactive, the change to picture characteristics operate, and the hand position of described user refers to user's hand in three-dimensional space
Between the location of middle and user's hand posture shape;The change of described picture characteristics includes the shape of image, color, big
The change of the image attributes such as small, position, posture, the attitudes vibration of described image refer to image under three dimensions spherical coordinate system
The change carried out according to pitch angle and the anglec of rotation.
Further, the three dimensions imaging exchange method provided by the present invention based on ultrasonic wave can also be known in real time
Position of the other user in the 3-D view of generation, ensure the 3-D view of generation with being kept originally in user interaction process
Structure, that is, the integrality of 3-D view are not damaged, while the proportion structure of its each composition keeps constant.
Wherein described user instruction be selected from the expansion of image, closing, rotation, switching, scaling, movement, folding, merging,
Section displaying, part-overall transformation, Image Reversal, details show etc. in instruction any one or it is any a variety of.
Described image rotation refers to that image is rotated by center or rotary shaft.
Described image merges the Fusion Edges referred between different images.
Described image detail shows the detailed information displaying for referring to that image local is included;Described detailed information can be with
It is that the attributes such as the hardness of material are presented in length, the image of image local.
To solve the technical problem of the present invention, technical scheme is as follows:
A kind of three dimensions imaging exchange method based on ultrasonic wave, comprises the following steps:
Step 1:Three dimensions spherical coordinate system is established, generates target three dimensions, ultrasonic microbubble is entered described target
Three dimensions;
Step 2:Obtain the parameter information of three-dimensional graph picture point all in 3-D view, described three-dimension space image
The parameter information of point includes t at the time of picture point, the spherical coordinate system coordinate (r, θ, φ) and color scalar value of picture point (R, G,
B);
Step 3:Positional information of the t user in target three dimensions is obtained, and according to described positional information, adjustment
Generate the spherical coordinate system coordinate (r', θ ', φ ') and color scalar value (R', G', B') of the three-dimensional graph picture point of t;Also
It is positional information of this step according to user in target three dimensions, adjusts the ball of the three-dimensional graph picture point of original t
Coordinate system coordinate (r, θ, φ) and color scalar value (R, G, B), generate the spherical coordinate system of the three-dimensional graph picture point of new t
Coordinate (r', θ ', φ ') and color scalar value (R', G', B');
Step 4:According to the spherical coordinate system coordinate (r', θ ', φ ') of the three-dimensional graph picture point of t, ultrasonic wave is generated
, the control ultrasonic microbubble motion of described ultrasound field, and make ultrasonic microbubble that there is corresponding speed, when being contacted with user to
The corresponding pressure feedback of user is given, described ultrasonic microbubble is stable at ultrasound field potential well position in target three dimensions
Put;
Step 5:According to the spherical coordinate system coordinate (r', θ ', φ ') of t three-dimensional graph picture point, laser direction is adjusted,
Laser intensity is adjusted according to t three-dimension space image parameter color scalar value (R', G', B'), launches three kinds of face of red, green, blue
The laser of color, makes it focus on ultrasonic microbubble, makes ultrasonic microbubble that respective color be presented, and then stops transmitting laser, waits next
Secondary instruction;
Step 6:Repeat step 2~5, by a frame time T, all figures in ultrasonic microbubble traversal target three dimensions
Picture point, 3-D view is generated in target three dimensions.
Described frame time T refers to needed for three-dimensional graph picture point all in ultrasonic microbubble traversal target three dimensions
Time;Described frame time T is the time needed for the whole 3-D view of ultrasonic microbubble completion, and frame time T meets to be less than people
The eye retentivity time of eye, ensure that whole 3-D view can be captured by human eye.
In described traversal target three dimensions all picture points refer to ultrasonic microbubble in target three dimensions according to
Moment t distributing order, in the spherical coordinate system coordinate (r', θ ', φ ') of the three-dimensional graph picture points of all composition 3-D views,
Corresponding color scalar value (R', G', B') is presented.All picture points refer to a frame time in described target three dimensions
Influenceed in T by the moment t all three-dimensional graph picture points arranged by customer location under moment t and/or user operation instruction,
The picture point generated in target three dimensions.
In one frame time T, three-dimensional graph picture point and the mapping relations such as figure of the image generated in target three dimensions
Shown in 10.The spherical coordinate system coordinate (r, θ, φ) and color scalar value (R, G, B) of former three-dimensional graph picture point, can be only by the moment
The influence of customer location obtains new spherical coordinate system coordinate (r', θ ', φ ') and color scalar value (R', G', B') under t;It is former three-dimensional
The spherical coordinate system coordinate (r, θ, φ) and color scalar value (R, G, B) of space diagram picture point, can customer location and use under by moment t
The influence of family operational order obtains new spherical coordinate system coordinate (r', θ ', φ ') and color scalar value (R', G', B');It is former three-dimensional
The spherical coordinate system coordinate (r, θ, φ) and color scalar value (R, G, B) of space diagram picture point, can user operation instruction under by moment t
Influence obtain new spherical coordinate system coordinate (r', θ ', φ ') and color scalar value (R', G', B').
In the step 4 of three dimensions imaging exchange method provided by the invention based on ultrasonic wave, described ultrasound field
Control ultrasonic microbubble motion refers to that described ultrasonic microbubble is synchronous with ultrasound field generation coupling;Described coupling synchronously refers to surpass
The resonant frequency ω of sound microvesiclecWith the frequencies omega of ultrasound fieldeIt is equal, and ultrasonic microbubble is sensitive to ultrasound field, by ultrasound
Wave field active force;The ultrasound field effect power includes one or more uncorrelated gradient arrows that ultrasonic microbubble position is subject to
Measure the active force on direction, the ultrasound field being distributed in real-time dynamic reflection induction range of ultrasonic microbubble;
Wherein, choosing for ultrasonic microbubble can utilize the method for resonant frequency matching to meet the ultrasound of above-mentioned condition to screen
Microvesicle, described resonant frequency matching process include frequency direct measuring method and frequency scanning determination method.
Described frequency direct measuring method refers to be calculated according to the quality m and coefficient of elasticity k of ultrasonic microbubble, wherein
The coefficient of elasticity k of ultrasonic microbubble is relevant with the appearance and size and hardness of ultrasonic microbubble, and the quality and coefficient of elasticity of ultrasonic microbubble are all
It can be obtained by searching corresponding material parameter, then the resonant frequency ω of ultrasonic microbubblecCalculation formula it is as follows:
Described frequency scanning determination method refers to continually scan for the frequency for changing ultrasonic wave generating source, micro- by observing ultrasound
Bubble effect of ultrasonic wave in same intensity different frequency ultrasound field is showed to determine the resonant frequency of ultrasonic microbubble, or directly
Measured by frequency meter.
For multiple ultrasonic roots according to the spherical coordinate system coordinate points (r', θ ', φ ') of the three-dimensional graph picture point of t, generation is super
Potential well position is located at the spherical coordinate system coordinate points (r', θ ', φ ') in target three dimensions in acoustic wavefield, wherein ultrasound field, surpasses
Sound microvesicle potential well position into ultrasound field is moved, and is finally stable at the position.Ball just because of three-dimensional graph picture point is sat
The coordinate of mark system coordinate points (r', θ ', φ ') and potential well position (r', θ ', φ ') it is identical, particulate matter could be presented in relevant position
Corresponding color, turn into three-dimensional graph picture point.
Preferably, the three dimensions imaging exchange method provided by the invention based on ultrasonic wave, moreover it is possible to further identification
User operation instruction, and corresponding map function can be carried out to the 3-D view of generation based on the instruction, comprise the following steps:
Step 1:Three dimensions spherical coordinate system is established, generates target three dimensions, ultrasonic microbubble is entered described target
Three dimensions;
Step 2:Obtain the parameter information of three-dimensional graph picture point all in 3-D view, described three-dimension space image
The parameter information of point includes t at the time of picture point, the spherical coordinate system coordinate (r, θ, φ) and color scalar value of picture point (R, G,
B);
Step 3:Positional information of the t user in target three dimensions is obtained, and combines n frame time T use before t
Family target three dimensions positional information, analysis obtain user operation instruction, then according to t user in target three-dimensional space
Between positional information and obtained user operation instruction, the spherical coordinate system coordinate of the three-dimensional graph picture point of adjustment generation t
(r', θ ', φ ') and color scalar value (R', G', B');Described customer position information includes current time tm, user's finger tip
Spherical coordinate system coordinate (rm,θm,φm);Described frame time T refers to three-dimensional all in ultrasonic microbubble traversal target three dimensions
Time needed for space diagram picture point;
Step 4:According to the spherical coordinate system coordinate (r', θ ', φ ') of the three-dimensional graph picture point of t, ultrasonic wave is generated
, the control ultrasonic microbubble motion of described ultrasound field, and make ultrasonic microbubble that there is corresponding speed, when being user's contact to
The corresponding pressure feedback of user is given, described ultrasonic microbubble is stable at ultrasound field potential well position in target three dimensions
Put;
Step 5:According to the spherical coordinate system coordinate (r', θ ', φ ') of t three-dimensional graph picture point, laser direction is adjusted,
Laser intensity is adjusted according to t three-dimension space image parameter color scalar value (R', G', B'), launches three kinds of face of red, green, blue
The laser of color, ultrasonic microbubble is focused on, respective color is presented in ultrasonic microbubble, then stops transmitting laser, and wait instructs next time;
Step 6:Repeat step 2~5, by a frame time T, all figures in ultrasonic microbubble traversal target three dimensions
Picture point, 3-D view is generated in target three dimensions.
Expansion of the described user instruction selected from image, closing, rotation, switching, scaling, movement, folding, merging, section
Displaying, part-overall transformation, Image Reversal, details show etc. in instruction any one or it is any a variety of.
Described image rotation refers to that image is rotated by center or rotary shaft.
Described image merges the Fusion Edges referred between different images.
Described image detail shows the detailed information displaying for referring to that image local is included;Described detailed information can be with
It is that the attributes such as the hardness of material are presented in length, the image of image local.
Preferably, the present invention also provides a kind of three dimensions imaging exchange method based on ultrasonic wave, one can not only be entered
Step identification user operation instruction, moreover it is possible to by the relative tertiary location relation of 3-D view that is generated with previous frame time T and/
Or user operation instruction is launched to control or stops launching laser, ensure what the integrality that 3-D view is presented was not operated by user
Interference, its step are as follows:
Step 1:Three dimensions spherical coordinate system is established, target three dimensions is generated, ultrasonic microbubble is entered target three-dimensional space
Between;
Step 2:Obtain the parameter information of three-dimensional graph picture point all in 3-D view, described three-dimension space image
The parameter information of point includes t at the time of picture point, the spherical coordinate system coordinate (r, θ, φ) and color scalar value of picture point (R, G,
B);
Step 3:Positional information of the t user in target three dimensions is obtained, and combines n frame time T use before t
Family target three dimensions positional information, analysis obtain user operation instruction, then according to t in target three dimensions
Customer position information and user operation instruction, the spherical coordinate system coordinate (r', θ ', φ ') and color of adjustment generation graphics picture point
Scalar value (R', G', B');Described customer position information includes current time tm, the spherical coordinate system coordinate (r of user's finger tipm,
θm,φm);Described frame time T refers to that ultrasonic microbubble travels through the time needed for all three-dimensional graph picture points;
Described frame time T is to complete the time needed for whole 3-D view, and frame time T meets to be less than human eye vision
The time is persisted, ensures that whole 3-D view can be captured by human eye.
Step 4:According to the spherical coordinate system coordinate (r', θ ', φ ') of the three-dimensional graph picture point of t, ultrasonic wave is generated
, the control ultrasonic microbubble motion of described ultrasound field, and make ultrasonic microbubble that there is corresponding speed, when being contacted with user to
The corresponding pressure feedback of user is given, described ultrasonic microbubble is stable at ultrasound field potential well position in target three dimensions
Put;
Step 5:According to the relative tertiary location relation and/or use of user and previous frame time the T 3-D view generated
Family operational order is launched to control or stops launching laser, if transmitting laser, is sat according to the ball of t three-dimensional graph picture point
Mark system coordinate (r', θ ', φ '), adjust laser direction, according to t three-dimension space image parameter color scalar value (R', G',
B' laser intensity) is adjusted, launches the laser of three kinds of colors of red, green, blue, focuses on ultrasonic microbubble, corresponding face is presented in ultrasonic microbubble
Color, then stops transmitting laser, and wait instructs next time;
Step 6:Repeat step 2~5, by a frame time T, ultrasonic microbubble travels through all three-dimensional graph picture points,
3-D view is generated in target three dimensions.
The present invention also provides three based on ultrasonic wave corresponding to a kind of three dimensions imaging exchange method based on ultrasonic wave
Dimension space is imaged interactive system, and described interactive system includes ultrasonic wave and module, laser generation module, interactive information acquisition occurs
Module, the first image storage analysis and processing module and power module, (its structured flowchart such as Fig. 9 institutes occur for module, ultrasonic microbubble
Show), wherein:
Described ultrasonic wave occurs module and is used to launch ultrasonic wave, forms ultrasound field, forms potential well in three dimensions,
So as to control ultrasonic microbubble to move to potential well position, and it is stable at the position;Preferably, module occurs for described ultrasonic wave,
It can be made up of at least three supersonic generators and at least one ultrasonic wave controller, described ultrasonic wave controller is used to connect
The ultrasonic wave control signal that the first image storage analysis and processing module is sent is received, execute instruction is sent to each supersonic generator
Parameter, control send direction, intensity and the phase of excusing from death ripple, and supersonic generator is used to receive holding for ultrasonic wave controller transmission
Row order parameter, adjustment send the ultrasonic wave of respective direction, intensity and phase, form ultrasound field, are formed in three dimensions
Potential well, so as to control ultrasonic microbubble to move to potential well position, and it is stable at the position.
Specifically, the method that ultrasonic microbubble is stable at ultrasound field potential well position in target three dimensions is as follows:
Ultrasonic wave occurs module and is made up of the individual supersonic generators of n (n >=3) and at least one ultrasonic wave controller;It is described
Execute instruction parameter of the potential well position on each supersonic generator be Then exist
In target three dimensions, positioned at the potential energy V of the ultrasonic microbubble of ultrasound field each pointe(ri) be calculated by below equation:
Wherein, EkThe energy sent for k-th of supersonic generator, (rik,θik,φik) it is in tiK-th of ultrasonic wave of moment
Execute instruction parameter of the generator using itself as spherical coordinate system origin,For in tiThe ultrasound of k-th of supersonic generator of moment
Wave phase adjusting parameter, K be ultrasound field proportionality constant, qikFor in tiThe intensity of k-th of ultrasonic wave of moment.
Specifically, ultrasonic microbubble senses with ultrasound field, forms ultrasonic microbubble Density Distribution.T is in locus ri
On, the ultrasonic microbubble density of the pseudo- potential energy of ultrasonic microbubble Density Distribution thus position determines.Therefore, ultrasonic microbubble Density Distribution
Pseudo- potential energy is:
Vc(ri, t) and=σ C (ri,t)
C(ri, t) and it is riThe ultrasonic microbubble density of opening position, σ εiIn the range of ultrasonic microbubble Density Distribution density-
Potential energy conversion coefficient, σ can be by measuring in the state of the equilibrium, potential energy V0Ultrasound field riThe ultrasonic microbubble density C at place0
It is calculated, calculation formula is as follows:
On the r of locus, the potential energy of definition sensing synthesis expression field is t:
V (r, t)=Vc(r,t)-Ve(r,t)
V is that ultrasonic microbubble synthesizes the potential energy in expressing field, V in sensingcFor the pseudo- potential energy of ultrasonic microbubble Density Distribution, VeFor
Potential energy of the ultrasonic microbubble in ultrasound field;
On the r of locus, the quantum superposition state plane wave function ψ (r, t) of ultrasonic microbubble is represented by known t:
A (r) be wave function amplitude, ωcFor the frequency of wave function, and meet ωc=ωe, ωeFor ultrasound field frequency;
Again because ultrasonic microbubble senses with ultrasound field, r in the unit interval is then considerediPosition ultrasonic microbubble is by ultrasound
Wave field gradient force action and the ENERGY E (r changedi) synthesized with sensing and express field potential energy difference Δ V (ri) there is following relation:
E(ri)ψ(ri)=Δ V (ri)
Sensing synthesis expression field potential energy difference Δ V (ri) be can be calculated with time t situation of change by below equation:
ΔV(ri, t) and=Δ Vc(ri,t)-ΔVe(ri,t)
Wherein, Δ Vc(ri, t) be ultrasonic microbubble Density Distribution pseudo- potential energy difference, Δ Ve(ri, t) for ultrasonic microbubble in ultrasound
Potential energy difference in wave field, N εiIn the range of and riRelevant position quantity.
In the case of without additional potential energy, position riWith rjThe potential energy of the sensing synthesis expression field at place reaches poised state, needs
Meet following relation:
As available from the above equation, ultrasonic microbubble is in position riWith rjThe potential energy of the sensing synthesis expression field at place reaches poised state
Potential variation amount is:
In t in locus riOn ultrasonic microbubble in the case where sensing synthesizes expression field gradient it is suffered make a concerted effort beCalculation formula is as follows:
Wherein, | cij|2Exist for ultrasonic microbubbleThe powered quantity of ultrasonic microbubble is accounted for by driving number on gradient of vector direction
Amount ratio.
Therefore, riQuantum superposition state residing for ultrasonic microbubble on position is represented by
Therefore, using without when Schrodinger equation describe ultrasonic microbubble under t motion state it is as follows:
ψ(ri) it is position riLocating the wave function of ultrasonic microbubble, m is ultrasonic microbubble quality,For reduced Planck constant.
When in riWhen the ultrasonic microbubble kinetic energy at place meets approximately equal, above equation is changed into:
T(ri)+U(ri)ψ(ri)=E ψ (ri)
T(ri) it is position riLocate ultrasonic microbubble kinetic energy, U (ri) it is position riLocate ultrasonic microbubble potential energy.
It is knownE(ri)ψ(ri)=Δ V (ri), then have:
And normal velocity component u and speed v point of the ultrasonic microbubble along contour surface are calculated by Hamilton-Jacobi equation
It is not:
As u=v, tangential velocity component of the ultrasonic microbubble along equipotentiality curved surface is 0, is not occurred in ultrasonic microbubble motion process
Collision, the efficiency highest of ultrasonic microbubble motion, the ultrasonic microbubble change energy that is thus easy to get and the relation of change potential energy are E=2U.
Derived more than, t is in locus riUpper all ultrasonic microbubbles sense synthesis expression field according to corresponding
Gradient direction motion, do not produce collision between ultrasonic microbubble now, most efficient self-organization can be ensured.
Then, riThe ultrasonic microbubble of position is to rjThe speed during self-organization of position is calculated as follows:
As V (ri,t)-V(rj, t) > 0 when, vij> 0, now, riUltrasonic microbubble on position is to rjPosition is moved, and works as V
(ri,t)-V(rj, t) < 0 when, vij< 0, now, rjUltrasonic microbubble outside position is to riPosition is moved, as V (ri,t)-V(rj,t)
When=0, vij=0, now, riThe ultrasonic microbubble of position is relative to rjPosition reaches poised state.Therefore, ultrasonic microbubble is eventually
It is stable at the minimum ultrasound field potential well position r of V (r, t).
The laser that module is used to launch three kinds of colors of red, green, blue occurs for described laser, and control ultrasonic microbubble presents corresponding
Color;Module occurs for described laser can be by least one red laser generator, at least one green glow generator, at least one
Individual blue light generator and at least one laser generator controller composition;Described red laser generator is red sharp for sending
Light;Described green laser generator is used to send green laser;Described blue laser generator is used to send blue laser;
Described laser generator controller is used to receive the laser control signal that the first image storage analysis and processing module is sent, control hair
Go out direction, the intensity of laser.
Described interactive information acquisition module is used to determine the positional information of user in three dimensions, is converted to user position
Confidence number is simultaneously sent to the first image storage analysis and processing module;Described interactive information acquisition module can be swashed by least three
Optical range finding apparatus and at least one identification device composition;The distance that laser ranging system is used to determine user's body to device is joined
Number;Identification device is used to identify user's body, and identification device can use the equipment such as Leap Motion, Kinect.
Described ultrasonic microbubble occurs module and is used to generate ultrasonic microbubble;With ultrasound field coupling occurs for described ultrasonic microbubble
Contract walks;Described coupling synchronously refers to that ultrasonic microbubble meets the resonant frequency ω of ultrasonic microbubble with ultrasound fieldcWith ultrasonic wave
The frequencies omega of fieldeIt is equal, and ultrasonic microbubble is sensitive to ultrasound field, by ultrasound field effect power;The ultrasound field is made
Firmly include the active force on one or more uncorrelated gradient vector directions that ultrasonic microbubble position is subject to, ultrasonic microbubble
The real-time dynamic reflection induction range of distribution in ultrasound field.Described ultrasonic microbubble occurs module and receives the storage of the first image
The control signal of analysis and processing module, ultrasonic microbubble is generated according to control signal.
Wherein, choosing for ultrasonic microbubble can utilize the method for resonant frequency matching to meet the ultrasound of above-mentioned condition to screen
Microvesicle, described resonant frequency matching process include frequency direct measuring method and frequency scanning determination method;
Described frequency direct measuring method refers to be calculated according to the quality m and coefficient of elasticity k of ultrasonic microbubble, wherein
The coefficient of elasticity k of ultrasonic microbubble is relevant with the appearance and size and hardness of ultrasonic microbubble, and the quality and coefficient of elasticity of ultrasonic microbubble are all
It can be obtained by searching corresponding material parameter, then it is as follows to calculate formula for the resonant frequency of ultrasonic microbubble:
Described frequency scanning determination method refers to continually scan for the frequency for changing ultrasonic wave generating source, micro- by observing ultrasound
Bubble effect of ultrasonic wave in same intensity different frequency ultrasound field is showed to determine the resonant frequency of ultrasonic microbubble, or directly
Measured accordingly by frequency meter.
Described the first image storage analysis and processing module is used for the picture point parameter letter for storing and reading graphics picture point
Breath, send ultrasonic microbubble generation signal and module occurs to ultrasonic microbubble, receive the use that described interactive information acquisition module is sent
Family position signalling, generated after being analyzed and processed the three-dimensional graph picture point of t spherical coordinate system coordinate (r', θ ', φ ') and
Color scalar value (R', G', B'), and the spherical coordinate system coordinate of three-dimensional graph picture point (r', θ ', φ ') is converted into ultrasonic wave
Control signal is sent to supersonic generator, and control sends direction, intensity and the phase of ultrasonic wave, by color scalar value (R',
G', B') laser control signal is converted to, it is sent to laser and module occurs, control sends direction, the intensity of laser, micro- to ultrasound
The raw module that is soaked sends ultrasonic microbubble generation signal.
Preferably, described the first image storage analysis and processing module includes image information memory module and the first data
Analysis and processing module.
Wherein, described image information memory module is used for the parameter information for storing graphics picture point;Described graphics
The parameter information of picture point includes t at the time of picture point, the spherical coordinate system coordinate (r, θ, φ) and color scalar value of picture point (R, G,
B);
The first described Data Analysis Services module is used for the ginseng for reading the graphics picture point in image information memory module
Number information, receives users location signals, analyzes and processes users location signals, the three-dimensional graph picture point of adjustment generation t
Spherical coordinate system coordinate (r', θ ', φ ') and color scalar value (R', G', B'), and the spherical coordinate system of three-dimensional graph picture point is sat
Mark (r', θ ', φ ') is converted to ultrasonic wave control signal and is sent to supersonic generator, and control sends direction, the intensity of ultrasonic wave
And phase, color scalar value (R', G', B') is converted into laser control signal, laser is sent to and module occurs, control sends sharp
The direction of light, intensity, send ultrasonic microbubble generation signal and module occurs to microvesicle.
Described power module is used to module, laser generation module, interactive information acquisition module, ultrasound occur to ultrasonic wave
Module occurs for microvesicle, the first image storage analysis and processing module provides the energy, and is connected with above-mentioned module.
The present invention also provides three based on ultrasonic wave corresponding to a kind of three dimensions imaging exchange method based on ultrasonic wave
Dimension space is imaged interactive system, can further identify user operation instruction, moreover it is possible to pass through the three-dimensional generated with previous frame time T
The relative tertiary location relation and/or user operation instruction of image are three-dimensional to control transmitting or stopping transmitting laser, guarantee presentation
The integrality of image is not disturbed by user's operation, described interactive system includes ultrasonic wave generation module, module occurs for laser,
Module, the second image storage analysis and processing module and power module (its structural frames occur for interactive information acquisition module, ultrasonic microbubble
Figure is as shown in Figure 1), wherein:
Described ultrasonic wave occurs module and is used to receive the ultrasonic wave control that the second image storage analysis and processing module is sent
Signal, launch ultrasonic wave, form ultrasound field, potential well is formed in three dimensions, so as to control ultrasonic microbubble to move to potential well
Position, and it is stable at the position;Preferably, module occurs for described ultrasonic wave, can be by least three supersonic generators
Formed with least one ultrasonic wave controller, described ultrasonic wave controller is used to receive the second image storage analysis and processing module
The ultrasonic wave control signal sent, to each supersonic generator send execute instruction parameter, control send excusing from death ripple direction,
Intensity and phase, ultrasound field is formed, forms potential well in three dimensions, so as to control ultrasonic microbubble to move to potential well position,
And it is stable at the position.
The laser that module is used to launch three kinds of colors of red, green, blue occurs for described laser, and control ultrasonic microbubble presents corresponding
Color;Module occurs for described laser can be by least one red laser generator, at least one green glow generator, at least one
Individual blue light generator and laser generator controller;Described red laser generator is used to send red laser;Described green
Laser generator is used to send green laser;Described blue laser generator is used to send blue laser;Described laser hair
Raw controller is used to receive the laser control signal that the second image storage analysis and processing module is sent, and control sends the side of laser
Transmitting to, intensity and laser and stop transmitting.
Described interactive information acquisition module is used to determine the positional information of user in three dimensions, is converted to user position
Confidence number is simultaneously sent to the second image storage analysis and processing module;Described interactive information acquisition module can be swashed by least three
Optical range finding apparatus and at least one identification device composition;The distance that laser ranging system is used to determine user's body to device is joined
Number;Identification device is used to identify user's body, and identification device can use the equipment such as Leap Motion, Kinect.
Described ultrasonic microbubble occurs module and is used to generate ultrasonic microbubble;With ultrasound field coupling occurs for described ultrasonic microbubble
Contract walks;Described coupling synchronously refers to that ultrasonic microbubble meets the resonant frequency ω of ultrasonic microbubble with ultrasound fieldcWith ultrasonic wave
The frequencies omega of fieldeIt is equal, and ultrasonic microbubble is sensitive to ultrasound field, by ultrasound field effect power;The ultrasound field is made
Firmly include the active force on one or more uncorrelated gradient vector directions that ultrasonic microbubble position is subject to, ultrasonic microbubble
The real-time dynamic reflection induction range of distribution in ultrasound field.Described ultrasonic microbubble occurs module and receives the storage of the second image
The ultrasonic microbubble generation signal of analysis and processing module, signal generation ultrasonic microbubble is generated according to ultrasonic microbubble.
Described the second image storage analysis and processing module is used for the parameter information for storing and reading graphics picture point, sends
Ultrasonic microbubble generates signal and module occurs to ultrasonic microbubble, receives the customer location letter that described interactive information acquisition module is sent
Number, and analyze to obtain user operation instruction according to the positional information of n frame time T before t, according to the user of t in mesh
Mark the positional information and user operation instruction in three dimensions, the spherical coordinate system of the three-dimensional graph picture point of adjustment generation t
Coordinate (r', θ ', φ ') and color scalar value (R', G', B'), and by the spherical coordinate system coordinate of three-dimensional graph picture point (r', θ ',
φ ') be converted to ultrasonic wave control signal and be sent to supersonic generator, control sends direction, intensity and the phase of ultrasonic wave, root
According to user and the relative tertiary location relation of the 3-D view of previous frame time T generations, by color scalar value (R', G', B')
Laser control signal is converted to, laser is sent to and module occurs, control sends the direction of laser, the transmitting of intensity and laser and stopped
Only launch, module, which occurs, to microvesicle sends ultrasonic microbubble generation signal.
Preferably, described the second image storage analysis and processing module includes image information memory module, the second data
Analysis and processing module, wherein,
Described image information memory module is used for the parameter information for storing graphics picture point;Described graphics picture point
Parameter information includes t at the time of picture point, the spherical coordinate system coordinate (r, θ, φ) and color scalar value (R, G, B) of picture point;
The second described Data Analysis Services module is used for the ginseng for reading the graphics picture point in image information memory module
Number information, receives the users location signals that described interactive information acquisition module is sent, and identification user and previous frame time T give birth to
Into 3-D view relative tertiary location relation, and according to the position signalling of n frame time T before t, analysis obtains user
Operational order, according to positional information and/or user operation instruction of the user of t in target three dimensions, generate t
Three-dimensional graph picture point spherical coordinate system coordinate (r', θ ', φ ') and color scalar value (R', G', B'), and by three dimensions
The spherical coordinate system coordinate (r', θ ', φ ') of picture point is converted to ultrasonic wave control signal and is sent to supersonic generator, control hair
Go out direction, intensity and the phase of ultrasonic wave, according to the relative tertiary location of user and previous frame time the T 3-D view generated
Relation, color scalar value (R', G', B') is converted into laser control signal, is sent to laser and module occurs, control sends laser
Direction, intensity and laser transmitting and stop transmitting, to ultrasonic microbubble occur module send ultrasonic microbubble generation signal.
Described power module is used to module, laser generation module, interactive information acquisition module, ultrasound occur to ultrasonic wave
Module occurs for microvesicle, the second image storage analysis and processing module provides the energy, and is connected with above-mentioned module.
The beneficial effects of the invention are as follows:Three dimensions imaging method and system based on ultrasonic wave, the present invention can pass through
3-D view is presented in target three dimensions in ultrasonic microbubble, due to ultrasonic microbubble in by ultrasound field ultrasonic potential well controlled,
The optional position that can be moved in target three dimensions, therefore the imaging of 3-D view of the present invention is independent of fixation
Imaging plane;Again because interactive information acquisition module can catch positional information of the user in target three dimensions, and then
Can be by catching action of the user in object space, therefore the method for the invention and system can control ultrasonic microbubble
Motion, adjusts position of the 3-D view in target three dimensions, user is carried out real-time interactive with 3-D view.
Specifically, the real-time interactive includes:(1) manipulation of the user to 3-D view, the gesture instruction of user can be by
System identification of the present invention, user can manipulate 3-D view;(2) user experiences the sense of touch of 3-D view, due to this
The described method and system of invention according to the positional information of user, can adjust the movement velocity of ultrasonic microbubble, so as to user with
During the 3-D view contact of generation, user can be experienced 3-D view and object is presented by corresponding pressure feedback
True sense of touch;(3) integrality of 3-D view is not disturbed by user action, because three dimensions of the present invention is imaged
Method and system according to the positional information of user, can adjust the movement locus of ultrasonic microbubble, therefore user and the three-dimensional of generation
When image contacts, the integrality of 3-D view structure is not disturbed by user.User has surmounted existing three with system interaction
Vision is tieed up, it is also realistic in body-sensing tactile.
Brief description of the drawings
Fig. 1 is the structured flowchart of the imaging interactive system of the three dimensions based on ultrasonic wave of the present invention;
Fig. 2 is the flow chart of the imaging exchange method of the three dimensions based on ultrasonic wave of the present invention;
Fig. 3 is the user instruction schematic diagram of image rotation of the present invention;
Fig. 4 is the user instruction schematic diagram of image scaling of the present invention;
Fig. 5 is that the present invention shows remarks and the user instruction schematic diagram introduced;
Fig. 6 is the user instruction schematic diagram that image of the present invention is retreated;
Fig. 7 is present invention generation ultrasound field potential well schematic diagram;
Fig. 8 is that module and ultrasonic wave generation module synchronization control schematic diagram occur for laser of the present invention;
Fig. 9 is the structured flowchart of the imaging interactive system of the three dimensions based on ultrasonic wave of the present invention;
Figure 10 is 3 d image data of the present invention and generation 3-D view graph of a relation;
Figure 11 is the 3-D view schematic diagram of present invention generation Two-dimensional Surfaces;
Shown in figure, 5 --- ultrasonic wave generation module, 6 --- laser generation module, 11 --- ultrasonic microbubble, 12 ---
Coordinate origin, 13 --- the potential well position of ultrasound field.
Embodiment
Hereinafter, exemplary embodiment is described in detail with reference to the drawings.However, specific structure disclosed herein and functional details
Only represent the purpose of description exemplary embodiment.Exemplary embodiment can be realized by many alterative versions, should not be construed as only
It is limited to exemplary embodiments set forth herein.
It is understood that although can be used term " first ", " second " etc. to describe each element here, but these elements are not
Should be limited by these terms.These terms are only used for an element and another differentiation.For example, the first element can be described as
Second element, similarly, second element can be described as the first element, without departing from the scope of exemplary embodiment.Here, term
"and/or" includes one or more any one and all combinations that correlation lists project.
It is understood that when element is referred to as " connecting " or " coupling " is to another element, it can be directly connected to or coupled to other yuan
Element, or neutral element may be present.On the contrary, when element is referred to as " being directly connected to " or " direct-coupling " to another element, in being not present
Between element.Other words for describing relation between element can be explained in the same way.
(such as " between " and " directly between ", " adjacent " and " direct neighbor " etc.).
Here, singulative " one ", "one" and " described " are intended to also include plural form, except non-language explicitly indicates that.
It can also be appreciated that term " comprising ", " including ", "comprising" and/or " including " indicate when in use signified feature,
Integer, step, operation, the presence of element, and/or component, but it is not excluded for other one or more features, integer, step, behaviour
Presence or increase of work, element, and/or component and/or its group.
It shall yet further be noted that in some alternatives, the function/action pointed out in illustrative methods can not be according to shown in accompanying drawing
Or the order described in specification occurs.For example, sequentially two shown accompanying drawings or step can be actually serial and performed simultaneously,
Or in reverse order or can sometimes repeat, depending on involved function/action.Similarly, in any shown or institute
Between the step of stating, extra intermediate steps are can perform before or afterwards.
In order that the object, technical solutions and advantages of the present invention are clearer, by taking three-dimensional demonstration human body food digestion as an example,
With reference to accompanying drawing, the invention will be further described.
Demonstrate required three-dimensional image information in this example to be pre-stored within the image information memory module of the system, this example
The scene of demonstration is that teacher explains chemical explosion, attraction, the classroom of digestion, teacher-oriented hair to student respectively
The bright user in this demonstration, student and teacher on the scene are observer.
Three dimensions imaging interactive system of the present invention based on ultrasonic wave includes ultrasonic wave and module, laser hair occurs
Module, the second image storage analysis and processing module and power module occur for raw module, interactive information acquisition module, ultrasonic microbubble,
Described the second image storage analysis and processing module includes image information memory module and the second Data Analysis Services module, described
Ultrasonic wave occur module be made up of supersonic generator and ultrasonic wave controller, structured flowchart such as Fig. 1 of the interactive system
It is shown;Interactive system of the present invention generated in target three dimensions 3-D view and with flow such as Fig. 2 of user mutual
It is shown.
System specific works step of the present invention is as follows:
Step 1:System initialization, the second Data Analysis Services module are formed flat with ultrasonic generator in ultrasonic module
Face central point is spherical coordinate origin, establishes spherical coordinate, and sits the angle of depression of each ultrasonic generator in reference axis
0 is designated as, ultrasonic microbubble occurs module transmitting ultrasonic microbubble and enters target three dimensions;
Step 2:Second Data Analysis Services module obtains the ginseng of the three-dimensional graph picture point in image information memory module
Number information, the parameter information of described three-dimensional graph picture point include t at the time of picture point, the spherical coordinate system coordinate of picture point
(r, θ, φ) and color scalar value (R, G, B), i.e. three-dimensional graph picture point i information parameter are (ti,ri,θi,φi,Ri,Gi,
Bi);
Step 3:Interactive information acquisition module obtains tiPositional information (t of the moment user in target three dimensionsi,rim,
θim,φim), and it is sent to the second Data Analysis Services module;
Specific implementation process is as follows:
By taking the hand various point locations for determining user as an example, interactive information acquisition module is by three laser ranging systems and one
Identification device forms;Hand various point locations are identified by identification device first, obtained respectively by three laser ranging systems
Spherical coordinate system parameter (r to finger using j-th of laser ranging system as originjm,θjm,φjm), according to laser ranging system certainly
Spherical coordinate system location parameter (the r established in systemj,θj, 0), the coordinate (r of hand position can be calculatedm,θm,φm), meter
It is as follows to calculate formula:
Wherein, rjmFor the distance of user's finger m to j-th laser ranging system, θjmIt is user's finger m relative to j-th
The corner of laser ranging system, φjmThe elevation angle for user's hand position m relative to j-th of laser ranging system.
Interactive information acquisition module identifies hand profile, user is worked as by determining the position coordinates of user's hand each point
The hand position information at preceding moment is sent to the second Data Analysis Services module.
Step 4:Second Data Analysis Services module receives user's current time tiHand position information, and combine tiWhen
Before quarter n frame time T user target three dimensions positional information, analysis obtain user operation instruction, then according to tiWhen
Carve positional information of the user in target three dimensions and obtained user operation instruction, the spherical coordinate system of adjustment generation picture point
Coordinate (ri',θi',φi') and color scalar value (Ri',Gi',Bi'), and further by the spherical coordinate system coordinate of picture point
(ri',θi',φi') be converted to the order parameter (r that each supersonic generator and laser generator in module occur for ultrasonic waveik,
θik,φik);
Specific Image Adjusting coordinate system transformation is clicked through using the coordinate transformation method of spherical coordinate system image to former coordinate system
The conversion such as row translation, rotation, scaling diminution, when such as carrying out equal proportion zoom to image, there is following relation:ri'=
l·ri, θi'=θi, φi'=φi, l is proportionality coefficient;When being rotated to figure, there is following relation:ri'=ri, θi'=θi+
Δ θ, φi'=φi+ Δ φ, Δ θ and Δ φ are the corresponding anglec of rotation;, can be first by spherical coordinate when such as being translated to image
It is coordinate (ri,θi,φi) be converted to plane right-angle coordinate (xi,yi,zi), translation obtains (xi',yi',zi'), reconvert balling-up
Coordinate system coordinate (ri',θi',φi'), wherein there is following relation:xi'=xi+ Δ x, yi'=yi+ Δ y, zi'=zi+ Δ z, Δ x,
Δ y, Δ z are respectively three translation of axes amounts;Other complex transformations can be by converting combination producing briefly above, such as rolls over
Dissolve and change and can be obtained by rotation and translation conversion superposition.
By taking supersonic generator as an example, it is as follows that picture point i parameter information specifically changes implementation process:
Known now picture point i parameter information is (ti,ri,θi,φi,Ri,Gi,Bi), module occurs for ultrasonic wave according to this
(t in picture point parameter informationi,ri',θi',φi') picture point is obtained to each ultrasonic wave generation of ultrasonic wave generation module
Specific instruction parameter (the r of deviceik,θik,φik), k-th supersonic generator is (r in spherical coordinate system coordinate parametersk,θk,0)
Calculation formula is as follows:
Wherein, rikFor the distance of picture point i to k-th supersonic generator, θikIt is picture point i ultrasonic relative to k-th
The corner of wave producer, φikThe elevation angle for picture point i relative to k-th of supersonic generator, c are the velocity of sound,K-th to surpass
The elevation angle adjustment phase place of sonic generator.
In the present embodiment by taking the finger for detecting user as an example, the use of the 3-D view rotation of the system generation can be set
Family instruction outwards straightens to U-shaped, remaining finger for thumb and forefinger and kept clenching fist the action of shape, rotary wrist, such as Fig. 3 institutes
Show;3-D view scaling user instruction in addition to thumb and forefinger finger keep clench fist shape, thumb and forefinger to
The action of U-shaped, change thumb and forefinger angle is overhang into, as shown in Figure 4;The pause of the system simultaneously shows and determined
The user instruction of respective menu is that forefinger stretches and double-clicks the action of corresponding 3-D view, remaining digital flexion, as shown in Figure 5;
The user instruction of the front and rear switching of 3-D view keeps shape of clenching fist for each finger in addition to forefinger, and forefinger is extended straight forward quickly to the left
Or the action of stroke to the right, as shown in Figure 6.
Step:5:According to the spherical coordinate system coordinate (r', θ ', φ ') of the three-dimensional graph picture point of t, ultrasonic wave is generated
, the control ultrasonic microbubble motion of described ultrasound field, and make ultrasonic microbubble that there is corresponding speed, when being contacted with user to
The corresponding pressure feedback of user is given, described ultrasonic microbubble is stable at ultrasound field potential well position in target three dimensions
Put.
The potential energy calculation of specific ultrasound field is as follows:
It is made up of for example, module occurs for ultrasonic wave three supersonic generators and a ultrasonic wave controller, it is known that image
Order parameter o'clock on k-th of supersonic generator isThen ultrasonic microbubble is positioned at super
Potential energy V in acoustic wavefield during focus pointe(ri) be calculated by below equation:
K be ultrasound field proportionality constant, qikFor in tiThe intensity of k-th of ultrasonic wave of moment, EkSent out for k-th of ultrasonic wave
The energy that raw device is sent, (rik,θik,φik) it is in tiK-th of supersonic generator of moment is using itself as spherical coordinate system origin
Execute instruction parameter,For in tiThe ultrasonic wave phase adjustment parameters of k-th of supersonic generator of moment.
Meanwhile in order to realize the true sense of touch of 3-D view, specific ultrasonic microbubble applies corresponding pressure F to fingeri, lead to
Cross the movement velocity v of system control ultrasonic microbubbleiTo realize, specific formula for calculation is as follows:
Wherein, miFor the quality of single ultrasonic microbubble, Δ t is the time that finger contacts ultrasonic microbubble, and wherein Δ t is normal
Number, ultrasonic microbubble speedIt can be calculated according to the run duration parameter of ultrasonic microbubble, specific formula for calculation isWherein,It is moved to point i motion vector from point i-1 for ultrasonic microbubble, and Δ tiThe point currently set for system
I-1 time parameters ti-1With point i time parameters tiBetween time difference, following relation be present:
Δti=μi(ti-ti-1)
Wherein, μiFor corresponding material hardness parameter, can be set by inquiring about relevant material parameters;Wherein, ultrasound is micro-
The movement velocity v of bubbleiExpression field potential energy difference Δ V (r are synthesized with sensingi) directly proportional, therefore according to ultrasonic microbubble in ultrasound field
Potential energy Ve(ri) calculation formula, can be by adjusting the emissive porwer q of supersonic generatork, control the motion of ultrasonic microbubble fast
Degree, specific conversion relation, can be obtained by the derivation of equation.
Step 6:According to the spherical coordinate system coordinate (r', θ ', φ ') of t three-dimensional graph picture point, laser generator is received
Order parameter, laser direction is adjusted, it is strong to adjust laser according to t three-dimension space image parameter color scalar value (R', G', B')
Degree, to launch the laser of three kinds of colors of red, green, blue, focus on ultrasonic microbubble, ultrasonic microbubble is presented respective color, then stopped, etc.
Treat to instruct next time;
Step 7:Repeat step 2~6, by a frame time T, ultrasonic microbubble travels through all three-dimensional graph picture points,
3-D view is presented.
The chemical explosion process of embodiment 1 is demonstrated
Specifically, interactive system of the present invention can identify the gesture of teacher in three dimensions, realize teacher with
The interaction of 3-D view, it is as follows that teacher demonstrates chemical explosion process:
(1) system starts, and launches ultrasonic microbubble into target three dimensions, and ultrasonic microbubble is stable at target three dimensions
In;
(2) teacher makes the front and rear switching gesture instruction of 3-D view first, and system walks according to foregoing specific works
The gesture of rapid 3~step 4 identification teacher, read the information ginseng of the chemical explosion 3-D view stored in image information memory module
Number, system show three-dimensional according to foregoing specific works step 1~step 7 control ultrasonic microbubble in target three dimensions
Image, it is switched to the three-dimensional demonstration initial pictures of chemical explosion;
(3) teacher is double-clicked using finger and touches the three-dimensional initial pictures, it is determined that playing, system is according to foregoing specific
The control ultrasonic microbubble motion of 1~step 7 of job step shows chemical explosion scene image;Described chemical explosion scene image bag
All 3-D views of actual chemical explosion process are included, i.e. the 3-D view of the system generation illustrates actual chemical explosion
Process;
(4) teacher is clicked using finger, and system identifies gesture according to foregoing specific works step 3~step 4, temporarily
Stop the broadcasting of chemical explosion scene, now every control parameter (t of ultrasonic microbubblei,ri,θi,φi,Ri,Gi,Bi) keep constant;
(5) teacher amplifies and reduced the static chemical explosion 3-D view of pause using finger, is double-clicked with finger and shows phase
The prompting answered and data, 3-D view is rotated by rotating gesture, observe 3-D view from different perspectives;
Specific Image Adjusting coordinate system transformation is clicked through using the coordinate transformation method of spherical coordinate system image to former coordinate system
The conversion such as row translation, rotation, scaling diminution, when such as carrying out equal proportion zoom to image, there is following relation:ri'=
l·ri, θi'=θi, φi'=φi, l is proportionality coefficient;When being rotated to figure, there is following relation:ri'=ri, θi'=θi+
Δ θ, φi'=φi+ Δ φ, Δ θ and Δ φ are the corresponding anglec of rotation;, can be first by spherical coordinate when such as being translated to image
It is coordinate (ri,θi,φi) be converted to plane right-angle coordinate (xi,yi,zi), translation obtains (xi',yi',zi'), reconvert balling-up
Coordinate system coordinate (ri',θi',φi'), wherein there is following relation:xi'=xi+ Δ x, yi'=yi+ Δ y, zi'=zi+ Δ z, Δ x,
Δ y, Δ z are respectively three translation of axes amounts;
(6) demonstration finishes, and teacher double-clicks gesture using finger, system identification gesture, stops generation 3-D view.
The magnetic field magnetic iron of embodiment 2 attracts each other demonstration
Specifically, system of the present invention can also realize the interaction body-sensing of user in operation, operated in user
While give the corresponding body-sensing pressure of user, experiencing magnetic field magnetic iron in user attracts each other process.
So that student's magnetic manipulation field magnet attracts each other demonstration as an example, its process is as follows:
(1) system starts, and launches ultrasonic microbubble into target three dimensions, and ultrasonic microbubble is stable at target three dimensions
In;
(2) student makes the front and rear switching gesture instruction of 3-D view first, and system walks according to foregoing specific works
The gesture of rapid 3~step 4 identification student, read the magnet stored in image information memory module and attract each other the letter of 3-D view
Parameter is ceased, system is shown according to foregoing specific works step 1~step 7 control ultrasonic microbubble in target three dimensions
3-D view, it is switched to the three-dimensional demonstration initial pictures that magnet attracts each other;
(3) student is double-clicked using finger and touches the three-dimensional initial pictures, it is determined that playing, system is according to foregoing specific
The control ultrasonic microbubble motion of 1~step 7 of job step shows magnet and attracted each other scene image;Described magnet attracts each other field
Scape image includes actual magnet and attracted each other all 3-D views of process, i.e. the 3-D view of the system generation illustrates reality
The magnet on border attracts each other process;
(4) student, which is put his hand into magnetic field, pushes magnet (3-D view) action open, and system is according to foregoing specific
Job step 3~step 4 cognometrics green hand's gesture, to magnet 3-D view) image space adjust accordingly, to student's hand
Movement locus is followed, and finger is applied by the relative motion of ultrasonic microbubble according to foregoing specific works step 5
Add corresponding pressure, student is experienced pressure caused by attraction;
Specific ultrasonic microbubble applies corresponding pressure F to fingeri, pass through the movement velocity v of system control ultrasonic microbubbleiCome
Realize, specific formula for calculation is as follows:
Wherein, miFor the quality of single ultrasonic microbubble, Δ t is the time that finger contacts ultrasonic microbubble, and wherein Δ t is normal
Number, ultrasonic microbubble speedIt can be calculated according to the run duration parameter of ultrasonic microbubble, specific formula for calculation isWherein,It is moved to point i motion vector from point i-1 for ultrasonic microbubble, and Δ tiThe point currently set for system
I-1 time parameters ti-1With point i time parameters tiBetween time difference, following relationship delta t be presenti=μi(ti-ti-1), wherein μiFor
Corresponding material hardness parameter, it can be set by inquiring about relevant material parameters;Wherein, the movement velocity v of ultrasonic microbubbleiWith sense
Expression field potential energy difference Δ V (r should be synthesizedi) directly proportional, therefore according to the potential energy V of ultrasonic microbubble in ultrasound fielde(ri) calculate public affairs
Formula, can be by adjusting the emissive porwer q of supersonic generatork, control the movement velocity of ultrasonic microbubble, specific conversion relation,
It can be obtained by the derivation of equation.
(5) student unclamps magnet (3-D view), and the gesture of system identification student, system is according to foregoing specific work
Make the control ultrasonic microbubble motion of step 1~step 7 and show magnet to attract each other scene image;
(6) demonstration finishes, and student double-clicks gesture using finger, system identification gesture, stops generation 3-D view.
The digestion food demonstration of the digestive system of embodiment 3
Specifically, teacher can also remain able to completely open up when 3-D view has other body parts such as finger to block
Show whole 3-D view, illustrated below by taking the process that teacher demonstrates digestive system digestion apple as an example:
(1) system starts, and launches ultrasonic microbubble into target three dimensions, and ultrasonic microbubble is stable at target three dimensions
In;
(2) teacher makes the front and rear switching gesture instruction of 3-D view first, and system walks according to foregoing specific works
The gesture of rapid 3~step 4 identification teacher, read the information ginseng of image information memory module digestive system digestion apple 3-D view
Number, system control ultrasonic microbubble to show graphics in target three dimensions according to foregoing specific works step 1~7
Picture, it is switched to the three-dimensional demonstration initial pictures of digestive system digestion apple process;
(3) teacher is double-clicked using finger and touches the three-dimensional initial pictures, it is determined that playing, system is according to foregoing specific
The control ultrasonic microbubble motion of 1~step 7 of job step shows digestive system digestion apple scene image;Described digestive system disappears
Changing apple scene image includes the 3-D view of true apple, the 3-D view of true digestive system and the digestion of true digestive system
The 3-D view of all 3-D views of apple process, i.e. the system generation illustrates real digestive system digestion apple mistake
Journey;
(4) teacher's progress hand catches the 3-D view of apple to be fed to the action of digestive system, and system is according to as previously described
Specific works step 3~step 4 by interactive information module identify hand position and action, adjust the 3-D view of apple
Parameter information, the 3-D view of apple is controlled to enter digestive system;
(5) teacher shows that digestive system carries out digestive system demonstration using finger, and system is according to foregoing specific works
Step 3~step 4 identification finger position, when finger position is overlapping with the viewing area of digestive system, the second image of system
Storage analysis and processing module updates to the movement locus of ultrasonic microbubble, controls the movement locus of ultrasonic microbubble to get around residing for hand
Position, and control laser generator opening and closing make ultrasonic microbubble show normal color;When hand is in some laser
When between generator and ultrasonic microbubble, the laser generator is closed, is both the intensity for strengthening other laser generators, makes Digestive
System display is not influenceed by hand motion;
Wherein, the movement locus of ultrasonic microbubble updates, by being superimposed phase on original ultrasonic microbubble location parameter
The displacement answered obtains, wherein when being translated to image, can be first by spherical coordinate coordinate (ri,θi,φi) be converted to it is flat
Face rectangular coordinate system (xi,yi,zi), translation obtains (xi',yi',zi'), reconvert is into spherical coordinate system coordinate (ri',θi',φi'),
Wherein there is following relation:xi'=xi+ Δ x, yi'=yi+ Δ y, zi'=zi+ Δ z, Δ x, Δ y, Δ z are respectively that teacher's finger enters
Enter translational movement of the ultrasonic microbubble in three reference axis caused by respective regions;
(6) demonstration finishes, and teacher double-clicks gesture using finger and determines that demonstration finishes, and system is according to foregoing specific work
Make step 3~step 4 identification gesture, stop generation 3-D view.
The ball of embodiment 4, which is cast, hits demonstration
Specifically, student in the dummy object that 3-D view is presented can also operate with real-world object identical,
By student illustrate exemplified by ball casts the process hit below:
(1) system starts, and launches ultrasonic microbubble into target three dimensions, and ultrasonic microbubble is stable at target three dimensions
In;
(2) student makes the front and rear switching gesture instruction of 3-D view first, and system walks according to foregoing specific works
The gesture of rapid 3~step 4 identification student, read image information memory module ball and cast the information ginseng for hitting demonstration 3-D view
Number, system show three-dimensional according to foregoing specific works step 1~step 7 control ultrasonic microbubble in target three dimensions
Image, it is switched to ball and casts the three-dimensional demonstration initial pictures for hitting presentation process.
(3) student is double-clicked using finger and touches the three-dimensional initial pictures, it is determined that playing, system is according to foregoing specific
The control ultrasonic microbubble motion of 1~step 7 of job step shows ball and casts shock demonstration scene image;Described ball casts shock and drilled
Show all 3-D views that scene image includes the 3-D view of real ball and real ball casts knockout process, i.e. the system
The 3-D view of generation illustrates real ball and casts the process hit;
(4) student clutches the 3-D view of ball, and system is known according to foregoing specific works step 3~step 4
The position of other hand and action, the parameter information of the 3-D view of ball being adjusted, the 3-D view that ball is presented follows hand to move, and
According to step 5 as previously described hand pressure touch feedback is given by ultrasonic microbubble motion;
(5) student carries out projectile action, and system identifies the throwing of hand according to foregoing specific works step 3~step 4
Action is penetrated, adjusts the parameter information of the 3-D view of ball, and shows that the 3-D view of ball moves according to physics law;
(6) ball collides with other balls in target three dimensions, and system is according to foregoing specific works step 3~step
4 second Data Analysis Services modules differentiate the collision boundary by projectile ball and other balls, and other of collision moment are calculated
The moving displacement of ball graphics picture point, adjust the graphics picture point translational motion of other balls;
The translational motion of other specific balls and the conservation of momentum under perfectly elastic impact is observed by the translational motion of projectile ball
With the conservation of energy, not kinetic energy loss is collided.
(7) demonstration finishes, and student double-clicks gesture using finger and determines that demonstration finishes, and system is according to foregoing specific work
Make step 3~step 4 identification gesture, stop generation 3-D view.
Above-described embodiment is used for illustrating the present invention, rather than limits the invention, every to be swashed using similar
The method and system that optical generator carries out three dimensions imaging is fallen within protection scope of the present invention.In the spirit of the present invention
In scope of the claims, to any modifications and changes of the invention made, protection scope of the present invention is both fallen within.
Claims (12)
1. a kind of three dimensions imaging exchange method based on ultrasonic wave, it is characterised in that comprise the following steps:
Step 1:Three dimensions spherical coordinate system is established, generates target three dimensions, ultrasonic microbubble is entered described target three-dimensional
Space;
Step 2:The parameter information of three-dimensional graph picture point all in 3-D view is obtained, described three-dimensional graph picture point
Parameter information includes t at the time of picture point, the spherical coordinate system coordinate (r, θ, φ) and color scalar value (R, G, B) of picture point;
Step 3:Positional information of the t user in target three dimensions is obtained, and according to described positional information, adjustment generation
The spherical coordinate system coordinate (r', θ ', φ ') and color scalar value (R', G', B') of the three-dimensional graph picture point of t;
Step 4:According to the spherical coordinate system coordinate (r', θ ', φ ') of the three-dimensional graph picture point of t, ultrasound field, institute are generated
The ultrasound field control ultrasonic microbubble motion stated, and make ultrasonic microbubble that there is corresponding speed, give use when being contacted with user
The corresponding pressure feedback in family, described ultrasonic microbubble is finally set to be stable at ultrasound field potential well position in target three dimensions;
Step 5:According to the spherical coordinate system coordinate (r', θ ', φ ') of t three-dimensional graph picture point, laser direction is adjusted, according to t
Moment three-dimension space image parameter color scalar value (R', G', B') adjusts laser intensity, and transmitting three kinds of colors of red, green, blue swash
Light, it is focused on ultrasonic microbubble, make ultrasonic microbubble that respective color be presented, then stop transmitting laser, wait instructs next time;
Step 6:Repeat step 2~5, by a frame time T, all images in ultrasonic microbubble traversal target three dimensions
Point, 3-D view is generated in target three dimensions.
2. the three dimensions imaging exchange method according to claim 1 based on ultrasonic wave, it is characterised in that described step
Rapid 3 are:Positional information of the t user in target three dimensions is obtained, and with reference to n frame time T user before t in target
The positional information of three dimensions, analysis obtains user operation instruction, then according to t user in the position of target three dimensions
Information and obtained user operation instruction, the three-dimensional graph picture point of adjustment generation t spherical coordinate system coordinate (r', θ ',
φ ') and color scalar value (R', G', B').
3. the three dimensions imaging exchange method according to claim 2 based on ultrasonic wave, it is characterised in that:The step
In 5 according to the relative tertiary location relation of user and previous frame time the T 3-D view generated and/or user operation instruction come
Control transmitting stops transmitting laser.
4. the imaging exchange method of the three dimensions based on ultrasonic wave according to Claims 2 or 3, it is characterised in that:It is described
User operation instruction be selected from the expansion of 3-D view, closing, rotation, switching, scaling, movement, folding, merging, section displaying,
In part-overall transformation, details idsplay order any one or it is any a variety of.
5. the imaging exchange method of the three dimensions based on ultrasonic wave according to Claims 2 or 3, it is characterised in that:It is described
User operation instruction for upset.
A kind of 6. three dimensions imaging interactive system based on ultrasonic wave, it is characterised in that:Described interactive system includes ultrasound
Module (5) occurs for ripple, module (6) occurs for laser, module occurs for interactive information acquisition module, ultrasonic microbubble, the storage of the first image
Analysis and processing module and power module, wherein:
Described ultrasonic wave occurs module (5) and is used to launch ultrasonic wave, forms ultrasound field, forms potential well in three dimensions,
So as to control ultrasonic microbubble to move to potential well position, and it is stable at the position;
The laser that module (6) is used to launch three kinds of colors of red, green, blue occurs for described laser, and control ultrasonic microbubble presents corresponding
Color;
Described interactive information acquisition module is used to determine positional information of the user in target three dimensions, is converted into use
Family position signalling is simultaneously sent to the first image storage analysis and processing module;
Described ultrasonic microbubble occurs module and is used to generate ultrasonic microbubble;
Described first image storage analysis and processing module is used for the parameter information for storing and reading graphics picture point, described in reception
Interactive information acquisition module send users location signals, after being analyzed and processed, adjustment generation t three-dimensional graph
The spherical coordinate system coordinate (r', θ ', φ ') and color scalar value (R', G', B') of picture point;
Described power module is used to module, laser generation module, interactive information acquisition module, ultrasonic microbubble occur to ultrasonic wave
Generation module, the first image storage analysis and processing module provide the energy.
7. the three dimensions imaging interactive system according to claim 6 based on ultrasonic wave, it is characterised in that described the
One image storage analysis and processing module includes image information memory module and the first Data Analysis Services module, wherein:
Described image information memory module is used for the parameter information for storing graphics picture point;
The first described Data Analysis Services module is used for the parameter letter for reading the graphics picture point in image information memory module
Breath, the users location signals that described interactive information acquisition module is sent are received, after being analyzed and processed, adjustment generation t
Three-dimensional graph picture point spherical coordinate system coordinate (r', θ ', φ ') and color scalar value (R', G', B').
A kind of 8. three dimensions imaging interactive system based on ultrasonic wave, it is characterised in that:Described interactive system includes ultrasound
Module (5) occurs for ripple, module (6) occurs for laser, module occurs for interactive information acquisition module, ultrasonic microbubble, the storage of the second image
Analysis and processing module and power module, wherein:
Described ultrasonic wave occurs module (5) and is used to launch ultrasonic wave, forms ultrasound field, forms potential well in three dimensions,
So as to control ultrasonic microbubble to move to potential well position, and it is stable at the position;
The laser that module (6) is used to launch three kinds of colors of red, green, blue occurs for described laser, and control ultrasonic microbubble presents corresponding
Color;
Described interactive information acquisition module is used to determine positional information of the user in target three dimensions, is converted into use
Family position signalling is simultaneously sent to the second image storage analysis and processing module;
Described ultrasonic microbubble occurs module and is used to generate ultrasonic microbubble;
Described second image storage analysis and processing module is used for the parameter information for storing and reading graphics picture point, described in reception
The users location signals that send of interactive information acquisition module, and according to the user of n frame time T before t in target three-dimensional
Positional information in space, analysis obtain user operation instruction, are then believed according to t user in the position of target three dimensions
The user operation instruction for ceasing and obtaining, the spherical coordinate system coordinate (r', θ ', φ ') of the three-dimensional graph picture point of adjustment generation t
With color scalar value (R', G', B'), and the spherical coordinate system coordinate of three-dimensional graph picture point (r', θ ', φ ') is converted into ultrasound
Wave control signal is sent to supersonic generator, and control sends direction, intensity and the phase of ultrasonic wave, by color scalar value (R',
G', B') laser control signal is converted to, it is sent to laser and module occurs, control sends direction, the intensity of laser, micro- to ultrasound
The raw module that is soaked sends ultrasonic microbubble generation signal;
Described power module is used to module, laser generation module, interactive information acquisition module, ultrasonic microbubble occur to ultrasonic wave
Generation module, the second image storage analysis and processing module provide the energy.
9. the imaging interactive system of the three dimensions based on ultrasonic wave according to claim 8, it is characterised in that:Described
Second image storage analysis and processing module includes image information memory module and the second Data Analysis Services module, wherein:
Described image information memory module is used for the parameter information for storing graphics picture point;
The second described Data Analysis Services module is used for the parameter letter for reading the graphics picture point in image information memory module
Breath, the users location signals that described interactive information acquisition module is sent are received, and according to the user of n frame time T before t
Positional information in target three dimensions, analysis obtains user operation instruction, then according to t user in target three-dimensional space
Between positional information and obtained user operation instruction, the spherical coordinate system coordinate of the three-dimensional graph picture point of adjustment generation t
(r', θ ', φ ') and color scalar value (R', G', B'), and by the spherical coordinate system coordinate of three-dimensional graph picture point (r', θ ', φ ')
Being converted to ultrasonic wave control signal and be sent to ultrasonic wave generation module, control sends direction, intensity and the phase of ultrasonic wave, according to
User and the relative tertiary location relation of the 3-D view of previous frame time T generations, color scalar value (R', G', B') is turned
Laser control signal is changed to, laser is sent to and module occurs, control the transmitting and stopping of the direction for sending laser, intensity and laser
Transmitting, module occurs to ultrasonic microbubble and sends ultrasonic microbubble generation signal.
10. the three dimensions imaging interactive system according to claim 8 based on ultrasonic wave, it is characterised in that:Described
Expansion of the user operation instruction selected from 3-D view, closing, rotation, switching, scaling, movement, folding, merging, section displaying, office
In portion-overall transformation, details idsplay order any one or it is any a variety of.
11. the three dimensions imaging interactive system according to claim 8 based on ultrasonic wave, it is characterised in that:Described
User operation instruction is upset.
12. the imaging interactive system of the three dimensions based on ultrasonic wave according to claim any one of 6-11, its feature exist
Module (5) occurs in described ultrasonic wave to be made up of at least three supersonic generators and at least one ultrasonic wave controller.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410217954.8A CN104699235B (en) | 2013-12-05 | 2014-05-21 | Three dimensions imaging exchange method and system based on ultrasonic wave |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310655833 | 2013-12-05 | ||
CN2013106558337 | 2013-12-05 | ||
CN201410217954.8A CN104699235B (en) | 2013-12-05 | 2014-05-21 | Three dimensions imaging exchange method and system based on ultrasonic wave |
Publications (2)
Publication Number | Publication Date |
---|---|
CN104699235A CN104699235A (en) | 2015-06-10 |
CN104699235B true CN104699235B (en) | 2017-12-01 |
Family
ID=53336131
Family Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410216588.4A Active CN104689674B (en) | 2013-12-05 | 2014-05-21 | Focusing particles method, aggregation processing method and aggregation processing system based on the ultrasonic trap of broad sense quantum |
CN201410217954.8A Active CN104699235B (en) | 2013-12-05 | 2014-05-21 | Three dimensions imaging exchange method and system based on ultrasonic wave |
CN201410216911.8A Active CN104699234B (en) | 2013-12-05 | 2014-05-21 | Three dimensions imaging exchange method and system based on laser |
CN201410216890.XA Active CN104688265B (en) | 2013-12-05 | 2014-07-14 | Method and system for dynamically and directly displaying image |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410216588.4A Active CN104689674B (en) | 2013-12-05 | 2014-05-21 | Focusing particles method, aggregation processing method and aggregation processing system based on the ultrasonic trap of broad sense quantum |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410216911.8A Active CN104699234B (en) | 2013-12-05 | 2014-05-21 | Three dimensions imaging exchange method and system based on laser |
CN201410216890.XA Active CN104688265B (en) | 2013-12-05 | 2014-07-14 | Method and system for dynamically and directly displaying image |
Country Status (1)
Country | Link |
---|---|
CN (4) | CN104689674B (en) |
Families Citing this family (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105302303A (en) * | 2015-10-15 | 2016-02-03 | 广东欧珀移动通信有限公司 | Game control method and apparatus and mobile terminal |
CN105607034A (en) * | 2015-12-23 | 2016-05-25 | 北京凌宇智控科技有限公司 | Three-dimensional space detection system, positioning method and system |
CN107121698B (en) * | 2016-02-24 | 2019-02-19 | 中国石油化工股份有限公司 | For optimizing the method, apparatus and system of 3-D seismics wave-field simulation and imaging |
CN106769707B (en) * | 2016-11-25 | 2023-03-21 | 中国科学院合肥物质科学研究院 | Potential well voltage-adjustable particle size spectrum measurement device and measurement method thereof |
CN106843502B (en) | 2017-03-10 | 2019-10-18 | 京东方科技集团股份有限公司 | A kind of the touch-control interaction systems and method of Three-dimensional Display |
CN107273831A (en) * | 2017-06-05 | 2017-10-20 | 苏州大学 | A kind of Three-dimensional target recognition method based on spherical space |
JP6946857B2 (en) * | 2017-08-24 | 2021-10-13 | 富士フイルムビジネスイノベーション株式会社 | Information processing equipment and programs |
CN107589845B (en) | 2017-09-19 | 2020-02-18 | 京东方科技集团股份有限公司 | Display system |
CN110376550B (en) * | 2018-04-12 | 2024-04-12 | 北京凌宇智控科技有限公司 | Three-dimensional space positioning method and system based on position compensation |
CN110376549A (en) * | 2018-04-12 | 2019-10-25 | 北京凌宇智控科技有限公司 | A kind of three dimension location method and system |
CN110376543A (en) * | 2018-04-12 | 2019-10-25 | 北京凌宇智控科技有限公司 | A kind of three dimension location method and system |
CN108771938A (en) * | 2018-04-18 | 2018-11-09 | 北京理工大学 | A kind of ultrasonic air gas purifying method and system |
CN109697941A (en) * | 2018-12-29 | 2019-04-30 | 广州欧科信息技术股份有限公司 | Historical and cultural heritage display systems based on hologram technology |
CN109946944B (en) * | 2019-03-01 | 2021-09-03 | 悠游笙活(北京)网络科技有限公司 | System and method for projecting photophoretic traps |
CN109901371B (en) * | 2019-03-01 | 2021-09-03 | 悠游笙活(北京)网络科技有限公司 | Holographic imaging system and method |
CN110502106A (en) * | 2019-07-26 | 2019-11-26 | 昆明理工大学 | A kind of interactive holographic display system and method based on 3D dynamic touch |
CN110989844A (en) * | 2019-12-16 | 2020-04-10 | 广东小天才科技有限公司 | Input method, watch, system and storage medium based on ultrasonic waves |
CN111322954B (en) * | 2020-03-19 | 2021-07-27 | 北京神工科技有限公司 | Assembly tool pose measuring method and device, storage medium and electronic equipment |
CN114911338A (en) * | 2021-02-09 | 2022-08-16 | 南京微纳科技研究院有限公司 | Contactless human-computer interaction system and method |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102121817A (en) * | 2010-12-22 | 2011-07-13 | 浙江大学 | Compact digital holographic apparatus and method of particle field |
CN102361497A (en) * | 2011-11-15 | 2012-02-22 | 南京大学 | Display method and display system for spatial three-dimensional video |
US20120223909A1 (en) * | 2011-03-02 | 2012-09-06 | Smart Technologies Ulc | 3d interactive input system and method |
CN103229041A (en) * | 2010-12-03 | 2013-07-31 | 索尼公司 | 3D data analysis device, 3Ddata analysis method, and 3D data analysis program |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1274614A (en) * | 1999-05-25 | 2000-11-29 | 安徽省卫生干部进修学校 | Supersonic vibrating dedusting method and device |
US6447574B1 (en) * | 2001-06-29 | 2002-09-10 | Global Clean Air, Inc. | System, process and apparatus for removal of pollutants from gaseous streams |
JP2004351330A (en) * | 2003-05-29 | 2004-12-16 | Sanyo Electric Co Ltd | Air cleaner |
TWI413274B (en) * | 2005-03-18 | 2013-10-21 | Mitsubishi Chem Corp | Light-emitting device, white light-emitting device, lighting device and image display device |
US8648772B2 (en) * | 2009-08-20 | 2014-02-11 | Amazon Technologies, Inc. | Amalgamated display comprising dissimilar display devices |
-
2014
- 2014-05-21 CN CN201410216588.4A patent/CN104689674B/en active Active
- 2014-05-21 CN CN201410217954.8A patent/CN104699235B/en active Active
- 2014-05-21 CN CN201410216911.8A patent/CN104699234B/en active Active
- 2014-07-14 CN CN201410216890.XA patent/CN104688265B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103229041A (en) * | 2010-12-03 | 2013-07-31 | 索尼公司 | 3D data analysis device, 3Ddata analysis method, and 3D data analysis program |
CN102121817A (en) * | 2010-12-22 | 2011-07-13 | 浙江大学 | Compact digital holographic apparatus and method of particle field |
US20120223909A1 (en) * | 2011-03-02 | 2012-09-06 | Smart Technologies Ulc | 3d interactive input system and method |
CN102361497A (en) * | 2011-11-15 | 2012-02-22 | 南京大学 | Display method and display system for spatial three-dimensional video |
Also Published As
Publication number | Publication date |
---|---|
CN104699234B (en) | 2018-02-02 |
CN104699234A (en) | 2015-06-10 |
CN104689674A (en) | 2015-06-10 |
CN104688265A (en) | 2015-06-10 |
CN104689674B (en) | 2017-09-05 |
CN104688265B (en) | 2017-01-25 |
CN104699235A (en) | 2015-06-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104699235B (en) | Three dimensions imaging exchange method and system based on ultrasonic wave | |
Hilliges et al. | HoloDesk: direct 3d interactions with a situated see-through display | |
Zhang et al. | Recent developments in game-based virtual reality educational laboratories using the microsoft kinect | |
CN103793060B (en) | A kind of user interactive system and method | |
TW201104494A (en) | Stereoscopic image interactive system | |
CN102448565A (en) | Real time retargeting of skeletal data to game avatar | |
JP2011521318A (en) | Interactive virtual reality image generation system | |
WO2005098517A2 (en) | Horizontal perspective hand-on simulator | |
Reyes et al. | Mixed reality guidance system for motherboard assembly using tangible augmented reality | |
Lok et al. | Incorporating dynamic real objects into immersive virtual environments | |
CN207601427U (en) | A kind of simulation laboratory based on virtual reality mixing | |
Duan et al. | Remote environment exploration with drone agent and haptic force feedback | |
TWM559476U (en) | System device with virtual reality and mixed reality house purchase experience | |
CN105608726A (en) | Three-dimensional interactive chatting method | |
Csongei et al. | ClonAR: Rapid redesign of real-world objects | |
Siegl et al. | An augmented reality human–computer interface for object localization in a cognitive vision system | |
Bolton et al. | BodiPod: interacting with 3d human anatomy via a 360 cylindrical display | |
KR20210042476A (en) | Augmented reality providing method and system using projection technology | |
Pokorny et al. | A database for reproducible manipulation research: CapriDB–Capture, Print, Innovate | |
CN109309827A (en) | More people's apparatus for real time tracking and method for 360 ° of suspension light field three-dimensional display systems | |
Akinjala et al. | Animating human movement & gestures on an agent using Microsoft kinect | |
Yan | Application of VR Virtual Reality Technology in 3D Image Interior Design System | |
Sherstyuk et al. | Semi-Automatic Surface Scanner for Medical Tangible User Interfaces | |
Morison | Perspective control: Technology to solve the multiple feeds problem in sensor systems | |
Gupta et al. | Training in virtual environments |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |