CN102169364A - Interaction module applied to stereoscopic interaction system and method of interaction module - Google Patents

Interaction module applied to stereoscopic interaction system and method of interaction module Download PDF

Info

Publication number
CN102169364A
CN102169364A CN 201010122713 CN201010122713A CN102169364A CN 102169364 A CN102169364 A CN 102169364A CN 201010122713 CN201010122713 CN 201010122713 CN 201010122713 A CN201010122713 A CN 201010122713A CN 102169364 A CN102169364 A CN 102169364A
Authority
CN
China
Prior art keywords
coordinate
dimensional
interactive
eyes
interaction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN 201010122713
Other languages
Chinese (zh)
Other versions
CN102169364B (en
Inventor
赵子毅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pixart Imaging Inc
Original Assignee
Pixart Imaging Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pixart Imaging Inc filed Critical Pixart Imaging Inc
Priority to CN 201010122713 priority Critical patent/CN102169364B/en
Publication of CN102169364A publication Critical patent/CN102169364A/en
Application granted granted Critical
Publication of CN102169364B publication Critical patent/CN102169364B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The invention relates to an interaction module applied to a stereoscopic interaction system and a method of the interaction module. The interaction module and the method are used for correcting a position of an interaction component according to the position of a user, or correcting the position of a virtual article in a stereoscopic image and an interaction judgement condition. Therefore, even if the position of the virtual article in the stereoscopic image which is seen by the user is changed by the change of the position of the user, the stereoscopic interaction system can obtain an accurate result according to the position of the corrected interaction component or the corrected position of the virtual article and the interaction judgement condition.

Description

Be applied to the interactive module and the method thereof of three-dimensional interaction systems
Technical field
The present invention relates to a kind of three-dimensional interaction systems, more particularly, relate to a kind of three-dimensional display system that utilizes and carry out interactive three-dimensional interaction systems.
Background technology
In known technology, three-dimensional display system is used to provide stereopsis.As shown in Figure 1, three-dimensional display system can be divided into bore hole formula three-dimensional display system and eyeglass stereoscopic display system.For example, the bore hole formula three-dimensional display system 110 in the left side of Fig. 1 utilizes the mode of beam split, provides different image (as the image DIM of Fig. 1 in different angles θ 1~DIM θ 8).So, because user's eyes are positioned at different angles, so the user can receive left image DIM respectively L(image DIM θ 4) and right image DIM R(image DIM θ 5), and obtain the stereopsis that bore hole formula three-dimensional display system 110 is provided according to this.Eyeglass stereoscopic display system 120 at the right-hand part of Fig. 1 comprises display screen 121 and auxiliary eyeglasses 122.Display screen 121 is used to provide left image DIM LWith right image DIM R Auxiliary eyeglasses 122 is used for assisting user's eyes to receive left image DIM respectively LWith right image DIM R, to allow the user can obtain this stereopsis.
Yet the user can change along with user's position from the resulting stereopsis of three-dimensional display system.With eyeglass stereoscopic display system 120 is example, and (not shown auxiliary eyeglasses 122 in Fig. 2) as shown in Figure 2 has virtual objects VO (for example, virtual objects VO is a tennis) in the stereopsis that three-dimensional display system 120 is provided.Wherein virtual objects VO is at left image DIM LIn the position be LOC ILVO, at right image DIM RIn the position be LOC IRVOIf the position of user's left eye is LOC at this moment 1LE, the position of user's right eye is LOC 1REThe position LOC of user's left eye 1LEPosition LOC with virtual objects VO ILVOForm straight line L 1LThe position LOC of user's right eye 1REPosition LOCI with virtual objects VO RVOForm straight line L 1RSo, the user sees that the position of virtual objects VO is according to straight line L 1LWith L 1RAnd determine.For example, as straight line L 1LWith L 1RThe position of intersection point be LOC 1CPThe time, the user sees that the position of virtual objects VO is LOC 1CPIn like manner, be respectively LOC when user's eyes position 2LEWith LOC 2REThe time, the position of user's eyes respectively with the position LOC of virtual objects VO ILVOWith LOC IRVOForm straight line L 2LWith L 2RAt this moment, the user sees that the position of virtual objects VO is according to straight line L 2LWith L 2RAnd determine.That is to say that the user sees that the position of virtual objects VO is straight line L 2LWith L 2RThe position LOC of intersection point 2CP
Because the user from the resulting stereopsis of three-dimensional display system, can change along with user's position, therefore when user's desire is interactive by interactive module (as Game device) and three-dimensional display system, may produce wrong interactive result.For example, user's desire is carried out three-dimensional tennis game by interactive module (as Game device) and three-dimensional display system 120.Interactive assembly (as the game control joystick) in the hand-held interactive module of user is with the batting of swinging the bat of the role in the control recreation.Interactive module (Game device) supposes that user's position is positioned at the dead ahead of three-dimensional display system, and interactive module (Game device) supposes that user's eyes position is respectively LOC 1LEWith LOC 1REAt this moment, interactive module (Game device) control three-dimensional display system 120 is at left image DIM LThe middle position LOC that shows tennis ILVO, at right image DIM RThe position of middle demonstration tennis is LOC IRVOTherefore, interactive module (Game device) supposes that the 3D tennis position that the user sees is LOC 1CP(as shown in Figure 2).In addition, as position and position LOC that the user swings the bat 1CPBetween distance less than interactive critical distance D THThe time, interactive module (Game device) judges that promptly the user hits tennis.Yet, if user's eyes position is actually LOC at this moment 2LEWith LOC 2RE, then the 3D tennis position in fact seen of user is LOC 2CPAssumed position LOC 2CPWith LOC 1CPBetween distance greater than interactive critical distance D THSo, control interactive assembly (game control joystick) to position LOC as the user 2CPWhen swinging the bat, interactive module (Game device) judges that the user does not hit tennis.In other words, though the 3D tennis position that in fact user is seen is LOC 2CP, and the user controls interactive assembly (game control joystick) to position LOC 2CPSwing the bat, but interactive module (Game device) judges that but the user does not hit tennis.That is to say,, therefore can cause the interactive relationship of interactive module (Game device) erroneous judgement user and article, produce incorrect interactive result, bring the user very big inconvenience because user's eye position changes the distortion that causes stereopsis.
Summary of the invention
The invention provides a kind of interactive module that is applied to a three-dimensional interaction systems.This solid interaction systems has a three-dimensional display system.This three-dimensional display system is used to provide a stereopsis.This stereopsis has a virtual objects.This virtual objects has a virtual coordinates and an interactive Rule of judgment.This interactive module comprises a location module, an interactive assembly, an interactive assembly locating module, and an interactive decision circuitry.This locating module is used for detecting the position of user in a scene, to produce a three-dimensional reference coordinate.This interaction assembly locating module is used for detecting the position of this interaction assembly, produces a three-dimensional interactive coordinate.It is a correction virtual coordinates that this interaction decision circuitry is used for changing this virtual coordinates according to this three-dimensional reference coordinate, and, determine the interactive result between this interaction assembly and this stereopsis according to this three-dimensional interactive coordinate, this correction virtual coordinates and this interaction Rule of judgment.
The present invention also provides a kind of interactive module that is applied to a three-dimensional interaction systems.This solid interaction systems has a three-dimensional display system.This three-dimensional display system is used to provide a stereopsis.This stereopsis has a virtual objects.This virtual objects has a virtual coordinates and an interactive Rule of judgment.This interactive module comprises a location module, an interactive assembly, an interactive assembly locating module, and an interactive decision circuitry.This locating module is used for detecting the position of user in a scene, to produce a three-dimensional reference coordinate.This interaction assembly locating module is used for detecting the position of this interaction assembly, to produce a three-dimensional interactive coordinate.It is the interactive coordinate of a three-dimensional correction that this interaction decision circuitry is used for changing this three-dimensional interactive coordinate according to this three-dimensional reference coordinate, and proofread and correct interactive coordinate, this virtual coordinates and this interaction Rule of judgment according to this three-dimensional, determine the interactive result between this interaction assembly and this stereopsis.
The present invention also provides a kind of usefulness to decide an interactive result's of a three-dimensional interaction systems method.This solid interaction systems has a three-dimensional display system and an interactive assembly.This three-dimensional display system is used to provide a stereopsis.This stereopsis has a virtual objects.This virtual objects has a virtual coordinates and an interactive Rule of judgment.This method comprises detecting user's in a scene position, to produce a three-dimensional reference coordinate, to detect the position of this interaction assembly, to produce a three-dimensional interactive coordinate, and, determine this interaction result between this interaction assembly and this stereopsis according to this three-dimensional reference coordinate, this three-dimensional interactive coordinate, this virtual coordinates and this interaction Rule of judgment.
Description of drawings
Fig. 1 is the synoptic diagram of the three-dimensional display system of explanation known technology.
The synoptic diagram that Fig. 2 changes with user's position for the stereopsis that three-dimensional display system provided of explanation known technology.
Fig. 3 and Fig. 4 are the synoptic diagram of explanation three-dimensional interaction systems of the present invention.
Fig. 5 is the synoptic diagram of first embodiment of explanation bearing calibration of the present invention.
Fig. 6, Fig. 7 and Fig. 8 can reduce the synoptic diagram of interactive decision circuitry mode of the number of the search point of required processing in first embodiment of bearing calibration of the present invention for explanation.
Fig. 9 and Figure 10 are the synoptic diagram of second embodiment of explanation bearing calibration of the present invention.
Figure 11 and Figure 12 are the synoptic diagram of the 3rd embodiment of explanation bearing calibration of the present invention.
Figure 13 is the synoptic diagram of explanation three-dimensional interaction systems may command sound and light program of the present invention.
Figure 14 is the synoptic diagram of first embodiment of eye location module of the present invention.
Figure 15 is the synoptic diagram of first embodiment of eye location circuit of the present invention.
Figure 16 is the synoptic diagram of another embodiment of eye location module of the present invention.
Figure 17 is the synoptic diagram of another embodiment of eye location circuit of the present invention.
Figure 18 is the synoptic diagram of another embodiment of eye location circuit of the present invention.
Figure 19 and Figure 20 are the synoptic diagram of another embodiment of eye location circuit of the present invention.
Figure 21 and Figure 22 are the synoptic diagram of another embodiment of eye location circuit of the present invention.
Figure 23 is the synoptic diagram of another embodiment of eye location module of the present invention.
Figure 24 is the synoptic diagram of first embodiment of three-dimensional scenic sensor of the present invention.
Figure 25 is the synoptic diagram that eyes coordinates of the present invention produces first embodiment of circuit.
Figure 26 is the synoptic diagram that eyes coordinates of the present invention produces another embodiment of circuit.
Figure 27 is the synoptic diagram that eyes coordinates of the present invention produces another embodiment of circuit.
Figure 28 is the synoptic diagram that eyes coordinates of the present invention produces another embodiment of circuit.
Wherein, description of reference numerals is as follows:
110,120,310 three-dimensional display systems
121 display screens
122 auxiliary eyeglasses
300 three-dimensional interaction systems
320 interactive module
321 locating modules
322 interactive assemblies
323 interactive assembly locating modules
324 interactive decision circuitry
330 display control circuits
340 loudspeaker
350 sound control circuits
1100,1300,1700 eye location modules
1110,1120,1810 image sensors
1130,1200,1400,1500,1600, eye location circuit
2300
1140,1920 three-dimensional coordinate change-over circuits
1210,1910 eye detecting circuit
1350,2030 human face detection circuit
1410,2110,2310 glasses circuit for detecting
1420,2120,2320 glasses coordinate transformation circuits
1530,2230 inclination detectors
1640,1820,2340 infrared light luminescence components
1650 infrared light reflection assemblies
1660,2360 infrared light sensing circuits
1710,1800 three-dimensional scenic sensors
1720,1900,2000,2100, eyes coordinates produces circuit
2200
1830 light sensing distance measuring equipments
COND PVO, COND CVOInteractive Rule of judgment
D SError distance
D THInteractive critical distance
D MPR, D MPLDistance
DIM 3DStereopsis
DIM θ 1~DIM θ 8, DIM L, DIM RImage
INFO DRange information
INFO TILTInclination information
L DDetected light
L RReflected light
L 1L、L 1R、L 2L、L 2R、L PL、L PR
L AL, L AR, L REFL, L REFR, L PJL, straight line
L PJR
LOC 3D_PIO、LOC 3D_CIO
LOC 3D_PVO、LOC 3D_CVO
LOC 3D_EYE、LOC IRVO
LOC ILVO、LOC 1CP、LOC 2CP
LOC 1LE、LOC 1LR、LOC 2LE
LOC 2LR、LOC 3D_LE、LOC 3D_RE
LOC LE_PRE, LOC RE_PRE, coordinate
LOC PTH、LOC CTH、LOC 3D_IPJR
LOC 3D_IPJL、LOC 3D_SPJR
LOC 3D_SPJL
LOC SEN1~LOC SEN3
LOC 2D_EYE1~LOC 2D_EYE3
LOC GLASS1、LOC GLASS2
LOC GLASS3、LOC IR、LOC MD
MP is with reference to mid point
P A, P XSearch point
P BEnd points
P CCentral point
The RA search area
The interactive result of RT
The SC scene
SIM 2D1~SIM 2D3The two dimension sensing image
SL GLASS1~SL GLASS3The glasses slope
SL IRThe infrared light slope
SUF PTH, SUF CTHCritical surface
Embodiment
The invention provides a kind of three-dimensional interaction systems, can be according to user's position, proofread and correct the position of interactive assembly, or the position of the virtual objects in the stereopsis and interactive Rule of judgment, so, three-dimensional interaction systems of the present invention can be according to the position of the interactive assembly after calibrated, or the position of the virtual objects after calibrated and interactive Rule of judgment, obtains correct interactive result.
Please refer to Fig. 3 and Fig. 4.Fig. 3 and Fig. 4 are the synoptic diagram of explanation three-dimensional interaction systems 300 of the present invention.Three-dimensional interaction systems 300 comprises three-dimensional display system 310, and interactive module 320.Three-dimensional display system 310 provides stereopsis DIM 3DThree-dimensional display system can be implemented by bore hole formula three-dimensional display system 110 or eyeglass stereoscopic display system 120.Interactive module 320 comprises locating module 321, interactive assembly 322, interactive assembly locating module 323, and interactive decision circuitry 324.Locating module 321 is used for detecting the position of user in scene SC, to produce three-dimensional reference coordinate.The position of the interactive assembly 322 of interactive assembly locating module 323 detectings is to produce three-dimensional interactive coordinate LOC 3D_PIO Interactive decision circuitry 324 is according to 3D reference coordinate, three-dimensional interactive coordinate LOC 3D_PIOWith stereopsis DIM 3D, determine interactive assembly 322 and stereopsis DIM 3DBetween interactive RT as a result.
For convenience of description, suppose that in the present invention locating module 321 is illustrated for the eye location module.The position of eye location module 321 detectings eyes of user in scene SC is to produce three-dimensional eyes coordinate LOC 3D_EYEBe used as three-dimensional reference coordinate.Wherein three-dimensional eyes coordinate LOC 3D_EYEComprise three-dimensional left eye coordinates LOC 3D_LEWith three-dimensional right eye coordinate LOC 3D_RETherefore, this moment, interactive decision circuitry 324 was according to three-dimensional eyes coordinate LOC 3D_EYE, three-dimensional interactive coordinate LOC 3D_PIOWith stereopsis DIM 3D, determine interactive assembly 322 and stereopsis DIM 3DBetween interactive RT as a result.Yet locating module 321 of the present invention is not defined as the eye detecting module, and for example, locating module 321 can be by detecting user's further feature (as user's ear or face etc.), with location user's position.
Below the principle of work of three-dimensional interaction systems 300 of the present invention will be described further.
Stereopsis DIM 3DBy left image DIM LWith right image DIM RForm.Set up body image DIM 3DHas virtual objects VO.For example, the user carries out tennis game by three-dimensional interaction systems 300, and virtual objects VO is a tennis, and the user is controlled at stereopsis DIM by interactive assembly 322 3DIn another virtual objects (as tennis racket), carry out tennis game.Virtual objects VO has virtual coordinates LOC 3D_PVOWith interactive Rule of judgment COND PVOMore particularly, the left image DIM that provided at three-dimensional display system 310 of virtual objects VO LIn the position be LOC ILVO, the right image DIM that is provided at three-dimensional display system 310 RIn the position be LOC IRVOInteractive module 320 hypothesis user earlier is in reference position (as the dead ahead of three-dimensional display system 310), and user's eyes position equals known eyes coordinate LOC EYE_PRE, wherein known eyes coordinate LOC EYE_PREComprise known left eye coordinates LOC LE_PREWith known right eye coordinate LOC RE_PREAccording to straight line L PL(known left eye coordinates LOC LE_PREWith virtual objects VO at left image DIM LPosition LOC ILVOBetween straight line) with straight line L PR(known right eye coordinate LOC RE_PREWith virtual objects VO at right image DIM RPosition LOC IRVOBetween straight line), three-dimensional interaction systems 300 can obtain the user from known eyes coordinate LOC EYE_PREThe virtual objects VO that is seen is at position LOC 3D_PVO, and the virtual coordinates of virtual objects VO is set at LOC 3D_PVOMore particularly, the user has three-dimensional imaging position model MODEL LOCThe image that can be used to receive according to eyes comes the position of positioning object.That is to say, when the user receives left image DIM LWith right image DIM RAfter, the user is according to left image DIM LThe position LOC of middle virtual objects VO ILVO, right image DIM RThe position of middle virtual objects VO is LOC IRVO, can be by three-dimensional imaging position model MODEL LOCLocate the three-dimensional imaging position of virtual objects VO.For example, in the present invention, suppose three-dimensional imaging position model MODEL LOCFor foundation virtual objects VO at left image DIM LIn the position (as position LOC ILVO) with the position of user's left eye (as known left eye coordinates LOC LE_PRE) first online (as straight line L PL) with according to virtual objects VO at right image DIM RIn the position (as known right eye coordinate LOC IRVO) with the position of user's right eye (as position LOC RE_PRE) second online (as straight line L PR), the three-dimensional imaging position of decision virtual objects VO.When above-mentioned first online and second online when intersecting at a point of crossing, three-dimensional imaging position model MODEL LOCThe three-dimensional imaging position that can set virtual objects VO is the coordinate of point of crossing; When above-mentioned first online and second online when not having the point of crossing, three-dimensional imaging position model MODEL LOCCan determine earlier to have with first online and the second online minor increment and the reference mid point, and the three-dimensional imaging position of setting virtual objects VO is the coordinate with reference to mid point.The interactive Rule of judgment COND of virtual objects VO PVOBe used to provide to the interactive RT as a result of interactive decision circuitry 324 decisions.For example, interactive Rule of judgment COND PVOCan be made as position and virtual coordinates LOC when interactive assembly 322 3D_PVOBetween distance less than interactive critical distance D THThe time, interactive RT as a result represents " contact ", that is to say, this moment, interactive decision circuitry 324 judged that the tennis racket that interactive assembly 322 is controlled touches stereopsis DIM 3DIn virtual objects VO (for example, as hit tennis); Position and virtual coordinates LOC when interactive assembly 322 3D_PVOBetween distance greater than interactive critical distance D THThe time, interactive RT as a result represents " not contact ", that is to say, this moment, interactive decision circuitry 324 judged that interactive assembly 322 does not touch stereopsis DIM 3DIn virtual objects VO (for example, not hitting tennis).
In the present invention, interactive decision circuitry 324 is according to three-dimensional eyes coordinate (three-dimensional reference coordinate) LOC 3D_EYE, three-dimensional interactive coordinate LOC 3D_PIOWith stereopsis DIM 3D, determine interactive RT as a result.More particularly, not the known eyes coordinate LOC that is supposed from three-dimensional interaction systems 300 owing to working as the user EYE_PREWatch stereopsis DIM 3DThe time, the user sees that the position of virtual objects VO can change and virtual objects VO may have point deformation, and causes incorrect interactive RT as a result.Therefore, the invention provides the embodiment of three kinds of bearing calibrations.Below will be further described.
In first embodiment of bearing calibration of the present invention, interactive decision circuitry 324 is watched stereopsis DIM according to the user 3DPosition (three-dimensional eyes coordinate LOC 3D_EYE), correction user in fact desire carries out interactive position by interactive assembly 322, obtains correct interactive RT as a result.More particularly, interactive decision circuitry 324 is according to three-dimensional imaging position model MODEL LOC, the eyes position of calculating as the user is known eyes coordinate LOC EYE_PREThe time viewed interactive assembly 322 virtual objects (as tennis racket) controlled (this position is the three-dimensional interactive coordinate LOC of correction in the position 3D_CIO).Then, interactive decision circuitry 324 is proofreaied and correct interactive coordinate LOC according to three-dimensional 3D_CIO, virtual objects VO virtual coordinates LOC 3D_PVOWith interactive Rule of judgment COND POV, the eyes position that the user is worked as in decision is known eyes coordinate LOC EYE_PREThe time viewed interactive RT as a result.Owing to interactive RT as a result not along with user's position changes, therefore the eyes position that this moment, interactive decision circuitry 324 resulting interactive results were the user is virtual at three-dimensional eyes coordinate LOC 3D_EYEViewed interactive RT as a result.
Please refer to Fig. 5.Fig. 5 is the synoptic diagram of first embodiment of explanation bearing calibration of the present invention.Interactive decision circuitry 324 is according to three-dimensional eyes coordinate (three-dimensional reference coordinate) LOC 3D_EYEConversion three-dimensional interactive coordinate LOC 3D_PIOFor three-dimensional is proofreaied and correct interactive coordinate LOC 3D_CIOMore particularly, interactive decision circuitry 324 is according to three-dimensional eyes coordinate LOC 3D_EYEWith three-dimensional interactive coordinate LOC 3D_PIO, calculate when user's eyes position virtual at known eyes coordinate LOC EYE_PREThe time, the position of the viewed interactive assembly 322 of user (the interactive coordinate LOC of the promptly three-dimensional correction of meaning 3D_CIO).For example, at known eyes coordinate LOC EYE_PRECoordinate system in have a plurality of search point P (search point P as shown in Figure 5 A).Interactive decision circuitry 324 is according to searching some P AWith known eyes coordinate LOC LE_PREWith LOC RE_PRE, can obtain searching some P AProjection is in left image DIM LA left side search the LOC of projection coordinate 3D_SPJL, and search some P AProjection is in right image DIM RThe right side search the LOC of projection coordinate 3D_SPJRBy the three-dimensional imaging position model MODEL that the present invention supposed LOC, interactive decision circuitry 324 is according to searching the LOC of projection coordinate 3D_SPJLWith LOC 3D_SPJR, and three-dimensional eyes coordinate LOC 3D_EYECan obtain corresponding to three-dimensional eyes coordinate LOC 3D_EYECoordinate system in, corresponding to searching a some P AEnd points P B, and interactive decision circuitry 324 can calculate end points P further BWith three-dimensional interactive coordinate LOC 3D_PIOError distance D SThus, interactive decision circuitry 324 can be calculated at known eyes coordinate LOC according to above-mentioned illustrated mode EYE_PRECoordinate system in all search the pairing error distance D of some P SSearch point (for example, as P when one X) pairing error distance D SHour, interactive decision circuitry 324 is according to searching a some P XThe position decide the three-dimensional interactive coordinate LOC of correction 3D_CIOBecause when user's eyes position is three-dimensional eyes coordinate LOC 3D_EYEThe time, the stereopsis DIM that the user saw 3DThe position of each virtual objects all be from known eyes coordinate LOC EYE_PREOrigin coordinate system transform to three-dimensional eyes coordinate LOC 3D_EYECoordinate system.Therefore calculate the three-dimensional interactive coordinate LOC of correction by the illustrated method of Fig. 5 3D_CIOThe time, the stereopsis DIM that the conversion direction of coordinate system and user are seen 3DThe conversion direction of each virtual objects identical, so can reduce the error that produces because of non-linear origin coordinate system transform, proofread and correct interactive coordinate LOC and obtain more correct three-dimensional 3D_CIO
In order to reduce in first embodiment of bearing calibration of the present invention, interactive decision circuitry 324 is when calculating at known eyes coordinate LOC EYE_PRECoordinate system in search the pairing error distance D of some P SThe calculation resources of Shi Suoxu, the present invention further provides the mode of simplification, with the number of the search point P that reduces interactive decision circuitry 324 required processing.Please refer to Fig. 6, Fig. 7 and Fig. 8.Fig. 6, Fig. 7 and Fig. 8 can reduce the synoptic diagram of interactive decision circuitry 324 mode of the number of the search point of required processing in first embodiment of bearing calibration of the present invention for explanation.Interactive decision circuitry 324 is according to three-dimensional eyes coordinate LOC 3D_EYEWith three-dimensional eyes coordinate LOC 3D_EYECoordinate system in three-dimensional interactive coordinate LOC 3D_PIOBe converted to known eyes coordinate LOC EYE_PRECoordinate system in center point P CBecause center point P CCorresponding to three-dimensional eyes coordinate LOC 3D_EYECoordinate system in three-dimensional interactive coordinate LOC 3D_PIO, therefore under general situation, have the least error distance D SSearch point P XCan be adjacent to center point P CIn other words, interactive decision circuitry 324 can only be calculated and be adjacent to center point P CThe pairing error distance D of search point P S, can obtain having the least error distance D SSearch point P X, and the interactive coordinate LOC of the three-dimensional correction of decision according to this 3D_CIO
More particularly, as shown in Figure 6, according to the three-dimensional interactive coordinate LOC of interactive assembly 322 3D_PIOThree-dimensional left eye coordinates LOC with the user 3D_LECan form projection straight line L PJLProjection straight line L PJLWith three-dimensional display system 310 intersections in position LOC 3D_IPJLPosition LOC wherein 3D_IPJLBe the left image DIM that the user sees that interactive assembly 322 projections are provided in three-dimensional display system 310 LThree-dimensional left interaction coordinate; In like manner, according to the three-dimensional interactive coordinate LOC of interactive assembly 322 3D_PIOThree-dimensional right eye coordinate LOC with the user 3D_ERCan form projection straight line L PJRProjection straight line L PJRWith 310 intersections of 3D optical projection system in position LOC 3D_IPJRPosition LOC wherein 3D_IPJRBe the right image DIM that the user sees that interactive assembly 322 projections are provided in three-dimensional display system 310 LThree-dimensional right interaction coordinate.That is to say that interactive decision circuitry 324 is according to three-dimensional eyes coordinate LOC 3D_EYEWith three-dimensional interactive coordinate LOC 3D_PIO, to draw the three-dimensional left interaction coordinate LOC of interactive assembly 322 projections on three-dimensional display system 310 3D_IPJLWith the right interaction coordinate of three-dimensional LOC 3D_IPJRInteractive decision circuitry 324 is according to the left interaction coordinate of three-dimensional LOC 3D_IPJLWith known left eye coordinates LOC LE_PREDecide left consult straight line L REFL, and according to the right interaction coordinate of this three-dimensional LOC 3D_IPJRWith known right eye coordinate LOC RE_PREDecide right consult straight line L REFRInteractive decision circuitry 324 is according to left consult straight line L REFLWith right consult straight line L REFR, can obtain at known eyes coordinate LOC EYE_PRECoordinate system in center point P CFor example, as left consult straight line L REFLWith right consult straight line L REFRWhen intersecting at intersection point CP (as shown in Figure 6), interactive decision circuitry 324 can be according to the position of intersection point CP, the decision center point P CAs left consult straight line L REFLWith right consult straight line L REFRWhen directly not intersecting (as shown in Figure 7), interactive decision circuitry 324 is according to left consult straight line L REFLWith right consult straight line L REFR, obtain having and left consult straight line L REFLWith right consult straight line L REFRMinor increment and reference mid point MP, and with reference to mid point MP and left consult straight line L REFLBetween distance D MPLEqual with reference to mid point MP and right consult straight line L REFRBetween distance D MPRAt this moment, be center point P with reference to mid point MP CWhen interactive decision circuitry 324 obtains center point P CAfter, as shown in Figure 8, interactive decision circuitry 324 can be according to center point P CDecision search area RA.The pairing error distance D of the interactive search point P of 324 calculating of decision circuitry in search area RA STherefore compared to mode, utilize the illustrated mode of Fig. 6, Fig. 7 and Fig. 8, can save interactive decision circuitry 324 further and calculate the three-dimensional interactive coordinate LOC of correction in comprehensive search illustrated in fig. 5 3D_CIOThe calculation resources of Shi Suoxu.
Please refer to Fig. 9 and Figure 10.Fig. 9 and Figure 10 are the synoptic diagram of the search point of second embodiment of explanation bearing calibration of the present invention.Interactive decision circuitry 324 is according to three-dimensional eyes coordinate (three-dimensional reference coordinate) LOC 3D_EYEConversion three-dimensional interactive coordinate LOC 3D_PIOFor three-dimensional is proofreaied and correct interactive coordinate LOC 3D_CIOMore particularly, interactive decision circuitry 324 is according to three-dimensional eyes coordinate LOC 3D_EYEWith three-dimensional interactive coordinate LOC 3D_PIO, calculate when user's eyes position be known eyes coordinate LOC EYE_PREThe time, the position of the viewed interactive assembly 322 of user (the interactive coordinate LOC of the promptly three-dimensional correction of meaning 3D_CIO).For example, as shown in Figure 9, according to the three-dimensional interactive coordinate LOC of interactive assembly 322 3D_PIOThree-dimensional left eye coordinates LOC with the user 3D_LECan form projection straight line L PJLProjection straight line L PJLWith three-dimensional display system 310 intersections in position LOC 3D_IPJLPosition LOC wherein 3D_IPJLBe the left image DIM that the user sees that interactive assembly 322 projections are provided in three-dimensional display system 310 LThree-dimensional left interaction coordinate; In like manner, according to the three-dimensional interactive coordinate LOC of interactive assembly 322 3D_PIOThree-dimensional right eye coordinate LOC with the user 3D_ERCan form projection straight line L PJRProjection straight line L PJRWith 310 intersections of 3D optical projection system in position LOC 3D_IPJRPosition LOC wherein 3D_IPJRBe the right image DIM that the user sees that interactive assembly 322 projections are provided in three-dimensional display system 310 LThree-dimensional right interaction coordinate.That is to say that interactive decision circuitry 324 is according to three-dimensional eyes coordinate LOC 3D_EYEWith three-dimensional interactive coordinate LOC 3D_PIO, draw the three-dimensional left interaction coordinate LOC of interactive assembly 322 projections on three-dimensional display system 310 3D_IPJLWith the right interaction coordinate of three-dimensional LOC 3D_IPJRInteractive decision circuitry 324 is according to the left interaction coordinate of three-dimensional LOC 3D_IPJLWith known left eye coordinates LOC LE_PREDetermine left consult straight line L REFL, and according to the right interaction coordinate of this three-dimensional LOC 3D_IPJRWith known right eye coordinate LOC RE_PREDetermine right consult straight line L REFRSo, interactive decision circuitry 324 is according to left consult straight line L REFLWith right consult straight line L REFR, the eyes position that can obtain as the user is virtual at known eyes coordinate LOC EYE_PREThe time, the position of the viewed interactive assembly 322 of user (the interactive coordinate LOC of three-dimensional correction 3D_CIO).Further say, as left consult straight line L REFLWith right consult straight line L REFRWhen intersecting at intersection point CP, the coordinate of intersection point CP is the three-dimensional interactive coordinate LOC of correction 3D_CIOAs left consult straight line L REFLWith right consult straight line L REFRWhen directly not intersecting (as shown in figure 10), interactive decision circuitry 324 is according to left consult straight line L REFLWith right consult straight line L REFR, obtain having and left consult straight line L REFLWith right consult straight line L REFRMinor increment and reference mid point MP, and with reference to mid point MP and left consult straight line L REFLBetween distance D MPLEqual with reference to mid point MP and right consult straight line L REFRBetween distance D MPRAt this moment, with reference to the coordinate of mid point MP promptly can be considered when user's eyes position be known eyes coordinate LOC EYE_PREThe time, the position of the viewed interactive assembly 322 of user (the interactive coordinate LOC of three-dimensional correction 3D_CIO).Therefore, interactive decision circuitry 324 can be proofreaied and correct interactive coordinate LOC according to three-dimensional 3D_CIO, and the virtual coordinates LOC of virtual objects VO 3D_PVOWith interactive Rule of judgment COND PVO, to determine interactive RT as a result.Compared to first embodiment of bearing calibration of the present invention, in second embodiment of bearing calibration of the present invention, interactive decision circuitry 324 is according to three-dimensional interactive coordinate LOC 3D_PIOWith three-dimensional eyes coordinate LOC 3D_EYE, obtain three-dimensional left interaction coordinate LOC 3D_IPJLWith the right interaction coordinate of three-dimensional LOC 3D_IPJR, and further according to the left interaction coordinate of three-dimensional LOC 3D_IPJL, three-dimensional right interaction coordinate LOC 3D_IPJRWith known eyes coordinate LOC EYE_PRE, obtain the three-dimensional interactive coordinate LOC of correction 3D_CIOThat is to say, be will be corresponding to three-dimensional eyes coordinate LOC in second embodiment of bearing calibration of the present invention 3D_EYEThe three-dimensional interactive coordinate LOC of coordinate system 3D_PIOBe converted to corresponding to known eyes coordinate LOC EYE_PREThe position of coordinate system, and proofread and correct interactive coordinate LOC as three-dimensional with this position 3D_CIOIn second embodiment of bearing calibration of the present invention, corresponding to known eyes coordinate LOC EYE_PRECoordinate system with corresponding to three-dimensional eyes coordinate LOC 3D_EYECoordinate system between conversion and non-linear (meaning is about to the three-dimensional interactive coordinate LOC of correction 3D_CIOReverse in the mode of similar above-mentioned explanation and to gain three-dimensional eyes coordinate LOC 3D_EYEThe position of coordinate system be not equal to three-dimensional interactive coordinate LOC 3D_PIO), therefore compared to first embodiment of bearing calibration of the present invention, the resulting three-dimensional of second embodiment of bearing calibration of the present invention is proofreaied and correct interactive coordinate LOC 3D_CIOBe approximate value.Yet, utilize second embodiment of bearing calibration of the present invention, interactive decision circuitry 324 need not calculate and search the pairing error distance D of some P STherefore, can save the required calculation resources of interactive decision circuitry 324 in large quantities.
In the 3rd embodiment of bearing calibration of the present invention, in fact interactive decision circuitry 324 sees stereopsis DIM according to the user 3DPosition (three-dimensional left eye coordinates LOC as shown in Figure 4 3D_LEWith three-dimensional right eye coordinate LOC 3D_RE), proofread and correct stereopsis DIM 3D(as the virtual coordinates LOC of virtual objects VO 3D_PVOWith interactive Rule of judgment COND PVO), obtain correct interactive RT as a result.More particularly, interactive decision circuitry 324 is according to three-dimensional eyes coordinate LOC 3D_EYE(three-dimensional left eye coordinates LOC 3D_LEWith three-dimensional right eye coordinate LOC 3D_RE), the virtual coordinates LOC of virtual objects VO 3D_PVOWith interactive Rule of judgment COND PVO, the viewing location that calculates as the user is three-dimensional eyes coordinate LOC 3D_EYEThe time, the user in fact see the position of virtual objects VO and user the interactive Rule of judgment that should experience.So, interactive decision circuitry 324 can be according to position (the three-dimensional interactive coordinate LOC of interactive assembly 322 3D_PIO), in fact the user see the position (coordinate behind as shown in Figure 4 calibrated) of virtual objects VO, and the user the interactive Rule of judgment (the interactive Rule of judgment behind as shown in Figure 4 calibrated) that should experience, and determine correct interactive result.
Please refer to Figure 11 and Figure 12.Figure 11 and Figure 12 are the synoptic diagram of the 3rd embodiment of explanation bearing calibration of the present invention.In the 3rd embodiment of bearing calibration of the present invention, interactive decision circuitry 324 is according to three-dimensional eyes coordinate (three-dimensional reference coordinate) LOC 3D_EYEProofread and correct stereopsis DIM 3D, to obtain correct interactive RT as a result.More particularly, interactive decision circuitry 324 is according to three-dimensional eyes coordinate (three-dimensional reference coordinate) LOC 3D_EYEThe virtual coordinates LOC of conversion virtual objects VO 3D_PVOFor proofreading and correct virtual coordinates LOC 3D_CVOAnd interactive decision circuitry 324 is according to this three-dimensional eyes coordinate LOC 3D_EYEChange interactive Rule of judgment COND PVOFor proofreading and correct interactive Rule of judgment COND CVOThus, interactive decision circuitry 324 is according to three-dimensional interactive coordinate LOC 3D_PIO, proofread and correct virtual coordinates LOC 3D_CVOWith the interactive Rule of judgment COND of correction CVO, determine interactive RT as a result.For example, as shown in figure 11, the user is from three-dimensional left eye coordinates LOC 3D_LEWith three-dimensional right eye coordinate LOC 3D_REWatch stereopsis DIM 3DTherefore, interactive decision circuitry 324 can be according to straight line L AL(three-dimensional left eye coordinates LOC 3D_LEWith virtual objects VO in left image DIM LPosition LOC ILVOBetween straight line) with straight line L AR(three-dimensional right eye coordinate LOC 3D_REWith virtual objects VO in right image DIM RPosition LOC IRVOBetween straight line), obtain the user from three-dimensional eyes coordinate LOC 3D_EYEThe virtual objects VO that is seen is at position LOC 3D_CVOThus, interactive decision circuitry 324 can be according to three-dimensional eyes coordinate LOC 3D_EYE, proofread and correct virtual coordinates LOC 3D_PVO, in fact see the residing position of virtual objects VO (correction virtual coordinates LOC and obtain the user 3D_CVO).As shown in figure 12, interactive Rule of judgment COND PVOFor according to interactive critical distance D THDetermine with the position of virtual objects VO.Therefore, interactive Rule of judgment COND PVOThe position that can be considered with virtual objects VO is the center, with interactive critical distance D THBe the formed critical surface SUF of radius PTHWhen interactive assembly 322 enters critical surface SUF PTHThe time, the interactive RT as a result of interactive decision circuitry 324 decisions represents " contact "; When interactive assembly 322 does not enter critical surface SUF PTHThe time, the interactive RT as a result of interactive decision circuitry 324 decisions represents " not contact ".Because critical surface SUF PTHCan be considered by many critical point P THForm each critical point P THThe position be its virtual coordinates LOC PTH, therefore interactive decision circuitry 324 is utilized the illustrated method of similar Figure 11, can be according to three-dimensional eyes coordinate LOC 3D_EYE, obtain each critical point P that in fact user is experienced THCorrection virtual coordinates LOC CTHThus, all critical point P THCorrection virtual coordinates LOCC THCan form the critical surface SUF after calibrated CTHAt this moment, proofread and correct critical surface SUF CTHBe and proofread and correct interactive Rule of judgment COND COVThat is to say, as the three-dimensional interactive coordinate LOC of interactive assembly 322 3D_PIOEnter and proofread and correct critical surface SUF CTHThe time, the interactive RT as a result of interactive decision circuitry 324 decisions represents " contact " (as shown in figure 12).Thus, interactive decision circuitry 324 is according to three-dimensional eyes coordinate LOC 3D_EYERecoverable stereopsis DIM 3D(the virtual coordinates LOC of virtual objects VO 3D_PVOWith interactive Rule of judgment COND PVO), to obtain position (the correction virtual coordinates LOC that in fact user sees virtual objects VO 3D_CVO) with the user in fact the interactive Rule of judgment that should experience (proofread and correct interactive Rule of judgment COND CVO).Therefore, interactive decision circuitry 324 can be according to the three-dimensional interactive coordinate LOC of interactive assembly 322 3D_PIO, virtual coordinates LOC 3D_CVOWith the interactive Rule of judgment COND of correction CVO, correctly to determine interactive RT as a result.In addition, under general situation, interactive Rule of judgment COND POVWith the interactive Rule of judgment COND of correction COVDifference little, for example, critical surface SUF PTHFor having radius D THSphere, at this moment, proofread and correct critical surface SUF CTHAlso be sphere, and its radius approximate D greatly THTherefore in the 3rd embodiment of bearing calibration of the present invention, also can only proofread and correct the virtual coordinates LOC of virtual objects VO 3D_PVO, and do not proofread and correct interactive Rule of judgment COND PVO, to save the required calculation resources of interactive decision circuitry 324.In other words, interactive decision circuitry 324 can be according to proofreading and correct virtual coordinates LOC 3D_CVOWith interactive Rule of judgment COND originally PVO, calculate interactive RT as a result.
In addition, in the 3rd embodiment of bearing calibration of the present invention, in fact interactive decision circuitry 324 sees stereopsis DIM according to the user 3DPosition (three-dimensional eyes coordinate LOC 3D_EYE), proofread and correct stereopsis DIM 3D(virtual coordinates LOC 3D_PVOWith interactive Rule of judgment COND PVO), obtain correct interactive RT as a result.Therefore in the 3rd embodiment of bearing calibration provided by the present invention, if stereopsis DIM 3DIn have a plurality of virtual objects (VO for example, 1~VO M), then interactive decision circuitry 324 needs to calculate each virtual objects VO 1~VO MThe correction virtual coordinates with proofread and correct interactive Rule of judgment.In other words, the data quantity of interactive decision circuitry 324 required processing increases along with the quantity of virtual objects.Yet in first and second embodiment of bearing calibration of the present invention, interactive decision circuitry 324 is watched stereopsis DIM according to the user 3DPosition (three-dimensional eyes coordinate LOC 3D_EYE), with position (the three-dimensional interactive coordinate LOC that proofreaies and correct interactive assembly 322 3D_PIO), obtain correct interactive RT as a result.Therefore in first and second embodiment of bearing calibration provided by the present invention, interactive decision circuitry 324 only need be calculated the three-dimensional of interactive assembly 322 and proofread and correct interactive coordinate LOC 3D_CIOIn other words, compared to the 3rd embodiment of bearing calibration provided by the present invention, even the quantity of virtual objects increases, the data quantity of interactive decision circuitry 324 required processing can not change yet.
Please refer to Figure 13, Figure 13 is the synoptic diagram of explanation three-dimensional interaction systems 300 may command sound and light programs of the present invention.Three-dimensional interaction systems 300 comprises display control circuit 330, loudspeaker 340, and sound control circuit 350 in addition.Display control circuit 330 is adjusted the stereopsis DIM that three-dimensional display system 310 is provided according to interactive RT as a result 3DFor example, when interactive decision circuitry 324 judged that interactive RT as a result represents " contact ", display control circuit 330 control three-dimensional display systems 310 showed the stereopsis DIM that virtual objects VO (as tennis) is hit by interactive assembly 322 (corresponding to tennis racket) 3D Sound control circuit 350 is adjusted the sound that loudspeaker 340 are provided according to interactive RT as a result.For example, when interactive decision circuitry 324 judges that interactive RT as a result represents " contact ", the sound that sound control circuit 350 control loudspeaker 340 output virtual objects VO (as tennis) are hit by interactive assembly 322 (corresponding to tennis racket).
Please refer to Figure 14.Figure 14 is the synoptic diagram of the embodiment 1100 of eye location module of the present invention.Eye location module 1100 comprises image sensor 1110 and 1120, eye location circuit 1130, and three-dimensional coordinate change-over circuit 1140. Image sensor 1110 and 1120 is used for sensing range and contains the scene SC of user's position, to produce two-dimentional sensing image SIM respectively 2D1With SIM 2D2, and image sensor 1110 is arranged at sense position LOC SEN1, image sensor 1120 is arranged at sense position LOC SEN2 Eye location circuit 1130 is used for according to two-dimentional sensing image SIM 2D1With SIM 2D2, obtain respectively at two-dimentional sensing image SIM 2D1The two-dimentional eyes coordinate LOC of middle user's eyes 2D_EYE1With at two-dimentional sensing image SIM 2D2The two-dimentional eyes coordinate LOC of middle user's eyes 2D_EYE2Three-dimensional coordinate change-over circuit 1140 is used for according to two-dimentional eyes coordinate LOC 2D_EYE1With LOC 2D_EYE2, image sensor 1110 position LOC SEN1, and the position LOC of image sensor 1120 SEN2, calculate the three-dimensional eyes coordinate LOC of user's eyes 3D_EYE, its principle of work is the known technology of industry, so repeat no more.
Please refer to Figure 15.Figure 15 is the synoptic diagram of eye location embodiment of circuit 1200 of the present invention.Eye location circuit 1200 comprises eye detecting circuit 1210.The two-dimentional sensing image SIM of eye detecting (eye-detecting) circuit 1210 detectings 2D1In user's eyes, to obtain two-dimentional eyes coordinate LOC 2D_EYE1, and the two-dimentional sensing image SIM of eye detecting circuit 1210 detectings 2D2In user's eyes, to obtain two-dimentional eyes coordinate LOC 2D_EYE2Because eye detecting is the known technology of industry, so repeat no more.
Please refer to Figure 16.Figure 16 is the synoptic diagram of the embodiment 1300 of eye location module of the present invention.Compared to eye location module 1100, eye location module 1300 also comprises human face detection circuit 1350.Human face detection circuit 1350 is used for identification two dimension sensing image SIM 2D1In people's face HM of user 1Scope and two-dimentional sensing image SIM 2D2In people's face HM of user 2Scope, wherein human face detection is the known technology of industry, so repeat no more.By human face detection circuit 1350,1130 need of eye location circuit are according to people's face HM 1With people's face HM 2Scope in data, can obtain two-dimentional eyes coordinate LOC respectively 2D_EYE1With LOC 2D_EYE2Therefore, compared to eye location module 1100, eye location module 1300 can reduce eye location circuit 1340 for two-dimentional sensing image SIM 2D1With SIM 2D2The scope of required processing, the processing speed of lifting eye location module 1100.
Consider that user's eyes may be covered by the auxiliary eyeglasses of eyeglass stereoscopic display system when three-dimensional display system 310 was implemented with the eyeglass stereoscopic display system, therefore in Figure 17, the invention provides another embodiment 1400 of eye location circuit.Set up body display system 310 to comprise display screen 311 and auxiliary eyeglasses 312.The user wears auxiliary eyeglasses 312, to receive the left image DIM that display screen 311 is provided LWith right image DIM REye location circuit 1400 comprises glasses circuit for detecting 1410, and glasses coordinate transformation circuit 1420.The two-dimentional sensing image SIM of glasses circuit for detecting 1410 detectings 2D1In auxiliary eyeglasses 312, to obtain two-dimentional glasses coordinate LOC GLASS1With glasses slope S L GLASS1, and the two-dimentional sensing image SIM of glasses circuit for detecting 1410 detectings 2D2In auxiliary eyeglasses 312, to obtain two-dimentional glasses coordinate LOC GLASS2With glasses slope S L GLASS2Glasses coordinate transformation circuit 1420 is according to two-dimentional glasses coordinate LOC GLASS1With LOC GLASS2, glasses slope S L GLASS1With SL GLASS2, and the user inputs to three-dimensional interaction systems 300 or three-dimensional interaction systems 300 predefined known binocular interval D in advance EYE, calculate user's two-dimentional eyes coordinate LOC 2D_EYE1With LOC 2D_EYE2So, even when user's eyes are covered by glasses, eye location module of the present invention still can obtain user's two-dimentional eyes coordinate LOC by the design of eye location circuit 1400 2D_EYE1With LOC 2D_EYE2
Please refer to Figure 18.Figure 18 is the synoptic diagram that the invention provides another embodiment 1500 of eye location circuit.Compared to eye location circuit 1400, eye location circuit 1500 also comprises an inclination detector 1530.Inclination detector 1530 can be arranged on the auxiliary eyeglasses 312.Inclination detector 1530 produces inclination information INFO according to the angle of inclination of auxiliary eyeglasses 312 TILTFor example, inclination detector 1530 is gyroscope (Gyroscope).Owing to work as at two-dimentional sensing image SIM 2D1With image SIM 2D2In corresponding to the picture element of auxiliary eyeglasses 312 more after a little while, the glasses slope S L that glasses circuit for detecting 1410 is calculated GLASS1With SL GLASS2Be easier to produce error.Therefore the inclination information INFO that is provided by inclination detector 1530 TILT, the glasses slope S L that glasses coordinate transformation circuit 1420 recoverable glasses circuit for detecting 1410 are calculated GLASS1With SL GLASS2For example, glasses coordinate transformation circuit 1420 is according to inclination information INFO TILT, proofread and correct the glasses slope S L that glasses circuit for detecting 1410 is calculated GLASS1With SL GLASS2, and produce correction glasses slope S L according to this GLASS1_CWith correction glasses slope S L GLASS2_CSo, glasses coordinate transformation circuit 1420 is according to two-dimentional glasses coordinate LOC GLASS1With LOC GLASS2, proofread and correct glasses slope S L GLASS1_CWith SL GLASS2_C, with known binocular interval DEYE, can calculate two-dimentional eyes coordinate LOC 2D_EYE1With LOC 2D_ EYE2Therefore, that is to say that compared to eye location circuit 1400, in eye location circuit 1500, glasses coordinate transformation circuit 1420 recoverable glasses circuit for detecting 1410 calculate glasses slope S L GLASS1With SL GLASS2The time error that produced, more correctly to calculate user's two-dimentional eyes coordinate LOC 2D_EYE1With LOC 2D_EYE2
Please refer to Figure 19.Figure 19 is the synoptic diagram of another embodiment 1600 of eye location circuit.Compared to eye location circuit 1400, eye location circuit 1600 comprises infrared light luminescence component 1640, infrared light reflection assembly 1650 in addition, and infrared light sensing circuit 1660.Infrared light luminescence component 1640 is used for sending detected light L DTo scene SC.Infrared light reflection assembly 1650 is arranged on the auxiliary eyeglasses 312, is used for reflecting detected light L DTo produce reflected light L RInfrared light sensing circuit 1660 is according to L R, produce two-dimensional infrared light coordinate LOC corresponding to the position of auxiliary eyeglasses 312 IRWith infrared light slope S L corresponding to the angle of inclination of auxiliary eyeglasses 312 IRBe similar to the explanation of Figure 18, information (the two-dimensional infrared light coordinate LOC that glasses coordinate transformation circuit 1420 can be provided according to infrared light sensing circuit 1660 IRWith infrared light slope S L IR), proofread and correct the glasses slope S L that glasses circuit for detecting 1410 is calculated GLASS1With SL GLASS2, and produce correction glasses slope S L according to this GLASS1_CWith correction glasses slope S L GLASS2_CThus, compared to eye location circuit 1400, in eye location circuit 1600, glasses coordinate transformation circuit 1420 recoverable glasses circuit for detecting 1410 calculate glasses slope S L GLASS1With SL GLASS2The time error that produced, more correctly to calculate user's two-dimentional eyes coordinate LOC 2D_EYE1With LOC 2D_EYE2In addition, in eye location circuit 1600, can have a plurality of infrared light reflection assemblies 1650.For example, in Figure 20, eye location circuit 1600 has two infrared light reflection assemblies 1650, and the position of correspondence at user's eyes is set respectively.In Figure 20, infrared light reflection assembly 1650 is separately positioned on the top of user's eyes, with explanation as an example.Eye location circuit 1600 in Figure 19 only has infrared light reflection assembly 1650, so infrared light sensing circuit 1660 needs the directive property of the single infrared light reflection assembly 1650 of detecting to calculate infrared light slope S L IRYet, in Figure 20, when infrared light sensing circuit 1660 detects the reflected light L that two infrared light reflection assemblies 1650 are produced RThe time, infrared light sensing circuit 1660 can be detected the position of two infrared light reflection assemblies 1650 according to this, and calculates infrared light slope S L IRTherefore, the eye location circuit 1600 that utilizes the mode of Figure 20 to implement can obtain infrared light slope S L simpler and easy and more accurately IR, more correctly to calculate user's two-dimentional eyes coordinate LOC 2D_EYE1With LOC 2D_EYE2
In addition, in Figure 19 and eye location circuit 1600 illustrated in fig. 20, when the rotation amplitude of user's head was big, the angle that may cause infrared light reflection assembly 1650 is deflection too, and makes infrared light sensing circuit 1660 can't sense enough reflected light L REnergy, so, may cause infrared light sensing circuit 1660 can't correctly calculate infrared light slope S L IRTherefore, the present invention further provides another embodiment 2300 of eye location circuit.Figure 21 and Figure 22 are the synoptic diagram of explanation eye location circuit 2300.Compared to eye location circuit 1400, eye location circuit 2300 comprises one or more infrared light luminescence component 2340 and infrared light sensing circuit 2360 in addition.The structure of infrared light luminescence component 2340 and infrared light sensing circuit 2360 and principle of work are similar with infrared light luminescence component 1640 and infrared light sensing circuit 1660 respectively.In eyes potential circuit 2300, infrared light luminescence component 2340 directly is provided with position corresponding to user's eyes.So, even when the rotation amplitude of user's head is big, infrared light sensing circuit 2360 also can sense enough detected light L DEnergy, with detecting infrared light luminescence component 2340, and calculate infrared light slope S L according to this IRIn Figure 21, eye location circuit 2300 has infrared light luminescence component 2340, and infrared light luminescence component 2340 approximately is arranged on the position intermediate of user's eyes.In Figure 22, eye location circuit 2300 has two infrared light luminescence components 2340, and infrared light luminescence component 2340 is arranged at the top of user's eyes respectively.Therefore, in Figure 21, only has an infrared light luminescence component 2340, in Figure 22, when infrared light sensing circuit 2360 detects two infrared light luminescence components 2240, can be directly go out infrared light slope S L with the position calculation of two infrared light luminescence components 2340 IR, and do not need to detect the directive property of single infrared light luminescence component 2340.Therefore, the eye location circuit 2300 that utilizes the mode of Figure 22 to implement can obtain infrared light slope S L simpler and easy and more accurately IR, more correctly to calculate user's two-dimentional eyes coordinate LOC 2D_EYE1With LOC 2D_EYE2
Please refer to Figure 23.Figure 23 is the synoptic diagram of another embodiment 1700 of eye location module of the present invention.Eye location module 1700 comprises three-dimensional scenic sensor 1710, and eyes coordinates produces circuit 1720.Three-dimensional scenic sensor 1710 is used for sensing range and contains user's scene SC, to produce two-dimentional sensing image SIM 2D3, and corresponding to two-dimentional sensing image SIM 2D3Range information INFO DRange information INFO DHave at two-dimentional sensing image SIM 2D3Every bit and the data of the distance between the three-dimensional scenic sensor 1710.Eyes coordinates produces circuit 1720, is used for according to two-dimentional sensing image SIM 2D3With range information INFO D, produce three-dimensional eyes coordinate LOC 3D_EYEFor example, eyes coordinates produces circuit 1720 and picks out two-dimentional sensing image SIM 2D3In corresponding to the picture element of user's eyes, then, eyes coordinates produces circuit 1720 according to range information INFO D, obtain two-dimentional sensing image SIM 2D3In corresponding to the scene SC of the picture element institute sensing of user's eyes and the distance between the three-dimensional scenic sensor 1710.So, eyes coordinates produces circuit 1720 according to two-dimentional sensing image SIM 2D3In corresponding to the position of the picture element of user's eyes with at range information INFO DIn the range data of correspondence, can produce three-dimensional eyes coordinate LOC 3D_EYE
Please refer to Figure 24.Figure 24 is the synoptic diagram of the embodiment 1800 of three-dimensional scenic sensor of the present invention.Three-dimensional scenic sensor 1800 comprises image sensor 1810, infrared light luminescence component 1820, and light sensing distance measuring equipment 1830.Image sensor 1810 sensing scene SC are to produce two-dimentional sensing image SIM 2D3Infrared light luminescence component 1820 detected light L DTo scene SC, so that scene SC produces reflected light L RLight sensing distance measuring equipment 1830 is used for sensing reflected light L R, to produce range information INFO DFor example, light sensing distance measuring equipment 1830 is Z sensor (Z-sensor).Because the Z sensor is the known technology of industry, so repeat no more.
Please refer to Figure 25.Figure 25 is the synoptic diagram that eyes coordinates of the present invention produces embodiment of circuit 1900.Eyes coordinates produces circuit 1900 and comprises eye detecting circuit 1910, and three-dimensional coordinate change-over circuit 1920.Eye detecting circuit 1910 is used for detecting two-dimentional sensing image SIM 2D3Middle user's eyes are to obtain two-dimentional eyes coordinate LOC 2D_EYE3Three-dimensional coordinate change-over circuit 1920 is according to two-dimentional eyes coordinate LOC 2D_EYE3, range information INFO D, the set range finding position LOC of light sensing distance measuring equipment 1830 MDAnd the set sense position LOC of image sensor 1810 (as shown in figure 24), SEN3(as shown in figure 24), calculate three-dimensional eyes coordinate LOC 3D_EYE
Please refer to Figure 26.Figure 26 is the synoptic diagram that eyes coordinates of the present invention produces embodiment of circuit 2000.Produce circuit 1900 compared to eyes coordinates, eyes coordinates produces circuit 2000 and also comprises human face detection circuit 2030.Human face detection circuit 2030 is used for identification two dimension sensing image SIM 2D3In people's face HM of user 3Scope.By people's face slowdown monitoring circuit 2030,1910 need of eye detecting circuit are according to people's face HM 3Data in the scope can obtain two-dimentional eyes coordinate LOC 2D_EYE3Therefore, produce circuit 1900 compared to eyes coordinates, eyes coordinates produces circuit 2000 can reduce eye detecting circuit 1910 for two-dimentional sensing image SIM 2D3The scope of required processing promotes the processing speed that eyes coordinates produces circuit 2000.
In addition, consider when three-dimensional display system 310 is implemented with the eyeglass stereoscopic display system, user's eyes may be covered by the auxiliary eyeglasses 312 of eyeglass stereoscopic display system, therefore in Figure 27, the invention provides another embodiment 2100 that eyes coordinates produces circuit.Eye location circuit 2100 comprises glasses circuit for detecting 2110, and glasses coordinate transformation circuit 2120.The two-dimentional sensing image SIM of glasses circuit for detecting 2110 detectings 2D3In auxiliary eyeglasses 312, to obtain two-dimentional glasses coordinate LOC GLASS3With glasses slope S L GLASS3Glasses coordinate transformation circuit 2120 is according to two-dimentional glasses coordinate LOC GLASS3With glasses slope S L GLASS3, the user inputs to three-dimensional interaction systems 300 or three-dimensional interaction systems 300 predefined known binocular interval D in advance EYE, and range information INFO D, to calculate user's three-dimensional eyes coordinate LOC 3D_EYESo, even when user's eyes are covered by glasses, eyes coordinates of the present invention produces the three-dimensional eyes coordinate LOC that circuit 2100 still can calculate the user 3D_EYE
Please refer to Figure 28.Figure 28 the invention provides the synoptic diagram that eyes coordinates produces another embodiment 2200 of circuit.Produce circuit 2100 compared to eyes coordinates, eyes coordinates produces circuit 2200 and also comprises inclination detector 2230.Inclination detector 2230 can be arranged on the auxiliary eyeglasses 312.The structure of inclination detector 2230 is similar to inclination detector 1530 to principle of work, so repeat no more.The inclination information INFO that is provided by inclination detector 2230 TILT, the glasses slope S L that glasses coordinate transformation circuit 2120 recoverable glasses circuit for detecting 2110 are calculated GLASS3For example, glasses coordinate transformation circuit 1420 is according to inclination information INFO TILT, proofread and correct the glasses slope S L that glasses circuit for detecting 2110 is calculated GLASS3, and produce correction glasses slope S L according to this GLASS3_CSo, glasses coordinate transformation circuit 1420 is according to two-dimentional glasses coordinate LOC GLASS3With correction glasses slope S L GLASS3_C, known binocular interval D EYEWith range information INFO D, can calculate three-dimensional eyes coordinate LOC 3D_EYEProduce circuit 2100 compared to eyes coordinates, produce in the circuit 2200 in eyes coordinates, glasses coordinate transformation circuit 2120 recoverable glasses circuit for detecting 2110 calculate glasses slope S L GLASS3The time error that produced, more correctly to calculate user's three-dimensional eyes coordinate LOC 3D_EYE
In sum, three-dimensional interaction systems 300 provided by the present invention, can be according to user's position, proofread and correct the position of interactive assembly, or the position of the virtual objects in the stereopsis and interactive Rule of judgment, so, even user's position change and cause the position change of the virtual objects in the stereopsis that the user sees, three-dimensional interaction systems of the present invention still can be according to the position of the interactive assembly after calibrated, or the position of the virtual objects after calibrated and interactive Rule of judgment, obtain correct interactive result.In addition, when locating module of the present invention is the eye location module, the auxiliary eyeglasses of formula three-dimensional display system even the user wears glasses and cause user's eyes crested, the known binocular interval that eye location module provided by the present invention is imported in advance according to the user, still can calculate the position of user's eyes, bring the user bigger facility.
The above only is the preferred embodiments of the present invention, and all equalizations of doing according to claim of the present invention change and modify, and all should belong to covering scope of the present invention.

Claims (25)

1. interactive module that is applied to a three-dimensional interaction systems, this solid interaction systems has a three-dimensional display system, this three-dimensional display system is used to provide a stereopsis, this stereopsis has a virtual objects, this virtual objects has a virtual coordinates and an interactive Rule of judgment, and this interactive module is characterised in that and comprises:
One locatees module, is used for detecting the position of user in a scene, to produce a three-dimensional reference coordinate;
One interactive assembly;
One interactive assembly locating module is used for detecting the position of this interaction assembly, to produce a three-dimensional interactive coordinate; And
One interactive decision circuitry, being used for changing this virtual coordinates according to this three-dimensional reference coordinate is a correction virtual coordinates, and according to this three-dimensional interactive coordinate, this correction virtual coordinates and this interaction Rule of judgment, to determine the interactive result between this interaction assembly and this stereopsis.
2. interactive module as claimed in claim 1 is characterized in that, it is the interactive Rule of judgment of a correction that this interaction decision circuitry is changed this interaction Rule of judgment according to this three-dimensional reference coordinate; This interaction decision circuitry is proofreaied and correct interactive Rule of judgment according to this three-dimensional interactive coordinate, this correction virtual coordinates and this, determines this interaction result; This interaction decision circuitry is according to an interactive critical distance and this virtual coordinates, to calculate a critical surface; It is a correction critical surface that this interaction decision circuitry is changed this critical surface according to this three-dimensional reference coordinate; This proofreaies and correct interactive Rule of judgment for when this three-dimensional interactive coordinate enters this correction critical surface, and this interaction result represents contact.
3. interactive module as claimed in claim 1 is characterized in that, this locating module is an eyes locating module, and this eye location module is used for detecting the position of user's eyes in this scene, produces a three-dimensional eyes coordinate as this three-dimensional reference coordinate;
Wherein this three-dimensional display system comprises a display screen and an auxiliary eyeglasses, and this display screen is used to provide a left image and a right image, and this auxiliary eyeglasses is used for auxiliary reception should left side image and this right side image, to obtain this stereopsis;
Wherein this eye location module comprises:
One first image sensor is used for this scene of sensing, to produce one first two-dimentional sensing image;
One second image sensor is used for this scene of sensing, to produce one second two-dimentional sensing image;
One eyes positioning circuit comprises:
One glasses circuit for detecting, be used for detecting this auxiliary eyeglasses in this first two-dimentional sensing image, obtaining one first two-dimentional glasses coordinate and one first glasses slope, and detect this auxiliary eyeglasses in this second two-dimentional sensing image, to obtain one second two-dimentional glasses position and one second glasses slope; And
One glasses coordinate transformation circuit, be used for according to this first two-dimentional glasses coordinate, this first glasses slope, this second two-dimentional glasses coordinate, this second glasses slope and a known binocular interval, to calculate one first two-dimentional eyes coordinate and one second two-dimentional eyes coordinate; And
One three-dimensional coordinate change-over circuit is used for calculating this three-dimensional eyes coordinate according to this first two-dimentional eyes coordinate, this second two-dimentional eyes coordinate, one first sense position of this first image sensor and one second sense position of this second image sensor.
4. interactive module as claimed in claim 3 is characterized in that, this eye location circuit also comprises an inclination detector; This inclination detector is arranged on this auxiliary eyeglasses; This inclination detector is used for producing an inclination information according to the angle of inclination of this auxiliary eyeglasses; This glasses coordinate transformation circuit calculates this first two-dimentional eyes coordinate and this second two-dimentional eyes coordinate according to this inclination information, the first two-dimentional glasses coordinate, this first glasses slope, this second two-dimentional glasses coordinate, this second glasses slope and this known binocular interval.
5. interactive module as claimed in claim 3 is characterized in that, this eye location circuit comprises in addition:
One first infrared light luminescence component is used for sending one first detected light; And
One infrared light sensing circuit is used for according to this first detected light, produces a two-dimensional infrared light coordinate and an infrared light slope;
Wherein this glasses coordinate transformation circuit calculates this first two-dimentional eyes coordinate and this second two-dimentional eyes coordinate according to this infrared light slope, this first glasses slope, this second glasses slope, this two-dimensional infrared light coordinate, this first two-dimentional glasses coordinate, this second two-dimentional glasses coordinate, this second correction glasses slope and this known binocular interval.
6. interactive module as claimed in claim 1 is characterized in that, this locating module is an eyes locating module, and this eye location module is used for detecting the position of user's eyes in this scene, to produce a three-dimensional eyes coordinate as this three-dimensional reference coordinate;
Wherein this three-dimensional display system comprises a display screen and an auxiliary eyeglasses, and this display screen is used to provide a left image and a right image, and this auxiliary eyeglasses is used for auxiliary reception should left side image and this right side image, to obtain this stereopsis;
Wherein this eye location module comprises:
One three-dimensional scenic sensor comprises:
One the 3rd image sensor is used for this scene of sensing, to produce one the 3rd two-dimentional sensing image;
One infrared light luminescence component is used for sending a detected light to this scene, so that this scene produces a reflected light; And
One light sensing distance measuring equipment is used for this reflected light of sensing, to produce a range information;
Wherein this range information has the data of the distance between the every bit and this three-dimensional scenic sensor in the 3rd two-dimentional sensing image; And
One eyes coordinate produces circuit, comprises:
One glasses circuit for detecting is used for detecting this auxiliary eyeglasses in the 3rd two-dimentional sensing image, to obtain one the 3rd two-dimentional glasses coordinate and one the 3rd glasses slope; And
One glasses coordinate transformation circuit is used for according to the 3rd two-dimentional glasses coordinate, the 3rd glasses slope, a known binocular interval and this range information, to calculate this three-dimensional eyes coordinate.
7. interactive module as claimed in claim 1 is characterized in that, this locating module is an eyes locating module, and this eye location module is used for detecting the position of user's eyes in this scene, to produce a three-dimensional eyes coordinate as this three-dimensional reference coordinate;
Wherein this eye location module comprises:
One three-dimensional scenic sensor is used for this scene of sensing, producing one the 3rd two-dimentional sensing image, and corresponding to a range information of the 3rd two-dimentional sensing image;
Wherein this range information has the data of the distance between the every bit and this three-dimensional scenic sensor in the 3rd two-dimentional sensing image; And
One eyes coordinate produces circuit, comprises:
One eyes circuit for detecting is used for detecting the eyes in the 3rd two-dimentional sensing image, to obtain one the 3rd two-dimentional eyes coordinate;
One three-dimensional coordinate change-over circuit is used for a range finding position according to the 3rd two-dimentional eyes coordinate, this range information, this light sensing distance measuring equipment, and one the 3rd sense position of the 3rd image sensor, calculates this three-dimensional eyes coordinate.
8. interactive module that is applied to a three-dimensional interaction systems, this solid interaction systems has a three-dimensional display system, this three-dimensional display system is used to provide a stereopsis, this stereopsis has a virtual objects, this virtual objects has a virtual coordinates and an interactive Rule of judgment, and this interactive module is characterised in that and comprises:
One locatees module, is used for detecting the position of user in a scene, to produce a three-dimensional reference coordinate;
One interactive assembly;
One interactive assembly locating module is used for detecting the position of this interaction assembly, to produce a three-dimensional interactive coordinate; And
One interactive decision circuitry, being used for changing this three-dimensional interactive coordinate according to this three-dimensional reference coordinate is the interactive coordinate of a three-dimensional correction, and proofread and correct interactive coordinate, this virtual coordinates and this interaction Rule of judgment according to this three-dimensional, determine the interactive result between this interaction assembly and this stereopsis.
9. interactive module as claimed in claim 8 is characterized in that, this locating module is an eyes locating module, and this eye location module is used for detecting the position of user's eyes in this scene, to produce a three-dimensional eyes coordinate as this three-dimensional reference coordinate; This interaction decision circuitry is according to this three-dimensional eyes coordinate and this three-dimensional interactive coordinate, draws this interaction assembly and is projected in a three-dimensional left interaction coordinate and a three-dimensional right interaction coordinate on this three-dimensional display system; This interaction decision circuitry decides a left consult straight line according to this a three-dimensional left side interaction coordinate and a known left eye coordinates, and decides a right consult straight line according to a right interaction coordinate of this three-dimensional and a known right eye coordinate; This interaction decision circuitry obtains this three-dimensional and proofreaies and correct interactive coordinate according to this left side consult straight line and this right side consult straight line.
10. interactive module as claimed in claim 9 is characterized in that, when left consult straight line and this right side consult straight line intersected, this interaction decision circuitry obtained this three-dimensional and proofreaies and correct interactive coordinate according to the coordinate of the intersection point of this left side consult straight line and this right side consult straight line; When left consult straight line and this right side consult straight line are non-intersect, this interaction decision circuitry is according to this left side consult straight line and this right side consult straight line, obtain having with the minor increment of this left side consult straight line and this right side consult straight line and one with reference to mid point, and should equal this with reference to the distance between mid point and this right side consult straight line with reference to the distance between mid point and this left side consult straight line, this interaction decision circuitry obtains this three-dimensional according to this coordinate with reference to mid point and proofreaies and correct interactive coordinate.
11. interactive module as claimed in claim 9 is characterized in that, this interaction decision circuitry obtains a central point according to this left side consult straight line and this right side consult straight line; This interaction decision circuitry determines a search area according to this central point; Have M in this search area and search point; This interaction decision circuitry is searched point and this three-dimensional eyes coordinate according to this known eyes coordinate, this M, and decision is in the coordinate system corresponding to this three-dimensional eyes coordinate, corresponding to M end points of this M search; This interaction decision circuitry decides M error distance corresponding to this M end points according to the position of this M end points and this three-dimensional interactive coordinate respectively; This interaction decision circuitry has the least error distance according to a K end points of this M end points, determines this three-dimensional to proofread and correct interactive coordinate; Wherein M, K represent positive integer respectively, and K≤M;
Wherein should the interaction decision circuitry search point and this known eyes coordinate, and determine a left side to search a projection coordinate and a right projection coordinate of searching according to this a M K of searching point; This interaction decision circuitry is searched projection coordinate, this right side search projection coordinate and this three-dimensional eyes coordinate according to this left side, obtains in this M end points, corresponding to this K this K end points of searching point.
12. interactive module as claimed in claim 8 is characterized in that, this locating module is an eyes locating module, and this eye location module is used for detecting the position of user's eyes in this scene, to produce a three-dimensional eyes coordinate as this three-dimensional reference coordinate;
Wherein in the pairing coordinate system of this known eyes coordinate, have M and search point; This interaction decision circuitry is searched point and this three-dimensional eyes coordinate according to this known eyes coordinate, this M, and decision is in the coordinate system corresponding to this three-dimensional eyes coordinate, corresponding to M end points of this M search; This interaction decision circuitry decides M error distance corresponding to this M end points according to the position of this M end points and this three-dimensional interactive coordinate respectively; This interaction decision circuitry has the least error distance according to a K end points of this M end points, determines this three-dimensional to proofread and correct interactive coordinate; Wherein M, K represent positive integer respectively, and K≤M;
Wherein should the interaction decision circuitry search point and this known eyes coordinate, and determine a left side to search a projection coordinate and a right projection coordinate of searching according to this a M K of searching point; This interaction decision circuitry is searched projection coordinate, this right side search projection coordinate and this three-dimensional eyes coordinate according to this left side, obtains in this M end points, corresponding to this K this K end points of searching point.
13. interactive module as claimed in claim 8 is characterized in that, this locating module is the eye location module, and this eye location module is used for detecting the position of user's eyes in this scene, to produce a three-dimensional eyes coordinate as this three-dimensional reference coordinate;
Wherein this three-dimensional display system comprises a display screen and an auxiliary eyeglasses, and this display screen is used to provide a left image and a right image, and this auxiliary eyeglasses is used for auxiliary reception should left side image and this right side image, to obtain this stereopsis;
Wherein this eye location module comprises:
One first image sensor is used for this scene of sensing, to produce one first two-dimentional sensing image;
One second image sensor is used for this scene of sensing, to produce one second two-dimentional sensing image;
One eyes positioning circuit comprises:
One glasses circuit for detecting, be used for detecting this auxiliary eyeglasses in this first two-dimentional sensing image, obtaining one first two-dimentional glasses coordinate and one first glasses slope, and detect this auxiliary eyeglasses in this second two-dimentional sensing image, to obtain one second two-dimentional glasses position and one second glasses slope; And
One glasses coordinate transformation circuit, be used for calculating one first two-dimentional eyes coordinate and one second two-dimentional eyes coordinate according to this first two-dimentional glasses coordinate, this first glasses slope, this second two-dimentional glasses coordinate, this second glasses slope and a known binocular interval; And
One three-dimensional coordinate change-over circuit is used for calculating this three-dimensional eyes coordinate according to this first two-dimentional eyes coordinate, this second two-dimentional eyes coordinate, one first sense position of this first image sensor and one second sense position of this second image sensor.
14. interactive module as claimed in claim 13 is characterized in that, this eye location circuit also comprises an inclination detector; This inclination detector is arranged on this auxiliary eyeglasses; This inclination detector is used for producing an inclination information according to the angle of inclination of this auxiliary eyeglasses; This glasses coordinate transformation circuit calculates this first two-dimentional eyes coordinate and this second two-dimentional eyes coordinate according to this inclination information, the first two-dimentional glasses coordinate, this first glasses slope, this second two-dimentional glasses coordinate, this second glasses slope and this known binocular interval.
15. interactive module as claimed in claim 13 is characterized in that, this eye location circuit also comprises:
One first infrared light luminescence component is used for sending one first detected light; And
One infrared light sensing circuit is used for according to this first detected light, produces a two-dimensional infrared light coordinate and an infrared light slope;
Wherein this glasses coordinate transformation circuit calculates this first two-dimentional eyes coordinate and this second two-dimentional eyes coordinate according to this infrared light slope, this first glasses slope, this second glasses slope, this two-dimensional infrared light coordinate, this first two-dimentional glasses coordinate, this second two-dimentional glasses coordinate, this second correction glasses slope and this known binocular interval.
16. interactive module as claimed in claim 8 is characterized in that, this locating module is an eyes locating module, and this eye location module is used for detecting the position of user's eyes in this scene, to produce a three-dimensional eyes coordinate as this three-dimensional reference coordinate;
Wherein this three-dimensional display system comprises a display screen and an auxiliary eyeglasses, and this display screen is used to provide
An one left image and a right image, this auxiliary eyeglasses are used for auxiliary reception should left side image and this right side image, to obtain this stereopsis;
Wherein this eye location module comprises:
One three-dimensional scenic sensor comprises:
One the 3rd image sensor is used for this scene of sensing, to produce one the 3rd two-dimentional sensing image;
One infrared light luminescence component is used for sending a detected light to this scene, so that this scene produces a reflected light; And
One light sensing distance measuring equipment is used for this reflected light of sensing, to produce the range information corresponding to the 3rd two-dimentional sensing image;
Wherein this range information has the data of the distance between the every bit and this three-dimensional scenic sensor in the 3rd two-dimentional sensing image; And
One eyes coordinate produces circuit, comprises:
One glasses circuit for detecting is used for detecting this auxiliary eyeglasses in the 3rd two-dimentional sensing image, to obtain one the 3rd two-dimentional glasses coordinate and one the 3rd glasses slope; And
One glasses coordinate transformation circuit is used for calculating this three-dimensional eyes coordinate according to the 3rd two-dimentional glasses coordinate, the 3rd glasses slope, a known binocular interval and this range information.
17. interactive module as claimed in claim 8 is characterized in that, this locating module is the eye location module, and this eye location module is used for detecting the position of user's eyes in this scene, to produce a three-dimensional eyes coordinate as this three-dimensional reference coordinate;
Wherein this eye location module comprises:
One three-dimensional scenic sensor is used for this scene of sensing, producing one the 3rd two-dimentional sensing image, and corresponding to a range information of the 3rd two-dimentional sensing image;
Wherein this range information has the data of the distance between the every bit and this three-dimensional scenic sensor in the 3rd two-dimentional sensing image; And
One eyes coordinate produces circuit, comprises:
One eyes circuit for detecting is used for detecting the eyes in the 3rd two-dimentional sensing image, to obtain one the 3rd two-dimentional eyes coordinate;
One three-dimensional coordinate change-over circuit is used for a range finding position according to the 3rd two-dimentional eyes coordinate, this range information, this light sensing distance measuring equipment, and one the 3rd sense position of the 3rd image sensor, calculates this three-dimensional eyes coordinate.
18. method with an interactive result who decides a three-dimensional interaction systems, this solid interaction systems has a three-dimensional display system and an interactive assembly, this three-dimensional display system is used to provide a stereopsis, this stereopsis has a virtual objects, this virtual objects has a virtual coordinates and an interactive Rule of judgment, the method is characterized in that to comprise:
Detecting user's in a scene position is to produce a three-dimensional reference coordinate;
Detect the position of this interaction assembly, to produce a three-dimensional interactive coordinate; And
According to this three-dimensional reference coordinate, this three-dimensional interactive coordinate, this virtual coordinates and this interaction Rule of judgment, determine this interaction result between this interaction assembly and this stereopsis.
19. method as claimed in claim 18, it is characterized in that, detecting user's in this scene position comprises the position of detecting eyes of user in this scene to produce this three-dimensional reference coordinate, to produce a three-dimensional eyes coordinate as this three-dimensional reference coordinate;
Wherein, determine this interaction result to comprise according to this three-dimensional reference coordinate, this three-dimensional interactive coordinate, this virtual coordinates and this interaction Rule of judgment:
According to this this virtual coordinates of three-dimensional eyes coordinate conversion is a correction virtual coordinates; And
According to this three-dimensional interactive coordinate, this correction virtual coordinates and this interaction Rule of judgment, determine this interaction result.
20. method as claimed in claim 18, it is characterized in that, detecting user's in this scene position comprises the position that is detected in user's eyes in this scene to produce this three-dimensional reference coordinate, to produce a three-dimensional eyes coordinate as this three-dimensional reference coordinate;
Wherein, determine this interaction result to comprise according to this three-dimensional reference coordinate, this three-dimensional interactive coordinate, this virtual coordinates and this interaction Rule of judgment:
According to this this virtual coordinates of three-dimensional eyes coordinate conversion is a correction virtual coordinates;
According to this three-dimensional eyes coordinate conversion should the interaction Rule of judgment be the interactive Rule of judgment of a correction;
And
Proofread and correct interactive Rule of judgment according to this three-dimensional interactive coordinate, this correction virtual coordinates and this, determine this interaction result;
Wherein should the interaction Rule of judgment proofreading and correct interactive Rule of judgment for this according to this three-dimensional eyes coordinate conversion comprises:
According to an interactive critical distance and this virtual coordinates, calculate a critical surface; And
According to this this critical surface of three-dimensional eyes coordinate conversion is a correction critical surface;
Wherein the interactive Rule of judgment of this correction is for when this three-dimensional interactive coordinate enters this correction critical surface, and this interaction result represents contact.
21. method as claimed in claim 18, it is characterized in that, detecting user's in this scene position comprises the position of detecting eyes of user in this scene to produce this three-dimensional reference coordinate, to produce a three-dimensional eyes coordinate as this three-dimensional reference coordinate;
Wherein, determine this interaction result to comprise according to this three-dimensional eyes coordinate, this three-dimensional interactive coordinate, this virtual coordinates and this interaction Rule of judgment:
According to this this three-dimensional interactive coordinate of three-dimensional eyes coordinate conversion is the interactive coordinate of a three-dimensional correction; And
Proofread and correct interactive coordinate, this virtual coordinates and this interaction Rule of judgment according to this three-dimensional, determine this interaction result;
Wherein should the interaction Rule of judgment when proofreading and correct distance between interactive coordinate and this virtual coordinates when this three-dimensional less than an interactive critical distance, this interaction result represents to contact.
22. method as claimed in claim 21 is characterized in that, proofreaies and correct interactive coordinate according to this this three-dimensional interactive coordinate of three-dimensional eyes coordinate conversion for this three-dimensional and comprises:
According to this three-dimensional eyes coordinate and this three-dimensional interactive coordinate, draw this interaction assembly and be projected in this solid
Three-dimensional left interaction coordinate on the display system and a three-dimensional right interaction coordinate;
Decide a left consult straight line according to this a three-dimensional left side interaction coordinate and a known left eye coordinates, and
Decide a right consult straight line according to a right interaction coordinate of this three-dimensional and a known right eye coordinate;
And
According to this left side consult straight line and this right side consult straight line, obtain this three-dimensional and proofread and correct interactive coordinate.
23. method as claimed in claim 22 is characterized in that, according to this left side consult straight line and this right side consult straight line, obtains this three-dimensional and proofreaies and correct interactive coordinate and comprise:
When this left side consult straight line and this right side consult straight line intersected, the coordinate according to the intersection point of this left side consult straight line and this right side consult straight line obtained this three-dimensional and proofreaies and correct interactive coordinate; And
When this left side consult straight line and this right side consult straight line are non-intersect, according to this left side consult straight line and this right side consult straight line, obtain having with the minor increment of this left side consult straight line and this right side consult straight line and one with reference to mid point, and obtain this three-dimensional according to this coordinate and proofread and correct interactive coordinate with reference to mid point;
Wherein should equal this with reference to the distance between mid point and this right side consult straight line with reference to the distance between mid point and this left side consult straight line.
24. method as claimed in claim 23 is characterized in that, according to this left side consult straight line and this right side consult straight line, obtains this three-dimensional and proofreaies and correct interactive coordinate and comprise:
Obtain a central point according to this left side consult straight line and this right side consult straight line;
Determine a search area according to this central point;
Wherein have M in this search area and search point;
Search point and this three-dimensional eyes coordinate according to this known eyes coordinate, this M, decision is corresponding to this a M M end points of searching;
Respectively according to the position of this M end points and the decision of this three-dimensional interactive coordinate M error distance corresponding to this M end points; And
K end points according to this M end points has the least error distance, determines this three-dimensional to proofread and correct interactive coordinate;
Wherein M, K represent positive integer respectively, and K≤M;
Wherein search point and this three-dimensional eyes coordinate according to this known eyes coordinate, this M, decision comprises corresponding to this M end points of this M search:
Search point and this known eyes coordinate according to this a M K of searching point, determine a left side to search a projection coordinate and a right projection coordinate of searching;
Search projection coordinate, this right side search projection coordinate and this three-dimensional eyes coordinate according to this left side, obtain in this M end points, corresponding to this K this K end points of searching point.
25. method as claimed in claim 21 is characterized in that, proofreaies and correct interactive coordinate according to this this three-dimensional interactive coordinate of three-dimensional eyes coordinate conversion for this three-dimensional and comprises:
Search point and this three-dimensional eyes coordinate according to M in this known eyes coordinate, the pairing coordinate system of this known eyes coordinate, decision is in the coordinate system corresponding to this three-dimensional eyes coordinate, corresponding to this a M M end points of searching;
Decide M error distance according to the position of this M end points and this three-dimensional interactive coordinate respectively corresponding to this M end points; And
K end points according to this M end points has the least error distance, determines this three-dimensional to proofread and correct interactive coordinate;
Wherein M, K represent positive integer respectively, and K≤M;
Wherein search point and this three-dimensional eyes coordinate according to this M in this known eyes coordinate, the pairing coordinate system of this known eyes coordinate, decision comprises corresponding to this M M end points searching point in the coordinate system corresponding to this three-dimensional eyes coordinate:
Search point and this known eyes coordinate according to this a M K of searching point, determine a left side to search a projection coordinate and a right projection coordinate of searching; And
Search projection coordinate, this right side search projection coordinate and this three-dimensional eyes coordinate according to this left side, obtain in this M end points, corresponding to this K this K end points of searching point.
CN 201010122713 2010-02-26 2010-02-26 Interaction module applied to stereoscopic interaction system and method of interaction module Expired - Fee Related CN102169364B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 201010122713 CN102169364B (en) 2010-02-26 2010-02-26 Interaction module applied to stereoscopic interaction system and method of interaction module

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 201010122713 CN102169364B (en) 2010-02-26 2010-02-26 Interaction module applied to stereoscopic interaction system and method of interaction module

Publications (2)

Publication Number Publication Date
CN102169364A true CN102169364A (en) 2011-08-31
CN102169364B CN102169364B (en) 2013-03-27

Family

ID=44490550

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 201010122713 Expired - Fee Related CN102169364B (en) 2010-02-26 2010-02-26 Interaction module applied to stereoscopic interaction system and method of interaction module

Country Status (1)

Country Link
CN (1) CN102169364B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103871045A (en) * 2012-12-11 2014-06-18 现代自动车株式会社 Display system and method
CN105630155A (en) * 2014-11-25 2016-06-01 三星电子株式会社 Computing apparatus and method for providing three-dimensional (3d) interaction
CN110637273A (en) * 2017-05-10 2019-12-31 微软技术许可有限责任公司 Presenting applications within a virtual environment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1687970A (en) * 2005-04-27 2005-10-26 蔡涛 Interactive controlling method for selecting 3-D image body reconstructive partial body
WO2006040740A1 (en) * 2004-10-15 2006-04-20 Philips Intellectual Property & Standard Gmbh System for 3d rendering applications using hands
CN101281422A (en) * 2007-04-02 2008-10-08 原相科技股份有限公司 Apparatus and method for generating three-dimensional information based on object as well as using interactive system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006040740A1 (en) * 2004-10-15 2006-04-20 Philips Intellectual Property & Standard Gmbh System for 3d rendering applications using hands
CN1687970A (en) * 2005-04-27 2005-10-26 蔡涛 Interactive controlling method for selecting 3-D image body reconstructive partial body
CN101281422A (en) * 2007-04-02 2008-10-08 原相科技股份有限公司 Apparatus and method for generating three-dimensional information based on object as well as using interactive system

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103871045A (en) * 2012-12-11 2014-06-18 现代自动车株式会社 Display system and method
CN103871045B (en) * 2012-12-11 2018-09-04 现代自动车株式会社 Display system and method
CN105630155A (en) * 2014-11-25 2016-06-01 三星电子株式会社 Computing apparatus and method for providing three-dimensional (3d) interaction
CN105630155B (en) * 2014-11-25 2020-03-17 三星电子株式会社 Computing device and method for providing three-dimensional (3D) interaction
CN110637273A (en) * 2017-05-10 2019-12-31 微软技术许可有限责任公司 Presenting applications within a virtual environment
CN110637273B (en) * 2017-05-10 2021-12-03 微软技术许可有限责任公司 Presenting applications within a virtual environment

Also Published As

Publication number Publication date
CN102169364B (en) 2013-03-27

Similar Documents

Publication Publication Date Title
US10846864B2 (en) Method and apparatus for detecting gesture in user-based spatial coordinate system
US11044402B1 (en) Power management for optical position tracking devices
US9465443B2 (en) Gesture operation input processing apparatus and gesture operation input processing method
CN106383596B (en) Virtual reality anti-dizzy system and method based on space positioning
JP5474202B2 (en) Method and apparatus for detecting a gazing point based on face detection and image measurement
US10802606B2 (en) Method and device for aligning coordinate of controller or headset with coordinate of binocular system
WO2014093946A1 (en) Calibration and registration of camera arrays using a single optical calibration target
US20110187638A1 (en) Interactive module applied in 3D interactive system and method
WO2013133929A1 (en) Visually guiding motion to be performed by a user
KR101740994B1 (en) Structure measuring unit for tracking, measuring and marking edges and corners of adjacent surfaces
US20120086637A1 (en) System and method utilized for human and machine interface
US20150116204A1 (en) Transparent display virtual touch apparatus not displaying pointer
WO2016008265A1 (en) Method and apparatus for locating position
CN102169364B (en) Interaction module applied to stereoscopic interaction system and method of interaction module
JP2903964B2 (en) Three-dimensional position and posture recognition method based on vision and three-dimensional position and posture recognition device based on vision
US20200116481A1 (en) Golf voice broadcast rangefinder
US20120219178A1 (en) Computer-readable storage medium, image processing apparatus, image processing system, and image processing method
US20080212870A1 (en) Combined beacon and scene navigation system
US10466046B1 (en) External display rangefinder
US20200159339A1 (en) Desktop spatial stereoscopic interaction system
Petrič et al. Real-time 3D marker tracking with a WIIMOTE stereo vision system: Application to robotic throwing
KR101718099B1 (en) System for screen golf using laser beam
US12001629B2 (en) Systems and methods for dynamic shape sketching using position indicator and processing device that displays visualization data based on position of position indicator
US20150049016A1 (en) Multimodal system and method facilitating gesture creation through scalar and vector data
KR101594740B1 (en) Display System for calculating a Coordinate of Impact Object and Drive Method of the Same

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20130327

Termination date: 20170226

CF01 Termination of patent right due to non-payment of annual fee