WO2017128046A1 - 三维空间定位传感器及互动显示***和3d图像生成方法 - Google Patents

三维空间定位传感器及互动显示***和3d图像生成方法 Download PDF

Info

Publication number
WO2017128046A1
WO2017128046A1 PCT/CN2016/072209 CN2016072209W WO2017128046A1 WO 2017128046 A1 WO2017128046 A1 WO 2017128046A1 CN 2016072209 W CN2016072209 W CN 2016072209W WO 2017128046 A1 WO2017128046 A1 WO 2017128046A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
infrared light
display
remote control
interactive
Prior art date
Application number
PCT/CN2016/072209
Other languages
English (en)
French (fr)
Inventor
那庆林
麦浩晃
黄彦
Original Assignee
神画科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 神画科技(深圳)有限公司 filed Critical 神画科技(深圳)有限公司
Priority to PCT/CN2016/072209 priority Critical patent/WO2017128046A1/zh
Publication of WO2017128046A1 publication Critical patent/WO2017128046A1/zh

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves

Definitions

  • the present invention relates to the field of image display and image generation imaging, and more particularly to a three-dimensional spatial positioning sensor and an interactive display system and image generating method.
  • the positioning of the remote controller is usually determined by a single point and the position information of the display unit.
  • the image of the interactive display system is also displayed through a preset file, and the effect is relatively simple. , lack of real three-dimensional sense of space.
  • the technical problem to be solved by the present invention is to provide an improved three-dimensional spatial positioning sensor, an interactive display system and a 3D image generating method.
  • the technical solution adopted by the present invention to solve the technical problem thereof is: constructing a three-dimensional spatial positioning sensor, comprising an infrared light emitting unit, a ranging monitoring unit, and an information processing unit;
  • an infrared light emitting unit that emits at least three infrared light signals to different positions on a target surface
  • the ranging monitoring unit acquires, after the infrared light signal is transmitted to the target surface, is reflected to the turn of the ranging monitoring unit;
  • the information processing unit obtains the ranging information of the target surface according to the inter-turn difference between the infrared light emitting unit and the ranging monitoring unit ⁇ according to the respective infrared light signals. .
  • the optical axes of the at least three infrared light signals are disposed at an angle according to a predetermined space.
  • the infrared light emitting unit includes at least three infrared emitters
  • the information processing unit includes a modulation circuit unit for modulating signals of the plurality of infrared light emitters.
  • the information processing unit includes a meter circuit unit for calculating an inter-turn difference between each of the infrared light signals respectively emitted by the infrared light emitting unit and returned to the ranging monitoring unit.
  • the ranging monitoring unit includes a lens unit, and a photodetector unit, the lens sheet
  • the element is configured to reflect infrared light reflected back from the display surface to the photodetector unit, and the photodetector unit separately collects the reflected infrared light signal.
  • the lens unit comprises a beam splitting prism unit, and the beam splitting prism unit is plated with a transflective film for transmitting a part of infrared light signals to the display surface and reflecting part of the infrared light signal.
  • the ranging monitoring unit is used for the acquisition of the start of the optical signal.
  • the ranging monitoring unit comprises a CM OS unit for collecting the transmission and return of the infrared light signal.
  • the present invention also constructs an interactive display system including a display unit and an interactive remote control unit.
  • the interactive remote control unit includes the three-dimensional spatial positioning sensor and a first communication unit for communicating with the display unit;
  • the three-dimensional spatial positioning sensor of the interactive remote control unit emits an infrared light signal to a display surface of the display unit, and obtains ranging positioning information of the interactive remote control unit relative to the display surface;
  • the first communication unit is communicably connected to the display unit, and the ranging positioning information of the interactive remote control unit relative to the display surface is sent to the display unit;
  • the interactive display system further includes a system monitoring unit for acquiring position information of the infrared light signal of the interactive remote control unit on the display surface and feeding back to the display unit.
  • the interactive display system further includes a 3D glasses unit, the 3D glasses unit including the three-dimensional spatial positioning sensor and a second communication unit for communicating with the display unit;
  • the three-dimensional spatial positioning sensor of the 3D glasses unit emits an infrared light signal to a display surface of the display unit, and obtains ranging positioning information of the 3D glasses unit relative to the display surface;
  • the second communication unit is communicably connected to the display unit, and the ranging positioning information of the 3D glasses unit relative to the display surface is sent to the display unit;
  • the system monitoring unit acquires position information of the infrared light signal of the 3D glasses unit on the display surface
  • the interactive display system further includes an algorithm unit, and the algorithm unit is based on the ranging positioning information of the interactive remote control unit and/or the 3D glasses unit and the infrared light emission thereof on the display surface. Position information, calculating a spatial position of the interactive remote control unit and/or the 3D glasses unit with respect to the display surface and a spatial position relative to each other.
  • the spatial location includes the distance and angle of the interactive remote control unit, the 3D glasses unit relative to the display surface, or the distance between the interactive remote control unit and the 3D glasses unit angle.
  • the wavelength of the infrared light signal used by the interactive remote control unit is different from the wavelength of the infrared light signal used by the 3D glasses unit.
  • the system monitoring unit includes a beam splitting prism for splitting infrared light of different wavelengths of the interactive remote control unit and the 3D glasses unit, and two infrared components for respectively collecting the interactive remote control unit and the 3D glasses unit. Sensor for the position of the light signal.
  • the present invention also constructs a 3D image generating method of an interactive display system, the interactive display system includes a display unit, an interactive remote control unit, and a system monitoring unit, and the interactive remote control unit includes the three-dimensional spatial positioning sensor.
  • the 3D image generation method includes the following steps:
  • the three-dimensional spatial positioning sensor of the interactive remote control unit emits an infrared light signal to a display surface of the display unit, and the interactive display system obtains ranging location information of the interactive remote control unit relative to the display surface ;
  • the system monitoring unit obtains position information of the infrared light signal emitted by the interactive remote control unit on the display surface of the display unit, and determines the 3D image and/or according to the position information.
  • S4 Generate a 3D image of the 3D object and/or a motion track thereof according to the position information of the interactive remote control unit relative to the display surface and the position information of the first base point and the second base point;
  • the 3D image corresponding to the base point and/or its motion trajectory is a zero parallax image
  • the 3D image corresponding to the second base point and/or its motion trajectory is a negative parallax image
  • the absolute value of the negative parallax value is determined by the distance value in step S1. Increase or decrease in the same direction.
  • the interactive display system further includes a 3D glasses unit, and the 3D glasses unit includes the three-dimensional spatial positioning sensor;
  • the step S1 further includes: the three-dimensional spatial positioning sensor of the 3D glasses unit emits an infrared light signal to a display surface of the display unit, and the interactive display system obtains that the 3D glasses unit is opposite to the display Ranging positioning information of the display, and calculating spatial position information between the interactive remote control unit and the 3D glasses unit;
  • the step S2 further includes: the system monitoring unit actually acquiring location information of the infrared light signal emitted by the 3D glasses unit on the display surface of the display unit;
  • the step S3 further includes: determining, according to the spatial virtual connection of the top of the 3D glasses unit and the top end of the interactive remote control unit or other parts, to the falling point of the display screen, determining the second base point of displaying the 3D image and/or its motion track .
  • the three-dimensional spatial positioning sensor and the interactive display system and the 3D image generating method of the present invention have the following beneficial effects:
  • the present invention obtains positioning information of a target surface by using a three-dimensional spatial positioning sensor, and allows the display system to be relatively based on the interactive remote control unit.
  • the positioning information of the display surface generates a 3D image corresponding to the position of the interactive remote control unit and/or its motion track, so that the user holding the interactive remote control unit can see the 3D image corresponding to the standing position, such as the correspondence of the 3D object by the interactive remote control unit.
  • the 3D object such as bullets and darts, which are emitted to the display surface, moves to the display surface to make the 3D effect more realistic.
  • FIG. 1 is a schematic diagram showing the principle of a three-dimensional spatial positioning sensor emitting an infrared light signal to a display surface for positioning according to an embodiment of the present invention
  • FIG. 2 is a schematic view showing the arrangement of the infrared emitters at an angle of the infrared light emitting unit of FIG. 1;
  • FIG. 3 is a schematic diagram of a principle of a ranging monitoring unit of a three-dimensional spatial positioning sensor including a lens unit and a photodetector unit ⁇ in the embodiment of the present invention
  • FIG. 4 is a ranging monitoring unit of a three-dimensional spatial positioning sensor according to an embodiment of the present invention, including a dimension CMO.
  • FIG. 5 is a schematic diagram of an interactive remote control unit of an interactive display system in the embodiment of the present invention transmitting an infrared light signal to a display surface;
  • FIG. 6 is an interactive remote control unit and a 3D eye unit of the interactive display system in the embodiment of the present invention. Schematic diagram of the infrared light signal ⁇ emitted by the display surface;
  • FIG. 7 is a schematic diagram of the first base point obtained after the interactive remote control unit of FIG. 6 is transmitted to the display surface;
  • FIG. 8 is a schematic view showing a second base point ⁇ from the first base point in FIG. 7;
  • FIG. 9 is an actual 3D image of the left and right eyes respectively generated by the first base point and the second base point according to the first base point and the second base point in FIG. 8;
  • FIG. 10 is a 3D image of a 3D effect presented by wearing a 3D glasses unit and looking at the actual 3D image generated in FIG.
  • FIG. 11 is an actual 3D viewed from the left and right eyes of the first base point and the second base point generated according to the interactive remote control unit, the 3D glasses unit relative to the display surface, and the mutual positioning information.
  • FIG. 12 is a 3D image of a 3D effect of wearing a 3D glasses unit looking at the actual 3D image generated in FIG. 11 to implement the preferred embodiment of the invention.
  • a three-dimensional spatial positioning sensor 11 in a preferred embodiment of the present invention includes an infrared light emitting unit 111, a ranging monitoring unit 112, and an information processing unit 113.
  • the infrared light emitting unit 111 includes three infrared emitting signals.
  • the three infrared emitters emit three infrared light signals to different positions of the target surface.
  • the target surface is the display surface S of the display unit, and the display surface S may be a plane or a curved surface.
  • the ranging monitoring unit 112 acquires the infrared light signal and transmits it to the target surface and then reflects it to the time of the ranging monitoring unit 112.
  • the information processing unit 113 sends and returns to the infrared light emitting unit 111 according to the respective infrared light signals.
  • the inter-turn difference of the ranging monitoring unit 112 ⁇ obtains the length of the route through which the infrared light passes, thereby obtaining ranging positioning information with respect to the target surface, and the ranging positioning information includes at least three infrared light flying distances relative to the display surface S. information.
  • the emission time of the infrared light can be obtained by the system setting or by the ranging monitoring unit 112.
  • the optical axes of the three infrared light signals are mutually arranged according to a predetermined spatial angle, and the three infrared light signals are at an angle in the space, and are emitted to the display.
  • the three points after the surface S can form a triangle, and the three-dimensional space can be further calculated by combining the positioning information with the display surface S.
  • the position angle of the at least three infrared rays of the position sensor 11 to the display surface s accurately positions the position of the three-dimensional spatial positioning sensor 11 relative to the display surface S, such as the distance from the muzzle of the interactive remote control unit 1 such as a remote control gun to the target point and the gun
  • the angle between the shooting axis and the display surface S is more accurate.
  • the infrared light emitting unit 111 can also transmit three or more infrared light signals to different positions of the target surface, and the infrared light emitting signals are respectively sent and returned by the infrared light emitting unit 111 to the respective positions.
  • the inter-turn difference of the ranging monitoring unit 112 is obtained, and the ranging positioning information with respect to the target surface is obtained.
  • the information processing unit 113 includes a modulation circuit unit 1131 for modulating signals of a plurality of infrared light emitters, and a signal for calculating each of the infrared light signals respectively emitted by the infrared light emitting unit 111. And a counting circuit unit 1132 that returns to the inter-turn difference of the ranging monitoring unit 112A, the modulation circuit unit 1131 modulates signals of the plurality of infrared light emitters, and the information processing unit 113 calculates the inter-turn difference between each beam of light emitted and returned.
  • the length of each infrared ray from the exit point to the length of the display surface S is used to calculate the ranging positioning information of the three-dimensional spatial positioning sensor 11 relative to the display surface S.
  • the ranging monitoring unit 112 includes a CMOS unit 1121 for collecting transmission and return of infrared light signals.
  • the CMOS unit 1121 has a lens. It is determined that the emission time of the infrared light can be set by the system, and the light signal is transmitted and the system is recorded, and the time is sent to the information processing unit 113.
  • the infrared light is emitted to the display surface S to generate a reflection, and the CMOS unit 1121 captures the returned infrared light signal, and returns to the time interval of the ranging monitoring unit 112, and feeds back to the information processing unit 113 to let the information processing unit 113 according to the UI.
  • the difference between the lengths of the paths through which the infrared rays pass is obtained, thereby obtaining the ranging and positioning information of the three-dimensional spatial positioning sensor 11 with respect to the target surface.
  • the ranging monitoring unit 112 includes a lens unit, and a photodetector unit 1123.
  • the lens unit is for reflecting back the infrared light from the display surface S to the display surface S and reflecting the emitted infrared light to the photodetector unit 1123.
  • the photodetector unit 1123 collects the infrared light signal reflected by the display surface S, and feeds back the light information to the information processing unit 113.
  • the photodetector unit 1123 can also collect the initial emission of infrared light, which of course can also be collected by the system.
  • the infrared light emitting unit 111 further includes a dichroic prism unit 1111 disposed in the emission direction of the infrared light, and the dichroic prism unit 1111 is plated with a transflective film for transmitting a part of the infrared light signal to the display surface S. And reflecting part of the infrared light signal to the ranging monitoring unit 112 for collecting the start of the optical signal. After the infrared light passes through the dichroic prism unit 1111, a light is reflected by the diaphragm to the ranging monitoring unit 112.
  • the ray information is fed back to the information processing unit 113 to obtain the enthalpy of the utterance; the other ray is transmitted to the display surface S, and then reflected to the ranging monitoring unit 112 to obtain the returned time.
  • the information processing unit 113 calculates the length of the infrared ray from the exit point to the display surface S from the inter-turn difference of each beam of light, thereby calculating the ranging position information of the three-dimensional spatial position sensor 11 with respect to the display surface S.
  • the interactive display system in a preferred embodiment of the present invention includes a display unit and an interactive remote control unit 1.
  • the display unit can be a display screen, a projector, etc.
  • the display surface S of the display unit is a display screen.
  • the interactive remote control unit 1 is held by the user in the hand, and the interactive remote control unit 1 is usually a game handle such as a remote control gun, interacts with the display unit, and correspondingly generates a 3D image that is adapted to the position of the remote control unit 1 for holding the 3D image. More real.
  • the interactive remote control unit 1 includes a three-dimensional spatial positioning sensor 11 and a first communication unit 12 for communicating with the display unit. After the interactive remote control unit 1 interacts with the display unit, the three-dimensional spatial positioning sensor 11 of the interactive remote control unit 1 emits an infrared light signal to the display surface S of the display unit, and obtains the ranging positioning information of the interactive remote control unit 1 relative to the display surface S.
  • the first communication unit 12 is communicably connected to the display unit, and transmits the ranging positioning information of the interactive remote control unit 1 to the display surface S to the display unit.
  • the interactive display system further includes a system monitoring unit 3 for acquiring position information of the infrared light signal of the interactive remote control unit 1 on the display surface S and feeding back to the display unit. After the infrared light is emitted to the display surface S, the system monitoring unit 3 Acquire the position information of the infrared light on the display surface S. As shown in FIG. 7, the system monitoring unit 3 has up to three infrared lights on the screen of the captured infrared light signal. If the application needs to determine the optical axis of the interactive remote unit 1, it may take a calculation method to obtain the midpoint of three points or Other parameters are more conducive to application operations.
  • the interactive display system further includes a 3D glasses unit 2, the 3D glasses unit 2 is worn by the user on the head, the interactive remote control unit 1 is held by the user in the hand, the interactive remote control unit 1, the 3D glasses unit 2 and
  • the interaction between the display units correspondingly generating a 3D image that is adapted to the position of the remote control unit 1 and the position of the human eye, makes the 3D image more realistic.
  • the 3D glasses unit 2 includes a three-dimensional spatial positioning sensor 11 and a second communication unit 22 for communication connection with the display unit.
  • the three-dimensional spatial positioning sensor 11 of the 3D glasses unit 2 emits an infrared light signal to the display surface S of the display unit, and obtains ranging positioning information of the 3D glasses unit 2 with respect to the display surface S.
  • the second communication unit 22 is communicably connected to the display unit, and transmits the ranging positioning information of the 3D glasses unit 2 relative to the display surface S to the display unit, and the ranging positioning information includes information such as the distance of the 3D glasses unit 2 with respect to the display surface S.
  • the system monitoring unit 3 acquires the position information of the infrared light signal of the 3D glasses unit 2 on the display surface S and feeds it back to the display unit.
  • the axis angle of the center of the 3D glasses unit 2 relative to the display surface S can be calculated by an algorithm, which is more conducive to the subsequent application. .
  • the interactive display system further includes an algorithm unit, and the algorithm unit calculates the interactive remote control unit 1, the 3D glasses unit according to the ranging positioning information of the interactive remote control unit 1, the 3D glasses unit 2, and the position information of the infrared light emission on the display surface S. 2 spatial position relative to the display surface S and relative to the spatial position between each other.
  • the spatial position includes the distance and the angle with respect to the display surface S, and the position and the orientation of the interactive remote control unit 1, the 3D glasses unit 2 relative to the display surface S, and the like can be accurately obtained.
  • the system monitoring unit 3 includes a beam splitting prism 31 for splitting the infrared light of different wavelengths of the interactive remote control unit 1 and the 3D glasses unit 2, and two positions for collecting the infrared light signals of the interactive remote control unit 1 and the 3D glasses unit 2, respectively.
  • the sensor 32, the sensor 32 can be a CMOS chip, a CCD chip, and the infrared light of the interactive remote control unit 1 and the 3D glasses unit 2 is split to the two sensors 32, respectively, and the interactive remote control unit 1 and the 3D glasses unit can be respectively obtained. 2 spatial location information.
  • the display unit generates a corresponding 3D image according to the ranging positioning information of the interactive remote control unit 1 with respect to the display surface S and the position information of the infrared light signal on the display surface S. Let the user holding the interactive remote control unit 1 see the 3D image corresponding to its standing position, making the 3D effect more realistic.
  • the 3D image generation method on the display unit includes the following steps:
  • the three-dimensional spatial positioning sensor 11 of the interactive remote control unit 1 emits an infrared light signal to the display surface s of the display unit, and the interactive display system obtains the ranging and positioning information of the interactive remote control unit 1 relative to the display surface S;
  • the distance location information may include information such as the distance of the interactive remote control unit 1 with respect to the display surface S.
  • the system monitoring unit 3 obtains the position information of the infrared light signal emitted by the interactive remote control unit 1 on the display surface S of the display unit, and determines the first base point K of the 3D image and/or its motion track according to the position information.
  • the first base point is a shooting axis of the interactive remote control unit 1 such as a remote control gun and a point extending to the display surface. In other embodiments, it may be a center position of three beams of infrared light. In other embodiments, The first base point can also be other locations.
  • the second base point J is obtained by using the distance measurement information of the interactive remote control unit 1 with respect to the display surface S to be offset from the base point of the display surface S by the first base point K.
  • the interactive remote control unit 1 is held by the user, usually under the eyes, and the display surface S is in front of the user, so the bullets, darts, etc. shot from the interactive remote control unit 1 should be from below the display surface S.
  • the obtained second base point J should be located between the first base point K and the bottom side of the display surface S.
  • S4 Generate a 3D image of the 3D object and/or a motion track thereof according to the ranging positioning information of the interactive remote control unit 1 with respect to the display surface S and the position information of the first base point K and the second base point J.
  • the 3D image corresponding to the first base point K and/or its motion trajectory is a zero parallax image
  • the 3D image corresponding to the second base point J and/or its motion trajectory is a negative parallax image
  • the absolute value of the negative parallax value is measured in step S1.
  • the 3D image of the generated 3D object and/or its motion trajectory are respectively generated on the two sides of the first base point K and the second base point J, respectively, in the left eye and the right eye.
  • the actual image or motion trajectory that can be seen the image at the first base point K is zero parallax, the image below the first base point K is negative parallax, and the image above the first base point K is positive parallax.
  • FIG. 10 when the 3D glasses are worn, a 3D image of the 3D effect is seen, and a 3D object such as a bullet or a dart that is emitted from the position near the direction of the interactive remote control unit 1 to the display surface S is seen. The movement track of the movement of the surface S is displayed.
  • the operation steps include:
  • Step S1 further includes: the three-dimensional spatial positioning sensor 11 of the 3D glasses unit 2 emits an infrared light signal to the display surface S of the display unit, and the interactive display system obtains ranging positioning information of the 3D glasses unit 2 relative to the display surface S. And calculating spatial position information between the interactive remote control unit 1 and the 3D glasses unit 2;
  • Step S2 further includes: the system monitoring unit 3 obtains the position information of the infrared light signal emitted by the 3D glasses unit 2 on the display surface S of the display unit, and further obtains the interactive remote control unit 1, the 3D glasses unit 2. The positional relationship between the display faces S.
  • Step S3 further includes: after obtaining the spatial position of the interactive remote control unit 1, the 3D glasses unit 2, extending to the display screen according to the spatial virtual connection between the 3D glasses unit 2 and the top end of the interactive remote control unit 1 or other parts The point determines the second base point J of the 3D image and/or its motion trajectory.
  • the 3D image of the generated 3D object and/or its motion trajectory includes the actual sides that can be seen in the left eye and the right eye on both sides of the connecting line segment of the first base point K and the second base point J, respectively.
  • the image or motion trajectory, the image at the first base point K is zero parallax, and the image below the first base point K is negative parallax, at the first base point
  • the image above K is a positive parallax.
  • the image that the user sees after wearing the 3D glasses is, for example, that the bullet is directly emitted from the muzzle of the gun-type interactive remote control unit 1, and is shot far into the distance through the display surface S pointed by the muzzle. Until it disappears. 3D effect 3
  • the D object and its motion trajectory will not be offset from the interactive remote control unit 1, and the visual effect is more realistic.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Position Input By Displaying (AREA)

Abstract

一种三维空间定位传感器(11)及互动显示***和3D图像生成方法,三维空间定位传感器(11)包括:红外光发射单元(111),发射至少三束红外光信号到目标面;测距监控单元(112),获取红外光信号反射到测距监控单元(112)的时间;信息处理单元(113),根据各束红外光发出和返回的时间差,得出相对目标面的测距定位信息。通过三维空间定位传感器(11)得出相对目标面的测距定位信息,让显示***根据互动遥控单元相对显示面(S)的测距定位信息和红外光在显示面(S)的位置信息生成与互动遥控单元位置对应的3D图像和/或其运动轨迹,可让握持互动遥控单元的用户看到与其站立位置对应的3D图像,如3D物体由互动遥控单元向显示面(S)发射出的子弹、飞镖等3D物体向显示面(S)移动的运动轨迹,让3D效果更加逼真。

Description

三维空间定位传感器及互动显示***和 3D图像生成方法 技术领域
[0001] 本发明涉及图像显示和图像生成成像领域, 更具体地说, 涉及一种三维空间定 位传感器及互动显示***和图像生成方法。
背景技术
[0002] 相关技术中的互动显示***中, 对遥控器的定位通常是通过单点判断与显示单 元的位置信息, 另外, 互动显示***的图像也是通过预设好的文件进行显示, 效果比较单一, 缺乏真实的三维空间感。
技术问题
[0003] 本发明要解决的技术问题在于, 提供一种改进的三维空间定位传感器及互动显 示***和 3D图像生成方法。
问题的解决方案
技术解决方案
[0004] 本发明解决其技术问题所采用的技术方案是: 构造一种三维空间定位传感器, 包括红外光发射单元、 测距监控单元、 信息处理单元;
[0005] 红外光发射单元, 发射至少三束红外光信号到目标面的不同位置,
[0006] 测距监控单元, 获取所述红外光信号发射到所述目标面后反射到所述测距监控 单元的吋间;
[0007] 信息处理单元, 根据各束红外光光信号分别由所述红外光发射单元发出和返回 到所述测距监控单元吋的吋间差, 得出相对所述目标面的测距定位信息。
[0008] 优选地, 所述至少三束红外光信号的光轴之间相互按照预定的空间夹角设置。
[0009] 优选地, 所述红外光发射单元包括至少三个红外发射器, 所述信息处理单元包 括用以调制多个红外光发射器的信号的调制电路单元。
[0010] 优选地, 所述信息处理单元包括用于计算各束红外光信号分别由所述红外光发 射单元发出和返回到所述测距监控单元吋的吋间差的计吋电路单元。
[0011] 优选地, 所述测距监控单元包括镜头单元、 以及光电探测器单元, 所述镜头单 元用于将从所述显示面反射回的红外光反射到所述光电探测器单元, 所述光电 探测器单元分别采集反射返回的红外光信号。
[0012] 优选地, 所述镜头单元包括分光棱镜单元, 所述分光棱镜单元镀有半反半透性 质的膜片, 用于透射部分红外光信号到所述显示面, 并反射部分红外光信号到 所述测距监控单元用于光信号起始吋间的采集。
[0013] 优选地, 所述测距监控单元包括用于采集所述红外光信号的发射和返回的 CM OS单元。
[0014] 本发明还构造一种互动显示***, 包括显示单元和互动遥控单元,
[0015] 所述互动遥控单元包括所述的三维空间定位传感器和用于和所述显示单元通信 连接的第一通讯单元;
[0016] 所述互动遥控单元的三维空间定位传感器发射红外光信号到所述显示单元的显 示面, 得出所述互动遥控单元相对所述显示面的测距定位信息;
[0017] 所述第一通讯单元与所述显示单元通信连接, 将所述互动遥控单元相对所述显 示面的测距定位信息发送给所述显示单元;
[0018] 所述互动显示***还包括用于获取所述互动遥控单元的红外光信号在所述显示 面的位置信息、 并反馈给所述显示单元的***监控单元。
[0019] 优选地, 所述互动显示***还包括 3D眼镜单元, 所述 3D眼镜单元包括所述的 三维空间定位传感器和用于和所述显示单元通信连接的第二通讯单元;
[0020] 所述 3D眼镜单元的三维空间定位传感器发射红外光信号到所述显示单元的显示 面, 得出所述 3D眼镜单元相对所述显示面的测距定位信息;
[0021] 所述第二通讯单元与所述显示单元通信连接, 将所述 3D眼镜单元相对所述显示 面的测距定位信息发送给所述显示单元;
[0022] 所述***监控单元获取所述 3D眼镜单元的红外光信号在所述显示面的位置信息
、 并反馈给所述显示单元。
[0023] 优选地, 所述互动显示***还包括算法单元, 所述算法单元根据所述互动遥控 单元和 /或所述 3D眼镜单元的测距定位信息和其红外光发射在所述显示面的位置 信息, 计算所述互动遥控单元和 /或所述 3D眼镜单元相对于所述显示面的空间位 置以及相对于彼此之间的空间位置。 [0024] 优选地, 所述空间位置包括所述互动遥控单元、 所述 3D眼镜单元相对于所述显 示面的距离与角度或所述互动遥控单元、 所述 3D眼镜单元相互之间的距离与角 度。
[0025] 优选地, 所述互动遥控单元采用的红外光信号波长与 3D眼镜单元采用的红外光 信号波长不同。
[0026] 优选地, 所述***监控单元包括用于分幵所述互动遥控单元和 3D眼镜单元不同 波长的红外光的分光棱镜以及两个分别用于采集所述互动遥控单元和 3D眼镜单 元红外光信号位置的传感器。
[0027] 本发明还构造一种互动显示***的 3D图像生成方法, 所述互动显示***包括显 示单元、 互动遥控单元以及***监控单元, 所述互动遥控单元包括所述的三维 空间定位传感器, 所述 3D图像生成方法包括以下步骤:
[0028] Sl、 所述互动遥控单元的三维空间定位传感器发射红外光信号到所述显示单元 的显示面, 所述互动显示***得出所述互动遥控单元相对所述显示面的测距定 位信息;
[0029] S2、 所述***监控单元实吋获取所述互动遥控单元所发射的红外光信号在所述 显示单元的显示面的位置信息, 依据所述位置信息决定所述 3D图像和 /或其动作 轨迹的第一基点;
[0030] S3、 利用所述互动遥控单元相对所述显示面的位置信息生成由所述第一基点向 所述显示面的底边偏移一定距离得到第二基点;
[0031] S4、 根据所述互动遥控单元相对所述显示面的位置信息以及所述第一基点、 第 二基点的位置信息, 生成 3D物件的 3D图像和 /或其动作轨迹; 所述第一基点对应 的 3D图像和 /或其运动轨迹为零视差图像, 所述第二基点对应的 3D图像和 /或其 运动轨迹为负视差图像, 其负视差数值之绝对值随步骤 S1中测距数值的增减而 同方向增减。
[0032] 优选地, 所述互动显示***还包括 3D眼镜单元, 所述 3D眼镜单元包括所述的 三维空间定位传感器;
[0033] 所述步骤 S1还包括: 所述 3D眼镜单元的三维空间定位传感器发射红外光信号 到所述显示单元的显示面, 所述互动显示***得出所述 3D眼镜单元相对所述显 示面的测距定位信息、 以及计算所述互动遥控单元和 3D眼镜单元相互之间的空 间位置信息;
[0034] 所述步骤 S2还包括: 所述***监控单元实吋获取所述 3D眼镜单元所发射的红 外光信号在所述显示单元的显示面的位置信息;
[0035] 所述步骤 S3还包括: 根据所述 3D眼镜单元与互动遥控单元顶端或其他部位的 空间虚拟连线延伸到显示画面的落点决定显示 3D图像和 /或其动作轨迹的第二基 点。
发明的有益效果
有益效果
[0036] 实施本发明的三维空间定位传感器及互动显示***和 3D图像生成方法, 具有以 下有益效果: 本发明通过三维空间定位传感器得出相对目标面的定位信息, 让 显示***根据互动遥控单元相对显示面的定位信息生成与互动遥控单元位置对 应的 3D图像和 /或其运动轨迹, 可让握持互动遥控单元的用户看到与其站立位置 对应的 3D图像, 如 3D物体由互动遥控单元的对应位置向显示面发射出的子弹、 飞镖等 3D物体向显示面移动的运动轨迹, 让 3D效果更加的逼真。
对附图的简要说明
附图说明
[0037] 下面将结合附图及实施例对本发明作进一步说明, 附图中:
[0038] 图 1是本发明实施例中的三维空间定位传感器发射红外光信号到显示面进行定 位的原理示意图;
[0039] 图 2是图 1中红外光发射单元呈夹角的各红外发射器排布示意图;
[0040] 图 3是本发明实施例中的三维空间定位传感器的测距监控单元包括镜头单元和 光电探测器单元吋的原理示意图;
[0041] 图 4是本发明实施例中的三维空间定位传感器的测距监控单元包括一维度 CMO
S单元吋的原理示意图;
[0042] 图 5是本发明实施例中的互动显示***的互动遥控单元向显示面发射红外光信 号吋的示意图;
[0043] 图 6是本发明实施例中的互动显示***的互动遥控单元、 3D眼睛单元分别向显 示面发射红外光信号吋的示意图;
[0044] 图 7是图 6中的互动遥控单元发射到显示面后得到第一基点的示意图;
[0045] 图 8是由图 7中的第一基点得出第二基点吋的示意图;
[0046] 图 9是根据图 8中的第一基点、 第二基点生成的位于第一基点、 第二基点连线两 侧、 左右眼分别睁幵吋看到的实际 3D图像;
[0047] 图 10是戴上 3D眼镜单元看图 9中生成的实际 3D图像吋呈现的 3D效果的 3D图像
[0048] 图 11是根据互动遥控单元、 3D眼镜单元相对显示面及相互之间的定位信息生成 的位于第一基点、 第二基点连线两侧、 左右眼分别睁幵吋看到的实际 3D图像; [0049] 图 12是戴上 3D眼镜单元看图 11中生成的实际 3D图像吋呈现的 3D效果的 3D图像 实施该发明的最佳实施例
本发明的最佳实施方式
[0050] 为了对本发明的技术特征、 目的和效果有更加清楚的理解, 现对照附图详细说 明本发明的具体实施方式。
[0051] 如图 1所示, 本发明一个优选实施例中的三维空间定位传感器 11包括红外光发 射单元 111、 测距监控单元 112、 信息处理单元 113; 红外光发射单元 111包括三 个红外发射器, 三个红外发射器发射三束红外光信号到目标面的不同位置, 进 一步地, 目标面即为显示单元的显示面 S, 显示面 S可以为平面或弧面。
[0052] 测距监控单元 112获取红外光信号发射到目标面后反射到测距监控单元 112的吋 间; 信息处理单元 113根据各束红外光光信号分别由红外光发射单元 111发出和 返回到测距监控单元 112吋的吋间差, 得出红外光线经过的路线长度, 从而得出 相对目标面的测距定位信息, 测距定位信息包括相对于显示面 S的至少三束红外 光飞行距离信息。 红外光的发出吋间可以由***设定获得, 也可通过测距监控 单元 112获得。
[0053] 如图 2、 及图 5至图 7所示, 三束红外光信号的光轴之间相互按照预定的空间夹 角设置, 三束红外光信号在空间内呈夹角, 发射到显示面 S后的三个点可以形成 一个三角形, 则可结合与显示面 S之间的测距定位信息进一步计算出三维空间定 位传感器 11的至少三束红外光对于显示面 s的入射角度, 准确定位三维空间定位 传感器 11相对显示面 S的位置, 如遥控枪等互动遥控单元 1的枪口到目标点的距 离及枪的射击轴线与显示面 S的角度, 定位更加准确。
[0054] 在其他实施例中, 红外光发射单元 111也可发射三束以上的红外光信号到目标 面的不同位置, 并由各束红外光光信号分别由红外光发射单元 111发出和返回到 测距监控单元 112吋的吋间差, 得出相对目标面的测距定位信息。
[0055] 如图 3、 图 4所示, 信息处理单元 113包括用以调制多个红外光发射器的信号的 调制电路单元 1131以及用于计算各束红外光信号分别由红外光发射单元 111发出 和返回到测距监控单元 112吋的吋间差的计吋电路单元 1132, 调制电路单元 1131 调制多个红外光发射器的信号, 信息处理单元 113由每束光发出与返回的吋间差 计算每束红外光线由射出点到显示面 S的长度, 从而计算出三维空间定位传感器 11相对显示面 S的测距定位信息。
[0056] 如图 4所示, 测距监控单元 112包括用于采集红外光信号的发射和返回的 CMOS 单元 1121。 CMOS单元 1121上带有镜头。 确定红外光的发出吋间可以利用***设 定, 在发射光信号吋***及吋记录, 得出发出吋间, 反馈给信息处理单元 113。 红外光发射到显示面 S后产生反射, 被 CMOS单元 1121捕捉到返回的红外光信号 , 得出返回到测距监控单元 112的吋间, 反馈给信息处理单元 113, 让信息处理 单元 113根据吋间差得出红外光线经过的路线长度, 从而得出三维空间定位传感 器 11相对目标面的测距定位信息。
[0057] 如图 3所示, 在其他实施例中, 测距监控单元 112包括镜头单元、 以及光电探测 器单元 1123。 镜头单元用于将从显示面 S反射回红外光穿透到显示面 S和将发射 的红外光反射到光电探测器单元 1123。 光电探测器单元 1123采集由显示面 S反射 返回的红外光信号, 将光线信息反馈给信息处理单元 113。 光电探测器单元 1123 也可采集红外光的起始发出, 当然也可由***采集。
[0058] 红外光发射单元 111还包括分光棱镜单元 1111, 设置在红外光的发射方向上, 分光棱镜单元 1111镀有半反半透性质的膜片, 用于透射部分红外光信号到显示 面 S, 并反射部分红外光信号到测距监控单元 112用于光信号起始吋间的采集。 在红外光经过分光棱镜单元 1111吋, 一路光经膜片反射到测距监控单元 112上, 将光线信息反馈给信息处理单元 113, 得出发出的吋间; 另外一路光穿透射向显 示面 S, 再反射到测距监控单元 112, 得出返回的吋间。 信息处理单元 113由每束 光到达的吋间差计算红外光线由射出点到显示面 S的长度, 从而计算出三维空间 定位传感器 11相对显示面 S的测距定位信息。
[0059] 如图 5所示, 本发明一个优选实施例中的互动显示***包括显示单元和互动遥 控单元 1, 显示单元可为显示屏、 投影仪等, 显示单元的显示面 S为显示屏的显 示面 S或投影仪的投影的画面所在面。 互动遥控单元 1被用户握在手中, 互动遥 控单元 1通常为遥控枪等游戏手柄, 与显示单元之间互动, 对应的生成与握持互 动遥控单元 1的位置相适应的 3D图像, 让 3D图像更加的真实。
[0060] 互动遥控单元 1包括三维空间定位传感器 11和用于和显示单元通信连接的第一 通讯单元 12。 在互动遥控单元 1与显示单元互动吋, 互动遥控单元 1的三维空间 定位传感器 11发射红外光信号到显示单元的显示面 S, 得出互动遥控单元 1相对 显示面 S的测距定位信息。 第一通讯单元 12与显示单元通信连接, 将互动遥控单 元 1相对显示面 S的测距定位信息发送给显示单元。
[0061] 互动显示***还包括用于获取互动遥控单元 1的红外光信号在显示面 S的位置信 息、 并反馈给显示单元的***监控单元 3, 红外光发射到显示面 S后, ***监控 单元 3获取红外光在显示面 S的位置信息。 如图 7所示, ***监控单元 3在捕捉画 面红外光信号吋画面上有至上三个红外光, 如果应用需要确定互动遥控单元 1的 光轴, 则可能采用计算方法获取三点的中点或其他参数, 更利于应用操作。
[0062] 如图 6所示, 互动显示***还包括 3D眼镜单元 2, 3D眼镜单元 2被用户戴在头上 , 互动遥控单元 1被用户握在手中, 互动遥控单元 1、 3D眼镜单元 2与显示单元之 间互动, 对应的生成与握持互动遥控单元 1的位置、 人眼位置相适应的 3D图像, 让 3D图像更加的真实。
[0063] 进一步地, 3D眼镜单元 2包括三维空间定位传感器 11和用于和显示单元通信连 接的第二通讯单元 22。 3D眼镜单元 2的三维空间定位传感器 11发射红外光信号到 显示单元的显示面 S, 得出 3D眼镜单元 2相对显示面 S的测距定位信息。 第二通讯 单元 22与显示单元通信连接, 将 3D眼镜单元 2相对显示面 S的测距定位信息发送 给显示单元, 该测距定位信息包括 3D眼镜单元 2相对于显示面 S的距离等信息。 同吋, ***监控单元 3获取 3D眼镜单元 2的红外光信号在显示面 S的位置信息、 并 反馈给显示单元。 如图 7所示, 通过将 3D眼镜单元 2的至少三束红外光斑在显示 面 S定位, 可以通过算法计算出 3D眼镜单元 2中心相对于显示面 S的轴线角度, 更 利于后续应用的幵展。
[0064] 互动显示***还包括算法单元, 算法单元根据互动遥控单元 1、 3D眼镜单元 2的 测距定位信息和其红外光发射在显示面 S的位置信息, 计算互动遥控单元 1、 3D 眼镜单元 2相对于显示面 S的空间位置以及相对于彼此之间的空间位置。 空间位 置包括相对于显示面 S的距离与角度, 可准确得出互动遥控单元 1、 3D眼镜单元 2 相对显示面 S的位置及摆向等。
[0065] 为了方便计算得出互动遥控单元 1、 3D眼镜单元 2的空间位置信息, 互动遥控单 元 1采用的红外光信号波长与 3D眼镜单元 2采用的红外光信号波长不同。 对应的 , ***监控单元 3包括用于分幵互动遥控单元 1和 3D眼镜单元 2不同波长的红外光 的分光棱镜 31以及两个分别用于采集互动遥控单元 1和 3D眼镜单元 2红外光信号 位置的传感器 32, 传感器 32可为 CMOS芯片、 CCD芯片, 互动遥控单元 1和 3D眼 镜单元 2的红外光分幵后分别射向两个传感器 32, 进而能分别得出互动遥控单元 1、 3D眼镜单元 2的空间位置信息。
[0066] 结合图 7、 图 8所示, 显示单元根据互动遥控单元 1相对显示面 S的测距定位信息 及其红外光信号在显示面 S的位置信息, 生成对应的 3D图像显示出来, 可让握持 互动遥控单元 1的用户看到与其站立位置对应的 3D图像, 让 3D效果更加的逼真。
[0067] 显示单元上的 3D图像生成方法包括以下步骤:
[0068] Sl、 互动遥控单元 1的三维空间定位传感器 11发射红外光信号到显示单元的显 示面 s, 所述互动显示***得出互动遥控单元 1相对显示面 S的测距定位信息; 该 测距定位信息可包括互动遥控单元 1相对于显示面 S的距离等信息。
[0069] S2、 ***监控单元 3实吋获取互动遥控单元 1所发射的红外光信号在显示单元的 显示面 S的位置信息, 依据位置信息决定 3D图像和 /或其动作轨迹的第一基点 K。 优选地, 该第一基点 Κ为遥控枪等互动遥控单元 1的射击轴线与延伸到显示面上 的点, 在其他实施例中, 可以为三束红外光的中心位置, 在其他实施例中, 第 一基点 Κ也可为其他位置。 [0070] S3、 如图 8所示, 利用互动遥控单元 1相对显示面 S的测距定位信息生成由第一 基点 K向显示面 S的底边偏移一定距离得到第二基点 J。 由于使用人站立后, 互动 遥控单元 1被用户握持吋通常是在眼睛的下方, 显示面 S在使用人的前方, 所以 从互动遥控单元 1射出的子弹、 飞镖等应该是从显示面 S下方射出, 才可以得到 子弹、 飞镖等由使用人手持的互动遥控单元 1方向附近射向画面的效果。 因此, 得到的第二基点 J应该位于第一基点 K和显示面 S的底边之间。
[0071] S4、 根据互动遥控单元 1相对显示面 S的测距定位信息以及第一基点K、 第二基 点 J的位置信息, 生成 3D物件的 3D图像和 /或其动作轨迹。 第一基点 K对应的 3D 图像和 /或其运动轨迹为零视差图像, 第二基点 J对应的 3D图像和 /或其运动轨迹 为负视差图像, 其负视差数值之绝对值随步骤 S1中测距数值的增减而同方向增 减。 这是因为, 使用人距离显示面 S越远, 测距距离越长, 此吋需要图像运动轨 迹起始点距离显示面 S也越远, 从而需要第二基点 J更大的负视差数值画面达到表 现效果。
[0072] 如图 9所示, 生成的 3D物件的 3D图像和 /或其动作轨迹包括在第一基点K、 第二 基点 J的连线线段两侧分别生成在左眼、 右眼单独睁幵吋能看到的实际图像或动 作轨迹, 在第一基点 K的图像为零视差, 在第一基点 K的下方的图像为负视差, 在第一基点 K上方的图像为正视差。 如图 10所示, 当戴上 3D眼镜后, 就看到 3D 效果的 3D图像, 以及看到 3D物体由互动遥控单元 1的方向附近位置向显示面 S发 射出的子弹、 飞镖等 3D物体向显示面 S移动的运动轨迹。
[0073] 如图 11所示, 在用户同吋戴上 3D眼镜单元 2和握持互动遥控单元 1吋, 可根据 3 D眼镜单元 2和互动遥控单元 1相对显示面 S的测距定位信息, 在显示单元上生成 出与人眼位置、 互动遥控单元 1更对应的真实的 3D效果。 对应的, 其操作步骤包 括:
[0074] 步骤 S1还包括: 3D眼镜单元 2的三维空间定位传感器 11发射红外光信号到显示 单元的显示面 S, 所述互动显示***得出 3D眼镜单元 2相对显示面 S的测距定位信 息、 以及计算出互动遥控单元 1和 3D眼镜单元 2相互之间的空间位置信息;
[0075] 步骤 S2还包括: ***监控单元 3实吋获取 3D眼镜单元 2所发射的红外光信号在 显示单元的显示面 S的位置信息, 进而可以得出互动遥控单元 1、 3D眼镜单元 2、 显示面 S之间的位置关系。
[0076] 步骤 S3还包括: 在得出互动遥控单元 1、 3D眼镜单元 2的空间位置后, 根据 3D 眼镜单元 2与互动遥控单元 1顶端或其他部位的空间虚拟连线延伸到显示画面的 落点决定显示 3D图像和 /或其动作轨迹的第二基点 J。
[0077] 生成的 3D物件的 3D图像和 /或其动作轨迹包括在第一基点K、 第二基点 J的连线 线段两侧分别生成在左眼、 右眼单独睁幵吋能看到的实际图像或动作轨迹, 在 第一基点 K的图像为零视差, 在第一基点 K的下方的图像为负视差, 在第一基点
K上方的图像为正视差。
[0078] 如图 12所示, 用户戴上 3D眼镜后看到的图像, 是例如子弹直接从枪型互动遥控 单元 1的枪口发出, 穿过枪口所指的显示面 S位置射向远方直至消失。 3D效果的 3
D物体及其运动轨迹不会与互动遥控单元 1产生偏移, 视觉效果更加的逼真。
[0079] 可以理解地, 上述各技术特征可以任意组合使用而不受限制。
[0080] 以上所述仅为本发明的实施例, 并非因此限制本发明的专利范围, 凡是利用本 发明说明书及附图内容所作的等效结构或等效流程变换, 或直接或间接运用在 其他相关的技术领域, 均同理包括在本发明的专利保护范围内。

Claims

权利要求书
一种三维空间定位传感器 (11) , 其特征在于, 包括红外光发射单元
(111) 、 测距监控单元 (112) 、 信息处理单元 (113) ; 红外光发射单元 (111) , 发射至少三束红外光信号到目标面的不同 位置,
测距监控单元 (112) , 获取所述红外光信号发射到所述目标面后反 射到所述测距监控单元 (112) 的吋间;
信息处理单元 (113) , 根据各束红外光光信号分别由所述红外光发 射单元 (111) 发出和返回到所述测距监控单元 (112) 吋的吋间差, 得出相对所述目标面的测距定位信息。
根据权利要求 1所述的三维空间定位传感器 (11) , 其特征在于, 所 述至少三束红外光信号的光轴之间相互按照预定的空间夹角设置。 根据权利要求 1或 2所述的三维空间定位传感器 (11) , 其特征在于 , 所述红外光发射单元 (111) 包括至少三个红外发射器, 所述信息 处理单元 (113) 包括用以调制多个红外光发射器的信号的调制电路 单元 (1131) 。
根据权利要求 3所述的三维空间定位传感器 (11) , 其特征在于, 所 述信息处理单元 (113) 包括用于计算各束红外光信号分别由所述红 外光发射单元 (111) 发出和返回到所述测距监控单元 (112) 吋的吋 间差的计吋电路单元 (1132) 。
根据权利要求 4所述的三维空间定位传感器 (11) , 其特征在于, 所 述测距监控单元 (112) 包括镜头单元、 以及光电探测器单元 (1123 ) , 所述镜头单元用于将从所述显示面 (S) 反射回的红外光反射到 所述光电探测器单元 (1123) , 所述光电探测器单元 (1123) 采集反 射返回的红外光信号。
根据权利要求 5所述的三维空间定位传感器 (11) , 其特征在于, 所 述红外光发射单元 (111) 还包括分光棱镜单元 (1111) , 所述分光 棱镜单元 (1111) 镀有半反半透性质的膜片, 用于透射部分红外光信 号到所述显示面 (S) , 并反射部分红外光信号到所述测距监控单元
(112) 用于光信号起始吋间的采集。
根据权利要求 4所述的三维空间定位传感器 (11) , 其特征在于, 所 述测距监控单元 (112) 包括用于采集所述红外光信号的发射和返回 的 CMOS单元 (1121) 。
一种互动显示***, 包括显示单元和互动遥控单元 (1) , 其特征在 于:
所述互动遥控单元 (1) 包括权利要求 1至 7任一项所述的三维空间定 位传感器 (11) 和用于和所述显示单元通信连接的第一通讯单元 (12 所述互动遥控单元 (1) 的三维空间定位传感器 (11) 发射红外光信 号到所述显示单元的显示面 (S) , 得出所述互动遥控单元 (1) 相对 所述显示面 (S) 的测距定位信息;
所述第一通讯单元 (12) 与所述显示单元通信连接, 将所述互动遥控 单元 (1) 相对所述显示面 (S) 的测距定位信息发送给所述显示单元 所述互动显示***还包括用于获取所述互动遥控单元 (1) 的红外光 信号在所述显示面 (S) 的位置信息、 并反馈给所述显示单元的*** 监控单元 (3) 。
根据权利要求 8所述的互动显示***, 其特征在于, 所述互动显示系 统还包括 3D眼镜单元 (2) , 所述 3D眼镜单元 (2) 包括权利要求 1至 7任一项所述的三维空间定位传感器 (11) 和用于和所述显示单元通 信连接的第二通讯单元 (22) ;
所述 3D眼镜单元 (2) 的三维空间定位传感器 (11) 发射红外光信号 到所述显示单元的显示面 (S) , 得出所述 3D眼镜单元 (2) 相对所 述显示面 (S) 的测距定位信息;
所述第二通讯单元 (22) 与所述显示单元通信连接, 将所述 3D眼镜 单元 (2) 相对所述显示面 (S) 的测距定位信息发送给所述显示单元 所述***监控单元 (3) 获取所述 3D眼镜单元 (2) 的红外光信号在 所述显示面 (S) 的位置信息、 并反馈给所述显示单元。
根据权利要求 9所述的互动显示***, 其特征在于, 所述互动显示系 统还包括算法单元, 所述算法单元根据所述互动遥控单元 (1) 和 /或 所述 3D眼镜单元 (2) 的测距定位信息和其红外光发射在所述显示面 (S) 的位置信息, 计算所述互动遥控单元 (1) 和 /或所述 3D眼镜单 元 (2) 相对于所述显示面 (S) 的空间位置以及相对于彼此之间的空 间位置。
根据权利要求 10所述的互动显示***, 其特征在于, 所述空间位置包 括所述互动遥控单元 (1) 、 所述 3D眼镜单元 (2) 相对于所述显示 面 (S) 的距离与角度或所述互动遥控单元 (1) 、 所述 3D眼镜单元 (2) 相互之间的距离与角度。
根据权利要求 9所述的互动显示***, 其特征在于, 所述互动遥控单 元 (1) 采用的红外光信号波长与 3D眼镜单元 (2) 采用的红外光信 号波长不同。
根据权利要求 12所述的互动显示***, 其特征在于, 所述***监控单 元 (3) 包括用于分幵所述互动遥控单元 (1) 和 3D眼镜单元 (2) 不 同波长的红外光的分光棱镜 (31) 以及两个分别用于采集所述互动遥 控单元 (1) 和 3D眼镜单元 (2) 红外光信号位置的传感器 (32) 。 一种互动显示***的 3D图像生成方法, 所述互动显示***包括显示 单元、 互动遥控单元 (1) 以及***监控单元 (3) , 其特征在于, 所 述互动遥控单元 (1) 包括权利要求 1至 7任一项所述的三维空间定位 传感器 (11) , 所述 3D图像生成方法包括以下步骤:
51、 所述互动遥控单元 (1) 的三维空间定位传感器 (11) 发射红外 光信号到所述显示单元的显示面 (S) , 所述互动显示***得出所述 互动遥控单元 (1) 相对所述显示面 (S) 的测距定位信息;
52、 所述***监控单元 (3) 实吋获取所述互动遥控单元 (1) 所发射 的红外光信号在所述显示单元的显示面 (S) 的位置信息, 依据所述 位置信息决定所述 3D图像和 /或其动作轨迹的第一基点 (K) ;
53、 利用所述互动遥控单元 (1) 相对所述显示面 (S) 的位置信息 生成由所述第一基点 (K) 向所述显示面 (S) 的底边偏移一定距离 得到第二基点 (J) ;
54、 根据所述互动遥控单元 (1) 相对所述显示面 (S) 的位置信息 以及所述第一基点 (K) 、 第二基点 (J) 的位置信息, 生成 3D物件 的 3D图像和 /或其动作轨迹; 所述第一基点 (K) 对应的 3D图像和 /或 其运动轨迹为零视差图像, 所述第二基点 (J) 对应的 3D图像和 /或其 运动轨迹为负视差图像, 其负视差数值之绝对值随步骤 S1中测距数值 的增减而同方向增减。
[权利要求 15] 根据权利要求 14所述的 3D图像生成方法, 其特征在于, 所述互动显 示***还包括 3D眼镜单元 (2) , 所述 3D眼镜单元 (2) 包括权利要 求 1至 7任一项所述的三维空间定位传感器 (11) ; 所述步骤 S1还包括: 所述 3D眼镜单元 (2) 的三维空间定位传感器 ( 11) 发射红外光信号到所述显示单元的显示面 (S) , 所述互动显示 ***得出所述 3D眼镜单元 (2) 相对所述显示面 (S) 的测距定位信 息、 以及计算所述互动遥控单元 (1) 和 3D眼镜单元 (2) 相互之间 的空间位置信息;
所述步骤 S2还包括: 所述***监控单元 (3) 实吋获取所述 3D眼镜单 元 (2) 所发射的红外光信号在所述显示单元的显示面 (S) 的位置信 息;
所述步骤 S3还包括: 根据所述 3D眼镜单元 (2) 与互动遥控单元 (1 ) 顶端或其他部位的空间虚拟连线延伸到显示画面的落点决定显示 3 D图像和 /或其动作轨迹的第二基点 (J) 。
PCT/CN2016/072209 2016-01-26 2016-01-26 三维空间定位传感器及互动显示***和3d图像生成方法 WO2017128046A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/072209 WO2017128046A1 (zh) 2016-01-26 2016-01-26 三维空间定位传感器及互动显示***和3d图像生成方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/072209 WO2017128046A1 (zh) 2016-01-26 2016-01-26 三维空间定位传感器及互动显示***和3d图像生成方法

Publications (1)

Publication Number Publication Date
WO2017128046A1 true WO2017128046A1 (zh) 2017-08-03

Family

ID=59397123

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/072209 WO2017128046A1 (zh) 2016-01-26 2016-01-26 三维空间定位传感器及互动显示***和3d图像生成方法

Country Status (1)

Country Link
WO (1) WO2017128046A1 (zh)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020175897A1 (en) * 1999-08-27 2002-11-28 Pelosi Michael J. 3D cursor or joystick device
CN1811665A (zh) * 2005-01-24 2006-08-02 原相科技股份有限公司 个人电脑的互动式输入装置及游戏***设备
CN101376057A (zh) * 2007-08-27 2009-03-04 原相科技股份有限公司 具警示功能的交互式游戏方法与***
CN101676682A (zh) * 2008-09-19 2010-03-24 鸿富锦精密工业(深圳)有限公司 遥控感测***
CN102008823A (zh) * 2009-04-26 2011-04-13 艾利维公司 控制视频游戏中物体移动的方法和***
CN102935288A (zh) * 2012-10-31 2013-02-20 深圳市德力信科技有限公司 一种人机互动游戏实现装置及方法
CN103152905A (zh) * 2011-12-06 2013-06-12 松下电器产业株式会社 照明***
CN103801076A (zh) * 2012-11-09 2014-05-21 西安景行数创信息科技有限公司 一种互动飞碟射击游戏***
CN104548599A (zh) * 2013-10-29 2015-04-29 西安景行数创信息科技有限公司 一种利用投影技术和感应技术的游戏装置

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020175897A1 (en) * 1999-08-27 2002-11-28 Pelosi Michael J. 3D cursor or joystick device
CN1811665A (zh) * 2005-01-24 2006-08-02 原相科技股份有限公司 个人电脑的互动式输入装置及游戏***设备
CN101376057A (zh) * 2007-08-27 2009-03-04 原相科技股份有限公司 具警示功能的交互式游戏方法与***
CN101676682A (zh) * 2008-09-19 2010-03-24 鸿富锦精密工业(深圳)有限公司 遥控感测***
CN102008823A (zh) * 2009-04-26 2011-04-13 艾利维公司 控制视频游戏中物体移动的方法和***
CN103152905A (zh) * 2011-12-06 2013-06-12 松下电器产业株式会社 照明***
CN102935288A (zh) * 2012-10-31 2013-02-20 深圳市德力信科技有限公司 一种人机互动游戏实现装置及方法
CN103801076A (zh) * 2012-11-09 2014-05-21 西安景行数创信息科技有限公司 一种互动飞碟射击游戏***
CN104548599A (zh) * 2013-10-29 2015-04-29 西安景行数创信息科技有限公司 一种利用投影技术和感应技术的游戏装置

Similar Documents

Publication Publication Date Title
US9322654B2 (en) Laser tracker with a target sensing unit for target tracking and orientation detection
KR101902283B1 (ko) 물체의 3차원 정보 획득을 위한 카메라 센싱 장치 및 이를 이용한 가상 골프 시뮬레이션 장치
CN106550228B (zh) 获取三维场景的深度图的设备
US9132342B2 (en) Dynamic environment and location based augmented reality (AR) systems
CN110178156A (zh) 包括可调节焦距成像传感器的距离传感器
CN104685541A (zh) 用于确定三维对象上注视点的方法和装置
US10962631B2 (en) Method for operating a laser distance measuring device
CN104634276A (zh) 三维测量***、拍摄设备和方法、深度计算方法和设备
US10222176B2 (en) Simulated gun shooting and target position sensing apparatus and method
CN206541271U (zh) 一种光学定位***和虚拟现实***
CN109068033A (zh) 景深摄像模组
JP2019009592A (ja) 遠隔通信方法、遠隔通信システム及び自律移動装置
CN108195305A (zh) 一种双目检测***及其深度检测方法
CN103499335A (zh) 一种三维测距方法及其装置
CN110095783A (zh) Vr测距信息展示装置及vr测距模组
KR102527207B1 (ko) 촬영 장치 및 물체의 거동 산출 장치
CN106483529A (zh) 一种光学***
WO2017128046A1 (zh) 三维空间定位传感器及互动显示***和3d图像生成方法
JP4907564B2 (ja) 測距双眼鏡
KR101977307B1 (ko) 항공 사격 채점 시스템 및 방법
JP7061304B2 (ja) ゴルフショット撮影装置、ゴルフショット解析システム及びボールスピン計測装置
CN108628447A (zh) 一种医学影像ar显示***
TWI220156B (en) Optical range-finder
CN105717487A (zh) 三维空间定位传感器及互动显示***和3d图像生成方法
CN211043667U (zh) Vr激光测距仪

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16886933

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16886933

Country of ref document: EP

Kind code of ref document: A1