WO2017128046A1 - Capteur de positionnement tridimensionnel, système d'affichage interactif, et procédé de génération d'image 3d - Google Patents

Capteur de positionnement tridimensionnel, système d'affichage interactif, et procédé de génération d'image 3d Download PDF

Info

Publication number
WO2017128046A1
WO2017128046A1 PCT/CN2016/072209 CN2016072209W WO2017128046A1 WO 2017128046 A1 WO2017128046 A1 WO 2017128046A1 CN 2016072209 W CN2016072209 W CN 2016072209W WO 2017128046 A1 WO2017128046 A1 WO 2017128046A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
infrared light
display
remote control
interactive
Prior art date
Application number
PCT/CN2016/072209
Other languages
English (en)
Chinese (zh)
Inventor
那庆林
麦浩晃
黄彦
Original Assignee
神画科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 神画科技(深圳)有限公司 filed Critical 神画科技(深圳)有限公司
Priority to PCT/CN2016/072209 priority Critical patent/WO2017128046A1/fr
Publication of WO2017128046A1 publication Critical patent/WO2017128046A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves

Definitions

  • the present invention relates to the field of image display and image generation imaging, and more particularly to a three-dimensional spatial positioning sensor and an interactive display system and image generating method.
  • the positioning of the remote controller is usually determined by a single point and the position information of the display unit.
  • the image of the interactive display system is also displayed through a preset file, and the effect is relatively simple. , lack of real three-dimensional sense of space.
  • the technical problem to be solved by the present invention is to provide an improved three-dimensional spatial positioning sensor, an interactive display system and a 3D image generating method.
  • the technical solution adopted by the present invention to solve the technical problem thereof is: constructing a three-dimensional spatial positioning sensor, comprising an infrared light emitting unit, a ranging monitoring unit, and an information processing unit;
  • an infrared light emitting unit that emits at least three infrared light signals to different positions on a target surface
  • the ranging monitoring unit acquires, after the infrared light signal is transmitted to the target surface, is reflected to the turn of the ranging monitoring unit;
  • the information processing unit obtains the ranging information of the target surface according to the inter-turn difference between the infrared light emitting unit and the ranging monitoring unit ⁇ according to the respective infrared light signals. .
  • the optical axes of the at least three infrared light signals are disposed at an angle according to a predetermined space.
  • the infrared light emitting unit includes at least three infrared emitters
  • the information processing unit includes a modulation circuit unit for modulating signals of the plurality of infrared light emitters.
  • the information processing unit includes a meter circuit unit for calculating an inter-turn difference between each of the infrared light signals respectively emitted by the infrared light emitting unit and returned to the ranging monitoring unit.
  • the ranging monitoring unit includes a lens unit, and a photodetector unit, the lens sheet
  • the element is configured to reflect infrared light reflected back from the display surface to the photodetector unit, and the photodetector unit separately collects the reflected infrared light signal.
  • the lens unit comprises a beam splitting prism unit, and the beam splitting prism unit is plated with a transflective film for transmitting a part of infrared light signals to the display surface and reflecting part of the infrared light signal.
  • the ranging monitoring unit is used for the acquisition of the start of the optical signal.
  • the ranging monitoring unit comprises a CM OS unit for collecting the transmission and return of the infrared light signal.
  • the present invention also constructs an interactive display system including a display unit and an interactive remote control unit.
  • the interactive remote control unit includes the three-dimensional spatial positioning sensor and a first communication unit for communicating with the display unit;
  • the three-dimensional spatial positioning sensor of the interactive remote control unit emits an infrared light signal to a display surface of the display unit, and obtains ranging positioning information of the interactive remote control unit relative to the display surface;
  • the first communication unit is communicably connected to the display unit, and the ranging positioning information of the interactive remote control unit relative to the display surface is sent to the display unit;
  • the interactive display system further includes a system monitoring unit for acquiring position information of the infrared light signal of the interactive remote control unit on the display surface and feeding back to the display unit.
  • the interactive display system further includes a 3D glasses unit, the 3D glasses unit including the three-dimensional spatial positioning sensor and a second communication unit for communicating with the display unit;
  • the three-dimensional spatial positioning sensor of the 3D glasses unit emits an infrared light signal to a display surface of the display unit, and obtains ranging positioning information of the 3D glasses unit relative to the display surface;
  • the second communication unit is communicably connected to the display unit, and the ranging positioning information of the 3D glasses unit relative to the display surface is sent to the display unit;
  • the system monitoring unit acquires position information of the infrared light signal of the 3D glasses unit on the display surface
  • the interactive display system further includes an algorithm unit, and the algorithm unit is based on the ranging positioning information of the interactive remote control unit and/or the 3D glasses unit and the infrared light emission thereof on the display surface. Position information, calculating a spatial position of the interactive remote control unit and/or the 3D glasses unit with respect to the display surface and a spatial position relative to each other.
  • the spatial location includes the distance and angle of the interactive remote control unit, the 3D glasses unit relative to the display surface, or the distance between the interactive remote control unit and the 3D glasses unit angle.
  • the wavelength of the infrared light signal used by the interactive remote control unit is different from the wavelength of the infrared light signal used by the 3D glasses unit.
  • the system monitoring unit includes a beam splitting prism for splitting infrared light of different wavelengths of the interactive remote control unit and the 3D glasses unit, and two infrared components for respectively collecting the interactive remote control unit and the 3D glasses unit. Sensor for the position of the light signal.
  • the present invention also constructs a 3D image generating method of an interactive display system, the interactive display system includes a display unit, an interactive remote control unit, and a system monitoring unit, and the interactive remote control unit includes the three-dimensional spatial positioning sensor.
  • the 3D image generation method includes the following steps:
  • the three-dimensional spatial positioning sensor of the interactive remote control unit emits an infrared light signal to a display surface of the display unit, and the interactive display system obtains ranging location information of the interactive remote control unit relative to the display surface ;
  • the system monitoring unit obtains position information of the infrared light signal emitted by the interactive remote control unit on the display surface of the display unit, and determines the 3D image and/or according to the position information.
  • S4 Generate a 3D image of the 3D object and/or a motion track thereof according to the position information of the interactive remote control unit relative to the display surface and the position information of the first base point and the second base point;
  • the 3D image corresponding to the base point and/or its motion trajectory is a zero parallax image
  • the 3D image corresponding to the second base point and/or its motion trajectory is a negative parallax image
  • the absolute value of the negative parallax value is determined by the distance value in step S1. Increase or decrease in the same direction.
  • the interactive display system further includes a 3D glasses unit, and the 3D glasses unit includes the three-dimensional spatial positioning sensor;
  • the step S1 further includes: the three-dimensional spatial positioning sensor of the 3D glasses unit emits an infrared light signal to a display surface of the display unit, and the interactive display system obtains that the 3D glasses unit is opposite to the display Ranging positioning information of the display, and calculating spatial position information between the interactive remote control unit and the 3D glasses unit;
  • the step S2 further includes: the system monitoring unit actually acquiring location information of the infrared light signal emitted by the 3D glasses unit on the display surface of the display unit;
  • the step S3 further includes: determining, according to the spatial virtual connection of the top of the 3D glasses unit and the top end of the interactive remote control unit or other parts, to the falling point of the display screen, determining the second base point of displaying the 3D image and/or its motion track .
  • the three-dimensional spatial positioning sensor and the interactive display system and the 3D image generating method of the present invention have the following beneficial effects:
  • the present invention obtains positioning information of a target surface by using a three-dimensional spatial positioning sensor, and allows the display system to be relatively based on the interactive remote control unit.
  • the positioning information of the display surface generates a 3D image corresponding to the position of the interactive remote control unit and/or its motion track, so that the user holding the interactive remote control unit can see the 3D image corresponding to the standing position, such as the correspondence of the 3D object by the interactive remote control unit.
  • the 3D object such as bullets and darts, which are emitted to the display surface, moves to the display surface to make the 3D effect more realistic.
  • FIG. 1 is a schematic diagram showing the principle of a three-dimensional spatial positioning sensor emitting an infrared light signal to a display surface for positioning according to an embodiment of the present invention
  • FIG. 2 is a schematic view showing the arrangement of the infrared emitters at an angle of the infrared light emitting unit of FIG. 1;
  • FIG. 3 is a schematic diagram of a principle of a ranging monitoring unit of a three-dimensional spatial positioning sensor including a lens unit and a photodetector unit ⁇ in the embodiment of the present invention
  • FIG. 4 is a ranging monitoring unit of a three-dimensional spatial positioning sensor according to an embodiment of the present invention, including a dimension CMO.
  • FIG. 5 is a schematic diagram of an interactive remote control unit of an interactive display system in the embodiment of the present invention transmitting an infrared light signal to a display surface;
  • FIG. 6 is an interactive remote control unit and a 3D eye unit of the interactive display system in the embodiment of the present invention. Schematic diagram of the infrared light signal ⁇ emitted by the display surface;
  • FIG. 7 is a schematic diagram of the first base point obtained after the interactive remote control unit of FIG. 6 is transmitted to the display surface;
  • FIG. 8 is a schematic view showing a second base point ⁇ from the first base point in FIG. 7;
  • FIG. 9 is an actual 3D image of the left and right eyes respectively generated by the first base point and the second base point according to the first base point and the second base point in FIG. 8;
  • FIG. 10 is a 3D image of a 3D effect presented by wearing a 3D glasses unit and looking at the actual 3D image generated in FIG.
  • FIG. 11 is an actual 3D viewed from the left and right eyes of the first base point and the second base point generated according to the interactive remote control unit, the 3D glasses unit relative to the display surface, and the mutual positioning information.
  • FIG. 12 is a 3D image of a 3D effect of wearing a 3D glasses unit looking at the actual 3D image generated in FIG. 11 to implement the preferred embodiment of the invention.
  • a three-dimensional spatial positioning sensor 11 in a preferred embodiment of the present invention includes an infrared light emitting unit 111, a ranging monitoring unit 112, and an information processing unit 113.
  • the infrared light emitting unit 111 includes three infrared emitting signals.
  • the three infrared emitters emit three infrared light signals to different positions of the target surface.
  • the target surface is the display surface S of the display unit, and the display surface S may be a plane or a curved surface.
  • the ranging monitoring unit 112 acquires the infrared light signal and transmits it to the target surface and then reflects it to the time of the ranging monitoring unit 112.
  • the information processing unit 113 sends and returns to the infrared light emitting unit 111 according to the respective infrared light signals.
  • the inter-turn difference of the ranging monitoring unit 112 ⁇ obtains the length of the route through which the infrared light passes, thereby obtaining ranging positioning information with respect to the target surface, and the ranging positioning information includes at least three infrared light flying distances relative to the display surface S. information.
  • the emission time of the infrared light can be obtained by the system setting or by the ranging monitoring unit 112.
  • the optical axes of the three infrared light signals are mutually arranged according to a predetermined spatial angle, and the three infrared light signals are at an angle in the space, and are emitted to the display.
  • the three points after the surface S can form a triangle, and the three-dimensional space can be further calculated by combining the positioning information with the display surface S.
  • the position angle of the at least three infrared rays of the position sensor 11 to the display surface s accurately positions the position of the three-dimensional spatial positioning sensor 11 relative to the display surface S, such as the distance from the muzzle of the interactive remote control unit 1 such as a remote control gun to the target point and the gun
  • the angle between the shooting axis and the display surface S is more accurate.
  • the infrared light emitting unit 111 can also transmit three or more infrared light signals to different positions of the target surface, and the infrared light emitting signals are respectively sent and returned by the infrared light emitting unit 111 to the respective positions.
  • the inter-turn difference of the ranging monitoring unit 112 is obtained, and the ranging positioning information with respect to the target surface is obtained.
  • the information processing unit 113 includes a modulation circuit unit 1131 for modulating signals of a plurality of infrared light emitters, and a signal for calculating each of the infrared light signals respectively emitted by the infrared light emitting unit 111. And a counting circuit unit 1132 that returns to the inter-turn difference of the ranging monitoring unit 112A, the modulation circuit unit 1131 modulates signals of the plurality of infrared light emitters, and the information processing unit 113 calculates the inter-turn difference between each beam of light emitted and returned.
  • the length of each infrared ray from the exit point to the length of the display surface S is used to calculate the ranging positioning information of the three-dimensional spatial positioning sensor 11 relative to the display surface S.
  • the ranging monitoring unit 112 includes a CMOS unit 1121 for collecting transmission and return of infrared light signals.
  • the CMOS unit 1121 has a lens. It is determined that the emission time of the infrared light can be set by the system, and the light signal is transmitted and the system is recorded, and the time is sent to the information processing unit 113.
  • the infrared light is emitted to the display surface S to generate a reflection, and the CMOS unit 1121 captures the returned infrared light signal, and returns to the time interval of the ranging monitoring unit 112, and feeds back to the information processing unit 113 to let the information processing unit 113 according to the UI.
  • the difference between the lengths of the paths through which the infrared rays pass is obtained, thereby obtaining the ranging and positioning information of the three-dimensional spatial positioning sensor 11 with respect to the target surface.
  • the ranging monitoring unit 112 includes a lens unit, and a photodetector unit 1123.
  • the lens unit is for reflecting back the infrared light from the display surface S to the display surface S and reflecting the emitted infrared light to the photodetector unit 1123.
  • the photodetector unit 1123 collects the infrared light signal reflected by the display surface S, and feeds back the light information to the information processing unit 113.
  • the photodetector unit 1123 can also collect the initial emission of infrared light, which of course can also be collected by the system.
  • the infrared light emitting unit 111 further includes a dichroic prism unit 1111 disposed in the emission direction of the infrared light, and the dichroic prism unit 1111 is plated with a transflective film for transmitting a part of the infrared light signal to the display surface S. And reflecting part of the infrared light signal to the ranging monitoring unit 112 for collecting the start of the optical signal. After the infrared light passes through the dichroic prism unit 1111, a light is reflected by the diaphragm to the ranging monitoring unit 112.
  • the ray information is fed back to the information processing unit 113 to obtain the enthalpy of the utterance; the other ray is transmitted to the display surface S, and then reflected to the ranging monitoring unit 112 to obtain the returned time.
  • the information processing unit 113 calculates the length of the infrared ray from the exit point to the display surface S from the inter-turn difference of each beam of light, thereby calculating the ranging position information of the three-dimensional spatial position sensor 11 with respect to the display surface S.
  • the interactive display system in a preferred embodiment of the present invention includes a display unit and an interactive remote control unit 1.
  • the display unit can be a display screen, a projector, etc.
  • the display surface S of the display unit is a display screen.
  • the interactive remote control unit 1 is held by the user in the hand, and the interactive remote control unit 1 is usually a game handle such as a remote control gun, interacts with the display unit, and correspondingly generates a 3D image that is adapted to the position of the remote control unit 1 for holding the 3D image. More real.
  • the interactive remote control unit 1 includes a three-dimensional spatial positioning sensor 11 and a first communication unit 12 for communicating with the display unit. After the interactive remote control unit 1 interacts with the display unit, the three-dimensional spatial positioning sensor 11 of the interactive remote control unit 1 emits an infrared light signal to the display surface S of the display unit, and obtains the ranging positioning information of the interactive remote control unit 1 relative to the display surface S.
  • the first communication unit 12 is communicably connected to the display unit, and transmits the ranging positioning information of the interactive remote control unit 1 to the display surface S to the display unit.
  • the interactive display system further includes a system monitoring unit 3 for acquiring position information of the infrared light signal of the interactive remote control unit 1 on the display surface S and feeding back to the display unit. After the infrared light is emitted to the display surface S, the system monitoring unit 3 Acquire the position information of the infrared light on the display surface S. As shown in FIG. 7, the system monitoring unit 3 has up to three infrared lights on the screen of the captured infrared light signal. If the application needs to determine the optical axis of the interactive remote unit 1, it may take a calculation method to obtain the midpoint of three points or Other parameters are more conducive to application operations.
  • the interactive display system further includes a 3D glasses unit 2, the 3D glasses unit 2 is worn by the user on the head, the interactive remote control unit 1 is held by the user in the hand, the interactive remote control unit 1, the 3D glasses unit 2 and
  • the interaction between the display units correspondingly generating a 3D image that is adapted to the position of the remote control unit 1 and the position of the human eye, makes the 3D image more realistic.
  • the 3D glasses unit 2 includes a three-dimensional spatial positioning sensor 11 and a second communication unit 22 for communication connection with the display unit.
  • the three-dimensional spatial positioning sensor 11 of the 3D glasses unit 2 emits an infrared light signal to the display surface S of the display unit, and obtains ranging positioning information of the 3D glasses unit 2 with respect to the display surface S.
  • the second communication unit 22 is communicably connected to the display unit, and transmits the ranging positioning information of the 3D glasses unit 2 relative to the display surface S to the display unit, and the ranging positioning information includes information such as the distance of the 3D glasses unit 2 with respect to the display surface S.
  • the system monitoring unit 3 acquires the position information of the infrared light signal of the 3D glasses unit 2 on the display surface S and feeds it back to the display unit.
  • the axis angle of the center of the 3D glasses unit 2 relative to the display surface S can be calculated by an algorithm, which is more conducive to the subsequent application. .
  • the interactive display system further includes an algorithm unit, and the algorithm unit calculates the interactive remote control unit 1, the 3D glasses unit according to the ranging positioning information of the interactive remote control unit 1, the 3D glasses unit 2, and the position information of the infrared light emission on the display surface S. 2 spatial position relative to the display surface S and relative to the spatial position between each other.
  • the spatial position includes the distance and the angle with respect to the display surface S, and the position and the orientation of the interactive remote control unit 1, the 3D glasses unit 2 relative to the display surface S, and the like can be accurately obtained.
  • the system monitoring unit 3 includes a beam splitting prism 31 for splitting the infrared light of different wavelengths of the interactive remote control unit 1 and the 3D glasses unit 2, and two positions for collecting the infrared light signals of the interactive remote control unit 1 and the 3D glasses unit 2, respectively.
  • the sensor 32, the sensor 32 can be a CMOS chip, a CCD chip, and the infrared light of the interactive remote control unit 1 and the 3D glasses unit 2 is split to the two sensors 32, respectively, and the interactive remote control unit 1 and the 3D glasses unit can be respectively obtained. 2 spatial location information.
  • the display unit generates a corresponding 3D image according to the ranging positioning information of the interactive remote control unit 1 with respect to the display surface S and the position information of the infrared light signal on the display surface S. Let the user holding the interactive remote control unit 1 see the 3D image corresponding to its standing position, making the 3D effect more realistic.
  • the 3D image generation method on the display unit includes the following steps:
  • the three-dimensional spatial positioning sensor 11 of the interactive remote control unit 1 emits an infrared light signal to the display surface s of the display unit, and the interactive display system obtains the ranging and positioning information of the interactive remote control unit 1 relative to the display surface S;
  • the distance location information may include information such as the distance of the interactive remote control unit 1 with respect to the display surface S.
  • the system monitoring unit 3 obtains the position information of the infrared light signal emitted by the interactive remote control unit 1 on the display surface S of the display unit, and determines the first base point K of the 3D image and/or its motion track according to the position information.
  • the first base point is a shooting axis of the interactive remote control unit 1 such as a remote control gun and a point extending to the display surface. In other embodiments, it may be a center position of three beams of infrared light. In other embodiments, The first base point can also be other locations.
  • the second base point J is obtained by using the distance measurement information of the interactive remote control unit 1 with respect to the display surface S to be offset from the base point of the display surface S by the first base point K.
  • the interactive remote control unit 1 is held by the user, usually under the eyes, and the display surface S is in front of the user, so the bullets, darts, etc. shot from the interactive remote control unit 1 should be from below the display surface S.
  • the obtained second base point J should be located between the first base point K and the bottom side of the display surface S.
  • S4 Generate a 3D image of the 3D object and/or a motion track thereof according to the ranging positioning information of the interactive remote control unit 1 with respect to the display surface S and the position information of the first base point K and the second base point J.
  • the 3D image corresponding to the first base point K and/or its motion trajectory is a zero parallax image
  • the 3D image corresponding to the second base point J and/or its motion trajectory is a negative parallax image
  • the absolute value of the negative parallax value is measured in step S1.
  • the 3D image of the generated 3D object and/or its motion trajectory are respectively generated on the two sides of the first base point K and the second base point J, respectively, in the left eye and the right eye.
  • the actual image or motion trajectory that can be seen the image at the first base point K is zero parallax, the image below the first base point K is negative parallax, and the image above the first base point K is positive parallax.
  • FIG. 10 when the 3D glasses are worn, a 3D image of the 3D effect is seen, and a 3D object such as a bullet or a dart that is emitted from the position near the direction of the interactive remote control unit 1 to the display surface S is seen. The movement track of the movement of the surface S is displayed.
  • the operation steps include:
  • Step S1 further includes: the three-dimensional spatial positioning sensor 11 of the 3D glasses unit 2 emits an infrared light signal to the display surface S of the display unit, and the interactive display system obtains ranging positioning information of the 3D glasses unit 2 relative to the display surface S. And calculating spatial position information between the interactive remote control unit 1 and the 3D glasses unit 2;
  • Step S2 further includes: the system monitoring unit 3 obtains the position information of the infrared light signal emitted by the 3D glasses unit 2 on the display surface S of the display unit, and further obtains the interactive remote control unit 1, the 3D glasses unit 2. The positional relationship between the display faces S.
  • Step S3 further includes: after obtaining the spatial position of the interactive remote control unit 1, the 3D glasses unit 2, extending to the display screen according to the spatial virtual connection between the 3D glasses unit 2 and the top end of the interactive remote control unit 1 or other parts The point determines the second base point J of the 3D image and/or its motion trajectory.
  • the 3D image of the generated 3D object and/or its motion trajectory includes the actual sides that can be seen in the left eye and the right eye on both sides of the connecting line segment of the first base point K and the second base point J, respectively.
  • the image or motion trajectory, the image at the first base point K is zero parallax, and the image below the first base point K is negative parallax, at the first base point
  • the image above K is a positive parallax.
  • the image that the user sees after wearing the 3D glasses is, for example, that the bullet is directly emitted from the muzzle of the gun-type interactive remote control unit 1, and is shot far into the distance through the display surface S pointed by the muzzle. Until it disappears. 3D effect 3
  • the D object and its motion trajectory will not be offset from the interactive remote control unit 1, and the visual effect is more realistic.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Position Input By Displaying (AREA)

Abstract

L'invention concerne un capteur de positionnement tridimensionnel (3D) (11), un système d'affichage interactif, et un procédé de génération d'image 3D. Le capteur de positionnement 3D (11) comprend : un émetteur infrarouge (111) émettant au moins trois signaux infrarouges vers une surface cible; une unité de détection par télémétrie (112) faisant l'acquisition des temps des signaux infrarouges réfléchis atteignant l'unité de détection par télémétrie (112) ; et une unité de traitement d'informations (113) générant, en fonction d'une différence temporelle entre l'émission et le retour de chaque faisceau infrarouge, des informations de positionnement par télémétrie concernant la surface cible. Par obtention d'informations de positionnement par télémétrie concernant une surface cible, le capteur de positionnement 3D (11) permet à un système d'affichage de générer, en fonction des informations de positionnement par télémétrie d'une unité de télécommande interactive concernant une surface d'affichage (S) et des informations de position d'un faisceau infrarouge sur la surface d'affichage (S), une image 3D correspondante et/ou une trajectoire de mouvement de celle-ci par rapport à l'unité de télécommande interactive de sorte qu'un utilisateur tenant l'unité de télécommande interactive peut voir l'image 3D correspondant à la position debout de celle-ci, par exemple, un objet 3D, tel qu'une balle, émis par l'unité de télécommande interactive vers la surface d'affichage (S), et une trajectoire de mouvement d'un objet 3D, tel qu'une fléchette, vers la surface d'affichage (S), permettant ainsi une expérience 3D plus réaliste.
PCT/CN2016/072209 2016-01-26 2016-01-26 Capteur de positionnement tridimensionnel, système d'affichage interactif, et procédé de génération d'image 3d WO2017128046A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/072209 WO2017128046A1 (fr) 2016-01-26 2016-01-26 Capteur de positionnement tridimensionnel, système d'affichage interactif, et procédé de génération d'image 3d

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/072209 WO2017128046A1 (fr) 2016-01-26 2016-01-26 Capteur de positionnement tridimensionnel, système d'affichage interactif, et procédé de génération d'image 3d

Publications (1)

Publication Number Publication Date
WO2017128046A1 true WO2017128046A1 (fr) 2017-08-03

Family

ID=59397123

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/072209 WO2017128046A1 (fr) 2016-01-26 2016-01-26 Capteur de positionnement tridimensionnel, système d'affichage interactif, et procédé de génération d'image 3d

Country Status (1)

Country Link
WO (1) WO2017128046A1 (fr)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020175897A1 (en) * 1999-08-27 2002-11-28 Pelosi Michael J. 3D cursor or joystick device
CN1811665A (zh) * 2005-01-24 2006-08-02 原相科技股份有限公司 个人电脑的互动式输入装置及游戏***设备
CN101376057A (zh) * 2007-08-27 2009-03-04 原相科技股份有限公司 具警示功能的交互式游戏方法与***
CN101676682A (zh) * 2008-09-19 2010-03-24 鸿富锦精密工业(深圳)有限公司 遥控感测***
CN102008823A (zh) * 2009-04-26 2011-04-13 艾利维公司 控制视频游戏中物体移动的方法和***
CN102935288A (zh) * 2012-10-31 2013-02-20 深圳市德力信科技有限公司 一种人机互动游戏实现装置及方法
CN103152905A (zh) * 2011-12-06 2013-06-12 松下电器产业株式会社 照明***
CN103801076A (zh) * 2012-11-09 2014-05-21 西安景行数创信息科技有限公司 一种互动飞碟射击游戏***
CN104548599A (zh) * 2013-10-29 2015-04-29 西安景行数创信息科技有限公司 一种利用投影技术和感应技术的游戏装置

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020175897A1 (en) * 1999-08-27 2002-11-28 Pelosi Michael J. 3D cursor or joystick device
CN1811665A (zh) * 2005-01-24 2006-08-02 原相科技股份有限公司 个人电脑的互动式输入装置及游戏***设备
CN101376057A (zh) * 2007-08-27 2009-03-04 原相科技股份有限公司 具警示功能的交互式游戏方法与***
CN101676682A (zh) * 2008-09-19 2010-03-24 鸿富锦精密工业(深圳)有限公司 遥控感测***
CN102008823A (zh) * 2009-04-26 2011-04-13 艾利维公司 控制视频游戏中物体移动的方法和***
CN103152905A (zh) * 2011-12-06 2013-06-12 松下电器产业株式会社 照明***
CN102935288A (zh) * 2012-10-31 2013-02-20 深圳市德力信科技有限公司 一种人机互动游戏实现装置及方法
CN103801076A (zh) * 2012-11-09 2014-05-21 西安景行数创信息科技有限公司 一种互动飞碟射击游戏***
CN104548599A (zh) * 2013-10-29 2015-04-29 西安景行数创信息科技有限公司 一种利用投影技术和感应技术的游戏装置

Similar Documents

Publication Publication Date Title
US9322654B2 (en) Laser tracker with a target sensing unit for target tracking and orientation detection
KR101902283B1 (ko) 물체의 3차원 정보 획득을 위한 카메라 센싱 장치 및 이를 이용한 가상 골프 시뮬레이션 장치
CN106550228B (zh) 获取三维场景的深度图的设备
US9132342B2 (en) Dynamic environment and location based augmented reality (AR) systems
CN110178156A (zh) 包括可调节焦距成像传感器的距离传感器
CN104685541A (zh) 用于确定三维对象上注视点的方法和装置
US10962631B2 (en) Method for operating a laser distance measuring device
CN104634276A (zh) 三维测量***、拍摄设备和方法、深度计算方法和设备
KR101920473B1 (ko) 센서 융합 기반으로 3차원 위치와 방향을 추정하는 장치 및 방법
US10222176B2 (en) Simulated gun shooting and target position sensing apparatus and method
CN206541271U (zh) 一种光学定位***和虚拟现实***
CN109068033A (zh) 景深摄像模组
JP2019009592A (ja) 遠隔通信方法、遠隔通信システム及び自律移動装置
CN108195305A (zh) 一种双目检测***及其深度检测方法
CN103499335A (zh) 一种三维测距方法及其装置
CN110095783A (zh) Vr测距信息展示装置及vr测距模组
CN106483529A (zh) 一种光学***
KR102527207B1 (ko) 촬영 장치 및 물체의 거동 산출 장치
WO2017128046A1 (fr) Capteur de positionnement tridimensionnel, système d'affichage interactif, et procédé de génération d'image 3d
JP4907564B2 (ja) 測距双眼鏡
KR101977307B1 (ko) 항공 사격 채점 시스템 및 방법
JP7061304B2 (ja) ゴルフショット撮影装置、ゴルフショット解析システム及びボールスピン計測装置
TWI220156B (en) Optical range-finder
CN105717487A (zh) 三维空间定位传感器及互动显示***和3d图像生成方法
CN211043667U (zh) Vr激光测距仪

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16886933

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16886933

Country of ref document: EP

Kind code of ref document: A1