WO2022095605A1 - 一种手柄控制追踪器 - Google Patents

一种手柄控制追踪器 Download PDF

Info

Publication number
WO2022095605A1
WO2022095605A1 PCT/CN2021/118171 CN2021118171W WO2022095605A1 WO 2022095605 A1 WO2022095605 A1 WO 2022095605A1 CN 2021118171 W CN2021118171 W CN 2021118171W WO 2022095605 A1 WO2022095605 A1 WO 2022095605A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
emitting
marks
handle
mark
Prior art date
Application number
PCT/CN2021/118171
Other languages
English (en)
French (fr)
Inventor
吴涛
Original Assignee
青岛小鸟看看科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 青岛小鸟看看科技有限公司 filed Critical 青岛小鸟看看科技有限公司
Publication of WO2022095605A1 publication Critical patent/WO2022095605A1/zh
Priority to US17/877,219 priority Critical patent/US11712619B2/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/24Constructional details thereof, e.g. game controllers with detachable joystick handles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/23Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
    • A63F13/235Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console using a wireless connection, e.g. infrared or piconet
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality

Definitions

  • Embodiments of the present disclosure relate to the technical field of virtual reality, and in particular, to a handle control tracker.
  • VR Virtual Reality, virtual reality
  • AR Augmented Reality, augmented reality
  • MR Mated Reality, mixed reality
  • an electromagnetic transmitter can be embedded in the handle, and an electromagnetic receiver can be embedded in the VR headset, and the position and attitude information of the handle in three-dimensional space can be calculated in real time through the principle of electromagnetic tracking;
  • An ultrasonic transmitter is embedded in the handle, and an ultrasonic receiver is embedded in the VR headset, and the position and attitude information of the handle in three-dimensional space are calculated in real time through the principle of ultrasonic tracking.
  • the electromagnetic sensor of the handle is relatively sensitive to electromagnetic signals in the environment, and is easily interfered by complex electromagnetic signals in the environment, so that the electromagnetic sensor generates wrong electromagnetic tracking data of the handle.
  • the electromagnetic sensor of the handle is relatively close to the computer host, or in an environment that is relatively close to the audio, TV, refrigerator, etc., due to the influence of other electromagnetic signals, the tracking performance of the handle is poor. Therefore, the handle of the electromagnetic sensor is used. Use more limited.
  • the use of handles using ultrasonic sensors is also limited.
  • a handle control tracker including:
  • the light-emitting unit is disposed at the end of the handle body and forms a preset angle with the handle body;
  • the light-emitting unit includes a first surface, a second surface, a plurality of first light-emitting marks and a plurality of second light-emitting marks, the second surface covering the first surface; the first light-emitting mark and the second light-emitting mark
  • the light-emitting marks are all arranged on the first surface, and a plurality of the first light-emitting marks are distributed in a ring shape;
  • the first light-emitting indicia and the second light-emitting indicia are configured to illuminate for capture by an imaging device
  • the number of the first luminous marks is seventeen
  • the number of the second luminous marks is two
  • the two second luminous marks are symmetrically arranged on the seventeenth first luminous marks. among.
  • the number of the first luminous marks is twenty
  • the number of the second luminous marks is two
  • the second luminous marks are in the shape of a strip
  • one of the second luminous marks is arranged on the The upper edge of the first surface
  • another second light-emitting mark is disposed on the lower edge of the first surface
  • twenty first light-emitting marks are distributed between the two second light-emitting marks.
  • the preset angle is 40°-135°.
  • the wavelength range of the first luminescent marker is 450nm-690nm.
  • the shapes of the first surface and the second surface are both annular.
  • the ratio of the radius of the first surface to the radius of the second surface is 1:1.5.
  • a plurality of the first luminescent markers are connected in series.
  • a plurality of the second luminescent markers are connected in series.
  • a wireless transmission module is arranged on the handle body.
  • a sensor which is arranged on the handle body.
  • the blinking frequency of the first light-emitting mark is 30 Hz.
  • the lighting time of the first light-emitting mark and the second light-emitting mark is 15 ⁇ s-100 ⁇ s.
  • the beneficial effect of the technical solutions of the embodiments of the present disclosure is that, by capturing the first light-emitting mark and the second light-emitting mark on the handle control tracker through the imaging device, the handle control tracker in the three-dimensional space can be calculated in real time and accurately position and attitude information, the operation is very simple. Moreover, the capture process is not affected by electromagnetic wave signals and ultrasonic signals in the surrounding environment, and has a wide range of applications.
  • FIG. 1 is a schematic structural diagram of a handle control tracker provided by an embodiment of the present disclosure
  • Fig. 2 is the schematic diagram when the first surface in Fig. 1 is unfolded
  • FIG. 3 is a schematic structural diagram of a handle control tracker provided by another embodiment of the present disclosure.
  • FIG. 4 is a schematic view of the first surface of FIG. 3 when unfolded.
  • this embodiment provides a handle control tracker. It is convenient for the imaging device to capture the first light-emitting mark 3 and the second light-emitting mark 4 on the handle-controlled tracker, so that the position and attitude information of the handle-controlled tracker in three-dimensional space can be calculated in real time and accurately.
  • the handle control tracker includes a handle body 1 and a light-emitting unit 2; the light-emitting unit 2 is disposed at the end of the handle body 1 and forms a preset angle with the handle body 1; the light-emitting unit 2 It includes a first surface, a second surface, a plurality of first luminous marks 3 and a plurality of second luminous marks 4, the second surface covers the first surface; the first luminous marks 3 and the second luminous marks
  • the markers 4 are all disposed on the first surface, and a plurality of the first luminescent markers 3 are distributed in a ring shape; the first luminescent markers 3 and the second luminescent markers 4 are configured to be lit to be illuminated by an imaging device Capture; the first light-emitting mark 3 lights up in the first time period, and the second light-emitting mark 4 lights up in the second time period.
  • the imaging device is two or more Camera tracking cameras built into the VR headset, and is configured to capture the first light-emitting marker 3 and the second light-emitting mark on the optical handle control tracker in real time Mark 4.
  • the imaging device By detecting the position area of the second light-emitting mark 4 on the image of the handle-controlled tracker, the general position of the light spot of the first light-emitting mark 3 on the handle-controlled tracker on the image can be determined, and then the computer image can be used in this image position area.
  • the processing algorithm detects in real time the two-dimensional position coordinates of the light spot corresponding to the first light-emitting mark 3 and the second light-emitting mark 4 on the handle control tracker on the tracking Camera image.
  • the computer vision algorithm is used to match the two-dimensional position coordinates on the image corresponding to each of the first light-emitting mark 3 and the second light-emitting mark 4 on the handle control tracker.
  • the handle control tracking is optimized and iterated by the PNP attitude calculation algorithm. The attitude and position information of the device relative to the current three-dimensional space environment.
  • the tracking stability of the handle controller is improved in the case of complex lighting sources in the surrounding environment, so that the handle controls the tracking stability of the tracker.
  • the attitude tracking information of 6 degrees of freedom is more stable and accurate.
  • the number of the first luminous marks 3 is seventeen
  • the number of the second luminous marks 4 is two
  • seventeen first light-emitting marks 3 and two second light-emitting marks 4 with a certain geometric shape exist on a circuit board at the same time, and seventeen first light-emitting marks 3 are connected to each other in series.
  • the second light-emitting marks 4 are connected to each other in series.
  • the seventeen first light-emitting marks 3 and the two second light-emitting marks 4 with a certain geometric shape will be lit at the same time according to a certain duty cycle, and the lighting time range can be 15 ⁇ s-100 ⁇ s.
  • the frames before and after the lighting time of the first light-emitting mark 3 and the second light-emitting mark 4 must be the same.
  • the number of the first light-emitting marks 3 is twenty
  • the number of the second light-emitting marks 4 is two
  • the second light-emitting marks 4 are strips. shape, one of the second luminous marks 4 is arranged on the upper edge of the first surface, another piece of the second luminous marks 4 is arranged on the lower edge of the first surface, twenty first luminous marks 3 is distributed between the two second luminescent marks 4 .
  • the handle of the left hand controls the physical position distribution of the twenty first luminous marks 3 on the tracker on the first surface
  • the handle of the right hand controls the twenty first luminous marks 3 on the tracker in the first
  • the physical position distribution of the surface can be physically symmetrically distributed according to a certain symmetry rule.
  • the preset angle is 40°-135°. This not only conforms to the user's usage habits and is convenient for holding the handle body 1, but also facilitates the capturing of the first luminescent mark 3 and the second luminous mark 4 by the imaging device.
  • the wavelength range of the first luminescent marker 3 is 450nm-690nm. This facilitates the capture of the first luminescent marker 3 by the imaging device.
  • the first light-emitting mark 3 is in the infrared light band, and its wavelength is the infrared light band of 850 nm or 940 nm.
  • the shapes of the first surface and the second surface are both annular.
  • the ratio of the radius of the first surface to the radius of the second surface is 1:1.5. This facilitates the arrangement of the first light-emitting mark 3 and the second light-emitting mark 4 between the first surface and the second surface, and facilitates the protection of the first light-emitting mark 3 and the second light-emitting mark 4 by the second surface.
  • a plurality of the first luminescent markers 3 are connected in series.
  • a plurality of the first light-emitting marks 3 are arranged on the same circuit board, which facilitates the lighting of the first light-emitting marks 3 at the same time, thereby facilitating the capturing of the first light-emitting marks 3 by the imaging device.
  • a plurality of the second luminescent markers 4 are connected in series.
  • a plurality of the second light-emitting marks 4 are arranged on the same circuit board, which facilitates lighting the second light-emitting marks 4 at the same time, thereby facilitating the capturing of the second light-emitting marks 4 by the imaging device.
  • a wireless transmission module is also included, and the wireless transmission module is arranged on the handle body 1 .
  • the handle controls the wireless transmission module of the tracker, and a wireless receiving module is built in the VR headset. Then, through the control processing unit built in the VR headset, the controller controls the flickering frequency of each first light-emitting mark 3 and the flickering frequency of each second light-emitting mark 4 of the tracker to be the same as the ones built in the VR head-mounted all-in-one.
  • the exposure shutter synchronization of the tracking camera that is, in each frame of the exposure shutter of the built-in tracking camera of the VR headset, the first light-emitting mark 3 and the second light-emitting mark 4 on the handle control tracker will meet. Bright.
  • the lighting time of the first light-emitting mark 3 and the second light-emitting mark 4 is about 15 ⁇ s-100 ⁇ s.
  • the exposure time of the tracking Camera is generally about 30 ⁇ s–150 ⁇ s, so that it can fully capture the lighting time period of the first light-emitting mark 3 and the second light-emitting mark 4 within the tracking Camera exposure time period, and the tracking Camera The capture frequency is adjusted to 60Hz.
  • the first light-emitting mark 3 and the second light-emitting mark 4 will be lighted separately. For example, let the second light-emitting mark 4 light up in the odd-numbered frames captured by the tracking camera, and let the first light-emitting mark 3 light up in the even-numbered frames captured by the tracking camera, wherein the first light-emitting mark 3 and the second light-emitting mark 4 can both be LED lights.
  • the area position of the second light-emitting mark 4 on the image is detected and tracked in real time on the odd-numbered frame image of the tracking Camera, and the position area is the area of the first light-emitting mark 3 of the controller tracker.
  • the area where the light spot is located this area is a rectangular area. Due to the consideration of the movement speed of the handle, on the even-numbered frame image of the tracking Camera, the image position area corresponding to the second light-emitting mark 4 on the previous odd-numbered frame image is used to correspond to this area.
  • the rectangle is extended by 1.5 times the length of the long and short sides.
  • the image position area where the light spot of the first light-emitting mark 3 on the handle control tracker is located under the current frame is formed, and the light spot detection of the first light-emitting mark 3 is performed on this area, and the image area outside this area is not detected, which greatly reduces the The accuracy and stability of the spot detection of the first luminescent mark 3 also reduces the complexity of spot detection and improves the detection efficiency to a certain extent. It should be noted that 1.5 times is the maximum movement distance of the front and rear frames corresponding to the maximum movement speed of the controller tracking controller in the actual environment corresponding to the tracking Camera 60Hz frequency through the actual test.
  • the flicker sensitivity of human eyes to light is above 50 Hz, it does not feel flicker.
  • the light spots of the first light-emitting mark 3 and the second light-emitting mark 4 are captured by the VR head-mounted tracking Camera at a frequency of 30 Hz, that is, according to the above steps, the handle controls a light-emitting mark and a second light-emitting mark 4 on the tracker.
  • the flashing frequency is 30Hz.
  • the second lighting time is also in the range of 15 ⁇ s-100 ⁇ s, so that the frequency of 33Hz per second makes the first light-emitting mark 3 and the second light-emitting mark 4 of the handle control tracker light up 99 times, which meets the sensitivity range of the human eye to light flicker.
  • the lighting time of all the first luminescent markers 3 and the second luminescent markers 4 is the same in each frame, which facilitates the imaging device to accurately capture the first luminescent markers 3 and the second luminescent markers 4 .
  • the tracking distance range of the handle control tracker can support 3cm-150cm, which is convenient for users to better interact with virtual reality, augmented reality and mixed reality through the handle control tracker.
  • the sensor is arranged on the handle body 1 .
  • an IMU inertial navigation sensor is embedded in the handle control tracker.
  • the sensor is at least a six-axis sensor, which is an accelerometer unit sensor and a gyroscope unit sensor; it can also be a nine-axis sensor, which is an accelerometer unit sensor and a gyroscope. unit sensor and geomagnetic unit sensor.
  • the output frequency of the IMU inertial navigation sensor is at least 90Hz.
  • a wireless network transmission unit is embedded in the handle control tracker, and the transmission protocol can be 2.4G network protocol or BLE (Bluetooth Low Energy) protocol.
  • the wireless transmission frequency is 100 Hz or more.
  • the information of the 6 degrees of freedom of the handle control tracker at high frequency can be accurately obtained in real time.
  • the blinking frequency of the first light-emitting mark 3 is 30 Hz. This facilitates the capture of the first luminescent marker 3 by the imaging device.
  • the lighting time of the first light-emitting mark 3 and the second light-emitting mark 4 is 15 ⁇ s-100 ⁇ s. This helps the imaging device to accurately capture the first luminescent marker 3 and the second luminescent marker 4 .
  • the imaging device captures the first light-emitting mark 3 and the second light-emitting mark 4 on the handle-controlled tracker, so that the position and attitude information of the handle-controlled tracker in three-dimensional space can be calculated in real time and accurately , the operation is very simple. Moreover, the capture process is not affected by electromagnetic wave signals and ultrasonic signals in the surrounding environment, and has a wide range of applications.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

本公开涉及一种手柄控制追踪器,包括手柄本体和发光单元;发光单元设置于手柄本体的端部,并与手柄本体形成预设角度;发光单元包括第一表面、第二表面、多个第一发光标记和多个第二发光标记,第二表面覆盖第一表面;第一发光标记和第二发光标记均设置于第一表面,且多个第一发光标记呈环状分布;第一发光标记和第二发光标记被配置为点亮以由成像设备捕捉;第一发光标记在第一时间段点亮,第二发光标记在第二时间段点亮。

Description

一种手柄控制追踪器
本公开要求于2020年11月09日提交中国专利局,申请号为202011242190.X,申请名称为“一种手柄控制追踪器”的中国专利申请的优先权,其全部内容通过引用结合在本公开中。
技术领域
本公开的实施例涉及虚拟现实技术领域,具体涉及一种手柄控制追踪器。
背景技术
随着VR(Virtual Reality,虚拟现实)、AR(Augmented Reality,增强现实)和MR(Mixed Reality,混合现实)技术的发展,手柄的作用越来越重要。用户通过手柄实现与虚拟现实、增强现实和混合现实的场景交互。
目前,可以在手柄的内部嵌设电磁发射器,同时在VR头戴式一体机的内部嵌设电磁接收器,通过电磁追踪原理实时解算手柄在三维空间中的位置和姿态信息;也可以在手柄的内部嵌设超声波发射器,同时在VR头戴式一体机的内部嵌设超声波收器,通过超声波追踪原理实时解算手柄在三维空间中的位置和姿态信息。
但是,手柄的电磁传感器对环境中的电磁信号比较敏感,容易受到环境中复杂的电磁信号干扰,从而使电磁传感器产生错误的手柄的电磁追踪数据。例如,当手柄的电磁传感器距离电脑主机比较近,或者在距离音响、电视、冰箱等比较近的环境下,受其他电磁信号的影响,手柄的追踪性能较差,因此,采用电磁传感器的手柄的使用局限性较大。同理,采用超声波传感器的手柄的使用局限性也较大。
发明内容
本公开的实施例的一个目的是提供一种改进的手柄控制追踪器。
根据本公开的实施例,提供了一种手柄控制追踪器,包括:
手柄本体;
发光单元,所述发光单元设置于所述手柄本体的端部,并与所述手柄本体形成预设角度;
所述发光单元包括第一表面、第二表面、多个第一发光标记和多个第二发光标记,所述第二表面覆盖所述第一表面;所述第一发光标记和所述第二发光标记均设置于所述第一表面,且多个所述第一发光标记呈环状分布;
所述第一发光标记和所述第二发光标记被配置为点亮以由成像设备捕捉;
所述第一发光标记在第一时间段点亮,所述第二发光标记在第二时间段点亮。
可选地,所述第一发光标记的数量为十七个,所述第二发光标记的数量为两个,且两个所述第二发光标记对称设置于十七个所述第一发光标记之中。
可选地,所述第一发光标记的数量为二十个,所述第二发光标记的数量为两条,且所述第二发光标记呈带状,一条所述第二发光标记设置于所述第一表面的上边缘,另一条所述第二发光标记设置于所述第一表面的下边缘,二十个所述第一发光标记分布于两条所述第二发光标记之间。
可选地,所述预设角度为40°-135°。
可选地,所述第一发光标记的波长范围为450nm-690nm。
可选地,所述第一表面与所述第二表面的形状均为圆环状。
可选地,所述第一表面的半径与所述第二表面的半径的比值为1:1.5。
可选地,多个所述第一发光标记之间串联连接。
可选地,多个所述第二发光标记之间串联连接。
可选地,还包括:
无线传输模块,所述无线传输模块设置于所述手柄本体。
可选地,还包括:
传感器,所述传感器设置于所述手柄本体。
可选地,所述第一发光标记的闪烁频率为30Hz。
可选地,所述第一发光标记和所述第二发光标记的点亮时间为15μs–100μs。
本公开的实施例的技术方案的有益效果在于,通过成像设备对手柄控制追踪器上的第一发光标记和第二发光标记进行捕捉,能够实时且准确地解算手柄控制追踪器在三维空间中的位置和姿态信息,操作非常简单。而且捕捉过程不受周围环境中电磁波信号和超声波信号的影响,适用范围较广。
通过以下参照附图对本公开的示例性实施例的详细描述,本公开的其它特征及其优点将会变得清楚。
附图说明
被结合在说明书中并构成说明书的一部分的附图示出了本公开的实施例,并且连同其说明一起用于解释本公开的原理。
图1是本公开一种实施方式提供的手柄控制追踪器的结构示意图;
图2是图1中第一表面展开时的示意图;
图3是本公开另一种实施方式提供的手柄控制追踪器的结构示意图;
图4是图3中第一表面展开时的示意图。
图中:1、手柄本体;2、发光单元;3、第一发光标记;4、第二发光标记。
具体实施方式
现在将参照附图来详细描述本公开的各种示例性实施例。应注意到:除非另外具体说明,否则在这些实施例中阐述的部件和步骤的相对布置、数字表达式和数值不限制本公开的范围。
以下对至少一个示例性实施例的描述实际上仅仅是说明性的,决不作为对本公开及其应用或使用的任何限制。
对于相关领域普通技术人员已知的技术、方法和设备可能不作详细讨论,但在适当情况下,所述技术、方法和设备应当被视为说明书的一部分。
在这里示出和讨论的所有例子中,任何具体值应被解释为仅仅是示例性的,而不是作为限制。因此,示例性实施例的其它例子可以具有不同的值。
应注意到:相似的标号和字母在下面的附图中表示类似项,因此,一 旦某一项在一个附图中被定义,则在随后的附图中不需要对其进行进一步讨论。
如图1至图4所示,本实施例提供一种手柄控制追踪器。其便于成像设备对手柄控制追踪器上的第一发光标记3和第二发光标记4进行捕捉,从而能够实时且准确地解算手柄控制追踪器在三维空间中的位置和姿态信息。
具体地,该手柄控制追踪器包括手柄本体1和发光单元2;所述发光单元2设置于所述手柄本体1的端部,并与所述手柄本体1形成预设角度;所述发光单元2包括第一表面、第二表面、多个第一发光标记3和多个第二发光标记4,所述第二表面覆盖所述第一表面;所述第一发光标记3和所述第二发光标记4均设置于所述第一表面,且多个所述第一发光标记3呈环状分布;所述第一发光标记3和所述第二发光标记4被配置为点亮以由成像设备捕捉;所述第一发光标记3在第一时间段点亮,所述第二发光标记4在第二时间段点亮。
在本实施例中,成像设备为内置于VR头戴式一体机上的两个或两个以上的Camera追踪摄像头,被配置为实时捕捉光学手柄控制追踪器上的第一发光标记3和第二发光标记4。通过检测第二发光标记4在手柄控制追踪器的图像上的位置区域,可以确定手柄控制追踪器上的第一发光标记3的光斑在图像上的大***置,然后在该图像位置区域通过计算机图像处理算法实时检测出追踪Camera图像上的手柄控制追踪器上第一发光标记3和第二发光标记4所对应的光斑二维位置坐标。然后,通过计算机视觉算法匹配出手柄控制追踪器上每一个第一发光标记3和第二发光标记4所对应图像上的二维位置坐标;最后,通过PNP姿态解算算法优化迭代出手柄控制追踪器相对当前三维空间环境的姿态和位置信息。
通过第一发光标记3和第二发光标记4在发光单元2上的位置设置规则,提高了手柄控制器在周围环境的发光源比较复杂的情况下具有追踪稳定性,从而使手柄控制追踪器的6个方向自由度的姿态追踪信息更加稳定和精准。
可选地,如图1和图2所示,所述第一发光标记3的数量为十七个, 所述第二发光标记4的数量为两个,且两个所述第二发光标记4对称设置于十七个所述第一发光标记3之中。
在本实施例中,十七个第一发光标记3和呈一定几何形状的两个第二发光标记4同时存在一个电路主板上,十七个第一发光标记3通过串联形式相互连接,两个第二发光标记4通过串联形式相互连接。
另外,十七个第一发光标记3和一定几何形状的两个第二发光标记4会按照一定的占空比同时进行点亮,点亮时间范围可以为15μs–100μs,在硬件和第一发光标记3和第二发光标记4控制的参数确定之后,第一发光标记3和第二发光标记4的点亮时间前后帧必须是相同的。
可选地,如图3和图4所示,所述第一发光标记3的数量为二十个,所述第二发光标记4的数量为两条,且所述第二发光标记4呈带状,一条所述第二发光标记4设置于所述第一表面的上边缘,另一条所述第二发光标记4设置于所述第一表面的下边缘,二十个所述第一发光标记3分布于两条所述第二发光标记4之间。
在本实施例中,左手的手柄控制追踪器上的二十个第一发光标记3在第一表面的物理位置分布和右手的手柄控制追踪器上的二十个第一发光标记3在第一表面的物理位置分布可按照一定对称规则进行物理对称分布。
可选地,所述预设角度为40°-135°。这不仅符合用户的使用习惯,便于用于对手柄本体1的握持,同时也便于成像设备对第一发光标记3和第二发光标记4的捕捉。
可选地,所述第一发光标记3的波长范围为450nm-690nm。这便于成像设备对第一发光标记3的捕捉。
在本实施例中,第一发光标记3为红外光波段,其波长为850nm或940nm的红外光波段。
可选地,所述第一表面与所述第二表面的形状均为圆环状。
可选地,所述第一表面的半径与所述第二表面的半径的比值为1:1.5。这便于第一发光标记3和第二发光标记4设置在第一表面和第二表面之间,便于第二表面对第一发光标记3和第二发光标记4的保护。
可选地,多个所述第一发光标记3之间串联连接。多个所述第一发光 标记3设置在同一个电路板上,这便于同时点亮第一发光标记3,进而便于成像设备对第一发光标记3的捕捉。
可选地,多个所述第二发光标记4之间串联连接。多个所述第二发光标记4设置在同一个电路板上,这便于同时点亮第二发光标记4,进而便于成像设备对第二发光标记4的捕捉。
可选地,还包括无线传输模块,所述无线传输模块设置于所述手柄本体1。
在本实施例中,手柄控制追踪器的无线传输模块,并在VR头戴式一体机中内置无线接收模块。然后,通过VR头戴式一体机内置的控制处理单元使手柄控制追踪器的每一个第一发光标记3的闪烁频率和每一个第二发光标记4的闪烁频率同VR头戴式一体机内置的追踪Camera的曝光快门同步,即,在每一帧VR头戴式一体机内置的追踪Camera的曝光快门打开时间段内,手柄控制追踪器上的第一发光标记3和第二发光标记4会点亮。从手柄功耗和实际使用环境等方面考虑,一般第一发光标记3和第二发光标记4的亮灯时间为15μs–100μs左右。为了同步精度考虑,追踪Camera的曝光时间一般为30μs–150μs左右,使其在追踪Camera曝光时间段内能充分捕捉到第一发光标记3和第二发光标记4的点亮时间段,追踪Camera的捕捉频率调整为60Hz。为了避免第一发光标记3和第二发光标记4同时点亮,避免在追踪Camera的图像上形成重合或者粘连现象,因此,第一发光标记3和第二发光标记4将分开点亮。例如,在追踪Camera拍摄的奇数帧让第二发光标记4点亮,在追踪Camera拍摄的偶数帧让第一发光标记3点亮,其中,第一发光标记3和第二发光标记4均可以为LED灯。通过第二发光标记4的物理特征信息,在追踪Camera的奇数帧图像上实时检测追踪第二发光标记4在图像上的区域位置,该位置区域为手柄控制器追踪器的第一发光标记3的光斑所在的区域,该区域为长方形区域,由于考虑手柄的运动速度现象,在追踪Camera的偶数帧图像上,通过上一个奇数帧图像上第二发光标记4对应的图像位置区域,对该区域对应的长方形进行长边和短边长度1.5倍的扩展。即形成当前帧下手柄控制追踪器上第一发光标记3的光斑所在的图像位置区域,并对该区域进行第一发光标记3的光 斑检测,该区域外的图像区域不做检测,大大减少了第一发光标记3的光斑检测的精度和稳定性,也减少了光斑检测的复杂度,一定程度上提高检测效率。需要说明的是,1.5倍是通过实际测试手柄追踪控制器在对应追踪Camera 60Hz频率下实际环境下的最大运动速度对应前后帧最大的移动距离范围。
另外,由于人眼对光的闪烁敏感度在50Hz以上,其不会感觉有闪烁现象。而第一发光标记3和第二发光标记4的光斑在VR头戴式的追踪Camera是按照30Hz频率被捕捉,即按照上述步骤操作,手柄控制追踪器上的一发光标记和第二发光标记4的闪烁频率为30Hz。在实际环境中对用户人眼观察不太友好,人眼会觉察有30Hz闪烁现象,即手柄控制追踪器在设备运行的开始时和VR头戴式一体机的追踪Camera通过无线同步之后,在从追踪Camera的每一帧的开始时间到每一帧的追踪Camera曝光快门开启之前这段时间段内,让手柄控制追踪器的第一发光标记3和第二发光标记4平均点亮两次,每次的点亮时间也是15μs–100μs范围,这样一秒33Hz的频率使得手柄控制追踪器的第一发光标记3和第二发光标记4会点亮99次,满足人眼对光闪烁敏感范围以上。
在本申请中,所有第一发光标记3和第二发光标记4的点亮时间在每一帧是相同的,这便于成像设备对第一发光标记3和第二发光标记4进行精准的捕捉。
可选地,手柄控制追踪器的追踪距离范围可支持3cm–150cm,这便于用户更好地通过手柄控制追踪器实现与虚拟现实、增强现实和混合现实的交互。
可选地,还包括:
传感器,所述传感器设置于所述手柄本体1。
例如,手柄控制追踪器中内嵌IMU惯性导航传感器,该传感器至少为六轴传感器,即为加速度计单元传感器和陀螺仪单元传感器;也可以为九轴传感器,即为加速度计单元传感器,陀螺仪单元传感器和地磁单元传感器。其中,IMU惯性导航传感器的输出频率至少为90Hz。
另外,手柄控制追踪器中内嵌无线网络传输单元,传输协议可以为 2.4G网络协议,或者为BLE(低功耗蓝牙)协议。另外,无线传输频率为100Hz以上。
通过IMU惯性导航传感器能够实时准确获取到高频率下的手柄控制追踪器的6个方向自由度的信息。
可选地,所述第一发光标记3的闪烁频率为30Hz。这便于成像设备对第一发光标记3进行捕捉。
可选地,所述第一发光标记3和所述第二发光标记4的点亮时间为15μs–100μs。这有助于成像设备能够准确地捕捉第一发光标记3和第二发光标记4。
在本实施例中,通过成像设备对手柄控制追踪器上的第一发光标记3和第二发光标记4进行捕捉,能够实时且准确地解算手柄控制追踪器在三维空间中的位置和姿态信息,操作非常简单。而且捕捉过程不受周围环境中电磁波信号和超声波信号的影响,适用范围较广。
虽然已经通过例子对本公开的一些特定实施例进行了详细说明,但是本领域的技术人员应该理解,以上例子仅是为了进行说明,而不是为了限制本公开的范围。本领域的技术人员应该理解,可在不脱离本公开的范围和精神的情况下,对以上实施例进行修改。本公开的范围由所附权利要求来限定。

Claims (13)

  1. 一种手柄控制追踪器,其特征在于,包括:
    手柄本体;
    发光单元,所述发光单元设置于所述手柄本体的端部,并与所述手柄本体形成预设角度;
    所述发光单元包括第一表面、第二表面、多个第一发光标记和多个第二发光标记,所述第二表面覆盖所述第一表面;所述第一发光标记和所述第二发光标记均设置于所述第一表面,且多个所述第一发光标记呈环状分布;
    所述第一发光标记和所述第二发光标记被配置为点亮以由成像设备捕捉;
    所述第一发光标记在第一时间段点亮,所述第二发光标记在第二时间段点亮。
  2. 根据权利要求1所述的手柄控制追踪器,其特征在于,所述第一发光标记的数量为十七个,所述第二发光标记的数量为两个,且两个所述第二发光标记对称设置于十七个所述第一发光标记之中。
  3. 根据权利要求1或2所述的手柄控制追踪器,其特征在于,所述第一发光标记的数量为二十个,所述第二发光标记的数量为两条,且所述第二发光标记呈带状,一条所述第二发光标记设置于所述第一表面的上边缘,另一条所述第二发光标记设置于所述第一表面的下边缘,二十个所述第一发光标记分布于两条所述第二发光标记之间。
  4. 根据权利要求1-3中任一项所述的手柄控制追踪器,其特征在于,所述预设角度为40°-135°。
  5. 根据权利要求1-4中任一项所述的手柄控制追踪器,其特征在于, 所述第一发光标记的波长范围为450nm-690nm。
  6. 根据权利要求1-5中任一项所述的手柄控制追踪器,其特征在于,所述第一表面与所述第二表面的形状均为圆环状。
  7. 根据权利要求1-6中任一项所述的手柄控制追踪器,其特征在于,所述第一表面的半径与所述第二表面的半径的比值为1:1.5。
  8. 根据权利要求1-7中任一项所述的手柄控制追踪器,其特征在于,多个所述第一发光标记之间串联连接。
  9. 根据权利要求1-8中任一项所述的手柄控制追踪器,其特征在于,多个所述第二发光标记之间串联连接。
  10. 根据权利要求1-9中任一项所述的手柄控制追踪器,其特征在于,还包括:
    无线传输模块,所述无线传输模块设置于所述手柄本体。
  11. 根据权利要求1-10中任一项所述的手柄控制追踪器,其特征在于,还包括:
    传感器,所述传感器设置于所述手柄本体。
  12. 根据权利要求1-11中任一项所述的手柄控制追踪器,其特征在于,所述第一发光标记的闪烁频率为30Hz。
  13. 根据权利要求1-12中任一项所述的手柄控制追踪器,其特征在于,所述第一发光标记和所述第二发光标记的点亮时间为15μs–100μs。
PCT/CN2021/118171 2020-11-09 2021-09-14 一种手柄控制追踪器 WO2022095605A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/877,219 US11712619B2 (en) 2020-11-09 2022-07-29 Handle controller

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011242190.X 2020-11-09
CN202011242190.XA CN112451962B (zh) 2020-11-09 2020-11-09 一种手柄控制追踪器

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/877,219 Continuation US11712619B2 (en) 2020-11-09 2022-07-29 Handle controller

Publications (1)

Publication Number Publication Date
WO2022095605A1 true WO2022095605A1 (zh) 2022-05-12

Family

ID=74826331

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/118171 WO2022095605A1 (zh) 2020-11-09 2021-09-14 一种手柄控制追踪器

Country Status (3)

Country Link
US (1) US11712619B2 (zh)
CN (1) CN112451962B (zh)
WO (1) WO2022095605A1 (zh)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112451962B (zh) * 2020-11-09 2022-11-29 青岛小鸟看看科技有限公司 一种手柄控制追踪器
CN113225870B (zh) * 2021-03-29 2023-12-22 青岛小鸟看看科技有限公司 Vr设备定位方法及vr设备
CN113318435A (zh) 2021-04-27 2021-08-31 青岛小鸟看看科技有限公司 手柄控制追踪器的控制方法、装置及头戴式显示设备
CN117017496B (zh) * 2023-09-28 2023-12-26 真健康(北京)医疗科技有限公司 柔性体表定位装置及穿刺手术导航定位***

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1794010A (zh) * 2005-12-19 2006-06-28 北京威亚视讯科技有限公司 位置姿态跟踪***
US20130307772A1 (en) * 2012-05-21 2013-11-21 Everest Display Inc. Interactive projection system with light spot identification and control method thereof
CN105511649A (zh) * 2016-01-06 2016-04-20 王帆 一种多点定位***及多点定位方法
CN108257177A (zh) * 2018-01-15 2018-07-06 天津锋时互动科技有限公司深圳分公司 基于空间标识的定位***与方法
CN110573993A (zh) * 2017-04-26 2019-12-13 脸谱科技有限责任公司 使用led追踪环的手持控制器
CN112451962A (zh) * 2020-11-09 2021-03-09 青岛小鸟看看科技有限公司 一种手柄控制追踪器

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7627139B2 (en) * 2002-07-27 2009-12-01 Sony Computer Entertainment Inc. Computer image and audio processing of intensity and input devices for interfacing with a computer program
US8313379B2 (en) * 2005-08-22 2012-11-20 Nintendo Co., Ltd. Video game system with wireless modular handheld controller
TW200918135A (en) * 2007-09-07 2009-05-01 Konami Digital Entertainment Action judgment device, game device, and computer program
US10532277B2 (en) * 2015-06-11 2020-01-14 Facebook Technologies, Llc Hand-held controllers with light-emitting diodes synchronized to an external camera
US10007339B2 (en) * 2015-11-05 2018-06-26 Oculus Vr, Llc Controllers with asymmetric tracking patterns
US10496157B2 (en) * 2017-05-09 2019-12-03 Microsoft Technology Licensing, Llc Controlling handheld object light sources for tracking
CN107219963A (zh) * 2017-07-04 2017-09-29 深圳市虚拟现实科技有限公司 虚拟现实手柄图形空间定位方法和***
US20190012835A1 (en) * 2017-07-07 2019-01-10 Microsoft Technology Licensing, Llc Driving an Image Capture System to Serve Plural Image-Consuming Processes
CN110119192B (zh) * 2018-02-06 2024-05-28 广东虚拟现实科技有限公司 视觉交互装置
US20190302903A1 (en) * 2018-03-30 2019-10-03 Microsoft Technology Licensing, Llc Six dof input device
US10740924B2 (en) * 2018-04-16 2020-08-11 Microsoft Technology Licensing, Llc Tracking pose of handheld object
CN110837295A (zh) * 2019-10-17 2020-02-25 重庆爱奇艺智能科技有限公司 一种手持控制设备及其追踪定位的方法、设备与***
CN111354018B (zh) * 2020-03-06 2023-07-21 合肥维尔慧渤科技有限公司 一种基于图像的物体识别方法、装置及***
CN111459279A (zh) * 2020-04-02 2020-07-28 重庆爱奇艺智能科技有限公司 主动式补光设备、3dof手柄、vr设备及追踪***

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1794010A (zh) * 2005-12-19 2006-06-28 北京威亚视讯科技有限公司 位置姿态跟踪***
US20130307772A1 (en) * 2012-05-21 2013-11-21 Everest Display Inc. Interactive projection system with light spot identification and control method thereof
CN105511649A (zh) * 2016-01-06 2016-04-20 王帆 一种多点定位***及多点定位方法
CN110573993A (zh) * 2017-04-26 2019-12-13 脸谱科技有限责任公司 使用led追踪环的手持控制器
CN108257177A (zh) * 2018-01-15 2018-07-06 天津锋时互动科技有限公司深圳分公司 基于空间标识的定位***与方法
CN112451962A (zh) * 2020-11-09 2021-03-09 青岛小鸟看看科技有限公司 一种手柄控制追踪器

Also Published As

Publication number Publication date
US11712619B2 (en) 2023-08-01
CN112451962A (zh) 2021-03-09
CN112451962B (zh) 2022-11-29
US20220362659A1 (en) 2022-11-17

Similar Documents

Publication Publication Date Title
WO2022095605A1 (zh) 一种手柄控制追踪器
US10656731B2 (en) Peripheral device for head-mounted display
CN108700939B (zh) 用于增强现实的***和方法
TWI471808B (zh) 瞳孔偵測裝置
US10663729B2 (en) Peripheral device for head-mounted display
JP6066676B2 (ja) ヘッドマウントディスプレイおよび映像提示システム
CN103529929B (zh) 手势识别***及可识别手势动作的眼镜
US10198866B2 (en) Head-mountable apparatus and systems
WO2016184107A1 (zh) 用于视线焦点定位的可穿戴设备及视线焦点定位方法
KR20150093831A (ko) 혼합 현실 환경에 대한 직접 상호작용 시스템
US9292086B2 (en) Correlating pupil position to gaze location within a scene
TW201416908A (zh) 瞳孔追蹤裝置
US11027195B2 (en) Information processing apparatus, information processing method, and program
JP6719418B2 (ja) 電子機器
US20180217671A1 (en) Remote control apparatus, remote control method, remote control system, and program
JP7283958B2 (ja) 複数のマーカを備えたデバイス
CN103793045B (zh) 瞳孔追踪装置
CN109300528B (zh) 一种认知康复训练***及方法
JP2021060627A (ja) 情報処理装置、情報処理方法、およびプログラム
US11216066B2 (en) Display device, learning device, and control method of display device
KR100871867B1 (ko) 사용자 상황 정보 인지에 기초한 개인별 콘텐츠 서비스장치 및 방법
CN103565399A (zh) 瞳孔检测装置
JP5732446B2 (ja) ヘッドマウントディスプレイ、および動き検出方法
US11042747B2 (en) Masking method for augmented reality effects
JP6518028B1 (ja) 表示装置、表示方法、プログラム、ならびに、非一時的なコンピュータ読取可能な情報記録媒体

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21888304

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 11-08-2023)

122 Ep: pct application non-entry in european phase

Ref document number: 21888304

Country of ref document: EP

Kind code of ref document: A1