CN110609626B - Virtual reality control system and method - Google Patents

Virtual reality control system and method Download PDF

Info

Publication number
CN110609626B
CN110609626B CN201910806003.7A CN201910806003A CN110609626B CN 110609626 B CN110609626 B CN 110609626B CN 201910806003 A CN201910806003 A CN 201910806003A CN 110609626 B CN110609626 B CN 110609626B
Authority
CN
China
Prior art keywords
touch
pen
stylus
virtual reality
neural network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910806003.7A
Other languages
Chinese (zh)
Other versions
CN110609626A (en
Inventor
黄昌正
陈曦
周言明
张娜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dongguan Yilian Interation Information Technology Co ltd
Fantasy Zhuhai Technology Co ltd
Guangzhou Huantek Co ltd
Original Assignee
Dongguan Yilian Interation Information Technology Co ltd
Guangzhou Huantek Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dongguan Yilian Interation Information Technology Co ltd, Guangzhou Huantek Co ltd filed Critical Dongguan Yilian Interation Information Technology Co ltd
Priority to CN201910806003.7A priority Critical patent/CN110609626B/en
Publication of CN110609626A publication Critical patent/CN110609626A/en
Application granted granted Critical
Publication of CN110609626B publication Critical patent/CN110609626B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

The invention discloses a virtual reality control system and method, wherein the system comprises a position tracker, a touch control pen and a display, the position tracker comprises a processor, a neural network unit and a video recording unit, the touch control pen comprises a microcontroller, a pressure sensor, an inertial sensor and at least two LED lamps, the processor is respectively connected with the neural network unit, the video recording unit, the microcontroller and the display, the LED lamps, the pressure sensor and the inertial sensor are all connected with the microcontroller, and the LED lamps are arranged at two ends of the touch control pen. The system of the invention uses the touch control pen to replace a game handle, reduces the volume and the weight of the operating device, calculates the spatial position information through the neural network unit, reduces the calculation error, is convenient for popularizing the virtual technology to the application of fine operation, and simultaneously can ensure that a user can not easily generate fatigue feeling in the process of operating the touch control pen for a long time. The invention can be applied to the technical field of virtualization.

Description

Virtual reality control system and method
Technical Field
The invention relates to the technical field of virtualization, in particular to a control system and a method for virtual reality.
Background
The virtual reality technology is a computer simulation system capable of creating and experiencing a virtual world, is a simulation environment generated by a computer, is also an interactive three-dimensional dynamic view and entity behavior simulation system with multi-source information fusion, and is a simulation system utilizing entity behavior. The virtual game generated by the virtual reality technology mainly operates the moving direction of the character in the game process through a game handle, and displays the moving path in a display. However, the gamepad is generally large in size and heavy in weight, which makes it difficult for the user to operate the gamepad for a long time. However, in some applications requiring relatively fine operations, such as handwriting, drawing, three-dimensional modeling, UI interface operation, etc., the area for contact is very small, and the displayed trace is relatively fine, which makes it difficult to popularize the game pad for operating the moving direction in the prior art in these fine applications.
Disclosure of Invention
To solve the above technical problems, the present invention aims to: a control system and method for virtual reality is provided, which has advantages of small volume and light weight for an operating device for operating a moving direction.
The first technical scheme adopted by the invention is that
The utility model provides a virtual reality's control system, includes position tracker, touch-control pen and display, position tracker includes treater, neural network unit and video recording unit, the touch-control pen includes microcontroller, pressure sensor, inertial sensor and two at least LED lamps, the treater is connected with neural network unit, video recording unit, microcontroller and display respectively, LED lamp, pressure sensor and inertial sensor all are connected with microcontroller, the LED lamp set up in the both ends of touch-control pen.
Furthermore, the touch control pen further comprises a capacitance key, the capacitance key is arranged on the surface of the touch control pen, and the capacitance key is connected with the microcontroller.
Further, the video recording unit comprises at least two cameras, and the cameras are used for collecting moving images of the LED lamps at two ends of the touch control pen.
Furthermore, an infrared filter is arranged on a lens of the camera.
Further, the touch control pen further comprises a lithium battery, and the lithium battery is connected with the microcontroller.
The second technical scheme adopted by the invention is as follows:
a method of controlling virtual reality, comprising the steps of:
acquiring at least two moving images of a touch control pen;
sending the moving image to a neural network unit, and enabling the neural network unit to calculate the spatial position information of the touch pen according to the moving image;
acquiring pressure sensor data and inertial sensor data of a stylus;
calculating movement data of the touch pen according to the spatial position information, the pressure sensor data and the inertial sensor data, wherein the movement data comprises track coordinates and an inclination angle;
and sending the movement data of the touch pen to a display, so that the display displays the movement track of the touch pen according to the movement data.
Further, between the step of acquiring pressure sensor data and inertial sensor data of the stylus and the step of calculating movement data of the stylus based on the spatial position information, the pressure sensor data, and the inertial sensor data, the method further includes the following steps:
and acquiring key information of the touch pen.
Further, the acquiring at least two moving images of the stylus specifically includes:
at least two moving images of the light of the LED lamp on the touch pen are obtained through the video recording unit.
Further, the acquiring pressure sensor data and inertial sensor data of the stylus specifically includes:
the method comprises the steps of obtaining nib pressure data obtained by detection of a pressure sensor on the stylus and obtaining rotation information obtained by detection of an inertial sensor on the stylus.
Further, before the step of acquiring at least two moving images of the stylus, the method further includes the following steps:
the neural network of the neural network unit is trained.
The system of the invention has the advantages that: according to the system, the touch pen is used for replacing a game handle, the size and the weight of the operating device are reduced, so that a virtual technology is popularized to a fine operation application, then the pressure sensor and the inertial sensor are additionally arranged on the touch pen, the position tracker can acquire pen point pressure and rotation information of the touch pen, the LED lamps are additionally arranged at two ends of the touch pen, the position tracker can acquire moving pictures of the touch pen through a video recording unit, finally the spatial position information of the touch pen is calculated through the neural network unit, calculation errors are reduced, and the moving track of the touch pen is calculated through the processor by combining the pressure sensor data and the inertial sensor data, so that the display can quickly and accurately display the moving track of the touch pen.
The method has the beneficial effects that: according to the method, the acquired moving image of the touch pen is sent to the neural network unit, so that the neural network unit calculates the space position information of the touch pen, calculation errors are reduced, then the moving track of the touch pen is calculated by combining pressure sensor data and inertial sensor data, and the moving data is sent to the display, so that the display can quickly and accurately display the moving track of the touch pen.
Drawings
Fig. 1 is a block diagram of a control system for virtual reality according to an embodiment of the present invention;
FIG. 2 is a schematic view of a usage scenario according to an embodiment of the present invention;
fig. 3 is a flowchart of a virtual reality control method according to an embodiment of the present invention;
fig. 4 is a schematic diagram illustrating a calculation principle of spatial location information according to an embodiment of the present invention.
Detailed Description
The invention is described in further detail below with reference to the figures and the specific embodiments. For the step numbers in the following embodiments, they are set for convenience of illustration only, the order between the steps is not limited at all, and the execution order of each step in the embodiments can be adapted according to the understanding of those skilled in the art.
Referring to fig. 1, an embodiment of the present invention provides a virtual reality control system, including a position tracker, a stylus and a display, where the position tracker includes a processor, a neural network unit and a video recording unit, the stylus includes a microcontroller, a pressure sensor, an inertial sensor and at least two LED lamps, the processor is connected to the neural network unit, the video recording unit, the microcontroller and the display, the LED lamps, the pressure sensor and the inertial sensor are all connected to the microcontroller, and the LED lamps are disposed at two ends of the stylus. The touch control pen is an electronic touch control pen and has the advantages of small volume and light weight. The position tracker is an infrared spatial position tracker. The display is a head mounted display having virtual reality and augmented reality functions. The stylus is connected with the position tracker through the wireless communication module. The position tracker is connected with the display through a USB interface or a wireless communication module. The LED lamp is an infrared LED lamp.
The video recording unit is used for acquiring moving images of the LED lamps at the two ends of the touch control pen in the using process and indirectly recording the moving track of the touch control pen through the moving images of the LED lamps;
the neural network unit is used for processing through a pre-trained artificial neural network according to the moving image and calculating to obtain the spatial positioning information of the touch pen;
the pressure sensor is used for detecting pressure data of a pen point of the touch pen and sending the pressure data of the pen point to the microcontroller;
the inertial sensor is used for detecting rotation data of the touch pen and sending the rotation data to the microcontroller;
the microcontroller is used for acquiring pressure data detected by the pressure sensor and rotation data detected by the inertia sensor and sending the pressure data and the rotation data into the processor through the wireless communication module;
and the processor is used for sending the moving image collected by the video recording unit to the neural network unit, receiving the space positioning information of the touch pen returned by the neural network unit, receiving the pressure data and the rotation data uploaded by the microcontroller, processing the space positioning information, the pressure data and the rotation data to obtain the moving track of the touch pen, and finally sending the moving track of the touch pen to the display through the wireless communication module or the USB interface so that the display displays the moving track of the touch pen.
This embodiment is through using the touch-control pen to replace game paddle, reduce operating means's volume and weight, in order to promote virtual technique to the operation application that becomes more meticulous, then through add pressure sensor and inertial sensor on the touch-control pen, make the position tracker can acquire the nib pressure and the rotation information of touch-control pen, and through add the LED lamp at the touch-control pen both ends, make the position tracker can acquire the touch-control pen through the video recording unit and move the picture, calculate the spatial position information of touch-control pen through the neural network unit at last, reduce calculation error, and combine pressure sensor data and inertial sensor data to calculate the removal orbit that obtains the touch-control pen through the treater, thereby make the display can be fast, the removal orbit of touch-control pen is accurately shown.
Further as a preferred embodiment, the touch pen further includes a capacitor key, the capacitor key is disposed on the surface of the touch pen, and the capacitor key is connected to the microcontroller. Because the keys on the game handle in the prior art are mechanical keys, the vibration noise generated in the using process is higher than that and the sensitivity is not high, so that the acquisition precision of pressure data is influenced, and meanwhile, operators feel fatigue easily. The capacitor key has the characteristics of small volume, high sensitivity and low noise. According to the embodiment, the capacitance keys are used, so that the pressure data acquisition precision is improved, and the fatigue of an operator in the operation process is reduced.
Further, as a preferred embodiment, the video recording unit includes at least two cameras, and the cameras are used for collecting moving images of the LED lamps at two ends of the stylus. Specifically, in the actual use process, the camera is installed in the direction opposite to the use position of the touch pen, the dead angle of video recording is large when one camera is used for recording video, and at least two cameras are used for recording video, so that the dead angle range in the video recording process is reduced, and the actual moving track of the touch pen can be accurately recorded.
As shown in fig. 2, when the user operates with the stylus 102, the display 103 is worn on the head, and the camera 101 records the whole process of the user operating the stylus 102 on the desktop 104, wherein the camera 101 is placed on the desktop 104, and the front of the camera 101 faces the operating position of the stylus 102.
Further preferably, an infrared filter is disposed on the lens of the camera. The infrared filter is a deep red filter for infrared photography, and has a corresponding effect even if the filter is not deep in color. The infrared filter is additionally arranged right in front of the camera, so that the camera can normally collect weak infrared light.
Further as a preferred embodiment, the stylus further includes a lithium battery, and the lithium battery is connected with the microcontroller. The game paddle of prior art is last generally to supply power for game paddle through external power cord to restricted game paddle's application range, this embodiment provides the power for the touch-control pen through using the lithium cell, reduces the volume of touch-control pen, has also increased the application range of touch-control pen simultaneously.
Referring to fig. 3, the present invention further provides a control method for virtual reality, and this embodiment is applied in the processor of the system embodiment shown in fig. 1, and specifically includes the following steps:
s201, acquiring at least two moving images of the touch pen; specifically, moving images of LED lamps at two ends of a touch control pen are obtained through a camera.
S202, sending the moving image to a neural network unit, and enabling the neural network unit to calculate the spatial position information of the touch pen according to the moving image; the neural network in the neural network unit is an artificial neural network trained in advance so as to ensure the accuracy of the calculation process.
As shown in fig. 4, which includes an imaging plane 301, a lens plane 302, a virtual image plane 303 and an optical axis 304, fig. 4 illustrates a schematic diagram of a principle of calculating spatial positions of light spot identifiers at two ends of an electronic stylus by using two cameras: and positioning the three-dimensional space of the matched light spot identification through the difference picture shot by the binocular vision system. Setting the midpoint of the central points of the two lenses as a coordinate origin A of a binocular system, wherein the distance between the central points is b; optical axisThe direction is the positive direction of the z axis, the direction from the midpoint of the left camera lens to the midpoint of the right camera lens is the positive direction of the x axis, the three-dimensional coordinates of the light spot identifier to be detected are set as (x, y, z), and the coordinates of the points shot by the left camera and the right camera on the imaging plane are respectively set as (x, y, z) ( l ,y l )、(x r ,y r ) The focal lengths of the two cameras are f. From the properties of similar triangles it can be known that:
Figure BDA0002183690390000051
solving to obtain:
Figure BDA0002183690390000052
thereby, the three-dimensional coordinates of the light point identification can be calculated. And determining a space line segment by the two three-dimensional coordinate points, wherein the line segment is the space position information of the electronic touch pen.
S203, acquiring pressure sensor data and inertial sensor data of the touch pen; the pressure sensor data is pressure data of a stylus tip detected by a pressure sensor within the stylus. The inertial sensor data is rotational data of the stylus detected by an inertial sensor within the stylus.
S204, calculating movement data of the touch pen according to the spatial position information, the pressure sensor data and the inertial sensor data, wherein the movement data comprises track coordinates and an inclination angle; in this embodiment, the movement track of the stylus is obtained by calculating through a calculation program pre-stored in the processor.
And S205, sending the movement data of the touch pen to a display, and enabling the display to display the movement track of the touch pen according to the movement data. The display is head-mounted display, and through sending the removal orbit with the stylus to the display, the user can see the removal orbit of the stylus that oneself controlled in virtual interface in the display, can also show the note error because of the inclination brings according to inclination simultaneously.
In the embodiment, the obtained moving image of the stylus is sent to the neural network unit, so that the neural network unit calculates the spatial position information of the stylus, thereby reducing the calculation error, and then the moving track of the stylus is calculated by combining the pressure sensor data and the inertial sensor data, and the moving data is sent to the display, so that the display can quickly and accurately display the moving track of the stylus.
Further, as a preferred embodiment, between the step of acquiring pressure sensor data and inertial sensor data of the stylus and the step of calculating movement data of the stylus based on the spatial position information, the pressure sensor data and the inertial sensor data, the method further includes the following steps:
and acquiring key information of the touch pen. The key information is an operating state of a capacitive key on the stylus, for example, when a user uses the capacitive key to highlight or erase a motion trajectory of the stylus displayed on the display, the processor may calculate and modify the motion trajectory of the stylus sent to the display according to the key information.
Further as a preferred embodiment, the acquiring at least two moving images of the stylus pen specifically includes:
at least two moving images of the light of the LED lamp on the touch pen are obtained through the video recording unit. The touch control pen does not have a light emitting function, so that the moving tracks of the touch control pen and other objects are difficult to distinguish in the subsequent processing process, and the moving process of the touch control pen in an image is more obvious and easy to distinguish by adding the infrared LED lamps at two ends of the touch control pen.
Further, as a preferred embodiment, the acquiring pressure sensor data and inertial sensor data of the stylus specifically includes:
the method comprises the steps of obtaining nib pressure data obtained by detection of a pressure sensor on the stylus and obtaining rotation information obtained by detection of an inertial sensor on the stylus. In the processes of some handwritten characters, drawing, UI (user interface) operation, three-dimensional modeling and the like, the contents written by the stylus pen need to be determined according to the rotation information and the pen point pressure information of the stylus pen. The embodiment can ensure the content displayed on the display and the content written by the user with a stylus, wherein the UI is a short for user interface.
As a further preferred embodiment, before the step of acquiring at least two moving images of the stylus, the method further includes the following steps:
the neural network of the neural network unit is trained. The training process is trained using prior art neural network training methods. In the embodiment, the neural network is trained, so that the error in calculating the spatial position information of the stylus is reduced, and the correctness of the display content in the display is ensured.
In summary, the system of the invention uses the stylus to replace a game handle, reduces the volume and weight of the operating device, so as to facilitate the popularization of the virtual technology to the application of fine operation, then adds a pressure sensor and an inertial sensor on the stylus to enable the position tracker to acquire the nib pressure and rotation information of the stylus, and adds LED lamps at two ends of the stylus to enable the position tracker to acquire the moving picture of the stylus through a video recording unit, and finally calculates the spatial position information of the stylus through a neural network unit to reduce the calculation error, and calculates the moving track of the stylus through a processor in combination with the pressure sensor data and the inertial sensor data, so that the display can quickly and accurately display the moving track of the stylus; furthermore, the method of the invention enables the neural network unit to calculate the spatial position information of the touch pen by sending the obtained moving image of the touch pen to the neural network unit, thereby reducing the calculation error, then calculates the moving track of the touch pen by combining the pressure sensor data and the inertial sensor data, and sends the moving data to the display, so that the display can rapidly and accurately display the moving track of the touch pen; furthermore, the method improves the calculation accuracy of the spatial positioning information by training the neural network.
While the preferred embodiments of the present invention have been illustrated and described, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. A control system for virtual reality, characterized by: including position tracker, touch-control pen and display, position tracker includes treater, neural network unit and video recording unit, the touch-control pen includes microcontroller, pressure sensor, inertial sensor and two at least LED lamps, the treater is connected with neural network unit, video recording unit, microcontroller and display respectively, LED lamp, pressure sensor and inertial sensor all are connected with microcontroller, the LED lamp set up in the both ends of touch-control pen.
2. The virtual reality control system according to claim 1, wherein: the touch control pen further comprises a capacitor key, the capacitor key is arranged on the surface of the touch control pen, and the capacitor key is connected with the microcontroller.
3. The virtual reality control system according to claim 1, wherein: the video recording unit comprises at least two cameras, and the cameras are used for collecting moving images of the LED lamps at two ends of the touch control pen.
4. A control system for virtual reality according to claim 3, wherein: an infrared filter is arranged on the lens of the camera.
5. The virtual reality control system according to claim 1, wherein: the touch control pen further comprises a lithium battery, and the lithium battery is connected with the microcontroller.
6. A virtual reality control method applied to the virtual reality control system according to any one of claims 1 to 5, characterized in that: the method comprises the following steps:
acquiring at least two moving images of a touch control pen;
sending the moving image to a neural network unit, and enabling the neural network unit to calculate the spatial position information of the touch pen according to the moving image;
acquiring pressure sensor data and inertial sensor data of a stylus;
calculating movement data of the touch pen according to the spatial position information, the pressure sensor data and the inertial sensor data, wherein the movement data comprises track coordinates and an inclination angle;
and sending the movement data of the touch pen to a display, so that the display displays the movement track of the touch pen according to the movement data.
7. The method for controlling virtual reality according to claim 6, wherein: between the step of acquiring pressure sensor data and inertial sensor data of the stylus and the step of calculating movement data of the stylus according to the spatial position information, the pressure sensor data and the inertial sensor data, the method further includes the following steps:
and acquiring key information of the touch pen.
8. The method for controlling virtual reality according to claim 6, wherein: the acquiring of the at least two moving images of the stylus specifically includes:
at least two moving images of the light of the LED lamp on the touch pen are obtained through the video recording unit.
9. The method for controlling virtual reality according to claim 6, wherein: the acquiring of the pressure sensor data and the inertial sensor data of the stylus specifically includes:
the method comprises the steps of obtaining nib pressure data obtained by detection of a pressure sensor on the stylus and obtaining rotation information obtained by detection of an inertial sensor on the stylus.
10. The method for controlling virtual reality according to claim 6, wherein: before the step of acquiring at least two moving images of the stylus, the method further comprises the following steps:
the neural network of the neural network unit is trained.
CN201910806003.7A 2019-08-29 2019-08-29 Virtual reality control system and method Active CN110609626B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910806003.7A CN110609626B (en) 2019-08-29 2019-08-29 Virtual reality control system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910806003.7A CN110609626B (en) 2019-08-29 2019-08-29 Virtual reality control system and method

Publications (2)

Publication Number Publication Date
CN110609626A CN110609626A (en) 2019-12-24
CN110609626B true CN110609626B (en) 2023-01-17

Family

ID=68891079

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910806003.7A Active CN110609626B (en) 2019-08-29 2019-08-29 Virtual reality control system and method

Country Status (1)

Country Link
CN (1) CN110609626B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109313500A (en) * 2016-06-09 2019-02-05 微软技术许可有限责任公司 The passive optical and inertia of very thin form factor track
CN209248486U (en) * 2018-11-13 2019-08-13 宁波视睿迪光电有限公司 A kind of virtual display interaction stylus

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109313500A (en) * 2016-06-09 2019-02-05 微软技术许可有限责任公司 The passive optical and inertia of very thin form factor track
CN209248486U (en) * 2018-11-13 2019-08-13 宁波视睿迪光电有限公司 A kind of virtual display interaction stylus

Also Published As

Publication number Publication date
CN110609626A (en) 2019-12-24

Similar Documents

Publication Publication Date Title
Memo et al. Head-mounted gesture controlled interface for human-computer interaction
US10261595B1 (en) High resolution tracking and response to hand gestures through three dimensions
CN110308789B (en) Method and system for mixed reality interaction with peripheral devices
US10203765B2 (en) Interactive input system and method
KR101652535B1 (en) Gesture-based control system for vehicle interfaces
US9229540B2 (en) Deriving input from six degrees of freedom interfaces
US20160098095A1 (en) Deriving Input from Six Degrees of Freedom Interfaces
CN110322500A (en) Immediately optimization method and device, medium and the electronic equipment of positioning and map structuring
US9122916B2 (en) Three dimensional fingertip tracking
CN112926423B (en) Pinch gesture detection and recognition method, device and system
CN114127669A (en) Trackability enhancement for passive stylus
US20130120250A1 (en) Gesture recognition system and method
US20120319945A1 (en) System and method for reporting data in a computer vision system
US20110157017A1 (en) Portable data processing appartatus
CN102810015B (en) Input method based on space motion and terminal
CN102906671A (en) Gesture input device and gesture input method
CN102226880A (en) Somatosensory operation method and system based on virtual reality
CN104364733A (en) Position-of-interest detection device, position-of-interest detection method, and position-of-interest detection program
CN103617642B (en) A kind of digital book drawing method and device
US11995254B2 (en) Methods, devices, apparatuses, and storage media for mapping mouse models for computer mouses
CN106293099A (en) Gesture identification method and system
Perra et al. Adaptive eye-camera calibration for head-worn devices
CN114706489A (en) Virtual method, device, equipment and storage medium of input equipment
CN110609626B (en) Virtual reality control system and method
Schlattmann et al. Markerless 4 gestures 6 DOF real‐time visual tracking of the human hand with automatic initialization

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20230616

Address after: 510635 self made room 01-011, third floor, No. 721, Tianhe North Road, Tianhe District, Guangzhou City, Guangdong Province (office only)

Patentee after: GUANGZHOU HUANTEK Co.,Ltd.

Patentee after: DONGGUAN YILIAN INTERATION INFORMATION TECHNOLOGY Co.,Ltd.

Patentee after: Fantasy (Zhuhai) Technology Co.,Ltd.

Address before: Room 01, 17 / F, Xingguang Yingjing, 119 Shuiyin Road, Yuexiu District, Guangzhou City, Guangdong Province 510075

Patentee before: GUANGZHOU HUANTEK Co.,Ltd.

Patentee before: DONGGUAN YILIAN INTERATION INFORMATION TECHNOLOGY Co.,Ltd.