CN109613982A - Wear-type AR shows the display exchange method of equipment - Google Patents

Wear-type AR shows the display exchange method of equipment Download PDF

Info

Publication number
CN109613982A
CN109613982A CN201811525502.0A CN201811525502A CN109613982A CN 109613982 A CN109613982 A CN 109613982A CN 201811525502 A CN201811525502 A CN 201811525502A CN 109613982 A CN109613982 A CN 109613982A
Authority
CN
China
Prior art keywords
wear
eye
image
type
pupil
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811525502.0A
Other languages
Chinese (zh)
Inventor
叶成环
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201811525502.0A priority Critical patent/CN109613982A/en
Publication of CN109613982A publication Critical patent/CN109613982A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention relates to field of display technology, and in particular to a kind of wear-type AR shows the display exchange method of equipment, comprising: wear-type AR shows that the display methods of equipment and wear-type AR show the exchange method of equipment.The present invention can show the content being not limited on mobile terminal (mobile phone), and other picture output devices can also be used, applied widely, good compatibility, simultaneously can head movement to user and eye motion carry out data acquisition, interacted by algorithm, enhance interactive experience.

Description

Wear-type AR shows the display exchange method of equipment
Technical field
The present invention relates to field of display technology, and in particular to a kind of wear-type AR shows the display exchange method of equipment.
Background technique
Augmented reality (AugmentedReality, abbreviation AR) is a kind of by real world information and virtual world information " seamless " integrated new technology, is the entity information that script is difficult to experience in the certain time spatial dimension of real world Visual information (sound, taste, tactile etc.) passes through the science and technology such as computer, is superimposed again after analog simulation, virtual information is answered Use real world, true environment and virtual object have been added to the same picture or space in real time, exist simultaneously by Human sensory is perceived, to reach the sensory experience of exceeding reality.
The current prior art is with free form surface reflective optic principle eyeglass come project content, this technology mostly by with In mobile phone box, i.e., the content on mobile phone is projected on eyeglass and has been reached after being inserted into mobile phone using the mobile phone box of the eyeglass To effect, the prior art is only able to display the content on mobile terminal, and does not have interactive function.
Summary of the invention
In view of the deficiencies of the prior art, the present invention provides a kind of display exchange method of wear-type AR display equipment, comprising: Wear-type AR shows that the display methods of equipment and wear-type AR show that the exchange method of equipment, the wear-type AR show equipment Display methods includes the following steps:
By showing that driving plate is separately connected the signal output port on picture output device and the portable screen of high-resolution The signal input port of curtain, the signal for exporting picture output device are real-time transmitted on portable screen, are shown;
The content that the portable screen is shown passes through its front and the reflective mirror of free form surface that is arranged at an angle Piece is reflected into human eye portion.
Preferably, the portable screen is set to human eye portion oblique upper and direction of visual lines in 45° angle.
Preferably, the reflective eyeglass of free form surface, the material used is the dedicated PC of plated film rear lens.
Wear-type AR shows the exchange method of equipment, includes the following steps:
Eye motion data, institute are acquired by the camera module that the free style reflecting optics middle portion location is arranged Stating eye motion data includes acquisition user blink data and eyeball shifting momentum data;
Pass through the gyro module being arranged in the wear-type AR display equipment, the number of acquisition user's head rotation angle According to;
Collected eye motion data are handled by data processing module and end rotation angle-data carries out algorithm meter It calculates, to realize interactive function.
Preferably, the method for the camera module acquisition eyeball shifting momentum data, comprising the following steps: eye mentions It takes;Edge detection;Eye centralized positioning;Eyeball tracking.
Preferably, the method that the eye extracts is, facial image is obtained by camera module, it is special to extract eye Sign, is converted into corresponding gray level image for facial image, handles face gray level image, utilize extracted eye feature Value variation, determine position of human eye.
Preferably, the method for the edge detection is, using the significant characteristic of image border brightness change, to being obtained Human eye gray scale region carry out edge detection, obtain pupil region.
Preferably, the method for the eye centralized positioning is, pupil contour feature point is obtained with gradient method, is sought adaptive It answers threshold value for determining pupil position and size, determines pupil center using cluster fitting pupil profile.
Preferably, the method for the eyeball tracking is,
Calculate blinkpunkt offset:
Setting time threshold is T, and eyes traveling time is t, then has: it is mobile phase that t, which is greater than T, and it is quiet that t, which is less than or equal to T, Only watch the stage attentively;
The pupil center's point obtained when based on Ocular quiescent records current pupil center's point O1, and coordinate is denoted as (x1, y1), The adjacent two field pictures in acquisition front and back obtain the pupil center point O2 of the second frame image, and coordinate is denoted as (x2, y2), are based on front and back pupil Center establishes coordinate system appropriate and calculates the mobile displacement (xd, yd) once generated of eyes and rotational angle θ, obtains blinkpunkt Offset;
Viewpoint offset mapping:
The displacement difference of front and back two field pictures based on acquisition allows eyes to move back and forth and carries out to the point that coordinate on screen is demarcated Calibration, using least square curve fitting algorithm to mapping function into solution, after obtaining eye movement characteristics information, the eye based on acquisition Dynamic command displacement and angle information, carry out corresponding system message response.
Preferably, the specific steps of the edge detection include:
It defines Gauss and weights smooth function, smooth to image progress convolution, Gauss weights smooth function is defined as:
Wherein σ is the mean square deviation of Gaussian Profile;xdWith ydIndicate the coordinate value of two-dimensional image image pixel;
Image gradient is calculated using gradient operator, obtains the smoothed out image of convolution partial derivative matrix in the x and y direction, Non-maxima suppression is carried out to matrix, obtains bianry image;
Threshold value screening is carried out to bianry image, the image obtained after screening only includes the higher pupil boundary of brightness ratio And its interior zone, obtain pupil region.
The present invention can show the content being not limited on mobile terminal (mobile phone), and other picture output devices can also be used, and be applicable in Range is wide, good compatibility, at the same can head movement to user and eye motion carry out data acquisition, handed over by algorithm Mutually, interactive experience is enhanced.
Other features and advantages of the present invention will be illustrated in the following description, also, partly becomes from specification It obtains it is clear that understand through the implementation of the invention.The objectives and other advantages of the invention can be by written explanation Specifically noted structure is achieved and obtained in book, claims and attached drawing.
Below by drawings and examples, technical scheme of the present invention will be described in further detail.
Detailed description of the invention
Attached drawing is used to provide further understanding of the present invention, and constitutes part of specification, with reality of the invention It applies example to be used to explain the present invention together, not be construed as limiting the invention.In the accompanying drawings:
Fig. 1 is that wear-type AR shows equipment connecting relation schematic diagram in one embodiment of the invention;
Fig. 2 is the display methods flow chart that wear-type AR shows equipment in one embodiment of the invention;
Fig. 3 is the interactive approach flow chart that wear-type AR shows equipment in one embodiment of the invention;
Fig. 4 is the method flow diagram that eyeball shifting momentum data is acquired in one embodiment of the invention;
Fig. 5 is the method flow diagram of eye centralized positioning in one embodiment of the invention;
Fig. 6 is eye movement characteristics information extracting method flow chart in one embodiment of the invention.
Specific embodiment
Hereinafter, preferred embodiments of the present invention will be described with reference to the accompanying drawings, it should be understood that preferred reality described herein Apply example only for the purpose of illustrating and explaining the present invention and is not intended to limit the present invention.
As shown in Figure 1 to Figure 3, the present invention provides a kind of display exchange method of wear-type AR display equipment, comprising: wears Formula AR shows that the display methods of equipment and wear-type AR show the exchange method of equipment,
The wear-type AR shows that the display methods of equipment includes the following steps:
By showing that driving plate 1 is separately connected signal output port (HDMI interface) and high-resolution on picture output device The signal input port (mipi interface) of rate portable screen 2, the signal for exporting picture output device are real-time transmitted to portable On formula screen 2, (content of display includes the software class such as video or game non-video content) is shown;
The video content that the portable screen 2 is shown, the free form surface being arranged by its front and at an angle are anti- Formula eyeglass 3 is penetrated, human eye portion is reflected into.
The portable screen 2 is set to human eye portion oblique upper and direction of visual lines in 45° angle.
The reflective eyeglass 3 of free form surface, the material used are special for plated film (similar automobile specular reflection film) rear lens Use PC.
Wear-type AR shows the exchange method of equipment, includes the following steps:
Eye motion data are acquired by the camera module 4 that 3 middle portion location of free style reflecting optics is arranged, The eye motion data include acquisition user blink data and eyeball shifting momentum data;
By the gyro module 5 being arranged in the wear-type AR display equipment, acquisition user's head rotates angle Data;
Collected eye motion data are handled by data processing module and end rotation angle-data carries out algorithm meter It calculates, to realize interactive function.
Above-mentioned technical proposal compared with prior art have the beneficial effect that the present invention can show and be not limited to mobile terminal (mobile phone) On content, other picture output devices can also be used, applied widely, good compatibility, while can be to the head movement of user And eye motion carries out data acquisition, is interacted by algorithm, enhances interactive experience.
As shown in Figures 4 to 6, in one embodiment,
The method that the camera module 4 acquires the eyeball shifting momentum data, comprising the following steps: eye extracts;Side Edge detection;Eye centralized positioning;Eyeball tracking.
The method that the eye extracts is facial image to be obtained by camera module 4 first, secondly, using being based on The mode of Haar matrix extracts eye feature;Facial image is finally converted into corresponding gray level image, to face gray level image It is handled, using the variation of the value of extracted eye feature, determines position of human eye.
The method of the edge detection is to utilize the significant characteristic of image border brightness change, the i.e. gray scale of pupil region Value is generally lower than the gray value in other regions, carries out edge detection to human eye gray scale region obtained.
Firstly, defining Gauss weights smooth function, smooth to image progress convolution, Gauss weights smooth function is defined as:
Wherein σ is the mean square deviation of Gaussian Profile;xdWith ydIndicate the coordinate value of two-dimensional image image pixel;It is calculated using gradient Son calculates image gradient, obtains the smoothed out image of convolution partial derivative matrix in the x and y direction, carries out non-maximum to matrix Inhibit, obtains bianry image;Threshold value screening is carried out to bianry image, the image obtained after screening only includes that brightness is relatively high Pupil boundary and its interior zone, obtain pupil region.
The method of eye centralized positioning of the present invention is to obtain pupil contour feature point with gradient method, is sought adaptive Threshold value determines pupil center or the eye center for determining pupil position and size, using cluster fitting pupil profile The method of positioning is, in gray value comparison phase, to be traversed, acquired whole to entire pupil region using the window area of N*N The maximum region of the sum of gray value in a pupil region, the maximum region N*N of the sum of gray value based on acquisition, using it is several where The point that method intersects window area diagonal line is as pupil center's point.
The method of eyeball tracking of the present invention is,
Behind the center for accurately obtaining pupil of human, correlated characteristic letter can be carried out according to the transfer of pupil sight The calculating and extraction of breath carry out corresponding system control response based on obtained characteristic information, to realize expected eye movement control System.
The mobile eye movement characteristics information extraction of view-based access control model point is mainly realized by following steps:
Calculate blinkpunkt offset:
Setting time threshold is T, and eyes traveling time is t, then has: it is mobile phase that t, which is greater than T, and it is quiet that t, which is less than or equal to T, Only watch the stage attentively;
The pupil center's point obtained when based on Ocular quiescent records current pupil center's point O1, and coordinate is denoted as (x1, y1), The adjacent two field pictures in acquisition front and back obtain the pupil center point O2 of the second frame image, and coordinate is denoted as (x2, y2), are based on front and back pupil Center establishes coordinate system appropriate and calculates the mobile displacement (xd, yd) once generated of eyes and rotational angle θ, obtains blinkpunkt Offset;
Viewpoint offset mapping:
The displacement difference of front and back two field pictures based on acquisition allows eyes to move back and forth and carries out to the point that coordinate on screen is demarcated Calibration, using least square curve fitting algorithm to mapping function into solution, after obtaining eye movement characteristics information, the eye based on acquisition Dynamic command displacement and angle information, carry out corresponding system message response.
After obtaining eye movement characteristics information by the above method, eye movement command displacement and angle information based on acquisition are carried out Corresponding system message response realizes the human-computer interaction controlled based on eye movement to reach expected eye movement control task.For Eye movement interbehavior, first is that collecting the eye movement behavior in Mobile Equipment human-computer interaction process by the eye movement behavior of monitoring user Data;Second is that realizing eye movement interaction by way of targetedly designing eye movement behavior.
In order to reach optimal interactive experience effect, and guarantee the universality of eye movement man-machine interaction mode, in the present invention The man-machine interaction method of eye movement control uses first way, i.e., the eye movement behavior naturally in human-computer interaction process, for people The kinetic characteristic and posture feature of natural eye movement behavior in machine interactive process, according to the capture to specific eye movement posture and behavior It is studied with corresponding visual feedback with prompt.The key point for choosing which is that eye movement control process will not naturally by user It is interrupted, by improving man-machine interaction experience for having the statistics of force information and utilization in natural eye movement behavior.
Above-mentioned technical proposal has the beneficial effect that
Movement is obtained by monitoring the eye movement behavior of user using the present invention is based on the man-machine interaction methods that eye movement controls Eye movement behavioral data naturally in human-computer interaction process is equipped, by effective analysis to mass data, realizes that pupil of human is fixed The extraction of position and eye movement characteristics, and then realize the interbehavior controlled based on eye movement.The realization of this method can be by user from intelligence Energy equipment reduces the energy consumption of user, user is helped to be able to carry out direct manipulation to freeing in the yoke of both hands, real It is existing more efficient, more naturally interactive.
The unspecified content of the present invention, can be used the prior art, therefore do not repeating.
Obviously, various changes and modifications can be made to the invention without departing from essence of the invention by those skilled in the art Mind and range.In this way, if these modifications and changes of the present invention belongs to the range of the claims in the present invention and its equivalent technologies Within, then the present invention is also intended to include these modifications and variations.

Claims (10)

1. the display methods that wear-type AR shows equipment, which comprises the steps of:
By showing that driving plate (1) is separately connected the signal output port on picture output device and high-resolution portable screen (2) signal input port, the signal for exporting picture output device are real-time transmitted on portable screen (2), are shown;
The content of the portable screen (2) display, the reflective mirror of free form surface being arranged by its front and at an angle Piece (3), is reflected into human eye portion.
2. the display methods that the wear-type AR provided according to claim 1 shows equipment, which is characterized in that the portable screen (2) human eye portion oblique upper and direction of visual lines are set in 45° angle.
3. the display methods that the wear-type AR provided according to claim 1 shows equipment, which is characterized in that the free form surface is anti- It penetrates formula eyeglass (3), the material used is the dedicated PC of plated film rear lens.
4. the exchange method that wear-type AR shows equipment, which comprises the steps of:
Eye motion data are acquired by the camera module (4) that free style reflecting optics (3) middle portion location is arranged, The eye motion data include acquisition user blink data and eyeball shifting momentum data;
Pass through the gyro module (5) being arranged in the wear-type AR display equipment, the number of acquisition user's head rotation angle According to;
Collected eye motion data are handled by data processing module and end rotation angle-data carries out algorithm calculating, are used To realize interactive function.
5. showing the exchange method of equipment according to the wear-type AR that claim 4 provides, which is characterized in that the camera module (4) method of the eyeball shifting momentum data is acquired, comprising the following steps: eye extracts;Edge detection;Eye centralized positioning; Eyeball tracking.
6. showing the exchange method of equipment according to the wear-type AR that claim 5 provides, which is characterized in that the eye extracted Method is to obtain facial image by camera module (4), extract eye feature, facial image is converted into corresponding gray scale Image handles face gray level image, using the variation of the value of extracted eye feature, determines position of human eye.
7. showing the exchange method of equipment according to the wear-type AR that claim 5 provides, it is characterised in that the edge detection Method is, using the significant characteristic of image border brightness change, carries out edge detection to human eye gray scale region obtained, obtains Pupil region.
8. showing the exchange method of equipment according to the wear-type AR that claim 5 provides, which is characterized in that the eye center is fixed The method of position is to obtain pupil contour feature point with gradient method, seeks adaptive threshold for determining pupil position and size, makes Pupil center is determined with cluster fitting pupil profile.
9. showing the exchange method of equipment according to the wear-type AR that claim 5 provides, which is characterized in that the eyeball tracking Method is,
Calculate blinkpunkt offset:
Setting time threshold is T, and eyes traveling time is t, then has: it is mobile phase that t, which is greater than T, and it is static note that t, which is less than or equal to T, Depending on the stage;
The pupil center's point obtained when based on Ocular quiescent records current pupil center's point O1, and coordinate is denoted as (x1, y1), acquisition The adjacent two field pictures in front and back obtain the pupil center point O2 of the second frame image, and coordinate is denoted as (x2, y2), are based on front and back pupil center It establishes coordinate system appropriate and calculates the mobile displacement (xd, yd) once generated of eyes and rotational angle θ, obtain the inclined of blinkpunkt Shifting amount;
Viewpoint offset mapping:
The displacement difference of front and back two field pictures based on acquisition allows eyes to move back and forth the point demarcated to coordinate on screen and carries out school Standard, using least square curve fitting algorithm to mapping function into solution, after obtaining eye movement characteristics information, the eye movement based on acquisition Command displacement and angle information carry out corresponding system message response.
10. showing the exchange method of equipment according to the wear-type AR that claim 7 provides, which is characterized in that the edge detection Specific steps include:
It defines Gauss and weights smooth function, smooth to image progress convolution, Gauss weights smooth function is defined as:
Wherein σ is the mean square deviation of Gaussian Profile;xdWith ydIndicate the coordinate value of two-dimensional image image pixel;
Image gradient is calculated using gradient operator, the smoothed out image of convolution partial derivative matrix in the x and y direction is obtained, to square Battle array carries out non-maxima suppression, obtains bianry image;
Threshold value screening is carried out to bianry image, after screening obtained image only include the higher pupil boundary of brightness ratio and its Interior zone obtains pupil region.
CN201811525502.0A 2018-12-13 2018-12-13 Wear-type AR shows the display exchange method of equipment Pending CN109613982A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811525502.0A CN109613982A (en) 2018-12-13 2018-12-13 Wear-type AR shows the display exchange method of equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811525502.0A CN109613982A (en) 2018-12-13 2018-12-13 Wear-type AR shows the display exchange method of equipment

Publications (1)

Publication Number Publication Date
CN109613982A true CN109613982A (en) 2019-04-12

Family

ID=66008216

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811525502.0A Pending CN109613982A (en) 2018-12-13 2018-12-13 Wear-type AR shows the display exchange method of equipment

Country Status (1)

Country Link
CN (1) CN109613982A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110879469A (en) * 2019-10-31 2020-03-13 华为技术有限公司 Head-mounted display equipment
CN111427150A (en) * 2020-03-12 2020-07-17 华南理工大学 Eye movement signal processing method used under virtual reality head-mounted display and wearable device
CN111665939A (en) * 2020-06-03 2020-09-15 广州市南方人力资源评价中心有限公司 Test question processing method and device based on head-mounted display equipment and electronic equipment
CN113253846A (en) * 2021-06-02 2021-08-13 樊天放 HID (human interface device) interactive system and method based on gaze deflection trend
CN113534959A (en) * 2021-07-27 2021-10-22 咪咕音乐有限公司 Screen display method, screen display device, virtual reality equipment and program product

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110213664A1 (en) * 2010-02-28 2011-09-01 Osterhout Group, Inc. Local advertising content on an interactive head-mounted eyepiece
CN104615241A (en) * 2015-01-04 2015-05-13 谭希韬 Wearable glass control method and system based on rotation of head
CN105378632A (en) * 2013-06-12 2016-03-02 微软技术许可有限责任公司 User focus controlled graphical user interface using a head mounted device
CN105629479A (en) * 2016-04-05 2016-06-01 杭州映墨科技有限公司 Catadioptric head-wearing display optical system for displaying three-dimensional scene
CN105934730A (en) * 2014-01-23 2016-09-07 微软技术许可有限责任公司 Automated content scrolling
CN107463258A (en) * 2017-08-07 2017-12-12 北京铂石空间科技有限公司 Head-mounted display apparatus, wear-type show interactive system and display exchange method
CN108595008A (en) * 2018-04-27 2018-09-28 北京计算机技术及应用研究所 Man-machine interaction method based on eye movement control

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110213664A1 (en) * 2010-02-28 2011-09-01 Osterhout Group, Inc. Local advertising content on an interactive head-mounted eyepiece
CN105378632A (en) * 2013-06-12 2016-03-02 微软技术许可有限责任公司 User focus controlled graphical user interface using a head mounted device
CN105934730A (en) * 2014-01-23 2016-09-07 微软技术许可有限责任公司 Automated content scrolling
CN104615241A (en) * 2015-01-04 2015-05-13 谭希韬 Wearable glass control method and system based on rotation of head
CN105629479A (en) * 2016-04-05 2016-06-01 杭州映墨科技有限公司 Catadioptric head-wearing display optical system for displaying three-dimensional scene
CN107463258A (en) * 2017-08-07 2017-12-12 北京铂石空间科技有限公司 Head-mounted display apparatus, wear-type show interactive system and display exchange method
CN108595008A (en) * 2018-04-27 2018-09-28 北京计算机技术及应用研究所 Man-machine interaction method based on eye movement control

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110879469A (en) * 2019-10-31 2020-03-13 华为技术有限公司 Head-mounted display equipment
WO2021082798A1 (en) * 2019-10-31 2021-05-06 华为技术有限公司 Head-mounted display device
CN111427150A (en) * 2020-03-12 2020-07-17 华南理工大学 Eye movement signal processing method used under virtual reality head-mounted display and wearable device
CN111427150B (en) * 2020-03-12 2021-03-30 华南理工大学 Eye movement signal processing method used under virtual reality head-mounted display and wearable device
CN111665939A (en) * 2020-06-03 2020-09-15 广州市南方人力资源评价中心有限公司 Test question processing method and device based on head-mounted display equipment and electronic equipment
CN113253846A (en) * 2021-06-02 2021-08-13 樊天放 HID (human interface device) interactive system and method based on gaze deflection trend
CN113253846B (en) * 2021-06-02 2024-04-12 樊天放 HID interaction system and method based on gaze deflection trend
CN113534959A (en) * 2021-07-27 2021-10-22 咪咕音乐有限公司 Screen display method, screen display device, virtual reality equipment and program product

Similar Documents

Publication Publication Date Title
CN109613982A (en) Wear-type AR shows the display exchange method of equipment
US20240007605A1 (en) Apparatus and method for performing motion capture using a random pattern on capture surfaces
AU2015348151B2 (en) Real-time visual feedback for user positioning with respect to a camera and a display
KR102212209B1 (en) Method, apparatus and computer readable recording medium for eye gaze tracking
US7113618B2 (en) Portable virtual reality
CN108153424B (en) Eye movement and head movement interaction method of head display equipment
CN107516335A (en) The method for rendering graph and device of virtual reality
CN107317987A (en) The display data compression method and equipment of virtual reality, system
CN101408800B (en) Method for performing three-dimensional model display control by CCD camera
US20100110069A1 (en) System for rendering virtual see-through scenes
US11170521B1 (en) Position estimation based on eye gaze
JP2016511888A (en) Improvements in and on image formation
CN112232310B (en) Face recognition system and method for expression capture
CN105763829A (en) Image processing method and electronic device
CN108334832A (en) A kind of gaze estimation method based on generation confrontation network
Ravnik et al. Dynamic anamorphosis as a special, computer-generated user interface
Zitnick et al. Manipulation of video eye gaze and head orientation for video teleconferencing
CN106851240A (en) The method and device of image real time transfer
CN105933690A (en) Adaptive method and device for adjusting 3D image content size
Nikolov et al. Gaze-contingent display using texture mapping and opengl: system and applications
CN114267070A (en) VR glasses capable of capturing human body actions and expressions and capturing method thereof
CN110097644B (en) Expression migration method, device and system based on mixed reality and processor
CN113923501B (en) LED screen panoramic display method and system based on VR virtual reality
CN112052827B (en) Screen hiding method based on artificial intelligence technology
Kijima Wearable Interface Devices

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20190412