CN116027910B - Eye bitmap generation method and system based on VR eye movement tracking technology - Google Patents

Eye bitmap generation method and system based on VR eye movement tracking technology Download PDF

Info

Publication number
CN116027910B
CN116027910B CN202310315777.6A CN202310315777A CN116027910B CN 116027910 B CN116027910 B CN 116027910B CN 202310315777 A CN202310315777 A CN 202310315777A CN 116027910 B CN116027910 B CN 116027910B
Authority
CN
China
Prior art keywords
eye
user
technology
tracking
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310315777.6A
Other languages
Chinese (zh)
Other versions
CN116027910A (en
Inventor
李朝远
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Shijing Medical Software Co ltd
Original Assignee
Guangzhou Shijing Medical Software Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Shijing Medical Software Co ltd filed Critical Guangzhou Shijing Medical Software Co ltd
Priority to CN202310315777.6A priority Critical patent/CN116027910B/en
Publication of CN116027910A publication Critical patent/CN116027910A/en
Application granted granted Critical
Publication of CN116027910B publication Critical patent/CN116027910B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Eye Examination Apparatus (AREA)

Abstract

The invention discloses an eye bitmap generation method and system based on a VR (virtual reality) eye tracking technology, wherein the eye bitmap generation method comprises the steps of obtaining eye positions of a user through a preset eye position calibration method according to a plane background built right in front of the visual field of the user by the VR technology, constructing and randomly displaying a plurality of constructed target objects on the plane background respectively according to the eye positions and preset target object position information by the VR technology, respectively obtaining eyeball tracking vision of the user watching the target objects through the preset eye tracking technology, obtaining a plurality of three-dimensional coordinates of the eyeball tracking vision falling on the plane background according to the VR technology, and obtaining an eye position deviation map of the user according to the three-dimensional coordinates and a preset deviation map drawing method, so that eye position checking efficiency and accuracy are improved, and meanwhile, operation complexity is reduced.

Description

Eye bitmap generation method and system based on VR eye movement tracking technology
Technical Field
The invention relates to the technical field of medical ophthalmic oblique amblyopia inspection, in particular to an eye bitmap generation method and system based on a VR eye movement tracking technology.
Background
In China, with the high-speed development of computer graphics, computer system engineering and other technologies, the virtual reality technology is becoming important, and draws interest and attention of people in all areas of China, and research and application of VR, establishment of virtual environment and virtual scene model and development of a distributed VR system are developing towards depth and breadth.
In the prior art, an eye position checking technology generally uses a common vision machine to check the eye position, so that a user overlaps two completely different pictures such as a tractor and a house, namely, the tractor enters the house, if a patient has normal retina correspondence, the user can directly obtain the horizontal, vertical and rotary oblique views on a dial at the same time, the measured oblique views are main oblique viewing angles, and simultaneously perception pictures such as a lion and a cage are used, the cage picture is arranged on one side of a gazing eye, the lion picture is arranged on one side of the gazing eye, the gazing eye tube is fixed at 0 position, the lens barrel handle at one side of the gazing eye is held by a hand by the person to be checked to move forwards and backwards until the lion enters the center of the cage, and the degree pointed by the lens barrel arm at one side of the user is the subjective oblique viewing angle.
In the prior art, when the eyes are inspected by using the simultaneous machine, there are no result situations, for example, when two pictures cannot be overlapped, the simultaneous vision function is described, and the two situations are presented: (1) only one side of the drawing is seen, and monocular inhibition is recorded; (2) two pictures are seen but cannot be overlapped, and the same-side double vision is recorded or an intersection point is recorded (when eyes watch a certain point, the two pictures suddenly change positions, and the point is the cross-fixation point). Meanwhile, in the prior art, the using operation is relatively complex and the checking efficiency is low in the using instrument and equipment.
Disclosure of Invention
The invention discloses an eye bitmap generation method and system based on a VR eye movement tracking technology, which improves eye position checking efficiency and accuracy and reduces operation complexity.
To achieve the above object, in a first aspect, the present invention provides an eye bitmap generation method based on VR eye tracking technology, including:
according to a planar background built right in front of a visual field of a user by a VR technology, obtaining eye position righting positions of the user through a preset eye position calibration method, and constructing a plurality of target objects on the planar background through the VR technology according to the eye position righting positions and preset target object position information;
respectively and randomly displaying the target objects, respectively obtaining eyeball tracking vision of the target objects watched by the user through a preset eye tracking technology, and obtaining a plurality of three-dimensional coordinates of the eyeball tracking vision falling on the plane background according to the VR technology; the eye tracking vision line comprises a left eye tracking vision line and a right eye tracking vision line;
and obtaining the eye position deviation map of the user according to the plurality of three-dimensional coordinates and a preset deviation map drawing method.
According to the eye bitmap generation method based on the VR eye tracking technology, firstly, a virtual reality scene is built right in front of a visual field of a user through the VR technology, so that the user is controlled to watch a plane background in the virtual scene, further, eye position correction of the user is obtained, post-inspection is conducted according to real eye position conditions of the user, accuracy of eye position inspection is improved, after the eye position correction of the user is determined, a plurality of target objects are built on the plane background through preset target position information, accuracy of the eye bitmap generation method is guaranteed, after the target objects are built, the target objects are randomly displayed respectively, eye tracking vision corresponding to the user watching each target object is obtained, three-dimensional coordinates of the eye tracking vision intersecting the plane background are obtained, an eye position deviation map of the user is obtained according to the three-dimensional coordinates, and eye position inspection results of the user are obtained simply and intuitively. According to the invention, according to the eye position correction of the user, the target object is constructed through the preset VR technology and the targets are randomly displayed respectively so as to obtain the eye position checking condition of the user, so that the checking efficiency and checking accuracy are improved, and meanwhile, the VR technology is utilized for checking the eye position of the user, so that the complexity of operation is reduced.
As a preferred example, in the plane background set up in front of the field of view of the user according to the VR technology, the eye position alignment of the user is obtained by a preset eye position calibration method, which specifically includes:
building a VR scene through the VR technology, and placing two parallel cameras and a plane background opposite to the parallel cameras in the VR scene; the planar background covers a field of view of the parallel cameras;
and controlling the user to watch the plane background through the parallel camera, and determining the visual field center of the parallel camera so as to obtain the eye position alignment of the user.
According to the invention, a virtual scene and a plane background are built right in front of the realization of a user through a preset VR technology, so that eye position detection can be conveniently carried out through the plane background, then, a camera built in the virtual scene is used for replacing eyes of the user to look at the plane background, and the right center of a visual field of the camera is collected to be the eye position of the user.
As a preferred example, constructing a plurality of target objects on the planar background according to the eye position correction and the preset target object position information through the VR technology specifically includes:
and taking the eye position righting as a center, and respectively constructing a plurality of target objects on the plane background at a plurality of positions outside the center and the center through the VR technology.
The invention designs the position of the target object by taking the eye position righting as the center, ensures the accuracy of eye position inspection, designs the target object based on the VR technology, and reduces the complexity of operation.
As a preferred example, the method for obtaining the three-dimensional coordinates of the eye tracking line of sight of the user on the planar background according to the VR technology includes:
aiming at the target object according to a handle ray preset in the VR technology and pressing a trigger key, capturing eyeball tracking vision of the user through a VR helmet preset in the VR technology; the eyeball tracking sight line comprises a left eye tracking sight line and a right eye tracking sight line;
calculating the three-dimensional coordinate position of the intersection point of the eye tracking sight line and the plane background through an API preset in the VR helmet, and obtaining a plurality of three-dimensional coordinates of the eye tracking sight line of the user falling on the plane background; the three-dimensional coordinates include left eye coordinates and right eye coordinates.
According to the invention, the user can capture the left eye and the right eye of the user to track the sight through the preset VR helmet through simple operation, namely, the handle rays are aimed at the target object and the trigger key is pressed, and the three-dimensional coordinates of the sight crossing the plane background are obtained, so that the complexity of operation is reduced, and meanwhile, the sight is captured based on the fact that the user presses the trigger key, so that the user is ensured to watch the target object, and the accuracy of inspection is further improved.
As a preferred example, the method for obtaining the eye position deviation map of the user according to the three-dimensional coordinates and the preset deviation map drawing method specifically includes:
obtaining a binocular fixation point deviation value of the current eye position by calculating the position distance of the left eye coordinate and the right eye coordinate;
and according to the binocular fixation point deviation value, the eye position deviation graph of the user is obtained by scaling and drawing the plurality of three-dimensional coordinates.
According to the invention, the binocular fixation point deviation value is obtained according to the three-dimensional coordinates, and the deviation graph is drawn according to the binocular fixation point deviation value and the three-dimensional coordinates, so that the eye position inspection condition of the user is simply and clearly obtained, and the inspection efficiency is improved.
In a second aspect, the invention provides an eye bitmap generation system based on a VR eye movement tracking technology, which comprises a target module, a tracking module and a drawing module;
the target module is used for obtaining the eye position correcting position of the user through a preset eye position correcting method according to a plane background built right in front of the visual field of the user by the VR technology, and constructing a plurality of target objects on the plane background through the VR technology according to the eye position correcting position and preset target object position information;
the tracking module is used for respectively and randomly displaying the target objects, respectively obtaining eyeball tracking vision of the user looking at the target objects through a preset eye tracking technology, and obtaining a plurality of three-dimensional coordinates of the eyeball tracking vision falling on the plane background according to the VR technology; the eye tracking vision line comprises a left eye tracking vision line and a right eye tracking vision line;
the drawing module is used for obtaining the eye position deviation graph of the user according to the three-dimensional coordinates and a preset deviation graph drawing method.
According to the eye bitmap generation system based on the VR eye tracking technology, firstly, a virtual reality scene is built right in front of a visual field of a user through the VR technology, so that the user is controlled to watch a plane background in the virtual scene, further, eye position correction of the user is obtained, post-inspection is conducted according to real eye position conditions of the user, accuracy of eye position inspection is improved, after the eye position correction of the user is determined, a plurality of target objects are built on the plane background through preset target position information, accuracy of the eye bitmap generation method is guaranteed, after the target objects are built, the target objects are randomly displayed respectively, eye tracking vision corresponding to the user watching each target object is obtained, three-dimensional coordinates of the eye tracking vision intersecting the plane background are obtained, an eye position deviation map of the user is obtained according to the three-dimensional coordinates, and eye position inspection results of the user are obtained simply and intuitively. According to the invention, according to the eye position correction of the user, the target object is constructed through the preset VR technology and the targets are randomly displayed respectively so as to obtain the eye position checking condition of the user, so that the checking efficiency and checking accuracy are improved, and meanwhile, the VR technology is utilized for checking the eye position of the user, so that the complexity of operation is reduced.
As a preferable example, the target module includes a scene unit, a calibration unit and a building unit;
the scene unit is used for constructing a VR scene through the VR technology, and placing two parallel cameras and a plane background right opposite to the parallel cameras in the VR scene; the planar background covers a field of view of the parallel cameras;
the calibration unit is used for controlling the user to look at the plane background through the parallel camera, determining the visual field center of the parallel camera, and further obtaining the eye position alignment of the user;
the building unit is used for taking the eye position as a center, and constructing a plurality of target objects on the plane background at a plurality of positions outside the center and the center through the VR technology.
According to the invention, a virtual scene and a plane background are built right in front of the realization of a user through a preset VR technology, so that eye position detection is conveniently carried out through the plane background, then, a camera built in the virtual scene is used for replacing eyes of the user to look at the plane background, the right center of the visual field of the camera is collected to be the eye position of the user, then, the eye position is taken as the center, a plurality of positions outside the center and a plurality of target objects are respectively built on the plane background through the VR technology, and the invention is triggered according to the actual eye position condition of the user, so that different eye position positions are determined, and the accuracy of eye position detection is ensured.
As a preferable example, the tracking module includes a sight line tracking unit and a coordinate unit;
the sight tracking unit is used for aiming at the target object according to a handle ray preset in the VR technology and pressing a trigger key, and capturing eyeball tracking sight of the user through a VR helmet preset in the VR technology; the eyeball tracking sight line comprises a left eye tracking sight line and a right eye tracking sight line;
the coordinate unit is used for calculating the three-dimensional coordinate position of the intersection point of the eye tracking sight line and the plane background through an API preset in the VR helmet, and obtaining a plurality of three-dimensional coordinates of the eye tracking sight line of the user falling on the plane background; the three-dimensional coordinates include left eye coordinates and right eye coordinates.
According to the invention, the user can capture the left eye and the right eye of the user to track the sight through the preset VR helmet through simple operation, namely, the handle rays are aimed at the target object and the trigger key is pressed, and the three-dimensional coordinates of the sight crossing the plane background are obtained, so that the complexity of operation is reduced, and meanwhile, the sight is captured based on the fact that the user presses the trigger key, so that the user is ensured to watch the target object, and the accuracy of inspection is further improved.
As a preferable example, the drawing module includes a deviation value unit and a drawing unit;
the deviation value unit is used for obtaining a binocular fixation point deviation value of the current eye position by calculating the position distances of the left eye coordinates and the right eye coordinates;
the drawing unit is used for obtaining an eye position deviation graph of the user by scaling and drawing the plurality of three-dimensional coordinates according to the binocular fixation point deviation value.
According to the invention, the binocular fixation point deviation value is obtained according to the three-dimensional coordinates, and the deviation graph is drawn according to the binocular fixation point deviation value and the three-dimensional coordinates, so that the eye position inspection condition of the user is simply and clearly obtained, and the inspection efficiency is improved.
In a third aspect, the present invention provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements an eye bitmap generation method based on VR eye tracking technology as set forth in the first aspect.
Drawings
Fig. 1: the embodiment of the invention provides a flow diagram of an eye bitmap generation method based on a VR eye tracking technology;
fig. 2: the embodiment of the invention provides a structural schematic diagram of an eye bitmap generation system based on a VR eye tracking technology;
fig. 3: the embodiment of the invention provides a target object position schematic diagram;
fig. 4: the embodiment of the invention provides an eye position deviation graph drawing schematic diagram.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Example 1
The embodiment provides an eye bitmap generation method based on VR eye tracking technology, and the specific implementation flow of the method please refer to fig. 1, mainly includes steps 101 to 103, and the steps mainly include:
step 101: according to a planar background built right in front of a field of view of a user by a VR technology, obtaining eye position correction of the user through a preset eye position calibration method, and constructing a plurality of target objects on the planar background by the VR technology according to the eye position correction and preset target object position information.
In this embodiment, the steps mainly include: building a VR scene through the VR technology, and placing two parallel cameras and a plane background opposite to the parallel cameras in the VR scene; the planar background covers a field of view of the parallel cameras; the parallel camera is used for controlling the user to watch the plane background, the visual field center of the parallel camera is determined, and then the eye position alignment of the user is obtained; and taking the eye position righting as a center, and respectively constructing a plurality of target objects on the plane background at a plurality of positions outside the center and the center through the VR technology.
In this embodiment, the steps specifically include: the VR technology comprises a developed VR program, a VR scene is built by the developed VR program, two parallel cameras cam are placed in the VR scene, a user observes the VR scene through the parallel cameras, the images rendered by the parallel cameras correspond to images which are visible to the left eye and the right eye of the user through the VR helmet when the user takes the VR helmet, a white plane is placed right opposite to the camera in the scene as a background, the size of the plane can cover the view of the whole camera, a red ball is placed on the white plane opposite to the right center of the view of the two cameras, the red ball position represents the center position of the view, namely the normal position of the eye position, then 8 red balls are placed in 8 positions outside the center point, the positions of 9 red balls represent the 9 bitmap gaze point of the eyes, and the white background viewed by the left eye and the right eye of the user is shown in figure 3 and comprises the white background and 9 red balls positioned on the white background.
In the step, a virtual scene and a plane background are built in front of the realization of a user through a preset VR technology, so that eye position detection is conveniently carried out through the plane background, then, a camera built in the virtual scene replaces eyes of the user to watch the plane background, the right center of the visual field of the camera is collected to be the eye position of the user, then, the eye position is taken as the center, a plurality of positions outside the center and the center are respectively used for building a plurality of target objects on the plane background through the VR technology, and the invention is triggered according to the actual eye position condition of the user, so that different eye position positions are determined, and the accuracy of eye position detection is ensured.
Step 102: respectively and randomly displaying the target objects, respectively obtaining eyeball tracking vision of the target objects watched by the user through a preset eye tracking technology, and obtaining a plurality of three-dimensional coordinates of the eyeball tracking vision falling on the plane background according to the VR technology; the eye tracking gaze includes a left eye tracking gaze and a right eye tracking gaze.
In this embodiment, the steps mainly include: aiming at the target object according to a handle ray preset in the VR technology and pressing a trigger key, capturing eyeball tracking vision of the user through a VR helmet preset in the VR technology; the eyeball tracking sight line comprises a left eye tracking sight line and a right eye tracking sight line; calculating the three-dimensional coordinate position of the intersection point of the eye tracking sight line and the plane background through an API preset in the VR helmet, and obtaining a plurality of three-dimensional coordinates of the eye tracking sight line of the user falling on the plane background; the three-dimensional coordinates include left eye coordinates and right eye coordinates.
In this embodiment, the steps specifically include: by selecting the VR mode in the VR helmet as a head position fixing mode, the visual field of the user is guaranteed not to be changed along with the rotation of the head, and then the eyeball of the user is guaranteed to rotate to watch the red color of each direction, and a 9-direction chart checking mode of eye movement in reality is simulated. The method comprises the steps of randomly displaying one red ball in the red balls with 9 directions on a white plane, hiding other red balls, enabling a user to aim at the displayed red balls through handle rays of the VR helmet, pressing a trigger key to enable the red balls to disappear, simultaneously obtaining rays l_ray and r_ray traced by eyeballs when the red balls are aimed at left eyes and right eyes returned by a VR helmet system, calculating three-dimensional coordinate positions of the rays and collision points of the white plane through an API provided by the system, and obtaining two three-dimensional coordinates l_ps (x, y, z) and r_ps (x, y, z) of the right eyes.
In the step, the user can capture the left eye and the right eye of the user to track the sight through the preset VR helmet by simply operating, namely aiming the handle rays at the target object and pressing the trigger key, and obtain the three-dimensional coordinates of the sight crossing the plane background, so that the complexity of operation is reduced, and meanwhile, the sight is captured based on the fact that the user presses the trigger key, so that the user is ensured to watch the target object, and the accuracy of inspection is further improved.
Step 103: and obtaining the eye position deviation map of the user according to the plurality of three-dimensional coordinates and a preset deviation map drawing method.
In this embodiment, the steps mainly include: obtaining a binocular fixation point deviation value of the current eye position by calculating the position distance of the left eye coordinate and the right eye coordinate; and according to the binocular fixation point deviation value, the eye position deviation graph of the user is obtained by scaling and drawing the plurality of three-dimensional coordinates.
In this embodiment, the steps specifically include: the binocular fixation point deviation value (linear distance, which is the deviation of the eye fixation point, the larger the linear distance is), of the current eye position is obtained by calculating the distance magnitudes of the two coordinate positions of the left eye and the right eye. The next eye position red ball is displayed circularly until 9 red balls are clicked, and 18 record coordinates are obtained according to the positions of the 9 red balls, wherein the table is as follows:
Figure SMS_1
and according to the 18 coordinate data x and y values and the obtained binocular fixation point deviation values, scaling and drawing an eye position deviation graph to obtain an eye position deviation graph of a user, wherein the eye position deviation graph comprises eye position coordinates of a left eye and a right eye as shown in fig. 4, and according to the eye position deviation graph, an eye position checking result of the user is obtained.
According to the method, the binocular fixation point deviation value is obtained according to the three-dimensional coordinates, and then the deviation graph is drawn according to the binocular fixation point deviation value and the three-dimensional coordinates, so that the eye position checking condition of the user is simply and clearly obtained, and the checking efficiency is improved.
On the other hand, the present embodiment also provides an eye bitmap generating system based on VR eye tracking technology, and the specific structure of the system please refer to fig. 2, which mainly includes a target module 201, a tracking module 202 and a drawing module 203.
The target module 201 is configured to obtain an eye position of a user through a preset eye position calibration method according to a planar background set up in front of a field of view of the user by using a VR technology, and construct a plurality of target objects on the planar background through the VR technology according to the eye position and preset target object position information.
The tracking module 202 is configured to obtain, by randomly displaying the target objects, eye tracking lines of sight of the user looking at the target objects through a preset eye tracking technique, and obtain a plurality of three-dimensional coordinates of the eye tracking lines of sight falling on the planar background according to the VR technique; the eye tracking gaze includes a left eye tracking gaze and a right eye tracking gaze.
The drawing module 203 is configured to obtain an eye deviation map of the user according to the plurality of three-dimensional coordinates and a preset deviation map drawing method.
In this embodiment, the target module 201 includes a scene unit, a calibration unit, and a setup unit.
The scene unit is used for constructing a VR scene through the VR technology, and placing two parallel cameras and a plane background right opposite to the parallel cameras in the VR scene; the planar background covers a field of view of the parallel cameras.
The calibration unit is used for controlling the user to look at the plane background through the parallel camera, determining the visual field center of the parallel camera, and further obtaining the eye position alignment of the user.
The building unit is used for taking the eye position as a center, and constructing a plurality of target objects on the plane background at a plurality of positions outside the center and the center through the VR technology.
In this embodiment, the tracking module 202 includes a gaze tracking unit and a coordinate unit.
The sight tracking unit is used for aiming at the target object according to a handle ray preset in the VR technology and pressing a trigger key, and capturing eyeball tracking sight of the user through a VR helmet preset in the VR technology; the eye tracking line of sight encompasses a left eye tracking line of sight and a right eye tracking line of sight.
The coordinate unit is used for calculating the three-dimensional coordinate position of the intersection point of the eye tracking sight line and the plane background through an API preset in the VR helmet, and obtaining a plurality of three-dimensional coordinates of the eye tracking sight line of the user falling on the plane background; the three-dimensional coordinates include left eye coordinates and right eye coordinates.
In this embodiment, the drawing module 203 includes a deviation value unit and a drawing unit.
The deviation value unit is used for obtaining the binocular fixation point deviation value of the current eye position by calculating the position distance of the left eye coordinate and the right eye coordinate.
The drawing unit is used for obtaining an eye position deviation graph of the user by scaling and drawing the plurality of three-dimensional coordinates according to the binocular fixation point deviation value.
The present embodiment also provides a computer-readable storage medium having a computer program stored thereon, which when executed by a processor, implements an eye bitmap generation method based on VR eye tracking technology as described in the present embodiment.
According to the eye bitmap generation method and system based on the VR eye tracking technology, a virtual reality scene is built right in front of a field of view of a user through the VR technology, so that the user is controlled to watch a plane background in the virtual scene, further eye position correction of the user is obtained, post-inspection is conducted according to real eye position conditions of the user, accuracy of eye position inspection is improved, after the eye position correction of the user is determined, a plurality of target objects are built on the plane background through preset target position information, accuracy of the eye bitmap generation method is guaranteed, after the target objects are built, the target objects are randomly displayed respectively, eye tracking sight corresponding to the user watching each target object is obtained, three-dimensional coordinates of the eye tracking sight crossing the plane background are obtained, an eye position deviation map of the user is obtained according to the three-dimensional coordinates, and eye position inspection results of the user are obtained intuitively. According to the invention, according to the eye position correction of the user, the target object is constructed through the preset VR technology and the targets are randomly displayed respectively so as to obtain the eye position checking condition of the user, so that the checking efficiency and checking accuracy are improved, and meanwhile, the VR technology is utilized for checking the eye position of the user, so that the complexity of operation is reduced.
The foregoing embodiments have been provided for the purpose of illustrating the general principles of the present invention, and are not to be construed as limiting the scope of the invention. It should be noted that any modifications, equivalent substitutions, improvements, etc. made by those skilled in the art without departing from the spirit and principles of the present invention are intended to be included in the scope of the present invention.

Claims (6)

1. An eye bitmap generation method based on a VR eye movement tracking technology is characterized by comprising the following steps:
according to a planar background built right in front of a visual field of a user by a VR technology, obtaining eye position righting positions of the user through a preset eye position calibration method, and constructing a plurality of target objects on the planar background through the VR technology according to the eye position righting positions and preset target object position information;
the target object is randomly displayed respectively, the handle rays preset in the VR technology aim at the target object and a trigger key is pressed, the eye tracking sight line of the user is captured through a VR helmet preset in the VR technology, the three-dimensional coordinate position of the intersection point of the eye tracking sight line and the plane background is calculated through an API preset in the VR helmet, and a plurality of three-dimensional coordinates of the eye tracking sight line of the user falling on the plane background are obtained; the eye tracking vision line comprises a left eye tracking vision line and a right eye tracking vision line; the three-dimensional coordinates include left eye coordinates and right eye coordinates;
and calculating the position distance of the left eye coordinate and the right eye coordinate to obtain a binocular fixation point deviation value of the current eye position, and scaling and drawing the plurality of three-dimensional coordinates according to the binocular fixation point deviation value to obtain an eye position deviation graph of the user.
2. The method for generating an eye bitmap based on VR eye tracking technology as set forth in claim 1, wherein the method for obtaining the eye position of the user by a preset eye position calibration method according to the plane background set up in front of the field of view of the user by VR technology specifically includes:
building a VR scene through the VR technology, and placing two parallel cameras and a plane background right opposite to the parallel cameras in the VR scene; the planar background covers a field of view of the parallel cameras;
and (3) replacing eyes of the user to watch the plane background by a parallel camera constructed in the VR scene, and acquiring the visual field center of the parallel camera to be the eye position of the user.
3. The method for generating an eye bitmap based on VR eye tracking technology as set forth in claim 1, wherein said constructing a plurality of target objects on said planar background by said VR technology according to said eye position alignment and preset target object position information specifically includes:
and taking the eye position righting as a center, and respectively constructing a plurality of target objects on the plane background at a plurality of positions outside the center and the center through the VR technology.
4. An eye bitmap generation system based on VR eye movement tracking technology is characterized by comprising a target module, a tracking module and a drawing module;
the target module is used for obtaining the eye position correcting position of the user through a preset eye position correcting method according to a plane background built right in front of the visual field of the user by the VR technology, and constructing a plurality of target objects on the plane background through the VR technology according to the eye position correcting position and preset target object position information;
the tracking module is used for capturing the eyeball tracking sight of the user through a VR helmet preset in the VR technology by respectively randomly displaying the target object, aligning the target object according to a handle ray preset in the VR technology and pressing a trigger key, calculating the three-dimensional coordinate position of the intersection point of the eyeball tracking sight and the plane background through an API preset in the VR helmet, and obtaining a plurality of three-dimensional coordinates of the eyeball tracking sight of the user falling on the plane background; the eye tracking vision line comprises a left eye tracking vision line and a right eye tracking vision line; the three-dimensional coordinates include left eye coordinates and right eye coordinates;
the drawing module is used for obtaining binocular fixation point deviation values of the current eye position by calculating the position distances of the left eye coordinates and the right eye coordinates, and drawing the three-dimensional coordinates in a scaling mode according to the binocular fixation point deviation values to obtain the eye position deviation map of the user.
5. The eye bitmap generation system based on VR eye tracking technology as set forth in claim 4, wherein said target module comprises a scene unit, a calibration unit, and a setup unit;
the scene unit is used for constructing a VR scene through the VR technology, and placing two parallel cameras and plane backgrounds right opposite to the parallel cameras in the VR scene; the planar background covers a field of view of the parallel cameras;
the calibration unit is used for replacing eyes of a user to watch the plane background through a parallel camera constructed in the VR scene, and acquiring the visual field center of the parallel camera to be the eye position of the user;
the building unit is used for taking the eye position as a center, and constructing a plurality of target objects on the plane background at a plurality of positions outside the center and the center through the VR technology.
6. A computer-readable storage medium, having stored thereon a computer program which, when executed by a processor, implements an eye bitmap generation method based on VR eye tracking technology as set forth in any one of claims 1 to 3.
CN202310315777.6A 2023-03-29 2023-03-29 Eye bitmap generation method and system based on VR eye movement tracking technology Active CN116027910B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310315777.6A CN116027910B (en) 2023-03-29 2023-03-29 Eye bitmap generation method and system based on VR eye movement tracking technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310315777.6A CN116027910B (en) 2023-03-29 2023-03-29 Eye bitmap generation method and system based on VR eye movement tracking technology

Publications (2)

Publication Number Publication Date
CN116027910A CN116027910A (en) 2023-04-28
CN116027910B true CN116027910B (en) 2023-07-04

Family

ID=86091265

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310315777.6A Active CN116027910B (en) 2023-03-29 2023-03-29 Eye bitmap generation method and system based on VR eye movement tracking technology

Country Status (1)

Country Link
CN (1) CN116027910B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014052758A (en) * 2012-09-06 2014-03-20 Hiroshima City Univ Sight line measurement method
JP2017107359A (en) * 2015-12-09 2017-06-15 Kddi株式会社 Image display device, program, and method that displays object on binocular spectacle display of optical see-through type
CN108634926A (en) * 2018-05-14 2018-10-12 杭州市余杭区第五人民医院 Vision testing method, device, system based on VR technologies and storage medium
US10742944B1 (en) * 2017-09-27 2020-08-11 University Of Miami Vision defect determination for facilitating modifications for vision defects related to double vision or dynamic aberrations

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000152285A (en) * 1998-11-12 2000-05-30 Mr System Kenkyusho:Kk Stereoscopic image display device
JP5714949B2 (en) * 2010-10-05 2015-05-07 パナソニック株式会社 Eye displacement measurement device
US8888287B2 (en) * 2010-12-13 2014-11-18 Microsoft Corporation Human-computer interface system having a 3D gaze tracker
US10231614B2 (en) * 2014-07-08 2019-03-19 Wesley W. O. Krueger Systems and methods for using virtual reality, augmented reality, and/or a synthetic 3-dimensional information for the measurement of human ocular performance
CN103239347B (en) * 2013-05-09 2015-05-13 北京大学 System for treating visual dysfunction by adopting ocular dominance regulation
WO2014188727A1 (en) * 2013-05-22 2014-11-27 国立大学法人神戸大学 Line-of-sight measurement device, line-of-sight measurement method and line-of-sight measurement program
WO2016002656A1 (en) * 2014-06-30 2016-01-07 凸版印刷株式会社 Line-of-sight measurement system, line-of-sight measurement method, and program
KR101862499B1 (en) * 2015-11-27 2018-05-29 포브, 아이엔씨. Viewpoint detecting system, point of gaze determining method and point of gaze determining program
JP6776970B2 (en) * 2017-03-24 2020-10-28 株式会社Jvcケンウッド Gaze detection device, gaze detection method and gaze detection program
DE112018006164B4 (en) * 2018-01-05 2021-05-12 Mitsubishi Electric Corporation VISUAL DIRECTION CALIBRATION DEVICE, VISUAL DIRECTION CALIBRATION PROCEDURE, AND VISUAL DIRECTION CALIBRATION PROGRAM
US20200121195A1 (en) * 2018-10-17 2020-04-23 Battelle Memorial Institute Medical condition sensor
CN114027783A (en) * 2021-11-30 2022-02-11 首都医科大学附属北京同仁医院 First-eye strabismus diagnosis method based on virtual reality and eye movement tracking technology
CN114052649A (en) * 2021-11-30 2022-02-18 首都医科大学附属北京同仁医院 Alternate covering strabismus diagnosis method based on virtual reality and eye movement tracking technology
CN114610161B (en) * 2022-05-10 2022-07-22 北京明仁视康科技有限公司 Visual target control method and system of visual rehabilitation device
CN115409774A (en) * 2022-07-13 2022-11-29 广州视景医疗软件有限公司 Eye detection method based on deep learning and strabismus screening system
CN115590462A (en) * 2022-12-01 2023-01-13 广州视景医疗软件有限公司(Cn) Vision detection method and device based on camera

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014052758A (en) * 2012-09-06 2014-03-20 Hiroshima City Univ Sight line measurement method
JP2017107359A (en) * 2015-12-09 2017-06-15 Kddi株式会社 Image display device, program, and method that displays object on binocular spectacle display of optical see-through type
US10742944B1 (en) * 2017-09-27 2020-08-11 University Of Miami Vision defect determination for facilitating modifications for vision defects related to double vision or dynamic aberrations
CN108634926A (en) * 2018-05-14 2018-10-12 杭州市余杭区第五人民医院 Vision testing method, device, system based on VR technologies and storage medium

Also Published As

Publication number Publication date
CN116027910A (en) 2023-04-28

Similar Documents

Publication Publication Date Title
US10699439B2 (en) Method and an apparatus for determining a gaze point on a three-dimensional object
US10198865B2 (en) HMD calibration with direct geometric modeling
EP1292877B1 (en) Apparatus and method for indicating a target by image processing without three-dimensional modeling
CN105872526B (en) Binocular AR wears display device and its method for information display
Canessa et al. Calibrated depth and color cameras for accurate 3D interaction in a stereoscopic augmented reality environment
US20070279590A1 (en) Sight-Line Detection Method and Device, and Three-Dimensional View-Point Measurement Device
CN113808160B (en) Sight direction tracking method and device
US20100315414A1 (en) Display of 3-dimensional objects
CN108259887B (en) Method and device for calibrating fixation point and method and device for calibrating fixation point
CN109901290B (en) Method and device for determining gazing area and wearable device
CN108369744A (en) It is detected by the 3D blinkpunkts of binocular homography
Tang et al. Evaluation of calibration procedures for optical see-through head-mounted displays
CN108537103B (en) Living body face detection method and device based on pupil axis measurement
CN109828663A (en) Determination method and device, the operating method of run-home object of aiming area
CN104679222A (en) Medical office system based on human-computer interaction, medical information sharing system and method
US20190281280A1 (en) Parallax Display using Head-Tracking and Light-Field Display
Hua et al. A testbed for precise registration, natural occlusion and interaction in an augmented environment using a head-mounted projective display (HMPD)
CN116027910B (en) Eye bitmap generation method and system based on VR eye movement tracking technology
Liao et al. AR interfaces for disocclusion—a comparative study
Andersen et al. A hand-held, self-contained simulated transparent display
US20030179249A1 (en) User interface for three-dimensional data sets
CN110060349A (en) A method of extension augmented reality head-mounted display apparatus field angle
CN107884930B (en) Head-mounted device and control method
CN107403406B (en) Method and system for converting between solid image and virtual image
CN113961068A (en) Close-distance real object eye movement interaction method based on augmented reality helmet

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant