CN107908285B - Data processing method, device and system - Google Patents

Data processing method, device and system Download PDF

Info

Publication number
CN107908285B
CN107908285B CN201711115810.1A CN201711115810A CN107908285B CN 107908285 B CN107908285 B CN 107908285B CN 201711115810 A CN201711115810 A CN 201711115810A CN 107908285 B CN107908285 B CN 107908285B
Authority
CN
China
Prior art keywords
user
eye movement
movement range
virtual electronic
eyeball
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711115810.1A
Other languages
Chinese (zh)
Other versions
CN107908285A (en
Inventor
郭凤阳
张洲
兰顺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201711115810.1A priority Critical patent/CN107908285B/en
Publication of CN107908285A publication Critical patent/CN107908285A/en
Application granted granted Critical
Publication of CN107908285B publication Critical patent/CN107908285B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

According to the method, the eye movement range output by the virtual electronic equipment can be controlled to be projected to the eyeballs of the user according to the positions of the eyeballs of the user, so that the user can watch the virtual scene.

Description

Data processing method, device and system
Technical Field
The present invention relates to the field of optical technologies, and in particular, to a data processing method, apparatus, and system.
Background
With the continuous development of technology, virtual display devices have been developed rapidly, and in general, in a head-mounted display device, as shown in fig. 1, the size of an Exit Pupil (Exit Pupil) of the head-mounted display device can be determined according to the distance between an Eye and a lens — the Eye distance (Eye distance), so as to determine the Eye movement range (Eye Box) when a user wears the head-mounted display device.
It should be noted that the larger the eye movement range is, the higher the comfort level of the eyes of the user is when watching, but because the virtual display device has limitations of size and volume, the size of the virtual display device is not too large, and therefore, the eye movement range of the device is fixed on the premise that the size of the virtual display device is not increased. When the wearing angle of the user changes, the eye sight line changes, so that the virtual scene cannot be seen, and the user experience is poor.
Disclosure of Invention
In view of this, the present invention provides a data processing method, which adjusts an eye movement range of a virtual electronic device according to positions of eyeballs of a user, and solves a problem in the prior art that the user cannot see a virtual scene when the eye sight line changes, and the technical scheme is as follows:
a data processing method is applied to virtual electronic equipment and comprises the following steps:
acquiring a position parameter of eyeballs of a user;
and controlling the eye movement range output by the virtual electronic equipment to be projected to the user eyeball based on the position parameter, wherein the projection direction of the eye movement range is changed along with the change of the position parameter of the user eyeball.
Optionally, the acquiring the position parameter of the eyeball of the user includes:
acquiring a coordinate position of the user eyeball on a first preset plane;
and acquiring an included angle between the eyeballs of the user and a first preset direction.
Optionally, the controlling the eye movement range output by the virtual electronic device to be projected to the user eyeball includes:
dividing the first preset plane into four quadrants, and controlling the projection direction of the eye movement range output by the virtual electronic equipment to rotate in a second direction according to the included angle when the coordinate position of the eyeball of the user is in the first quadrant and the second quadrant;
and when the coordinate position of the eyeball of the user is in a third quadrant and a fourth quadrant, controlling the projection direction of the eye movement range output by the virtual electronic equipment to rotate in a third direction according to the included angle, wherein the third direction and the second direction are opposite.
Optionally, the data processing method further includes:
acquiring a coordinate range of the eye movement range output by the virtual display equipment on the first preset plane;
and judging whether the coordinate position of the eyeball of the user on a first preset plane is in the coordinate range of the eye movement range on the first preset plane, if so, controlling the projection direction of the eye movement range output by the virtual electronic equipment not to rotate.
A data processing apparatus applied to a virtual electronic device, the data processing apparatus comprising:
the first acquisition module is used for acquiring the position parameters of eyeballs of the user;
and the control module is used for controlling the eye movement range output by the virtual electronic equipment to be projected to the eyeballs of the user based on the position parameters, wherein the projection direction of the eye movement range is changed along with the change of the position parameters of the eyeballs of the user.
Optionally, the first obtaining module includes:
the first acquisition unit is used for acquiring the coordinate position of the user eyeball on a first preset plane;
and the second acquisition unit is used for acquiring the included angle between the eyeballs of the user and the first preset direction.
Optionally, the control module includes:
the dividing unit is used for dividing the first preset plane into four quadrants;
the first control unit is used for controlling the projection direction of the eye movement range output by the virtual electronic equipment to rotate in a second direction according to the included angle when the coordinate position of the eyeball of the user is in a first quadrant and a second quadrant;
and the second control unit is used for controlling the projection direction of the eye movement range output by the virtual electronic equipment to rotate in a third direction according to the included angle when the coordinate position of the eyeball of the user is in a third quadrant and a fourth quadrant, and the third direction and the second direction are opposite.
Optionally, the method further includes:
the second acquisition module is used for acquiring the coordinate range of the eye movement range output by the virtual display equipment on the first preset plane;
and the judging module is used for judging whether the coordinate position of the eyeball of the user on a first preset plane is in the coordinate range of the eye movement range on the first preset plane, and if so, controlling the projection direction of the eye movement range output by the virtual electronic equipment not to rotate.
A data processing system comprising:
a memory for storing a program;
a processor for executing the program, wherein the program executes any one of the above data processing methods when running.
A storage medium storing a program which, when executed by a processor, implements any one of the above-described data processing methods.
The technical scheme has the following beneficial effects:
according to the data processing method provided by the invention, the eye movement range output by the virtual electronic equipment can be controlled to be projected to the eyeballs of the user according to the positions of the eyeballs of the user, so that the user can watch a virtual scene.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
FIG. 1 is a schematic diagram illustrating a principle of determining an eye movement range according to an eye distance in a virtual electronic device;
fig. 2 is a schematic flow chart of a data processing method according to an embodiment of the present invention;
fig. 3 is a schematic diagram illustrating a light refraction angle of a virtual electronic device according to an embodiment of the invention;
fig. 4 is a schematic flowchart of a data processing method according to an embodiment of the present invention;
fig. 5 is a schematic diagram of a first default plane in a data processing method according to an embodiment of the present invention;
fig. 6 is a schematic view of angles of light refracted by a virtual electronic device based on different positions of eyeballs of a user in a data processing method according to an embodiment of the present invention;
fig. 7 is a schematic flowchart of a data processing method according to an embodiment of the present invention;
fig. 8 is a schematic diagram illustrating a first preset plane division in a data processing method according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of a data processing apparatus according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 2, fig. 2 is a schematic flow chart of a data processing method according to an embodiment of the present invention, where the data processing method is applied to a virtual electronic device, and includes the steps of:
s21, acquiring the position parameters of the eyeballs of the user;
the inventor finds that the eye movement range projected by the virtual electronic device in the prior art is a range with a fixed size, however, due to different user faces, such as different heights of nose bridges of different users, the angles of virtual scenes projected by the virtual electronic device are different when different users wear the same virtual electronic device, and therefore, the wearing angle of wearing the virtual electronic device needs to be manually adjusted by the user to optimize the angle of the viewed virtual scene, which undoubtedly reduces the experience of the user in using the virtual electronic device.
Based on this, this embodiment provides a data processing method, first obtaining a position parameter of an eyeball of a user, specifically, as shown in fig. 3, a near infrared sensor NIRSesor may be disposed in a virtual electronic device, when a user wears the virtual electronic device (e.g., a head-mounted electronic device HMD), infrared light emitted by the near infrared sensor illuminates the eye of the user, at this time, an iris of the eye reflects the infrared light, and the position parameter of the eyeball of the user may be determined by detecting the reflected light by the near infrared sensor, where the position parameter may be a specific position and a specific direction of the eyeball.
Specifically, the present embodiment further provides a method for specifically obtaining a position parameter of an eyeball of a user, as shown in fig. 4, including the steps of:
s41, acquiring the coordinate position of the user eyeball on a first preset plane;
and S42, acquiring an included angle between the eyeball of the user and the first preset direction.
With reference to fig. 5, a preset plane may be predefined, for example, the plane may be a plane a perpendicular to a certain reference light emitted by the virtual electronic device, at this time, an intersection point of the illumination light emitted by the virtual electronic device on the first preset plane is defined as a coordinate origin (0, 0), and then, according to a projection position of a light refracted by the iris on the first preset plane detected by the near infrared sensor, a coordinate position of an eyeball of the user on the first preset plane is determined. For example, the position of the user's eye in the figure may be (-1, 1), located in the first quadrant on plane A.
In addition, the present embodiment further obtains an included angle between the eyeball of the user and the first preset direction, and preferably, in the present embodiment, the first preset direction is defined as a direction of a reference light emitted by the virtual electronic device, such as a horizontal direction in the drawing. As can be seen from the figure, the angle between the eyeball of the user and the first preset direction is α. The step is to obtain the angle of the included angle α.
It should be noted that the above definitions of the present embodiment are only for illustration, and the present invention is not limited to the above method for defining the preset direction and the manner for acquiring the eyeball position information of the user. For example, a plane on which the virtual electronic device displays the virtual scene may be defined as a first preset plane, and at this time, a projection position of the user's eyeball on the first preset plane may be obtained.
And S22, controlling the eye movement range output by the virtual electronic equipment to be projected to the user eyeball based on the position parameter, wherein the projection direction of the eye movement range changes along with the change of the position parameter of the user eyeball.
In this embodiment, the virtual electronic device may adjust the projection angle according to a position parameter of an eyeball of the user, and specifically, the MEMS sensor may be disposed on the virtual electronic device, so that when the position of the eyeball is not at an angle that best views a virtual scene, a light refraction angle of the MEMS sensor is adjusted according to an obtained included angle between the eyeball of the user and a preset direction, so that an eye movement range (in which the virtual scene is projected) projected by the virtual electronic device is directly opposite to the eyeball of the user, and a good viewing experience is provided for the user.
Specifically, in this embodiment, as shown in fig. 6, the MEMS sensor may be disposed at the lower end of the virtual electronic device, so as to ensure that the light is refracted by the MEMS sensor. For example, in fig. 6a, after the user wears the virtual electronic device, the obtained position of the eyeball of the user is higher than a first preset direction (e.g., a horizontal direction), and at this time, the MEMS sensor is controlled to rotate by a certain angle, so that the light is directly opposite to the eye of the user after being refracted by the MEMS sensor. The angle of the MEMS adjustment can be determined according to the included angle between the user eyeball and the first preset direction, if the included angle between the obtained user eyeball and the first preset direction is 10 degrees, the light rays refracted by the MEMS can be controlled to be 10 degrees with the first preset direction, and at the moment, the light rays refracted by the MEMS can be guaranteed to be just opposite to the user eyeball.
For another example, in fig. 6b, after the user wears the virtual electronic device, the obtained position of the eyeball of the user is in a first preset direction (e.g., a horizontal direction), at this time, the MEMS sensor is controlled not to rotate by an angle, the default light emitted by the virtual electronic device is in the horizontal direction, and the displayed virtual scene is in a vertical direction and is directly opposite to the eye of the user.
For example, in fig. 6c, after the user wears the virtual electronic device, the obtained position of the eyeball of the user is lower than a first preset direction (e.g., a horizontal direction), and at this time, the MEMS sensor is controlled to rotate by a certain angle, so that the light is directly opposite to the eye of the user after being refracted by the MEMS sensor. The adjusting angle of the MEMS can be determined according to the included angle between the user eyeball and the first preset direction, if the included angle between the obtained user eyeball and the first preset direction is-10 degrees, the light rays refracted by the MEMS can be controlled to be-10 degrees with the first preset direction, and at the moment, the light rays refracted by the MEMS can be guaranteed to be just opposite to the user eyeball.
In addition, the present embodiment further provides another implementation method for controlling the eye movement range output by the virtual electronic device to be projected to the eyes of the user, as shown in fig. 7, the implementation method includes the steps of:
s71, dividing the first preset plane into four quadrants, and controlling the projection direction of the eye movement range output by the virtual electronic equipment to rotate in a second direction according to the included angle when the coordinate position of the eyeball of the user is in a first quadrant and a second quadrant;
and S72, when the coordinate position of the eyeball of the user is in a third quadrant and a fourth quadrant, controlling the projection direction of the eye movement range output by the virtual electronic equipment to rotate in a third direction according to the included angle, wherein the third direction and the second direction are opposite.
Specifically, referring to fig. 5, in this embodiment, the first predetermined plane is divided into a plurality of parts, for example, four quadrants, and then the projection direction of the virtual electronic device is further controlled according to the position of the quadrant where the user's eyeball is located.
For example, when the coordinate position of the eyeball of the user is in a first quadrant and a second quadrant, the projection direction of the eye movement range output by the virtual electronic device is controlled to rotate in the second direction according to the included angle, such as from bottom to top, so that the eye movement range moves from the original origin to the first quadrant in the figure.
For another example, when the coordinate position of the eyeball of the user is in a third quadrant and a fourth quadrant, the projection direction of the eye movement range output by the virtual electronic device is controlled to rotate in a third direction according to the included angle, and the third direction and the second direction are opposite to each other.
It should be noted that, this embodiment is not limited to the above-mentioned division manner, for example, the virtual electronic device may be further divided into a plurality of regions as shown in fig. 8, and if the default eye movement range output by the virtual electronic device is a region 8, then when the eyeballs of the user are located in the region 4, the user cannot view the virtual scene presented by the virtual electronic device, therefore, in this embodiment, the eye movement range output by the virtual electronic device is located in the region 4 by obtaining the position parameters of the eyeballs of the user, and then adjusting the refracted light rays of the virtual electronic device, so as to ensure that the eye movement range output by the virtual electronic device is located in the region 4, so as to ensure the optimal viewing angle of the user. The virtual electronic device can be adjusted to refract light of the light source at a required angle through the MEMS sensor, other mechanical structures can be adopted to realize the refraction of the light, and if a reflector and a rotating mechanism can be additionally arranged in the virtual electronic device, the function of light refraction is realized.
On the basis of the foregoing embodiment, this embodiment further provides a data processing method, which includes first obtaining a coordinate range of an eye movement range output by the virtual display device on the first preset plane, then determining whether a coordinate position of an eyeball of the user on the first preset plane is located in the coordinate range of the eye movement range on the first preset plane, and if so, controlling a projection direction of the eye movement range output by the virtual electronic device not to rotate.
That is, in the above embodiment, the angle adjustment of the output eye movement range is performed as long as the acquired position of the user's eyeball and the first preset direction form an included angle. In order to further save energy consumption, the present embodiment presets the default output eye movement range of the virtual electronic device, and then determines whether the current position of the eyeball of the user needs to be adjusted.
For example, the default eye movement range of the virtual electronic device is assumed to be an area formed by the area 4, the area 5, the area 7, and the area 8 in fig. 8. Then, when the eyeball of the user is located in the area 4, it is determined that the area 4 belongs to the default eye movement range of the virtual electronic device, and at this time, the user can view the virtual scene played by the virtual electronic device without adjusting the eye movement range. When the position of the eyeball of the user is in another area outside the default eye movement range of the virtual electronic device, for example, when the eyeball of the user is in the area 15 in fig. 8, the virtual electronic device triggers an adjustment instruction, and at this time, the above embodiment is executed to adjust the eye movement range according to the acquired position parameter information of the eyeball of the user, so that the projection direction of the eye movement range changes along with the change of the position parameter of the eyeball of the user.
Corresponding to the above method, as shown in fig. 9, an embodiment of the present invention further provides a data processing apparatus, which is applied to a virtual electronic device, where the virtual electronic device is capable of displaying a virtual scene, and the apparatus includes:
a first obtaining module 91, configured to obtain a position parameter of an eyeball of a user;
and the control module 92 is configured to control, based on the position parameter, an eye movement range output by the virtual electronic device to be projected to the user eyeball, where a projection direction of the eye movement range changes along with a change in the position parameter of the user eyeball.
Optionally, the first obtaining module includes:
the first acquisition unit is used for acquiring the coordinate position of the user eyeball on a first preset plane;
and the second acquisition unit is used for acquiring the included angle between the eyeballs of the user and the first preset direction.
Optionally, the control module includes:
the dividing unit is used for dividing the first preset plane into four quadrants;
the first control unit is used for controlling the projection direction of the eye movement range output by the virtual electronic equipment to rotate in a second direction according to the included angle when the coordinate position of the eyeball of the user is in a first quadrant and a second quadrant;
and the second control unit is used for controlling the projection direction of the eye movement range output by the virtual electronic equipment to rotate in a third direction according to the included angle when the coordinate position of the eyeball of the user is in a third quadrant and a fourth quadrant, and the third direction and the second direction are opposite.
Optionally, the data processing apparatus provided in this embodiment further includes:
the second acquisition module is used for acquiring the coordinate range of the eye movement range output by the virtual display equipment on the first preset plane;
and the judging module is used for judging whether the coordinate position of the eyeball of the user on a first preset plane is in the coordinate range of the eye movement range on the first preset plane, and if so, controlling the projection direction of the eye movement range output by the virtual electronic equipment not to rotate.
The data processing device comprises a processor and a memory, the first acquisition module, the control module and the like are stored in the memory as program units, and the processor executes the program units stored in the memory to realize corresponding functions.
The processor comprises a kernel, and the kernel calls the corresponding program unit from the memory. The kernel can be set to be one or more than one, and the eye movement range output by the virtual electronic equipment is controlled to be projected to the user eyeballs by acquiring the position parameters of the user eyeballs and based on the position parameters, wherein the projection direction of the eye movement range is changed along with the change of the position parameters of the user eyeballs, so that the problem of poor user experience caused by the fact that the wearing mode of the virtual electronic equipment needs to be manually adjusted in the prior art is solved.
The memory may include volatile memory in a computer readable medium, Random Access Memory (RAM) and/or nonvolatile memory such as Read Only Memory (ROM) or flash memory (flash RAM), and the memory includes at least one memory chip.
An embodiment of the present invention provides a storage medium on which a program is stored, the program implementing the data processing method when executed by a processor.
The embodiment of the invention provides a processor, which is used for running a program, wherein the data processing method is executed when the program runs.
The embodiment of the invention provides equipment, which comprises a processor, a memory and a program which is stored on the memory and can run on the processor, wherein the processor executes the program and realizes the following steps:
acquiring a position parameter of eyeballs of a user;
and controlling the eye movement range output by the virtual electronic equipment to be projected to the user eyeball based on the position parameter, wherein the projection direction of the eye movement range is changed along with the change of the position parameter of the user eyeball.
Optionally, the acquiring the position parameter of the eyeball of the user includes:
acquiring a coordinate position of the user eyeball on a first preset plane;
and acquiring an included angle between the eyeballs of the user and a first preset direction.
Optionally, the controlling the eye movement range output by the virtual electronic device to be projected to the user eyeball includes:
dividing the first preset plane into four quadrants, and controlling the projection direction of the eye movement range output by the virtual electronic equipment to rotate in a second direction according to the included angle when the coordinate position of the eyeball of the user is in the first quadrant and the second quadrant;
and when the coordinate position of the eyeball of the user is in a third quadrant and a fourth quadrant, controlling the projection direction of the eye movement range output by the virtual electronic equipment to rotate in a third direction according to the included angle, wherein the third direction and the second direction are opposite.
Optionally, the data processing method further includes:
acquiring a coordinate range of the eye movement range output by the virtual display equipment on the first preset plane;
and judging whether the coordinate position of the eyeball of the user on a first preset plane is in the coordinate range of the eye movement range on the first preset plane, if so, controlling the projection direction of the eye movement range output by the virtual electronic equipment not to rotate.
The present application further provides a computer program product adapted to perform a program for initializing the following method steps when executed on a data processing device:
acquiring a position parameter of eyeballs of a user;
and controlling the eye movement range output by the virtual electronic equipment to be projected to the user eyeball based on the position parameter, wherein the projection direction of the eye movement range is changed along with the change of the position parameter of the user eyeball.
Optionally, the acquiring the position parameter of the eyeball of the user includes:
acquiring a coordinate position of the user eyeball on a first preset plane;
and acquiring an included angle between the eyeballs of the user and a first preset direction.
Optionally, the controlling the eye movement range output by the virtual electronic device to be projected to the user eyeball includes:
dividing the first preset plane into four quadrants, and controlling the projection direction of the eye movement range output by the virtual electronic equipment to rotate in a second direction according to the included angle when the coordinate position of the eyeball of the user is in the first quadrant and the second quadrant;
and when the coordinate position of the eyeball of the user is in a third quadrant and a fourth quadrant, controlling the projection direction of the eye movement range output by the virtual electronic equipment to rotate in a third direction according to the included angle, wherein the third direction and the second direction are opposite.
Optionally, the data processing method further includes:
acquiring a coordinate range of the eye movement range output by the virtual display equipment on the first preset plane;
and judging whether the coordinate position of the eyeball of the user on a first preset plane is in the coordinate range of the eye movement range on the first preset plane, if so, controlling the projection direction of the eye movement range output by the virtual electronic equipment not to rotate.
In summary, the present application provides a data processing method, apparatus, and system, in which the method can control the eye movement range output by the virtual electronic device to be projected to the user's eyeball according to the position of the user's eyeball, so that the user can view a virtual scene.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
In the several embodiments provided in the present application, it should be understood that the disclosed method, apparatus, and device may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment. In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. A data processing method is applied to virtual electronic equipment, and the data processing method comprises the following steps:
obtaining position parameters of eyeballs of a user, wherein the position parameters at least comprise: the included angle between the eyeballs of the user and the first preset direction is set;
based on the position parameters, the light refraction angle of a sensor arranged on the virtual electronic equipment is adjusted, the eye movement range output by the virtual electronic equipment is controlled to be projected to the eyeballs of the user, wherein a virtual scene is projected in the eye movement range, and the projection direction of the eye movement range changes along with the change of the position parameters of the eyeballs of the user.
2. The data processing method of claim 1, wherein the location parameter further comprises:
and the coordinate position of the user eyeball on the first preset plane.
3. The data processing method according to claim 2, wherein the controlling of the eye movement range output by the virtual electronic device to be projected to the user's eye comprises:
dividing the first preset plane into four quadrants, and controlling the projection direction of the eye movement range output by the virtual electronic equipment to rotate in a second direction according to the included angle when the coordinate position of the eyeball of the user is in the first quadrant and the second quadrant;
and when the coordinate position of the eyeball of the user is in a third quadrant and a fourth quadrant, controlling the projection direction of the eye movement range output by the virtual electronic equipment to rotate in a third direction according to the included angle, wherein the third direction and the second direction are opposite.
4. The data processing method of claim 2, further comprising:
acquiring a coordinate range of the eye movement range output by the virtual electronic equipment on the first preset plane;
and judging whether the coordinate position of the eyeball of the user on a first preset plane is in the coordinate range of the eye movement range on the first preset plane, if so, controlling the projection direction of the eye movement range output by the virtual electronic equipment not to rotate.
5. A data processing apparatus, applied to a virtual electronic device, the data processing apparatus comprising:
a first obtaining module, configured to obtain a position parameter of an eyeball of a user, where the position parameter at least includes: the included angle between the eyeballs of the user and the first preset direction is set;
and the control module is used for adjusting the light refraction angle of a sensor arranged on the virtual electronic equipment based on the position parameter and controlling the eye movement range output by the virtual electronic equipment to be projected to the eyeballs of the user, wherein a virtual scene is projected in the eye movement range, and the projection direction of the eye movement range is changed along with the change of the position parameter of the eyeballs of the user.
6. The data processing apparatus of claim 5, wherein the first obtaining module is further configured to:
and acquiring the coordinate position of the eyeball of the user on a first preset plane.
7. The data processing apparatus of claim 6, wherein the control module comprises:
the dividing unit is used for dividing the first preset plane into four quadrants;
the first control unit is used for controlling the projection direction of the eye movement range output by the virtual electronic equipment to rotate in a second direction according to the included angle when the coordinate position of the eyeball of the user is in a first quadrant and a second quadrant;
and the second control unit is used for controlling the projection direction of the eye movement range output by the virtual electronic equipment to rotate in a third direction according to the included angle when the coordinate position of the eyeball of the user is in a third quadrant and a fourth quadrant, and the third direction and the second direction are opposite.
8. The data processing apparatus of claim 6, further comprising:
the second acquisition module is used for acquiring a coordinate range of the eye movement range output by the virtual electronic equipment on the first preset plane;
and the judging module is used for judging whether the coordinate position of the eyeball of the user on a first preset plane is in the coordinate range of the eye movement range on the first preset plane, and if so, controlling the projection direction of the eye movement range output by the virtual electronic equipment not to rotate.
9. A data processing system, comprising:
a memory for storing a program;
a processor for executing the program, which program when executed performs the data processing method of any one of claims 1 to 4.
10. A storage medium characterized by storing a program which, when executed by a processor, implements the data processing method according to any one of claims 1 to 4.
CN201711115810.1A 2017-11-13 2017-11-13 Data processing method, device and system Active CN107908285B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711115810.1A CN107908285B (en) 2017-11-13 2017-11-13 Data processing method, device and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711115810.1A CN107908285B (en) 2017-11-13 2017-11-13 Data processing method, device and system

Publications (2)

Publication Number Publication Date
CN107908285A CN107908285A (en) 2018-04-13
CN107908285B true CN107908285B (en) 2021-09-14

Family

ID=61845233

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711115810.1A Active CN107908285B (en) 2017-11-13 2017-11-13 Data processing method, device and system

Country Status (1)

Country Link
CN (1) CN107908285B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108760246B (en) * 2018-05-25 2020-01-07 上海复瞻智能科技有限公司 Method for detecting eye movement range in head-up display system
CN110413121B (en) * 2019-07-29 2022-06-14 Oppo广东移动通信有限公司 Control method of virtual reality equipment, virtual reality equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104391567A (en) * 2014-09-30 2015-03-04 深圳市亿思达科技集团有限公司 Display control method for three-dimensional holographic virtual object based on human eye tracking
CN104506836A (en) * 2014-11-28 2015-04-08 深圳市亿思达科技集团有限公司 Personal holographic three-dimensional display method and device based on eyeball tracking
CN104683786A (en) * 2015-02-28 2015-06-03 上海玮舟微电子科技有限公司 Human eye tracking method and device of naked eye 3D equipment

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9262999B1 (en) * 2013-05-13 2016-02-16 Amazon Technologies, Inc. Content orientation based on user orientation
US9626783B2 (en) * 2015-02-02 2017-04-18 Kdh-Design Service Inc. Helmet-used device capable of automatically adjusting positions of displayed information and helmet thereof
US9984507B2 (en) * 2015-11-19 2018-05-29 Oculus Vr, Llc Eye tracking for mitigating vergence and accommodation conflicts
CN105912109A (en) * 2016-04-06 2016-08-31 众景视界(北京)科技有限公司 Screen automatic switching device of head-wearing visual device and head-wearing visual device
CN107247511B (en) * 2017-05-05 2019-07-16 浙江大学 A kind of across object exchange method and device captured based on eye movement in virtual reality

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104391567A (en) * 2014-09-30 2015-03-04 深圳市亿思达科技集团有限公司 Display control method for three-dimensional holographic virtual object based on human eye tracking
CN104506836A (en) * 2014-11-28 2015-04-08 深圳市亿思达科技集团有限公司 Personal holographic three-dimensional display method and device based on eyeball tracking
CN104683786A (en) * 2015-02-28 2015-06-03 上海玮舟微电子科技有限公司 Human eye tracking method and device of naked eye 3D equipment

Also Published As

Publication number Publication date
CN107908285A (en) 2018-04-13

Similar Documents

Publication Publication Date Title
KR102121134B1 (en) Eye-traceable wearable devices
US11290706B2 (en) Display systems and methods for determining registration between a display and a user's eyes
US11880033B2 (en) Display systems and methods for determining registration between a display and a user's eyes
CN106999034B (en) Wearable device and method for outputting virtual image
US11714487B2 (en) Gaze and smooth pursuit based continuous foveal adjustment
US20240126086A1 (en) Systems and methods for operating a display system based on user perceptibility
US10852817B1 (en) Eye tracking combiner having multiple perspectives
US8752963B2 (en) See-through display brightness control
US20140146394A1 (en) Peripheral display for a near-eye display device
JP2019527377A (en) Image capturing system, device and method for automatic focusing based on eye tracking
US10725302B1 (en) Stereo imaging with Fresnel facets and Fresnel reflections
WO2018223663A1 (en) Vr image processing method, device, and apparatus
CN105474071A (en) Projection processor for projective display system
CN112384883A (en) Wearable device and control method thereof
CN107908285B (en) Data processing method, device and system
US11307654B1 (en) Ambient light eye illumination for eye-tracking in near-eye display
CN116569221A (en) Flexible illumination for imaging systems
CN116583885A (en) Gesture optimization in biometric authentication systems
CN116472564A (en) Automatically selecting biometric based on quality of acquired image
CN116529787A (en) Multi-wavelength biological identification imaging system
US20240045498A1 (en) Electronic apparatus
US20240105046A1 (en) Lens Distance Test for Head-Mounted Display Devices
EP4350417A1 (en) Augmented reality apparatus and method for providing vision measurement and vision correction
CN117761892A (en) Lens distance testing for head mounted display devices
CN116529786A (en) Multi-camera biometric imaging system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant