CN111221412A - Cursor positioning method and device based on eye control - Google Patents

Cursor positioning method and device based on eye control Download PDF

Info

Publication number
CN111221412A
CN111221412A CN201911379933.5A CN201911379933A CN111221412A CN 111221412 A CN111221412 A CN 111221412A CN 201911379933 A CN201911379933 A CN 201911379933A CN 111221412 A CN111221412 A CN 111221412A
Authority
CN
China
Prior art keywords
eye
pupil position
screen
unit
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911379933.5A
Other languages
Chinese (zh)
Inventor
胡强
胡琅
徐平
侯立涛
冯杰
何斌
方威
侯少毅
黎天韵
黄丽玲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ji Hua Laboratory
Original Assignee
Ji Hua Laboratory
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ji Hua Laboratory filed Critical Ji Hua Laboratory
Priority to CN201911379933.5A priority Critical patent/CN111221412A/en
Publication of CN111221412A publication Critical patent/CN111221412A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/197Matching; Classification

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

The invention provides a cursor positioning method and device based on eye control. The method comprises the following steps: A. when a user looks at four corners of a screen, an image of one eye is collected, and the pupil position of the eye is identified; B. establishing a mapping relation between the pupil position and the position on the screen; C. collecting a current image of the eye and identifying the pupil position of the eye; D. calculating the position of the sight of the user on the screen according to the mapping relation; E. and sending the calculation result to the controlled host computer so that the controlled host computer controls the screen to display the cursor at the corresponding position. The device comprises a pupil position acquisition module, a mapping module, a calculation module and a communication module. The method and the device can realize the control of the position of the cursor through eyes, and are convenient for people whose hands can not work normally to control a computer.

Description

Cursor positioning method and device based on eye control
Technical Field
The invention relates to the technical field of computer control, in particular to a cursor positioning method and device based on eye control.
Background
A general computer controls a position of a cursor on a screen by a movement of a mouse, which is generally operated by a human hand, however, for a person with a hand disability or a person whose hand cannot normally work due to other diseases, the computer is difficult to control by the mouse.
Disclosure of Invention
In view of the above-mentioned shortcomings in the prior art, the present invention aims to provide a cursor positioning method and device based on eye control.
In order to achieve the purpose, the invention adopts the following technical scheme:
a cursor positioning method based on eye control comprises the following steps:
A. when a user looks at four corners of a screen, an image of one eye is collected, and the pupil position of the eye is identified;
B. establishing a mapping relation between the pupil position and the position on the screen;
C. collecting a current image of the eye and identifying the pupil position of the eye;
D. calculating the position of the sight of the user on the screen according to the mapping relation;
E. and sending the calculation result to the controlled host computer so that the controlled host computer controls the screen to display the cursor at the corresponding position.
In the cursor positioning method based on eye control, step a specifically includes:
A1. entering a calibration state when a user completes a first preset action by using eyes;
A2. four consecutive executions: when a user finishes a second preset action by using eyes, acquiring an image of the eyes and identifying the pupil position of the eyes;
A3. judging whether the pupil positions of the four times are the positions of four corners of a rectangle or not; if yes, executing step A4, otherwise, executing step A5;
A4. recording the positions of the four pupils and exiting the calibration state;
A5. the reminder is issued and step a2 is re-executed.
In the cursor positioning method based on eye control, the first preset action is used as the preset times of the continuous glaring; the second preset action is a preset number of continuous blinks.
In the cursor positioning method based on eye control, the coordinates of the pupil position in the preset coordinate system are identified in step a2, and the coordinates of the pupil position are calculated four times in step A3, so as to determine whether the coordinates are the positions of the four corners of the rectangle.
In the cursor positioning method based on eye control, step B specifically includes:
B1. b, establishing an eye coordinate system by taking the first pupil position obtained in the step A as an origin, taking a connecting line of the origin and one non-diagonal pupil position as a horizontal coordinate, and taking a connecting line of the origin and the other non-diagonal pupil position as a vertical coordinate;
B2. and calculating the abscissa length of the unit abscissa length in the eye coordinate system corresponding to the abscissa length on the screen according to the preset screen size, and calculating the ordinate length of the unit ordinate length in the eye coordinate system corresponding to the ordinate length on the screen.
In the cursor positioning method based on eye control, in step D, the coordinates of the corresponding position on the screen are calculated from the coordinates of the current pupil position in the eye coordinate system.
An eye control based cursor positioning device comprising:
the pupil position acquisition module is used for acquiring an image of one eye of the user and identifying the pupil position of the eye;
the mapping module is used for establishing a mapping relation between the pupil position and the position on the screen;
the computing module is used for computing the position of the sight of the user on the screen according to the mapping relation;
and the communication module is used for sending the calculation result to the controlled host computer so that the controlled host computer controls the screen to display the cursor at the corresponding position.
In the cursor positioning device based on eye control, the pupil position collecting module includes:
the image acquisition unit is used for acquiring an eye image of a user;
an identification unit for identifying a pupil position;
the first judging unit is used for judging whether the eyes of the user complete a first preset action or not;
the first execution unit is used for entering a calibration state when a user completes a first preset action by using eyes;
the second judgment unit is used for judging whether the eyes of the user complete a second preset action or not;
the second execution unit is used for driving the identification unit to identify the current pupil position when the user completes a second preset action by using eyes;
the third judging unit is used for judging whether the pupil positions identified by the identification unit driven by the second execution unit are the positions of four corners of a rectangle or not;
the third execution unit is used for recording the positions of the pupils for four times and executing the exit of the calibration state when the judgment result of the third judgment unit is yes;
and the fourth execution unit is used for driving the second execution unit to re-identify the pupil position for four times when the judgment result of the third judgment unit is negative.
In the eye-control-based cursor positioning device, the mapping module comprises:
the coordinate establishing unit is used for establishing an eye coordinate system by taking a first pupil position in the four pupil positions recorded by the third executing unit as an origin, taking a connecting line of the origin and one non-diagonal pupil position as a horizontal coordinate, and taking a connecting line of the origin and the other non-diagonal pupil position as a vertical coordinate;
and the calculating unit is used for calculating the horizontal coordinate length of the unit horizontal coordinate length in the eye coordinate system corresponding to the screen according to the preset screen size and calculating the vertical coordinate length of the unit vertical coordinate length in the eye coordinate system corresponding to the screen.
The cursor positioning device based on the eye control further comprises a glasses frame and a main body part arranged in front of and above one frame body of the glasses frame; the pupil position acquisition module, the mapping module, the calculation module and the communication module are contained in the main body part.
Has the advantages that:
the invention provides a cursor positioning method and device based on eye control, which comprises the steps of collecting an image of one eye when a user looks at four corners of a screen, and identifying the pupil position of the eye; establishing a mapping relation between the pupil position and the position on the screen; collecting a current image of the eye and identifying the pupil position of the eye; calculating the position of the sight of the user on the screen according to the mapping relation; sending the calculation result to the controlled host computer so that the controlled host computer can control the screen to display the cursor at the corresponding position; the position of the cursor is controlled through eyes, and a person whose hand can not work normally can conveniently control the computer.
Drawings
Fig. 1 is a flowchart of a cursor positioning method based on eye control according to the present invention.
Fig. 2 is a specific process of step a in the cursor positioning method based on eye control according to the present invention.
Fig. 3 is a schematic structural diagram of a cursor positioning device based on eye control according to the present invention.
Fig. 4 is a schematic structural diagram of a pupil position acquisition module in the cursor positioning device based on eye control according to the present invention.
Fig. 5 is a schematic structural diagram of a mapping module in the cursor positioning device based on eye control according to the present invention.
FIG. 6 is a perspective view of a cursor positioning device based on eye control provided by the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention.
In the description of the present invention, it is to be understood that the terms "center", "longitudinal", "lateral", "length", "width", "thickness", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", "clockwise", "counterclockwise", and the like, indicate orientations and positional relationships based on those shown in the drawings, and are used only for convenience of description and simplicity of description, and do not indicate or imply that the device or element being referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus, should not be considered as limiting the present invention. Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, features defined as "first", "second", may explicitly or implicitly include one or more of the described features. In the description of the present invention, "a plurality" means two or more unless specifically defined otherwise.
The following disclosure provides embodiments or examples for implementing different configurations of the invention. To simplify the disclosure of the present invention, the components and arrangements of specific examples are described below. Of course, they are merely examples and are not intended to limit the present invention. Furthermore, the present invention may repeat reference numerals and/or letters in the various examples, such repetition is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various embodiments and/or configurations discussed. In addition, the present invention provides examples of various specific processes and materials, but one of ordinary skill in the art may recognize applications of other processes and/or uses of other materials.
Referring to fig. 1 and 2, the cursor positioning method based on eye control provided by the present invention includes the steps of:
A. when a user looks at the four corners of the screen, an image of one of the eyes is collected, and the pupil position of the eye is identified.
Because the relative position angles and distances between the user and the screen are different, the corresponding relationship between the pupil position of the human eyes and the specific position of the sight line of the human eyes on the screen is different, and therefore the screen position needs to be calibrated firstly.
Specifically, the step a includes the steps of:
A1. entering a calibration state when a user completes a first preset action with eyes
When a user just starts to use a computer or the position mode of the user is changed, the position of the screen needs to be calibrated firstly, and whether the user enters a calibration state is judged by acquiring the eye image of the user and judging whether the user finishes a first preset action.
The first preset action may be, but is not limited to, a preset number of times of successive glaring (for example, the preset number of times of blinking, the preset number of times of up-and-down movement of the eyes, the preset number of times of left-and-right movement of the eyes, and the like may also be used). The preset number of times is preferably 2-3, and if the preset number of times is 1, the user is liable to be triggered by an unintentional glaring motion (a blinking motion, an eye up-and-down moving motion, an eye left-and-right moving motion), and if the preset number of times is more than 3, fatigue is liable to occur and misoperation is liable to occur.
A2. Four consecutive executions: when the user completes the second preset action by using the eyes, the image of the eyes is collected and the pupil position of the eyes is identified
After entering the calibration state, the user sequentially looks at four corners of the screen, and completes a second preset action when looking at each corner, and whether the second preset action is completed or not is judged by acquiring the eye image of the user, and if so, the position of the pupil at that time is identified.
The second preset action may be, but is not limited to, a preset number of times of continuous blinking (for example, the preset number of times of glaring, the preset number of times of up-and-down movement of the eyes, the preset number of times of left-and-right movement of the eyes, and the like may also be used). The preset number of times is preferably 2-3, and if the preset number of times is 1, the user is liable to mistakenly trigger due to an unintentional blinking motion (a glaring motion, an eye up-and-down moving motion, an eye left-and-right moving motion), and if the preset number of times is more than 3, fatigue is liable to occur and misoperation is liable to occur.
The second preset operation may be different from or the same as the first preset operation. When the two actions are the same, the calibration body is entered after the first action is finished, and the same actions finished afterwards are all judged as a second preset action.
In step a2, identifying the pupil position specifically includes: and identifying the coordinates of the pupil position in a preset coordinate system. The preset coordinate system is a coordinate system preset in a system program, that is, an inherent coordinate system of the equipment system, and is usually a body coordinate system of a camera in the equipment system.
A3. Judging whether the pupil positions of the four times are the positions of four corners of a rectangle or not; if yes, go to step A4, otherwise go to step A5
Specifically, the coordinates of the pupil position (coordinates in a preset coordinate system) are calculated four times, so as to judge whether the pupil position is the position of four corners of a rectangle;
for example, a formula of a straight line where a connecting line between two adjacent points is located is calculated to obtain four straight line formulas (a distance between any two points and a common straight line formula are calculated, then the maximum two distances and the corresponding straight line formulas are removed, because the two straight lines are diagonal lines), then included angles between every two of the four straight lines are calculated to obtain four included angle values, and when the four included angle values all fall within an effective angle range taking 90 degrees as a center, the judgment result is yes. Wherein all effective angular ranges may be, but are not limited to, 89-91. The determination method is not limited thereto.
In the preferred embodiment, when the user looks at the four corners of the screen, the user looks clockwise or counterclockwise in sequence, and the connecting line of the two points collected before and after is the straight line required when judging whether the connecting line is the position of the four corners of the rectangle (so that the process of calculating the straight line formula of the diagonal line is omitted), which is beneficial to simplifying the calculation process.
A4. Recording the position of the four pupils and exiting the calibration state
Here, the recorded position of the fourth pupil refers to a coordinate value in a preset coordinate system.
A5. Sending out the reminding information and re-executing the step A2
The reminding information can be, but is not limited to, voice information and/or image information; the voice information can be sent out by a voice module of the device, or can be sent to the controlled host computer and then sent out by the controlled host computer through a loudspeaker; the image information can be sent to the controlled host computer firstly and then displayed by the controlled host computer through the screen.
B. And establishing a mapping relation between the pupil position and the position on the screen.
The step B specifically comprises the following steps:
B1. establishing an eye coordinate system by taking the first pupil position obtained in the step A as an origin (or taking other pupil positions as the origin), taking a connecting line of the origin and one non-diagonal pupil position as a horizontal coordinate, and taking a connecting line of the origin and the other non-diagonal pupil position as a vertical coordinate; generally, the longer one of the two connecting lines is taken as an abscissa, and the shorter one is taken as an ordinate so as to correspond to the length and width directions of the rectangular screen;
B2. according to the preset screen size (the length and width of the screen are fixed and can be manually input in a computer or automatically recognized by the computer and then sent to the equipment end), the horizontal coordinate length of the unit horizontal coordinate length in the eye coordinate system corresponding to the screen is calculated, and the vertical coordinate length of the unit vertical coordinate length in the eye coordinate system corresponding to the screen is calculated.
C. A current image of the eye is acquired and its pupil position is identified.
D. Calculating the position of the sight of the user on the screen according to the mapping relation
And calculating the coordinates of the corresponding position on the screen according to the coordinates of the current pupil position in the eye coordinate system.
E. And sending the calculation result to the controlled host computer so that the controlled host computer controls the screen to display the cursor at the corresponding position.
In the cursor positioning method based on eye control, when a user looks at four corners of a screen, an image of one eye is collected, and the pupil position of the eye is identified; establishing a mapping relation between the pupil position and the position on the screen; collecting a current image of the eye and identifying the pupil position of the eye; calculating the position of the sight of the user on the screen according to the mapping relation; sending the calculation result to the controlled host computer so that the controlled host computer can control the screen to display the cursor at the corresponding position; the position of the cursor is controlled through eyes, and a person whose hand can not work normally can conveniently control the computer.
Referring to fig. 3-6, the present invention further provides a cursor positioning device based on eye control, comprising: the pupil position acquisition module 1, the mapping module 2, the calculation module 3 and the communication module 4;
the pupil position acquisition module 1 is used for acquiring an image of one eye of a user and identifying the pupil position of the eye;
the mapping module 2 is used for establishing a mapping relation between the pupil position and the position on the screen;
the computing module 3 is used for computing the position of the sight of the user on the screen according to the mapping relation;
the communication module 4 is configured to send the calculation result to the controlled host, so that the controlled host controls the screen to display the cursor at the corresponding position.
Further, referring to fig. 4, the pupil position collecting module 1 includes: the system comprises an image acquisition unit 1.1, an identification unit 1.2, a first judgment unit 1.3, a first execution unit 1.4, a second judgment unit 1.5, a second execution unit 1.6, a third judgment unit 1.7, a third execution unit 1.8 and a fourth execution unit 1.9;
the image acquisition unit 1.1 is used for acquiring an eye image of a user;
wherein, the identification unit 1.2 is used for identifying the pupil position;
the first judging unit 1.3 is used for judging whether the eyes of the user complete a first preset action;
the first execution unit 1.4 is used for entering a calibration state when a user completes a first preset action by using eyes;
the second judging unit 1.5 is used for judging whether the eyes of the user complete a second preset action;
the second execution unit 1.6 is configured to drive the identification unit 1.2 to identify the current pupil position when the user completes a second preset action with the eyes;
the third judging unit 1.7 is configured to judge whether the pupil position identified by the identifying unit 1.2 is the position of four corners of the rectangle by driving the identifying unit 1.2 by the second executing unit 1.6 four times;
the third executing unit 1.8 is configured to record the positions of the pupils of the four times and execute exiting from the calibration state when the third determining unit 1.7 determines that the result is yes;
the fourth executing unit 1.9 is configured to drive the second executing unit 1.2 to re-identify the pupil position four times when the determination result of the third determining unit 1.7 is negative.
In some embodiments, the pupil position collecting module 1 further includes a voice module (not shown in the figure), and the voice module is configured to send out a voice prompt message when the determination result of the three determining units 1.7 is negative.
Further, referring to fig. 5, the mapping module 2 includes: a coordinate establishing unit 2.1 and a calculating unit 2.2;
the coordinate establishing unit 2.1 is configured to establish an eye coordinate system by using a first pupil position in the four pupil positions recorded by the third executing unit 1.8 as an origin, using a connecting line between the origin and one non-diagonal pupil position as a horizontal coordinate, and using a connecting line between the origin and the other non-diagonal pupil position as a vertical coordinate;
the calculating unit 2.2 is configured to calculate, according to a preset screen size, an abscissa length of a unit abscissa length in the eye coordinate system corresponding to the screen, and a ordinate length of a unit ordinate length in the eye coordinate system corresponding to the screen.
In some preferred embodiments, see fig. 6, the eye control based cursor positioning device further comprises a glasses frame 5, and a main body part 6 disposed in front of and above one of the frames of the glasses frame 5; the pupil position acquisition module 1, the mapping module 2, the calculation module 3, and the communication module 4 are included in the main body portion 6. The main body part 6 is fixed above and in front of the eyes of the user by the glasses frame 5 so as to collect and analyze the images of the human eyes and avoid obstructing the sight of the user. The frame of the glasses frame 5 may or may not have lenses.
In view of the above, the cursor positioning device based on eye control collects the image of one of the eyes when the user looks at the four corners of the screen, and identifies the pupil position; establishing a mapping relation between the pupil position and the position on the screen; collecting a current image of the eye and identifying the pupil position of the eye; calculating the position of the sight of the user on the screen according to the mapping relation; sending the calculation result to the controlled host computer so that the controlled host computer can control the screen to display the cursor at the corresponding position; the position of the cursor is controlled through eyes, and a person whose hand can not work normally can conveniently control the computer.
In summary, although the present invention has been described with reference to the preferred embodiments, the above-described preferred embodiments are not intended to limit the present invention, and those skilled in the art can make various changes and modifications without departing from the spirit and scope of the present invention, which are substantially the same as the present invention.

Claims (10)

1. A cursor positioning method based on eye control is characterized by comprising the following steps:
A. when a user looks at four corners of a screen, an image of one eye is collected, and the pupil position of the eye is identified;
B. establishing a mapping relation between the pupil position and the position on the screen;
C. collecting a current image of the eye and identifying the pupil position of the eye;
D. calculating the position of the sight of the user on the screen according to the mapping relation;
E. and sending the calculation result to the controlled host computer so that the controlled host computer controls the screen to display the cursor at the corresponding position.
2. The method for cursor positioning based on eye control of claim 1, wherein step a specifically comprises:
A1. entering a calibration state when a user completes a first preset action by using eyes;
A2. four consecutive executions: when a user finishes a second preset action by using eyes, acquiring an image of the eyes and identifying the pupil position of the eyes;
A3. judging whether the pupil positions of the four times are the positions of four corners of a rectangle or not; if yes, executing step A4, otherwise, executing step A5;
A4. recording the positions of the four pupils and exiting the calibration state;
A5. the reminder is issued and step a2 is re-executed.
3. The cursor positioning method based on eye control of claim 2, wherein the first preset action is a preset number of times of a series of glares; the second preset action is a preset number of continuous blinks.
4. The cursor positioning method based on eye control of claim 2, wherein the coordinates of the pupil position in the preset coordinate system are identified in step a2, and the coordinates of the pupil position are calculated four times in step A3, so as to determine whether the positions are the four corners of the rectangle.
5. The method of claim 2, wherein step B specifically comprises:
B1. b, establishing an eye coordinate system by taking the first pupil position obtained in the step A as an origin, taking a connecting line of the origin and one non-diagonal pupil position as a horizontal coordinate, and taking a connecting line of the origin and the other non-diagonal pupil position as a vertical coordinate;
B2. and calculating the abscissa length of the unit abscissa length in the eye coordinate system corresponding to the abscissa length on the screen according to the preset screen size, and calculating the ordinate length of the unit ordinate length in the eye coordinate system corresponding to the ordinate length on the screen.
6. An eye control-based cursor positioning method according to claim 5, wherein in step D, the coordinates of the corresponding position on the screen are calculated from the coordinates of the current pupil position in the eye coordinate system.
7. An eye control based cursor positioning device, comprising:
the pupil position acquisition module is used for acquiring an image of one eye of the user and identifying the pupil position of the eye;
the mapping module is used for establishing a mapping relation between the pupil position and the position on the screen;
the computing module is used for computing the position of the sight of the user on the screen according to the mapping relation;
and the communication module is used for sending the calculation result to the controlled host computer so that the controlled host computer controls the screen to display the cursor at the corresponding position.
8. An eye control-based cursor positioning device according to claim 7, wherein the pupil position acquisition module comprises:
the image acquisition unit is used for acquiring an eye image of a user;
an identification unit for identifying a pupil position;
the first judging unit is used for judging whether the eyes of the user complete a first preset action or not;
the first execution unit is used for entering a calibration state when a user completes a first preset action by using eyes;
the second judgment unit is used for judging whether the eyes of the user complete a second preset action or not;
the second execution unit is used for driving the identification unit to identify the current pupil position when the user completes a second preset action by using eyes;
the third judging unit is used for judging whether the pupil positions identified by the identification unit driven by the second execution unit are the positions of four corners of a rectangle or not;
the third execution unit is used for recording the positions of the pupils for four times and executing the exit of the calibration state when the judgment result of the third judgment unit is yes;
and the fourth execution unit is used for driving the second execution unit to re-identify the pupil position for four times when the judgment result of the third judgment unit is negative.
9. An eye control-based cursor positioning device according to claim 7, wherein said mapping module comprises:
the coordinate establishing unit is used for establishing an eye coordinate system by taking a first pupil position in the four pupil positions recorded by the third executing unit as an origin, taking a connecting line of the origin and one non-diagonal pupil position as a horizontal coordinate, and taking a connecting line of the origin and the other non-diagonal pupil position as a vertical coordinate;
and the calculating unit is used for calculating the horizontal coordinate length of the unit horizontal coordinate length in the eye coordinate system corresponding to the screen according to the preset screen size and calculating the vertical coordinate length of the unit vertical coordinate length in the eye coordinate system corresponding to the screen.
10. An eye control-based cursor positioning device according to any one of claims 7-9, further comprising a glasses frame, and a main body portion disposed in front of and above one of the housings of the glasses frame; the pupil position acquisition module, the mapping module, the calculation module and the communication module are contained in the main body part.
CN201911379933.5A 2019-12-27 2019-12-27 Cursor positioning method and device based on eye control Pending CN111221412A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911379933.5A CN111221412A (en) 2019-12-27 2019-12-27 Cursor positioning method and device based on eye control

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911379933.5A CN111221412A (en) 2019-12-27 2019-12-27 Cursor positioning method and device based on eye control

Publications (1)

Publication Number Publication Date
CN111221412A true CN111221412A (en) 2020-06-02

Family

ID=70826681

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911379933.5A Pending CN111221412A (en) 2019-12-27 2019-12-27 Cursor positioning method and device based on eye control

Country Status (1)

Country Link
CN (1) CN111221412A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112656483A (en) * 2020-12-21 2021-04-16 中南大学湘雅医院 Visual portable choledochoscope lithotomy forceps

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1889016A (en) * 2006-07-25 2007-01-03 周辰 Eye-to-computer cursor automatic positioning controlling method and system
CN102339147A (en) * 2008-04-10 2012-02-01 江国庆 Arithmetic device and application thereof
WO2012121405A1 (en) * 2011-03-07 2012-09-13 Sharp Kabushiki Kaisha A user interface, a device having a user interface and a method of providing a user interface
CN105867603A (en) * 2015-12-08 2016-08-17 乐视致新电子科技(天津)有限公司 Eye-controlled method and device
CN109933200A (en) * 2019-03-15 2019-06-25 北京环境特性研究所 Computer vision control method based on near-infrared eyes image
US10372202B1 (en) * 2014-01-20 2019-08-06 Ca, Inc. Positioning a cursor on a display monitor based on a user's eye-gaze position

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1889016A (en) * 2006-07-25 2007-01-03 周辰 Eye-to-computer cursor automatic positioning controlling method and system
CN102339147A (en) * 2008-04-10 2012-02-01 江国庆 Arithmetic device and application thereof
WO2012121405A1 (en) * 2011-03-07 2012-09-13 Sharp Kabushiki Kaisha A user interface, a device having a user interface and a method of providing a user interface
US10372202B1 (en) * 2014-01-20 2019-08-06 Ca, Inc. Positioning a cursor on a display monitor based on a user's eye-gaze position
CN105867603A (en) * 2015-12-08 2016-08-17 乐视致新电子科技(天津)有限公司 Eye-controlled method and device
CN109933200A (en) * 2019-03-15 2019-06-25 北京环境特性研究所 Computer vision control method based on near-infrared eyes image

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112656483A (en) * 2020-12-21 2021-04-16 中南大学湘雅医院 Visual portable choledochoscope lithotomy forceps

Similar Documents

Publication Publication Date Title
WO2020042345A1 (en) Method and system for acquiring line-of-sight direction of human eyes by means of single camera
CN109690553A (en) The system and method for executing eye gaze tracking
US20190129501A1 (en) Interactive Motion-Based Eye Tracking Calibration
CN109343700B (en) Eye movement control calibration data acquisition method and device
CN111190486B (en) Partition display method and device based on eye control
US20160327813A1 (en) Method for determining at least one optical design parameter for a progressive ophthalmic lens
WO2010142455A2 (en) Method for determining the position of an object in an image, for determining an attitude of a persons face and method for controlling an input device based on the detection of attitude or eye gaze
JP2020140630A (en) Fixation point estimation system, fixation point estimation method, fixation point estimation program, and information recording medium for recording the same
CN111221412A (en) Cursor positioning method and device based on eye control
US20180108116A9 (en) Adjusting a direction of a picture displayed on a screen
CN109144262B (en) Human-computer interaction method, device, equipment and storage medium based on eye movement
CN113989832A (en) Gesture recognition method and device, terminal equipment and storage medium
EP2261772A1 (en) Method for controlling an input device based on the detection of attitude or eye gaze
CN205750115U (en) Wearable computing device and the wearable device with it
JP2017049781A (en) Glasses-type wearable device, control method thereof, and information management server
CN106547339B (en) Control method and device of computer equipment
EP2261857A1 (en) Method for determining the position of an object in an image, for determining an attitude of a persons face and method for controlling an input device based on the detection of attitude or eye gaze
CN111651043B (en) Augmented reality system supporting customized multi-channel interaction
JP2019086916A (en) Work support device, work support method, and work support program
CN113359996A (en) Life auxiliary robot control system, method and device and electronic equipment
JP5088192B2 (en) Drawing apparatus and program
KR20060096612A (en) Method for modeling pupil shape
CN106095088B (en) A kind of electronic equipment and its image processing method
CN104731332B (en) A kind of information processing method and electronic equipment
JP2020008981A (en) Support device and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination