CN106491071A - A kind of method for giving a test of one's eyesight and terminal - Google Patents

A kind of method for giving a test of one's eyesight and terminal Download PDF

Info

Publication number
CN106491071A
CN106491071A CN201510559458.5A CN201510559458A CN106491071A CN 106491071 A CN106491071 A CN 106491071A CN 201510559458 A CN201510559458 A CN 201510559458A CN 106491071 A CN106491071 A CN 106491071A
Authority
CN
China
Prior art keywords
test
user gesture
user
character
gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201510559458.5A
Other languages
Chinese (zh)
Inventor
刘飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ZTE Corp
Original Assignee
ZTE Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ZTE Corp filed Critical ZTE Corp
Priority to CN201510559458.5A priority Critical patent/CN106491071A/en
Priority to PCT/CN2015/098093 priority patent/WO2016131337A1/en
Publication of CN106491071A publication Critical patent/CN106491071A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/02Subjective types, i.e. testing apparatus requiring the active assistance of the patient
    • A61B3/028Subjective types, i.e. testing apparatus requiring the active assistance of the patient for testing visual acuity; for determination of refraction, e.g. phoropters
    • A61B3/032Devices for presenting test symbols or characters, e.g. test chart projectors

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

The invention discloses a kind of method for giving a test of one's eyesight, the method includes:After showing that on screen visual acuity chart, test start, the initial position of user gesture is obtained;After highlighting a test character, the user gesture direction of motion is detected, the opening direction in the user gesture direction for detecting and the test character is compared, judge user whether it can be seen that the test character according to comparative result;When needing to continue test, next test character is highlighted, when needing not continue to test, eyesight testing result is determined and the eyesight testing result is fed back to the user.The present invention can simulate the process with mirror row test vision in terminal, facilitate user to test oneself at any time vision.

Description

A kind of method for giving a test of one's eyesight and terminal
Technical field
The present invention relates to field of terminal technology, more particularly to a kind of method for giving a test of one's eyesight and terminal.
Background technology
To the eyes love of oneself not enough, especially some near-sighted people are unconcerned to the eye eyesight of oneself for people It is indifferent to, causes the eyes number of degrees slowly to increase, until affecting one's power of vision.But, go to hospital or with mirror row Test vision is cumbersome, therefore, lacks a kind of quick, easily eyesight test method.
Content of the invention
The technical problem to be solved is to provide a kind of method for giving a test of one's eyesight and terminal, Neng Gou In terminal, the process of vision is tested in simulation with mirror row, facilitates user to test oneself at any time vision.
A kind of method for giving a test of one's eyesight is embodiments provided, the method includes:
After showing that on screen visual acuity chart, test start, the initial position of user gesture is obtained;
After highlighting a test character, the user gesture direction of motion is detected, by the user's handss for detecting The opening direction of gesture direction and the test character is compared, and judges that user whether can according to comparative result The test character is enough seen;
When needing to continue test, next test character is highlighted, when needing not continue to test, Determine eyesight testing result and the eyesight testing result is fed back to the user.
Alternatively, the initial position for obtaining user gesture, including:
Initial environment brightness is detected using light sensation proximity transducer, user's handss is shot using front-facing camera Gesture image, according to the initial position that the user gesture image for shooting determines user gesture;Or using preposition Photographic head shoots user's images of gestures, according to the initial bit that the user gesture image for shooting determines user gesture Put.
Alternatively, described highlight one test character after, detect the user gesture direction of motion, including:
After a test character is highlighted, the change of experiencing environmental light brightness using light sensation proximity transducer Change, after the change of environmental light brightness experienced by the light sensation proximity transducer, triggering front-facing camera is clapped Take the photograph user gesture image;Or after a test character is highlighted, touch after postponing a waiting time Send out front-facing camera and shoot user's images of gestures;
The user gesture image shot according to the front-facing camera determines the current location of user gesture;
Last position of the current location of user gesture with user gesture is compared, according to comparing knot Fruit determines the user gesture direction of motion.
Alternatively, the current location by user gesture is compared with the last position of user gesture, The user gesture direction of motion is determined according to comparative result, including:
As the current location of user gesture is differed with the last position of user gesture, then analysis position Change direction, using the change in location direction for analyzing as the user gesture direction of motion.
Alternatively, the current location by user gesture is compared with the last position of user gesture, The user gesture direction of motion is determined according to comparative result, including:
Current location such as user gesture is identical with the last position of user gesture, then from user gesture Extract the finger tip image of the finger for stretching out in present image, the finger tip sensing of the analysis finger for stretching out, The finger tip is pointed to as the user gesture direction of motion.
Alternatively, the opening direction in the user gesture direction for detecting and the test character is compared, Whether user is judged according to comparative result it can be seen that the test character, including:
As the user gesture direction is consistent with the opening direction of the test character, then judge that user can The test character is seen, the opening direction such as the user gesture direction and the test character is inconsistent, Then judge that user can not see the test character.
Alternatively, determine a need for continuing test, including:
Such as user it can be seen that currently testing character, and the current test row of the visual acuity chart has not been surveyed or has been gone back There is next line to need to test, then judge to need to continue test;
As user can not see current test character, and use described in the current test row of the visual acuity chart The test number of characters that family can not be seen reaches threshold value, then judge to need not continue to test.
Alternatively, described highlight one test character, including:
The test character is highlighted, or described in flickering display, tests character, or in the test The lower section display highlighting of character.
The embodiment of the present invention additionally provides a kind of terminal for giving a test of one's eyesight, including:
Initial display and locating module, after showing that on screen visual acuity chart, test start, obtain and use The initial position of family gesture;
Test module, for highlighting after a test character, detects the user gesture direction of motion, will The user gesture direction for detecting and the opening direction of the test character are compared, according to comparative result Judge user whether it can be seen that the test character;
Control module, for when needing to continue test, highlighting next test character, is being not required to When continuing to test, determine eyesight testing result and the eyesight testing result is fed back to the user.
Alternatively, the initial display and locating module, for obtaining the initial position of user gesture, bag Include:
Initial environment brightness is detected using light sensation proximity transducer, user's handss is shot using front-facing camera Gesture image, according to the initial position that the user gesture image for shooting determines user gesture;Or using preposition Photographic head shoots user's images of gestures, according to the initial bit that the user gesture image for shooting determines user gesture Put.
Alternatively, the test module, for highlighting after a test character, detects user gesture The direction of motion, including:
After a test character is highlighted, the change of experiencing environmental light brightness using light sensation proximity transducer Change, after the change of environmental light brightness experienced by the light sensation proximity transducer, triggering front-facing camera is clapped Take the photograph user gesture image;Or after a test character is highlighted, touch after postponing a waiting time Send out front-facing camera and shoot user's images of gestures;
The user gesture image shot according to the front-facing camera determines the current location of user gesture;
Last position of the current location of user gesture with user gesture is compared, according to comparing knot Fruit determines the user gesture direction of motion.
Alternatively, the test module, for by upper the one of the current location of user gesture and user gesture Secondary position is compared, and determines the user gesture direction of motion according to comparative result, including:
As the current location of user gesture is differed with the last position of user gesture, then analysis position Change direction, using the change in location direction for analyzing as the user gesture direction of motion.
Alternatively, the test module, for by upper the one of the current location of user gesture and user gesture Secondary position is compared, and determines the user gesture direction of motion according to comparative result, including:
Current location such as user gesture is identical with the last position of user gesture, then from user gesture Extract the finger tip image of the finger for stretching out in present image, the finger tip sensing of the analysis finger for stretching out, The finger tip is pointed to as the user gesture direction of motion.
Alternatively, the test module, for the user gesture direction that will detect and the test character Opening direction be compared, whether user is judged according to comparative result it can be seen that the test character, Including:
As the user gesture direction is consistent with the opening direction of the test character, then judge that user can The test character is seen, the opening direction such as the user gesture direction and the test character is inconsistent, Then judge that user can not see the test character.
Alternatively, the control module, for determining a need for continuing test, including:
Such as user it can be seen that currently testing character, and the current test row of the visual acuity chart has not been surveyed or has been gone back There is next line to need to test, then judge to need to continue test;
As user can not see current test character, and use described in the current test row of the visual acuity chart The test number of characters that family can not be seen reaches threshold value, then judge to need not continue to test.
Compared with prior art, the present invention is provided a kind of method for giving a test of one's eyesight and terminal, on screen Show scaled visual acuity chart, during test vision, highlight test letter line by line, using preposition Photographic head and light sensation proximity transducer are matched somebody with somebody by simulation catching the direction up and down of positioning user gesture Mirror row tests the process of vision, facilitates user to test oneself at any time vision.
Description of the drawings
A kind of method schematic diagrams that give a test of one's eyesight of the Fig. 1 for the embodiment of the present invention.
A kind of terminal schematic diagrams that give a test of one's eyesight of the Fig. 2 for the embodiment of the present invention.
Specific embodiment
For making the object, technical solutions and advantages of the present invention become more apparent, below in conjunction with accompanying drawing Embodiments of the invention are described in detail.It should be noted that in the case where not conflicting, this Shen Please in embodiment and the feature in embodiment can mutual combination in any.
As shown in figure 1, embodiments providing a kind of method for giving a test of one's eyesight, the method includes:
S101, after showing that visual acuity chart, test start, obtains the initial position of user gesture on screen;
Wherein, described show visual acuity chart on screen, including:
Screen is lighted automatically, and automatic brightness adjustment shows on screen and regards to the appropriate brightness of test vision Power table;
Such as, after user's hit testing option, the visual acuity chart through scale smaller is displayed in full screen on screen Lattice;
Wherein, the initial position for obtaining user gesture, including:
Initial environment brightness is detected using light sensation proximity transducer, user's handss is shot using front-facing camera Gesture image, according to the initial position that the user gesture image for shooting determines user gesture;Or using preposition Photographic head shoots user's images of gestures, according to the initial bit that the user gesture image for shooting determines user gesture Put;
S102, after highlighting a test character, detects the user gesture direction of motion, by detected The opening direction of user gesture direction and the test character is compared, and judges user according to comparative result Whether it can be seen that the test character;
Wherein, described highlight one test character, including:
The test character is highlighted, or described in flickering display, tests character, or in the test The lower section display highlighting of character;
Alternatively, the light is designated as Line of light mark or point-like cursor;
Wherein, the test character is capital E;
Wherein, described highlight one test character after, detect the user gesture direction of motion, including:
After a test character is highlighted, the change of experiencing environmental light brightness using light sensation proximity transducer Change, after the change of environmental light brightness experienced by the light sensation proximity transducer, triggering front-facing camera is clapped Take the photograph user gesture image;Or after a test character is highlighted, touch after postponing a waiting time Send out front-facing camera and shoot user's images of gestures;
The user gesture image shot according to the front-facing camera determines the current location of user gesture;
Last position of the current location of user gesture with user gesture is compared, according to comparing knot Fruit determines the user gesture direction of motion;
Wherein, the waiting time is used for waiting user to be made a response according to screen display;During the wait Length can be empirical value;
Wherein, the current location by user gesture is compared with the last position of user gesture, The user gesture direction of motion is determined according to comparative result, including:
As the current location of user gesture is differed with the last position of user gesture, then analysis position Change direction, using the change in location direction for analyzing as the user gesture direction of motion;
Wherein, the current location by user gesture is compared with the last position of user gesture, The user gesture direction of motion is determined according to comparative result, including:
Current location such as user gesture is identical with the last position of user gesture, then from user gesture Extract the finger tip image of the finger for stretching out in present image, the finger tip sensing of the analysis finger for stretching out, The finger tip is pointed to as the user gesture direction of motion;
Wherein, the opening direction in the user gesture direction for detecting and the test character is compared, Whether user is judged according to comparative result it can be seen that the test character, including:
As the user gesture direction is consistent with the opening direction of the test character, then judge that user can The test character is seen, the opening direction such as the user gesture direction and the test character is inconsistent, Then judge that user can not see the test character;
S103, when needing to continue test, highlights next test character, is needing not continue to survey During examination, determine eyesight testing result and the eyesight testing result is fed back to the user;
Wherein, determine a need for continuing test, including:
Such as user it can be seen that currently testing character, and the current test row of the visual acuity chart has not been surveyed or has been gone back There is next line to need to test, then judge to need to continue test;
As user can not see current test character, and use described in the current test row of the visual acuity chart The test number of characters that family can not be seen reaches threshold value, then judge to need not continue to test;
Wherein, the threshold value is 3;
Wherein, the eyesight testing result is fed back to the user, including:
Show the eyesight testing result on screen;
As shown in Fig. 2 a kind of terminal for giving a test of one's eyesight is embodiments provided, including:
Initial display and locating module 201, after showing that on screen visual acuity chart, test start, obtain Take the initial position of user gesture;
Test module 202, for highlighting after a test character, detects the user gesture direction of motion, The opening direction in the user gesture direction for detecting and the test character is compared, according to comparing knot Whether fruit judges user it can be seen that the test character;
Control module 203, for when needing to continue test, highlighting next test character, When needing not continue to test, determine eyesight testing result and the eyesight testing result is fed back to the use Family.
Wherein, the initial display and locating module 201, for obtaining the initial position of user gesture, Including:
Initial environment brightness is detected using light sensation proximity transducer, user's handss is shot using front-facing camera Gesture image, according to the initial position that the user gesture image for shooting determines user gesture;Or using preposition Photographic head shoots user's images of gestures, according to the initial bit that the user gesture image for shooting determines user gesture Put.
Wherein, the test module 202, for highlighting after a test character, detects user's handss Potential motion direction, including:
After a test character is highlighted, the change of experiencing environmental light brightness using light sensation proximity transducer Change, after the change of environmental light brightness experienced by the light sensation proximity transducer, triggering front-facing camera is clapped Take the photograph user gesture image;Or after a test character is highlighted, touch after postponing a waiting time Send out front-facing camera and shoot user's images of gestures;
The user gesture image shot according to the front-facing camera determines the current location of user gesture;
Last position of the current location of user gesture with user gesture is compared, according to comparing knot Fruit determines the user gesture direction of motion.
Wherein, the test module 202, for by the current location of user gesture and user gesture One time position is compared, and determines the user gesture direction of motion according to comparative result, including:
As the current location of user gesture is differed with the last position of user gesture, then analysis position Change direction, using the change in location direction for analyzing as the user gesture direction of motion.
Wherein, the test module 202, for by the current location of user gesture and user gesture One time position is compared, and determines the user gesture direction of motion according to comparative result, including:
Current location such as user gesture is identical with the last position of user gesture, then from user gesture Extract the finger tip image of the finger for stretching out in present image, the finger tip sensing of the analysis finger for stretching out, The finger tip is pointed to as the user gesture direction of motion.
Wherein, the test module 202, for the user gesture direction that will detect and the test word The opening direction of symbol is compared, and whether judges user according to comparative result it can be seen that the test character, Including:
As the user gesture direction is consistent with the opening direction of the test character, then judge that user can The test character is seen, the opening direction such as the user gesture direction and the test character is inconsistent, Then judge that user can not see the test character.
Wherein, the control module 203, for determining a need for continuing test, including:
Such as user it can be seen that currently testing character, and the current test row of the visual acuity chart has not been surveyed or has been gone back There is next line to need to test, then judge to need to continue test;
As user can not see current test character, and use described in the current test row of the visual acuity chart The test number of characters that family can not be seen reaches threshold value, then judge to need not continue to test.
Wherein, the test module 202, for highlighting a test character, including:
The test character is highlighted, or described in flickering display, tests character, or in the test The lower section display highlighting of character;
Application example
Present invention application example provides a kind of method of test eye vision, comprises the following steps:
S301:Application program opening, screen display " test " and " exiting " option;
S302:After user's hit testing option, vision form of the screen display through scale smaller;
Wherein, eye is apart from the general 30cm of screen;
S303:Front-facing camera and light proximity transducer are opened, and catch and recognize gesture glide direction;
Wherein, when gesture is in front of equipment, the environmental light brightness around light proximity transducer can become Change, now, light is close to the signal projector in level sensor and launches a signal, and handss can be by signaling reflex Return, after reception device receives signal, position the position of gesture, if around this time proximity transducer Environmental light brightness be original brightness.Now front-facing camera catches the position of a gesture, orientates position as Put 1.
When gesture is slided, light is close to the environmental light brightness of surrounding to be changed by original brightness change procedure Brightness flop afterwards once, after light proximity transducer experiences an environment light sensation brightness flop, preposition is taken the photograph As head can receive a signal, now photographic head catches a hand gesture location again, orientates position 2 as, Processor orientates position 1 as gesture glide direction to the change of position 2.Here it is a complete light Proximity transducer and front-facing camera coordinate the process for catching a gesture glide direction.
When the gesture transfixion of user, i.e., front-facing camera catches hand gesture location twice 1 and position 2 is same position, and the data that at this moment the computing module scanning in arithmetic processor catches, analysis are sold The fingertip location of the finger stretched out in gesture, judges that the finger tip institute is gesture glide direction towards direction.
While gestures direction is caught, timer T1 and enumerator N1 is opened and timing is counted:Survey Alphabetical highlighted after, T1 starts timing:Now front-facing camera and light proximity transducer catch gesture and slide Direction, a length of 3 seconds when T1 is set maximum, capturing motion terminates later within 3 seconds, message processing module Whether with the gestures direction that catch consistent, if unanimously, count if calculating highlighted letter " E " opening direction Device is counted as 1, continues the highlighted continuation test of next letter, when enumerator is 3, is switched to next The less letter of row continues test.If certain a line letter number is inadequate 3, surveyed automatically switch to next The less letter of row continues test, counter O reset.If in any one test process, if gone out The inconsistent situation of existing gesture and letter opening occurs, then test terminates, and execution step S304, if handss Gesture is consistent with alphabetical opening, and so test to the last a line test is completed, execution step S304.
S304:Test is completed, and is calculated eyesight testing result and is shown on screen;
Such as, user is pointed out after the completion of test:" your vision is 5.0, continues test please by test, no Then exit ".
A kind of method for giving a test of one's eyesight and terminal that above-described embodiment is provided, show on screen and contract in proportion Little visual acuity chart, during test vision, highlights test letter, line by line using front-facing camera and light sensation Proximity transducer positions the direction up and down of user gesture to catch, and tests vision by simulation with mirror row Process, facilitate user to test oneself at any time vision.
One of ordinary skill in the art will appreciate that all or part of step in said method can pass through program Complete to instruct related hardware, described program can be stored in computer-readable recording medium, such as read-only Memorizer, disk or CD etc..Alternatively, all or part of step of above-described embodiment can also be used Realizing, correspondingly, each module/unit in above-described embodiment can be adopted one or more integrated circuits The form of hardware is realized, it would however also be possible to employ the form of software function module is realized.The present invention is not restricted to appoint The combination of the hardware and software of what particular form.
It should be noted that the present invention can also have other various embodiments, without departing substantially from spirit of the invention and In the case of its essence, those of ordinary skill in the art can make various corresponding changes according to the present invention And deformation, but these corresponding changes and deformation should all belong to the protection model of appended claims of the invention Enclose.

Claims (15)

1. a kind of method for giving a test of one's eyesight, the method include:
After showing that on screen visual acuity chart, test start, the initial position of user gesture is obtained;
After highlighting a test character, the user gesture direction of motion is detected, by the user's handss for detecting The opening direction of gesture direction and the test character is compared, and judges that user whether can according to comparative result The test character is enough seen;
When needing to continue test, next test character is highlighted, when needing not continue to test, Determine eyesight testing result and the eyesight testing result is fed back to the user.
2. the method for claim 1, it is characterised in that:
The initial position for obtaining user gesture, including:
Initial environment brightness is detected using light sensation proximity transducer, user's handss is shot using front-facing camera Gesture image, according to the initial position that the user gesture image for shooting determines user gesture;Or using preposition Photographic head shoots user's images of gestures, according to the initial bit that the user gesture image for shooting determines user gesture Put.
3. method as claimed in claim 1 or 2, it is characterised in that:
Described highlight one test character after, detect the user gesture direction of motion, including:
After a test character is highlighted, the change of experiencing environmental light brightness using light sensation proximity transducer Change, after the change of environmental light brightness experienced by the light sensation proximity transducer, triggering front-facing camera is clapped Take the photograph user gesture image;Or after a test character is highlighted, touch after postponing a waiting time Send out front-facing camera and shoot user's images of gestures;
The user gesture image shot according to the front-facing camera determines the current location of user gesture;
Last position of the current location of user gesture with user gesture is compared, according to comparing knot Fruit determines the user gesture direction of motion.
4. method as claimed in claim 3, it is characterised in that:
The current location by user gesture is compared with the last position of user gesture, according to than Relatively result determines the user gesture direction of motion, including:
As the current location of user gesture is differed with the last position of user gesture, then analysis position Change direction, using the change in location direction for analyzing as the user gesture direction of motion.
5. method as claimed in claim 3, it is characterised in that:
The current location by user gesture is compared with the last position of user gesture, according to than Relatively result determines the user gesture direction of motion, including:
Current location such as user gesture is identical with the last position of user gesture, then from user gesture Extract the finger tip image of the finger for stretching out in present image, the finger tip sensing of the analysis finger for stretching out, The finger tip is pointed to as the user gesture direction of motion.
6. the method for claim 1, it is characterised in that:
The opening direction in the user gesture direction for detecting and the test character is compared, according to than Compared with result judgement user whether it can be seen that the test character, including:
As the user gesture direction is consistent with the opening direction of the test character, then judge that user can The test character is seen, the opening direction such as the user gesture direction and the test character is inconsistent, Then judge that user can not see the test character.
7. the method as described in claim 1 or 6, it is characterised in that:
Determine a need for continuing test, including:
Such as user it can be seen that currently testing character, and the current test row of the visual acuity chart has not been surveyed or has been gone back There is next line to need to test, then judge to need to continue test;
As user can not see current test character, and use described in the current test row of the visual acuity chart The test number of characters that family can not be seen reaches threshold value, then judge to need not continue to test.
8. the method for claim 1, it is characterised in that:
Described highlight one test character, including:
The test character is highlighted, or described in flickering display, tests character, or in the test The lower section display highlighting of character.
9. a kind of terminal for giving a test of one's eyesight, including:
Initial display and locating module, after showing that on screen visual acuity chart, test start, obtain and use The initial position of family gesture;
Test module, for highlighting after a test character, detects the user gesture direction of motion, will The user gesture direction for detecting and the opening direction of the test character are compared, according to comparative result Judge user whether it can be seen that the test character;
Control module, for when needing to continue test, highlighting next test character, is being not required to When continuing to test, determine eyesight testing result and the eyesight testing result is fed back to the user.
10. terminal as claimed in claim 9, it is characterised in that:
The initial display and locating module, for obtaining the initial position of user gesture, including:
Initial environment brightness is detected using light sensation proximity transducer, user's handss is shot using front-facing camera Gesture image, according to the initial position that the user gesture image for shooting determines user gesture;Or using preposition Photographic head shoots user's images of gestures, according to the initial bit that the user gesture image for shooting determines user gesture Put.
11. terminals as described in claim 9 or 10, it is characterised in that:
The test module, for highlighting after a test character, detects the user gesture direction of motion, Including:
After a test character is highlighted, the change of experiencing environmental light brightness using light sensation proximity transducer Change, after the change of environmental light brightness experienced by the light sensation proximity transducer, triggering front-facing camera is clapped Take the photograph user gesture image;Or after a test character is highlighted, touch after postponing a waiting time Send out front-facing camera and shoot user's images of gestures;
The user gesture image shot according to the front-facing camera determines the current location of user gesture;
Last position of the current location of user gesture with user gesture is compared, according to comparing knot Fruit determines the user gesture direction of motion.
12. terminals as claimed in claim 11, it is characterised in that:
The test module, enters for the last position by the current location of user gesture with user gesture Row compares, and determines the user gesture direction of motion according to comparative result, including:
As the current location of user gesture is differed with the last position of user gesture, then analysis position Change direction, using the change in location direction for analyzing as the user gesture direction of motion.
13. terminals as claimed in claim 11, it is characterised in that:
The test module, enters for the last position by the current location of user gesture with user gesture Row compares, and determines the user gesture direction of motion according to comparative result, including:
Current location such as user gesture is identical with the last position of user gesture, then from user gesture Extract the finger tip image of the finger for stretching out in present image, the finger tip sensing of the analysis finger for stretching out, The finger tip is pointed to as the user gesture direction of motion.
14. terminals as claimed in claim 9, it is characterised in that:
The test module, for the user gesture direction that will detect and the openings of the test character To being compared, whether user is judged according to comparative result it can be seen that the test character, including:
As the user gesture direction is consistent with the opening direction of the test character, then judge that user can The test character is seen, the opening direction such as the user gesture direction and the test character is inconsistent, Then judge that user can not see the test character.
15. terminals as described in claim 9 or 14, it is characterised in that:
The control module, for determining a need for continuing test, including:
Such as user it can be seen that currently testing character, and the current test row of the visual acuity chart has not been surveyed or has been gone back There is next line to need to test, then judge to need to continue test;
As user can not see current test character, and use described in the current test row of the visual acuity chart The test number of characters that family can not be seen reaches threshold value, then judge to need not continue to test.
CN201510559458.5A 2015-09-06 2015-09-06 A kind of method for giving a test of one's eyesight and terminal Pending CN106491071A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201510559458.5A CN106491071A (en) 2015-09-06 2015-09-06 A kind of method for giving a test of one's eyesight and terminal
PCT/CN2015/098093 WO2016131337A1 (en) 2015-09-06 2015-12-21 Method and terminal for detecting vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510559458.5A CN106491071A (en) 2015-09-06 2015-09-06 A kind of method for giving a test of one's eyesight and terminal

Publications (1)

Publication Number Publication Date
CN106491071A true CN106491071A (en) 2017-03-15

Family

ID=56688678

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510559458.5A Pending CN106491071A (en) 2015-09-06 2015-09-06 A kind of method for giving a test of one's eyesight and terminal

Country Status (2)

Country Link
CN (1) CN106491071A (en)
WO (1) WO2016131337A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108992034A (en) * 2017-06-06 2018-12-14 沈荷芳 Virtual reality eye detection system and eye detection method thereof
TWI788486B (en) * 2017-12-22 2023-01-01 日商視覺科技研究所股份有限公司 Visual function inspection system, optical characteristic calculation system, optical member selection method, optical member manufacturing method, display member manufacturing method, lighting device manufacturing method, visual function inspection device, optical characteristic calculation device, visual function inspection method, optical Calculation method of characteristics, computer program, and recording medium

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109171637A (en) * 2018-09-30 2019-01-11 苏州安视沛清科技有限公司 Vision testing method, device, computer storage medium and computer equipment
CN111700583B (en) * 2020-05-23 2023-04-18 福建生物工程职业技术学院 Detection method of indoor shared self-service vision detection system

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1883376A (en) * 2006-06-19 2006-12-27 陈宁宁 Self-testing eye chart with speech instruction
CN202179524U (en) * 2011-08-11 2012-04-04 王金华 Microcomputer E-table eyesight detector
CN103106391A (en) * 2011-11-14 2013-05-15 株式会社东芝 Gesture recognition apparatus and method thereof
CN103376890A (en) * 2012-04-16 2013-10-30 富士通株式会社 Gesture remote control system based on vision
CN103514437A (en) * 2012-12-24 2014-01-15 Tcl集团股份有限公司 Three-dimensional hand gesture recognition device and three-dimensional hand gesture recognition method
CN103890695A (en) * 2011-08-11 2014-06-25 视力移动技术有限公司 Gesture based interface system and method
CN103976706A (en) * 2014-05-20 2014-08-13 科云(上海)信息技术有限公司 Intelligent vision examination device
CN104000553A (en) * 2014-05-23 2014-08-27 何明光 Electronic vision detecting system adopting double-blind design
CN104095609A (en) * 2014-05-20 2014-10-15 大连戴安科技有限公司 Novel wearable intelligent myopia treatment instrument integrated with preventing, treating and detecting functions
WO2014168558A1 (en) * 2013-04-11 2014-10-16 Crunchfish Ab Portable device using passive sensor for initiating touchless gesture control
CN204557489U (en) * 2015-03-12 2015-08-12 山东大学 Based on the contactless tripper of video image gesture identification

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102984344A (en) * 2012-10-16 2013-03-20 广东欧珀移动通信有限公司 Method for testing vision with mobile phone and mobile phone
CN203074671U (en) * 2013-01-31 2013-07-24 浙江工贸职业技术学院 Intelligent eye test device
EP2821888B1 (en) * 2013-07-01 2019-06-12 BlackBerry Limited Gesture detection using ambient light sensors
CN104598224A (en) * 2014-12-26 2015-05-06 上海沙斐网络科技有限公司 Vision detection method based on terminal and vision detection terminal

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1883376A (en) * 2006-06-19 2006-12-27 陈宁宁 Self-testing eye chart with speech instruction
CN202179524U (en) * 2011-08-11 2012-04-04 王金华 Microcomputer E-table eyesight detector
CN103890695A (en) * 2011-08-11 2014-06-25 视力移动技术有限公司 Gesture based interface system and method
CN103106391A (en) * 2011-11-14 2013-05-15 株式会社东芝 Gesture recognition apparatus and method thereof
CN103376890A (en) * 2012-04-16 2013-10-30 富士通株式会社 Gesture remote control system based on vision
CN103514437A (en) * 2012-12-24 2014-01-15 Tcl集团股份有限公司 Three-dimensional hand gesture recognition device and three-dimensional hand gesture recognition method
WO2014168558A1 (en) * 2013-04-11 2014-10-16 Crunchfish Ab Portable device using passive sensor for initiating touchless gesture control
CN103976706A (en) * 2014-05-20 2014-08-13 科云(上海)信息技术有限公司 Intelligent vision examination device
CN104095609A (en) * 2014-05-20 2014-10-15 大连戴安科技有限公司 Novel wearable intelligent myopia treatment instrument integrated with preventing, treating and detecting functions
CN104000553A (en) * 2014-05-23 2014-08-27 何明光 Electronic vision detecting system adopting double-blind design
CN204557489U (en) * 2015-03-12 2015-08-12 山东大学 Based on the contactless tripper of video image gesture identification

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108992034A (en) * 2017-06-06 2018-12-14 沈荷芳 Virtual reality eye detection system and eye detection method thereof
TWI788486B (en) * 2017-12-22 2023-01-01 日商視覺科技研究所股份有限公司 Visual function inspection system, optical characteristic calculation system, optical member selection method, optical member manufacturing method, display member manufacturing method, lighting device manufacturing method, visual function inspection device, optical characteristic calculation device, visual function inspection method, optical Calculation method of characteristics, computer program, and recording medium

Also Published As

Publication number Publication date
WO2016131337A1 (en) 2016-08-25

Similar Documents

Publication Publication Date Title
US11650659B2 (en) User input processing with eye tracking
US10884488B2 (en) Electronic device and method for controlling display
CN105279459B (en) A kind of terminal glance prevention method and mobile terminal
JP5942586B2 (en) Tablet terminal and operation reception program
JP5807989B2 (en) Gaze assist computer interface
JP6131540B2 (en) Tablet terminal, operation reception method and operation reception program
US11360605B2 (en) Method and device for providing a touch-based user interface
CN106231178B (en) A kind of self-timer method and mobile terminal
CN106491071A (en) A kind of method for giving a test of one's eyesight and terminal
CN104731314B (en) Last known browsing position indicating is provided using the biometric data towards movement
KR102326489B1 (en) Electronic device and method for controlling dispaying
US20180324366A1 (en) Electronic make-up mirror device and background switching method thereof
CN108829239A (en) Control method, device and the terminal of terminal
CN107239222A (en) The control method and terminal device of a kind of touch-screen
CN109101110A (en) A kind of method for executing operating instructions, device, user terminal and storage medium
CN104199608B (en) The method of quick open record and touch terminal on touching terminal
JP2011243108A (en) Electronic book device and electronic book operation method
CN107509024A (en) One kind is taken pictures processing method and mobile terminal
CN110174937A (en) Watch the implementation method and device of information control operation attentively
CN109960405A (en) Mouse operation method, device and storage medium
KR20140019215A (en) Camera cursor system
CN108600715A (en) A kind of method for controlling projection and projection device
JP6397508B2 (en) Method and apparatus for generating a personal input panel
CN113491502A (en) Eyeball tracking calibration inspection method, device, equipment and storage medium
CN110164444A (en) Voice input starting method, apparatus and computer equipment

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20170315