CN111443665A - Intelligent ward control method and system based on eye movement signals - Google Patents

Intelligent ward control method and system based on eye movement signals Download PDF

Info

Publication number
CN111443665A
CN111443665A CN202010165965.1A CN202010165965A CN111443665A CN 111443665 A CN111443665 A CN 111443665A CN 202010165965 A CN202010165965 A CN 202010165965A CN 111443665 A CN111443665 A CN 111443665A
Authority
CN
China
Prior art keywords
menu
area
patient
eye movement
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010165965.1A
Other languages
Chinese (zh)
Inventor
葛盛
江一川
王鹏
何静
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southeast University
Original Assignee
Southeast University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southeast University filed Critical Southeast University
Priority to CN202010165965.1A priority Critical patent/CN111443665A/en
Publication of CN111443665A publication Critical patent/CN111443665A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
    • G05B19/41845Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM] characterised by system universality, reconfigurability, modularity
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
    • G05B19/4183Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM] characterised by data acquisition, e.g. workpiece identification
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
    • G05B19/4188Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM] characterised by CIM planning or realisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus

Abstract

The invention relates to an eye movement signal-based intelligent ward control method and system, and belongs to the technical field of human-computer interaction, signal processing and intelligent home intersection. The system comprises an eye tracking module, a menu display module, a data analysis module and a command output module. Facing to patients, a menu display module presents a control menu; tracking an eye movement signal of the tested person by an eye movement tracking module; the data analysis module analyzes the mapping relation between the eye movement signal and the menu options; the command output module transmits the selected menu option to each sub-module under the ward function module in a wireless signal mode; the ward function module comprises units such as an electric sickbed, a ward calling unit, a television, an air conditioner, an electric curtain and the like. By using the control method and the control system provided by the application, the patients can express the relative living demands and control facilities through sight lines.

Description

Intelligent ward control method and system based on eye movement signals
Technical Field
The invention discloses an intelligent ward control method and system based on eye movement signals, particularly relates to a method for expressing requirements and controlling facilities related to each daily life of a ward by using a patient sight watching display to display a menu, and belongs to the technical field of human-computer interaction, signal processing and intelligent household intersection.
Background
A large proportion of patients in stroke have speech and motor dysfunction. A significant proportion of the elderly population also suffer from the above-mentioned disorders. In addition, some patients may affect normal speech due to tracheal, nasogastric intubation, or pulmonary, laryngeal, oral inflammation. The above people all have the obstacle that the meaning cannot be expressed through normal speaking, writing and typing, and an auxiliary device is urgently needed to help patients to implement meaning expression.
The human-computer interface based on physiological signals is a technical means for realizing the command output of human body to the outside by acquiring human body physiological signals and processing the human body physiological signals through signal processing, feature extraction, feature classification and the like. At present, researches and equipment for implementing command output by acquiring electroencephalogram, myoelectricity and brain blood oxygen metabolism signals are common. However, the above methods and techniques require sensors to be worn, and patients are prone to skin discomfort when wearing the sensors for a long period of time. Meanwhile, the method and the technology are easy to be subjected to electromyographic interference or electromagnetic interference existing in a use environment, so that the signal quality is low, and the possibility of wrong classification exists.
Eye tracking technology (eye tracking) is a technology for tracking the movement locus of an eyeball by an infrared camera. Implementing human-machine interfaces using eye tracking is an emerging method and technology. The application aims to utilize the eye movement tracking technology to help patients to realize the expression of requirements and facility control related to various daily lives in sickrooms.
Disclosure of Invention
The invention aims to provide an intelligent ward control method and system based on eye movement signals, which aims at overcoming the defects of the background art, presents a control menu to a patient with language or dyskinesia and tracks the eyeball movement of the patient, acquires a menu option concerned by the patient by using a signal analysis method and outputs a control signal of a related device or equipment, realizes intelligent control of equipment in a ward and solves the technical problem that the ward equipment cannot meet various living requirements of the patient with dyskinesia.
The invention adopts the following technical scheme for realizing the aim of the invention:
the intelligent ward control scheme based on the eye movement signal comprises the following three parts.
One) control system construction based on multi-layer menu:
1) displaying the requirement expression and facility control of the patient relative to each living on a display in a menu form, and constructing a multi-level menu in a tree structure mode to implement fine division on the requirement expression and facility control command relative to each living;
2) the main menu consists of options such as calling, entertainment, room management, bed adjustment, diet, relieving hands and the like; the options of the first-level submenu corresponding to the calling option are as follows: calling doctors, nurses, nursing workers and family members; the options of the primary submenu corresponding to the entertainment options are: televisions, radios; setting a secondary submenu under the television and the radio respectively, wherein each item of the secondary submenu is as follows: turning off the equipment, increasing and decreasing channels and increasing and decreasing volume; the options of the primary submenu corresponding to the room management options are as follows: air conditioning, lighting, curtains; a secondary submenu is arranged under the air conditioner, the lighting and the curtain, and the content of the secondary submenu can be specifically set according to the control requirements of the air conditioner, the lighting and the curtain; each item of the first-level submenu corresponding to the bed position adjusting item is as follows: adjusting the head high, the head low, the legs high, the legs low, the left high, the left low, the right high, the right low, and the reset; the items of the first-level submenu corresponding to the diet items are as follows: food and beverage; a secondary sub-menu is respectively arranged under food and beverage, and the subdivision content of the secondary sub-menu can be specifically set according to the actual use requirement; each item of the one-level submenu corresponding to the handedness item is as follows: large solution and small solution;
3) and each level of submenus is provided with a return main menu option, and the number and the display content of each level of menu options can be edited according to requirements.
Two) analysis of spatial and temporal characteristics of the point of regard:
1) completing eyeball tracking parameter setting based on an infrared camera, and respectively acquiring pupil and cornea reflection spots to obtain eye movement coordinates corresponding to the gazing point;
2) calibrating a fixation point, respectively displaying the fixation points at the middle points of the upper, lower, left and right sides of the screen and the four corners of the screen, collecting eye movement coordinates when an observer watches the fixation points, and establishing a mapping relation between the screen and the eye movement coordinate range; calculating the deviation between the observer's fixation point and the calibration point, finishing the calibration work when the average deviation is less than 1 degree of the visual angle, otherwise, continuing the calibration process until the error requirement is met;
3) sampling coordinate values of a patient fixation point on a display every 2 ms by using an eye tracker, calculating fixation time of each pixel on a screen in a time window by taking 200 ms as the time window, dividing the screen into a plurality of sub-regions according to the number and the position of a menu currently displayed by a menu display module, counting the total fixation time of all pixels in each sub-region, and taking the sub-region corresponding to the maximum value as a fixation area in the time window;
4) and (3) sliding the 200 ms time window along the time axis by taking the starting display time of the current menu as 0 time and 100 ms as a sliding step length from 200 ms, and analyzing the gazing area position corresponding to each sliding time window. And when the gazing areas corresponding to the 5 continuous sliding time windows are the same, outputting a certain menu option corresponding to the gazing area as the selection of the patient.
Three) various unit controls of ward function module:
1) the ward function module consists of an electric sickbed, a ward calling unit, a television, a radio, an illumination unit, an air conditioner and an electric curtain unit; the electric sickbed unit can realize the integrated control of a flashlight and can realize the functions of head lifting, leg lifting and left-right side turning of the sickbed; the ward calling unit can realize the functions of calling doctors, nurses, nursing workers and family members; the television unit can realize the remote control of various functions of the television; the radio unit can realize the remote control of various functions of the radio; the lighting unit can realize the remote control of each lamp; the air conditioner can realize the remote control of various functions of the air conditioner; the electric curtain unit can realize the remote control of the lifting of the curtain.
2) All the control of the ward function module can be implemented by the sight of the patient and can also be implemented by manual operation;
3) the ward function module can realize the addition or deletion of the control unit.
By adopting the technical scheme, the invention has the following beneficial effects:
(1) the vision tracking is used for realizing the expression and facility control of various living related requirements, the human-computer interface of the ward equipment is realized through the eye movement tracking technology, and the intelligent ward which can automatically respond to the eye movement signal of the patient can be realized by additionally arranging an eye movement instrument on the basis of the existing ward equipment and endowing the eye movement instrument with software design.
(2) In order to avoid the problem that the sum of the heat values of a plurality of sub-regions is higher due to sight drift in the process of watching a menu by a patient and further the correct issuing of a control command is influenced, the method and the device count the gazing regions in the sliding time windows after acquiring the original heat map of the gazing point, and only simultaneously determine the gazing regions in a plurality of continuous sliding time windows to be the gazing regions of the patient, so that the accurate issuing of the control command is ensured.
(3) Each menu option can be edited on line according to the requirement, so that the system is convenient to correspond to various actual use environments and application requirements, and is beneficial to expanding the ward function module according to the actual requirement by a user.
Drawings
Fig. 1 is a system configuration diagram.
Fig. 2 is a system signal flow diagram.
Fig. 3 is a ward function block diagram.
Fig. 4 is a diagram of a smart ward control main menu displayed by the menu display module.
Fig. 5 is a level of submenus corresponding to the call option.
Fig. 6 is a level one sub-menu corresponding to entertainment options.
Fig. 7 is a level one sub-menu corresponding to the room management option.
Fig. 8 is a level submenu corresponding to the bed adjustment option.
Fig. 9 is a level of submenus corresponding to the dietary options.
Fig. 10 is a level of submenus corresponding to the handout options.
Fig. 11 is a secondary sub-menu corresponding to the tv option.
FIG. 12 is a two-level sub-menu corresponding to radio options.
FIG. 13 is a diagram illustrating a screen divided into sub-regions.
Fig. 14 is a schematic diagram of sliding windows and sliding steps.
Fig. 15 is an example of selecting the sub-area with the maximum total gazing duration sum in a certain time window as the gazing area.
Fig. 16 is an example of the output for determining the constant gazing area as the will of the patient.
Detailed Description
The technical scheme of the invention is explained in detail in the following with reference to the attached drawings.
As shown in fig. 1, the intelligent ward control system based on eye movement signals disclosed by the invention comprises an eye movement tracking module, a menu display module, a data analysis module and a command output module. As shown in fig. 2, the menu display module sends the content of each option of the currently displayed menu and the coordinate position of the option button in the screen to the data analysis module. The eye movement tracking module tracks the motion track coordinates of the eyeballs of the patient to acquire the original heat map of the fixation point. The data analysis module firstly carries out Gaussian space smoothing on the original heat map of the fixation point output by the eye movement tracking module; acquiring heat map distribution exceeding a threshold value by using a threshold value method; dividing the screen into a plurality of menu option sub-regions according to the coordinate position of an option button sent by the menu display module in the screen, counting the sum of the heat values of all pixels in each menu option sub-region, determining the menu option sub-region with the largest sum of the heat values of all pixels as a patient gazing area, outputting and displaying a selected instruction of the menu option to the menu display module, and generating a control command of a control unit corresponding to the menu option. The control command is sent to the units corresponding to the ward function module in a wireless signal mode through the command output module, and the units complete corresponding control functions. In order to avoid the problem that the sum of the heat values of a plurality of sub-regions is higher due to sight drift in the process of watching a menu by a patient so as to influence the correct issuing of a control command, the method counts the gazing area in the sliding time window after acquiring the original heat map of the gazing point, and determines that the gazing area is the gazing area of the patient when the gazing areas in a plurality of (for example, 5) continuous sliding time windows are the same.
As shown in fig. 3, the ward function module is composed of an electric sickbed, a ward calling unit, a television, a radio, a lighting unit, an air conditioner and an electric curtain unit; the units corresponding to the ward function modules refer to a flashlight integrated control handle for adjusting the head lifting, the leg lifting and the left and right side turning of the electric sickbed, and a remote controller of a television, a radio, an illumination, an air conditioner and an electric curtain.
The requirements expression and facility control related to each living are displayed on the display in the form of a tree-structured menu. As shown in fig. 4, the main menu is displayed on the display in full screen, and is composed of option buttons such as call, entertainment, room management, bed adjustment, diet, and relieving the hands, and the option buttons are distributed at equal intervals. As shown in fig. 5, the first level sub-menu corresponding to the calling option is composed of options buttons for calling doctor, calling nurse, calling family members, and returning. As shown in fig. 6, the level one sub-menu corresponding to the entertainment option is composed of options buttons such as tv, radio, return, etc. As shown in fig. 7, the one-level sub-menu corresponding to the room management option is composed of option buttons such as air conditioner, lighting, curtain, and back. As shown in fig. 8, the primary sub-menu corresponding to the bed position adjustment option is composed of option buttons such as head-up, head-down, leg-up, leg-down, left-up, left-down, right-up, right-down, reset, and return. As shown in fig. 9, the first level sub-menu corresponding to the diet option is composed of food, drink, return, etc. option buttons. As shown in fig. 10, the one-level submenu corresponding to the solution option is composed of buttons for large solution, small solution, return, and the like. As shown in fig. 11, the secondary sub-menu corresponding to the tv option is composed of option buttons such as channel up, channel down, volume up, volume down, tv off, and back. As shown in fig. 12, the secondary sub-menu corresponding to the radio option is composed of option buttons of channel up, channel down, volume up, volume down, radio off, and back.
As shown in fig. 13, the screen is divided into several sub-regions according to the number and position of the currently displayed menu by the menu display module, and the currently displayed menu has six options in two rows and three columns, so that the display screen is divided into six sub-regions in two rows and three columns.
As shown in fig. 14, the method for counting the intra-view area in the sliding time window includes: and calculating the heat value of each pixel on a screen in the time window by taking the starting display time of the current menu as 0 time and 200 ms as the time window, starting from 200 ms, sliding the 200 ms time window along a time axis by taking 100 ms as a sliding step length, counting the gazing area corresponding to each sliding time window, and when the gazing areas corresponding to 5 continuous sliding time windows are the same, taking a certain menu option corresponding to the gazing area as the selection of a patient and outputting a control command corresponding to the menu option.
Fig. 15 is an example of selecting the sub-area with the maximum total gazing duration sum in a certain time window as the gazing area. And dividing the screen into sub-areas with the same size according to the number and the positions of the options in the display menu. And calculating the sum of the gazing duration of all pixels in each subarea in the 200 ms time window, and taking the subarea corresponding to the maximum value as a gazing area in the time window. As shown, the sum of the durations of gaze in the sub-regions of the first row and the second column is the largest, and the sub-regions of the first row and the second column are taken as the gazing area in the time window.
Fig. 16 is an example of the output for determining the constant gazing area as the will of the patient. And when the gazing areas of the 5 continuous time windows are the same, judging that the option corresponding to the gazing area is output as the will of the patient. As shown in the figure, the gazing areas of the 5 consecutive time windows are all the sub-areas in the first row and the second row, so that the option corresponding to the sub-area is determined to be the willingness output of the patient.

Claims (8)

1. An intelligent ward control method based on eye movement signals is characterized in that eye movement track coordinates of a patient are collected to obtain an original heat map of a fixation point, a menu display area is divided into menu option sub-areas according to coordinate positions of option buttons in a screen, all pixel heat values in the menu option sub-areas are counted to determine the fixation area of the patient, and a control command corresponding to menu options in the fixation area of the patient is output.
2. The intelligent ward control method based on eye movement signals of claim 1, wherein after obtaining the original heat map of the gazing point, the heat values of all pixels in each menu option sub-area in the sliding time window are counted to determine the gazing area of the patient, and when the gazing areas of the patient in a plurality of continuous sliding time windows are the same, the gazing area is determined to be the gazing area of the patient.
3. The intelligent ward control method based on eye movement signals as claimed in claim 1, wherein the specific method for determining the gazing area of the patient by counting the heat value of all pixels in each menu option sub-area comprises: and performing Gaussian space smoothing on the acquired original heat map of the fixation point, acquiring heat map distribution exceeding a threshold value by using a threshold value method, counting the sum of the heat values of all pixels in each menu option sub-area, and taking the menu option sub-area with the largest sum of the heat values of all pixels as the fixation area of the patient.
4. The intelligent ward control method based on eye movement signals as claimed in claim 1, wherein the menu is a tree-structured multi-level menu, including a main menu corresponding to calling, entertainment, room management, bed adjustment, diet, and relieving, a one-level sub-menu corresponding to each option of the main menu, and a two-level sub-menu corresponding to the one-level sub-menu; the options of the first-level submenu corresponding to the calling option are as follows: calling doctors, nurses, nursing workers and family members; the options of the primary submenu corresponding to the entertainment options are: televisions, radios; the items of the secondary submenu corresponding to the television and radio options are as follows: turning off the equipment, increasing and decreasing channels and increasing and decreasing volume; the options of the primary submenu corresponding to the room management options are as follows: air conditioning, lighting, curtains; each item of the secondary submenu corresponding to the air conditioner, lighting and curtain items is opened and closed according to the content; each item of the first-level submenu corresponding to the bed position adjusting item is as follows: adjusting the head high, the head low, the legs high, the legs low, the left high, the left low, the right high, the right low, and the reset; the items of the first-level submenu corresponding to the diet items are as follows: food and beverage; each item of the one-level submenu corresponding to the handedness item is as follows: major solution and minor solution.
5. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the method of any one of claims 1 to 4.
6. The utility model provides an wisdom ward control system based on eye movement signal which characterized in that includes:
the eye movement tracking module tracks the motion track coordinates of the eyeballs of the patient to acquire an original heat map of the fixation point;
the menu display module is used for sending each item content of the current display menu and the coordinate position of the item button in the screen to the data analysis module;
the data analysis module is used for dividing the menu display area into menu option sub-areas according to the coordinate positions of the option buttons in the screen, receiving the original heat map of the fixation point output by the eye tracking module, counting the heat values of all pixels in each menu option sub-area to determine the fixation area of the patient, and outputting a control command corresponding to the menu options in the fixation area of the patient; and a process for the preparation of a coating,
and the command output module transmits a control command corresponding to the menu option in the patient gazing area to the control unit of the ward function module.
7. The utility model provides an wisdom ward controlling means based on eye movement signal which characterized in that includes:
the eye tracker tracks the motion track coordinates of the eyeballs of the patient to acquire an original heat map of the fixation point;
the display screen is used for sending each item content of the current display menu and the coordinate position of the item button in the screen to the controller; and a process for the preparation of a coating,
the controller divides the menu display area into menu option sub-areas according to the coordinate positions of the option buttons in the screen, receives the original heat map of the fixation point output by the eye tracker, counts the heat values of all pixels in each menu option sub-area to determine the fixation area of the patient, outputs the control command corresponding to the menu option in the fixation area of the patient, and transmits the control command corresponding to the menu option in the fixation area of the patient to the control unit of the ward function module.
8. The intelligent ward control device based on eye movement signals of claim 7, wherein the eye movement instrument is fixed above the display screen, and the display screen is fixed above the sickbed with a downward inclination.
CN202010165965.1A 2020-03-11 2020-03-11 Intelligent ward control method and system based on eye movement signals Pending CN111443665A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010165965.1A CN111443665A (en) 2020-03-11 2020-03-11 Intelligent ward control method and system based on eye movement signals

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010165965.1A CN111443665A (en) 2020-03-11 2020-03-11 Intelligent ward control method and system based on eye movement signals

Publications (1)

Publication Number Publication Date
CN111443665A true CN111443665A (en) 2020-07-24

Family

ID=71627500

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010165965.1A Pending CN111443665A (en) 2020-03-11 2020-03-11 Intelligent ward control method and system based on eye movement signals

Country Status (1)

Country Link
CN (1) CN111443665A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3123794A1 (en) * 2021-06-14 2022-12-16 Centre Hospitalier Regional Universitaire De Tours COMMUNICATION INTERFACE ADAPTED BASED ON A COGNITIVE ASSESSMENT FOR PATIENTS DEPRIVED OF SPEECH

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103677221A (en) * 2012-08-31 2014-03-26 由田新技股份有限公司 Eye-controlled communication system
CN107783945A (en) * 2017-11-13 2018-03-09 山东师范大学 A kind of search result web page notice assessment method and device based on the dynamic tracking of eye
CN108270996A (en) * 2016-12-30 2018-07-10 安讯士有限公司 Watch thermal map attentively
CN207663395U (en) * 2017-12-26 2018-07-27 威海安鹏信息技术有限公司 A kind of ward electrical appliance intelligent controlling device
CN109522887A (en) * 2019-01-24 2019-03-26 北京七鑫易维信息技术有限公司 A kind of Eye-controlling focus method, apparatus, equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103677221A (en) * 2012-08-31 2014-03-26 由田新技股份有限公司 Eye-controlled communication system
CN108270996A (en) * 2016-12-30 2018-07-10 安讯士有限公司 Watch thermal map attentively
CN107783945A (en) * 2017-11-13 2018-03-09 山东师范大学 A kind of search result web page notice assessment method and device based on the dynamic tracking of eye
CN207663395U (en) * 2017-12-26 2018-07-27 威海安鹏信息技术有限公司 A kind of ward electrical appliance intelligent controlling device
CN109522887A (en) * 2019-01-24 2019-03-26 北京七鑫易维信息技术有限公司 A kind of Eye-controlling focus method, apparatus, equipment and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3123794A1 (en) * 2021-06-14 2022-12-16 Centre Hospitalier Regional Universitaire De Tours COMMUNICATION INTERFACE ADAPTED BASED ON A COGNITIVE ASSESSMENT FOR PATIENTS DEPRIVED OF SPEECH
WO2022263757A1 (en) * 2021-06-14 2022-12-22 Centre Hospitalier Regional Universitaire De Tours Communication interface adapted according to a cognitive evaluation for speech-impaired patients

Similar Documents

Publication Publication Date Title
EP3809241B1 (en) System and method for enabling communication through eye feedback
USRE39539E1 (en) System and method for monitoring eye movement
US6542081B2 (en) System and method for monitoring eye movement
Zhang et al. An EOG-based human–machine interface to control a smart home environment for patients with severe spinal cord injuries
CN107480462B (en) Intelligent clinical interaction system
US10039445B1 (en) Biosensors, communicators, and controllers monitoring eye movement and methods for using them
JP2024012497A (en) Communication methods and systems
WO2018013968A1 (en) Posture analysis systems and methods
CN110840666A (en) Wheelchair mechanical arm integrated system based on electro-oculogram and machine vision and control method thereof
Bang et al. New computer interface combining gaze tracking and brainwave measurements
CN106681494A (en) Environment control method based on brain computer interface
CN109948435A (en) Sitting posture prompting method and device
CN109508094B (en) Visual induction brain-computer interface method combined with asynchronous eye movement switch
IL268575A (en) System and method for patient monitoring
CN106681509A (en) Interface operating method and system
CN111443665A (en) Intelligent ward control method and system based on eye movement signals
Pandey et al. Assistance for paralyzed patient using eye motion detection
CN105763937A (en) Display device with health monitoring function and health monitoring method
CN102467230A (en) Method and system for operating and controlling vernier by human body
CN113160260A (en) Head-eye double-channel intelligent man-machine interaction system and operation method
CN113297966A (en) Night learning method based on multiple stimuli
CN111514001A (en) Full-automatic intelligent scraping device and working method thereof
Wang et al. Research on a spatial–temporal characterisation of blink-triggered eye control interactions
CN115662591A (en) Ward integrated management platform and server
WO2020132941A1 (en) Identification method and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200724