WO2014146269A1 - Controlling electronic systems and devices using eye status and eye movements - Google Patents

Controlling electronic systems and devices using eye status and eye movements Download PDF

Info

Publication number
WO2014146269A1
WO2014146269A1 PCT/CN2013/072958 CN2013072958W WO2014146269A1 WO 2014146269 A1 WO2014146269 A1 WO 2014146269A1 CN 2013072958 W CN2013072958 W CN 2013072958W WO 2014146269 A1 WO2014146269 A1 WO 2014146269A1
Authority
WO
WIPO (PCT)
Prior art keywords
eye
status
controller
alert
response
Prior art date
Application number
PCT/CN2013/072958
Other languages
French (fr)
Inventor
Honglei JIN
Original Assignee
Motorola Mobility Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Motorola Mobility Llc filed Critical Motorola Mobility Llc
Priority to PCT/CN2013/072958 priority Critical patent/WO2014146269A1/en
Publication of WO2014146269A1 publication Critical patent/WO2014146269A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/52Details of telephonic subscriber devices including functional features of a camera

Definitions

  • Various embodiments in the present disclosure concern control of electronic devices, such as mobile telephones and computers, via eye movements and status.
  • Mobile telephones present a variety of alerts that require users to manually interact with their telephones. For example, many phones provide incoming call alerts, alarm clock alerts, expiring timer alerts, incoming text message alerts, and reminder alerts, each of which may require the user to touch his or her phone in some way to silence and/or otherwise terminate an audible, visual, and/or tactile alert.
  • Manual interactions with their mobile phones are not always convenient or possible for users whose hands are not free to pickup or touch the telephone, and for users whose hands are wet or dirty. Absent proper manual interaction, the alert typically continues for a period of time, potentially distracting the user and/or those nearby from other tasks.
  • FIG. 1 is a block diagram of an example apparatus or system
  • FIG. 2 is a flow chart of an example method of operating a system, such as that shown in FIG. 1, and therefore corresponding to one or more embodiments.
  • FIG. 3 is a block diagram of an example computing device or system corresponding to one or more embodiments.
  • Example Device Embodiment(s) offered not to limit but only to exemplify and teach the invention, are shown and described in sufficient detail to enable those skilled in the art to implement or practice the invention(s). Thus, where appropriate to avoid obscuring the invention(s), the description may omit certain information known to those of skill in the art.
  • Example Device Embodiment(s) offered not to limit but only to exemplify and teach the invention, are shown and described in sufficient detail to enable those skilled in the art to implement or practice the invention(s). Thus, where appropriate to avoid obscuring the invention(s), the description may omit certain information known to those of skill in the art.
  • FIG. 1 shows an example of a generic apparatus 100 incorporating some principles of the present invention.
  • Apparatus 100 includes an alert element 110, an optical input component 120, and a controller 130.
  • Alert element 110 includes circuitry and other components to provide an audio alert, a visual alert, a mechanical vibration alert, or any combination thereof.
  • the alert element 110 may be implemented as a loudspeaker, an electronic display (LCD, OLED, etc.), one or more single or multi-colored LEDs, a vibrator, and/or a haptic element.
  • alert element 110 may provide an audio ring tone and a flashing light and/or caller identification (caller ID) information via speech synthesis or via textual display in response to an incoming telephone call.
  • the alert is provided in the form of all or a portion of a textual email message header or as an image or icon associated with a sender of a message.
  • Optical input component 120 includes one or more digital cameras or image sensors.
  • the optical input component further includes one or more illumination sources, for example infrared sources in the form of light-emitting diodes (LEDs).
  • the camera includes an indicator light that provides a visual indication to a user that the optical input component or camera is operational and/or that another operational condition of apparatus 100 had been satisfied, as determined by controller 130.
  • Controller 130 which is coupled to alert element 110 and to optical input component 120, includes an eye status detector 131.
  • the eye status detector 131 is implemented as one or more programmable processor logic circuits, for performing one or more functions described herein.
  • Logic includes hardware, firmware, software and/or combinations of each to perform a function(s) or an action(s), and/or to cause a function or action from another component.
  • logic may include a software controlled microprocessor, discrete logic such as an application specific integrated circuit (ASIC), a programmable logic device, memory device containing instructions, or the like, or combinational logic embodied in hardware.
  • ASIC application specific integrated circuit
  • Logic may also be fully embodied as software (coded machine readable and executable instructions) stored on a non-transitory, tangible medium, which can be executed by a processor to effect an end result or series of desired actions in a process. Logic may suitably be used to implement one or more modules configured to perform one or more functions.
  • controller module 130 is configured to control, for example, switch, actuate, or cause, alert element 110 to issue an alert in response to a signal representative or indicative of a predetermined event.
  • predetermined events include an incoming call (voice or video), an incoming message (text, SMS, video, instant message, email), a timer alarm (stopwatch or clock), or a calendar reminder.
  • the event signal is shown as a broken-line arrow to indicate that its origin may be internal or external to apparatus 100 or internal or external to controller module 130.
  • controller module 130 includes portions configured or programmed to provide an alarm clock function, a mobile telephony function, a messaging function, and/or a calendar function, each of which may generate the event signal.
  • controller module 130 In addition to causing alert elements 1 10 to initiate issuance of an alert in response to the event signal, controller module 130 also activates optical input
  • optical input component 120 begins providing image information, for example as a sequence of high-definition video image frames, to eye status detector 131.
  • Eye status detector 131 is configured to detect whether there are human eyes of a user included in the images and then determine whether the eyes are gazing at or toward the optical input component. Some embodiment include a facial recognition function within eye status detector 131 to confirm identity of the user as being the owner or other authorized user of apparatus, Other embodiments may use other approaches to identifying user, or like the example embodiment allow the eye status detector to read and respond to gaze of anyone within its field of view. If the eyes are determined to be gazing at the apparatus, eye status detector 131 suppress or stops all or a portion of the ongoing alert, for example silencing an audible, and/or stopping a vibrational portion of the ongoing alert.
  • the eye status detector 131 continues receiving image data and then further determines based on a further detected eye status how to respond to the actual event itself. For example, the eye status detector 131 may determine, based on received images, that the user's eyes have remained open or closed longer than a predetermined amount of time, that the user has blinked more than a predetermined number of times within a predetermined window, or that the user has winked (kept one eye open and one eye closed) during a particular image frame or during a particular window of time, or that the user has right winked or left winked. One of more of these status determinations can then be used as a command to direct the further response.
  • a left wink (closed left eye, open right eye) can direct answering the call
  • a right wink (closed right eye, open left eye) or double closure (both eyes closed, then both eyes opened, then both eyes closed— all within a predetermined time period) can direct dismissing the call.
  • FIG. 2 shows a flow chart 200 of one or more example methods of operating an apparatus or system, such as apparatus 100, or one or more components thereof, with particular emphasis on eye status detector 131.
  • Flow chart 200 includes process blocks 210-299, which are arranged and described in a serial execution sequence in the example embodiment. However, other embodiments execute two or more blocks in parallel using multiple processors or processor-like circuits or a single processor organized as two or more virtual machines or sub processors. Other embodiments also alter the process sequence or provide different functional partitions to achieve analogous results. Moreover, still other embodiments implement the blocks as two or more interconnected hardware modules with related control and data signals communicated between and through the modules. Additional, the process flow can be readily recast in terms of a state diagram. Thus, the example process flow applies to software, hardware, and/or firmware implementations.
  • the example method begins with receipt of an event signal.
  • this entails controller module 130 or apparatus 100 receiving an event signal, such as incoming call notification signal, an incoming text message notification signal, an incoming email message notification signal, an alarm clock alert signal, or low-battery status notification signal. Execution then continues at block 220.
  • an event signal such as incoming call notification signal, an incoming text message notification signal, an incoming email message notification signal, an alarm clock alert signal, or low-battery status notification signal.
  • Block 220 entails activating an alert and collecting image information.
  • Collecting image information entails activating one or more image or optical sensors within optical input component 120.
  • collecting this information entails activating one or more illumination sources, for example infrared LEDs.
  • Block 230 entails determining eye status based on the collected image information.
  • this determination entails detecting or recognizing facial image regions or patterns within each image frame and further identifying eyes within each recognized facial region. (Some embodiments may perform a facial recognition test to confirm that facial image regions in the collected image data match or correspond to those of an authorized user.)
  • the eye regions are then further analyzed or classified according to known techniques as one of: open, closed, transitional (between open and closed), or unknown.
  • known techniques can then be used to determined a point or direction of gaze. Many methods are available to determine eye movement and gaze direction. Some methods use a sequence of video images to extract the eye position.
  • Some methods use infrared light transmitter to send out infrared light and catch the reflected infrared light image with an infrared camera to detect the eye movement according to the characteristics of the eye's response to the light. Some embodiments may use facially-mounted sensors or reflectors or other elements to facilitate determination of eye gaze direction. After eye status is determined, execution continues at block 240.
  • Block 240 determines whether the eye status information indicates that the user has acknowledged the alert by gazing at the device. An eye status is determined to be a gaze if both eyes are open and directed at the device for a predetermined period of time. If the determination is negative (no) execution branches to block 250, and if the determination is affirmative execution proceeds to block 260.
  • Block 250 entails determining whether the alert has reached its end time or not. If it has not reached its end time, execution returns to block 230 to determine eye status again based on most recent image information. If the alert has reached its terminal time, execution advances to block 299, which entails ending the alert and taking a default (predetermined action) in response to the incoming event. For example, if the event is an incoming phone call, a default action could be to route the incoming call to a voice mail application for handling.
  • Block 260 entails suppressing, muting, or silencing all or part of the ongoing alert when a gaze is detected per block 240.
  • the alert may include an audible, visible, and tactile portion.
  • partial suppression of the alert entails terminating the audible and tactile portions of the alert.
  • Other embodiments may reduce the volume or sound level of the audible portion to a non-zero amount, for example 10, 20, 30, 40 or 50% of its original level.
  • some embodiments reduce the level of one or more portions of the alert, indicative of the fact that the user' s gaze indicates that the attention-commanding function of the alert has been largely, if not fully, realized. Execution advances next to block 270.
  • Block 270 entails determining a current eye status.
  • the example embodiment determines eye status by using one or more known computational and/or pattern recognition techniques for recognizing eyes in one or more of a sequence of video images and determining the eye status as left open, closed, transitional, or unknown, and right open, closed, transitional, or unknown in each of a set of images.
  • Block 280 entails determining whether the eye status information for the image data matches with one of a set of eye commands available for the event (or type of alert). In the exemplary embodiment, this entails comparing the eye status for a set of two or more consecutive image frames in a given temporal window after receipt of an acknowledgement gaze (both eyes directed toward the apparatus) to determine if the eye status matches that associated with one of a set of one or more predetermined action eye commands for the event.
  • the set of predetermined eye commands may be to answer or dismiss, with the answer command triggered by a sequence of images indicative of two blinks (double closure of both eyes) within a given time window and the dismiss command triggered if a sequence of images indicates no blinks (no dual eye closures) within the given time window.
  • the answer or dismiss command in some embodiments, is triggered by a wink occurring in a predetermined number of consecutive frames.
  • the email is marked read in response to an amount of time a user's eyes are determined by the eye status detector to be gazing at a display screen displaying the email message.
  • the amount of time is, in some embodiments, a function of the length of the email message.
  • Other embodiments can monitor for specific eye status information associated with a command to mark the email as read, allowing the user to decide whether to mark as read or leave unmarked. If no eye command is detected, execution branches to block 295, where the eye status detector module 131 determines whether the alert has timed out. If the alert has not timed out, execution branches back to block 270 to determine an eye status from additional image data.
  • execution advances to block 299, at which point the alert is ended and any default action associated with the event, for example, an incoming call, is taken by the controller. From this juncture, the controller in some embodiments shifts into an idle or wait state until the next event occurs, restarting the process at block 210.
  • FIG. 2 also includes a divider line 201 between blocks 260 and 270, which separates flow chart 200 into an upper portion (blocks 210-260) and a lower portion (blocks 270-299).
  • the upper and lower portions can be executed independently; and in other embodiments, one of the upper and lower portions or a part thereof is entirely omitted.
  • FIG. 3 shows an example electronic device or system 300, which incorporates an eye status detector, identical or similar to that described previously with respect to FIG. 1 and FIG. 2.
  • System or apparatus 300 is generally representative of a personal computer, desktop computer, laptop computer, tablet computer, workstation, personal digital assistant, smart phone, mobile telephone, global positioning receiver, gaming system, remote controller, or any other device having an image or optical sensing capability and a processing capability and one or more user controllable functions.
  • device 300 includes a central bus 301 which interconnects a power module 310, a transceiver module 320, a user interface module 330, a camera module 340, a a processor module 350, and a memory module 360.
  • Power module 310 includes components and circuitry for providing power to various components of the electronic device 300.
  • module 310 includes a power supply, one or more batteries, battery-charging circuitry, and an AC adapter module and plug (none of which are shown separately in the figure).
  • Transceiver module 320 includes one or more transceivers, transmitters, and/or receiver circuits for enabling communications with external devices, systems, and/or networks via available communications protocols. Some embodiments include circuitry for enabling personal area, local area, wide area, or metropolitan area wireless communications via one or more of the following protocols: GSM (Global System for Mobile Communications), Bluetooth, WiFi, WiMAX, GPS (Global Positioning System), LTE (Long Term Evolution), and UMTS (Universal Mobile Telecommunications
  • GSM Global System for Mobile Communications
  • WiFi Wireless Fidelity
  • WiMAX Wireless Fidelity
  • GPS Global Positioning System
  • LTE Long Term Evolution
  • UMTS Universal Mobile Telecommunications
  • Transceiver module 320 may also include one or more antennae 322, which are configured according to any known or developed structures for radiating and/or receiving electromagnetic energy as desired for one or more of the wireless transceivers, transmitters, and/or receiver circuits.
  • User interface module 330 includes one or more microphones, keyboards, alpha-numeric keyboard, pointing elements, isolated buttons, soft and/or hard keys, touch screens, jog wheel, and/or any other known input components. Additionally, user interface module includes one or more alert elements such as a loudspeaker, electronic display (LCD, OLED), LEDs, and/or vibrator for creating audible, visible, and/or tactile alerts.
  • alert elements such as a loudspeaker, electronic display (LCD, OLED), LEDs, and/or vibrator for creating audible, visible, and/or tactile alerts.
  • Camera module 340 includes one or more light or optical sensors, for example in the form of one or more gridded array of image sensors. In some
  • Processor module 350 includes one or more processors, processing circuits, or controllers. In the example embodiment, processor module 350 takes any convenient or desirable form and implements the flow chart of FIG. 2.
  • Memory module 360 takes the example form of one or more electronic, magnetic, or optical data-storage devices stores code (machine-readable or executable instructions.) Specifically, memory module 360 stores code for operating system module 361, applications module 362, and eye status detector module 363 (131).
  • operating system module 361 takes the form of a conventional operating system (OS), such as Google Chrome OS, Android OS, Apple OS X, Apple iOS, Microsoft Windows, Microsoft Windows Mobile, or Linux.
  • OS operating system
  • Applications module 362 includes one or more applications, such as a word processor, a browser, a blogging application, a social networking application, a messaging application, a presentation application, a timer or alarm clock application, a telephony application, a calendar application, and a game application, one or more which can generate an event requiring an alert and provide an event signal to eye status detector module 363 (which is similar in functionality to eye status detector 131 described with respect to FIGs. 1 and 2).
  • applications such as a word processor, a browser, a blogging application, a social networking application, a messaging application, a presentation application, a timer or alarm clock application, a telephony application, a calendar application, and a game application, one or more which can generate an event requiring an alert and provide an event signal to eye status detector module 363 (which is similar in functionality to eye status detector 131 described with respect to FIGs. 1 and 2).
  • eye status detector module 363 one or more portions of the functionality of eye status detector module 363 are provided as part of host operating system; whereas in others, one or more portions of the functionality are provided as part of an application, for example, a plug-in, add-on, or extension to one or more of the applications within applications module 362.
  • the eye status detector module includes a user profile or user preference portion, which allows a user to define and/or associate eye commands with various command options in one or more applications within applications module 362 or with one or more operating system commands. These preferences are stored, for example, with other application preferences, and accessed by user for setting and adjustment along with other user preferences.
  • some embodiments may include one or more generic or specialized processors (or “processing devices”) such as microprocessors, digital signal processors, customized processors and field programmable gate arrays (FPGAs) and unique stored program instructions (including both software and firmware) that control the one or more processors to implement, in conjunction with certain non- processor circuits, some, most, or all of the functions of the method and/or apparatus described herein.
  • processors or “processing devices”
  • processors such as microprocessors, digital signal processors, customized processors and field programmable gate arrays (FPGAs) and unique stored program instructions (including both software and firmware) that control the one or more processors to implement, in conjunction with certain non- processor circuits, some, most, or all of the functions of the method and/or apparatus described herein.
  • FPGAs field programmable gate arrays
  • unique stored program instructions including both software and firmware
  • some embodiments can be implemented as a computer- readable storage medium having computer readable code stored thereon for programming a computer (e.g., including a processor) to perform a method as described and claimed herein.
  • computer-readable storage medium can include a non-transitory machine readable storage device, having stored thereon a computer program that include a plurality of code sections for performing operations, steps or a set of instructions.
  • Examples of such computer-readable storage mediums include, but are not limited to, a hard disk, a CD-ROM, an optical storage device, a magnetic storage device, a ROM (Read Only Memory), a PROM (Programmable Read Only Memory), an
  • EPROM Erasable Programmable Read Only Memory
  • EEPROM Electrically Erasable Programmable Read Only Memory
  • Flash memory Flash memory

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Telephone Function (AREA)

Abstract

A controller enables a user to control a mobile phone or other device using eye commands. In one example embodiment, the controller determines 230 an eye status of one or more of a user's eyes in a set of image information, and then changes an operational condition of the device, based on the determined eye status. For example, the controller can determine from image information that a user has blinked or winked one or more times and translate this eye status into the command of answering or dismissing an incoming telephone call, or opening or dismissing a text or email message.

Description

CONTROLLING ELECTRONIC SYSTEMS AND DEVICES USING EYE
STATUS AND EYE MOVEMENTS
Technical Field
[0001] Various embodiments in the present disclosure concern control of electronic devices, such as mobile telephones and computers, via eye movements and status.
Background
[0002] Mobile telephones present a variety of alerts that require users to manually interact with their telephones. For example, many phones provide incoming call alerts, alarm clock alerts, expiring timer alerts, incoming text message alerts, and reminder alerts, each of which may require the user to touch his or her phone in some way to silence and/or otherwise terminate an audible, visual, and/or tactile alert. Manual interactions with their mobile phones, however, are not always convenient or possible for users whose hands are not free to pickup or touch the telephone, and for users whose hands are wet or dirty. Absent proper manual interaction, the alert typically continues for a period of time, potentially distracting the user and/or those nearby from other tasks.
[0003] Accordingly, there is an opportunity for new hands-free approaches to answering alerts and/or controlling other aspects of electronic devices, such as mobile phones.
Brief Description of Drawings
[0004] The accompanying figures, where like reference numerals refer to identical or functionally similar elements throughout the separate views, together with the detailed description below, are incorporated in and form part of the specification, and serve to further illustrate embodiments of concepts that include the claimed invention, and explain various principles and advantages of those embodiments. [0005] FIG. 1 is a block diagram of an example apparatus or system
corresponding to one or more embodiments.
[0006] FIG. 2 is a flow chart of an example method of operating a system, such as that shown in FIG. 1, and therefore corresponding to one or more embodiments.
[0007] FIG. 3 is a block diagram of an example computing device or system corresponding to one or more embodiments.
[0008] Skilled artisans will appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions of some of the elements in the figures may be exaggerated relative to other elements to facilitate understanding.
Detailed Description
[0009] This document, which incorporates the drawings and the appended claims, describes one or more specific embodiments of one or more inventions. These
embodiments, offered not to limit but only to exemplify and teach the invention, are shown and described in sufficient detail to enable those skilled in the art to implement or practice the invention(s). Thus, where appropriate to avoid obscuring the invention(s), the description may omit certain information known to those of skill in the art. Example Device Embodiment(s)
[0010] FIG. 1 shows an example of a generic apparatus 100 incorporating some principles of the present invention. Apparatus 100 includes an alert element 110, an optical input component 120, and a controller 130.
[0011] Alert element 110 includes circuitry and other components to provide an audio alert, a visual alert, a mechanical vibration alert, or any combination thereof. The alert element 110 may be implemented as a loudspeaker, an electronic display (LCD, OLED, etc.), one or more single or multi-colored LEDs, a vibrator, and/or a haptic element. For example, alert element 110 may provide an audio ring tone and a flashing light and/or caller identification (caller ID) information via speech synthesis or via textual display in response to an incoming telephone call. In some embodiments, the alert is provided in the form of all or a portion of a textual email message header or as an image or icon associated with a sender of a message.
[0012] Optical input component 120, in some embodiments, includes one or more digital cameras or image sensors. In particular embodiments, the optical input component further includes one or more illumination sources, for example infrared sources in the form of light-emitting diodes (LEDs). In an example embodiment, the camera includes an indicator light that provides a visual indication to a user that the optical input component or camera is operational and/or that another operational condition of apparatus 100 had been satisfied, as determined by controller 130.
[0013] Controller 130, which is coupled to alert element 110 and to optical input component 120, includes an eye status detector 131. In the example embodiment, the eye status detector 131 is implemented as one or more programmable processor logic circuits, for performing one or more functions described herein. "Logic," as used herein, includes hardware, firmware, software and/or combinations of each to perform a function(s) or an action(s), and/or to cause a function or action from another component. For example, based on a desired application or need, logic may include a software controlled microprocessor, discrete logic such as an application specific integrated circuit (ASIC), a programmable logic device, memory device containing instructions, or the like, or combinational logic embodied in hardware. Logic may also be fully embodied as software (coded machine readable and executable instructions) stored on a non-transitory, tangible medium, which can be executed by a processor to effect an end result or series of desired actions in a process. Logic may suitably be used to implement one or more modules configured to perform one or more functions.
[0014] More specifically, controller module 130 is configured to control, for example, switch, actuate, or cause, alert element 110 to issue an alert in response to a signal representative or indicative of a predetermined event. Examples of predetermined events include an incoming call (voice or video), an incoming message (text, SMS, video, instant message, email), a timer alarm (stopwatch or clock), or a calendar reminder. [0015] In FIG. 1, the event signal is shown as a broken-line arrow to indicate that its origin may be internal or external to apparatus 100 or internal or external to controller module 130. In particular embodiments, controller module 130 includes portions configured or programmed to provide an alarm clock function, a mobile telephony function, a messaging function, and/or a calendar function, each of which may generate the event signal.
[0016] In addition to causing alert elements 1 10 to initiate issuance of an alert in response to the event signal, controller module 130 also activates optical input
component 120. In response to this activation, optical input component 120 begins providing image information, for example as a sequence of high-definition video image frames, to eye status detector 131. Eye status detector 131 is configured to detect whether there are human eyes of a user included in the images and then determine whether the eyes are gazing at or toward the optical input component. Some embodiment include a facial recognition function within eye status detector 131 to confirm identity of the user as being the owner or other authorized user of apparatus, Other embodiments may use other approaches to identifying user, or like the example embodiment allow the eye status detector to read and respond to gaze of anyone within its field of view. If the eyes are determined to be gazing at the apparatus, eye status detector 131 suppress or stops all or a portion of the ongoing alert, for example silencing an audible, and/or stopping a vibrational portion of the ongoing alert.
[0017] In some embodiments, the eye status detector 131 continues receiving image data and then further determines based on a further detected eye status how to respond to the actual event itself. For example, the eye status detector 131 may determine, based on received images, that the user's eyes have remained open or closed longer than a predetermined amount of time, that the user has blinked more than a predetermined number of times within a predetermined window, or that the user has winked (kept one eye open and one eye closed) during a particular image frame or during a particular window of time, or that the user has right winked or left winked. One of more of these status determinations can then be used as a command to direct the further response. For example, if the event were an incoming telephone call in a mobile telephone or in tablet computer having a Voice -Over- IP telephony application, a left wink (closed left eye, open right eye) can direct answering the call, and a right wink (closed right eye, open left eye) or double closure (both eyes closed, then both eyes opened, then both eyes closed— all within a predetermined time period) can direct dismissing the call.
Example Method(s) of Operation
[0018] FIG. 2 shows a flow chart 200 of one or more example methods of operating an apparatus or system, such as apparatus 100, or one or more components thereof, with particular emphasis on eye status detector 131. Flow chart 200 includes process blocks 210-299, which are arranged and described in a serial execution sequence in the example embodiment. However, other embodiments execute two or more blocks in parallel using multiple processors or processor-like circuits or a single processor organized as two or more virtual machines or sub processors. Other embodiments also alter the process sequence or provide different functional partitions to achieve analogous results. Moreover, still other embodiments implement the blocks as two or more interconnected hardware modules with related control and data signals communicated between and through the modules. Additional, the process flow can be readily recast in terms of a state diagram. Thus, the example process flow applies to software, hardware, and/or firmware implementations.
[0019] At block 210, the example method begins with receipt of an event signal.
In the example embodiment, this entails controller module 130 or apparatus 100 receiving an event signal, such as incoming call notification signal, an incoming text message notification signal, an incoming email message notification signal, an alarm clock alert signal, or low-battery status notification signal. Execution then continues at block 220.
[0020] Block 220 entails activating an alert and collecting image information. In the example embodiment, activating one or more alert elements 110 to produce an audible, visible, and/or tactile alert. Collecting image information entails activating one or more image or optical sensors within optical input component 120. In some
embodiments, collecting this information entails activating one or more illumination sources, for example infrared LEDs.
[0021] Block 230 entails determining eye status based on the collected image information. In the example embodiment, this determination entails detecting or recognizing facial image regions or patterns within each image frame and further identifying eyes within each recognized facial region. (Some embodiments may perform a facial recognition test to confirm that facial image regions in the collected image data match or correspond to those of an authorized user.) The eye regions are then further analyzed or classified according to known techniques as one of: open, closed, transitional (between open and closed), or unknown. Known techniques can then be used to determined a point or direction of gaze. Many methods are available to determine eye movement and gaze direction. Some methods use a sequence of video images to extract the eye position. Some methods use infrared light transmitter to send out infrared light and catch the reflected infrared light image with an infrared camera to detect the eye movement according to the characteristics of the eye's response to the light. Some embodiments may use facially-mounted sensors or reflectors or other elements to facilitate determination of eye gaze direction. After eye status is determined, execution continues at block 240.
[0022] Block 240 determines whether the eye status information indicates that the user has acknowledged the alert by gazing at the device. An eye status is determined to be a gaze if both eyes are open and directed at the device for a predetermined period of time. If the determination is negative (no) execution branches to block 250, and if the determination is affirmative execution proceeds to block 260.
[0023] Block 250 entails determining whether the alert has reached its end time or not. If it has not reached its end time, execution returns to block 230 to determine eye status again based on most recent image information. If the alert has reached its terminal time, execution advances to block 299, which entails ending the alert and taking a default (predetermined action) in response to the incoming event. For example, if the event is an incoming phone call, a default action could be to route the incoming call to a voice mail application for handling.
[0024] Block 260 entails suppressing, muting, or silencing all or part of the ongoing alert when a gaze is detected per block 240. For example, the alert may include an audible, visible, and tactile portion. In this case, partial suppression of the alert entails terminating the audible and tactile portions of the alert. Other embodiments may reduce the volume or sound level of the audible portion to a non-zero amount, for example 10, 20, 30, 40 or 50% of its original level. Similarly, some embodiments reduce the level of one or more portions of the alert, indicative of the fact that the user' s gaze indicates that the attention-commanding function of the alert has been largely, if not fully, realized. Execution advances next to block 270.
[0025] Block 270 entails determining a current eye status. As in block 230, the example embodiment determines eye status by using one or more known computational and/or pattern recognition techniques for recognizing eyes in one or more of a sequence of video images and determining the eye status as left open, closed, transitional, or unknown, and right open, closed, transitional, or unknown in each of a set of images.
[0026] Block 280 entails determining whether the eye status information for the image data matches with one of a set of eye commands available for the event (or type of alert). In the exemplary embodiment, this entails comparing the eye status for a set of two or more consecutive image frames in a given temporal window after receipt of an acknowledgement gaze (both eyes directed toward the apparatus) to determine if the eye status matches that associated with one of a set of one or more predetermined action eye commands for the event. For example, if the event is an incoming call, the set of predetermined eye commands may be to answer or dismiss, with the answer command triggered by a sequence of images indicative of two blinks (double closure of both eyes) within a given time window and the dismiss command triggered if a sequence of images indicates no blinks (no dual eye closures) within the given time window. The answer or dismiss command, in some embodiments, is triggered by a wink occurring in a predetermined number of consecutive frames. [0027] More particularly, if an eye command is detected, execution branches to block 290, which entails ending the alert (any portion which is still active) and taking an action based on the detected eye command, for example answering or dismissing the call, or opening an email message and marking it as read. In some embodiments, the email is marked read in response to an amount of time a user's eyes are determined by the eye status detector to be gazing at a display screen displaying the email message. The amount of time is, in some embodiments, a function of the length of the email message. Other embodiments can monitor for specific eye status information associated with a command to mark the email as read, allowing the user to decide whether to mark as read or leave unmarked. If no eye command is detected, execution branches to block 295, where the eye status detector module 131 determines whether the alert has timed out. If the alert has not timed out, execution branches back to block 270 to determine an eye status from additional image data. If the alert has timed out, execution advances to block 299, at which point the alert is ended and any default action associated with the event, for example, an incoming call, is taken by the controller. From this juncture, the controller in some embodiments shifts into an idle or wait state until the next event occurs, restarting the process at block 210.
[0028] FIG. 2 also includes a divider line 201 between blocks 260 and 270, which separates flow chart 200 into an upper portion (blocks 210-260) and a lower portion (blocks 270-299). In some embodiments, the upper and lower portions can be executed independently; and in other embodiments, one of the upper and lower portions or a part thereof is entirely omitted.
Exemplary System(s)
[0029] FIG. 3 shows an example electronic device or system 300, which incorporates an eye status detector, identical or similar to that described previously with respect to FIG. 1 and FIG. 2. System or apparatus 300 is generally representative of a personal computer, desktop computer, laptop computer, tablet computer, workstation, personal digital assistant, smart phone, mobile telephone, global positioning receiver, gaming system, remote controller, or any other device having an image or optical sensing capability and a processing capability and one or more user controllable functions.
Specifically, device 300 includes a central bus 301 which interconnects a power module 310, a transceiver module 320, a user interface module 330, a camera module 340, a a processor module 350, and a memory module 360.
[0030] Power module 310 includes components and circuitry for providing power to various components of the electronic device 300. In the example embodiment, module 310 includes a power supply, one or more batteries, battery-charging circuitry, and an AC adapter module and plug (none of which are shown separately in the figure).
[0031] Transceiver module 320 includes one or more transceivers, transmitters, and/or receiver circuits for enabling communications with external devices, systems, and/or networks via available communications protocols. Some embodiments include circuitry for enabling personal area, local area, wide area, or metropolitan area wireless communications via one or more of the following protocols: GSM (Global System for Mobile Communications), Bluetooth, WiFi, WiMAX, GPS (Global Positioning System), LTE (Long Term Evolution), and UMTS (Universal Mobile Telecommunications
System). Transceiver module 320 may also include one or more antennae 322, which are configured according to any known or developed structures for radiating and/or receiving electromagnetic energy as desired for one or more of the wireless transceivers, transmitters, and/or receiver circuits.
[0032] User interface module 330 includes one or more microphones, keyboards, alpha-numeric keyboard, pointing elements, isolated buttons, soft and/or hard keys, touch screens, jog wheel, and/or any other known input components. Additionally, user interface module includes one or more alert elements such as a loudspeaker, electronic display (LCD, OLED), LEDs, and/or vibrator for creating audible, visible, and/or tactile alerts.
[0033] Camera module 340 includes one or more light or optical sensors, for example in the form of one or more gridded array of image sensors. In some
embodiments, the multiple image sensors are arranged to collect data from opposite directions, such as on the front and rear major surfaces of an apparatus housing. [0034] Processor module 350 includes one or more processors, processing circuits, or controllers. In the example embodiment, processor module 350 takes any convenient or desirable form and implements the flow chart of FIG. 2.
[0035] Memory module 360 takes the example form of one or more electronic, magnetic, or optical data-storage devices stores code (machine-readable or executable instructions.) Specifically, memory module 360 stores code for operating system module 361, applications module 362, and eye status detector module 363 (131).
[0036] In the example embodiment, operating system module 361 takes the form of a conventional operating system (OS), such as Google Chrome OS, Android OS, Apple OS X, Apple iOS, Microsoft Windows, Microsoft Windows Mobile, or Linux.
[0037] Applications module 362 includes one or more applications, such as a word processor, a browser, a blogging application, a social networking application, a messaging application, a presentation application, a timer or alarm clock application, a telephony application, a calendar application, and a game application, one or more which can generate an event requiring an alert and provide an event signal to eye status detector module 363 (which is similar in functionality to eye status detector 131 described with respect to FIGs. 1 and 2).
[0038] In some embodiments, one or more portions of the functionality of eye status detector module 363 are provided as part of host operating system; whereas in others, one or more portions of the functionality are provided as part of an application, for example, a plug-in, add-on, or extension to one or more of the applications within applications module 362. In some embodiments, the eye status detector module includes a user profile or user preference portion, which allows a user to define and/or associate eye commands with various command options in one or more applications within applications module 362 or with one or more operating system commands. These preferences are stored, for example, with other application preferences, and accessed by user for setting and adjustment along with other user preferences.
Conclusion [0039] In the foregoing specification, specific embodiments have been described.
However, one of ordinary skill in the art appreciates that various modifications and changes can be made without departing from the scope of the invention as set forth in the claims below. Accordingly, the specification and figures are to be regarded in an illustrative rather than a restrictive sense, and all such modifications are intended to be included within the scope of present teachings.
[0040] The benefits, advantages, solutions to problems, and any element(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential features or elements of any or all the claims. The invention is defined solely by the appended claims including any amendments made during the pendency of this application and all equivalents of those claims as issued.
[0041] Moreover in this document, relational terms, such as second, top and bottom, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms "comprises," "comprising," "has",
"having," "includes", "including," "contains", "containing" or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises, has, includes, contains a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. An element proceeded by "comprises a", "has ...a", "includes ...a", "contains ...a" does not, without more constraints, preclude the existence of additional identical elements in the process, method, article, or apparatus that comprises, has, includes, contains the element. The terms "a" and "an" are defined as one or more unless explicitly stated otherwise herein. The terms "substantially", "essentially", "approximately", "about" or any other version thereof, are defined as being close to as understood by one of ordinary skill in the art, and in one non-limiting embodiment the term is defined to be within 10%, in another embodiment within 5%, in another embodiment within 1% and in another embodiment within 0.5%. The term "coupled" as used herein is defined as connected, although not necessarily directly and not necessarily mechanically. A device or structure that is "configured" in a certain way is configured in at least that way, but may also be configured in ways that are not listed.
[0042] It will be appreciated that some embodiments may include one or more generic or specialized processors (or "processing devices") such as microprocessors, digital signal processors, customized processors and field programmable gate arrays (FPGAs) and unique stored program instructions (including both software and firmware) that control the one or more processors to implement, in conjunction with certain non- processor circuits, some, most, or all of the functions of the method and/or apparatus described herein. Alternatively, some or all functions could be implemented by a state machine that has no stored program instructions, or in one or more application specific integrated circuits (ASICs), in which each function or some combinations of certain of the functions are implemented as custom logic. Of course, a combination of the two approaches could be used.
[0043] Moreover, some embodiments can be implemented as a computer- readable storage medium having computer readable code stored thereon for programming a computer (e.g., including a processor) to perform a method as described and claimed herein. Likewise, computer-readable storage medium can include a non-transitory machine readable storage device, having stored thereon a computer program that include a plurality of code sections for performing operations, steps or a set of instructions.
[0044] Examples of such computer-readable storage mediums include, but are not limited to, a hard disk, a CD-ROM, an optical storage device, a magnetic storage device, a ROM (Read Only Memory), a PROM (Programmable Read Only Memory), an
EPROM (Erasable Programmable Read Only Memory), an EEPROM (Electrically Erasable Programmable Read Only Memory) and a Flash memory. Further, it is expected that one of ordinary skill, notwithstanding possibly significant effort and many design choices motivated by, for example, available time, current technology, and economic considerations, when guided by the concepts and principles disclosed herein will be readily capable of generating such software instructions and programs and ICs with minimal experimentation.
[0045] The Abstract of the Disclosure is provided to allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in various embodiments for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separately claimed subject matter.

Claims

Claims The claimed invention is:
1. An apparatus comprising:
an alert element;
an optical input component; and
a controller coupled to the alert element and the optical input component, wherein the controller is configured to:
initiate an alert via the alert element in response to a predetermined event; identify an eye status based on image information from the optical input component; and
change the alert in response to determining that the eye status satisfies a first predetermined condition.
2. The apparatus of claim 1:
wherein the predetermined event is an incoming telephone call;
wherein the alert element includes a loudspeaker; and
wherein the controller is configured to reduce a sound level of the loudspeaker in response to the eye status satisfying the first predetermined condition.
3. The apparatus of claim 2, wherein the controller is further configured to:
identify another eye status based on further image information from the optical input component; and
to answer the incoming telephone call in response to determining that the another eye status satisfies a predetermined condition.
4. The apparatus of claim 2, wherein the controller is further configured to:
identify another eye status based on further image information from the optical input component; and to dismiss the incoming telephone call in response to determining that the another eye status satisfies a predetermined condition.
5. The apparatus of claim 1 :
wherein the predetermined event is receipt of an alarm signal from an alarm
clock;
wherein the alert element includes a loudspeaker; and
wherein the controller is configured to reduce a sound level of the loudspeaker in response to the eye status satisfying the first predetermined condition.
6. The apparatus of claim 5, wherein the controller is further configured to:
identify another eye status based on further image information from the optical input component; and
snooze the alarm clock in response to determining that the another eye status satisfies a predetermined condition.
7. The apparatus of claim 1 :
wherein the predetermined event is a calendar reminder notification;
wherein the alert element includes a loudspeaker and a display; and
wherein the controller is configured to reduce a sound level of the loudspeaker in response to the eye status satisfying the first predetermined condition.
8. The apparatus of claim 7, wherein the controller is further configured to:
identify another eye status based on further image information from the optical input component; and
open a calendar application in response to determining that the another eye status satisfies a predetermined condition.
9. The apparatus of claim 1 :
wherein the predetermined event is an email receipt notification;
wherein the alert element includes a loudspeaker and a display; and wherein the controller is configured to reduce a sound level of the loudspeaker in response to the eye status satisfying the first predetermined condition.
10. The apparatus of claim 9, wherein the controller is further configured to:
identify another eye status based on further image information from the optical input component; and
open an email in response to determining that the another eye status satisfies a predetermined condition.
11. The apparatus of claim 1 :
wherein the image information includes a set of two or more consecutive images from a sequence of images; and
wherein the controller is configured to determine whether the image information represents two opened eyes of a person for a predetermined amount of time and to decrease a sound level of a loudspeaker in response to determining that the image information represents the two opened eyes of the person for the predetermined amount of time.
12. The apparatus of claim 11, wherein determining that the image information
represents the two opened eyes includes determining that the image information includes a left eye and a right eye that are opened concurrently.
13. The apparatus of claim 1, further comprising:
a wireless transceiver module, coupled to the controller, configured to provide an incoming communication notification to the controller, and the controller initiates the alert responsive to receiving the incoming communication notification.
14. The apparatus of claim 1, wherein the optical input component is a camera; wherein the eye status is one of a closed eye and an open eye of a single person; and wherein the first predetermined condition is that the eye status be that of the closed eye and the open eye for at least a predetermined amount of time.
15. The apparatus of claim 1, wherein the alert element includes an LED.
16. The apparatus of claim 1, wherein the alert element includes a haptic component.
17. A method comprising:
receiving a signal indicative of an incoming communication;
activating a camera to receive image information in response to receiving the signal;
providing an alert notification in response to receiving the signal; and
changing the alert notification based on information regarding a status of at least one eye in the image information.
18. The method of claim 17, wherein changing the alert notification includes
discontinuing at least a portion of the alert notification.
19. The method of claim 17:
wherein the signal is indicative of an incoming telephone call;
wherein the image information includes data representative of a left eye and right eye of a person; and
wherein the method further comprises:
answering the incoming telephone call in response to a status of the left eye being different than a status of the right eye.
20. The method of claim 19, wherein the status of one of the left eye and the right eye is open and the status of another of the right eye and the left eye is closed.
PCT/CN2013/072958 2013-03-20 2013-03-20 Controlling electronic systems and devices using eye status and eye movements WO2014146269A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2013/072958 WO2014146269A1 (en) 2013-03-20 2013-03-20 Controlling electronic systems and devices using eye status and eye movements

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2013/072958 WO2014146269A1 (en) 2013-03-20 2013-03-20 Controlling electronic systems and devices using eye status and eye movements

Publications (1)

Publication Number Publication Date
WO2014146269A1 true WO2014146269A1 (en) 2014-09-25

Family

ID=51579288

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2013/072958 WO2014146269A1 (en) 2013-03-20 2013-03-20 Controlling electronic systems and devices using eye status and eye movements

Country Status (1)

Country Link
WO (1) WO2014146269A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9978265B2 (en) 2016-04-11 2018-05-22 Tti (Macao Commercial Offshore) Limited Modular garage door opener
US10015898B2 (en) 2016-04-11 2018-07-03 Tti (Macao Commercial Offshore) Limited Modular garage door opener

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102385308A (en) * 2010-09-06 2012-03-21 英业达股份有限公司 Portable electronic device and operating method thereof
CN102467088A (en) * 2010-11-16 2012-05-23 深圳富泰宏精密工业有限公司 Face recognition alarm clock and method for wakening user by face recognition alarm clock
CN102799277A (en) * 2012-07-26 2012-11-28 深圳先进技术研究院 Wink action-based man-machine interaction method and system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102385308A (en) * 2010-09-06 2012-03-21 英业达股份有限公司 Portable electronic device and operating method thereof
CN102467088A (en) * 2010-11-16 2012-05-23 深圳富泰宏精密工业有限公司 Face recognition alarm clock and method for wakening user by face recognition alarm clock
CN102799277A (en) * 2012-07-26 2012-11-28 深圳先进技术研究院 Wink action-based man-machine interaction method and system

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9978265B2 (en) 2016-04-11 2018-05-22 Tti (Macao Commercial Offshore) Limited Modular garage door opener
US10015898B2 (en) 2016-04-11 2018-07-03 Tti (Macao Commercial Offshore) Limited Modular garage door opener
US10127806B2 (en) 2016-04-11 2018-11-13 Tti (Macao Commercial Offshore) Limited Methods and systems for controlling a garage door opener accessory
US10157538B2 (en) 2016-04-11 2018-12-18 Tti (Macao Commercial Offshore) Limited Modular garage door opener
US10237996B2 (en) 2016-04-11 2019-03-19 Tti (Macao Commercial Offshore) Limited Modular garage door opener

Similar Documents

Publication Publication Date Title
US11886695B2 (en) Notification processing method and electronic device
US10423235B2 (en) Primary device that interfaces with a secondary device based on gesture commands
CN109154858B (en) Intelligent electronic device and operation method thereof
US8928723B2 (en) Mobile terminal and control method thereof
EP3331226B1 (en) Method and device for reading messages
US20180046336A1 (en) Instant Message Processing Method and Apparatus, and Storage Medium
US10610152B2 (en) Sleep state detection method, apparatus and system
KR102053361B1 (en) Method for performing smart rotation an electronic device thereof
EP3121701A1 (en) Method and apparatus for single-hand operation on full screen
US9760165B2 (en) Mobile terminal device and input operation receiving method for switching input methods
US9380433B2 (en) Mobile terminal and control method thereof
US9348412B2 (en) Method and apparatus for operating notification function in user device
EP2708994A2 (en) Triggering method and wireless handheld device
WO2014084224A1 (en) Electronic device and line-of-sight input method
CN110673783B (en) Touch control method and electronic equipment
KR20160113906A (en) Mobile terminal and control method thereof
EP3410282A1 (en) Electronic device and method for controlling user interface of electronic device
US11410647B2 (en) Electronic device with speech recognition function, control method of electronic device with speech recognition function, and recording medium
US20150234586A1 (en) Mobile terminal and method of controlling the same
EP2928163A1 (en) Messaging system and method thereof
WO2014146269A1 (en) Controlling electronic systems and devices using eye status and eye movements
US20160139770A1 (en) Method for presenting prompt on mobile terminal and the same mobile terminal
CN106126050B (en) Menu display method and device
KR20150111834A (en) Mobile terminal and method for controlling the same
EP3848780B1 (en) Electronic device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13878976

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 26/11/2015)

122 Ep: pct application non-entry in european phase

Ref document number: 13878976

Country of ref document: EP

Kind code of ref document: A1