WO2018113187A1 - 一种显示控制方法及显示装置 - Google Patents

一种显示控制方法及显示装置 Download PDF

Info

Publication number
WO2018113187A1
WO2018113187A1 PCT/CN2017/086116 CN2017086116W WO2018113187A1 WO 2018113187 A1 WO2018113187 A1 WO 2018113187A1 CN 2017086116 W CN2017086116 W CN 2017086116W WO 2018113187 A1 WO2018113187 A1 WO 2018113187A1
Authority
WO
WIPO (PCT)
Prior art keywords
eye
frequency
terminal
state
image
Prior art date
Application number
PCT/CN2017/086116
Other languages
English (en)
French (fr)
Inventor
王明良
Original Assignee
惠科股份有限公司
重庆惠科金渝光电科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 惠科股份有限公司, 重庆惠科金渝光电科技有限公司 filed Critical 惠科股份有限公司
Priority to US15/557,808 priority Critical patent/US10255874B2/en
Publication of WO2018113187A1 publication Critical patent/WO2018113187A1/zh

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/003Details of a display terminal, the details relating to the control arrangement of the display terminal and to the interfaces thereto
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/167Detection; Localisation; Normalisation using comparisons between temporally consecutive images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/171Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • G06V40/176Dynamic expression
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/197Matching; Classification
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/08Arrangements within a display terminal for setting, manually or automatically, display parameters of the display terminal
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2330/00Aspects of power supply; Aspects of display protection and defect management
    • G09G2330/02Details of power systems and of start or stop of display operation
    • G09G2330/021Power management, e.g. power saving
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2330/00Aspects of power supply; Aspects of display protection and defect management
    • G09G2330/02Details of power systems and of start or stop of display operation
    • G09G2330/021Power management, e.g. power saving
    • G09G2330/022Power management, e.g. power saving in absence of operation, e.g. no data being entered during a predetermined time
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/14Detecting light within display terminals, e.g. using a single or a plurality of photosensors

Definitions

  • the present disclosure relates to the field of display control technologies, and in particular, to a display control method and a display device.
  • the embodiment of the present application provides a display control method and a display device, which can detect whether there is a user in front of the terminal according to the added image sensor, and perform the terminal and the display device according to the frequency of the eye movement calculated according to the acquired image. Intelligent display control.
  • an embodiment of the present application provides a display control method, where the method includes
  • Corresponding operations are selected according to the frequency of the eye sway and a preset rule, where the preset rule is a correspondence between the frequency of the eye sway and the operation;
  • the terminal is controlled to perform a corresponding function according to the operation.
  • the embodiment of the present application provides a display device, and the display device includes
  • a storage module for storing program instructions
  • the processing module is electrically connected to the display panel and the storage module, and is configured to invoke and execute the program instruction to perform the following steps:
  • Corresponding operations are selected according to the frequency of the eye sway and a preset rule, where the preset rule is a correspondence between the frequency of the eye sway and the operation;
  • the terminal is controlled to perform a corresponding function according to the operation.
  • an embodiment of the present application provides a display device, where the display device includes
  • An image acquisition module configured to acquire an image in an imaging area in front of the terminal in real time
  • a frequency calculation module configured to calculate a frequency of a user's eye movement according to the acquired image
  • An operation selection module configured to select a corresponding operation according to the frequency of the eye movement and a preset rule, where the preset rule is a correspondence between a frequency of the eye movement and an operation;
  • control module configured to control the terminal to perform a corresponding function according to the operation.
  • the image sensor is used to detect whether the user exists, and the frequency of the eye movement is calculated by using the acquired image, and the intelligent display control of the terminal and the display device is performed according to a preset rule, for example, the eye is shaken.
  • the frequency is in the first frequency range, and the first operation is performed correspondingly. If the frequency of the eye movement is in the second frequency range, the second operation is performed accordingly, thereby improving the intelligent control level of the electronic product with the display function and improving User experience.
  • FIG. 1 is a schematic flow chart of a display control method according to an embodiment of the present application.
  • FIG. 2 is a schematic sub-flow diagram of a display control method according to a first embodiment of the present application.
  • FIG. 3 is another schematic sub-flow diagram of a display control method according to a second embodiment of the present application.
  • FIG. 4 is a schematic block diagram of a terminal provided by an embodiment of the present application.
  • FIG. 5 is a schematic block diagram of a frequency calculation module of a terminal according to a first embodiment of the present application.
  • FIG. 6 is a schematic block diagram of an operation selection module of a terminal according to the first embodiment of the present application.
  • FIG. 7 is a schematic block diagram of a control module of a terminal according to a second embodiment of the present application.
  • FIG. 8 is a schematic block diagram of a user of a terminal according to an embodiment of the present application.
  • FIG. 9 is a schematic block diagram of a display device according to an embodiment of the present application.
  • the terminal can be implemented in various forms.
  • terminals described in the embodiments of the present application include, but are not limited to, such as having a touch sensitive surface (eg, a touch screen display and/or a touch pad)
  • Fixed terminals such as desktop computers, LCD TVs, and digital TVs.
  • the terminal device is a touch sensitive surface (eg, a touch screen display and/or a touch pad)
  • Portable communication device such as a mobile phone, laptop or tablet computer.
  • the display device can be implemented in various forms.
  • the display device described in the embodiments of the present application includes, but is not limited to, a display device such as a display having an organic light emitting diode display, a liquid crystal display, a plasma display, a cathode ray tube display, or the like.
  • fixed terminals including displays and touch sensitive surfaces are described.
  • a portable mobile terminal such as a notebook computer
  • a component such as a notebook computer.
  • the terminal supports a variety of applications, such as one or more of the following: drawing applications, presentation applications, word processing applications, website creation applications, disk burning applications, spreadsheet applications, gaming applications, phone applications Programs, video conferencing applications, email applications, instant messaging applications, workout support applications, photo management applications, digital camera applications, digital camera applications, web Browse applications, digital music player applications, and/or digital video player applications.
  • applications such as one or more of the following: drawing applications, presentation applications, word processing applications, website creation applications, disk burning applications, spreadsheet applications, gaming applications, phone applications Programs, video conferencing applications, email applications, instant messaging applications, workout support applications, photo management applications, digital camera applications, digital camera applications, web Browse applications, digital music player applications, and/or digital video player applications.
  • Various applications that can be executed on the terminal can use at least one common physical user interface device such as a touch sensitive surface.
  • One or more functions of the touch sensitive surface and corresponding information displayed on the terminal can be adjusted and/or changed within the application and/or within the respective application.
  • the common physical architecture of the terminal eg, a touch-sensitive surface
  • FIG. 1 is a schematic flowchart of a display control method according to an embodiment of the present application.
  • the display control method is applied to a terminal, where the terminal has an image sensor; the image sensor can be obtained in real time.
  • the image in the imaging area in front of the terminal converts the optical image into a processing module that transmits the electronic signal to the terminal, and then calculates a sensor or the like of the corresponding data; as shown, the method may include steps S11 to S14.
  • S11 Acquire an image in an imaging area in front of the terminal in real time. Specifically, the optical image acquired in real time is converted into an electronic signal and transmitted to the terminal processing module for corresponding processing. For example, if the presence of the user is detected in the image in the imaging area in front of the acquired terminal and the user's eyes are shaking , indicating that a user is watching the terminal in the camera area in front of the terminal.
  • the usage status of the current terminal may be determined according to the presence of the user and whether the user's eyes are swaying.
  • the usage status of the terminal includes a normal working state and a standby state.
  • the frequency of the user's eye movement based on the acquired image. Specifically, the eye image in the preset time period is extracted from the acquired image, and the number of times the user's eyes are shaken is calculated according to the eye image of the preset time period, which may be based on the calculated number of times the user's eyes are shaken and preset. The time period calculates the frequency of the eye movement.
  • the user's eye swaying frequency range is set.
  • the user's normal eye swaying frequency range is 15 times a minute
  • the program design can set the corresponding frequency range according to the normal sway range of the human eye.
  • the eye movement frequency of 10-20 times per minute is the first frequency range
  • the eye movement frequency of 1-10 times per minute is set to the second frequency range
  • the setting is performed in steps of 10 times for each frequency range.
  • the terminal performs the first operation;
  • the calculated frequency of the user's eye sway is in the second frequency range, and the terminal performs the second operation.
  • the first operation of the terminal may be set according to a function operation of the terminal, where the first operation may be to automatically switch the current application scenario or the playing channel of the terminal, The operation may be an automatic adjustment of brightness or the like; if the eye swaying frequency is in the second frequency range, the terminal performs a second operation, which may be an operation of automatically lowering or increasing the volume or adjusting the contrast.
  • the preset rule may also be set by a user, and the setting of the corresponding operation of the eye swaying frequency range may be set according to personal preference.
  • Control the terminal to perform a corresponding function according to the operation Specifically, an operation performed by the terminal is selected according to the eye flapping frequency and the preset rule, thereby controlling the terminal to perform a corresponding function according to the operation. For example, when the user accidentally falls asleep while watching the video using the terminal, the calculation of the eye movement frequency is zero, thereby controlling the terminal to enter the standby mode, when the user wakes up, the eye movement frequency is calculated, and the frequency of the eye movement is too slow.
  • the control terminal adjusts the low volume, the control terminal performs the operation of switching channels when the eye flapping frequency is too fast; thereby realizing the intelligent control terminal working state, ensuring more intelligent control terminal display, facilitating user use, improving user experience, and saving Power resources.
  • the above embodiment can also set corresponding function operations according to user preferences.
  • the frequency of the eye movement calculated according to the acquired image processing and the preset rule are performed correspondingly, thereby ensuring intelligent control of the terminal display and improving the user experience. That is, the foregoing embodiment may perform corresponding operations on the terminal display according to the range of the eye swaying frequency and the preset rule in the preset time period, for example, when the calculated frequency of the user's eye swaying is located in the first frequency range, The terminal performs a first operation; when the calculated frequency of the user's eye sway is in the second frequency range, the terminal performs a second operation; thereby dynamically determining an operation performed by the terminal according to an eye swaying frequency range, Improve the user experience as the user watches the video.
  • the above embodiment may also determine the correspondence between the frequency of the eye movement and the operation by the user performing a custom setting.
  • step S12 is a sub-schematic flowchart of step S12 provided by the first embodiment of the present application.
  • the frequency of the user's eye movement is calculated based on the acquired image.
  • step S12 includes S21-S25.
  • the eye image state in the image may be identified according to the feature information of the eye.
  • the pupil shape, the heterochromatic edge (iris, sclera), and the like of the user's eyes may be identified by programming, and thus When the state of the eye image of the user is in a blinking state, eye feature information such as the pupil shape of the eye and the edge of the heterochromatic color can be recognized, and when the eye is closed, the corresponding eye feature information cannot be recognized.
  • the number of eye movements in the preset time period is calculated, thereby calculating the eye movement of the preset time period. frequency.
  • FIG. 3 is another schematic sub-flow diagram of a display control method according to a second embodiment of the present application. Specifically, steps S31 to S36 are included.
  • step S31 Determine whether the current usage state of the terminal is a standby state or a normal working state. If the terminal is in the standby state, the process proceeds to step S32, and if the terminal is in the normal working state, the process proceeds to step S36. Specifically, the currently used state of the terminal includes a standby state and a normal working state.
  • step S32 If it is in a standby state, detecting whether a user exists in a preset range of the terminal. If the user of the terminal enters step S33, if the terminal does not exist, the user proceeds to step S35. Specifically, determining whether the user can locate the user's body image according to the image generated by the image sensor within the preset range of the terminal, thereby detecting the presence of the user.
  • the infrared sensor may also be disposed at the terminal, and the user may detect the presence of the user according to the infrared sensor when entering the preset range of the terminal.
  • step S33 Calculate whether the preset time range of the terminal is that the time of the user reaches a preset duration. If the preset range of the terminal exists, the user's time reaches the preset time length to proceed to step S34, and if the preset time length is not reached, the process proceeds to step S35. Specifically, when the preset range of the terminal detects the presence of the user, the timing module starts to work. In some feasible embodiments, the preset duration may be set by the user, which may prevent the user from entering the preset range of the terminal and suddenly leaving.
  • control terminal works normally. Specifically, when the terminal is in a normal working state, the image in the imaging area in front of the real-time acquisition terminal is executed.
  • the control terminal maintains the standby mode.
  • the terminal is currently in a normal working state. Specifically, when the terminal is in a normal working state, the image in the imaging area in front of the real-time acquisition terminal is executed.
  • the smart control terminal is used to improve the intelligent control performance of the terminal according to whether the user exists and detects whether the user's time reaches a preset time, thereby preventing the user experience from being affected.
  • the image sensor or the infrared sensor is always working. Once the presence of the user in the imaging area before the terminal is detected and the preset duration is reached, the terminal will be in standby. The state is switched to the normal working state. In other embodiments, the user can also manually start the button to enter the normal working state.
  • the above embodiment can ensure the automatic switching use state of the terminal, and the use state of the terminal includes the standby state and the normal working state.
  • the step S12 of the first embodiment and the specific process of the second embodiment may also be combined.
  • the eye may be calculated according to the method provided in the first embodiment.
  • the frequency range is determined, thereby determining that the terminal performs the corresponding operation.
  • FIG. 4 is a schematic block diagram of a terminal 100 according to an embodiment of the present application.
  • the terminal 100 includes an image acquisition module 10, a frequency calculation module 20, an operation selection module 30, and a control module 40.
  • the image acquisition module 10 is configured to acquire an image in the front imaging area of the terminal generated by the image sensor in real time. Specifically, the optical image acquired in real time is converted into an electronic signal and transmitted to the terminal processing module for corresponding processing. For example, if the presence of the user is detected in the image in the imaging area in front of the acquired terminal and the user's eyes are shaking , indicating that a user is watching the terminal in the camera area in front of the terminal.
  • the usage status of the current terminal may be determined according to the presence of the user and whether the user's eyes are swaying.
  • the usage status of the terminal includes a normal working state and a standby state.
  • the frequency calculation module 20 is configured to calculate a frequency of the user's eye movement based on the acquired image. Specifically, the eye image in the preset time period is extracted from the acquired image, and the number of times the user's eyes are shaken is calculated according to the eye image of the preset time period, which may be based on the calculated number of times the user's eyes are shaken and preset. The time period calculates the frequency of the eye movement.
  • the operation selection module 30 is configured to select a corresponding operation according to the frequency of the eye movement and a preset rule, where the preset rule is a correspondence between a frequency of the eye movement and an operation, and the operation selection module 30 A first operation module 32 and a second operation module 34 are included (as shown in FIG. 6).
  • the user's eye swaying frequency range is set.
  • the user's normal eye swaying frequency range is 15 times a minute
  • the program design can set the corresponding frequency range according to the normal sway range of the human eye.
  • the eye movement frequency of 10-20 times per minute is the first frequency range
  • the eye movement frequency of 1-10 times per minute is set to the second frequency range, and the setting is performed in steps of 10 times for each frequency range.
  • the user can customize the frequency range in which the eye is shaken. Therefore, according to the eye swaying frequency range and the preset rule selection terminal, performing a corresponding operation, when the calculated frequency of the user's eye swaying is in the first frequency range, the terminal performs the first operation; The calculated frequency of the user's eye sway is in the second frequency range, and the terminal performs the second operation.
  • the first operation of the terminal may be set according to a function operation of the terminal, where the first operation may be to automatically switch the current application scenario or the playing channel of the terminal, The operation may be an automatic adjustment of brightness or the like; if the eye swaying frequency is in the second frequency range, the terminal performs a second operation, which may be an operation of automatically lowering or increasing the volume or adjusting the contrast.
  • the preset rule may also be set by a user, and the setting of the corresponding operation of the eye swaying frequency range may be set according to personal preference.
  • the control module 40 is configured to control the terminal to perform a corresponding function according to the operation. Specifically, an operation performed by the terminal is selected according to the eye flapping frequency and the preset rule, thereby controlling the terminal to perform a corresponding function according to the operation. For example, when the user accidentally falls asleep while watching the video using the terminal, the calculation of the eye movement frequency is zero, thereby controlling the terminal to enter the standby mode, when the user wakes up, the eye movement frequency is calculated, and the frequency of the eye movement is too slow. When the control terminal adjusts the low volume, the control terminal performs the operation of switching channels when the eye flapping frequency is too fast; thereby realizing the intelligent control terminal working state, ensuring more intelligent control terminal display, facilitating user use, improving user experience, and saving Power resources. In addition, the above embodiment can also set corresponding function operations according to user preferences.
  • the frequency of the eye movement calculated according to the acquired image processing and the preset rule are performed correspondingly, thereby ensuring intelligent control of the terminal display and improving the user experience. That is, the foregoing embodiment may perform corresponding operations on the terminal display according to the range of the eye swaying frequency and the preset rule in the preset time period, for example, when the calculated frequency of the user's eye swaying is located in the first frequency range, The terminal performs a first operation; when the calculated frequency of the user's eye sway is in the second frequency range, the terminal performs a second operation; thereby dynamically determining an operation performed by the terminal according to an eye swaying frequency range, Improve the user experience as the user watches the video.
  • the above embodiment may also determine the correspondence between the frequency of the eye movement and the operation by the user performing a custom setting.
  • FIG. 5 is a schematic block diagram of the frequency calculation module 20 provided by the embodiment of the present application.
  • the frequency calculation module 20 calculates the frequency of the user's eye movement based on the acquired image.
  • the frequency calculation module 20 includes an extraction module 22, an identification module 24, and a calculation module 26.
  • the extraction module 22 is configured to extract the image in a preset time period. Specifically, the preset time period can be set by a user.
  • the identification module 24 is configured to identify an eye image state in the image, and the eye image state includes a closed eye state and a blink state.
  • the eye image state in the image may be identified according to the feature information of the eye.
  • the pupil shape, the heterochromatic edge (iris, sclera), and the like of the user's eyes may be identified by programming, and thus When the state of the eye image of the user is in a blinking state, eye feature information such as the pupil shape of the eye and the edge of the heterochromatic color can be recognized, and when the eye is closed, the corresponding eye feature information cannot be recognized.
  • the calculation module 26 is configured to calculate that the eye is shaken once if the state of the eye images of the two adjacent images is inconsistent. Specifically, the state of the eye image in the image is recognized, and when the state of the eye image of the two images in the adjacent time points does not match, the eye is moved once.
  • the calculating module 26 is configured to calculate the number of times the eye is shaken in a preset time period. Specifically, the number of eye movements can be calculated according to the comparison of the state of the eye image in the image in the preset time, so that the number of eye movements in the preset time period can be calculated.
  • the calculating module 26 is further configured to calculate a frequency of the eye movement according to the calculated number of times and a preset time period.
  • the number of eye movements in the preset time period is calculated, thereby calculating the eye movement of the preset time period. frequency.
  • FIG. 7 is a schematic block diagram of a control module 40 provided by an embodiment of the present application.
  • the control module 40 intelligently controls the usage state of the terminal according to whether the user exists and detects whether the time of the user reaches a preset duration, thereby improving the intelligent control performance of the terminal, thereby avoiding the user.
  • the experience is affected.
  • the control module 40 includes a first determining module 42 , a detecting module 44 , and a timing module 46 .
  • the first determining module 42 is configured to determine whether the currently used state of the terminal is a standby state or a normal working state. Specifically, the currently used state of the terminal includes a standby state and a normal working state.
  • the detecting module 44 is configured to detect whether there is a user within a preset range of the terminal if it is in a standby state. Specifically, determining whether the user can locate the user's body image according to the image generated by the image sensor within the preset range of the terminal, thereby detecting the presence of the user.
  • the infrared sensor may also be disposed at the terminal, and the user may detect the presence of the user according to the infrared sensor when entering the preset range of the terminal.
  • the timing module 46 is configured to calculate whether the preset time range of the terminal is that the time of the user reaches a preset duration. Specifically, when the preset range of the terminal detects the presence of the user, the timing module starts to work.
  • the preset duration may be set by the user, which may prevent the user from entering the preset range of the terminal and suddenly leaving.
  • the control module 40 is configured to control the normal operation of the terminal. Specifically, when the terminal is in a normal working state, the image in the imaging area in front of the real-time acquisition terminal is executed.
  • the control module 40 is configured to control the terminal to maintain a standby mode.
  • the first determining module 42 is configured to determine that the terminal is currently in a normal working state. Specifically, when the terminal is in a normal working state, the image in the imaging area in front of the real-time acquisition terminal is executed.
  • the smart control terminal is used to improve the intelligent control performance of the terminal according to whether the user exists and detects whether the user's time reaches a preset time, thereby preventing the user experience from being affected.
  • the image sensor or the infrared sensor is always working. Once the presence of the user in the imaging area before the terminal is detected and the preset duration is reached, the terminal will be in standby. The state is switched to the normal working state. In other embodiments, the user can also manually start the button to enter the normal working state.
  • the above embodiment can ensure the automatic switching use state of the terminal, and the use state of the terminal includes the standby state and the normal working state.
  • the step S12 of the first embodiment and the specific process of the second embodiment may also be combined.
  • the eye may be calculated according to the method provided in the first embodiment.
  • the frequency range is determined, thereby determining that the terminal performs the corresponding operation.
  • FIG. 8 is a schematic block diagram of a user of a terminal according to an embodiment of the present application, which may also be another embodiment of a display device.
  • the terminal 100 described in this embodiment includes: at least one input device 200, at least one output device 300, and at least one processing module (CPU) 400, an image sensor 500, and a storage module 600.
  • the input device 200, the output device 300, the processing module 400, the image sensor 500, and the storage module 600 are connected by a bus 700.
  • the input device 200 can specifically be a touch panel (touch screen), a physical button, a fingerprint recognition module, and a mouse.
  • the output device 300 can be specifically a display screen.
  • the storage module 600 can be a high-speed RAM storage module or a non-volatile storage module (non-volatile). Memory), such as a disk storage module.
  • the storage module 600 is configured to store a set of program codes, and the input device 200, the output device 300, and the processing module 400 are used to invoke the program code stored in the storage module 600, and perform the following operations:
  • the processing module 400 is configured to:
  • the working state of the current terminal can be determined according to whether the eye is present and whether the eye is moving.
  • the frequency of the user's eye movement is calculated based on the acquired image. Specifically, the eye image in the preset time period is extracted from the acquired image, and the number of times the user's eyes are shaken is calculated according to the eye image of the preset time period, which may be based on the calculated number of times the user's eyes are shaken and preset. The time period calculates the frequency of the eye movement.
  • Corresponding operations are selected according to the frequency of the eye sway and a preset rule, and the preset rule is a correspondence between the frequency of the eye sway and the operation.
  • the user's eye swaying frequency range is set.
  • the user's normal eye swaying frequency range is 15 times a minute
  • the program design can set the corresponding frequency range according to the normal sway range of the human eye.
  • the eye movement frequency of 10-20 times per minute is the first frequency range
  • the eye movement frequency of 1-10 times per minute is set to the second frequency range
  • the setting is performed in steps of 10 times for each frequency range.
  • the terminal performs the first operation;
  • the calculated frequency of the user's eye sway is in the second frequency range, and the terminal performs the second operation.
  • the first operation of the terminal may be set according to a function operation of the terminal, where the first operation may be to automatically switch the current application scenario or the playing channel of the terminal, The operation may be an automatic adjustment of brightness or the like; if the eye swaying frequency is in the second frequency range, the terminal performs a second operation, which may be an operation of automatically lowering or increasing the volume or adjusting the contrast.
  • the preset rule may also be set by a user, and the setting of the corresponding operation of the eye swaying frequency range may be set according to personal preference.
  • the terminal is controlled to perform a corresponding function according to the operation. Specifically, an operation performed by the terminal is selected according to the eye flapping frequency and the preset rule, thereby controlling the terminal to perform a corresponding function according to the operation. For example, when the user accidentally falls asleep while watching the video using the terminal, the calculation of the eye movement frequency is zero, thereby controlling the terminal to enter the standby mode, when the user wakes up, the eye movement frequency is calculated, and the frequency of the eye movement is too slow.
  • the control terminal adjusts the low volume, the control terminal performs the operation of switching channels when the eye flapping frequency is too fast; thereby realizing the intelligent control terminal working state, ensuring more intelligent control terminal display, facilitating user use, improving user experience, and saving Power resources.
  • the above embodiment can also set corresponding operations according to user preferences.
  • the frequency of the eye movement calculated according to the acquired image processing and the preset rule are performed correspondingly, thereby ensuring intelligent control of the terminal display and improving the user experience. That is, the foregoing embodiment may perform corresponding operations on the terminal display according to the range of the eye swaying frequency and the preset rule in the preset time period, for example, when the calculated frequency of the user's eye swaying is located in the first frequency range, The terminal performs a first operation; when the calculated frequency of the user's eye sway is in the second frequency range, the terminal performs a second operation; thereby dynamically determining an operation performed by the terminal according to an eye swaying frequency range, Improve the user experience as the user watches the video.
  • the above embodiment may also determine the correspondence between the frequency of the eye movement and the operation by the user performing a custom setting.
  • the processing module 400 calculates the frequency of the eye movement of the user according to the acquired image. Specifically, the processing module 400 is configured to:
  • the image within a preset time period is extracted.
  • the preset time period can be set by a user.
  • an eye image state in the image is identified, the eye image state including a closed eye state and a blink state.
  • the eye image state in the image may be identified according to the feature information of the eye.
  • the pupil shape, the heterochromatic edge (iris, sclera), and the like of the user's eyes may be identified by programming, and thus
  • eye feature information such as the pupil shape of the eye and the edge of the heterochromatic color can be recognized, and when the eye is closed, the corresponding eye feature information cannot be recognized.
  • the state of the eye images of the two adjacent images is inconsistent, it is calculated that the eye is moved once. Specifically, the state of the eye image in the image is recognized, and when the state of the eye image of the two images in the adjacent time points does not match, the eye is moved once.
  • the number of eye movements can be calculated according to the comparison of the state of the eye image in the image in the preset time, so that the number of eye movements in the preset time period can be calculated.
  • the frequency of the eye movement is calculated according to the calculated number of times and the preset time period.
  • the number of eye movements in the preset time period is calculated, thereby calculating the eye movement of the preset time period. frequency.
  • the processing module 400 intelligently controls the usage state of the terminal according to whether the user exists and detects whether the time of the user reaches a preset duration, thereby improving the intelligent control performance of the terminal, thereby preventing the user experience from being affected. .
  • the processing module 400 is used to
  • the current use state of the terminal is a standby state or a normal working state.
  • the currently used state of the terminal includes a standby state and a normal working state.
  • the infrared sensor may also be disposed at the terminal, and the user may detect the presence of the user according to the infrared sensor when entering the preset range of the terminal.
  • Determining whether the preset range of the terminal has a user's time reaches a preset duration Specifically, when the preset range of the terminal detects the presence of the user, the timing module starts to work.
  • the preset duration may be set by the user, which may prevent the user from entering the preset range of the terminal and suddenly leaving.
  • the control terminal works normally. Specifically, when the terminal is in a normal working state, the image in the imaging area in front of the real-time acquisition terminal is executed.
  • the control terminal maintains the standby mode.
  • the terminal is currently in a normal working state. Specifically, when the terminal is in a normal working state, the image in the imaging area in front of the real-time acquisition terminal is executed.
  • the smart control terminal is used to improve the intelligent control performance of the terminal according to whether the user exists and detects whether the user's time reaches a preset time, thereby preventing the user experience from being affected.
  • the image sensor or the infrared sensor is always working. Once the presence of the user in the imaging area before the terminal is detected and the preset duration is reached, the terminal will be in standby. The state is switched to the normal working state. In other embodiments, the user can also manually start the button to enter the normal working state.
  • the above embodiment can ensure the automatic switching use state of the terminal, and the use state of the terminal includes the standby state and the normal working state.
  • the step S12 of the first embodiment and the specific process of the second embodiment may also be combined.
  • the eye may be calculated according to the method provided in the first embodiment.
  • the frequency range is determined, thereby determining that the terminal performs the corresponding operation.
  • FIG. 9 is a schematic block diagram of a display device 10 according to an embodiment of the present application.
  • the display device 10 described in this embodiment includes: a display screen 30 and an image sensor for displaying an image screen; the display device 10 further includes
  • An image acquisition module configured to acquire an image in an imaging area in front of the terminal in real time
  • a frequency calculation module configured to calculate a frequency of a user's eye movement according to the acquired image
  • An operation selection module configured to select a corresponding operation according to the frequency of the eye movement and a preset rule, where the preset rule is a correspondence between a frequency of the eye movement and an operation;
  • control module configured to control the terminal to perform a corresponding function according to the operation.
  • the display device 10 described in the embodiments of the present application includes, but is not limited to, a display device 10 such as a display screen 30 having an organic light emitting diode display, a liquid crystal display, a plasma display, a cathode ray tube display, or the like.
  • a display device 10 such as a display screen 30 having an organic light emitting diode display, a liquid crystal display, a plasma display, a cathode ray tube display, or the like.
  • the display device 10 is integrally formed in a periphery.
  • the non-display area of the display device 10 has a mounting hole 20 for mounting an image sensor.
  • the image sensor is disposed in the mounting hole 20.
  • the image sensor includes, but is not limited to, a camera and a CCD. CMOS, the image sensor can be used to acquire an image in the imaging area located in front of the display screen 30 in real time.
  • the display screen 30 includes, but is not limited to, an organic light emitting diode display screen, a liquid crystal display screen, a plasma display screen, a cathode ray tube display screen, and the like.
  • the display screen 30 has a TFT substrate and a color filter substrate, and a liquid crystal layer is disposed between the TFT substrate and the color filter substrate, and the display screen 30 is used for displaying an image screen.
  • the mounting hole 20 is located on a longitudinal vertical line of the non-display area of the display device 10.
  • the image sensor may be fixed in the mounting hole 20 by a silicone.
  • the aperture 20 can also correspond to a plurality of image sensors.
  • the presence or absence of the user may be detected according to the image sensor, and the frequency of the eye movement is calculated by the acquired image, and the intelligent display control of the display device is performed according to a preset rule, for example, an eye ⁇ The moving frequency is in the first frequency range, the display device performs the first operation, and if the frequency of the eye movement is in the second frequency range, the display device performs the second operation, thereby improving the intelligent control level of the display device and improving the user The effect of the experience.
  • a preset rule for example, an eye ⁇ The moving frequency is in the first frequency range, the display device performs the first operation, and if the frequency of the eye movement is in the second frequency range, the display device performs the second operation, thereby improving the intelligent control level of the display device and improving the user The effect of the experience.
  • the disclosed terminal and method may be implemented in other manners.
  • the device embodiments described above are merely illustrative.
  • the division of the modules is only a logical function division.
  • there may be another division manner for example, multiple modules or components may be combined or Can be integrated into another system, or some features can be ignored or not executed.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, device or module, or an electrical, mechanical or other form of connection.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Ophthalmology & Optometry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Hardware Design (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

本申请实施例公开了一种显示控制方法、终端及显示装置,其中方法包括实时获取终端前方的摄像区域内的图像;根据获取的图像计算用户的眼睛眨动的频率;根据所述眼睛眨动的频率和预设的规则选择相应的操作,所述预设规则为所述眼睛眨动的频率和操作的对应关系;控制所述终端及所述显示装置根据所述操作执行相应的功能。

Description

一种显示控制方法及显示装置
技术领域
本公开涉及显示控制技术领域,尤其涉及一种显示控制方法及显示装置。
背景技术
随着科技和经济的持续发展,智能手机、平板电脑、液晶电视、液晶显示屏等电子产品得到空前普及,同时对这些电子产品的智能化控制也提出了越来越高的要求。
以液晶电视为例,如今看电视节目已经成为人们日常生活中不可或缺的一部分,因此,对电视的智能化显示控制提出了越来越高的要求。例如,人们在观看电视的过程中,当看到好看的电视节目时会产生兴奋感,相应的希望电视自动调高音量、亮度,当看到无聊的电视节目时会产生疲倦感,相应的希望电视自动切换频道,或者人们观看电视节目的时间过长,眼睛产生了疲倦感,相应的希望电视自动调低音量、亮度等。但是,现有技术中,在用户观看电视产生疲倦时或较兴奋时电视不能自动执行相应的操作,因此,目前这些电子产品的智能化程度还不够高,严重影响用户体验效果。
发明内容
本申请实施例提供一种显示控制方法及显示装置,可以根据增加的图像传感器来侦测终端的前方是否存在使用者并根据获取的图像计算出的眼睛眨动的频率来进行终端及显示装置的智能化显示控制。
为了实现上述目的,一方面,本申请实施例提供了一种显示控制方法,该方法包括,
实时获取终端前方的摄像区域内的图像;
根据获取的图像计算用户的眼睛眨动的频率;
根据所述眼睛眨动的频率和预设的规则选择相应的操作,所述预设规则为所述眼睛眨动的频率和操作的对应关系;
控制所述终端根据所述操作执行相应的功能。
为了实现上述目的,另一方面,本申请施例提供了一种显示装置,该显示装置包括,
显示面板;
存储模块,用于存储程序指令;以及
处理模块与所述显示面板和存储模块电性连接,用于调用并执行所述程序指令,以执行如下步骤:
实时获取终端前方的摄像区域内的图像;
根据获取的图像计算用户的眼睛眨动的频率;
根据所述眼睛眨动的频率和预设的规则选择相应的操作,所述预设规则为所述眼睛眨动的频率和操作的对应关系;
控制所述终端根据所述操作执行相应的功能。
为了实现上述目的,另一方面,本申请实施例提供了一种显示装置,该显示装置包括,
显示面板以及
图像获取模块,用于实时获取终端前方的摄像区域内的图像;
频率计算模块,用于根据获取的图像计算用户的眼睛眨动的频率;
操作选择模块,用于根据所述眼睛眨动的频率和预设的规则选择相应的操作,所述预设规则为所述眼睛眨动的频率和操作的对应关系;以及
控制模块,用于控制所述终端根据所述操作执行相应的功能。
本申请实施例通过图像传感器来侦测使用者是否存在,并通过获取的图像来计算出眼睛眨动的频率,根据预设规则来进行终端及显示装置的智能化显示控制,例如,眼睛眨动的频率位于第一频率范围,相应执行第一操作,若眼睛眨动的频率位于第二频率范围,相应执行第二操作,因此,可以提高具备显示功能的电子产品的智能化控制水平,并提升用户的体验效果。
附图说明
为了更清楚地说明本申请实施例技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本申请实施例提供的一种显示控制方法的示意流程图;
图2是本申请第一实施例提供的一种显示控制方法的示意子流程图。
图3是本申请第二实施例提供的一种显示控制方法的另一示意子流程图。
图4是本申请实施例提供的一种终端的示意性框图。
图5是本申请第一实施例提供的一种终端的频率计算模块的示意性框图。
图6是本申请第一实施例提供的一种终端的操作选择模块的示意性框图。
图7是本申请第二实施例提供的一种终端的控制模块的示意性框图。
图8是本申请实施例提供的一种终端的用户示意框图。
图9是本申请实施例提供的一种显示装置的示意性框图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
应当理解,当在本说明书和所附权利要求书中使用时,术语“包括”和 “包含”指示所描述特征、整体、步骤、操作、元素和/或组件的存在,但并不排除一个或多个其它特征、整体、步骤、操作、元素、组件和/或其集合的存在或添加。
还应当理解,在此本申请说明书中所使用的术语仅仅是出于描述特定实施例的目的而并不意在限制本申请。如在本申请说明书和所附权利要求书中所使用的那样,除非上下文清楚地指明其它情况,否则单数形式的“一”、“一个”及“该”意在包括复数形式。
具体实现中,终端可以以各种形式来实施。例如,本申请实施例中描述的终端包括但不限于诸如具有触摸敏感表面(例如,触摸屏显示器和/或触摸板) 的台式计算机、液晶电视机、数字TV等固定终端。还应当理解的是,在某些实施例中,所述终端设备为具有触摸敏感表面(例如,触摸屏显示器和/或触摸板) 的移动电话、膝上型计算机或平板计算机之类的便携式通信设备。
具体实现中,显示装置可以以各种形式来实施。例如,本申请实施例中描述的显示装置包括但不限于诸如具有有机发光二极管显示屏,液晶显示屏、等离子显示屏、阴极射线管显示屏等显示屏的显示装置。
在接下来的讨论中,描述了包括显示器和触摸敏感表面等固定终端。然而,本领域技术人员将理解的是,除了特别用于固定放置目的的元件之外,根据本申请的实施方式的构造也能够应用于笔记本电脑等便携性移动终端,同样,也能够应用于诸如物理键盘、鼠标和/或控制杆的一个或多个其它物理用户接口设备。
终端支持各种应用程序,例如以下中的一个或多个:绘图应用程序、演示应用程序、文字处理应用程序、网站创建应用程序、盘刻录应用程序、电子表格应用程序、游戏应用程序、电话应用程序、视频会议应用程序、电子邮件应用程序、即时消息收发应用程序、锻炼支持应用程序、照片管理应用程序、数码相机应用程序、数字摄影机应用程序、web 浏览应用程序、数字音乐播放器应用程序和/ 或数字视频播放器应用程序。
可以在终端上执行的各种应用程序可以使用诸如触摸敏感表面的至少一个公共物理用户接口设备。可以在应用程序之间和/或相应应用程序内调整和/或改变触摸敏感表面的一个或多个功能以及终端上显示的相应信息。这样,终端的公共物理架构(例如,触摸敏感表面)可以支持具有对用户而言直观且透明的用户界面的各种应用程序。
请参看图1,其是本申请实施例提供的一种显示控制方法的示意流程图,所述一种显示控制方法,应用于终端,所述终端具有图像传感器;所述图像传感器是可以实时获取终端前方的摄像区域内的图像,将光学图像转换成电子信号传输到终端的处理模块后可以计算得到相应数据的传感器等;如图所示,方法可包括步骤S11~S14。
S11,实时获取终端前方的摄像区域内的图像。具体地,将实时获取的光学图像转换成电子信号传输给终端处理模块进行相应处理,例如,若在获取到的终端前方的摄像区域内的图像中侦测到用户的存在且用户眼睛在眨动,表明终端前摄像区域内有用户正在观看终端。其中,可以根据用户的存在及用户的眼睛是否在眨动来确定当前终端所处的使用状态,终端所处的使用状态包括正常工作状态和待机状态。
S12,根据获取的图像计算用户的眼睛眨动的频率。具体地,从该获取的图像中提取预设时间段内的眼部图像,根据预设时间段的眼部图像计算用户眼睛眨动的次数,可以根据计算的用户眼睛眨动的次数与预设时间段计算出所述眼睛眨动的频率。
S13,根据所述眼睛眨动的频率和预设的规则选择相应的操作,所述预设规则为所述眼睛眨动的频率和操作的对应关系。具体地,设置用户的眼睛眨动频率范围,例如,用户的眼睛正常眨动频率范围为一分钟眨动15次,程序设计可以根据人眼正常眨动范围来进行相应的频率范围的设置,设置10-20次每分钟的眼睛眨动频率为第一频率范围,设置1-10次每分钟的眼睛眨动频率为第二频率范围,设置以10次呈阶梯式累加为各频率范围。在一些可行的实施例中,还可以由用户进行自定义设置眼睛眨动的频率范围。因此,根据所述眼睛眨动频率范围和所述预设规则选择终端执行相应的操作,当计算出的所述用户眼睛眨动的频率位于第一频率范围,所述终端执行第一操作;当计算出的所述用户眼睛眨动的频率位于第二频率范围,所述终端执行第二操作。例如,若该眼睛眨动频率位于第一频率范围,可以根据终端的功能操作来设置相应的所述终端的第一操作,所述第一操作可以是自动切换终端当前应用场景或播放频道,也可以是自动调节亮度等操作;若该眼睛眨动频率位于第二频率范围,终端执行第二操作,所述第二操作可以是自动调低或调高音量或者调节对比度等操作。在一些可行的实施例中,还可以由用户自定义设置所述预设规则,对眼睛眨动频率范围进行相应操作的设置可以根据个人喜好进行设置。
S14,控制所述终端根据所述操作执行相应的功能。具体地,根据所述眼睛眨动频率和所述预设规则选择终端执行的操作,从而控制所述终端根据所述操作执行相应的功能。例如,当用户使用终端观看视频不小心睡着时,计算眼睛眨动频率为零,从而控制终端进入待机模式,当用户醒来时,计算眼睛眨动频率,还可以根据眼睛眨动频率过慢时控制终端调低音量,眼睛眨动频率过快时控制终端执行切换频道等操作;从而实现智能化控制终端的工作状态,确保更智能化控制终端显示,方便用户使用,提升用户体验,且节约电力资源。另外,上述实施例还可以根据用户喜好设置相应的功能操作。
在上述实施例中,根据所述获取到的图像处理计算出的眼睛眨动的频率和预设的规则进行相应的操作,从而保证了终端显示的智能化控制,提高用户的体验。即上述实施例可以根据预设时间段内眼睛眨动频率范围和预设的规则来对终端显示进行相应操作,例如,当计算出的所述用户眼睛眨动的频率位于第一频率范围,所述终端执行第一操作;当计算出的所述用户眼睛眨动的频率位于第二频率范围,所述终端执行第二操作;从而实现根据眼睛眨动频率范围动态确定所述终端执行的操作,在用户观看视频时提高用户的体验。另外,上述实施例还可以通过用户进行自定义设置来确定所述眼睛眨动的频率和操作的对应关系。
请参看图2,其是本申请第一实施例提供的步骤S12的子示意流程图。在本实施例中,根据获取的图像计算用户的眼睛眨动的频率。具体地,步骤S12包括S21-S25。
S21,提取预设时间段内的所述图像。具体地,该预设时间段可以由用户进行设置。
S22,识别出所述图像中的眼部图像状态,所述眼部图像状态包括闭眼状态和睁眼状态。具体地,根据眼睛的特征信息可以识别出所述图像中的该眼部图像状态,例如,可以通过程序设计对用户眼睛的瞳孔外形、异色边缘(虹膜、巩膜)等信息进行识别处理,因此,当用户的眼部图像状态处于睁眼状态时,可以识别到眼睛的瞳孔外形以及异色边缘等眼睛特征信息,处于闭眼状态时,不可以识别到相应的眼睛特征信息。
S23,若相邻的两幅图像的眼部图像状态不一致时,计算为眼睛眨动一次。具体地,将该识别出所述图像中的眼部图像状态进行比对,当相邻时间点内的两幅图像的眼部图像状态不一致时,计算为眼睛眨动一次。
S24,计算预设时间段眼睛眨动的次数。具体地,可以根据预设时间内的图像中的眼部图像状态的比对计算眼睛眨动次数,从而可以计算出预设时间段内眼睛眨动的次数。
S25,根据计算的次数与预设时间段计算出所述眼睛眨动的频率。
上述实施例中,根据识别出预设时间段内的所述图像中的眼部图像状态并且进行相应比对,计算预设时间段内眼睛眨动次数,从而计算预设时间段的眼睛眨动频率。
请参看图3,其是本申请第二实施例提供的一种显示控制方法的另一示意子流程图。具体地,包括步骤S31~S36。
S31,判断终端当前所处使用状态是待机状态还是正常工作状态。若该终端处于待机状态进入步骤S32,若该终端处于正常工作状态进入步骤S36。具体地,所述终端当前所处使用状态包括待机状态和正常工作状态。
S32,若是待机状态,侦测在所述终端的预设范围内是否存在使用者。若该终端存在使用者进入步骤S33,若该终端不存在使用者进入步骤S35。具体地,确定终端预设范围内是否存在使用者可以根据图像传感器生成的图像定位使用者的人体图像,从而侦测到使用者的存在。在一些可行的实施例中,还可以在终端设置红外传感器,使用者进入到终端的预设范围内时可以根据红外传感器侦测到使用者的存在。
S33,计算所述终端的预设范围存在使用者的时间是否达到预设时长。若该终端的预设范围存在使用者的时间达到预设时长进入步骤S34,若达不到预设时长进入步骤S35。具体地,当所述终端的预设范围侦测到使用者的存在,计时模块开始工作。在一些可行的实施例中,该预设时长可以由用户进行设置,可以避免用户出现进入到终端的预设范围内又突然离开等情况。
S34,控制终端正常工作。具体地,当终端处于正常工作状态时,执行所述实时获取终端前方的摄像区域内的图像。
S35,控制终端维持待机模式。
S36,所述终端当前处于正常工作状态。具体地,当终端处于正常工作状态时,执行所述实时获取终端前方的摄像区域内的图像。
在上述实施例中,根据侦测使用者是否存在且侦测到使用者的时间是否达到预设时长来智能控制终端所处的使用状态,提高终端的智能化控制性能,从而避免用户体验受影响。可以理解地,在上述实施例中若该终端处于待机状态,图像传感器或红外传感器也一直在工作,一旦侦测到终端前摄像区域内使用者的存在且达到预设时长,该终端将从待机状态切换到正常工作状态,在其他实施例中,也可以由用户手动按键开机,进入正常工作状态。上述实施例可以确保终端的自动切换使用状态,终端所处的使用状态包括待机状态和正常工作状态。
在一些可行的实施例中,还可以将第一实施例的步骤S12和第二实施例的具体流程进行合并,例如,当终端处于正常工作状态时可以按照第一实施例提供的方法计算眼睛眨动频率范围,进而确定终端执行相应操作。
请参看图4,其本申请实施例的一种终端100的示意性框图。该终端100包括图像获取模块10、频率计算模块20、操作选择模块30以及控制模块40。
图像获取模块10,用于实时获取所述图像传感器产生的所述终端前摄像区域内的图像。具体地,将实时获取的光学图像转换成电子信号传输给终端处理模块进行相应处理,例如,若在获取到的终端前方的摄像区域内的图像中侦测到用户的存在且用户眼睛在眨动,表明终端前摄像区域内有用户正在观看终端。其中,可以根据用户的存在及用户的眼睛是否在眨动来确定当前终端所处的使用状态,终端所处的使用状态包括正常工作状态和待机状态。
频率计算模块20,用于根据获取的图像计算用户的眼睛眨动的频率。具体地,从该获取的图像中提取预设时间段内的眼部图像,根据预设时间段的眼部图像计算用户眼睛眨动的次数,可以根据计算的用户眼睛眨动的次数与预设时间段计算出所述眼睛眨动的频率。
操作选择模块30,用于根据所述眼睛眨动的频率和预设的规则选择相应的操作,所述预设规则为所述眼睛眨动的频率和操作的对应关系,所述操作选择模块30包括第一操作模块32和第二操作模块34(如图6所示)。具体地,设置用户的眼睛眨动频率范围,例如,用户的眼睛正常眨动频率范围为一分钟眨动15次,程序设计可以根据人眼正常眨动范围来进行相应的频率范围的设置,设置10-20次每分钟的眼睛眨动频率为第一频率范围,设置1-10次每分钟的眼睛眨动频率为第二频率范围,设置以10次呈阶梯式累加为各频率范围。在一些可行的实施例中,还可以由用户进行自定义设置眼睛眨动的频率范围。因此,根据所述眼睛眨动频率范围和所述预设规则选择终端执行相应的操作,当计算出的所述用户眼睛眨动的频率位于第一频率范围,所述终端执行第一操作;当计算出的所述用户眼睛眨动的频率位于第二频率范围,所述终端执行第二操作。例如,若该眼睛眨动频率位于第一频率范围,可以根据终端的功能操作来设置相应的所述终端的第一操作,所述第一操作可以是自动切换终端当前应用场景或播放频道,也可以是自动调节亮度等操作;若该眼睛眨动频率位于第二频率范围,终端执行第二操作,所述第二操作可以是自动调低或调高音量或者调节对比度等操作。在一些可行的实施例中,还可以由用户自定义设置所述预设规则,对眼睛眨动频率范围进行相应操作的设置可以根据个人喜好进行设置。
控制模块40,用于控制所述终端根据所述操作执行相应的功能。具体地,根据所述眼睛眨动频率和所述预设规则选择终端执行的操作,从而控制所述终端根据所述操作执行相应的功能。例如,当用户使用终端观看视频不小心睡着时,计算眼睛眨动频率为零,从而控制终端进入待机模式,当用户醒来时,计算眼睛眨动频率,还可以根据眼睛眨动频率过慢时控制终端调低音量,眼睛眨动频率过快时控制终端执行切换频道等操作;从而实现智能化控制终端的工作状态,确保更智能化控制终端显示,方便用户使用,提升用户体验,且节约电力资源。另外,上述实施例还可以根据用户喜好设置相应的功能操作。
在上述实施例中,根据所述获取到的图像处理计算出的眼睛眨动的频率和预设的规则进行相应的操作,从而保证了终端显示的智能化控制,提高用户的体验。即上述实施例可以根据预设时间段内眼睛眨动频率范围和预设的规则来对终端显示进行相应操作,例如,当计算出的所述用户眼睛眨动的频率位于第一频率范围,所述终端执行第一操作;当计算出的所述用户眼睛眨动的频率位于第二频率范围,所述终端执行第二操作;从而实现根据眼睛眨动频率范围动态确定所述终端执行的操作,在用户观看视频时提高用户的体验。另外,上述实施例还可以通过用户进行自定义设置来确定所述眼睛眨动的频率和操作的对应关系。
请参看图5,其是本申请实施例提供的频率计算模块20的示意性框图。在本实施例中,频率计算模块20根据获取的图像计算用户的眼睛眨动的频率。具体地,频率计算模块20包括提取模块22、识别模块24以及计算模块26。
所述提取模块22,用于提取预设时间段内的所述图像。具体地,该预设时间段可以由用户进行设置。
所述识别模块24,用于识别出所述图像中的眼部图像状态,所述眼部图像状态包括闭眼状态和睁眼状态。具体地,根据眼睛的特征信息可以识别出所述图像中的该眼部图像状态,例如,可以通过程序设计对用户眼睛的瞳孔外形、异色边缘(虹膜、巩膜)等信息进行识别处理,因此,当用户的眼部图像状态处于睁眼状态时,可以识别到眼睛的瞳孔外形以及异色边缘等眼睛特征信息,处于闭眼状态时,不可以识别到相应的眼睛特征信息。
所述计算模块26,若相邻的两幅图像的眼部图像状态不一致时,用于计算为眼睛眨动一次。具体地,将该识别出所述图像中的眼部图像状态进行比对,当相邻时间点内的两幅图像的眼部图像状态不一致时,计算为眼睛眨动一次。
所述计算模块26,用于计算预设时间段眼睛眨动的次数。具体地,可以根据预设时间内的图像中的眼部图像状态的比对计算眼睛眨动次数,从而可以计算出预设时间段内眼睛眨动的次数。
所述计算模块26,还用于根据计算的次数与预设时间段计算出所述眼睛眨动的频率。
上述实施例中,根据识别出预设时间段内的所述图像中的眼部图像状态并且进行相应比对,计算预设时间段内眼睛眨动次数,从而计算预设时间段的眼睛眨动频率。
请参看图7,其是本申请实施例提供的控制模块40的示意性框图。在本实施例中,控制模块40根据侦测使用者是否存在且侦测到使用者的时间是否达到预设时长来智能控制终端所处的使用状态,提高终端的智能化控制性能,从而避免用户体验受影响。具体地,控制模块40包括第一判断模块42、侦测模块44以及计时模块46。
所述第一判断模块42,用于判断终端当前所处使用状态是待机状态还是正常工作状态。具体地,所述终端当前所处使用状态包括待机状态和正常工作状态。
所述侦测模块44,若是待机状态,用于侦测在所述终端的预设范围内是否存在使用者。具体地,确定终端预设范围内是否存在使用者可以根据图像传感器生成的图像定位使用者的人体图像,从而侦测到使用者的存在。在一些可行的实施例中,还可以在终端设置红外传感器,使用者进入到终端的预设范围内时可以根据红外传感器侦测到使用者的存在。
所述计时模块46,用于计算所述终端的预设范围存在使用者的时间是否达到预设时长。具体地,当所述终端的预设范围侦测到使用者的存在,计时模块开始工作。在一些可行的实施例中,该预设时长可以由用户进行设置,可以避免用户出现进入到终端的预设范围内又突然离开等情况。
所述控制模块40,用于控制终端正常工作。具体地,当终端处于正常工作状态时,执行所述实时获取终端前方的摄像区域内的图像。
所述控制模块40,用于控制终端维持待机模式。
所述第一判断模块42,用于判断所述终端当前处于正常工作状态。具体地,当终端处于正常工作状态时,执行所述实时获取终端前方的摄像区域内的图像。
在上述实施例中,根据侦测使用者是否存在且侦测到使用者的时间是否达到预设时长来智能控制终端所处的使用状态,提高终端的智能化控制性能,从而避免用户体验受影响。可以理解地,在上述实施例中若该终端处于待机状态,图像传感器或红外传感器也一直在工作,一旦侦测到终端前摄像区域内使用者的存在且达到预设时长,该终端将从待机状态切换到正常工作状态,在其他实施例中,也可以由用户手动按键开机,进入正常工作状态。上述实施例可以确保终端的自动切换使用状态,终端所处的使用状态包括待机状态和正常工作状态。
在一些可行的实施例中,还可以将第一实施例的步骤S12和第二实施例的具体流程进行合并,例如,当终端处于正常工作状态时可以按照第一实施例提供的方法计算眼睛眨动频率范围,进而确定终端执行相应操作。
请参看图8,其为本申请实施例提供的一种终端的用户示意框图,其也可以是显示装置的另一实施例。本实施例中所描述的终端100,包括:至少一个输入设备200、至少一个输出设备300、以及至少一个处理模块(CPU)400、图像传感器500以及存储模块600。输入设备200、输出设备300、处理模块400、图像传感器500、存储模块600信号通过总线700连接。
输入设备200具体可为触控面板(触摸屏)、物理按键、指纹识别模组和鼠标。
输出设备300具体可为显示屏。
存储模块600可以是高速RAM存储模块,也可为非不稳定的存储模块(non-volatile memory),例如磁盘存储模块。上述存储模块600用于存储一组程序代码,上述输入设备200、输出设备300和处理模块400用于调用存储模块600中存储的程序代码,执行如下操作:
处理模块400,用于:
实时获取所述图像传感器产生的所述终端前摄像区域内的图像。具体地,若在获取到的终端图像传感器产生的图像中检测到摄像区域内存在眼睛且眼睛在眨动,表明终端前摄像区域内有用户正在观看终端。其中,可以根据眼睛是否存在及眼睛是否在眨动来确定当前终端所处的工作状态。
根据获取的图像计算用户的眼睛眨动的频率。具体地,从该获取的图像中提取预设时间段内的眼部图像,根据预设时间段的眼部图像计算用户眼睛眨动的次数,可以根据计算的用户眼睛眨动的次数与预设时间段计算出所述眼睛眨动的频率。
根据所述眼睛眨动的频率和预设的规则选择相应的操作,所述预设规则为所述眼睛眨动的频率和操作的对应关系。具体地,设置用户的眼睛眨动频率范围,例如,用户的眼睛正常眨动频率范围为一分钟眨动15次,程序设计可以根据人眼正常眨动范围来进行相应的频率范围的设置,设置10-20次每分钟的眼睛眨动频率为第一频率范围,设置1-10次每分钟的眼睛眨动频率为第二频率范围,设置以10次呈阶梯式累加为各频率范围。在一些可行的实施例中,还可以由用户进行自定义设置眼睛眨动的频率范围。因此,根据所述眼睛眨动频率范围和所述预设规则选择终端执行相应的操作,当计算出的所述用户眼睛眨动的频率位于第一频率范围,所述终端执行第一操作;当计算出的所述用户眼睛眨动的频率位于第二频率范围,所述终端执行第二操作。例如,若该眼睛眨动频率位于第一频率范围,可以根据终端的功能操作来设置相应的所述终端的第一操作,所述第一操作可以是自动切换终端当前应用场景或播放频道,也可以是自动调节亮度等操作;若该眼睛眨动频率位于第二频率范围,终端执行第二操作,所述第二操作可以是自动调低或调高音量或者调节对比度等操作。在一些可行的实施例中,还可以由用户自定义设置所述预设规则,对眼睛眨动频率范围进行相应操作的设置可以根据个人喜好进行设置。
控制所述终端根据所述操作执行相应的功能。具体地,根据所述眼睛眨动频率和所述预设规则选择终端执行的操作,从而控制所述终端根据所述操作执行相应的功能。例如,当用户使用终端观看视频不小心睡着时,计算眼睛眨动频率为零,从而控制终端进入待机模式,当用户醒来时,计算眼睛眨动频率,还可以根据眼睛眨动频率过慢时控制终端调低音量,眼睛眨动频率过快时控制终端执行切换频道等操作;从而实现智能化控制终端的工作状态,确保更智能化控制终端显示,方便用户使用,提升用户体验,且节约电力资源。另外,上述实施例还可以根据用户喜好设置相应的操作。
在上述实施例中,根据所述获取到的图像处理计算出的眼睛眨动的频率和预设的规则进行相应的操作,从而保证了终端显示的智能化控制,提高用户的体验。即上述实施例可以根据预设时间段内眼睛眨动频率范围和预设的规则来对终端显示进行相应操作,例如,当计算出的所述用户眼睛眨动的频率位于第一频率范围,所述终端执行第一操作;当计算出的所述用户眼睛眨动的频率位于第二频率范围,所述终端执行第二操作;从而实现根据眼睛眨动频率范围动态确定所述终端执行的操作,在用户观看视频时提高用户的体验。另外,上述实施例还可以通过用户进行自定义设置来确定所述眼睛眨动的频率和操作的对应关系。
进一步地,处理模块400根据获取的图像计算用户的眼睛眨动的频率,具体地,处理模块400用于:
提取预设时间段内的所述图像。具体地,该预设时间段可以由用户进行设置。
识别出所述图像中的眼部图像状态,所述眼部图像状态包括闭眼状态和睁眼状态。具体地,根据眼睛的特征信息可以识别出所述图像中的该眼部图像状态,例如,可以通过程序设计对用户眼睛的瞳孔外形、异色边缘(虹膜、巩膜)等信息进行识别处理,因此,当用户的眼部图像状态处于睁眼状态时,可以识别到眼睛的瞳孔外形以及异色边缘等眼睛特征信息,处于闭眼状态时,不可以识别到相应的眼睛特征信息。
若相邻的两幅图像的眼部图像状态不一致时,计算为眼睛眨动一次。具体地,将该识别出所述图像中的眼部图像状态进行比对,当相邻时间点内的两幅图像的眼部图像状态不一致时,计算为眼睛眨动一次。
计算预设时间段眼睛眨动的次数。具体地,可以根据预设时间内的图像中的眼部图像状态的比对计算眼睛眨动次数,从而可以计算出预设时间段内眼睛眨动的次数。
根据计算的次数与预设时间段计算出所述眼睛眨动的频率。
上述实施例中,根据识别出预设时间段内的所述图像中的眼部图像状态并且进行相应比对,计算预设时间段内眼睛眨动次数,从而计算预设时间段的眼睛眨动频率。
进一步地,处理模块400根据侦测使用者是否存在且侦测到使用者的时间是否达到预设时长来智能控制终端所处的使用状态,提高终端的智能化控制性能,从而避免用户体验受影响。具体地,处理模块400用于
判断终端当前所处使用状态是待机状态还是正常工作状态。具体地,所述终端当前所处使用状态包括待机状态和正常工作状态。
若是待机状态,侦测在所述终端的预设范围内是否存在使用者。具体地,确定终端预设范围内是否存在使用者可以根据图像传感器生成的图像定位使用者的人体图像,从而侦测到使用者的存在。在一些可行的实施例中,还可以在终端设置红外传感器,使用者进入到终端的预设范围内时可以根据红外传感器侦测到使用者的存在。
判断所述终端的预设范围存在使用者的时间是否达到预设时长。具体地,当所述终端的预设范围侦测到使用者的存在,计时模块开始工作。在一些可行的实施例中,该预设时长可以由用户进行设置,可以避免用户出现进入到终端的预设范围内又突然离开等情况。
控制终端正常工作。具体地,当终端处于正常工作状态时,执行所述实时获取终端前方的摄像区域内的图像。
控制终端维持待机模式。
所述终端当前处于正常工作状态。具体地,当终端处于正常工作状态时,执行所述实时获取终端前方的摄像区域内的图像。
在上述实施例中,根据侦测使用者是否存在且侦测到使用者的时间是否达到预设时长来智能控制终端所处的使用状态,提高终端的智能化控制性能,从而避免用户体验受影响。可以理解地,在上述实施例中若该终端处于待机状态,图像传感器或红外传感器也一直在工作,一旦侦测到终端前摄像区域内使用者的存在且达到预设时长,该终端将从待机状态切换到正常工作状态,在其他实施例中,也可以由用户手动按键开机,进入正常工作状态。上述实施例可以确保终端的自动切换使用状态,终端所处的使用状态包括待机状态和正常工作状态。
在一些可行的实施例中,还可以将第一实施例的步骤S12和第二实施例的具体流程进行合并,例如,当终端处于正常工作状态时可以按照第一实施例提供的方法计算眼睛眨动频率范围,进而确定终端执行相应操作。
请参看图9,其为本申请实施例提供的一种显示装置10的示意性框图。本实施例中所描述的显示装置10,包括:一显示屏30和一图像传感器,该显示屏30用于显示图像画面;该显示装置10还包括,
图像获取模块,用于实时获取终端前方的摄像区域内的图像;
频率计算模块,用于根据获取的图像计算用户的眼睛眨动的频率;
操作选择模块,用于根据所述眼睛眨动的频率和预设的规则选择相应的操作,所述预设规则为所述眼睛眨动的频率和操作的对应关系;以及
控制模块,用于控制所述终端根据所述操作执行相应的功能。
本申请实施例中描述的显示装置10包括但不限于诸如具有有机发光二极管显示屏,液晶显示屏、等离子显示屏、阴极射线管显示屏等显示屏30的显示装置10。
该显示装置10为四周一体成型,该显示装置10的非显示区域内具有用于安装图像传感器的安装孔20,该图像传感器设于安装孔20内,其中,图像传感器包括但不限于摄像头、CCD、CMOS,该图像传感器可用于实时获取获取位于显示屏30前方的摄像区域内的图像。
所述显示屏30, 包括但不限于有机发光二极管显示屏,液晶显示屏、等离子显示屏、阴极射线管显示屏等。该显示屏30具有TFT基板和彩膜基板,所述TFT基板和所述彩膜基板中间设有液晶层,该显示屏30用于显示图像画面。
所述安装孔20位于所述显示装置10的非显示区域的纵向中垂线上,该图像传感器可以通过硅胶固定于安装孔20内。
在一些可行的实施例中,显示装置10的图像传感器可以有多个,因此,所对应的安装孔20也对应有多个,即安装孔20与所述图像传感器一一对应;另外,一个安装孔20也可以对应多个图像传感器。
在上述实施例中,可以根据图像传感器来侦测使用者是否存在,并通过获取的图像来计算出眼睛眨动的频率,根据预设规则来进行显示装置的智能化显示控制,例如,眼睛眨动的频率位于第一频率范围,显示装置执行第一操作,若眼睛眨动的频率位于第二频率范围,显示装置执行第二操作,因此,可以提高显示装置的智能化控制水平,并提升用户的体验效果。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的模块及算法步骤,能够以电子硬件、计算机软件或者二者的结合来实现,为了清楚地说明硬件和软件的可互换性,在上述说明中已经按照功能一般性地描述了各示例的组成及步骤。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
所属领域的技术人员可以清楚地了解到,为了描述的方便和简洁,上述描述的终端和模块的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的终端和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述模块的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个模块或组件可以结合或者可以集成到另一个***,或一些特征可以忽略,或不执行。另外,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口、装置或模块的间接耦合或通信连接,也可以是电的,机械的或其它的形式连接。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到各种等效的修改或替换,这些修改或替换都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以权利要求的保护范围为准。

Claims (20)

  1. 一种显示控制方法,包括,
    实时获取终端前方的摄像区域内的图像;
    根据获取的图像计算用户的眼睛眨动的频率;
    根据所述眼睛眨动的频率和预设的规则选择相应的操作,所述预设规则为所述眼睛眨动的频率和操作的对应关系;以及
    控制所述终端根据所述操作执行相应的功能。
  2. 根据权利要求1所述的方法,其中,所述根据获取的图像计算用户的眼睛眨动的频率的步骤包括,
    提取预设时间段内的所述图像;
    识别出所述图像中的眼部图像状态,所述眼部图像状态包括闭眼状态和睁眼状态;
    若相邻的两幅图像的眼部图像状态不一致时,计算为眼睛眨动一次;
    计算预设时间段内眼睛眨动的次数;以及
    根据计算的次数与预设时间段计算出所述眼睛眨动的频率。
  3. 根据权利要求1所述的方法,其中,所述预设规则为所述眼睛眨动的频率和操作的对应关系的步骤包括,
    若所述眼睛眨动的频率位于第一频率范围,所述终端执行第一操作;以及
    若所述眼睛眨动的频率位于第二频率范围,所述终端执行第二操作。
  4. 根据权利要求1所述的方法,还包括,
    判断终端当前所处使用状态是待机状态还是正常工作状态;
    若是待机状态,侦测在所述终端的预设范围内是否存在使用者;
    计算在所述终端的预设范围内存在使用者的时间是否达到预设时长;以及
    若达到预设时长,控制所述终端进入正常工作状态并执行所述实时获取终端前方的摄像区域内的图像。
  5. 如权利要求4所述的方法,其中,所述根据获取的图像计算用户的眼睛眨动的频率的步骤包括,
    提取预设时间段内的所述图像;
    识别出所述图像中的眼部图像状态,所述眼部图像状态包括闭眼状态和睁眼状态;
    若相邻的两幅图像的眼部图像状态不一致时,计算为眼睛眨动一次;
    计算预设时间段内眼睛眨动的次数;以及
    根据计算的次数与预设时间段计算出所述眼睛眨动的频率。
  6. 根据权利要求4所述的方法,其中,所述预设规则为所述眼睛眨动的频率和操作的对应关系的步骤包括,
    若所述眼睛眨动的频率位于第一频率范围,所述终端执行第一操作;以及
    若所述眼睛眨动的频率位于第二频率范围,所述终端执行第二操作。
  7. 一种显示装置,包括,
    显示面板;
    存储模块,用于存储程序指令;以及
    处理模块,与所述显示面板和所述存储模块连接,用于调用并执行所述程序指令,以执行如下步骤:实时获取终端前方的摄像区域内的图像;根据获取的图像计算用户的眼睛眨动的频率;根据所述眼睛眨动的频率和预设的规则选择相应的操作,所述预设规则为所述眼睛眨动的频率和操作的对应关系;控制所述终端及所述显示装置根据所述操作执行相应的功能。
  8. 如权利要求7所述的显示装置,其中,处理模块执行所述根据获取的图像计算用户的眼睛眨动的频率的步骤时具体执行如下步骤,
    提取预设时间段内的所述图像;
    识别出所述图像中的眼部图像状态,所述眼部图像状态包括闭眼状态和睁眼状态;
    若相邻的两幅图像的眼部图像状态不一致时,计算为眼睛眨动一次;
    计算预设时间段内眼睛眨动的次数;以及
    根据计算的次数与预设时间段计算出所述眼睛眨动的频率。
  9. 如权利要求7所述的显示装置,其中,处理模块执行所述预设规则为所述眼睛眨动的频率和操作的对应关系的步骤时具体执行如下步骤,
    若所述眼睛眨动的频率位于第一频率范围,所述终端执行第一操作;以及
    若所述眼睛眨动的频率位于第二频率范围,所述终端执行第二操作。
  10. 如权利要求7所述的显示装置,其中,处理模块调用并执行所述程序指令,还执行如下步骤,
    判断终端当前所处使用状态是待机状态还是正常工作状态;
    若是待机状态,侦测在所述终端的预设范围内是否存在使用者;
    计算在所述终端的预设范围内存在使用者的时间是否达到预设时长;以及
    若达到预设时长,控制所述终端进入正常工作状态并执行所述实时获取终端前方的摄像区域内的图像。
  11. 如权利要求10所述的显示装置,其中,处理模块执行所述根据获取的图像计算用户的眼睛眨动的频率的步骤时具体执行如下步骤,
    提取预设时间段内的所述图像;
    识别出所述图像中的眼部图像状态,所述眼部图像状态包括闭眼状态和睁眼状态;
    若相邻的两幅图像的眼部图像状态不一致时,计算为眼睛眨动一次;
    计算预设时间段内眼睛眨动的次数;以及
    根据计算的次数与预设时间段计算出所述眼睛眨动的频率。
  12. 如权利要求10所述的显示装置,其中,处理模块执行所述预设规则为所述眼睛眨动的频率和操作的对应关系的步骤时具体执行如下步骤,
    若所述眼睛眨动的频率位于第一频率范围,所述终端执行第一操作;以及
    若所述眼睛眨动的频率位于第二频率范围,所述终端执行第二操作。
  13. 一种显示装置,包括,显示面板以及
    图像获取模块,用于实时获取终端前方的摄像区域内的图像;
    频率计算模块,用于根据获取的图像计算用户的眼睛眨动的频率;
    操作选择模块,用于根据所述眼睛眨动的频率和预设的规则选择相应的操作,所述预设规则为所述眼睛眨动的频率和操作的对应关系;以及
    控制模块,用于控制所述终端根据所述操作执行相应的功能 。
  14. 如权利要求13所述的显示装置,其中,所述频率计算模块包括,
    提取模块,用于提取预设时间段内的所述图像;
    识别模块,用于识别出所述图像中的眼部图像状态,所述眼部图像状态包括闭眼状态和睁眼状态;
    计算模块,用于若相邻的两幅图像的眼部图像状态不一致时,计算眼睛眨动一次;
    所述计算模块,还用于计算预设时间段内眼睛眨动的次数;以及
    所述计算模块,还用于根据计算的次数与预设时间段计算出所述眼睛眨动的频率。
  15. 如权利要求13所述的显示装置,其中,所述操作选择模块包括,
    第一操作模块,用于若所述眼睛眨动的频率位于第一频率范围,所述终端执行第一操作;以及
    第二操作模块,用于若所述眼睛眨动的频率位于第二频率范围,所述终端执行第二操作。
  16. 如权利要求13所述的显示装置,其中,所述控制模块包括,
    第一判断模块,用于判断终端当前所处使用状态是待机状态还是正常工作状态;
    侦测模块,用于若是待机状态,侦测在所述终端的预设范围内是否存在使用者;
    计时模块,用于计算在所述终端的预设范围内存在使用者的时间是否达到预设时长;
    所述控制模块,用于若达到预设时长,控制所述终端进入正常工作状态并执行所述实时获取终端前方的摄像区域内的图像。
  17. 根据权利要求13所述的显示装置,其中,还包括,
    识别模块,用于识别出所述图像中的眼部图像状态,所述眼部图像状态包括闭眼状态和睁眼状态;
    第一判断模块,用于判断所述显示装置当前所处使用状态是待机状态还是正常工作状态;
    侦测模块,用于若是待机状态,侦测在所述显示装置的预设范围内是否存在使用者;
    计时模块,用于计算在所述显示装置的预设范围内存在使用者的时间是否达到预设时长。
  18. 根据权利要求17所述的显示装置,其中,所述频率计算模块包括,
    提取模块,用于提取预设时间段内的所述图像;
    识别模块,用于识别出所述图像中的眼部图像状态,所述眼部图像状态包括闭眼状态和睁眼状态;
    计算模块,用于若相邻的两幅图像的眼部图像状态不一致时,计算眼睛眨动一次;
    所述计算模块,还用于计算预设时间段内眼睛眨动的次数;以及
    所述计算模块,还用于根据计算的次数与预设时间段计算出所述眼睛眨动的频率。
  19. 根据权利要求17所述的显示装置,其中,所述操作选择模块包括,
    第一操作模块,用于若所述眼睛眨动的频率位于第一频率范围,所述终端执行第一操作;以及
    第二操作模块,用于若所述眼睛眨动的频率位于第二频率范围,所述终端执行第二操作。
  20. 根据权利要求17所述的显示装置,其中,所述控制模块用于若达到预设时长,控制所述终端进入正常工作状态并执行所述实时获取终端前方的摄像区域内的图像。
PCT/CN2017/086116 2016-12-19 2017-05-26 一种显示控制方法及显示装置 WO2018113187A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/557,808 US10255874B2 (en) 2016-12-19 2017-05-26 Display controlling method and display device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201611178597.4A CN106681503A (zh) 2016-12-19 2016-12-19 一种显示控制方法、终端及显示装置
CN201611178597.4 2016-12-19

Publications (1)

Publication Number Publication Date
WO2018113187A1 true WO2018113187A1 (zh) 2018-06-28

Family

ID=58870884

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/086116 WO2018113187A1 (zh) 2016-12-19 2017-05-26 一种显示控制方法及显示装置

Country Status (3)

Country Link
US (1) US10255874B2 (zh)
CN (1) CN106681503A (zh)
WO (1) WO2018113187A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112738407A (zh) * 2021-01-06 2021-04-30 富盛科技股份有限公司 一种操控多摄像机的方法和装置

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106681503A (zh) * 2016-12-19 2017-05-17 惠科股份有限公司 一种显示控制方法、终端及显示装置
CN106681060A (zh) * 2017-03-10 2017-05-17 惠科股份有限公司 一种封胶方法、封胶结构及显示装置
CN108932058B (zh) 2018-06-29 2021-05-18 联想(北京)有限公司 显示方法、装置及电子设备
US10802585B2 (en) 2018-07-12 2020-10-13 Apple Inc. Electronic devices with display operation based on eye activity
CN110958422B (zh) * 2018-09-25 2021-08-27 杭州萤石软件有限公司 一种在可见光图像中展示红外检测信息的方法和设备
CN110582014A (zh) * 2019-10-17 2019-12-17 深圳创维-Rgb电子有限公司 电视机及其电视控制方法、控制装置和可读存储介质
CN110784763B (zh) * 2019-11-07 2021-11-02 深圳创维-Rgb电子有限公司 显示终端控制方法、显示终端及可读存储介质
CN113760097A (zh) * 2021-09-16 2021-12-07 Oppo广东移动通信有限公司 控制音量的方法及装置、终端及计算机可读存储介质
CN115170075B (zh) * 2022-07-06 2023-06-16 深圳警通人才科技有限公司 一种基于数字化平台技术的智慧办公***

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101287086A (zh) * 2007-04-10 2008-10-15 深圳Tcl新技术有限公司 一种实现电视机自动开关机的方法及***
CN104267814A (zh) * 2014-09-25 2015-01-07 联想(北京)有限公司 一种信息处理方法及电子设备
CN106155317A (zh) * 2016-06-29 2016-11-23 深圳市金立通信设备有限公司 一种终端屏幕控制方法和终端
CN106681503A (zh) * 2016-12-19 2017-05-17 惠科股份有限公司 一种显示控制方法、终端及显示装置

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9772689B2 (en) * 2008-03-04 2017-09-26 Qualcomm Incorporated Enhanced gesture-based image manipulation
US8106783B2 (en) * 2008-03-12 2012-01-31 Denso Corporation Input apparatus, remote controller and operating device for vehicle
JP4561914B2 (ja) * 2008-09-22 2010-10-13 ソニー株式会社 操作入力装置、操作入力方法、プログラム
KR101078057B1 (ko) * 2009-09-08 2011-10-31 주식회사 팬택 영상인식기법을 이용한 촬영 제어 기능을 구비한 이동단말 및 영상인식기법을 이용한 촬영 제어 시스템
JP6106921B2 (ja) * 2011-04-26 2017-04-05 株式会社リコー 撮像装置、撮像方法および撮像プログラム
KR101850034B1 (ko) * 2012-01-06 2018-04-20 엘지전자 주식회사 이동 단말기 및 그 제어방법
KR101850035B1 (ko) * 2012-05-02 2018-04-20 엘지전자 주식회사 이동 단말기 및 그 제어방법
CN102915193B (zh) * 2012-10-24 2015-04-01 广东欧珀移动通信有限公司 一种网页浏览方法、装置及智能终端
CN103472918A (zh) * 2013-09-12 2013-12-25 京东方科技集团股份有限公司 一种护眼显示装置及其操作方法
US10048748B2 (en) * 2013-11-12 2018-08-14 Excalibur Ip, Llc Audio-visual interaction with user devices
KR20150089283A (ko) * 2014-01-27 2015-08-05 엘지전자 주식회사 웨어러블 단말기 및 이를 포함하는 시스템
KR102240632B1 (ko) * 2014-06-10 2021-04-16 삼성디스플레이 주식회사 생체 효과 영상을 제공하는 전자 기기의 구동 방법
KR102240639B1 (ko) * 2014-06-12 2021-04-15 엘지전자 주식회사 글래스 타입 단말기 및 그것의 제어 방법
KR102184272B1 (ko) * 2014-06-25 2020-11-30 엘지전자 주식회사 글래스 타입 단말기 및 이의 제어방법
CN106156806A (zh) * 2015-04-01 2016-11-23 冠捷投资有限公司 显示器的防止疲劳的方法
US20160343229A1 (en) * 2015-05-18 2016-11-24 Frank Colony Vigilance detection method and apparatus
KR20160138806A (ko) * 2015-05-26 2016-12-06 엘지전자 주식회사 글래스타입 단말기 및 그 제어방법
KR20170037466A (ko) * 2015-09-25 2017-04-04 엘지전자 주식회사 이동 단말기 및 이의 제어방법
CN105809139A (zh) * 2016-03-15 2016-07-27 广东欧珀移动通信有限公司 眼球信息的采集方法及装置
US20180025050A1 (en) * 2016-07-21 2018-01-25 Yen4Ken,Inc. Methods and systems to detect disengagement of user from an ongoing
CN106446831B (zh) * 2016-09-24 2021-06-25 江西欧迈斯微电子有限公司 一种人脸识别方法及装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101287086A (zh) * 2007-04-10 2008-10-15 深圳Tcl新技术有限公司 一种实现电视机自动开关机的方法及***
CN104267814A (zh) * 2014-09-25 2015-01-07 联想(北京)有限公司 一种信息处理方法及电子设备
CN106155317A (zh) * 2016-06-29 2016-11-23 深圳市金立通信设备有限公司 一种终端屏幕控制方法和终端
CN106681503A (zh) * 2016-12-19 2017-05-17 惠科股份有限公司 一种显示控制方法、终端及显示装置

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112738407A (zh) * 2021-01-06 2021-04-30 富盛科技股份有限公司 一种操控多摄像机的方法和装置

Also Published As

Publication number Publication date
CN106681503A (zh) 2017-05-17
US10255874B2 (en) 2019-04-09
US20180293954A1 (en) 2018-10-11

Similar Documents

Publication Publication Date Title
WO2018113187A1 (zh) 一种显示控制方法及显示装置
WO2018161578A1 (zh) 动态调节屏幕刷新率的方法、装置、存储介质及电子设备
WO2016129784A1 (en) Image display apparatus and method
WO2016072749A1 (en) Electronic device, and method for analyzing face information in electronic device
WO2017074078A1 (en) Method for operating electronic device and electronic device for supporting the same
WO2015068911A1 (en) Mobile terminal and method of controlling the same
WO2018155893A1 (en) Interface providing method for multitasking and electronic device implementing the same
WO2015109865A1 (zh) 空调运行模式自定义控制方法及***
WO2017095033A1 (ko) 마찰음을 이용하는 장치 및 방법
WO2016167620A1 (en) Apparatus and method for providing information via portion of display
WO2017105018A1 (en) Electronic apparatus and notification displaying method for electronic apparatus
WO2017039125A1 (en) Electronic device and operating method of the same
WO2017086559A1 (en) Image display device and operating method of the same
WO2018161572A1 (zh) 移动终端帧率的控制方法、装置、存储介质及电子设备
WO2019076087A1 (zh) 电视机及其显示图效控制方法、计算机可读存储介质
WO2019143189A1 (en) Electronic device and method of operating electronic device in virtual reality
WO2019051902A1 (zh) 终端控制方法、空调器及计算机可读存储介质
WO2016058258A1 (zh) 终端远程控制方法和***
WO2017201943A1 (zh) 显示屏驱动方法及装置
WO2018223602A1 (zh) 显示终端、画面对比度提高方法及计算机可读存储介质
WO2017088444A1 (zh) 终端电量信息提示方法和装置
WO2018126888A1 (zh) 电视功能的快捷启动设置方法及装置
WO2017121066A1 (zh) 应用程序显示方法和***
WO2017206865A1 (zh) 一种应用程序的关闭方法、装置、存储介质及电子设备
WO2019061530A1 (zh) 显示器自动调节亮度的方法、装置及存储介质

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 15557808

Country of ref document: US

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17885350

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 17.10.2019)

122 Ep: pct application non-entry in european phase

Ref document number: 17885350

Country of ref document: EP

Kind code of ref document: A1