CN113703640A - Display device and intelligent touch method thereof - Google Patents

Display device and intelligent touch method thereof Download PDF

Info

Publication number
CN113703640A
CN113703640A CN202111016337.8A CN202111016337A CN113703640A CN 113703640 A CN113703640 A CN 113703640A CN 202111016337 A CN202111016337 A CN 202111016337A CN 113703640 A CN113703640 A CN 113703640A
Authority
CN
China
Prior art keywords
control window
touch
display
content
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111016337.8A
Other languages
Chinese (zh)
Inventor
王子锋
揭育顺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BOE Technology Group Co Ltd
Original Assignee
BOE Technology Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BOE Technology Group Co Ltd filed Critical BOE Technology Group Co Ltd
Priority to CN202111016337.8A priority Critical patent/CN113703640A/en
Publication of CN113703640A publication Critical patent/CN113703640A/en
Priority to PCT/CN2022/108148 priority patent/WO2023029822A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1407General aspects irrespective of display type, e.g. determination of decimal point position, display with fixed or driving decimal point, suppression of non-significant zeros

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure discloses a display device and an intelligent touch method thereof, which are used for reducing the step of calling a small window through a menu to perform touch switching and improving the efficiency of man-machine interaction. Including display screen, touch-control subassembly, controller, wherein: the display screen is used for displaying content; the touch control assembly is used for receiving a touch control signal input by a user; the controller is configured to perform: triggering display of a control window on a display screen of the display equipment according to the detected user position, and controlling the control window to be in a first state, wherein the first state represents a transition state of judging whether to switch to a second state or not according to a received touch signal input by a user; controlling the control window to be switched from a first state to a second state according to the detected position relation between the touch position of the touch signal input by the user and the control window; and determining that the control window is in a second state, and controlling the display screen through the control window.

Description

Display device and intelligent touch method thereof
Technical Field
The disclosure relates to the technical field of human-computer interaction, and in particular relates to a display device and an intelligent touch method thereof.
Background
The display screen is used as the most important interface of human-computer interaction, and is increasingly applied to various aspects in life, along with the development of technology, many interactive functions are added to the display screen, for example, a user can perform human-computer interaction with the display screen through modes such as sound control and touch control, wherein the touch control mode comprises various modes such as infrared and capacitance. With the more comprehensive functions of the display screen, the applied scenes are more changeable, and the sizes of the display screens which can be selected in different scenes are more diversified.
For the current large-size display screen, for example, a conference whiteboard over 86 inches, etc., because the size of the display screen is large, it is inconvenient for the user to perform the full-screen touch operation, one of the currently provided touch modes is to open a small window through an operation menu on the display screen, and the user performs the touch operation in the small window and synchronizes to the display screen, but in the mode, the small window can be switched to after a plurality of selections are performed through the operation menu, and the small screen operation is performed in the fixed small window, so the human-computer interaction efficiency is low; the other mode is to perform online touch control through other equipment and display equipment to realize synchronous operation of the other equipment and the display equipment, but the mode needs other equipment, is inconvenient for direct human-computer interaction, and has fussy operation and poor experience.
Disclosure of Invention
The disclosure provides a display device and an intelligent touch method thereof, which are used for reducing the step of calling a small window through a menu to perform touch switching, providing more intelligent and portable human-computer interaction, and simultaneously improving the human-computer interaction efficiency.
In a first aspect, a display device provided in an embodiment of the present disclosure includes a display screen, a touch component, and a controller, where:
the display screen is used for displaying content;
the touch control assembly is used for receiving a touch control signal input by a user;
the controller is configured to perform:
triggering display of a control window on a display screen of the display equipment according to the detected user position, and controlling the control window to be in a first state, wherein the first state represents a transition state of judging whether to switch to a second state or not according to a received touch signal input by a user;
controlling the control window to be switched from a first state to a second state according to the detected position relation between the touch position of the touch signal input by the user and the control window;
and determining that the control window is in a second state, and controlling the display screen through the control window.
The display device provided by the embodiment can estimate the next behavior of the user according to the behavior of the user, so that whether the user has the requirement for using the control window is gradually judged according to the position of the user and the touch position of the touch signal, a mode of opening the control window without sense is provided, and the interaction efficiency of the user is improved.
As an alternative embodiment, the processor is configured to perform:
determining the distance between the user and the display device according to the detected position of the user;
and if the distance is smaller than the distance threshold, triggering the display of the control window on the display screen of the display equipment.
As an alternative embodiment, the processor is configured to perform:
determining the abscissa of the center of the control window according to the abscissa of the user position;
and determining the display position of the control window on the display screen according to the abscissa of the center of the control window.
As an alternative embodiment, the processor is configured to perform:
and if the touch position is located in the area of the control window, controlling the control window to be in a second state.
As an alternative embodiment, the processor is configured to perform:
and controlling the control window to be in a second state according to the detected position relation between the touch position operated by the user and the control window and the touch content generated in a first range taking the touch position as the center.
As an optional implementation manner, the processor is specifically further configured to perform:
if the touch position is located in the area where the control window is located, determining a display position corresponding to the touch position on the display screen, and if the touch content is matched with display content displayed in a second range with the display position as the center on the display screen, controlling the control window to be in a second state; or the like, or, alternatively,
if the touch position is located in the area where the control window is located, display content displayed on the display screen in a third range with the touch position as the center is obtained, if the touch content generated in the range with the touch position as the center is matched with the display content, the control window is closed, and the content displayed at the display position corresponding to the touch position is deleted.
As an optional implementation, the processor is configured to determine that the touch content matches the display content by:
determining that the touch content is matched with the display content according to at least one of the size and the data type of the touch content and the display content; or the like, or, alternatively,
as an optional implementation manner, it is determined that the touch content matches the presentation content according to at least one of a size and a data type of the touch content and the presentation content.
The processor is configured to perform:
after the touch content is amplified according to a preset rule or the display content is reduced according to a preset rule, determining that the touch content is matched with the display content according to the sizes of the touch content and the display content; or the like, or, alternatively,
and after the touch content is amplified according to a preset rule or the display content is reduced according to a preset rule, determining that the touch content is matched with the display content according to the sizes of the touch content and the display content.
As an optional implementation, in accordance with the detected user position, the triggering, on the display screen of the display device, display of a control window, where the processor is further specifically configured to perform:
and mapping at least part of content on the display screen to the control window for displaying after zooming.
As an optional implementation manner, the processor is specifically further configured to perform:
and controlling the transparency of the content mapped in the control window to be gradually reduced in the process of converting the control window from the first state to the second state.
As an alternative embodiment, the processor is configured to perform:
if at least part of the content on the display screen is mapped to the control window for displaying after being zoomed according to the detected user position, the mapped content is displayed according to the first-level transparency; or the like, or, alternatively,
if the touch position is located in the area where the control window is located, displaying the mapped content according to a secondary transparency, wherein the secondary transparency is lower than the primary transparency; or the like, or, alternatively,
and if the touch position is located in the area where the control window is located and the touch content is matched with the display content, displaying the mapped content in a non-transparent mode.
As an optional implementation manner, after triggering the display of the control window on the display screen of the display device, the processor is further specifically configured to perform:
if the touch signal is not detected within a preset time period, closing the control window; or the like, or, alternatively,
and if the difference value between the detection time of the touch signal and the detection time of the user position is greater than a time threshold, closing the control window.
As an alternative embodiment, the processing means is configured to determine the user position by:
determining the position of the user according to depth information in a depth image of the user shot by the display equipment; or the like, or, alternatively,
and scanning by using the digital radar array of the display equipment, and determining the position of the user according to a scanning result.
As an optional implementation manner, the processor is specifically further configured to:
if the control window is in a first state, displaying content through the control window, and receiving a touch signal through the display screen;
and if the control window is in the second state, receiving a touch signal through the control window, and displaying the content through the area outside the control window.
In a second aspect, a method for intelligent touch provided by the embodiments of the present disclosure includes:
triggering display of a control window on a display screen of the display equipment according to the detected user position, and controlling the control window to be in a first state, wherein the first state represents a transition state of judging whether to switch to a second state or not according to a received touch signal input by a user;
controlling the control window to be switched from a first state to a second state according to the detected position relation between the touch position of the touch signal input by the user and the control window;
and determining that the control window is in a second state, and controlling the display screen through the control window.
As an optional implementation, the triggering, according to the detected user position, a display control window on a display screen of a display device includes:
determining the distance between the user and the display device according to the detected position of the user;
and if the distance is smaller than the distance threshold, triggering the display of the control window on the display screen of the display equipment.
As an optional implementation, the triggering display of the control window on the display screen of the display device includes:
determining the abscissa of the center of the control window according to the abscissa of the user position;
and determining the display position of the control window on the display screen according to the abscissa of the center of the control window.
As an optional implementation manner, the controlling the control window to be in the second state according to the detected position relationship between the touch position of the touch signal input by the user and the control window includes:
and if the touch position is located in the area of the control window, controlling the control window to be in a second state.
As an optional implementation manner, controlling the control window to be in the second state according to the detected position relationship between the touch position of the touch signal input by the user and the control window includes:
and controlling the control window to be in a second state according to the detected position relation between the touch position operated by the user and the control window and the touch content generated in a first range taking the touch position as the center.
As an optional implementation manner, the controlling the control window to be in the second state according to the detected position relationship between the touch position operated by the user and the control window, and the touch content generated in the first range with the touch position as the center includes:
if the touch position is located in the area where the control window is located, determining a display position corresponding to the touch position on the display screen, and if the touch content is matched with display content displayed in a second range with the display position as the center on the display screen, controlling the control window to be in a second state; or the like, or, alternatively,
if the touch position is located in the area where the control window is located, display content displayed on the display screen in a third range with the touch position as the center is obtained, if the touch content generated in the range with the touch position as the center is matched with the display content, the control window is closed, and the content displayed at the display position corresponding to the touch position is deleted.
As an alternative implementation, the content match is determined by:
determining that the touch content is matched with the display content according to at least one of the size and the data type of the touch content and the display content; or the like, or, alternatively,
and determining that the touch content is matched with the display content according to at least one of the size and the data type of the touch content and the display content.
As an alternative implementation, the content match is determined by:
after the touch content is amplified according to a preset rule or the display content is reduced according to a preset rule, determining that the touch content is matched with the display content according to the sizes of the touch content and the display content; or the like, or, alternatively,
and after the touch content is amplified according to a preset rule or the display content is reduced according to a preset rule, determining that the touch content is matched with the display content according to the sizes of the touch content and the display content.
As an optional implementation, the triggering, according to the detected user position, display of a control window on a display screen of the display device further includes:
and mapping at least part of content on the display screen to the control window for displaying after zooming.
As an optional implementation manner, the mapping at least part of the content on the display screen to the control window for displaying after zooming further includes:
and controlling the transparency of the content mapped in the control window to be gradually reduced in the process of converting the control window from the first state to the second state.
As an optional implementation, the controlling the transparency of the content mapped in the control window to be gradually reduced includes:
if at least part of the content on the display screen is mapped to the control window for displaying after being zoomed according to the detected user position, the mapped content is displayed according to the first-level transparency; or the like, or, alternatively,
if the touch position is located in the area where the control window is located, displaying the mapped content according to a secondary transparency, wherein the secondary transparency is lower than the primary transparency; or the like, or, alternatively,
and if the touch position is located in the area where the control window is located and the touch content is matched with the display content, displaying the mapped content in a non-transparent mode.
As an optional implementation, after triggering the display of the control window on the display screen of the display device, the method further includes:
if the touch signal is not detected within a preset time period, closing the control window; or the like, or, alternatively,
and if the difference value between the detection time of the touch signal and the detection time of the user position is greater than a time threshold, closing the control window.
As an alternative embodiment, the user position is determined by:
determining the position of the user according to depth information in a depth image of the user shot by the display equipment; or the like, or, alternatively,
and scanning by using the digital radar array of the display equipment, and determining the position of the user according to a scanning result.
As an optional implementation, the method further includes:
if the control window is in a first state, displaying content through the control window, and receiving a touch signal through the display screen;
and if the control window is in the second state, receiving a touch signal through the control window, and displaying the content through the area outside the control window.
In a third aspect, an embodiment of the present disclosure further provides an apparatus for intelligent touch, including:
the display unit is used for triggering display of a control window on a display screen of the display equipment according to the detected user position and controlling the control window to be in a first state, and the first state represents a transition state of judging whether to switch to a second state or not according to a received touch signal input by a user;
the conversion unit is used for controlling the control window to be converted from a first state to a second state according to the detected position relation between the touch position of the touch signal input by the user and the control window;
and the control unit is used for determining that the control window is in a second state and controlling the display screen through the control window.
As an optional implementation manner, the display unit is specifically configured to:
determining the distance between the user and the display device according to the detected position of the user;
and if the distance is smaller than the distance threshold, triggering the display of the control window on the display screen of the display equipment.
As an optional implementation manner, the display unit is specifically configured to:
determining the abscissa of the center of the control window according to the abscissa of the user position;
and determining the display position of the control window on the display screen according to the abscissa of the center of the control window.
As an optional implementation manner, the conversion unit is specifically configured to:
and if the touch position is located in the area of the control window, controlling the control window to be in a second state.
As an optional implementation manner, the conversion unit is specifically configured to:
and controlling the control window to be in a second state according to the detected position relation between the touch position operated by the user and the control window and the touch content generated in a first range taking the touch position as the center.
As an optional implementation manner, the conversion unit is specifically configured to:
if the touch position is located in the area where the control window is located, determining a display position corresponding to the touch position on the display screen, and if the touch content is matched with display content displayed in a second range with the display position as the center on the display screen, controlling the control window to be in a second state; or the like, or, alternatively,
if the touch position is located in the area where the control window is located, display content displayed on the display screen in a third range with the touch position as the center is obtained, if the touch content generated in the range with the touch position as the center is matched with the display content, the control window is closed, and the content displayed at the display position corresponding to the touch position is deleted.
As an optional implementation manner, the conversion unit is specifically configured to:
determining that the touch content is matched with the display content according to at least one of the size and the data type of the touch content and the display content; or the like, or, alternatively,
and determining that the touch content is matched with the display content according to at least one of the size and the data type of the touch content and the display content.
As an optional implementation manner, the conversion unit is specifically configured to:
after the touch content is amplified according to a preset rule or the display content is reduced according to a preset rule, determining that the touch content is matched with the display content according to the sizes of the touch content and the display content; or the like, or, alternatively,
and after the touch content is amplified according to a preset rule or the display content is reduced according to a preset rule, determining that the touch content is matched with the display content according to the sizes of the touch content and the display content.
As an optional implementation manner, the display unit is further specifically configured to:
and mapping at least part of content on the display screen to the control window for displaying after zooming.
As an optional implementation manner, the display unit is further specifically configured to:
and controlling the transparency of the content mapped in the control window to be gradually reduced in the process of converting the control window from the first state to the second state.
As an optional implementation manner, the display unit is further specifically configured to:
if at least part of the content on the display screen is mapped to the control window for displaying after being zoomed according to the detected user position, the mapped content is displayed according to the first-level transparency; or the like, or, alternatively,
if the touch position is located in the area where the control window is located, displaying the mapped content according to a secondary transparency, wherein the secondary transparency is lower than the primary transparency; or the like, or, alternatively,
and if the touch position is located in the area where the control window is located and the touch content is matched with the display content, displaying the mapped content in a non-transparent mode.
As an optional implementation manner, after the display of the control window is triggered on the display screen of the display device, the display device further includes a closing unit specifically configured to:
if the touch signal is not detected within a preset time period, closing the control window; or the like, or, alternatively,
and if the difference value between the detection time of the touch signal and the detection time of the user position is greater than a time threshold, closing the control window.
As an optional implementation manner, the display unit is specifically configured to determine the user position by:
determining the position of the user according to depth information in a depth image of the user shot by the display equipment; or the like, or, alternatively,
and scanning by using the digital radar array of the display equipment, and determining the position of the user according to a scanning result.
As an optional implementation manner, the apparatus further includes a determining unit specifically configured to:
if the control window is in a first state, displaying content through the control window, and receiving a touch signal through the display screen;
and if the control window is in the second state, receiving a touch signal through the control window, and displaying the content through the area outside the control window.
In a fourth aspect, the disclosed embodiments also provide a computer storage medium, on which a computer program is stored, which when executed by a processor is configured to implement the steps of the method according to the first aspect.
These and other aspects of the disclosure will be more readily apparent from the following description of the embodiments.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present disclosure, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present disclosure, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic diagram of a conventional touch method according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram of a display device according to an embodiment of the disclosure;
fig. 3A is a schematic diagram of a first control window display provided in the embodiment of the present disclosure;
FIG. 3B is a schematic diagram of a second control window display provided by the present disclosure;
fig. 4A is a schematic diagram illustrating a method for detecting a user location according to an embodiment of the disclosure;
fig. 4B is a schematic diagram illustrating a method for detecting a user location according to an embodiment of the disclosure;
FIG. 5 is a schematic diagram illustrating a display position of a control window according to an embodiment of the present disclosure;
FIG. 6A is a diagram illustrating a first control window display provided by an embodiment of the present disclosure;
FIG. 6B is a diagram illustrating a second control window display provided by the present disclosure;
FIG. 6C is a diagram illustrating a third exemplary control window display provided by the present disclosure;
fig. 7A is a schematic diagram of a first touch content matching provided in the present disclosure;
fig. 7B is a schematic diagram of a second touch content matching provided in the present disclosure;
fig. 7C is a schematic diagram of third touch content matching provided in the embodiment of the disclosure;
fig. 7D is a schematic diagram of a fourth touch content matching provided in the present disclosure;
fig. 8A is a schematic diagram of a fifth touch content matching provided in the present disclosure;
fig. 8B is a schematic diagram of a sixth touch content matching provided in the present disclosure;
fig. 8C is a schematic diagram of a seventh touch content matching provided in the embodiment of the present disclosure;
fig. 8D is a schematic diagram of an eighth touch content matching provided in the embodiment of the present disclosure;
fig. 9 is a flowchart illustrating a detailed implementation of a method for intelligent touch control according to an embodiment of the present disclosure;
fig. 10 is a flowchart illustrating an implementation of a method for intelligent touch control according to an embodiment of the present disclosure;
fig. 11 is a schematic diagram of an intelligent touch device according to an embodiment of the present disclosure.
Detailed Description
To make the objects, technical solutions and advantages of the present disclosure clearer, the present disclosure will be described in further detail with reference to the accompanying drawings, and it is apparent that the described embodiments are only a part of the embodiments of the present disclosure, rather than all embodiments. All other embodiments, which can be derived by one of ordinary skill in the art from the embodiments disclosed herein without making any creative effort, shall fall within the scope of protection of the present disclosure.
The term "and/or" in the embodiments of the present disclosure describes an association relationship of associated objects, and means that there may be three relationships, for example, a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
The application scenario described in the embodiment of the present disclosure is for more clearly illustrating the technical solution of the embodiment of the present disclosure, and does not form a limitation on the technical solution provided in the embodiment of the present disclosure, and as a person having ordinary skill in the art knows, with the occurrence of a new application scenario, the technical solution provided in the embodiment of the present disclosure is also applicable to similar technical problems. In the description of the present disclosure, the term "plurality" means two or more unless otherwise specified.
The display screen is used as the most important interface of human-computer interaction, and is increasingly applied to various aspects in life, along with the development of technology, many interactive functions are added to the display screen, for example, a user can perform human-computer interaction with the display screen through modes such as sound control and touch control, wherein the touch control mode comprises various modes such as infrared and capacitance. With the more comprehensive functions of the display screen, the applied scenes are more changeable, and the sizes of the display screens which can be selected in different scenes are more diversified. For the current large-size display screen, for example, a conference whiteboard exceeding 86 inches, etc., because the size of the display screen is large, and it is inconvenient for the user to perform the touch operation of the full screen, as shown in fig. 1, one of the currently provided touch modes is to open a small window 101 through an operation menu 100 on the display device, and the user performs the touch operation in the small window 101 and synchronizes to the display device, but in this mode, the user needs to perform multiple selections through the operation menu 100 to switch to the small window 101, and the position of the small window 101 is fixed, and the touch operation is performed in the fixed small window 101, so the human-computer interaction efficiency is low; the other method is to perform online touch control through the other device 102 and the display device to achieve synchronous operation of the two devices, but the method requires other devices, such as a terminal and a tablet, and is inconvenient for direct human-computer interaction, complex in operation and poor in experience. In order to reduce the usage steps that a user frequently opens a small window through an operation menu in the process of operating a display device, the present embodiment provides a method for automatically determining whether the small window needs to be used to receive a touch signal and control a display screen to perform a corresponding operation for a large-size display device, and the display device thereof.
The display device provided by the embodiment includes, but is not limited to, a large-size display screen, a large-size smart tablet, a large-size electronic whiteboard, and other display devices. The display device in this embodiment may further be configured with a camera module for shooting a user, and may also be configured with a digital array radar for scanning the user and confirming the position of the user. The camera module and/or the digital array radar may be integrated on the display device, or may be used as a separate module to be in communication connection with the display device, which is not limited to this embodiment.
The core idea of the intelligent touch method provided in this embodiment is to determine whether a user is about to perform a touch operation by using detection of the user, and specifically determine a behavior of the user according to a received touch signal, for example, whether the touch operation input by the user at this time is a touch operation performed on a control window to control a display screen to perform corresponding display, or a touch operation performed on the display screen, and accurately distinguish whether the user needs to use the control window to realize control of the display screen, so that the display device automatically confirms the behavior of the user under the condition that the user does not need to actively operate a menu to open the control window, and provides the user with a control window configured with a touch function under the condition that it is determined that the user uses the control window to interact with the display device, so as to control the display screen by using the control window. The control window in this embodiment and the small window in the above content are the same concept in this embodiment, and both refer to functions for receiving a touch signal and controlling a display screen.
It should be noted that, in the initial state of the display device in this embodiment after the power on is started, a default control manner is set to receive the touch signal through the display screen of the display device and perform display, and at this time, the display screen may receive the touch signal and may also perform display corresponding to the touch signal. If the control mode is switched to the control window control mode, the current display screen can receive the touch signal, but the touch signal is mainly received through the control window, and synchronous display is performed on the control window and the display screen. The control window is located on the position, corresponding to the user, of the display screen, the size of the control window is smaller than that of the display screen, and the size can be set to be convenient for the user to control the full screen without moving.
As shown in fig. 2, the display device provided in this embodiment includes a display screen 200, a touch component 201, and a controller 202, wherein:
the display screen 200 is used for displaying content;
the touch component 201 is used for receiving a touch signal input by a user;
the controller 202 is configured to perform:
triggering display of a control window on a display screen 200 of the display device according to the detected user position, and controlling the control window to be in a first state, wherein the first state represents a transition state of judging whether to switch to a second state according to a received touch signal input by a user; the first state of the control window indicates that the control window has a display function but does not have a control function.
The area of the control window in this embodiment is shown in fig. 3A and 3B. The present embodiment does not excessively limit the shape, size, and line of the region of the control window.
Controlling the control window to be switched from a first state to a second state according to the detected position relation between the touch position of the touch signal input by the user and the control window;
and determining that the control window is in the second state, and controlling the display screen 200 through the control window. It should be noted that the second state in this embodiment is used to represent that the control window has a control function and a display function, and is used to implement full-screen control display.
In some embodiments, if the control window is in the first state, displaying content through the control window, and receiving a touch signal through the display screen; at this time, the touch screen is still in a full-screen touch state, the touch signal is received through the full screen, and the control window only has a display function. And if the control window is in the second state, receiving the touch signal through the control window, and displaying the content through the area outside the control window.
If the control window is in the first state, displaying the content through the control window; in this state, the control window is only used for displaying contents, including mapped contents after full-screen zooming and contents written by a user in the control window; and receiving a control signal through the display screen to control display.
And if the control window is in a second state, displaying the content through the control window and receiving a touch signal. In this state, the control window can not only display content, but also receive a touch signal input by a user and perform full-screen control display on the display screen.
In implementation, the display device in this embodiment may detect a user position, determine, through the user position, a distance between the user and the display screen, estimate a behavior of the user, determine whether the user is about to perform a touch operation on the display screen, and display, when determining that the user is about to perform the touch operation on the display screen according to the user position, a control window on the display screen, and display, after scaling, a content on the display screen in the control window, at this time, the control window on the display screen, and display, in the control window, a full-screen mapped content. In the process of practical application, when a user approaches the display screen and before touch operation is performed, a control window is displayed on the display screen in advance, so that whether the user needs to realize full-screen control through the control window is judged according to the actual behavior of the user in performing touch operation. The specific judgment mode is according to the position relation between the touch position of the touch signal input by the user and the control window.
In some examples, if the touch position is located in the area where the control window is located, it is determined that the control window is in the second state, a touch function is configured for the control window, a touch signal is received through the control window, and the display screen is controlled by using the touch signal. It is easily understood that if the user needs to use the control window, the touch position of the touch signal input by the user must be located within the control window, otherwise the user is considered not to need to use the control window.
In some examples, in order to accurately determine the behavior of the user, that is, whether the user is about to use the display device for touch operation, the present embodiment determines the distance between the user and the display device according to the user position, and finally determines whether the user has the behavior about to use touch operation. In implementation, the distance between the user and the display device is determined according to the detected position of the user; and if the distance is smaller than the distance threshold, triggering the display of the control window on the display screen of the display equipment.
In some examples, the display device in this embodiment may detect the user position through a camera assembly, and/or a digital radar array, and determine the user position in any one or any of the following ways:
mode 1, determining the position of a user according to depth information in a depth image of the user shot by the display device;
in an implementation, the camera assembly includes, but is not limited to, a binocular camera mounted on the display device, and determines the user position and the distance from the user to the display device (binocular camera) using depth information in the captured depth image including the user. And an accurate judgment basis is provided for judging the behavior of the user.
And 2, scanning by using the digital radar array of the display equipment, and determining the position of the user according to a scanning result.
In implementation, whether a user is scanned or not is determined according to fluctuation information between a transmitting beam and a receiving beam of the digital radar array, the position of the user is determined, and the distance between the user and the display device is further determined.
As shown in fig. 4A, a digital radar array 400 is installed on the display device for scanning a user; as shown in fig. 4B, L2 represents the position of digital radar array 400 with respect to the user, and the display device determines the position of the user from the information of the fluctuation between the transmit beam and the receive beam, and thus determines the distance between the user and the display device to be X. If the distance is smaller than the distance threshold L1, it indicates that the user is about to perform touch operation on the display device, and a display control window is triggered on the display screen of the display device, and if the distance X is smaller than the distance threshold L1, it indicates that the user is not about to perform touch operation yet, and at this time, the display screen is still defaulted to be display screen touch.
And 3, determining the first position of the user according to the depth information in the depth image of the user shot by the display device, scanning by using the digital radar array of the display device, determining the second position of the user according to the scanning result, weighting and summing the first position of the user and the second position of the user according to weights respectively corresponding to the first position of the user and the second position of the user, and determining the final position of the user.
In this way, the display device determines the user positions according to the camera assembly and the digital radar array, and finally performs weighted summation on the user positions determined by the camera assembly and the digital radar array according to the corresponding weights, so as to determine the final user position.
In some examples, to improve the accuracy of determining the user behavior, the present embodiment further determines whether the user has a behavior to be subjected to a touch operation based on the position relationship between the user position, the touch position, and the control window, and the time of receiving the touch signal.
In some examples, the control window is closed if no operation on the control window is detected for more than a preset period of time; or if the difference between the detection time of the touch signal and the detection time of the user position is greater than a time threshold, closing the control window.
In implementation, after receiving a touch signal of a user, before determining a position relationship between a touch position of the touch signal input by the user and the control window, determining a detection time of the touch signal, wherein a difference value between the detection time of the touch signal and the detection time of the user position is smaller than a time threshold. In implementation, the detection time is recorded through a clock in the display device, and a time threshold value is customized. In the embodiment, the detection time is introduced as a judgment basis, so that the behavior of the user operation is judged more accurately, more bases are provided for the configuration and the closing of the touch function of the control window, and the non-sensory experience of the user is improved.
In some examples, the control window and the touch function configured for the control window are closed if a time interval between two adjacent detected touch signals input by the user is greater than a preset time. Therefore, the control window can be prevented from being still displayed after the user does not operate for a long time, and the control window cannot be automatically closed after the user does not operate. Through the setting of the time threshold, the control window can be automatically closed after the user does not use the control window for a long time to perform touch operation, and the control state of the display screen is returned.
In some examples, as shown in fig. 5, the position of the control window in this embodiment is not fixed, so as to improve the user experience and facilitate the user to reduce movement when performing touch operation on a large-size display screen.
In some embodiments, the abscissa of the center of the control window is determined from the abscissa of the user position; and determining the display position of the control window on the display screen according to the abscissa of the center of the control window.
In practice, the position of the control window displayed on the display screen is determined by the following steps:
step 1) determining the abscissa of the center of the control window according to the abscissa of the user position;
step 2) determining a vertical coordinate of the center of the control window according to a preset vertical coordinate;
and 3) determining the display position of the control window on the display screen according to the abscissa and the ordinate.
The central abscissa X of the central position of the control window is aligned with the abscissa of the user position, and the central ordinate Y of the central position of the control window may be a preset ordinate according to the average height of the user or a preset ordinate according to the user requirement, which is not limited herein.
It is easy to understand that the position of the control window in this embodiment can move along with the movement of the user, so as to facilitate the user to reduce the number of movements, and the control window is always located right in front of the user, thereby facilitating the user to operate.
In some embodiments, in order to prompt a user whether to perform a touch operation of a control window, after determining the position information of the user, if a distance between the user and a display screen is smaller than a distance threshold, at least a part of content on the display screen is scaled and then mapped to the control window for display. Optionally, at least part of the content on the display screen is scaled and then mapped to the control window for display.
In some embodiments, the transparency of the content mapped in the control window is progressively reduced during the transition of the control window from the first state to the second state.
In some embodiments, the control window may specifically change the transparency level by level based on:
if at least part of the content on the display screen is mapped to the control window for displaying after being zoomed according to the detected user position, the mapped content is displayed according to the first-level transparency; or the like, or, alternatively,
if the touch position is located in the area where the control window is located, displaying the mapped content according to a secondary transparency, wherein the secondary transparency is lower than the primary transparency; or the like, or, alternatively,
and if the touch position is located in the area where the control window is located and the touch content is matched with the display content, displaying the mapped content in a non-transparent mode.
In specific implementation, the fact that the user is about to perform touch operation on the display screen is estimated only through the position of the user, the content mapped in the control window can be displayed in a transparent display mode, and therefore the user can be reminded that the user can perform touch display by using the control window under the condition that the content of the display screen is not completely shielded. Optionally, the control window itself may also be displayed in a progressive transparent display manner. It should be noted that, at this time, the displayed control window is in the first state, and only has a display function, and does not have a function of receiving the touch signal. If the user needs to use the control window, the user inevitably performs touch operation in the control window of the transparent display, and if the user performs touch operation outside the control window, the user considers that the user does not need to use the control window, and the control window of the transparent display is closed.
Further, as shown in fig. 6A, if it is detected that the distance L between the user position and the display device is smaller than the distance threshold, the content of the control window is displayed according to a first level of transparency, as shown in fig. 6B, and if it is detected that the user moves further to the display device and it is detected that the input touch signal is located in the area of the control window, the content of the control window is displayed according to a second level of transparency, where the second level of transparency is lower than the first level of transparency, it is easy to understand that, when the more judgment conditions that the user behavior satisfies are more accurate, the more accurate the judgment result is, the lower the transparency of the content of the control window is adjusted, so as to determine that the user is finally determined to use the control window, as shown in fig. 6C, the control window is completely displayed in a non-transparent manner.
In some embodiments, if it is detected that the distance between the user position and the display device is smaller than the distance threshold, the content of the control window is displayed according to the primary transparency, and if the user moves forward and inputs the touch signal, and the difference between the detection time of the touch signal and the detection time of the user position is smaller than the time threshold, and the touch position of the touch signal is located in the area where the control window is located, the mapped content is displayed according to the set secondary transparency. And after the control of the display screen through the control window is determined, displaying the mapped content in the control window in a non-transparent mode. Through the display modes with different transparencies, the user is reminded and the content of the control window is enhanced, so that the use experience of the user is improved.
In some examples, the present embodiment further controls the control window to be in the second state according to the detected position relationship between the touch position of the user operation and the control window, and the touch content generated in the first range with the touch position as the center, and determines to configure the touch function for the control window. After it is determined that the touch position of the touch signal of the user is located in the area where the control window is located, in order to avoid misjudgment, the embodiment further determines whether the user uses the control window for touch operation according to specific touch content input by the user, so as to improve the accuracy of the determination.
In some examples, the touch content generated at the touch position is matched with the display content on the display screen, it is determined that the touch signal is received through the control window at this time, and the content displayed by the touch signal is synchronously displayed on the display screen, and the determination is specifically performed by combining the touch position and the touch content through the following steps:
step 2-1), if the touch position is located in the area of the control window, determining a display position corresponding to the touch position on the display screen;
and 2-2) if the touch content is matched with the display content displayed in a second range taking the display position as the center on the display screen, controlling the control window to be in a second state.
It should be noted that, in this embodiment, the touch content corresponding to the touch position includes: taking the touch position as a center, and inputting touch content in a first range by a user; the display content corresponding to the display position comprises: and displaying content on the display screen within a second range by taking the display position as a center.
In implementation, when whether the user uses the control window is not determined, when the touch content input by the user is received in the control window, the touch content is still displayed on the display screen synchronously. And further judging whether a touch function is configured for the control window according to whether the touch content corresponding to the touch position is matched with the display content, wherein for the accuracy of judgment, the touch content corresponding to the touch position in implementation includes all touch contents input by the user in a preset area, namely the display content corresponding to the touch content displayed in the preset area, the touch content specifically refers to the content input by the user in the control window, the display content specifically refers to the content displayed on the display screen, and the displayed content includes the content displayed on the display screen after being amplified according to the content input by the user in the touch window. For example, as shown in fig. 7A, if the user inputs "do" at the touch position and displays "do" at the display position, and further, it is determined that the touch content "do you do" corresponding to the touch position does not match the display content "do" corresponding to the display position, as shown in fig. 7B, it is determined that the user does not need to use the control window at this time, and the user actually performs a touch operation on the display screen, closes the control window, and displays "do" at the display position; as shown in fig. 7C, if the user inputs "do" at the touch position and displays "do" at the display position, further, it is determined that the touch content "do you do" corresponding to the touch position matches the display content "do you do" corresponding to the display position, as shown in fig. 7D, it is determined that the user needs to use the control window at this time, a touch function is configured for the control window, and the control window may also be displayed in a non-transparent manner.
It should be noted that, if there is no touch content input by the user on the display screen before the touch signal input by the user is detected, the configuration of the touch function for the control window is determined directly according to the detected position relationship between the touch position of the touch signal input by the user and the control window, and at this time, it is only necessary to determine whether the touch position of the touch signal input by the user is in the area where the control window is located without matching the touch content generated by the touch position with the display content corresponding to the display position.
In some embodiments, the touch content is determined to match the display content by:
and determining that the touch content is matched with the display content according to at least one of the size and the data type of the touch content and the display content. It should be noted that, since the content in the control window in the embodiment is obtained by scaling the content of the display screen, when performing content matching, a more accurate matching basis can be provided, and the size can be matched in an equal proportion. For example, the ratio of the display content to the content in the control window is 5: and 1, after the size of the touch content is enlarged by 5 times, judging whether the size of the touch content is equal to the size of the display content.
Optionally, the data types in this embodiment include, but are not limited to: at least one of text, graphics, and tables.
In some embodiments, after the touch content is enlarged according to a preset rule or the display content is reduced according to a preset rule, the touch content is determined to be matched with the display content according to the sizes of the touch content and the display content.
In some embodiments, in addition to providing matching between the touch content and the display content, in order to determine the touch behavior of the user more accurately, the embodiment further determines the touch behavior according to whether the touch content matches with the display content displayed on the display screen around the touch position, which is specifically implemented as follows:
step 3-1) if the touch position is located in the area where the control window is located, obtaining display content displayed on the display screen in a third range by taking the touch position as a center;
it should be noted that, in this embodiment, the touch content generated in the first range of the touch position includes: touch content input by a user in a first range with the touch position as a center; the display content comprises the following steps: and displaying content on the display screen within a third range taking the touch position as a center. The display content represents the content originally displayed on the display screen, that is, the display content includes the content originally displayed on the display screen and just lies in the control window, that is, the display content is the content displayed on the display screen behind the control window without being blocked.
And 3-2) if the touch content is matched with the display content, closing the control window and the touch function configured for the control window, and deleting the display content at the display position corresponding to the touch position.
In implementation, as shown in fig. 8A, if the user inputs "do" at the touch position and displays "do" at the display position, and further, it is determined that the touch content "do you do" corresponding to the touch position matches the display content "do you do", as shown in fig. 8B, it is determined that the user does not need to use the control window at this time, and the user actually performs a touch operation on the display screen, closes the control window and displays "do" at the display position; as shown in fig. 8C, if the user inputs "do" at the touch position and displays "do" at the display position, further, it is determined that the touch content "do you do" corresponding to the touch position does not match the display content corresponding to the display position, at this time, the display content is the content displayed by the display screen in the control window, and actually there is no content but the background of the display screen, as shown in fig. 8D, it is determined that the user needs to use the control window at this time, the touch function is configured for the control window, and the control window may also be displayed in a non-transparent manner, including displaying the content and/or the background of the control window in a non-transparent manner.
In some embodiments, after the touch content is enlarged according to a preset rule or the display content is reduced according to a preset rule, the touch content is determined to be matched with the display content according to the sizes of the touch content and the display content.
In this embodiment, the touch content input by the user in the control window is matched with the content in different areas (including the display area and the display area), and if the touch content input by the user in the control window is matched with the display content in the display area, it indicates that the user needs to use the control window to perform touch display at this time, and then a touch function is configured for the control window; if the touch content input by the user in the control window is matched with the display content, it indicates that the user only performs touch operation on the display screen at the moment, and the touch operation is just located in the control window, so that the display content of the display position corresponding to the touch position is deleted, and the control window is closed. By adding the judgment basis, the misjudgment is further avoided, the judgment accuracy is improved, and the accurate and non-inductive interactive experience is provided.
The present embodiment further provides a detailed method for intelligent touch, which includes determining a distance between a user and a display device by using a detected user position, displaying a control window at a position of a display screen corresponding to the user position according to a first level of transparency when it is determined that the distance is smaller than a distance threshold, and mapping the content on the display screen to the control window for display after scaling the content in the display screen in an equal proportion, if a touch signal input by the user is further detected and a difference between a detection time of the touch signal and a detection time of the user position is smaller than a time threshold, determining a positional relationship between a touch position of the touch signal and the control window, if the touch position is located in an area where the control window is located, further determining a matching relationship between the touch content corresponding to the touch position and the display content, and if the touch content is matched, indicating that the user needs to use the control window at this time, and if the touch content is matched with the display content, indicating that the user uses the display screen to perform touch operation at the moment, closing the touch functions configured for the control window and the control window, and deleting the display content of the display position corresponding to the touch position. It should be noted that, if there is no touch content input by the user on the display screen before receiving the touch signal input by the user, it is only necessary to determine the position relationship between the touch position of the touch signal input by the user and the control window to determine whether to configure the touch function for the control window and perform non-transparent display, and there is no need to perform matching of the touch content. Through the process, the operation of opening and closing the control window can be realized in a non-inductive manner in the operation process of operating the display screen by the user, the control window is opened and closed without redundant steps in the process of operating the display screen by the user, and the opening, the display, the touch function configuration, the closing and the like of the control window can be realized according to the behavior, the touch position and the touch content of the user. The efficient and convenient use experience is provided for the user.
As shown in fig. 9, the implementation flow of the above method is specifically as follows:
step 900, after starting up, setting a default setting to a display screen to receive a touch signal and control display;
step 901, detecting the position of a user;
step 902, judging whether the distance between the detected user position and the display device is smaller than a distance threshold, if so, executing step 903, otherwise, executing step 911;
and 903, triggering a display control window on the display screen, scaling the content on the display screen according to the first-level transparency in an equal proportion, and displaying the content on the control window.
Step 904, detecting a touch signal input by a user;
step 905, judging whether the difference value between the detection time of the touch signal and the detection time of the user position is smaller than a time threshold, if so, executing step 906, otherwise, executing step 911;
step 906, determining whether the touch position of the touch signal is located in the area where the control window is located, if so, executing step 907, otherwise, executing step 911;
step 907, displaying the content in the control window according to the set secondary transparency, wherein the secondary transparency is lower than the primary transparency.
Step 908, determining whether the touch content corresponding to the touch position on the display screen matches the display content corresponding to the display position or the display content, if the touch content matches the display content, executing step 909, otherwise executing step 911.
Wherein, the display content comprises: display content on the display screen within a second range centered on the display position.
The display content comprises the following steps: and displaying content on the display screen within a third range taking the touch position as a center.
Step 909, determining that a touch function is configured for the control window, and not transparently displaying the content in the control window;
step 910, determining whether a time interval between two adjacent detected touch signals input by the user is greater than a preset time, if so, executing step 911, otherwise, executing step 906.
And 911, closing the control window, deleting the display content of the display position corresponding to the touch position, returning to the display screen to receive the touch signal and controlling the display.
Based on the same inventive concept, the embodiment of the present disclosure further provides an intelligent touch method, and since the method is a method corresponding to the device in the embodiment of the present disclosure, and the principle of the method for solving the problem is similar to that of the device, the implementation of the method may refer to the implementation of the device, and repeated details are not repeated.
As shown in fig. 10, an implementation flow of the method for intelligent touch control provided in this embodiment is as follows:
step 1000, according to the detected user position, triggering display of a control window on a display screen of the display device, and controlling the control window to be in a first state, wherein the first state represents a transition state of judging whether to switch to a second state or not according to a received touch signal input by a user;
step 1001, controlling the control window to be switched from a first state to a second state according to the detected position relation between the touch position of the touch signal input by the user and the control window;
step 1002, determining that the control window is in a second state, and controlling the display screen through the control window.
As an optional implementation, the triggering, according to the detected user position, a display control window on a display screen of a display device includes:
determining the distance between the user and the display device according to the detected position of the user;
and if the distance is smaller than the distance threshold, triggering the display of the control window on the display screen of the display equipment.
As an optional implementation, the triggering display of the control window on the display screen of the display device includes:
determining the abscissa of the center of the control window according to the abscissa of the user position;
and determining the display position of the control window on the display screen according to the abscissa of the center of the control window.
As an optional implementation manner, the controlling the control window to be in the second state according to the detected position relationship between the touch position of the touch signal input by the user and the control window includes:
and if the touch position is located in the area of the control window, controlling the control window to be in a second state.
As an optional implementation manner, controlling the control window to be in the second state according to the detected position relationship between the touch position of the touch signal input by the user and the control window includes:
and controlling the control window to be in a second state according to the detected position relation between the touch position operated by the user and the control window and the touch content generated in a first range taking the touch position as the center.
As an optional implementation manner, the controlling the control window to be in the second state according to the detected position relationship between the touch position operated by the user and the control window, and the touch content generated in the first range with the touch position as the center includes:
if the touch position is located in the area where the control window is located, determining a display position corresponding to the touch position on the display screen, and if the touch content is matched with display content displayed in a second range with the display position as the center on the display screen, controlling the control window to be in a second state; or the like, or, alternatively,
if the touch position is located in the area where the control window is located, display content displayed on the display screen in a third range with the touch position as the center is obtained, if the touch content generated in the range with the touch position as the center is matched with the display content, the control window is closed, and the content displayed at the display position corresponding to the touch position is deleted.
As an alternative implementation, the content match is determined by:
determining that the touch content is matched with the display content according to at least one of the size and the data type of the touch content and the display content; or the like, or, alternatively,
and determining that the touch content is matched with the display content according to at least one of the size and the data type of the touch content and the display content.
As an alternative implementation, the content match is determined by:
after the touch content is amplified according to a preset rule or the display content is reduced according to a preset rule, determining that the touch content is matched with the display content according to the sizes of the touch content and the display content; or the like, or, alternatively,
and after the touch content is amplified according to a preset rule or the display content is reduced according to a preset rule, determining that the touch content is matched with the display content according to the sizes of the touch content and the display content.
As an optional implementation, the triggering, according to the detected user position, display of a control window on a display screen of the display device further includes:
and mapping at least part of content on the display screen to the control window for displaying after zooming.
As an optional implementation manner, the mapping at least part of the content on the display screen to the control window for displaying after zooming further includes:
and controlling the transparency of the content mapped in the control window to be gradually reduced in the process of converting the control window from the first state to the second state.
As an optional implementation, the controlling the transparency of the content mapped in the control window to be gradually reduced includes:
if at least part of the content on the display screen is mapped to the control window for displaying after being zoomed according to the detected user position, the mapped content is displayed according to the first-level transparency; or the like, or, alternatively,
if the touch position is located in the area where the control window is located, displaying the mapped content according to a secondary transparency, wherein the secondary transparency is lower than the primary transparency; or the like, or, alternatively,
and if the touch position is located in the area where the control window is located and the touch content is matched with the display content, displaying the mapped content in a non-transparent mode.
As an optional implementation, after triggering the display of the control window on the display screen of the display device, the method further includes:
if the touch signal is not detected within a preset time period, closing the control window; or the like, or, alternatively,
and if the difference value between the detection time of the touch signal and the detection time of the user position is greater than a time threshold, closing the control window.
As an alternative embodiment, the user position is determined by:
determining the position of the user according to depth information in a depth image of the user shot by the display equipment; or the like, or, alternatively,
and scanning by using the digital radar array of the display equipment, and determining the position of the user according to a scanning result.
As an optional implementation, the method further includes:
if the control window is in a first state, displaying content through the control window, and receiving a touch signal through the display screen;
and if the control window is in the second state, receiving a touch signal through the control window, and displaying the content through the area outside the control window.
Based on the same inventive concept, the embodiment of the present disclosure further provides an intelligent touch device, and since the device is a device in the method in the embodiment of the present disclosure, and the principle of the device to solve the problem is similar to that of the method, the implementation of the device may refer to the implementation of the method, and repeated details are not repeated.
As shown in fig. 11, the apparatus includes:
the display unit 1100 is configured to trigger display of a control window on a display screen of the display device according to the detected user position, and control the control window to be in a first state, where the first state represents a transition state of determining whether to transition to a second state according to a received touch signal input by a user;
a switching unit 1101, configured to control the control window to switch from a first state to a second state according to a detected positional relationship between a touch position of a touch signal input by a user and the control window;
the control unit 1102 is configured to determine that the control window is in the second state, and control the display screen through the control window.
As an optional implementation manner, the display unit 1100 is specifically configured to:
determining the distance between the user and the display device according to the detected position of the user;
and if the distance is smaller than the distance threshold, triggering the display of the control window on the display screen of the display equipment.
As an optional implementation manner, the display unit 1100 is specifically configured to:
determining the abscissa of the center of the control window according to the abscissa of the user position;
and determining the display position of the control window on the display screen according to the abscissa of the center of the control window.
As an optional implementation manner, the conversion unit 1101 is specifically configured to:
and if the touch position is located in the area of the control window, controlling the control window to be in a second state.
As an optional implementation manner, the conversion unit 1101 is specifically configured to:
and controlling the control window to be in a second state according to the detected position relation between the touch position operated by the user and the control window and the touch content generated in a first range taking the touch position as the center.
As an optional implementation manner, the conversion unit 1101 is specifically configured to:
if the touch position is located in the area where the control window is located, determining a display position corresponding to the touch position on the display screen, and if the touch content is matched with display content displayed in a second range with the display position as the center on the display screen, controlling the control window to be in a second state; or the like, or, alternatively,
if the touch position is located in the area where the control window is located, display content displayed on the display screen in a third range with the touch position as the center is obtained, if the touch content generated in the range with the touch position as the center is matched with the display content, the control window is closed, and the content displayed at the display position corresponding to the touch position is deleted.
As an optional implementation manner, the conversion unit 1101 is specifically configured to:
determining that the touch content is matched with the display content according to at least one of the size and the data type of the touch content and the display content; or the like, or, alternatively,
and determining that the touch content is matched with the display content according to at least one of the size and the data type of the touch content and the display content.
As an optional implementation manner, the conversion unit 1101 is specifically configured to:
after the touch content is amplified according to a preset rule or the display content is reduced according to a preset rule, determining that the touch content is matched with the display content according to the sizes of the touch content and the display content; or the like, or, alternatively,
and after the touch content is amplified according to a preset rule or the display content is reduced according to a preset rule, determining that the touch content is matched with the display content according to the sizes of the touch content and the display content.
As an optional implementation manner, the display unit is further specifically configured to:
and mapping at least part of content on the display screen to the control window for displaying after zooming.
As an optional implementation manner, the display unit 1100 is further specifically configured to:
and controlling the transparency of the content mapped in the control window to be gradually reduced in the process of converting the control window from the first state to the second state.
As an optional implementation manner, the display unit 1100 is further specifically configured to:
if at least part of the content on the display screen is mapped to the control window for displaying after being zoomed according to the detected user position, the mapped content is displayed according to the first-level transparency; or the like, or, alternatively,
if the touch position is located in the area where the control window is located, displaying the mapped content according to a secondary transparency, wherein the secondary transparency is lower than the primary transparency; or the like, or, alternatively,
and if the touch position is located in the area where the control window is located and the touch content is matched with the display content, displaying the mapped content in a non-transparent mode.
As an optional implementation manner, after the display of the control window is triggered on the display screen of the display device, the display device further includes a closing unit specifically configured to:
if the touch signal is not detected within a preset time period, closing the control window; or the like, or, alternatively,
and if the difference value between the detection time of the touch signal and the detection time of the user position is greater than a time threshold, closing the control window.
As an optional implementation manner, the display unit 1100 is specifically configured to determine the user position by:
determining the position of the user according to depth information in a depth image of the user shot by the display equipment; or the like, or, alternatively,
and scanning by using the digital radar array of the display equipment, and determining the position of the user according to a scanning result.
As an optional implementation manner, the apparatus further includes a determining unit specifically configured to:
if the control window is in a first state, displaying content through the control window, and receiving a touch signal through the display screen;
and if the control window is in the second state, receiving a touch signal through the control window, and displaying the content through the area outside the control window.
Based on the same inventive concept, the disclosed embodiments also provide a computer storage medium having a computer program stored thereon, which when executed by a processor, implements the steps of:
triggering display of a control window on a display screen of the display equipment according to the detected user position, and controlling the control window to be in a first state, wherein the first state represents a transition state of judging whether to switch to a second state or not according to a received touch signal input by a user;
controlling the control window to be switched from a first state to a second state according to the detected position relation between the touch position of the touch signal input by the user and the control window;
and determining that the control window is in a second state, and controlling the display screen through the control window.
As will be appreciated by one skilled in the art, embodiments of the present disclosure may be provided as a method, system, or computer program product. Accordingly, the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present disclosure may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, optical storage, and the like) having computer-usable program code embodied therein.
The present disclosure is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the disclosure. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various changes and modifications can be made in the present disclosure without departing from the spirit and scope of the disclosure. Thus, if such modifications and variations of the present disclosure fall within the scope of the claims of the present disclosure and their equivalents, the present disclosure is intended to include such modifications and variations as well.

Claims (22)

1. A display device, wherein, includes display screen, touch-control subassembly, controller, wherein:
the display screen is used for displaying content;
the touch control assembly is used for receiving a touch control signal input by a user;
the controller is configured to perform:
triggering display of a control window on a display screen of the display equipment according to the detected user position, and controlling the control window to be in a first state, wherein the first state represents a transition state of judging whether to switch to a second state or not according to a received touch signal input by a user;
controlling the control window to be switched from a first state to a second state according to the detected position relation between the touch position of the touch signal input by the user and the control window;
and determining that the control window is in a second state, and controlling the display screen through the control window.
2. The display device of claim 1, wherein the processor is configured to perform:
determining the distance between the user and the display device according to the detected position of the user;
and if the distance is smaller than the distance threshold, triggering the display of the control window on the display screen of the display equipment.
3. The display device of claim 1, wherein the processor is configured to perform:
determining the abscissa of the center of the control window according to the abscissa of the user position;
and determining the display position of the control window on the display screen according to the abscissa of the center of the control window.
4. The display device of claim 1, wherein the processor is configured to perform:
and if the touch position is located in the area of the control window, controlling the control window to be in a second state.
5. The display device of claim 1, wherein the processor is configured to perform:
and controlling the control window to be in a second state according to the detected position relation between the touch position operated by the user and the control window and the touch content generated in a first range taking the touch position as the center.
6. The display device of claim 5, wherein the processor is configured to perform:
if the touch position is located in the area where the control window is located, determining a display position corresponding to the touch position on the display screen, and if the touch content is matched with display content displayed in a second range with the display position as the center on the display screen, controlling the control window to be in a second state; or the like, or, alternatively,
if the touch position is located in the area where the control window is located, display content displayed on the display screen in a third range with the touch position as the center is obtained, if the touch content generated in the range with the touch position as the center is matched with the display content, the control window is closed, and the content displayed at the display position corresponding to the touch position is deleted.
7. The display device of claim 6, wherein the processor is configured to:
determining that the touch content is matched with the display content according to at least one of the size and the data type of the touch content and the display content; or the like, or, alternatively,
and determining that the touch content is matched with the display content according to at least one of the size and the data type of the touch content and the display content.
8. The display device of claim 6, wherein the processor is configured to perform:
after the touch content is amplified according to a preset rule or the display content is reduced according to a preset rule, determining that the touch content is matched with the display content according to the sizes of the touch content and the display content; or the like, or, alternatively,
and after the touch content is amplified according to a preset rule or the display content is reduced according to a preset rule, determining that the touch content is matched with the display content according to the sizes of the touch content and the display content.
9. The display device according to any one of claims 1 to 8, wherein the triggering of the display of the control window on the display screen of the display device is performed according to the detected user position, and the processor is further configured to perform:
and mapping at least part of content on the display screen to the control window for displaying after zooming.
10. The display device of claim 9, wherein the processor is further configured to perform:
and controlling the transparency of the content mapped in the control window to be gradually reduced in the process of converting the control window from the first state to the second state.
11. The display device of claim 10, wherein the processor is configured to perform:
if at least part of the content on the display screen is mapped to the control window for displaying after being zoomed according to the detected user position, the mapped content is displayed according to the first-level transparency; or the like, or, alternatively,
if the touch position is located in the area where the control window is located, displaying the mapped content according to a secondary transparency, wherein the secondary transparency is lower than the primary transparency; or the like, or, alternatively,
and if the touch position is located in the area where the control window is located and the touch content is matched with the display content, displaying the mapped content in a non-transparent mode.
12. The display device of any one of claims 1 to 8 and 10 to 11, wherein, after triggering display of a control window on a display screen of the display device, the processor is further specifically configured to perform:
if the touch signal is not detected within a preset time period, closing the control window; or the like, or, alternatively,
and if the difference value between the detection time of the touch signal and the detection time of the user position is greater than a time threshold, closing the control window.
13. The display device of any of claims 1-8, 10-11, wherein the processor is configured to determine the user location by:
determining the position of the user according to depth information in a depth image of the user shot by the display equipment; or the like, or, alternatively,
and scanning by using the digital radar array of the display equipment, and determining the position of the user according to a scanning result.
14. The display device of any of claims 1-8, 10-11, wherein the processor is further specifically configured to:
if the control window is in a first state, displaying content through the control window, and receiving a touch signal through the display screen;
and if the control window is in the second state, receiving a touch signal through the control window, and displaying the content through the area outside the control window.
15. A method of smart touch, wherein the method comprises:
triggering display of a control window on a display screen of the display equipment according to the detected user position, and controlling the control window to be in a first state, wherein the first state represents a transition state of judging whether to switch to a second state or not according to a received touch signal input by a user;
controlling the control window to be switched from a first state to a second state according to the detected position relation between the touch position of the touch signal input by the user and the control window;
and determining that the control window is in a second state, and controlling the display screen through the control window.
16. The method of claim 15, wherein the triggering display of a control window on a display screen of the display device in accordance with the detected user position comprises:
determining the distance between the user and the display device according to the detected position of the user;
and if the distance is smaller than the distance threshold, triggering the display of the control window on the display screen of the display equipment.
17. The method of claim 15, wherein the position of the control window is determined by:
determining the abscissa of the center of the control window according to the abscissa of the user position;
and determining the display position of the control window on the display screen according to the abscissa of the center of the control window.
18. The method of claim 15, wherein the controlling the control window to be in the second state according to the detected position relationship between the touch position of the touch signal input by the user and the control window comprises:
and if the touch position is located in the area of the control window, controlling the control window to be in a second state.
19. The method of claim 15, wherein the controlling the control window to be in the second state according to the detected position relationship between the touch position of the touch signal input by the user and the control window comprises:
and controlling the control window to be in a second state according to the detected position relation between the touch position operated by the user and the control window and the touch content generated in a first range taking the touch position as the center.
20. The method of any of claims 15 to 19, wherein the triggering of the display of the control window on the display screen of the display device in dependence on the detected user position further comprises:
and mapping at least part of content on the display screen to the control window for displaying after zooming.
21. The method of claim 20, wherein the scaling at least a portion of the content on the display screen to the control window for display further comprises:
and controlling the transparency of the content mapped in the control window to be gradually reduced in the process of converting the control window from the first state to the second state.
22. A computer storage medium having a computer program stored thereon, wherein the program when executed by a processor implements the steps of the method of any of claims 15 to 21.
CN202111016337.8A 2021-08-31 2021-08-31 Display device and intelligent touch method thereof Pending CN113703640A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111016337.8A CN113703640A (en) 2021-08-31 2021-08-31 Display device and intelligent touch method thereof
PCT/CN2022/108148 WO2023029822A1 (en) 2021-08-31 2022-07-27 Display device and intelligent touch-control method therefor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111016337.8A CN113703640A (en) 2021-08-31 2021-08-31 Display device and intelligent touch method thereof

Publications (1)

Publication Number Publication Date
CN113703640A true CN113703640A (en) 2021-11-26

Family

ID=78658329

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111016337.8A Pending CN113703640A (en) 2021-08-31 2021-08-31 Display device and intelligent touch method thereof

Country Status (2)

Country Link
CN (1) CN113703640A (en)
WO (1) WO2023029822A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023029822A1 (en) * 2021-08-31 2023-03-09 京东方科技集团股份有限公司 Display device and intelligent touch-control method therefor

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120113151A1 (en) * 2010-11-08 2012-05-10 Shinichi Nakano Display apparatus and display method
CN111722781A (en) * 2020-06-22 2020-09-29 京东方科技集团股份有限公司 Intelligent interaction method and device and storage medium
CN111897463A (en) * 2020-07-29 2020-11-06 海信视像科技股份有限公司 Screen interface interactive display method and display equipment
CN111913621A (en) * 2020-07-29 2020-11-10 海信视像科技股份有限公司 Screen interface interactive display method and display equipment
WO2023029822A1 (en) * 2021-08-31 2023-03-09 京东方科技集团股份有限公司 Display device and intelligent touch-control method therefor

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2624116B1 (en) * 2012-02-03 2017-09-06 EchoStar Technologies L.L.C. Display zoom controlled by proximity detection
CN111913622B (en) * 2020-07-29 2022-04-19 海信视像科技股份有限公司 Screen interface interactive display method and display equipment
CN112346639B (en) * 2020-11-04 2023-01-10 北京小米移动软件有限公司 Method, device and equipment for displaying application interface and storage medium
CN112923653A (en) * 2021-03-01 2021-06-08 合肥美菱物联科技有限公司 Refrigerator intelligent control system and method based on position and distance analysis

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120113151A1 (en) * 2010-11-08 2012-05-10 Shinichi Nakano Display apparatus and display method
CN102467345A (en) * 2010-11-08 2012-05-23 夏普株式会社 Display apparatus and display method
CN111722781A (en) * 2020-06-22 2020-09-29 京东方科技集团股份有限公司 Intelligent interaction method and device and storage medium
CN111897463A (en) * 2020-07-29 2020-11-06 海信视像科技股份有限公司 Screen interface interactive display method and display equipment
CN111913621A (en) * 2020-07-29 2020-11-10 海信视像科技股份有限公司 Screen interface interactive display method and display equipment
WO2023029822A1 (en) * 2021-08-31 2023-03-09 京东方科技集团股份有限公司 Display device and intelligent touch-control method therefor

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023029822A1 (en) * 2021-08-31 2023-03-09 京东方科技集团股份有限公司 Display device and intelligent touch-control method therefor

Also Published As

Publication number Publication date
WO2023029822A1 (en) 2023-03-09

Similar Documents

Publication Publication Date Title
US10551987B2 (en) Multiple screen mode in mobile terminal
US20170347153A1 (en) Method of zooming video images and mobile terminal
CN103186345B (en) The section system of selection of a kind of literary composition and device
US10275151B2 (en) Apparatus and method for cursor control and text selection and editing based on gesture-based touch inputs received in a virtual keyboard display area
CN110531920B (en) Display method and device of sidebar, terminal and storage medium
CN106415472B (en) Gesture control method and device, terminal equipment and storage medium
CN108958627B (en) Touch operation method and device, storage medium and electronic equipment
US8106883B2 (en) Mobile terminal and method for moving a cursor and executing a menu function using a navigation key
KR101929316B1 (en) Method and apparatus for displaying keypad in terminal having touchscreen
US9785324B2 (en) Device, method, and storage medium storing program
CN108829314B (en) Screenshot selecting interface selection method, device, equipment and storage medium
US20130159903A1 (en) Method of displaying graphic user interface using time difference and terminal supporting the same
CN108595074A (en) Status bar operating method, device and computer readable storage medium
CN106959797B (en) A kind of setting method and mobile terminal notifying footmark
CN104238726A (en) Intelligent glasses control method, intelligent glasses control device and intelligent glasses
CN107562289A (en) Charge anti-interference method and device
CN102929528A (en) Device with picture switching function and picture switching method
CN108803986A (en) A kind of method of adjustment and device of mobile terminal virtual key
US11455071B2 (en) Layout method, device and equipment for window control bars
CN109002339A (en) touch operation method, device, storage medium and electronic equipment
CN113703640A (en) Display device and intelligent touch method thereof
CN107765900A (en) Intelligent pen, control method, device, equipment and storage medium of intelligent pen
CN108845756B (en) Touch operation method and device, storage medium and electronic equipment
CN102880413A (en) Method for controlling display of mobile terminal with touch screen and mobile terminal
CN108984097B (en) Touch operation method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination