WO2021042910A1 - 一种用户交互方法及电子设备 - Google Patents

一种用户交互方法及电子设备 Download PDF

Info

Publication number
WO2021042910A1
WO2021042910A1 PCT/CN2020/104906 CN2020104906W WO2021042910A1 WO 2021042910 A1 WO2021042910 A1 WO 2021042910A1 CN 2020104906 W CN2020104906 W CN 2020104906W WO 2021042910 A1 WO2021042910 A1 WO 2021042910A1
Authority
WO
WIPO (PCT)
Prior art keywords
area
operation event
touch screen
electronic device
event
Prior art date
Application number
PCT/CN2020/104906
Other languages
English (en)
French (fr)
Inventor
陈浩
陈晓晓
王卿
郑爱华
胡凯
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2021042910A1 publication Critical patent/WO2021042910A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing

Definitions

  • This application relates to the technical field of electronic equipment, and in particular to a user interaction method and electronic equipment based on an extended area.
  • the touch screen provides a simple, convenient and natural way of human-computer interaction.
  • touch the touch screen installed on the front of the display with a finger or other objects, and then the system locates and selects information input according to the icon or menu position touched by the finger.
  • the embodiments of the present application provide a user interaction method and electronic device, which can provide convenience for user operations, improve user experience, and enhance interaction efficiency.
  • an embodiment of the present application provides an electronic device, the electronic device includes: at least one touch screen, the at least one touch screen includes a first area and a second area, the first area is used to provide a display for a first application Output and provide touch input; the second area is used to provide touch input; the processor is configured to receive a first operation event to the second area, the first operation event is a touch input event; Configured to provide a second operation event for the first area according to the first operation event, so that the electronic device can act according to the second operation event; wherein the second operation event is mapped according to the first operation event get.
  • the processor is further configured to send a second operation event to the first application, so that the first application responds to the second operation Event to act.
  • the at least one touch screen includes a first touch screen and a second touch screen, the first area is located on the first touch screen, and the second area is located on the second touch screen; so
  • the processor is also configured to map a first operation event in the second area to a second operation event in the first area.
  • the first area and the second area are respectively located in different parts of the same touch screen in at least one touch screen; the processor is also configured to pass events
  • the input subsystem maps the first operation event in the second area to the second operation event to the first area; the processor is further configured to arrange a window corresponding to the first application on the first area.
  • the first area and the second area are respectively located in different parts of the same touch screen in at least one touch screen; the processor is further configured to use the first The second application receives the first operation event, and maps the first operation event in the second area to the second operation event to the first area; the processor is also configured to configure the window corresponding to the first application in the first area One area.
  • the first area has a first size
  • the second area has a second size
  • the first size and the second size constitute equal proportions
  • the processor is also The configuration is used to map the coordinates of the first operation event in the second area to the coordinates of the second operation event in the first area according to the ratio of the second size to the first size.
  • the first area has a first size
  • the third area in the second area has a third size
  • the first size and the third size constitute equal proportions
  • the processor is further configured to map the coordinates of the first operation event in the third area to the coordinates of the second operation event in the first area according to the ratio of the third size to the first size.
  • the fourth area in the first area has a fourth size
  • the third area in the second area has a third size
  • the fourth size and the third size The composition is in equal proportions; the processor is further configured to map the coordinates of the first operation event in the third area to the coordinates of the second operation event in the fourth area according to the ratio of the third size to the fourth size.
  • operations included in the first operation event are different from operations included in the second operation event.
  • the first operation event includes one operation
  • the second operation event includes two or more operations.
  • the processor is further configured to provide a second operation event for the first area according to the first application.
  • the electronic device further includes a memory, configured to store a mapping relationship between the first operation in the second area and the second operation in the first area;
  • the processor is further configured to provide a second operation event to the first area according to the mapping relationship and the first operation in the second area.
  • the embodiments of the present application provide a user interaction method, which is applied to an electronic device equipped with at least one touch screen.
  • the at least one touch screen includes a first area and a second area, and the first area is used for
  • the first application provides display output and provides touch input; the second area is used to provide touch input; the method includes: receiving a first operation event to the second area; the first operation event is a touch input event; The first operation event provides a second operation event for the first area, so that the electronic device can act according to the second operation event; wherein, the second operation event is mapped according to the first operation event.
  • the providing the second operation event for the first area includes sending the second operation event to the first application, so that the first application can follow the The second operation event performs an action.
  • the at least one touch screen includes a first touch screen and a second touch screen, the first area is located on the first touch screen, and the second area is located on the second touch screen; so
  • the method includes mapping a first operation event in the second area to a second operation event in the first area.
  • the first area and the second area are respectively located in different parts of the same touch screen in at least one touch screen; the method further includes: The first operation event in the second area is mapped to the second operation event to the first area; the method further includes arranging a window corresponding to the first application on the first area.
  • the first area and the second area are respectively located on different parts of the same touch screen in at least one touch screen; the method further includes receiving the second application by using a second application. An operation event, mapping the first operation event in the second area to the second operation event to the first area; the method further includes arranging a window corresponding to the first application on the first area.
  • the providing the second operation event for the first area includes providing the second operation event for the first area to the application according to the first application.
  • the method further includes: recording user operation events in the second area and operation events in the first area; storing the operation events in the second area and A mapping relationship between operation events in the first area; said providing a second operation event to the first area includes providing a second operation event to the first area according to the mapping relationship and the first operation event in the second area Operational events.
  • an embodiment of the present application provides a computer storage medium, the computer storage medium includes computer instructions, when the computer instructions run on an electronic device, the electronic device is caused to execute the method described in the second aspect .
  • embodiments of the present application provide a computer program product, and the program code included in the computer program product implements the method described in the second aspect when the program code included in the computer program product is executed by a processor in an electronic device.
  • the user interaction method and electronic device provided by the embodiments of the present application can provide convenience for user operations, improve user experience, and enhance interaction efficiency.
  • Figure 1 shows several situations of the extended area;
  • Figure 1(a) shows that the extended area is located at the edge area of the curved screen or the normal screen;
  • Figure 1(b) shows that the extended area is located in a certain part of the screen area of the curved screen or the normal screen.
  • Situation Figure 1(c) illustrates the situation where the expansion area is located in a screen or part of the screen area of the folding screen;
  • Figure 1(d) illustrates the situation where the expansion area is located in the folding area of the folding screen in the folded state;
  • Fig. 2 is a schematic diagram of a user interaction system according to an embodiment of the present application.
  • Fig. 3 is a schematic diagram of a user interaction system according to another embodiment of the present application.
  • Fig. 4 is a schematic diagram of a user interaction system according to another embodiment of the present application.
  • Fig. 5 is a schematic diagram of a user interaction system according to still another embodiment of the present application.
  • Fig. 6 is a schematic diagram of a user interaction system according to still another embodiment of the present application.
  • Figure 7 is a schematic diagram of a physical screen divided into a display area and an extended area
  • Figure 8 is a schematic diagram of 1:1 mapping between the display area and the extended area
  • Fig. 9 is a schematic diagram of a non-proportional mapping between the display area and the extended area
  • FIG. 10 is another schematic diagram showing a non-proportional mapping between the display area and the extended area
  • FIG. 11 is a schematic diagram illustrating different posture mapping of the display area and the extended area
  • FIG. 12 is a schematic flowchart of a user interaction method according to an embodiment of the present application.
  • FIG. 13 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
  • Fig. 14 is a schematic diagram of partial effect of operation event mapping.
  • first and second are only used for descriptive purposes, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of indicated technical features. Therefore, the features defined with “first” and “second” may explicitly or implicitly include one or more of these features.
  • the terms “including”, “including”, “having” and their variations all mean “including but not limited to”, unless otherwise specifically emphasized.
  • the embodiment of the application discloses a user interaction method and an electronic device adopting the method.
  • the electronic device may include at least one touch screen, and at least one touch screen may include two different areas. According to the main form and function of the area, these two areas can be referred to as the display area and the extended area respectively.
  • the display area can provide display output and touch input for applications like a normal touch screen, while the extended area focuses on receiving touch input .
  • the electronic device receives an operation event to the extended area, the operation event is mapped to the display area, so that the electronic device and the application working on the display area can act according to the operation event that originally occurred in the extended area.
  • Touch input events on the extended area can also be combined with touch input events on the display area to provide input events for applications or other system functions.
  • the embodiments of the present application can be applied to the area division of other forms and functions. Therefore, the present application will take the display area and the extended area as examples, and should not be limited to this.
  • the user interaction method of the embodiments of the present application can be applied to various electronic devices, including but not limited to mobile phones, tablet computers, personal digital assistants (personal digital assistants, PDAs), wearable devices, laptop computers (laptops) and other portable electronic devices. equipment.
  • portable electronic devices include, but are not limited to, portable electronic devices equipped with iOS, android, microsoft or other operating systems.
  • the electronic equipment may also be other types of electronic equipment, such as household equipment such as refrigerators and washing machines, or electronic equipment for automobiles and industrial use.
  • the embodiment of the present application does not specifically limit the type of electronic device.
  • the extended area can be regarded as a separate input function to the display area, or as a supplement to the input function of the display area.
  • the extended area can be implemented by using other physical screens other than the physical screen where the display area is located, or can be implemented on different parts of the same physical screen as the display area, and can be selected and distinguished according to the screen form and/or state.
  • Figure 1 shows several situations where the same physical screen is used to achieve an extended area.
  • Figure 1(a) shows that the extended area is located at the edge area of a curved screen or a normal screen, in which the middle part is the main display area A1, and the two sides are divided into elongated areas as the extended area A2.
  • Figure 1(b) illustrates the situation where the extended area is located in a certain part of the screen area of the curved screen or the normal screen, where a larger screen is divided from the curved screen or the normal screen as the extended area A2.
  • Figure 1(c) illustrates the situation where the expansion area is located in a screen or a part of the screen area of the folding screen, where the folding screen can be folded into two parts of similar size, similar to the upper and lower parts of a laptop; the folded state The two screens on different planes are divided into a display area A1 and an extended area A2.
  • Figure 1(d) illustrates the situation where the expansion area is located in the folding area of the folding screen in the folded state, where the folding screen can be folded into a book shape with a folding area in the middle; the folding area can be set as the expansion area A2, but not The folded portion (that is, the flat portion) is set as the display area A1.
  • the extended area can provide independent input or input supplementation for the display area. The specific content of the input provided by the extended area is discussed below.
  • FIG. 1 is only an example, and the extension area in this application is not limited to the various situations in FIG. 1.
  • Fig. 2 is a schematic diagram of a user interaction system according to an embodiment of the present application.
  • the user interaction system can be implemented in an electronic device.
  • the electronic device includes two physical screens, such as a liquid crystal display (LCD) 212 and an LCD 222. Both LCD 212 and LCD 222 are touch screens.
  • the display area A1 is implemented on the LCD 212, and may be all or part of the LCD 212.
  • the extended area A2 is implemented on the LCD 222, and may be all or part of the LCD 222.
  • the display screen referred to in this application may include, but is not limited to, a liquid crystal display screen, a light-emitting diode (OLED) display screen, and an organic light-emitting diode (OLED) display screen.
  • OLED light-emitting diode
  • OLED organic light-emitting diode
  • the LCD driver 215 receives one or more operations of the user on the LCD 212, and combines these operations into various operation events, and marks the operation B1 occurring in the display area A1.
  • the framework layer input subsystem 219 provides the operation event B1 to the application (not shown). It should be noted that the application here can be one application or multiple applications.
  • the frame layer input subsystem 219 can be called in the InputManagerService (start) method through JNI to start the InputReaderThread and InputDispatcherThread threads of the Native layer, thereby starting the operation of the input Input system.
  • InputReaderThread mainly executes content related to InputReader, mainly reading events from EventHub, preprocessing events; then, processing this event according to the rules (policy); finally sending a message to InputDispatcher to notify the occurrence of the event.
  • InputDispatcher will start the distribution of events, and distribute the events to the window manager WindowManager or the application through InputChannel.
  • the data that needs to be displayed including graphics from applications, video data, and composition data from the system, are provided to the LCD driver 214 via the frame layer display system, so that the output of the system and application programs is provided on the LCD 212.
  • the display system may include a window manager 218 and a surface synthesizer 214.
  • the window manager 218 creates windows under the instructions of the application, and the window manager creates a surface for each window to allow the application to draw various objects on it.
  • the surface is an object that points to the video memory and is used to be drawn on the screen. All visible windows have a surface that can be drawn on.
  • the system uses the Surface Flinger 214 service to render each surface on the final screen according to the correct depth information.
  • the LCD driver 215 receives the user's operations on the LCD 222, and combines these operations into various operation events.
  • the operation events will be marked as operation B2 in the extended area A2, which will not be repeated. Subsequently, the corresponding operation event is sent to the framework layer input subsystem 219. It should be noted that for driving the LCD212 and LCD222, different drivers can be used.
  • the frame layer input subsystem 219 can set an event processing unit to determine whether the operation event occurs in the display area or the extended area, and process the operation events that occur in the extended area (ie, LCD222).
  • the operation events are mapped to the display area (ie, LCD212), and then these operation events that originally occurred on the LCD222 but are mapped to the LCD212 are provided.
  • these operation events that originally occurred on LCD222 but are mapped to LCD212 and the operation events that occurred on LCD212 (if any) are sent to the application running on LCD212, so that the application can operate according to the operation. Event to act.
  • mapping process can be implemented as a function, and the function can be implemented in one of the various links of the input subsystem 219 at the framework layer, so as to map the operation event from the extended area to the display area.
  • the embodiment shown in FIG. 2 can be applied to a situation with more than two physical screens. In this case, two of them can be selected as screens that carry the display area and the extended area respectively.
  • Fig. 3 is a schematic diagram of a user interaction system according to another embodiment of the present application.
  • the user interaction system can be implemented in an electronic device.
  • the electronic device has a physical touch screen on which a display area A1 and an extended area A2 can be set.
  • the physical screen and its display area A1 and extended area A2 can be configured as shown in FIG. 1.
  • the display area A1 can provide system and application program interface output, while the display area A1 and the extended area A2 can respectively provide touch input.
  • the LCD driver 315 can receive the user's operations on the LCD 312 and combine these operations into various operation events. At the same time, the coordinates of the area where the operation event occurs are determined. In this embodiment, operations can be divided into operation B1 in the display area A1 and operation B2 in the expansion area A2 according to the coordinates of the occurrence area. Subsequently, the corresponding operation event is sent to the framework layer input subsystem 319.
  • the frame layer input subsystem 319 provides an operation event B1 that occurs in the display area A1 and an operation event B2 that is mapped from the extended area A2 to the display area A1.
  • the framework layer input subsystem 319 provides the operation event occurring in the display area A1 to the application 219; for the operation event B2 occurring in the extended area A2, the framework layer input subsystem 319 provides the mapped operation event B2 Give the application 219.
  • the mapping means that an operation event that originally occurred in the extended area A2 is treated as an operation event that occurred in the display area A1 after coordinate conversion.
  • the application can create one or more windows through the window manager 318, and the window manager 318 creates a surface for each window to allow the application to draw various graphics or display videos on it.
  • the surface is an object that points to the video memory and is used to be drawn on the screen; all visible windows have a surface that can be drawn on.
  • the Surface Flinger 314 in the bottom display system can render each surface to the final screen according to the correct depth information.
  • the surface synthesizer 314 can determine the size of the screen according to the system configuration or software code. In this embodiment, the size of the screen is set as the display area A1 through system configuration or software code. In this way, the surface synthesizer 314 will configure the buffer and the size of the synthesized image according to the set screen size. Similarly, the window manager 318 also determines the size of the window canvas according to the set screen size.
  • Fig. 4 is a schematic diagram of a user interaction system according to another embodiment of the present application.
  • the user interaction system of FIG. 4 is different from that of FIG. 3 in that, in FIG. 4, the window manager 418 puts each application in a forced split screen state, so that the window of each application is restricted to the display area A1, and the pictures or pictures of each application are restricted to the display area A1.
  • the video can only be drawn in the display area A1.
  • the window manager 418 may return the size of A1 through the interface when receiving Getdisplaysize, getRealSiz, etc. from the application. In this way, each application will regard the display area A1 as the display area set by the system, and the system level will perceive the entire screen.
  • Fig. 5 is a schematic diagram of a user interaction system according to still another embodiment of the present application.
  • the user interaction system of FIG. 5 is different from that of FIG. 4 in that an application 525 is specially provided, and the application 525 is configured to correspond to the extended area A2.
  • the framework layer input subsystem 319 does not perform special processing on the operation event B2 that occurs in the extended area A2, but distributes it in a normal manner, so that the application 525 receives the operation event B2 that occurs in the extended area A2.
  • the application 525 processes and distributes the operation event B2, so that the operation event B2 returns to the framework layer input subsystem 319, but the operation event B2 at this time is mapped to the display area A1.
  • the frame layer input subsystem 319 provides the operation event B2 mapped to the display area A1 and the operation event B1 occurring on the display area A1, for example, to distribute them to various applications 219.
  • the application 525 returns the operation event to the framework layer input subsystem by simulating event injection into injectinputEvent.
  • Fig. 6 is a schematic diagram of a user interaction system according to still another embodiment of the present application.
  • Fig. 6 is different from Fig. 3 in that the application 525 and the corresponding framework layer input subsystem 419 are specially provided.
  • the working mechanism of the application 525 and the framework layer input subsystem 419 can be referred to the above description in conjunction with FIG. 5, and will not be repeated.
  • the extended area mentioned above receives touch input, it is also possible to draw a cursor or similar icon at the position of the extended area touch input through the display system through views, etc. at the same time, so as to bring visual feedback to the user .
  • Fig. 7 is a schematic diagram of a physical screen divided into a display area and an extended area.
  • the physical screen division can be performed on the physical screen shown in FIG. 1 or on different physical screens.
  • the physical screen A has a size of W x H, its origin is defined as (0, 0), and the diagonal point is (W, H).
  • the physical screen A is equally divided into the display area A1 and the extended area A2, the coordinate origin of the display area A1 is (0, 0), the diagonal point is (W, H/2); the coordinate origin of the extended area A2 is (0 , 0), the diagonal point is (W, H/2).
  • the division of areas on different physical screens can be done in the same way.
  • Figure 8 is a schematic diagram of 1:1 mapping between the display area and the extended area.
  • the mapping can be implemented on the physical screen shown in Figure 7.
  • the physical screen is equally divided into a display area A1 and an extended area A2.
  • the coordinate origin of the display area A1 is (0, 0), and the diagonal point is (W, H/2);
  • the coordinate origin of the extended area A2 is (0, 0), and the diagonal point is (W, H /2). Therefore, the mapping from the extended area A2 to the display area A1 can be completed by area replacement, without the need for mapping of position coordinates.
  • any touch behavior of the user in the extended area A2 (for example, P2) can be directly mapped to the display area A1 (for example, P1) in a 1:1 relationship.
  • Fig. 9 is a schematic diagram of a non-proportional mapping between the display area and the extended area.
  • the display area A2 and A1 are not proportional. You can select a certain area A3 in the extended area, and then perform a proportional mapping between A3 and A1.
  • the size of A1 is (W1, H1)
  • the size of A3 is (W3, H3).
  • the local coordinates of the point P2 located in A3 are (a, b). Then P2 is mapped to point P1 in A1, and the local coordinates of p1 are ((W1/W3)*a, (H1/H3)*a).
  • Fig. 10 is another schematic diagram of non-proportional mapping between the display area and the extended area.
  • the display areas A2 and A1 are not proportional. You can select an area A3 in the extended area, select an area A4 in the display area, and then perform a proportional mapping between A3 and A4. In one case, A3 can overlap with A2, so that ratio mapping can be performed between A2 and A4.
  • the mapping of operation events in the extended area to the display area is not only reflected in the change of the area, the mapping of position/coordinates, but also the mapping of the posture represented by the operation itself.
  • Some gesture mapping tables can be pre-made in the input subsystem.
  • the operation event B22 occurs in the extended area
  • the operation event B12 occurs in the display area through the mapping.
  • B21 can be different from B22.
  • Fig. 11 is a schematic diagram illustrating the mapping of different postures between the display area and the extended area. Click once at the point P1 of the extended area A2, and it will be mapped to a behavior of sliding from point P2 to P3 in the display area A1. The longer the click time of P1, the longer the slide distance of P3 relative to P2.
  • the mapping of operation events occurring in the extended area to the display area can take effect globally (in various scenarios). For example, as long as the click operation in Figure 11 occurs, a sliding behavior will appear in the display area.
  • the input subsystem may define the function button S1 in the extended area for the glory of the king.
  • the event processing module recognizes that it is the glory of the king game scene, and automatically distributes the key combination events of T2, T3, and T4 to the input subsystem (see Figure 14).
  • a whitelist For settings that take effect globally or locally, a whitelist can be used. Inputdispatch determines to distribute the T2, T3, and T4 key combination events corresponding to S1 to the application specified in the whitelist (for example, the glory of the king game).
  • the user can interactively define through the recording interface.
  • the extended area can be described as: click, long press T, slide, two-finger away operation... etc.
  • the button behavior in the display area will record various button information including coordinates, location, and duration.
  • Fig. 12 is a schematic flowchart of a user interaction method according to an embodiment of the present application.
  • the user interaction method can be applied to an electronic device equipped with at least one touch screen.
  • the at least one touch screen includes a first area and a second area.
  • the first area is used to provide display output and touch input for the first application; the second area is used to provide touch input.
  • the electronic device may be the electronic device described in one of FIGS. 2-6.
  • the method includes: in step 1200, receiving a first operation event on the second area; the first operation event is a touch input event.
  • step 1202 a second operation event for the first area is provided, so that the electronic device can act according to the second operation event; wherein, the second operation event is mapped according to the first operation event.
  • the providing the second operation event for the first area includes sending the second operation event to the first application, so that the first application can act according to the second operation event.
  • the at least one touch screen includes a first touch screen and a second touch screen, the first area is located on the first touch screen, and the second area is located on the second touch screen; the method includes placing the first touch screen in the second area.
  • the operation event is mapped to the second operation event for the first area.
  • the first area and the second area are respectively located in different parts of the same touch screen in at least one touch screen; the method further includes mapping the first operation event in the second area to the first area The second operation event; the method further includes configuring the window corresponding to the first application on the first area.
  • the first area and the second area are respectively located on different parts of the same touch screen in at least one touch screen; the method further includes receiving the first operation event by the second application, and placing the first operation event in the second area An operation event is mapped to a second operation event for the first area; the method further includes arranging a window corresponding to the first application on the first area.
  • FIG. 13 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
  • the electronic device includes a processor 1310, a memory 1320, and at least one touch screen (such as a touch screen 1330, a touch screen 1340).
  • the at least one touch screen includes a first area and a second area.
  • the first area is used to provide display output and touch input for at least one application; the second area is used to provide touch input.
  • the memory 1320 is used to store computer execution instructions.
  • the processor 1310 executes the computer execution instructions stored in the memory 1320, so that the electronic device executes the method shown in FIG. 12 .
  • the terminal further includes a communication bus 1350, where the processor 1310 can be connected to the memory 1320 and at least one touch screen (which may be a touch screen 1330 or a touch screen 1340) through the communication bus 1350.
  • the processor 1310 can be connected to the memory 1320 and at least one touch screen (which may be a touch screen 1330 or a touch screen 1340) through the communication bus 1350.
  • the method steps in the embodiments of the present application can be implemented by hardware, or can be implemented by a processor executing software instructions.
  • Software instructions can be composed of corresponding software modules (such as the input subsystem, window manager, surface synthesizer, etc. above).
  • the software modules can be stored in random access memory (RAM), flash memory, Read-only memory (ROM), programmable read-only memory (programmable rom, PROM), erasable programmable read-only memory (erasable PROM, EPROM), electrically erasable programmable read-only memory (electrically EPROM) , EEPROM), registers, hard disk, mobile hard disk, CD-ROM or any other form of storage medium known in the art.
  • An exemplary storage medium is coupled to the processor, so that the processor can read information from the storage medium and write information to the storage medium.
  • the storage medium may also be an integral part of the processor.
  • the processor and the storage medium may be located in the ASIC.
  • the above-mentioned embodiments it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof.
  • software it can be implemented in the form of a computer program product in whole or in part.
  • the computer program product includes one or more computer instructions.
  • the computer may be a general-purpose computer, a special-purpose computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in a computer-readable storage medium or transmitted through the computer-readable storage medium.
  • the computer instructions can be sent from a website site, computer, server, or data center to another website site via wired (such as coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (such as infrared, wireless, microwave, etc.) , Computer, server or data center for transmission.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server or a data center integrated with one or more available media.
  • the usable medium may be a magnetic medium (for example, a floppy disk, a hard disk, and a magnetic tape), an optical medium (for example, a DVD), or a semiconductor medium (for example, a solid state disk (SSD)).

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请实施例提供了一种用户交互方法及电子设备。电子设备包括:至少一个触摸屏,所述至少一个触摸屏包括第一区域和第二区域,第一区域用于为第一应用提供显示输出以及提供触控输入;第二区域用于提供触控输入;处理器,经配置用于接收对第二区域的第一操作事件,所述第一操作事件是触控输入的事件;还经配置用于根据所述第一操作事件提供对第一区域的第二操作事件,以便所述电子设备根据所述第二操作事件进行动作;其中,所述第二操作事件根据第一操作事件映射得到。本申请实施例的用户交互方法及电子设备可以提供用户操作的便利性,改善用户体验,提升交互效率。

Description

一种用户交互方法及电子设备
本申请要求于2019年09月06日提交中国专利局、申请号为201910841190.2、申请名称为“一种用户交互方法及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及电子设备技术领域,具体涉及一种基于扩展区域的用户交互方法及电子设备。
背景技术
触摸屏作为一种计算机输入设备,提供了简单、方便、自然的一种人机交互方式。工作时,用手指或其它物体触摸安装在显示器前端的触摸屏,然后***根据手指触摸的图标或菜单位置来定位选择信息输入。
随着屏幕技术的发展,出现了曲面屏,折叠屏等产品形态。曲面屏的曲线带来不同屏幕部位的视觉和触控感觉,折叠屏进一步将这种差异放大。与此同时,屏幕材料的改进,也允许屏幕越来越大,允许在不同区域设定不同的功能。这些新屏幕产品形态的出现,带来了更为丰富的用户体验的可能性。
发明内容
本申请实施例提供了一种用户交互方法及电子设备,可以提供用户操作的便利性,改善用户体验,提升交互效率。
第一方面,本申请实施例提供了一种电子设备,所述电子设备包括:至少一个触摸屏,所述至少一个触摸屏包括第一区域和第二区域,第一区域用于为第一应用提供显示输出以及提供触控输入;第二区域用于提供触控输入;处理器,经配置用于接收对第二区域的第一操作事件,所述第一操作事件是触控输入的事件;还经配置用于根据所述第一操作事件提供对第一区域的第二操作事件,以便所述电子设备根据所述第二操作事件进行动作;其中,所述第二操作事件根据第一操作事件映射得到。
结合第一方面,在第一方面一种可能的实现方式中,所述处理器还经配置用于发送第二操作事件给所述第一应用,以便所述第一应用根据所述第二操作事件进行动作。
结合第一方面,在第一方面一种可能的实现方式中,所述至少一个触摸屏包括第一触摸屏和第二触摸屏,第一区域位于第一触摸屏上,第二区域位于第二触摸屏上;所述处理器还经配置用于将在第二区域的第一操作事件映射为对第一区域的第二操作事件。
结合第一方面,在第一方面一种可能的实现方式中,所述第一区域和第二区域分别位于至少一个触摸屏中同一个触摸屏的不同部分;所述处理器还经配置用于通过事件输入子***将在第二区域的第一操作事件映射为对第一区域的第二操作事件;所述处理器还经配置用于将第一应用所对应的窗口配置在第一区域上。
结合第一方面,在第一方面一种可能的实现方式中,所述第一区域和第二区域分 别位于至少一个触摸屏中同一个触摸屏的不同部分;所述处理器还经配置用于利用第二应用接收第一操作事件,将在第二区域的第一操作事件映射为对第一区域的第二操作事件;所述处理器还经配置用于将第一应用所对应的窗口配置在第一区域上。
结合第一方面,在第一方面一种可能的实现方式中,第一区域具有第一尺寸,第二区域具有第二尺寸,第一尺寸和第二尺寸构成等比例;所述处理器还经配置用于按第二尺寸和第一尺寸的比例,将第一操作事件在第二区域的坐标映射为第二操作事件在第一区域的坐标。
结合第一方面,在第一方面一种可能的实现方式中,第一区域具有第一尺寸,第二区域中的第三区域具有第三尺寸,第一尺寸和第三尺寸构成等比例;所述处理器还经配置用于按第三尺寸和第一尺寸的比例,将第一操作事件在第三区域的坐标映射为第二操作事件在第一区域的坐标。
结合第一方面,在第一方面一种可能的实现方式中,第一区域中的第四区域具有第四尺寸,第二区域中的第三区域具有第三尺寸,第四尺寸和第三尺寸构成等比例;所述处理器还经配置用于按第三尺寸和第四尺寸的比例,将第一操作事件在第三区域的坐标映射为第二操作事件在第四区域的坐标。
结合第一方面,在第一方面一种可能的实现方式中,所述第一操作事件所包括的操作不同于所述第二操作事件所包括的操作。
结合第一方面,在第一方面一种可能的实现方式中,所述第一操作事件包括一次操作,所述第二操作事件包括两次或两次以上操作。
结合第一方面,在第一方面一种可能的实现方式中,处理器还经配置用于根据第一应用提供对第一区域的第二操作事件。
结合第一方面,在第一方面一种可能的实现方式中,电子设备还包括存储器,用于存储在第二区域的第一操作和在第一区域的第二操作之间的映射关系;
所述处理器还经配置用于根据所述映射关系和在第二区域的第一操作,提供对第一区域的第二操作事件。
第二方面,本申请实施例提供了一种用户交互方法,用户交互方法应用于配置有至少一个触摸屏的电子设备,所述至少一个触摸屏包括第一区域和第二区域,第一区域用于为第一应用提供显示输出以及提供触控输入;第二区域用于提供触控输入;所述方法包括:接收对第二区域的第一操作事件;第一操作事件是触控输入的事件;根据所述第一操作事件提供对第一区域的第二操作事件,以便所述电子设备根据所述第二操作事件进行动作;其中,所述第二操作事件根据第一操作事件映射得到。
结合第二方面,在第二方面一种可能的实现方式中,所述提供对第一区域的第二操作事件包括,发送第二操作事件给第一应用,以便所述第一应用根据所述第二操作事件进行动作。
结合第二方面,在第二方面一种可能的实现方式中,所述至少一个触摸屏包括第一触摸屏和第二触摸屏,第一区域位于第一触摸屏上,第二区域位于第二触摸屏上;所述方法包括将在第二区域的第一操作事件映射为对第一区域的第二操作事件。
结合第二方面,在第二方面一种可能的实现方式中,所述第一区域和第二区域分别位于至少一个触摸屏中同一个触摸屏的不同部分;所述方法还包括通过事件输入子 ***将在第二区域的第一操作事件映射为对第一区域的第二操作事件;所述方法还包括将第一应用所对应的窗口配置在第一区域上。
结合第二方面,在第二方面一种可能的实现方式中,所述第一区域和第二区域分别位于至少一个触摸屏中同一个触摸屏的不同部分;所述方法还包括利用第二应用接收第一操作事件,将在第二区域的第一操作事件映射为对第一区域的第二操作事件;所述方法还包括将第一应用所对应的窗口配置在第一区域上。
结合第二方面,在第二方面一种可能的实现方式中,所述提供对第一区域的第二操作事件包括,根据第一应用向应用提供对第一区域的第二操作事件。
结合第二方面,在第二方面一种可能的实现方式中,所述方法还包括:记录用户在第二区域的操作事件和在第一区域的操作事件;保存在第二区域的操作事件和在第一区域的操作事件之间的映射关系;所述提供对第一区域的第二操作事件包括,根据所述映射关系和在第二区域的第一操作事件提供对第一区域的第二操作事件。
第三方面,本申请实施例提供了一种计算机存储介质,所述计算机存储介质包括计算机指令,当所述计算机指令在电子设备上运行时,使得所述电子设备执行第二方面所述的方法。
第四方面,本申请实施例提供了一种计算机程序产品,所述计算机程序产品包含的程序代码被电子设备中的处理器执行时,实现第二方面所述的方法。
本申请实施例提供的用户交互方法及电子设备,可以提供用户操作的便利性,改善用户体验,提升交互效率。
附图说明
图1是扩展区域的几种情形;图1(a)示意了扩展区域位于曲面屏或正常屏幕的边缘区域;图1(b)示意了扩展区域位于曲面屏或正常屏幕的某部分屏幕区域的情形;图1(c)示意了扩展区域位于折叠屏中的一块屏幕或其一部分屏幕区域的情形;图1(d)示意了扩展区域位于折叠屏在折叠状态下的折叠区域的情形;
图2是根据本申请实施例的用户交互***的示意图;
图3是根据本申请另一实施例的用户交互***的示意图;
图4是根据本申请又一实施例的用户交互***的示意图;
图5是根据本申请再一实施例的用户交互***的示意图;
图6是根据本申请再一实施例的用户交互***的示意图;
图7是物理屏划分显示区域和扩展区域的示意图;
图8是显示区域和扩展区域1:1映射的示意图;
图9是显示区域和扩展区域非等比例映射的示意图;
图10是显示区域和扩展区域非等比例映射的另一示意图;
图11是示意了显示区域和扩展区域不同姿态映射的示意图;
图12是根据本申请实施例的一种用户交互方法的流程示意图;
图13是本申请实施例的一种电子设备的结构示意图;
图14是操作事件映射局部生效的示意图。
具体实施方式
下面将结合附图,对本发明实施例中的技术方案进行描述。显然,所描述的实施 例仅是本申请一部分实施例,而不是全部的实施例。
在本说明书的描述中“一个实施例”或“一些实施例”等意味着在本申请的一个或多个实施例中包括结合该实施例描述的特定特征、结构或特点。由此,在本说明书中的不同之处出现的语句“在一个实施例中”、“在一些实施例中”、“在其他一些实施例中”、“在另外一些实施例中”等不是必然都参考相同的实施例,而是意味着“一个或多个但不是所有的实施例”,除非是以其他方式另外特别强调。
其中,在本说明书的描述中,除非另有说明,“/”表示或的意思,例如,A/B可以表示A或B;本文中的“和/或”仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。另外,在本申请实施例的描述中,“多个”是指两个或多于两个。
在本说明书的描述中,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征。术语“包括”、“包含”、“具有”及它们的变形都意味着“包括但不限于”,除非是以其他方式另外特别强调。
本申请实施例披露了一种用户交互方法和采用该方法的电子设备。电子设备可以包括至少一个触摸屏,至少一个触摸屏上可以包括两个不同的区域。根据区域的主要形态和功能,可以将这两个区域分别简称为显示区域和扩展区域,显示区域和通常的触摸屏一样可以为应用提供显示输出以及提供触摸输入,而扩展区域则侧重在接收触摸输入。电子设备在接收对扩展区域的操作事件时,将该操作事件映射到显示区域上,以便电子设备及其工作在显示区域上的应用可以根据该原本发生在扩展区域上的操作事件进行动作。扩展区域上的触摸输入事件(即操作事件)也可以和显示区域上的触摸输入事件结合在一起为应用或其它***功能提供输入事件。当然,本申请实施例可以适用于其它形态和功能的区域划分,因此,本申请将以显示区域和扩展区域为例展开说明,而不应以此为限。
本申请实施例的用户交互方法可以应用于各种电子设备,包括但不限于手机、平板电脑、个人数字助理(personal digital assistant,PDA)、可穿戴设备、膝上型计算机(laptop)等便携式电子设备。便携式电子设备的示例性实施例包括但不限于搭载iOS、android、microsoft或者其他操作***的便携式电子设备。电子设备还可以是其它类型的电子设备,诸如冰箱、洗衣机等家用设备,或者汽车、工业用电子设备。本申请实施例对电子设备的类型不做具体限定。
在展开说明本申请的各实施例之前,首先探讨一下扩展区域的定义。在本申请中,扩展区域既可以视作为单独的对显示区域的输入功能,也可以视作为对显示区域的输入功能的补充。扩展区域可以利用不同于显示区域所在物理屏以外的其它物理屏加以实现,也可以和显示区域实现在相同物理屏的不同部位,而根据屏幕形态和/或状态加以选择和区分。
图1是利用同一物理屏实现扩展区域的几种情形。图1(a)示意了扩展区域位于曲面屏或正常屏幕的边缘区域,其中,中间部分为主要的显示区域A1,两侧划分出细长条形的区域作为扩展区域A2。图1(b)示意了扩展区域位于曲面屏或正常屏幕的 某部分屏幕区域的情形,其中,从曲面屏或正常屏幕中划分出较大的一块屏幕作为扩展区域A2。图1(c)示意了扩展区域位于折叠屏中的一块屏幕或其一部分屏幕区域的情形,其中折叠屏可被折叠为大小相近的两个部分,类似于笔记本电脑的上下两部分;将折叠态下会处于不同平面的两部分屏幕分为显示区域A1和扩展区域A2。图1(d)示意了扩展区域位于折叠屏在折叠状态下的折叠区域的情形,其中折叠屏可以折叠为书本形状,中间有一个折叠区域;可将折叠区域设置为扩展区域A2,而未被折叠的部分(即平面部分)设置为显示区域A1。在图1所示的各种情形下,扩展区域可以为显示区域提供独立的输入或者输入补充。有关扩展区域提供输入的具体内容在下文展开讨论。
需要注意,在图1(a)、图1(b)和图1(d)中,扩展区域和显示区域同属于一个物理屏,因此对扩展区域和显示区域的显示控制和输入控制原本是由相同的硬件和软件模块实现的。为了实现本申请实施例所赋予扩展区域和显示区域的不同功能,需要在软件模块方面对扩展区域和显示区域的信息/事件进行分别处理。
在图1(c)中,考虑到折叠屏的特点,扩展区域和显示区域可以采用不同的硬件加以驱动和控制。在此情况下,则需在软件模块中建立对来自不同硬件的事件/信息的沟通机制,即可实现本申请实施例所赋予扩展区域和显示区域的不同功能。如果扩展区域和显示区域采用相同的硬件,则可参照图1(a)、图1(b)和图1(d)的处理办法,在软件模块方面进行调整。
当然,图1仅属举例,本申请中扩展区域不限于图1的各种情况。
图2是根据本申请实施例的用户交互***的示意图。用户交互***可以实现于电子设备中。如图2所示,电子设备包括两个物理屏,例如液晶(Liquid Crystal Display,LCD)212和LCD 222。LCD 212和LCD 222均为触摸屏。显示区域A1实现于LCD 212上,可以是LCD212的全部或一部分。扩展区域A2实现于LCD 222上,可以是LCD222的全部或一部分。当然,本申请所指的显示屏可以包括但不限于液晶显示屏、发光二极管(Light-Emitting Diode,OLED)显示屏、有机发光二极管(Organic Light-Emitting Diode,OLED)显示屏。仅仅出于描述便利,下文以LCD为例。
对应于LCD 212,LCD驱动器215接收用户对LCD 212的一次或多次操作,并且把这些操作合并为各种操作事件,并且标记为发生在显示区域A1的操作B1。框架层输入子***219把操作事件B1提供给应用(未图示)。需要说明,这里的应用可以是一个应用,也可以是多个应用。
以安卓***为例,框架层输入子***219可以在InputManagerService中启动(start)方法会通过JNI调用,启动Native层的InputReaderThread,InputDispatcherThread线程,从而开始输入Input***的运行。InputReaderThread主要是执行和InputReader相关的内容,主要是从EventHub中读取事件,预处理事件;然后,根据规则(policy)来处理此事件;最后发送一个消息到InputDispatcher中,通知事件的产生。紧接着,InputDispatcher会开始事件的分发,通过InputChannel把事件分发给窗口管理器WindowManager或者应用程序。
另一方面,需要显示的数据,包括来自应用的图形、视频数据和来自***的构图数据等等,经框架层显示***提供给LCD驱动器214,从而使得在LCD212上提供系 统和应用程序的输出。
以安卓***为例,显示***可以包括窗口管理器218和表面合成器214。窗口管理器218在应用的指示下创建窗口,窗口管理器为每一个窗口创建一个表面(Surface)来让该应用在上面绘制各种物体。表面是指向显存的一个物体,用来被绘制到屏幕上,所有能看见的窗口都拥有可以在上面绘制的表面。***使用表面合成器(Surface Flinger)214服务来把各表面按照正确的深度信息渲染到最终的屏幕上。
对于LCD222,LCD驱动器215接收用户对LCD 222的操作,并且把这些操作合并为各种操作事件,操作事件会被标记为在扩展区域A2的操作B2,对此不复赘述。随后,相应的操作事件被发送给框架层输入子***219。需要说明,对于LCD212和LCD222的驱动,可以采用不同的驱动器。
在本申请实施例中,框架层输入子***219可以设置事件处理单元,判断操作事件是发生在显示区域还是扩展区域,对这些发生在扩展区域(即LCD222)上的操作事件进行处理,将这些操作事件映射到显示区域(即LCD212)上,然后提供这些原本发生在LCD222上但被映射到LCD212上的操作事件。在一个例子中,这些原本发生在LCD222上但被映射到LCD212上的操作事件以及发生在LCD212上的操作事件(如果有的话)被发送给运行在LCD212上的应用,以便应用根据所述操作事件进行动作。
本领域的技术人员意识到,映射过程可以实现为函数,该函数可以在框架层输入子***219的各个环节之一实现,从而将操作事件从扩展区域映射到显示区域。
图2所示实施例可以适用于具有两块以上物理屏的情形,在此情形下可以将其中两块分别选做承载显示区域和扩展区域的屏幕即可。
图3是根据本申请另一实施例的用户交互***的示意图。用户交互***可以实现于电子设备中。
如图3所示,电子设备有一块物理触摸屏,在该屏上可以设置显示区域A1和扩展区域A2,物理屏及其显示区域A1和扩展区域A2的配置情形可以如图1所示。在显示区域A1上可以提供***及应用程序界面输出,同时在显示区域A1和扩展区域A2可以分别提供触摸输入。
LCD驱动器315可以接收用户对LCD312的操作,并且将这些操作合并为各种操作事件。同时,操作事件的发生区域的坐标被确定。在本实施例中,操作可以按发生区域的坐标分为在显示区域A1的操作B1,和在扩展区域A2的操作B2。随后,相应的操作事件被发送给框架层输入子***319。
框架层输入子***319提供发生在显示区域A1的操作事件B1和从扩展区域A2映射到显示区域A1的操作事件B2。在一个例子中,框架层输入子***319将发生在显示区域A1的操作事件提供给应用219;对于发生在扩展区域A2的操作事件B2,框架层输入子***319将映射后的操作事件B2提供给应用219。所述映射,是指原本发生在扩展区域A2的操作事件在进行坐标转换后,被当作发生在显示区域A1的操作事件。
另一方面,应用可以通过窗口管理器318来创建一个或多个窗口,窗口管理器318为每一个窗口创建一个表面(Surface)来让该应用在上面绘制各种图形或展示视频。 表面是指向显存的一个物体,用来被绘制到屏幕上;所有能看见的窗口都拥有可以在上面绘制的表面。底层显示***中的表面合成器(Surface Flinger)314可以把各表面按照正确的深度信息渲染到最终的屏幕上。
表面合成器314可以根据***配置或软件代码确定屏幕的大小。在本实施例中,通过***配置或软件代码设定屏幕的大小为显示区域A1。如此,表面合成器314将按设定的屏幕大小去配置缓冲器buffer、合成图像的大小。同样,窗口管理器318也按照设定的屏幕大小去确定窗口画布的大小。
图4是根据本申请又一实施例的用户交互***的示意图。图4的用户交互***不同于图3的地方在于,在图4中,窗口管理器418让各个应用处于强制分屏状态,使得各应用的窗口被限制在显示区域A1,进而各应用的图片或视频仅能被绘制在显示区域A1。在一个例子中,窗口管理器418可以在接收到来自应用的Getdisplaysize、getRealSiz等等的时候,将A1的尺寸通过接口返回。如此,各应用将把显示区域A1作为***设置的显示区域,而***层面则感知整个屏幕。
图5是根据本申请再一实施例的用户交互***的示意图。图5的用户交互***不同于图4的地方在于,专门设置了应用525,该应用525被配置对应于扩展区域A2。框架层输入子***319并不对发生在扩展区域A2的操作事件B2进行特别处理,而是将其按正常方式分发,使得应用525接收到该发生在扩展区域A2的操作事件B2。应用525对该操作事件B2进行处理和分发,使得操作事件B2重新回到框架层输入子***319,只是此时的操作事件B2被映射到显示区域A1。框架层输入子***319提供映射到显示区域A1上的操作事件B2和发生在显示区域A1上的操作事件B1,比如将之分发给各个应用219。在一个例子中,应用525通过模拟事件注入injectinputEvent的方式将操作事件返回框架层输入子***。
图6是根据本申请再一实施例的用户交互***的示意图。图6不同于图3的地方在于专门设置了应用525以及相应的框架层输入子***419。而应用525及框架层输入子***419的工作机制可参见上文结合图5的说明,不再赘述。
需要说明,虽然上文提及扩展区域部分接收触摸输入,但是,也可以同时通过显示***通过view等方式在扩展区域部分触摸输入的位置处绘制光标或类似图标,以便给用户带来视觉的反馈。
图7是物理屏划分显示区域和扩展区域的示意图。物理屏划分可以是在图1所示的物理屏上进行,也可以是在不同的物理屏上进行。如图7所示,物理屏A具有W x H的尺寸,其原点定义为(0,0),对角点为(W,H)。将物理屏A等分为显示区域A1和扩展区域A2,则显示区域A1的坐标原点为(0,0),对角点为(W,H/2);扩展区域A2的坐标原点为(0,0),对角点为(W,H/2)。对于不同物理屏上的区域划分可以按同理进行。
图8是显示区域和扩展区域1:1映射的示意图。映射可以在图7所示的物理屏上实现。如图8所示,将物理屏等分为显示区域A1和扩展区域A2。参照图7,显示区域A1的坐标原点为(0,0),对角点为(W,H/2);扩展区域A2的坐标原点为(0,0),对角点为(W,H/2)。因此,扩展区域A2到显示区域A1的映射可以通过区域置换即可完成,而不必进行位置坐标的映射。换句话说,用户在扩展区域A2的任何 触控行为(例如P2),可以按1:1关系被直接映射到显示区域A1(例如P1)。
图9是显示区域和扩展区域非等比例映射的示意图。如图9所示,显示区域A2和A1非等比例关系,可以在扩展区域中选定某个区域A3,然后在A3和A1之间进行比例映射。例如,A1的尺寸为(W1,H1),A3的尺寸为(W3,H3)。位于A3中的点P2的本地坐标为(a,b)。那么P2映射为A1中的点P1,则p1的本地坐标为((W1/W3)*a,(H1/H3)*a)。
图10是显示区域和扩展区域非等比例映射的另一示意图。如图10所示,显示区域A2和A1非等比例关系,可以在扩展区域中选定某个区域A3,在显示区域中选定某个区域A4,然后在A3和A4之间进行比例映射。在一种情形下,A3可以和A2重叠,如此可以在A2和A4之间进行比例映射。
在扩展区域的操作事件到显示区域的映射,不仅体现在区域的改变,位置/坐标的映射,还可以体现在操作本身所代表姿态的映射。可以在输入子***中预制一些手势映射表,当在扩展区域发生操作事件B22,则通过映射在显示区域发生操作事件B12。B21可以不同于B22。
图11是示意了显示区域和扩展区域不同姿态映射的示意图。在扩展区域A2的点P1位置点击一下,则映射为在显示区域A1的从点P2滑动到P3的一个行为。P1的点击时间越长,则P3相对P2的滑道距离越长。
在一个实施例中,发生在扩展区域的操作事件到显示区域的映射,可以全局(在各种场景下)生效。例如,图11的点击操作,只要发生,便在显示区域出现一个滑动行为。
当然,也可以将某些预设手势设置为局部生效。比如,王者荣耀游戏有一个组合件,需要用户T2、T3、T4三个键组合按,才能进行技能组合释放。在本申请实施例中,输入子***可以为王者荣耀在扩展区域定义功能按钮S1。当用户在扩展区域将按钮S1按下时,事件处理模块识别到是王者荣耀游戏场景,自动向输入子***分发T2、T3、T4的按键组合事件(参见图14)。
对全局或局部生效的设置,可以采用白名单方式。Inputdispatch确定向白名单指定的应用(例如,王者荣耀游戏)分发和S1对应的T2、T3、T4的按键组合事件。
用户可以通过录制界面进行交互定义。首先,可以在扩展区域进行操作定义。可以选择点击、长按、滑动,或者直接在屏幕进行一段操作(例如,在屏幕滑动一段距离L),记录扩展区域的操作。扩展区域可以描述为:点击、长按T、滑动、两指远离操作…等。
在扩展区域的手势定义好之后,对在显示区域的按键行为/手势进行记录。在显示区域的按键行为将记录包括坐标、位置、时长等各项按键信息。
将记录的用户在扩展区域的操作S和在显示区域的操作T建立映射关系,并且将映射关系以配置文件或其它方式保存。在后续操作阶段,一旦在扩展区域检测到同类操作S,将由输入子***直接将T事件还原和发送。
图12是根据本申请实施例的一种用户交互方法的流程示意图。用户交互方法可应用于配置有至少一个触摸屏的电子设备。所述至少一个触摸屏包括第一区域和第二区 域,第一区域用于为第一应用提供显示输出以及提供触控输入;第二区域用于提供触控输入。电子设备可以是图2-图6之一所述的电子设备。
如图12所示,所述方法包括:在步骤1200,接收对第二区域的第一操作事件;第一操作事件是触控输入的事件。
在步骤1202,提供对第一区域的第二操作事件,以便所述电子设备根据所述第二操作事件进行动作;其中,所述第二操作事件根据第一操作事件映射得到。
在一个实施例中,所述提供对第一区域的第二操作事件包括,发送第二操作事件给第一应用,以便所述第一应用根据所述第二操作事件进行动作。
在一个实施例中,所述至少一个触摸屏包括第一触摸屏和第二触摸屏,第一区域位于第一触摸屏上,第二区域位于第二触摸屏上;所述方法包括将在第二区域的第一操作事件映射为对第一区域的第二操作事件。
在一个实施例中,所述第一区域和第二区域分别位于至少一个触摸屏中同一个触摸屏的不同部分;所述方法还包括将在第二区域的第一操作事件映射为对第一区域的第二操作事件;所述方法还包括将第一应用所对应的窗口配置在第一区域上。
在一个实施例中,所述第一区域和第二区域分别位于至少一个触摸屏中同一个触摸屏的不同部分;所述方法还包括利用第二应用接收第一操作事件,将在第二区域的第一操作事件映射为对第一区域的第二操作事件;所述方法还包括将第一应用所对应的窗口配置在第一区域上。
有关用户交互方法的实施细节可以参阅前文结合图1-图11的描述,此处不再赘述。
图13是本申请实施例的一种电子设备的结构示意图。如图13所示,电子设备包括处理器1310、存储器1320、至少一个触摸屏(比如触摸屏1330、触摸屏1340)。所述至少一个触摸屏包括第一区域和第二区域,第一区域用于为至少一个应用提供显示输出以及提供触控输入;第二区域用于提供触控输入。所述存储器1320用于存储计算机执行指令,当所述电子设备运行时,所述处理器1310执行所述存储器1320存储的所述计算机执行指令,以使所述电子设备执行图12所示的方法。
在一些实施例中,该终端还包括通信总线1350,其中,处理器1310可通过通信总线1350与存储器1320、至少一个触摸屏(可以是触摸屏1330、触摸屏1340)连接。
本申请的实施例中的方法步骤可以通过硬件的方式来实现,也可以由处理器执行软件指令的方式来实现。软件指令可以由相应的软件模块(例如上文的输入子***、窗口管理器、表面合成器等等)组成,软件模块可以被存放于随机存取存储器(random access memory,RAM)、闪存、只读存储器(read-only memory,ROM)、可编程只读存储器(programmable rom,PROM)、可擦除可编程只读存储器(erasable PROM,EPROM)、电可擦除可编程只读存储器(electrically EPROM,EEPROM)、寄存器、硬盘、移动硬盘、CD-ROM或者本领域熟知的任何其它形式的存储介质中。一种示例性的存储介质耦合至处理器,从而使处理器能够从该存储介质读取信息,且可向该存储介质写入信息。当然,存储介质也可以是处理器的组成部分。处理器和存储介质可以位于ASIC中。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算 机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者通过所述计算机可读存储介质进行传输。所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如固态硬盘(solid state disk,SSD))等。
可以理解的是,在本申请的实施例中涉及的各种数字编号仅为描述方便进行的区分,并不用来限制本申请的实施例的范围。

Claims (20)

  1. 一种电子设备,所述电子设备包括:
    至少一个触摸屏,所述至少一个触摸屏包括第一区域和第二区域,第一区域用于为第一应用提供显示输出以及提供触控输入;第二区域用于提供触控输入;
    处理器,经配置用于接收对第二区域的第一操作事件,所述第一操作事件是触控输入的事件;还经配置用于根据所述第一操作事件提供对第一区域的第二操作事件,以便所述电子设备根据所述第二操作事件进行动作;其中,所述第二操作事件根据第一操作事件映射得到。
  2. 根据权利要求1所述的电子设备,其特征在于,所述处理器还经配置用于发送第二操作事件给所述第一应用,以便所述第一应用根据所述第二操作事件进行动作。
  3. 根据权利要求1或2所述的电子设备,其特征在于,所述至少一个触摸屏包括第一触摸屏和第二触摸屏,第一区域位于第一触摸屏上,第二区域位于第二触摸屏上;
    所述处理器还经配置用于将在第二区域的第一操作事件映射为对第一区域的第二操作事件。
  4. 根据权利要求1或2所述的电子设备,其特征在于,
    所述第一区域和第二区域分别位于至少一个触摸屏中同一个触摸屏的不同部分;
    所述处理器还经配置用于通过事件输入子***将在第二区域的第一操作事件映射为对第一区域的第二操作事件;
    所述处理器还经配置用于将第一应用所对应的窗口配置在第一区域上。
  5. 根据权利要求1或2所述的电子设备,其特征在于,
    所述第一区域和第二区域分别位于至少一个触摸屏中同一个触摸屏的不同部分;
    所述处理器还经配置用于利用第二应用接收第一操作事件,将在第二区域的第一操作事件映射为对第一区域的第二操作事件;
    所述处理器还经配置用于将第一应用所对应的窗口配置在第一区域上。
  6. 根据权利要求1或2所述的电子设备,其特征在于,
    第一区域具有第一尺寸,第二区域具有第二尺寸,第一尺寸和第二尺寸构成等比例;
    所述处理器还经配置用于按第二尺寸和第一尺寸的比例,将第一操作事件在第二区域的坐标映射为第二操作事件在第一区域的坐标。
  7. 根据权利要求1或2所述的电子设备,其特征在于,
    第一区域具有第一尺寸,第二区域中的第三区域具有第三尺寸,第一尺寸和第三尺寸构成等比例;
    所述处理器还经配置用于按第三尺寸和第一尺寸的比例,将第一操作事件在第三区域的坐标映射为第二操作事件在第一区域的坐标。
  8. 根据权利要求1或2所述的电子设备,其特征在于,
    第一区域中的第四区域具有第四尺寸,第二区域中的第三区域具有第三尺寸,第四尺寸和第三尺寸构成等比例;
    所述处理器还经配置用于按第三尺寸和第四尺寸的比例,将第一操作事件在第三区域的坐标映射为第二操作事件在第四区域的坐标。
  9. 根据权利要求1或2所述的电子设备,其特征在于,所述第一操作事件包括一次操作,所述第二操作事件包括两次或两次以上操作。
  10. 根据权利要求1或2所述的电子设备,其特征在于,处理器还经配置用于根据第一应用提供对第一区域的第二操作事件。
  11. 根据权利要求1或2所述的电子设备,其特征在于,电子设备还包括存储器,用于存储在第二区域的第一操作和在第一区域的第二操作之间的映射关系;
    所述处理器还经配置用于根据所述映射关系和在第二区域的第一操作,提供对第一区域的第二操作事件。
  12. 一种用户交互方法,其特征在于,应用于配置有至少一个触摸屏的电子设备,所述至少一个触摸屏包括第一区域和第二区域,第一区域用于为第一应用提供显示输出以及提供触控输入;第二区域用于提供触控输入;所述方法包括:
    接收对第二区域的第一操作事件;第一操作事件是触控输入的事件;
    根据所述第一操作事件提供对第一区域的第二操作事件,以便所述电子设备根据所述第二操作事件进行动作;其中,所述第二操作事件根据第一操作事件映射得到。
  13. 根据权利要求12所述的方法,其特征在于,所述提供对第一区域的第二操作事件包括,发送第二操作事件给第一应用,以便所述第一应用根据所述第二操作事件进行动作。
  14. 根据权利要求12或13所述的方法,其特征在于,所述至少一个触摸屏包括第一触摸屏和第二触摸屏,第一区域位于第一触摸屏上,第二区域位于第二触摸屏上;
    所述方法包括将在第二区域的第一操作事件映射为对第一区域的第二操作事件。
  15. 根据权利要求12或13所述的方法,其特征在于,所述第一区域和第二区域分别位于至少一个触摸屏中同一个触摸屏的不同部分;
    所述方法还包括通过事件输入子***将在第二区域的第一操作事件映射为对第一区域的第二操作事件;
    所述方法还包括将第一应用所对应的窗口配置在第一区域上。
  16. 根据权利要求12或13所述的方法,其特征在于,所述第一区域和第二区域分别位于至少一个触摸屏中同一个触摸屏的不同部分;
    所述方法还包括利用第二应用接收第一操作事件,将在第二区域的第一操作事件映射为对第一区域的第二操作事件;
    所述方法还包括将第一应用所对应的窗口配置在第一区域上。
  17. 根据权利要求12或13所述的方法,其特征在于,所述提供对第一区域的第二操作事件包括,根据第一应用提供对第一区域的第二操作事件。
  18. 根据权利要求12或13所述的方法,其特征在于,所述方法还包括:记录用户在第二区域的操作事件和在第一区域的操作事件;保存在第二区域的操作事件和在第一区域的操作事件之间的映射关系;
    所述提供对第一区域的第二操作事件包括,根据所述映射关系和在第二区域的第一操作事件提供对第一区域的第二操作事件。
  19. 一种计算机存储介质,其特征在于,所述计算机存储介质包括计算机指令,当所述计算机指令在电子设备上运行时,使得所述电子设备执行权利要求12-18任一项所述的方法。
  20. 一种计算机程序产品,其特征在于,所述计算机程序产品包含的程序代码被电子设备中的处理器执行时,实现权利要求12-18任一项所述的方法。
PCT/CN2020/104906 2019-09-06 2020-07-27 一种用户交互方法及电子设备 WO2021042910A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910841190.2A CN110806830A (zh) 2019-09-06 2019-09-06 一种用户交互方法及电子设备
CN201910841190.2 2019-09-06

Publications (1)

Publication Number Publication Date
WO2021042910A1 true WO2021042910A1 (zh) 2021-03-11

Family

ID=69487487

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/104906 WO2021042910A1 (zh) 2019-09-06 2020-07-27 一种用户交互方法及电子设备

Country Status (2)

Country Link
CN (1) CN110806830A (zh)
WO (1) WO2021042910A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110806830A (zh) * 2019-09-06 2020-02-18 华为技术有限公司 一种用户交互方法及电子设备
CN112269518B (zh) * 2020-11-17 2022-03-22 三星电子(中国)研发中心 触控方法和装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140340336A1 (en) * 2013-05-15 2014-11-20 Samsung Electronics Co., Ltd. Portable terminal and method for controlling touch screen and system thereof
CN108319422A (zh) * 2017-01-18 2018-07-24 中兴通讯股份有限公司 一种多屏互动触控显示方法、装置、存储介质和终端
CN108595076A (zh) * 2018-05-01 2018-09-28 苏州鸥鹄智能科技有限公司 一种电子设备触控交互方法
CN108992924A (zh) * 2018-08-21 2018-12-14 苏州蜗牛数字科技股份有限公司 一种触发有序触摸屏操作事件的方法
CN110806830A (zh) * 2019-09-06 2020-02-18 华为技术有限公司 一种用户交互方法及电子设备

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103593136A (zh) * 2013-10-21 2014-02-19 广东欧珀移动通信有限公司 单手操作大屏幕触控终端的方法、装置及触控终端
CN108459790A (zh) * 2018-05-01 2018-08-28 苏州鸥鹄智能科技有限公司 一种电子设备触控交互方法
CN108595075A (zh) * 2018-05-01 2018-09-28 苏州鸥鹄智能科技有限公司 一种电子设备触控交互方法
CN108563379A (zh) * 2018-05-01 2018-09-21 苏州鸥鹄智能科技有限公司 一种电子设备触控交互方法
CN109062464B (zh) * 2018-06-27 2021-03-02 Oppo广东移动通信有限公司 触控操作方法、装置、存储介质和电子设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140340336A1 (en) * 2013-05-15 2014-11-20 Samsung Electronics Co., Ltd. Portable terminal and method for controlling touch screen and system thereof
CN108319422A (zh) * 2017-01-18 2018-07-24 中兴通讯股份有限公司 一种多屏互动触控显示方法、装置、存储介质和终端
CN108595076A (zh) * 2018-05-01 2018-09-28 苏州鸥鹄智能科技有限公司 一种电子设备触控交互方法
CN108992924A (zh) * 2018-08-21 2018-12-14 苏州蜗牛数字科技股份有限公司 一种触发有序触摸屏操作事件的方法
CN110806830A (zh) * 2019-09-06 2020-02-18 华为技术有限公司 一种用户交互方法及电子设备

Also Published As

Publication number Publication date
CN110806830A (zh) 2020-02-18

Similar Documents

Publication Publication Date Title
US20230049473A1 (en) Method and device for managing tab window indicating application group including heterogeneous applications
US10754711B2 (en) Multi-window control method and electronic device supporting the same
CN109074276B (zh) ***任务切换器中的选项卡
JP5373011B2 (ja) 電子装置およびその情報表示方法
EP2360569B1 (en) Method and apparatus for providing informations of multiple applications
US8924885B2 (en) Desktop as immersive application
US20100325527A1 (en) Overlay for digital annotations
US10417018B2 (en) Navigation of immersive and desktop shells
US20130057572A1 (en) Multiple Display Device Taskbars
JP2017517055A (ja) 選択可能なコントロールおよびコマンドを表示および拡大縮小するためのコマンドユーザインターフェース
US9843665B2 (en) Display of immersive and desktop shells
US20120066640A1 (en) Apparatus for providing multi-mode warping of graphical user interface objects
WO2018119575A1 (zh) 一种显示方法及电子设备
US10241977B2 (en) Combining and displaying multiple document areas
US9164974B2 (en) Method and apparatus for providing an electronic book service in a mobile device
WO2021042910A1 (zh) 一种用户交互方法及电子设备
WO2023061280A1 (zh) 应用程序显示方法、装置及电子设备
WO2024037418A1 (zh) 显示方法、装置、电子设备及可读存储介质
WO2019071594A1 (zh) 一种显示处理方法及电子设备
US20140043267A1 (en) Operation Method of Dual Operating Systems, Touch Sensitive Electronic Device Having Dual Operating Systems, and Computer Readable Storage Medium Having Dual Operating Systems
US20130127745A1 (en) Method for Multiple Touch Control Virtual Objects and System thereof
JP5995206B2 (ja) 情報処理装置
WO2023093661A1 (zh) 界面控制方法、装置、电子设备及存储介质
WO2022143337A1 (zh) 显示控制方法、装置、电子设备和存储介质
CN115617225A (zh) 应用界面显示方法、装置、电子设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20859759

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20859759

Country of ref document: EP

Kind code of ref document: A1