WO2015161653A1 - 一种终端操作方法及终端设备 - Google Patents

一种终端操作方法及终端设备 Download PDF

Info

Publication number
WO2015161653A1
WO2015161653A1 PCT/CN2014/092946 CN2014092946W WO2015161653A1 WO 2015161653 A1 WO2015161653 A1 WO 2015161653A1 CN 2014092946 W CN2014092946 W CN 2014092946W WO 2015161653 A1 WO2015161653 A1 WO 2015161653A1
Authority
WO
WIPO (PCT)
Prior art keywords
interface
sub
touch operation
main interface
drawing buffer
Prior art date
Application number
PCT/CN2014/092946
Other languages
English (en)
French (fr)
Inventor
汪玮
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2015161653A1 publication Critical patent/WO2015161653A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present invention relates to the field of communications, and in particular, to a terminal operating method and a terminal device.
  • a tablet also called a Tablet PC
  • a Tablet PC is a small, portable personal computer with a touch screen as the basic input device. Its own touch screen (also known as the tablet technology) allows users to work with a stylus or digital pen instead of a traditional keyboard or mouse. Users can enter information via built-in handwriting recognition, on-screen soft keyboard, voice recognition or a real keyboard (if equipped with this model).
  • the screen of an ordinary tablet is more than 6 inches, this makes it impossible for the user to perform one-hand operation conveniently. For example, when the right hand is held alone, the left lower corner of the tablet must be clicked with the left hand, and vice versa. Similarly, there is also a problem with single-hand operation for large-screen mobile terminals.
  • Embodiments of the present invention provide a terminal operation method and a terminal device to solve the problem that a large-screen terminal device is inconvenient to operate with one hand.
  • an embodiment of the present invention provides a terminal device, where the terminal device includes a display unit, an obtaining unit, and a processing unit connected to the acquiring unit, where:
  • the display unit is configured to display, in a main interface displayed by the terminal, a sub-interface mapped with a screen in the main interface, in response to triggering by a user;
  • the acquiring unit is configured to detect a touch operation occurring in the sub-interface
  • the processing unit is configured to map, according to the mapping relationship between the main interface and the sub-interface, a location of the touch operation in a sub-interface to a corresponding location in the main interface, and a corresponding location in the main interface The touch operation after the mapping is performed.
  • the mapping relationship includes a scaling ratio of the main interface to the sub-interface
  • the display unit is specifically configured to create a second drawing buffer that is independent of the first drawing buffer of the main interface; and the picture drawn in the first drawing buffer is reduced according to the scaling ratio and then drawn to the In the second drawing buffer; outputting the picture drawn in the second drawing buffer to the display screen to form the sub-interface.
  • the display unit is further configured to keep the picture drawn in the second drawing buffer and the picture drawn in the first drawing buffer in synchronization with each other.
  • the performing, by the processing unit, the mapping operation after the mapping is performed on the corresponding location in the main interface specifically: the processing unit identifies a type of operation corresponding to the touch operation in the location in the main interface; The operation type performs business processing; and sends the processed result to the display unit;
  • the display unit is further configured to draw a picture to be displayed to the first drawing buffer according to the processing result, and output the image to the main interface;
  • the displaying unit is configured to keep the picture drawn in the second drawing buffer in synchronization with the picture drawn in the first drawing buffer, and specifically includes: the displaying unit draws the picture to be displayed to the first When drawing a buffer, the screen to be displayed is reduced according to the zoom ratio, and then drawn to the second drawing buffer, and the first The picture drawn in the two drawing buffers is output to the sub-interface.
  • the processing unit is specifically configured to acquire a parameter for describing the touch operation, where the parameter includes at least coordinates of the touch operation in the sub-interface; according to a mapping relationship between the sub-interface and the main interface, Mapping coordinates of the touch operation in the sub-interface to coordinates in a main interface; updating coordinates in parameters describing the touch operation to the mapped coordinates, and performing the according to the updated parameters
  • the processing of the mapped touch operation is performed in the main interface.
  • an embodiment of the present invention provides a method for operating a terminal, including:
  • the terminal device In response to the trigger of the user, the terminal device displays a sub-interface mapped with the screen in the main interface in the displayed main interface;
  • the touch operation is performed at a corresponding position in the main interface.
  • the mapping relationship includes a zoom ratio of the main interface and the sub-interface, and the display of the terminal in the displayed main interface is mapped in the main interface.
  • the sub-interface of the screen including:
  • the method further includes:
  • the performing the touch operation in the corresponding position in the main interface includes:
  • the synchronizing the picture drawn in the second drawing buffer with the picture drawn in the first drawing buffer comprises:
  • the terminal device draws the picture to be displayed to the first drawing buffer
  • the picture to be displayed is reduced according to the zoom ratio, and then drawn to the second drawing buffer, and the first The picture drawn in the two drawing buffers is output to the sub-interface.
  • the mapping according to the main interface and the sub-interface The mapping, the mapping the location of the touch operation in the sub-interface to the corresponding location of the main interface, specifically includes:
  • Obtaining a parameter for describing the touch operation the parameter including at least coordinates of the touch operation in the sub-interface
  • the performing the mapping operation after the mapping is performed on the corresponding location in the main interface specifically includes:
  • the mapped touch operation is performed in the main interface according to the updated parameter.
  • the embodiment of the present invention displays a sub-interface mapped with a picture in the main interface in the main interface of the display, and when the touch operation in the sub-interface is acquired, the sub-interface touch operation is mapped to a touch operation corresponding to the main interface.
  • the user is convenient to operate the entire screen interface content.
  • FIG. 1 is a schematic diagram of a logical structure of a terminal device according to an embodiment of the present disclosure
  • FIG. 2 is a schematic diagram showing display of a sub-interface and a main interface in a terminal device according to an embodiment of the present invention
  • FIG. 3 is a schematic flowchart 1 of a method for operating a terminal according to an embodiment of the present disclosure
  • FIG. 4 is a second schematic flowchart of a method for operating a terminal according to an embodiment of the present disclosure
  • FIG. 5 is a schematic structural diagram of hardware of a terminal device according to an embodiment of the present invention.
  • the terminal device may be a fixed terminal device or a mobile terminal device.
  • the mobile terminal device refers to a terminal device that can be used in mobile, and may include a mobile phone, a notebook, a tablet computer, a PDA, a POS machine, and may also include an on-board computer.
  • networks and technologies move toward more and more broadband, the mobile communications industry will move toward a true era of mobile information.
  • the processing power of mobile terminals has already had powerful processing capabilities, and mobile terminals are changing from simple call tools to an integrated information processing platform, which also makes mobile terminals in the application process.
  • the humanized and intelligent operation in the field has become the focus of attention of those skilled in the art and users.
  • the embodiment of the present invention adds another display interface convenient for one-hand operation to the original display interface of the terminal device to form a dual display interface.
  • the display interface capable of one-hand operation has a mapping relationship with the original display interface, and is a thumbnail display of the original display interface, and the user can display the thumbnail display.
  • the operation of the original display interface is completed within the interface.
  • the terminal device includes a display unit 01, an acquisition unit 02, and a processing unit 03 connected to both the display unit 01 and the acquisition unit 02. among them:
  • the display unit 01 is configured to display a sub-interface mapped with a picture in the main interface in the main interface of the display in response to the trigger of the user.
  • the sub-interface is a thumbnail display of the main interface.
  • the sub-interface is displayed on the main interface to form a dual display interface with the main interface, as shown in FIG. 2, and FIG. 2 is a schematic diagram of the sub-interface and the main interface in the terminal device.
  • 201 is the main interface
  • 202 is the sub-interface.
  • the obtaining unit 02 is configured to detect a touch operation occurring in the sub-interface.
  • the processing unit 03 is configured to map a location of the touch operation in the sub-interface to a corresponding location in the main interface according to a mapping relationship between the main interface and the sub-interface, and perform a mapped touch operation on the corresponding location in the main interface.
  • the embodiment of the present invention maps the content in the main interface to the sub-interface, so that the user can implement the touch operation on the main interface in the sub-interface, thereby facilitating the user to operate the main interface through the one-hand operation sub-interface, thereby realizing One-handed operation of the entire screen of a large-screen terminal device.
  • the terminal device when the sub-interface is displayed, the terminal device adopts an output manner independent of the main interface, and displays the sub-interface on the original display interface to form a dual display interface.
  • the independent output of the sub-interface can be realized by creating a separate drawing buffer for the sub-interface.
  • This function of creating a separate drawing buffer can be implemented in the display unit 01.
  • the specific implementation of the components shown in FIG. 1 will be described in detail below.
  • the mapping relationship in the embodiment of the present invention includes a scaling ratio of the main interface and the sub-interface.
  • the display unit 01 creates a second drawing buffer with a first drawing buffer independent of the main interface.
  • the picture drawn in the first drawing buffer is reduced according to the scaling ratio and then drawn into the second drawing buffer.
  • Outputting the picture drawn in the second drawing buffer to the display screen forms the sub-interface.
  • the above embodiment creates another by using a first drawing buffer on the main interface.
  • the second drawing buffer outputs the sub-interface, so that the sub-interfaces on the terminal device can be independently controlled to avoid affecting the main interface when displaying the sub-interface.
  • the display unit 01 is further configured to keep the content drawn in the second drawing buffer and the content drawn in the first drawing buffer in synchronization with each other, that is, the content displayed in the main interface changes. Always keep the content displayed in the sub-interface consistent with what is displayed in the main interface.
  • the above synchronous update operation can be completed by the processing unit 03 cooperating with the display unit 01.
  • the specific processing of the mapping operation performed by the processing unit 03 in the corresponding position in the main interface includes: identifying a type of operation corresponding to the touch operation at the location in the main interface; according to the identified operation type The business process is performed; the result of the process is sent to the display unit 01.
  • the display unit 01 draws the picture to be displayed to the first drawing buffer according to the result of the processing of the processing unit 04 and outputs it to the main interface.
  • the display unit 01 is configured to keep the picture drawn in the second drawing buffer in synchronization with the picture drawn in the first drawing buffer, and specifically includes: when the display unit 01 draws the picture to be displayed to the first drawing buffer, The picture to be displayed is reduced in accordance with the zoom ratio, and then drawn to the second drawing buffer, and the picture drawn in the second drawing buffer is output to the sub-interface.
  • the main interface in the embodiment of the present invention and the first drawing buffer of the display screen in the output main interface are the capabilities of the existing terminal device, the first drawing is performed on the display unit 01. The process of outputting the main interface of the buffer will not be described in detail in the present invention.
  • the mapping of the location of the touch operation in the sub-interface to the corresponding location in the main interface includes: the processing unit 03 acquires a parameter for describing the touch operation, the parameter includes at least Touching the coordinates of the operation in the sub-interface; mapping the coordinates of the touch operation in the sub-interface to the coordinates in the main interface according to the mapping relationship between the sub-interface and the main interface; the coordinates in the parameter of the touch operation will be described Updated to the mapped coordinates.
  • the performing the mapped touch operation by the processing unit 03 in the corresponding position in the main interface specifically includes: performing the mapped touch operation in the main interface according to the updated parameter.
  • the zoom ratio of the main interface and the sub-interface may be used as a mapping relationship between the main interface and the sub-interface.
  • the scaling ratio can be preset on the terminal device.
  • the size of the sub-interface is set on the terminal device, and the scaling ratio of the main interface and the sub-interface can be calculated by the size of the main interface and the size of the preset sub-interface.
  • the coordinates in the main interface and the sub-interface are all represented by relative coordinates, that is, the main interface and the sub-interface both use a certain point in the respective interface as a coordinate origin, and the coordinates in the main interface and the sub-interface are relative to the origin of the respective coordinates.
  • the position of the coordinate origin of the main interface in the main interface is consistent with the position of the coordinate origin of the sub-interface in the sub-interface (for example, the coordinates of the bottom left corner of the interface are the coordinate origin).
  • the mapping relationship may further include an offset of the coordinate origin of the main interface and the sub-interface to perform mapping conversion.
  • the obtaining unit 02 can be implemented by further improving the existing touch screen technology.
  • the acquisition unit 02 detects a touch operation on the touch screen using an existing touch screen technology, and acquires a parameter for describing the touch operation, the parameter including at least coordinates of the touch operation in the screen coordinate system.
  • the obtaining unit 02 acquires the coordinate range of the sub-interface in the screen coordinate system; and recognizes the touch operation occurring in the sub-interface by comparing the coordinates of the touch operation in the screen coordinate system with the coordinate range of the sub-interface in the screen coordinate system; .
  • the obtaining unit 02 After recognizing the touch operation occurring in the sub-interface, the obtaining unit 02 further passes the parameters of the touch operation to the processing unit 03, so that the processing unit 03 completes the mapping process.
  • the obtaining unit 02 may first convert the coordinates of the touch operation included in the parameter in the screen coordinate system to the coordinates of the touch operation in the sub-interface before transmitting the parameter for describing the touch operation to the processing unit 03.
  • the obtaining unit 02 may also not perform the The conversion is performed by the processing unit 03 after receiving the parameter to obtain a parameter including the coordinates of the touch operation in the sub-interface.
  • the terminal device may further provide a function for the user to perform operations on the sub-interface itself in the sub-interface. For example, changing the position of the sub-interface, changing the size of the sub-interface, or performing an enlarged display in the sub-interface. Therefore, the terminal device may further include a sub-interface operation unit 04. among them,
  • the obtaining unit 02 is further configured to determine that the touch operation is a touch operation for the main interface, or a touch operation for the sub-interface; if the touch operation is a touch operation for the main interface, the trigger processing unit 03 performs The processing of the location of the touch operation in the sub-interface is mapped to the corresponding location in the main interface; if the touch operation is a touch operation for the sub-interface, the trigger sub-interface operation unit 04 performs processing.
  • the sub-interface operation unit 04 is configured to identify an operation type corresponding to the touch operation for the sub-interface corresponding to the sub-interface, and perform service processing according to the operation type in the sub-interface.
  • the sub-interface operation unit 04 performs the service processing according to the operation type in the sub-interface, and specifically includes: the sub-interface operation unit 04 follows The set position adjusts the output position of the second drawing buffer to cause the display unit 01 to output the picture drawn in the second drawing buffer to the display forming sub-interface according to the adjusted output position.
  • the sub-interface operation unit 04 performs the service processing according to the operation type in the sub-interface, and specifically includes: the sub-interface operation unit 04 follows.
  • the size of the setting adjusts the size of the second drawing buffer. If the zoom ratio is preset, the sub-interface operation unit 04 also updates the zoom ratio according to the adjusted size of the second drawing buffer.
  • the sub-interface operation unit 04 When the operation type of the touch operation corresponding to the sub-interface is an enlarged display, the sub-interface operation unit 04 performs the service processing according to the operation type in the sub-interface, and the sub-interface operation unit 04 generates the data of the enlarged display frame, and the display unit 01 performs enlargement display frame drawing, and the display content within the specified radius at the coordinates in the sub-interface is enlarged according to a preset ratio, and the display unit 01 draws the enlarged content on the enlarged display Inside the box.
  • the embodiment of the present invention displays the sub-interface mapped with the content in the main interface in the displayed main interface, and when the touch operation in the sub-interface is acquired, the sub-interface touch operation is mapped to correspond to the main interface.
  • the touch operation realizes one-hand touch operation of the user on the entire main interface by the sub-interface, thereby improving the operability of the terminal device.
  • FIG. 3 is a schematic diagram of a method for operating a terminal according to an embodiment of the present invention. As shown in FIG. 3, the method includes:
  • the terminal device In response to triggering by the user, the terminal device displays a sub-interface mapped with the screen in the main interface in the displayed main interface.
  • the triggering of the user may trigger the corresponding physical button or touch the corresponding virtual button or perform a corresponding gesture operation (for example, shaking the mobile phone left and right).
  • Both the main interface and the sub-interface can be implemented using "double buffering" drawing technology.
  • Double buffering means creating a buffer in memory that is consistent with the screen drawing area. First draw the graphics into this buffer, and then copy the graphics in this buffer to the screen drawing area at one time, thus on the screen. Form a display interface.
  • the drawing buffer created for the main interface is referred to as a first drawing buffer
  • the drawing buffer created for the sub-interface is referred to as a second drawing buffer.
  • the main interface is displayed, the graphic is first drawn into the first drawing buffer in the memory, and then outputted from the first drawing buffer to the screen.
  • the graphics are first drawn to the second drawing buffer in the memory, and then outputted from the second drawing buffer to the screen.
  • step 301 the terminal device displays the sub-interface mapped to the screen in the main interface in the displayed main interface by mapping the picture drawn in the first drawing buffer with the picture drawn by the second drawing buffer.
  • the mapping relationship may include a scaling ratio of the main interface and the sub-interface.
  • the specific implementation process of step 301 is as follows: the terminal device creates a second drawing buffer independent of the first drawing buffer; wherein the first drawing buffer is an output main The drawing buffer of the interface. The picture drawn in the first drawing buffer is reduced according to the zoom ratio and then drawn into the second drawing buffer; and the picture drawn in the second drawing buffer is output to the display to form a sub-interface.
  • the terminal device detects a touch operation occurring in the sub-interface.
  • the terminal device detects a touch operation on the touch screen; then acquires coordinates of the touch operation in the screen coordinate system and a coordinate range of the sub-interface in the screen coordinate system; and compares the touch operation on the screen coordinates
  • the coordinates in the system and the coordinate range of the sub-interface in the screen coordinate system identify touch operations occurring within the sub-interface.
  • the terminal device maps, according to the preset mapping relationship between the main interface and the sub-interface, a location of the touch operation in the sub-interface to a corresponding location in the main interface.
  • the terminal device After the terminal device detects a touch operation occurring in the sub-interface, the terminal device acquires a parameter for describing the touch operation, where the parameter includes at least coordinates of the touch operation in the sub-interface; a mapping relationship between the interface and the main interface, mapping coordinates of the touch operation in the sub-interface to coordinates in the main interface; updating coordinates in the parameters describing the touch operation to the mapped coordinates.
  • the parameters for describing the touch operation may further include status information such as the duration of the touch operation.
  • status information such as the duration of the touch operation. For example, the user long presses a virtual button with a camera icon in the already generated sub-interface, at which time the terminal device acquires the location of the touch operation and the duration of the touch operation.
  • the terminal device performs a mapped touch operation on a corresponding location in the main interface.
  • the terminal device identifies a corresponding operation type of the touch operation in the main interface.
  • the terminal device performs business processing according to the operation type, and draws a picture to be displayed to the first drawing buffer according to the processing result, and outputs the picture in the first drawing buffer to the main interface.
  • the terminal device may identify a corresponding operation type of the touch operation in the main interface according to a parameter of the touch operation after mapping, for example, a coordinate of the touch operation in the main interface.
  • a parameter of the touch operation after mapping for example, a coordinate of the touch operation in the main interface.
  • the specific recognition process can be implemented by using existing gesture recognition technology, and details are not described herein again.
  • the parameters such as the touch duration may also be used in the identification process. Since the parameters such as the touch duration do not change during the mapping process, the embodiment of the present invention only needs to change the coordinate parameters in the mapping process. A detailed description has been made.
  • the terminal device implements the user's one-handed operation of the entire screen in the sub-interface. Touch operation.
  • the terminal device Since the screen displayed in the main interface changes with the user's operation, after displaying the sub-interface, the terminal device also keeps the picture drawn in the second drawing buffer and the picture drawn in the first drawing buffer in synchronization with the update. , so that the screen displayed in the sub-interface is consistent with the screen displayed in the main interface.
  • the process of this synchronization update typically occurs after step 304.
  • the terminal device draws the picture to be displayed to the first drawing buffer, further mapping the picture to be displayed to the second drawing buffer, and drawing the second drawing buffer
  • the screen is output to the sub-interface.
  • the mapping of the image to be displayed to the second drawing buffer in the above process may include: reducing the zoomed ratio of the main interface and the sub-interface to the second drawing buffer according to the zoom ratio of the main interface and the sub-interface.
  • the embodiment of the present invention displays a sub-interface mapped with a picture in the main interface in the main interface of the display, and when the touch operation in the sub-interface is acquired, the sub-interface touch operation is mapped to a touch operation corresponding to the main interface.
  • the user is convenient to operate the entire screen interface content.
  • the terminal operation method provided by the embodiment of the present invention is described in detail below with reference to FIG. 4, as shown in FIG. 4, the method includes:
  • the terminal device receives an operation of triggering a one-hand operation function key by a user.
  • the one-hand operation function key may be a physical button or a virtual button or a corresponding gesture.
  • the bitmap can be created in the second drawing buffer when the second drawing buffer is created, and the display screen of the sub-interface can be drawn on the bitmap in the second drawing buffer.
  • the zoom ratio of the main interface and the sub-interface may be preset, and the bitmap is established according to the zoom ratio of the main interface and the sub-interface.
  • the display size of the main interface a is a: 800*600.
  • the terminal device calculates the size of the sub-interface b as b: 400*300 by using a preset zoom ratio.
  • the terminal device creates a bitmap of size 400*300 in the second drawing buffer. It should be noted that, usually, the end The size of the bitmap in the drawing buffer of the end device is the same as the size of the corresponding display interface.
  • the terminal device reduces the screen of the first drawing buffer according to the scaling ratio of the main interface and the sub-interface, and then draws the image into the second drawing buffer.
  • the terminal device detects a touch operation on the screen, and obtains a parameter for describing the touch operation.
  • the parameter describing the touch operation includes at least coordinates of the touch operation in the screen coordinate system.
  • Each point in the display of the terminal device has its own coordinates, and the coordinate system used to describe the coordinates of each point in the display is called the screen coordinate system.
  • the coordinate system of the sub-interface and the coordinate system of the main interface are also involved in the embodiment of the present invention.
  • the coordinate origin of the coordinate system of the sub-interface is located in the sub-interface, and the coordinates in the sub-interface are represented by the coordinate origin of the sub-interface.
  • the coordinate origin in the main interface is located in the main interface, and the coordinates in the main interface are represented relative to the coordinate origin of the main interface. Normally, the coordinate origins of the three coordinate systems can coincide.
  • the terminal device determines whether the detected touch operation occurs in the sub-interface. If yes, proceed to step 406. Otherwise, go to step 408.
  • the terminal device acquires a position of the sub-interface in the screen, and the position is represented by a coordinate range of the sub-interface in the screen coordinate system.
  • the terminal device determines whether the coordinate of the touch operation in the screen coordinate system falls within the coordinate range of the sub-interface in the screen coordinate system, and if so, the touch operation is a touch operation occurring in the sub-interface; otherwise, the touch operation is A touch operation that acts directly on the main interface.
  • the terminal device determines that the touch operation is a touch operation for the main interface or a touch operation for the sub-interface.
  • the touch operation is a touch operation on the main interface, proceed to 407; if the touch operation is performed on the sub-interface, perform step 410, and the terminal device identifies the operation type corresponding to the touch operation in the sub-interface, And according to the type of operation in the sub-interface.
  • the setting sub-interface is During the operation of the position, the terminal device adjusts the output attribute of the second drawing buffer according to the set position, and outputs the picture drawn in the second drawing buffer to the display screen according to the adjusted output attribute.
  • the terminal device adjusts the size of the second drawing buffer according to the set size. If the zoom ratio is preset, the terminal device also updates the zoom ratio according to the adjusted size of the second drawing buffer.
  • the terminal device When the operation type corresponding to the touch operation in the sub-interface is an enlarged display, the terminal device generates an enlarged display frame, and the display screen within the specified radius at the coordinates in the sub-interface is enlarged and displayed according to a preset ratio. In the enlarged display box.
  • the terminal device maps a location of the touch operation occurring in the sub-interface to a corresponding location in the main interface.
  • the terminal device converts the coordinates of the touch operation obtained in step 404 in the screen coordinate system into coordinates in the coordinate system of the sub-interface. Then, according to the mapping relationship between the sub-interface and the main interface, the coordinates in the obtained coordinate system of the sub-interface are mapped to the corresponding coordinates of the main interface. Thereby the touch operation becomes a touch operation acting on the main interface.
  • the zoom ratio of the main interface and the sub-interface is 2:1
  • the display size of the main interface a is a:800*600
  • the size of the sub-interface b is b:400*300
  • the terminal device maps the coordinates of the touch operation in the sub-interface to the main interface according to the mapping relationship between the preset sub-interface and the main interface.
  • the mapping relationship adopted is: coordinate value * scaling ratio in the sub-interface.
  • the coordinates of the touch operation are relative coordinates.
  • the bottom left corner of the sub-interface (or any fixed point in the sub-interface) can be used regardless of where the sub-interface is dragged by the user.
  • the coordinate origin b (0, 0).
  • the terminal device performs the touch operation in the main interface.
  • the terminal device identifies the corresponding operation type of the touch operation in the main interface according to the parameter of the touch operation, and performs service processing according to the identified operation type.
  • the parameter describing the touch operation is the parameter obtained in step 404. If the touch operation is a touch operation mapped to the main interface, the parameter describing the touch operation is the updated parameter in step 408.
  • the terminal device After determining the type of operation of the touch operation in the main interface, the terminal device performs a function of the service (for example, opening a music program) on the one hand, and displays and displays a picture of the service on the other hand (for example, a music program) interface). Therefore, the terminal device also draws the picture to be displayed to the first drawing buffer according to the result of the processing, and outputs the picture in the first drawing buffer to the main interface.
  • Step 409 can be implemented by using the prior art, and details are not described herein again.
  • the terminal device updates the screen displayed in the sub-interface according to the update of the screen displayed in the main interface.
  • the terminal device maps the drawing data in the processing result obtained in step 408 according to the mapping relationship between the main interface and the sub-interface, and draws the second drawing buffer according to the mapped data, and the second drawing buffer
  • the screen in the output is output to the sub-interface.
  • FIG. 5 is a schematic structural diagram of hardware of a terminal device according to an embodiment of the present invention.
  • the terminal device can be a mobile phone, a tablet computer, a PDA, or the like.
  • the terminal device includes a processor 501, an input/output device 502, a memory 503, and a bus 504.
  • the processor 501, the input/output device 502, and the memory 503 are communicably connected by a bus 504.
  • the processor 501 is a control center of the terminal, and performs various functions and processing data of the terminal device by running or executing a software program stored in the memory 503 and calling data stored in the memory 503.
  • the input/output device 502 may be a touch screen for sensing a user's touch operation and transmitting the user's touch operation to the processor 501 for processing, and outputting the processed result processed by the processor 501 for display.
  • the touch screen may include a display panel and a touch panel. Optionally, it can be configured by using an LCD (Liquid Crystal Display) or an OLED (Organic Light-Emitting Diode). Display panel.
  • the touch panel may cover the display panel. When the touch panel detects a touch operation on or near the touch panel, the touch panel transmits to the processor 501 to determine the type of the touch event, and then the processor 501 provides the display panel according to the type of the touch event. Corresponding visual output.
  • the memory 503 may be a read only memory (ROM), a static storage device, a dynamic storage device, or a random access memory (RAM).
  • the memory 503 in the embodiment of the present invention may include two parts, a ROM and a RAM, in which instructions of the operating system and other application programs and application data may be stored in the ROM, and the first drawing buffer and the second drawing buffer may be created in the RAM.
  • the instructions stored in the memory 503 are executed by the processor 501.
  • the processor 501 can be a general-purpose central processing unit (CPU), a microprocessor, an application specific integrated circuit (ASIC), or one or more integrated circuits for executing related programs.
  • CPU central processing unit
  • ASIC application specific integrated circuit
  • Bus 504 can include a path for communicating information between various components, such as processor 501, memory 503, and input/output device 502.
  • the processor 501 is configured to execute an instruction in the memory 503, and is configured to implement a function of: displaying, in response to a trigger of the user, a mapping in the main interface displayed by the input/output device 502 in the main interface. a sub-interface of the screen; detecting, by the input/output device 502, a touch operation occurring in the sub-interface; mapping the touch operation in the sub-interface to the main interface according to a mapping relationship between the preset sub-interface and the main interface And then perform the mapped touch operation in the main interface.
  • the user can trigger on the input/output device 502, and the input/output device 502 sends the user's trigger to the processor 501.
  • the user's touch operation is detected by the input/output device 502, and parameters for describing the touch operation are acquired by the input/output device 502, and the parameters are passed to the processor 501.
  • the processor 501 generates a sub-interface, and maps the touch operation in the sub-interface to the main interface, and the process of performing the mapped touch operation in the main interface may refer to the method embodiment part shown in FIG. 3 and FIG. Let me repeat.
  • FIG. 5 only shows the processor 501, the memory 503, Input/output device 502 and bus 504, but in a particular implementation, those skilled in the art will appreciate that the device also includes other devices necessary to achieve proper operation. At the same time, those skilled in the art will appreciate that hardware devices that implement other functions may also be included, depending on the particular needs.
  • the disclosed apparatus and method can be implemented in other manners.
  • the device embodiments described above are merely illustrative.
  • the division of the modules or units is only a logical function division. In actual implementation, there may be another division manner, for example, multiple units or components may be combined or Can be integrated into another system, or some features can be ignored or not executed.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, device or unit, and may be in an electrical, mechanical or other form.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • each functional unit in each embodiment of the present invention may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the above integrated unit can be implemented in the form of hardware or in the form of a software functional unit.
  • the integrated unit if implemented in the form of a software functional unit and sold or used as a standalone product, may be stored in a computer readable storage medium.
  • the technical solution of the present invention which is essential or contributes to the prior art, or all or part of the technical solution, may be embodied in the form of a software product stored in a storage medium.
  • Including several instructions to make one A computer device (which may be a personal computer, a server, or a network device, etc.) or a processor performs all or part of the steps of the methods described in various embodiments of the present invention.
  • the foregoing storage medium includes: a U disk, a mobile hard disk, a read-only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disk, and the like. .

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本发明的实施例提供一种终端操作方法及终端设备,涉及通信领域,通过在显示的主界面内显示映射有主界面中的画面的子界面,实现用户在子界面对整个屏幕实现单手触控操作,方便用户操作整个屏幕界面内容且不影响浏览效果。该方案包括:响应于用户的触发,终端设备在显示的主界面内显示映射有主界面中的画面的子界面;检测发生在子界面内的触摸操作;根据主界面和子界面的映射关系,将触摸操作在子界面中的位置映射到主界面中对应的位置;在主界面中对应的位置执行触摸操作。

Description

一种终端操作方法及终端设备
本申请要求2014年4月25日递交的申请号为201410173006.9、发明名称为“一种终端操作方法及终端设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本发明涉及通信领域,尤其涉及一种终端操作方法及终端设备。
背景技术
随着通信技术的飞速发展,移动终端作为承载诸多功能的便携式设备的功能和外观都大大得到改善,其中,平板电脑的广泛普及和应用尤为引人注目。
平板电脑,也叫平板计算机(Tablet Personal Computer,tablet PC),是一种小型、方便携带的个人电脑,以触摸屏作为基本的输入设备。它拥有的触摸屏(也称为数位板技术)允许用户通过触控笔或数字笔等来进行作业而不是传统的键盘或鼠标。用户可以通过内建的手写识别、屏幕上的软键盘、语音识别或者一个真正的键盘(如果该机型配备的话)进行信息录入。然而,由于普通平板电脑的屏幕都在6英寸以上,这就使得用户无法方便的进行单手操作,比如右手单持时必须使用左手点击平板计算机的左下角,反之亦然。同样的,对于大屏移动终端也存在单手操作变的问题。
因此,针对上述问题,有必要提供一种供使用者可以在单手操作手持移动终端设备时,能够方便操作整个屏幕界面内容的方案。
发明内容
本发明的实施例提供一种终端操作方法及终端设备,以解决大屏幕终端设备不便单手操作的问题。
为达到上述目的,本发明的实施例采用如下技术方案:
第一方面,本发明的实施例提供一种终端设备,所述终端设备包括显示单元、获取单元、与所述获取单元连接的处理单元,其中:
所述显示单元,用于响应于用户的触发,在所述终端显示的主界面内显示映射有所述主界面中的画面的子界面;
所述获取单元,用于检测发生在所述子界面内的触摸操作;
所述处理单元,用于根据所述主界面和子界面的映射关系,将所述触摸操作在子界面中的位置映射到所述主界面中对应的位置,并在所述主界面中对应的位置执行映射后的所述触摸操作。
在第一方面的第一种可能的实现方式中,所述映射关系包括所述主界面与所述子界面的缩放比,
所述显示单元具体用于创建独立于所述主界面的第一绘图缓冲区的第二绘图缓冲区;将所述第一绘图缓冲区中绘制的画面按照所述缩放比进行缩小后绘制到所述第二绘图缓冲区中;将所述第二绘图缓冲区中绘制的画面输出到显示屏上形成所述子界面。
结合第一方面的第一种可能的实现方式,在第一方面的第二种可能的实现方式中,
所述显示单元,还用于将所述第二绘图缓冲区中绘制的画面与所述第一绘图缓冲区中绘制的画面保持同步更新。
结合第一方面的第二种可能的实现方式,在第一方面的第三种可能的实现方式中,
所述处理单元用于在所述主界面中对应的位置执行映射后的所述触摸操作具体包括:所述处理单元识别在所述位置的触摸操作在所述主界面中对应的操作类型;按照所述操作类型进行业务处理;将处理的结果发送给所述显示单元;
所述显示单元,还用于根据所述处理结果将要显示的画面绘制到所述第一绘图缓冲区并输出至所述主界面;
所述显示单元用于将所述第二绘图缓冲区中绘制的画面与所述第一绘图缓冲区中绘制的画面保持同步更新具体包括:所述显示单元在将要显示的画面绘制到所述第一绘图缓冲区时,将所述要显示的画面按照所述缩放比进行缩小后绘制到所述第二绘图缓冲区,并将所述第 二绘图缓冲区中绘制的画面输出至所述子界面。
结合前述的第一方面以及第一方面的第第一至第三种可能的实现方式,在第一方面的第四种可能的实现方式中,
所述处理单元具体用于获取用于描述所述触摸操作的参数,所述参数至少包括所述触摸操作在所述子界面中的坐标;根据所述子界面和所述主界面的映射关系,将所述触摸操作在所述子界面中的坐标映射为在主界面中的坐标;将描述所述触摸操作的参数中的坐标更新为所述映射后的坐标,根据更新后的参数执行所述在所述主界面中执行映射后的所述触摸操作的处理。
第二方面,本发明的实施例提供了一种终端操作方法,包括:
响应于用户的触发,终端设备在显示的主界面内显示映射有所述主界面中的画面的子界面;
检测发生在所述子界面内的触摸操作;
根据所述主界面和子界面的映射关系,将所述触摸操作在所述子界面中的位置映射到所述主界面中对应的位置;
在所述主界面中对应的位置执行所述触摸操作。
在第二方面的第一种可能的实现方式中,所述映射关系包括所述主界面与所述子界面的缩放比,所述终端在显示的主界面内显示映射有所述主界面中的画面的子界面,包括:
创建独立于所述主界面的第一绘图缓冲区的第二绘图缓冲区;
将所述第一绘图缓冲区中绘制的画面按照所述缩放比缩小后绘制到所述第二绘图缓冲区中;
将所述第二绘图缓冲区中绘制的画面输出到显示屏上形成所述子界面。
结合第二方面的第一种可能的实现方式,在第二方面的第二种可能的实现方式中,所述方法还包括:
将所述第二绘图缓冲区中绘制的画面与所述第一绘图缓冲区中绘制的画面保持同步更新。
结合第二方面的第二种可能的实现方式,在第二方面的第三种可 能的实现方式中,所述在所述主界面中对应的位置执行所述触摸操作具体包括:
识别在所述位置的触摸操作在所述主界面中对应的操作类型;
按照所述操作类型进行业务处理;
根据处理的结果将要显示的画面绘制到所述第一绘图缓冲区,将所述第一绘图缓冲区绘制的画面输出至所述主界面;
所述将所述第二绘图缓冲区中绘制的画面与所述第一绘图缓冲区中绘制的画面保持同步更新具体包括:
所述终端设备在将要显示的画面绘制到所述第一绘图缓冲区时,将所述要显示的画面按照所述缩放比进行缩小后绘制到所述第二绘图缓冲区,并将所述第二绘图缓冲区中绘制的画面输出至所述子界面。
结合前述的第二方面以及第二方面的第一致第三种可能的实现方式,在第二方面的第四种可能的实现方式中,所述根据所述主界面和所述子界面的映射关系,将所述触摸操作在所述子界面中的位置映射到所述主界面的对应位置具体包括:
获取用于描述所述触摸操作的参数,所述参数至少包括所述触摸操作在所述子界面中的坐标;
根据所述子界面和所述主界面的映射关系,将所述触摸操作在所述子界面中的坐标映射为在主界面中的坐标;
将描述所述触摸操作的参数中的坐标更新为所述映射后的坐标;
所述在所述主界面中对应的位置执行映射后的所述触摸操作具体包括:
根据更新后的参数在所述主界面中执行映射后的所述触摸操作。
本发明的实施例通过在显示的主界面内显示映射有主界面中的画面的子界面,当获取到在子界面内的触摸操作时,将子界面触摸操作映射为对应于主界面的触摸操作,从而实现用户在子界面对整个主界面的单手触控操作,方便用户操作整个屏幕界面内容。
附图说明
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本发明实施例提供的终端设备的逻辑结构示意图;
图2为本发明实施例提供的终端设备中的子界面和主界面的显示示意图;
图3为本发明实施例提供的一种终端操作方法的流程示意图一;
图4为本发明实施例提供的一种终端操作方法的流程示意图二;
图5为本发明实施例提供的终端设备的硬件结构示意图。
具体实施方式
以下描述中,为了说明而不是为了限定,提出了诸如特定***结构、接口、技术之类的具体细节,以便透彻理解本发明。然而,本领域的技术人员应当清楚,在没有这些具体细节的其它实施例中也可以实现本发明。在其它情况中,省略对众所周知的装置、电路以及方法的详细说明,以免不必要的细节妨碍本发明的描述。
终端设备,可以为固定终端设备或移动终端设备,其中,移动终端设备是指可以在移动中使用的终端设备,可以包括手机、笔记本、平板电脑、PDA、POS机,还可以包括车载电脑。随着网络和技术朝着越来越宽带化的方向发展,移动通信产业将走向真正的移动信息时代。另一方面,随着集成电路技术的飞速发展,移动终端的处理能力已经拥有了强大的处理能力,移动终端正在从简单的通话工具变为一个综合信息处理平台,这也使得移动终端在应用过程中的人性化和智能化操作成为本领域技术人员以及广大用户关注的焦点。
针对大屏终端设备不方便进行单手操作的问题,本发明实施例在终端设备的原始显示界面上增加另一方便进行单手操作的显示界面,形成双显示界面。其中,该可进行单手操作的显示界面与原始显示界面具有映射关系,是原始显示界面的缩略显示,用户可在该缩略显示 的界面内完成对原始显示界面的操作。下面通过具体的实施例对本发明的方案进行详细说明。为了方便描述,在本发明实施例中,将终端设备的原始显示界面称为主界面,将原始显示界面的缩略显示界面称为子界面。图1为本发明的实施例提供终端设备的逻辑结构示意图。如图1所示,该终端设备包括显示单元01、获取单元02、与所述显示单元01和所述获取单元02均连接的处理单元03。其中:
显示单元01,用于响应于用户的触发,在显示的主界面内显示映射有主界面中的画面的子界面。其中,子界面是主界面的缩略显示。子界面显示于主界面上与主界面形成双显示界面,如图2所示,图2为终端设备中的子界面和主界面的显示示意图。其中,201为主界面,202为子界面。
获取单元02,用于检测发生在子界面内的触摸操作。
处理单元03,用于根据主界面和子界面的映射关系,将所述触摸操作在子界面中的位置映射到主界面中对应的位置,并在主界面中对应的位置执行映射后的触摸操作。
本发明的实施例通过将主界面中的内容映射到子界面中,使用户可以在子界面内实现对主界面的触控操作,从而方便用户通过单手操作子界面来操作主界面,实现了对大屏终端设备的整个屏幕的单手操作。
本发明实施例在显示子界面时,终端设备采用独立于主界面的输出方式,将子界面显示于原始显示界面上,形成双显示界面。其中,可以采用为子界面创建独立绘图缓冲区的方式来实现子界面的独立输出。该创建独立绘图缓冲区的功能可在显示单元01中实现。下面对图1所示的各部件的具体实现进行详细描述。其中,本发明实施例中的映射关系包括主界面与所述子界面的缩放比。
显示单元01,创建与独立于主界面的第一绘图缓冲区的第二绘图缓冲区。将第一绘图缓冲区中绘制的画面按照所述缩放比进行缩小后绘制到第二绘图缓冲区中。将第二绘图缓冲区中绘制的画面输出到显示屏上形成所述子界面。
上述实施例通过在主界面的第一绘图缓冲区外,另外创建一个第 二绘图缓冲区来输出子界面,使得终端设备上的子界面可以被独立控制,避免在显示子界面时对主界面造成影响。
由于主界面中显示的内容会随用户的操作而发生变化,因此,显示单元01还用于将第二绘图缓冲区中绘制的内容与第一绘图缓冲区中绘制的内容保持同步更新,即,始终保持子界面中显示的内容与主界面中显示的内容一致。
上述同步更新操作可通过处理单元03与显示单元01配合来完成。
具体的,处理单元03在主界面中对应的位置执行映射后的所述触摸操作的具体处理过程包括:识别在所述位置的触摸操作在主界面中对应的操作类型;按照识别出的操作类型进行业务处理;将处理的结果发送给显示单元01。
显示单元01根据处理单元04的处理的结果将要显示的画面绘制到第一绘图缓冲区并输出至主界面。
其中,显示单元01用于将第二绘图缓冲区中绘制的画面与第一绘图缓冲区中绘制的画面保持同步更新具体包括:显示单元01在将要显示的画面绘制到第一绘图缓冲区时,将要显示的画面按照缩放比进行缩小后绘制到第二绘图缓冲区,并将述第二绘图缓冲区中绘制的画面输出至子界面。
需要说明的是,由于本发明实施例中的主界面以及输出主界面中的显示画面的第一绘图缓冲区是现有的终端设备本身所具备的能力,因此,关于显示单元01通过第一绘图缓冲区输出主界面的过程在本发明中就不再赘述。
在本发明实施例中,处理单元03用于将触摸操作在子界面中的位置映射到主界面中对应的位置具体包括:处理单元03获取用于描述所述触摸操作的参数,该参数至少包括触摸操作在子界面中的坐标;根据子界面和所述主界面的映射关系,将该触摸操作在子界面中的坐标映射为在主界面中的坐标;将描述该触摸操作的参数中的坐标更新为所述映射后的坐标。
处理单元03在主界面中对应的位置执行映射后的触摸操作具体包括:根据更新后的参数在所述主界面中执行映射后的触摸操作。
由于本发明实施例中的子界面用于对主界面中的内容进行缩略显示,因此,可将主界面与子界面的缩放比作为主界面和子界面的映射关系。其中,该缩放比可以预先设置在终端设备上。或者,在终端设备上设置子界面的大小,可以通过主界面的大小和预设的子界面的大小计算得到主界面与子界面的缩放比。
处理单元03根据子界面和所述主界面的映射关系,将触摸操作在子界面中的坐标映射为在主界面中的坐标可采用以下计算公式“主界面坐标=子界面坐标*缩放比”来实现。
示例性的,假设主界面与子界面的缩放比为2:1,子界面中的坐标a=(100,100),则该坐标a按照缩放比进行映射后,在主界面中的坐标a‘=(100*2,100*2)。
上述实施例中主界面和子界面中的坐标均采用相对坐标来表示,即主界面和子界面均以各自界面内的某点为坐标原点,主界面和子界面中的坐标是相对于各自的坐标原点的来表示的。其中,主界面的坐标原点在主界面中的位置与子界面的坐标原点在子界面中的位置一致(例如,都以界面的最左下角点坐标为坐标原点)。当主界面和子界面中的坐标采用绝对坐标来表示时,映射关系中还可包括主界面和子界面的坐标原点的偏移量以进行映射换算。
本发明实施例中,获取单元02可在现有的触摸屏技术上做进一步的改进来实现。首先,获取单元02利用现有的触摸屏技术检测触摸屏上的触摸操作,获取用于描述该触摸操作的参数,该参数至少包括该触摸操作在屏幕坐标系中的坐标。其次,获取单元02获取子界面在屏幕坐标系中的坐标范围;通过对比触摸操作在屏幕坐标系中的坐标与子界面在屏幕坐标系中的坐标范围,识别出发生在子界面内的触摸操作。
获取单元02在识别出发生在子界面内的触摸操作后,还进一步的将该触摸操作的参数传递给处理单元03,以便处理单元03完成映射处理。
获取单元02在将用于描述该触摸操作的参数传递给处理单元03前,可先将该参数中包括的该触摸操作在屏幕坐标系中的坐标转换为该触摸操作在子界面中的坐标。可选的,获取单元02也可以不进行该 转换,而由处理单元03在接收到该参数后进行转换,以获得包括触摸操作在子界面中的坐标的参数。
需要说明的是,如果子界面的坐标系和屏幕坐标系的坐标原点相同,则无需在屏幕坐标系与子界面的坐标系之间进行转换。
进一步地,为了提高用户体验,终端设备还可以提供给用户在子界面内完成对子界面本身进行操作的功能。例如,改变子界面的位置、改变子界面的大小或者在子界面中进行放大显示等。因此,终端设备还可以进一步包括子界面操作单元04。其中,
获取单元02,还用于判断触摸操作为针对所述主界面的触摸操作,或者为针对子界面的触摸操作;若该触摸操作为针对所述主界面的触摸操作,则触发处理单元03执行将该触摸操作在子界面中的位置映射到主界面中对应的位置的处理;若该触摸操作为针对子界面的触摸操作,则触发子界面操作单元04进行处理。
子界面操作单元04,用于识别针对子界面的触摸操作对应在子界面中的操作类型,按照在子界面中的操作类型进行业务处理。
具体的,当该触摸操作对应在子界面中的操作类型为设置子界面的位置的操作时,子界面操作单元04按照在子界面中的操作类型进行业务处理具体包括:子界面操作单元04按照设置的位置调整第二绘图缓冲区的输出位置,以使显示单元01按照调整后的输出位置将第二绘图缓冲区中绘制的画面输出至显示屏形成子界面。
类似的,如果该触摸操作对应在子界面中的操作类型为设置子界面的大小的操作时,子界面操作单元04按照在子界面中的操作类型进行业务处理具体包括:子界面操作单元04按照设置的大小调整第二绘图缓冲区的大小。如果预设了缩放比,子界面操作单元04还按照调整后的第二绘图缓冲区的大小更新缩放比。
当该触摸操作对应在子界面中的操作类型为放大显示时,子界面操作单元04按照在子界面中的操作类型进行业务处理具体包括:子界面操作单元04生成放大显示框的数据,显示单元01进行放大显示框绘制,将该触摸操作在子界面内的坐标处的指定半径内的显示内容按照预置比例进行放大,显示单元01将放大后的内容绘制在所述放大显 示框内。
需要说明的是,上述子界面操作单元04的功能都是可选的,可根据业务需要进行配置。
本发明的实施例通过在显示的主界面内显示映射有所述主界面中的内容的子界面,当获取到在子界面内的触摸操作时,将子界面触摸操作映射为对应于主界面的触摸操作,从而实现用户在子界面对整个主界面的单手触控操作,提高了终端设备的可操作性。
图3为本发明的实施例提供一种终端操作方法,如图3所示,该方法包括:
301、响应于用户的触发,终端设备在显示的主界面内显示映射有所述主界面中的画面的子界面。
其中,用户的触发,可以为用户触发对应的物理按键或者触摸相应的虚拟按键或者进行相应的手势操作(例如左右摇晃手机一次)。
主界面和子界面均可采用“双缓冲”绘图技术来实现。“双缓冲”即在内存中创建一个与屏幕绘图区域一致的缓冲区,先将图形绘制到这个缓冲区中,再一次性将这个缓冲区中的图形拷贝到屏幕绘图区域上,从而在屏幕上形成显示界面。本发明实施例中,为主界面创建的绘图缓冲区称为第一绘图缓冲区,为子界面创建的绘图缓冲区称为第二绘图缓冲区。在显示主界面时,先将图形绘制到内存中的第一绘图缓冲区中,再从第一绘图缓冲区输出到屏幕上。同样的,在显示子界面时,先将图形绘制到内存中的第二绘图缓冲区上,再从第二绘图缓冲区输出到屏幕上。
步骤301中终端设备在显示的主界面内显示映射有所述主界面中的画面的子界面可以通过将第一绘图缓冲区中绘制的画面与第二绘图缓冲区绘制的画面进行映射来实现。
其中,映射关系可包括主界面与子界面的缩放比,步骤301的具体实现过程如下:终端设备创建独立于第一绘图缓冲区的第二绘图缓冲区;其中,第一绘图缓冲区为输出主界面的绘图缓冲区。将第一绘图缓冲区中绘制的画面按照缩放比缩小后绘制到第二绘图缓冲区中;以及将第二绘图缓冲区中绘制的画面输出到显示屏上形成子界面。
302、终端设备检测发生在子界面内的触摸操作。
具体的,终端设备检测触摸屏上的触摸操作;然后获取所述触摸操作在屏幕坐标系中的坐标以及所述子界面在所述屏幕坐标系中的坐标范围;通过对比所述触摸操作在屏幕坐标系中的坐标与所述子界面在所述屏幕坐标系中的坐标范围,识别出发生在所述子界面内的触摸操作。
303、终端设备根据预设的所述主界面和子界面的映射关系,将所述触摸操作在子界面中的位置映射到主界面中对应的位置。
具体的,在终端设备检测到发生在所述子界面内的触摸操作之后,终端设备获取用于描述所述触摸操作的参数,该参数至少包括所述触摸操作在子界面中的坐标;根据子界面和主界面的映射关系,将所述触摸操作在子界面中的坐标映射为在主界面中的坐标;将描述所述触摸操作的参数中的坐标更新为映射后的坐标。
另外,用于描述触摸操作的参数还可以包括触摸操作的作用时长等状态信息。例如,用户在已经生成的子界面内长按带有相机图标的虚拟按键,此时终端设备获取该触摸操作的位置以及该触摸操作持续的时长。
304、终端设备在主界面中对应的位置执行映射后的触摸操作。
具体的,在终端设备将子界面中的触摸操作映射到所述主界面中之后,终端设备识别该触摸操作在所述主界面中对应的操作类型。终端设备按照该操作类型进行业务处理,根据处理的结果将要显示的画面绘制到第一绘图缓冲区,并将第一绘图缓冲区中的画面输出至主界面。
其中,终端设备可根据进行映射后该触摸操作的参数,例如,该触摸操作在主界面中的坐标,来识别该触摸操作在所述主界面中对应的操作类型。具体的识别过程可采用现有的手势识别技术来实现,这里不再赘述。需要说明的是,在识别过程中还可能用到触摸时长等参数,由于触摸时长这类参数在映射过程中不会发生变化,因此,本发明实施例仅对映射过程中需要发明改变的坐标参数进行了详细说明。
至此,终端设备实现了用户在子界面内完成了对整个屏幕的单手 触控操作。
由于主界面中显示的画面会随用户的操作而发生变化,因此,在显示子界面之后,终端设备还将第二绘图缓冲区中绘制的画面与第一绘图缓冲区中绘制的画面保持同步更新,以使子界面中显示的画面与主界面中显示的画面保持一致。该同步更新的过程通常发生在步骤304之后。
具体的,终端设备在将要显示的画面绘制到所述第一绘图缓冲区时,还进一步将要显示的画面进行映射后绘制到第二绘图缓冲区,并将所述第二绘图缓冲区中绘制的画面输出至子界面。
其中,上述过程中的将要显示的画面进行映射后绘制到第二绘图缓冲区具体可以包括:将要显示的画面按照主界面与子界面的缩放比进行缩小后绘制到第二绘图缓冲区。
本发明的实施例通过在显示的主界面内显示映射有主界面中的画面的子界面,当获取到在子界面内的触摸操作时,将子界面触摸操作映射为对应于主界面的触摸操作,从而实现用户在子界面对整个主界面的单手触控操作,方便用户操作整个屏幕界面内容。
下面通过图4对本发明实施例提供的终端操作方法进行详细的说明,如图4所示,该方法包括:
401、终端设备接收用户触发单手操作功能键的操作。
其中,单手操作功能键可以是物理按键或虚拟按键或相应的手势。
402、响应于用户的触发,创建独立于主界面的第一绘图缓冲区的第二绘图缓冲区。
其中,在创建的第二绘图缓冲区时可以在第二绘图缓冲区建立位图,子界面的显示画面可绘制在第二绘图缓冲区中的位图上。
由于本发明实施例中的子界面用于对主界面中的画面进行缩略显示,因此,可以预先设置主界面与子界面的缩放比,按照主界面与子界面的缩放比来建立位图。示例性的,假设主界面与子界面的缩放比为2:1,主界面a的显示尺寸为a:800*600。则终端设备通过预置的缩放比计算出子界面b的尺寸为b:400*300。终端设备在第二绘图缓冲区中建立尺寸为400*300的位图。需要说明的是,通常情况下,终 端设备的绘图缓冲区中位图的尺寸与对应的显示界面的尺寸一致。
403、将第一绘图缓冲区中的画面按照映射关系绘制到第二绘图缓冲区中,并将第二绘图缓冲区中绘制的画面输出到显示屏上形成子界面。
具体的,终端设备将第一绘图缓冲区的画面按照主界面与子界面的缩放比进行缩小后绘制到第二绘图缓冲区中。
404、终端设备检测屏幕上的触摸操作,获得用于描述该触摸操作的参数。
其中,描述该触摸操作的参数至少包括该触摸操作在屏幕坐标系中的坐标。终端设备的显示屏中的每个点均有自己的坐标,用于描述显示屏中每个点的坐标的坐标系称为屏幕坐标系。本发明实施例中还涉及到子界面的坐标系和主界面的坐标系。其中,子界面的坐标系的坐标原点位于子界面中,子界面中的坐标是相对子界面的坐标原点来表示的。而主界面中的坐标原点位于主界面中,主界面中的坐标是相对主界面的坐标原点来表示。通常情况这三个坐标系的坐标原点可以重合。
405、终端设备判断检测到的触摸操作是否发生在子界面内,是则继续执行步骤406,否则,跳至步骤408。
具体的,终端设备获取子界面在屏幕中的位置,该位置以子界面在屏幕坐标系中的坐标范围来表示。终端设备判断该触摸操作在屏幕坐标系中的坐标是否落入子界面在屏幕坐标系中的坐标范围内,若是,则该触摸操作是发生在子界面内的触摸操作;否则,该触摸操作是直接作用在主界面中的触摸操作。
406、终端设备判断该触摸操作为针对主界面的触摸操作或者为针对子界面的触摸操作。
具体的,若判断出该触摸操作是针对主界面的触摸操作,则继续执行407;若是针对子界面的触摸操作,则执行步骤410、终端设备识别该触摸操作对应在子界面中的操作类型,并按照在子界面中的操作类型进行处理。
其中,当该触摸操作对应在子界面中的操作类型为设置子界面的 位置的操作时,终端设备按照设置的位置调整第二绘图缓冲区的输出属性,并按照调整后的输出属性将第二绘图缓冲区中绘制的画面输出至显示屏。
类似的,如果该触摸操作对应在子界面中的操作类型为设置子界面的大小的操作时,终端设备按照设置的大小调整第二绘图缓冲区的大小。如果预设了缩放比,终端设备还按照调整后的第二绘图缓冲区的大小更新缩放比。
当该触摸操作对应在子界面中的操作类型为放大显示时,终端设备生成放大显示框,将该触摸操作在子界面内的坐标处的指定半径内的显示画面按照预置比例进行放大并显示在放大显示框内。
407、终端设备将发生在子界面中的触摸操作的位置映射到主界面中对应的位置。
具体的,终端设备将步骤404中获得的该触摸操作在屏幕坐标系中的坐标转换为在子界面的坐标系中的坐标。然后再按照子界面和主界面的映射关系,将获得的子界面的坐标系中的坐标映射到主界面的对应坐标上。从而使得该触摸操作成为作用在主界面上的触摸操作。
示例性的,假设主界面与子界面的缩放比为2:1,主界面a的显示尺寸为a:800*600,子界面b的尺寸为b:400*300,用户在子界面中执行了一次横向滑动操作,从子界面中的坐标A:(x,y)=A:(100,100)滑动至坐标B:(x,y)=B:(200,100)。终端设备根据预置的子界面和主界面的映射关系将该触摸操作在子界面中的坐标映射到主界面中。其中,采用的映射关系为:子界面中的坐标值*缩放比。因此,本例中映射到主界面中的触摸操作为从C:(x,y)=(100*2,100*2)滑动至坐标C:(x,y)=(200*2,100*2)的触摸操作。这里需要说明的是,所述触摸操作的坐标均为相对坐标,例如,无论子界面被用户拖动到何处,都可将子界面的最左下角点(或子界面中任意一个定点)坐标作为坐标原点b(0,0)。
408、终端设备在主界面中执行该触摸操作。
终端设备根据该触摸操作的参数识别该触摸操作在主界面中对应的操作类型,按照识别出的操作类型进行业务处理。
其中,如果该触摸操作是直接发生在主界面中的触摸操作,描述该触摸操作的参数是步骤404中获得的参数。如果该触摸操作是映射到主界面中的触摸操作,描述该触摸操作的参数是步骤408中更新后的参数。
终端设备确定触摸操作在所述主界面中对应的操作类型之后,一方面要执行该业务的功能(例如,打开音乐程序),另一方面要绘制并显示该业务的画面(例如,音乐程序的界面)。因此,终端设备还根据处理的结果将要显示的画面绘制到第一绘图缓冲区,并将第一绘图缓冲区中的画面输出至主界面。步骤409可采用现有技术来实现,这里不再赘述。
409、终端设备在执行触摸操作后,根据主界面中显示的画面的更新对子界面中显示的画面进行更新。
具体的,终端设备设备将步骤408中得到的处理结果中的绘图数据按照主界面和子界面的映射关系进行映射,按照映射后的数据在第二绘图缓冲区中绘图,并将第二绘图缓冲区中的画面输出至子界面。
至此,完整的实现了用户在子界面中对整个屏幕实现单手触控操作。
图5为本发明实施例提供的终端设备的硬件结构示意图。该终端设备可以为手机、平板电脑、PDA等。如图5所示,该终端设备包括处理器501、输入/输出设备502、存储器503以及总线504。
其中,处理器501、输入/输出设备502和存储器503通过总线504通信连接。
处理器501是终端的控制中心,通过运行或执行存储在存储器503内的软件程序,以及调用存储在存储器503内的数据,执行终端设备的各种功能和处理数据。
输入/输出设备502可以是触摸屏,用于感知用户的触摸操作并将用户的触摸操作传送至处理器501进行处理,以及将处理器501处理后的处理结果输出显示。其中,触摸屏可包括显示面板和触控面板。可选的,可以采用LCD(Liquid Crystal Display,液晶显示器)、OLED(Organic Light-Emitting Diode,有机发光二极管)等形式来配置 显示面板。触控面板可覆盖显示面板,当触控面板检测到在其上或附近的触摸操作后,传送给处理器501以确定触摸事件的类型,随后处理器501根据触摸事件的类型在显示面板上提供相应的视觉输出。
存储器503可以是只读存储器(Read Only Memory,ROM),静态存储设备,动态存储设备或者随机存取存储器(Random Access Memory,RAM)。本发明实施例中存储器503可以包括ROM和RAM两部分,其中操作***和其他应用程序的指令以及应用数据可以存储在ROM中,第一绘图缓冲区和第二绘图缓冲区可以创建在RAM中。存储器503中存储的指令由处理器501来运行执行。
处理器501可以采用通用的中央处理器(Central Processing Unit,CPU),微处理器,应用专用集成电路(Application Specific Integrated Circuit,ASIC),或者一个或多个集成电路,用于执行相关程序。
总线504可包括一通路,在各个部件(例如处理器501、存储器503和输入/输出设备502、)之间传送信息。
本发明实施例中,处理器501用于执行存储器503中的指令,用于实现以下功能:响应于用户的触发,在输入/输出设备502显示的主界面内显示映射有所述主界面中的画面的子界面;通过输入/输出设备502检测发生在子界面内的触摸操作;根据预设的子界面和主界面的映射关系,将所述子界面中的触摸操作映射到所述主界面中;进而在主界面中执行映射后的触摸操作。
其中,用户可以在输入/输出设备502上进行触发,输入/输出设备502将用户的触发发送至处理器501。
用户的触摸操作由输入/输出设备502进行检测,并由输入/输出设备502获取用于描述该触摸操作的参数,并将该参数传递给处理器501。
处理器501生成子界面,并将子界面中的触摸操作映射到主界面,以及在主界面中执行映射后的触摸操作的过程可参考图3以及图4所示的方法实施例部分,这里不再赘述。
应注意,尽管图5所示的硬件仅仅示出了处理器501、存储器503、 输入/输出设备502以及总线504,但是在具体实现过程中,本领域的技术人员应当明白,该装置还包含实现正常运行所必须的其他器件。同时,根据具体需要,本领域的技术人员应当明白,还可包含实现其他功能的硬件器件。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。上述描述的***,装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露设备和方法,可以通过其它的方式实现。例如,以上所描述的设备实施例仅仅是示意性的,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个***,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本发明各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本发明的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台 计算机设备(可以是个人计算机,服务器,或者网络设备等)或处理器(processor)执行本发明各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本发明的具体实施方式,但本发明的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本发明揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本发明的保护范围之内。因此,本发明的保护范围应以所述权利要求的保护范围为准。

Claims (18)

  1. 一种终端设备,其特征在于,所述终端设备包括显示单元、获取单元、与所述获取单元连接的处理单元,其中:
    所述显示单元,用于响应于用户的触发,在所述终端显示的主界面内显示映射有所述主界面中的画面的子界面;
    所述获取单元,用于检测发生在所述子界面内的触摸操作;
    所述处理单元,用于根据所述主界面和子界面的映射关系,将所述触摸操作在子界面中的位置映射到所述主界面中对应的位置,并在所述主界面中对应的位置执行映射后的所述触摸操作。
  2. 根据权利要求1所述的终端设备,其特征在于,所述映射关系包括所述主界面与所述子界面的缩放比,所述显示单元具体用于创建独立于所述主界面的第一绘图缓冲区的第二绘图缓冲区;将所述第一绘图缓冲区中绘制的画面按照所述缩放比进行缩小后绘制到所述第二绘图缓冲区中;将所述第二绘图缓冲区中绘制的画面输出到显示屏上形成所述子界面。
  3. 根据权利要求2所述的终端设备,其特征在于,所述显示单元,还用于将所述第二绘图缓冲区中绘制的画面与所述第一绘图缓冲区中绘制的画面保持同步更新。
  4. 根据权利要求3所述的终端设备,其特征在于,
    所述处理单元用于在所述主界面中对应的位置执行映射后的所述触摸操作具体包括:所述处理单元识别在所述位置的触摸操作在所述主界面中对应的操作类型;按照所述操作类型进行业务处理;将处理的结果发送给所述显示单元;
    所述显示单元,还用于根据所述处理结果将要显示的画面绘制到所述第一绘图缓冲区并输出至所述主界面;
    所述显示单元用于将所述第二绘图缓冲区中绘制的画面与所述第一绘图缓冲区中绘制的画面保持同步更新具体包括:所述显示单元在将要显示的画面绘制到所述第一绘图缓冲区时,将所述要显示的画面按照所述缩放比进行缩小后绘制到所述第二绘图缓冲区,并将所述第二绘图缓冲区中绘制的画面输出至所述子界面。
  5. 根据权利要求1至3中任一项所述的终端设备,其特征在于,所述 处理单元具体用于获取用于描述所述触摸操作的参数,所述参数至少包括所述触摸操作在所述子界面中的坐标;根据所述子界面和所述主界面的映射关系,将所述触摸操作在所述子界面中的坐标映射为在主界面中的坐标;将描述所述触摸操作的参数中的坐标更新为所述映射后的坐标,根据更新后的参数执行所述在所述主界面中执行映射后的所述触摸操作的处理。
  6. 根据权利要求1至5中任一项所述的终端设备,其特征在于,
    所述获取单元,具体用于检测触摸屏上的触摸操作;获取所述触摸操作在屏幕坐标系中的坐标以及所述子界面在所述屏幕坐标系中的坐标范围;通过对比所述触摸操作在屏幕坐标系中的坐标与所述子界面在所述屏幕坐标系中的坐标范围,识别出发生在所述子界面内的触摸操作。
  7. 根据权利要求2至4中任一项所述的终端设备,其特征在于,所述终端设备还包括子界面操作单元;
    所述获取单元,还用于判断所述触摸操作为针对所述主界面的触摸操作,或者为针对所述子界面的触摸操作;若所述触摸操作为针对所述主界面的触摸操作,则触发所述处理单元执行将所述触摸操作在所述子界面中的位置映射到所述主界面中对应的位置的处理;若所述触摸操作为针对所述子界面的触摸操作,则触发所述子界面操作单元进行处理;
    所述子界面操作单元,用于识别所述针对子界面的触摸操作对应在子界面中的操作类型,按照在子界面中的操作类型进行业务处理。
  8. 根据权利要求7所述的终端设备,其特征在于,当所述触摸操作对应在子界面中的操作类型为设置所述子界面的位置的操作时,所述子界面操作单元按照在子界面中的操作类型进行业务处理具体包括:子界面操作单元按照设置的位置调整所述第二绘图缓冲区的输出位置,以使所述输出单元按照调整后的输出位置将所述第二绘图缓冲区中绘制的内容输出至所述显示屏形成所述子界面。
  9. 根据权利要求7所述的终端设备,其特征在于,当所述触摸操作对应在子界面中的操作类型为放大显示时,所述子界面操作单元按照在子界面中的操作类型进行业务处理具体包括:子界面操作单元生成放大显示框的数据,将所述数据提供给所述绘图模块进行放大显示框绘制,将所述触 摸操作在所述子界面内的坐标处的指定半径内的显示内容按照预置比例进行放大,通知所述绘图模块将放大后的内容绘制在所述放大显示框内。
  10. 一种终端操作方法,其特征在于,包括:
    响应于用户的触发,终端设备在显示的主界面内显示映射有所述主界面中的画面的子界面;
    检测发生在所述子界面内的触摸操作;
    根据所述主界面和子界面的映射关系,将所述触摸操作在所述子界面中的位置映射到所述主界面中对应的位置;
    在所述主界面中对应的位置执行所述触摸操作。
  11. 根据权利要求10所述的方法,其特征在于,所述映射关系包括所述主界面与所述子界面的缩放比,所述终端在显示的主界面内显示映射有所述主界面中的画面的子界面,包括:
    创建独立于所述主界面的第一绘图缓冲区的第二绘图缓冲区;
    将所述第一绘图缓冲区中绘制的画面按照所述缩放比缩小后绘制到所述第二绘图缓冲区中;
    将所述第二绘图缓冲区中绘制的画面输出到显示屏上形成所述子界面。
  12. 根据权利要求11所述的方法,其特征在于,所述方法还包括:
    将所述第二绘图缓冲区中绘制的画面与所述第一绘图缓冲区中绘制的画面保持同步更新。
  13. 根据权利要求12所述的方法,其特征在于,所述在所述主界面中对应的位置执行所述触摸操作具体包括:
    识别在所述位置的触摸操作在所述主界面中对应的操作类型;
    按照所述操作类型进行业务处理;
    根据处理的结果将要显示的画面绘制到所述第一绘图缓冲区,将所述第一绘图缓冲区绘制的画面输出至所述主界面;
    所述将所述第二绘图缓冲区中绘制的画面与所述第一绘图缓冲区中绘制的画面保持同步更新具体包括:
    所述终端设备在将要显示的画面绘制到所述第一绘图缓冲区时,将所述要显示的画面按照所述缩放比进行缩小后绘制到所述第二绘图缓冲区, 并将所述第二绘图缓冲区中绘制的画面输出至所述子界面。
  14. 根据权利要求10至12中任一项所述的方法,其特征在于,所述根据所述主界面和所述子界面的映射关系,将所述触摸操作在所述子界面中的位置映射到所述主界面的对应位置具体包括:
    获取用于描述所述触摸操作的参数,所述参数至少包括所述触摸操作在所述子界面中的坐标;
    根据所述子界面和所述主界面的映射关系,将所述触摸操作在所述子界面中的坐标映射为在主界面中的坐标;
    将描述所述触摸操作的参数中的坐标更新为所述映射后的坐标;
    所述在所述主界面中对应的位置执行映射后的所述触摸操作具体包括:
    根据更新后的参数在所述主界面中执行映射后的所述触摸操作。
  15. 根据权利要求10至14中任一项所述的方法,其特征在于,所述检测发生在所述子界面内的触摸操作具体包括:
    检测触摸屏上的触摸操作;
    获取所述触摸操作在屏幕坐标系中的坐标以及所述子界面在所述屏幕坐标系中的坐标范围;
    通过对比所述触摸操作在屏幕坐标系中的坐标与所述子界面在所述屏幕坐标系中的坐标范围,识别出发生在所述子界面内的触摸操作。
  16. 根据权利要求11至13中任一项所述的方法,其特征在于,将所述子界面中的触摸操作映射到所述主界面中之前,所述方法还包括:
    判断所述触摸操作为针对所述主界面的触摸操作,或者为针对所述子界面的触摸操作;
    若所述触摸操作为针对所述主界面的触摸操作,则执行所述将所述触摸操作在所述子界面中的位置映射到所述主界面中对应的位置的操作;
    若所述触摸操作为针对所述子界面的触摸操作,则识别所述触摸操作对应在子界面中的操作类型,按照在子界面中的操作类型进行业务处理。
  17. 根据权利要求16所述的方法,其特征在于,当所述触摸操作对应在子界面中的操作类型为设置所述子界面的位置的操作时,所述按照在子界面中的操作类型进行业务处理,包括:
    所述终端设备按照设置的位置调整所述第二绘图缓冲区的输出位置,并按照调整后的输出位置将所述第二绘图缓冲区中绘制的内容输出至所述显示屏。
  18. 根据权利要求16所述的方法,其特征在于,当所述触摸操作对应在子界面中的操作类型为放大显示时,所述按照在子界面中的操作类型进行业务处理,包括:
    所述终端设备生成放大显示框,将所述触摸操作在所述子界面内的坐标处的指定半径内的显示内容按照预置比例进行放大并显示在所述放大显示框内。
PCT/CN2014/092946 2014-04-25 2014-12-03 一种终端操作方法及终端设备 WO2015161653A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201410173006.9A CN103955339A (zh) 2014-04-25 2014-04-25 一种终端操作方法及终端设备
CN201410173006.9 2014-04-25

Publications (1)

Publication Number Publication Date
WO2015161653A1 true WO2015161653A1 (zh) 2015-10-29

Family

ID=51332617

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2014/092946 WO2015161653A1 (zh) 2014-04-25 2014-12-03 一种终端操作方法及终端设备

Country Status (2)

Country Link
CN (1) CN103955339A (zh)
WO (1) WO2015161653A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109992186A (zh) * 2019-04-08 2019-07-09 努比亚技术有限公司 单手操作方法、装置、终端及存储介质
WO2021057203A1 (zh) * 2019-09-24 2021-04-01 华为技术有限公司 一种操作方法和电子设备

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104461332B (zh) * 2013-09-24 2019-03-08 联想(北京)有限公司 一种信息处理方法和电子设备
CN103955339A (zh) * 2014-04-25 2014-07-30 华为技术有限公司 一种终端操作方法及终端设备
CN104238745B (zh) * 2014-07-31 2017-11-28 天津三星通信技术研究有限公司 一种移动终端单手操作方法及移动终端
CN105630369B (zh) * 2014-11-06 2020-02-07 上海乐今通信技术有限公司 单手操作移动终端的实现方法及移动终端
CN104484111A (zh) * 2014-12-30 2015-04-01 小米科技有限责任公司 触摸屏的内容显示方法及装置
CN104660826B (zh) * 2015-03-13 2018-05-01 硕诺科技(深圳)有限公司 一种手机大屏转小屏的方法
CN105155153B (zh) * 2015-08-28 2018-03-16 深圳思瑞普科技有限公司 一种电脑刺绣机局部花样显示的处理方法
CN107168630B (zh) * 2016-03-07 2020-05-05 广州市动景计算机科技有限公司 一种终端设备、页面控制装置及页面控制方法
CN106020678A (zh) * 2016-04-29 2016-10-12 青岛海信移动通信技术股份有限公司 一种在移动设备进行触控操作的方法和装置
CN106371749A (zh) * 2016-08-30 2017-02-01 青岛海信移动通信技术股份有限公司 一种终端控制的方法和装置
CN106445354A (zh) * 2016-11-24 2017-02-22 北京小米移动软件有限公司 终端设备的触摸控制方法及装置
CN107515712A (zh) * 2017-06-30 2017-12-26 西安易朴通讯技术有限公司 界面显示方法及电子设备
WO2019056167A1 (zh) * 2017-09-19 2019-03-28 深圳传音通讯有限公司 一种全屏幕单手操作方法、终端设备及计算机可读介质
CN110888581A (zh) * 2019-10-11 2020-03-17 广州视源电子科技股份有限公司 元素传递方法、装置、设备及存储介质
CN111124201A (zh) * 2019-11-29 2020-05-08 华为技术有限公司 一种单手操作的方法和电子设备
CN111722781A (zh) * 2020-06-22 2020-09-29 京东方科技集团股份有限公司 智能交互方法及设备、存储介质
CN114157889B (zh) * 2020-08-18 2024-04-16 海信视像科技股份有限公司 一种显示设备及触控协助交互方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101470595A (zh) * 2007-12-25 2009-07-01 新利虹科技股份有限公司 屏幕辅助***
CN103472996A (zh) * 2013-09-17 2013-12-25 深圳市佳创软件有限公司 一种移动设备接收触控方法及设备
CN103677556A (zh) * 2012-09-24 2014-03-26 北京三星通信技术研究有限公司 快速定位应用程序的方法及设备
CN103955339A (zh) * 2014-04-25 2014-07-30 华为技术有限公司 一种终端操作方法及终端设备

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130127738A1 (en) * 2011-11-23 2013-05-23 Microsoft Corporation Dynamic scaling of touch sensor
CN102830917A (zh) * 2012-08-02 2012-12-19 上海华勤通讯技术有限公司 移动终端及其触控建立方法
CN102915201B (zh) * 2012-09-17 2015-08-05 广东欧珀移动通信有限公司 一种大屏幕触控手机的单手操作方法
CN103488419B (zh) * 2013-08-26 2017-09-08 宇龙计算机通信科技(深圳)有限公司 通信终端的操作方法及通信终端
CN103744582B (zh) * 2014-01-21 2017-06-20 宇龙计算机通信科技(深圳)有限公司 终端操控装置和终端操控方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101470595A (zh) * 2007-12-25 2009-07-01 新利虹科技股份有限公司 屏幕辅助***
CN103677556A (zh) * 2012-09-24 2014-03-26 北京三星通信技术研究有限公司 快速定位应用程序的方法及设备
CN103472996A (zh) * 2013-09-17 2013-12-25 深圳市佳创软件有限公司 一种移动设备接收触控方法及设备
CN103955339A (zh) * 2014-04-25 2014-07-30 华为技术有限公司 一种终端操作方法及终端设备

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109992186A (zh) * 2019-04-08 2019-07-09 努比亚技术有限公司 单手操作方法、装置、终端及存储介质
CN109992186B (zh) * 2019-04-08 2024-01-12 努比亚技术有限公司 单手操作方法、装置、终端及存储介质
WO2021057203A1 (zh) * 2019-09-24 2021-04-01 华为技术有限公司 一种操作方法和电子设备

Also Published As

Publication number Publication date
CN103955339A (zh) 2014-07-30

Similar Documents

Publication Publication Date Title
WO2015161653A1 (zh) 一种终端操作方法及终端设备
US20240137462A1 (en) Display apparatus and control methods thereof
CN110471596B (zh) 一种分屏切换方法、装置、存储介质及电子设备
RU2662690C2 (ru) Устройство и способ управления объектом пользовательского прибора
US9965039B2 (en) Device and method for displaying user interface of virtual input device based on motion recognition
US11158057B2 (en) Device, method, and graphical user interface for processing document
WO2020258929A1 (zh) 文件夹界面切换方法及终端设备
WO2020134744A1 (zh) 图标移动方法及移动终端
US20160292922A1 (en) Display control device, display control method, and recording medium
WO2021057337A1 (zh) 操作方法及电子设备
US20170199662A1 (en) Touch operation method and apparatus for terminal
CN109471692B (zh) 一种显示控制方法及终端设备
CN109828850B (zh) 一种信息显示方法及终端设备
US9377901B2 (en) Display method, a display control method and electric device
US20160012612A1 (en) Display control method and system
WO2022161432A1 (zh) 显示控制方法、装置、电子设备及介质
KR102535334B1 (ko) 이미지 표시 방법 및 이동 단말
WO2018010316A1 (zh) 桌面页面管理的方法及装置
US11209914B1 (en) Method and apparatus for detecting orientation of electronic device, and storage medium
WO2020173316A1 (zh) 图像显示方法、终端和移动终端
CN108600544B (zh) 一种单手控制方法及终端
US20140359410A1 (en) Method and apparatus for gesture-based data processing
JP6540367B2 (ja) 表示制御装置、通信端末、通信システム、表示制御方法、及びプログラム
CN113655929A (zh) 界面显示的适配处理方法、装置和电子设备
WO2015067023A1 (zh) 视频会议体感控制方法、终端及***

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14889977

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14889977

Country of ref document: EP

Kind code of ref document: A1