WO2017028703A1 - 界面对象归类方法和装置 - Google Patents

界面对象归类方法和装置 Download PDF

Info

Publication number
WO2017028703A1
WO2017028703A1 PCT/CN2016/094105 CN2016094105W WO2017028703A1 WO 2017028703 A1 WO2017028703 A1 WO 2017028703A1 CN 2016094105 W CN2016094105 W CN 2016094105W WO 2017028703 A1 WO2017028703 A1 WO 2017028703A1
Authority
WO
WIPO (PCT)
Prior art keywords
interface object
touch screen
points
interface
object container
Prior art date
Application number
PCT/CN2016/094105
Other languages
English (en)
French (fr)
Inventor
张淼
Original Assignee
阿里巴巴集团控股有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 阿里巴巴集团控股有限公司 filed Critical 阿里巴巴集团控股有限公司
Publication of WO2017028703A1 publication Critical patent/WO2017028703A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present application belongs to the field of mobile communications, and in particular, to an interface object classification method and apparatus.
  • the intelligent terminal has an independent operating system and an independent running space.
  • the user can install an application provided by a third-party service provider such as software, games, navigation, etc., and can implement a wireless network access terminal device through a wireless communication network.
  • a third-party service provider such as software, games, navigation, etc.
  • the present application provides an interface object categorization method and apparatus to solve the technical problems in the prior art when categorizing interface objects.
  • an interface object categorization method including: detecting contact for a plurality of points of a touch screen, the plurality of points forming a content collection area on the touch screen; When the contact of the plurality of points of the touch screen gradually gathers on the touch screen, the display image of the interface object in the content collection area is gradually concentrated; when the contact of the plurality of points for the touch screen is detected, the contact has been gathered to the pre- When the size is set, an interface object container is created and displayed, and the interface object in the content collection area is saved in the interface object container.
  • an interface object categorization method including: detecting a pin a contact of a plurality of interface objects displayed on the touch screen, and when the contacts for the plurality of interface objects gradually gather on the touch screen, the display images of the plurality of interface objects are gradually concentrated; when When the contacts of the interface objects have been gathered into the preset range, an interface object container is created and displayed, and the plurality of interface objects are saved in the interface object container.
  • an interface object categorization method including: detecting contact for a plurality of points of a touch screen, the plurality of points forming a content collection area on the touch screen; detecting a contact of a plurality of points of the content collection area, and when contact of a plurality of points for the content collection area gradually gathers on the touch screen, a display image of the interface object in the content collection area is gradually concentrated; When it is detected that the contact for the plurality of points of the content collection area has been gathered into the preset range, an interface object container is created and displayed, and the interface object in the content collection area is saved in the interface object container Inside.
  • an interface object categorizing device including: a first detecting module, configured to detect contact for a plurality of points of the touch screen, wherein the plurality of points form a touch screen on the touch screen a content collection area; a display module, configured to gradually concentrate the display image of the interface object in the content collection area when the contact of the plurality of points for the touch screen is gradually gathered on the touch screen; the first processing module And for detecting and displaying an interface object container when the contact of the plurality of points for the touch screen has been gathered to a preset size, the interface object in the content collection area is saved in the interface object container .
  • an interface object categorizing device including: a second detecting module, configured to detect contact of a plurality of interface objects displayed for a touch screen, and when targeting the plurality of interface objects The display images of the plurality of interface objects are gradually concentrated when the contacts are gradually gathered on the touch screen; and the second processing module is configured to detect that the contacts for the plurality of interface objects have been gathered into the preset range At the time, an interface object container is created and displayed, and the plurality of interface objects are saved in the interface object container.
  • an interface object categorizing device including: a fifth detecting module, configured to detect contact for a plurality of points of the touch screen, wherein the plurality of points form a touch screen on the touch screen a content collection area; a sixth detection module, configured to detect contact of a plurality of points for the content collection area, and when contacts of a plurality of points for the content collection area gradually gather on the touch screen, The display image of the interface object in the content collection area is gradually concentrated; the third processing module is configured to create and display a contact when detecting that the contact of the plurality of points for the content collection area has been gathered into the preset range The interface object container, the interface object in the content collection area is saved in the interface object container.
  • the present application can obtain the following technical effects: the operation steps of categorizing the interface objects are simplified, and the categorization of the interface objects can be quickly completed through the multi-touch gesture operation, thereby improving the operation efficiency. , And form a simple and vivid way of human-computer interaction.
  • FIG. 1 is a schematic flowchart of a method for categorizing an interface object according to an embodiment of the present application
  • FIG. 2 is a schematic diagram of an exemplary interface for forming a content collection area according to an embodiment of the present application
  • FIG. 3 is a schematic diagram of an exemplary interface in which interface objects are compressed or covered with each other in the embodiment of the present application;
  • 4(a) is a schematic diagram showing an exemplary interface of the upper-level directory while the interface object is compressed in the embodiment of the present application;
  • FIG. 4(b) is a schematic diagram showing an exemplary interface of the interface object container created after the embodiment of the present application is classified;
  • FIG. 5 is a schematic flowchart of a method for categorizing an interface object according to an embodiment of the present application
  • FIG. 6 is a schematic diagram of an exemplary interface for determining a selected interface object in an embodiment of the present application
  • FIG. 7 is a schematic diagram of an exemplary interface for compressing or overwriting a selected interface object according to an embodiment of the present application
  • FIG. 8( a ) is a schematic diagram showing an exemplary interface for displaying a higher-level directory while compressing or overwriting a selected interface object according to an embodiment of the present application;
  • FIG. 8(b) is a schematic diagram showing an exemplary interface of the interface object container created after the embodiment of the present application is classified;
  • FIG. 9 is a schematic flowchart of a method for categorizing an interface object according to an embodiment of the present application.
  • FIGS. 10(a)-(c) are schematic diagrams showing exemplary interfaces of an interface object categorization method provided by an embodiment of the present application.
  • FIG. 11 is a schematic block diagram showing the structure of an interface object categorization device according to an embodiment of the present application.
  • FIG. 12 is a schematic block diagram showing the structure of an interface object categorization device according to an embodiment of the present application.
  • FIG. 13(a)-(c) are schematic diagrams showing exemplary interfaces for categorizing application icons in an embodiment of the present application
  • FIGS. 14(a)-(c) are schematic diagrams showing exemplary interfaces for categorizing picture files in the embodiment of the present application.
  • 15(a)-(c) are schematic diagrams showing exemplary interfaces for classifying item list entries in the embodiment of the present application.
  • 16(a)-(c) are schematic diagrams showing exemplary interfaces for classifying mailing list entries in an embodiment of the present application.
  • 1 is a method for categorizing an interface object according to an embodiment of the present disclosure, which is applicable to a terminal device, and includes The following steps.
  • step S10 contacts for a plurality of points of the touch screen are detected, and the plurality of points form a content collection area on the touch screen.
  • the terminal device supports a multi-touch function, and the touch event triggered by the user gesture operation detects contact of a plurality of points for the touch screen, thereby forming a content collection area on the touch screen according to the plurality of points. For example, as shown in FIG. 2, when the touch screen detects the contact of a plurality of points, a content collection area is formed on the touch screen.
  • the interface object displayed in the content collection area will become the interface object to be classified.
  • the interface object may be an application icon displayed on the touch screen, a mail item in a mailing list, an item item in a product list, a picture, and other interface objects.
  • the contact for a plurality of points of the touch screen is detected, which may be any contact with the touch screen, and the contact may be on the interface object of the interactive interface displayed by the touch screen, or may be located at other positions of the interactive interface. All the points in contact with the touch screen together form a content collection area, and the interface objects displayed in the content collection area become the interface objects to be classified. For a single interface object, all displayed in the content collection area or partially displayed in the content collection area will become the interface object to be classified.
  • step S11 when it is detected that the contacts for the plurality of points of the touch screen gradually gather on the touch screen, the display images of the interface objects in the content collection area are gradually concentrated.
  • the terminal device can detect that the contacts for the plurality of points of the touch screen gradually gather on the touch screen according to the contact position. At this time, the interface objects displayed in the content collection area are gradually concentrated as the user gestures are gathered, that is, the display images of the interface objects displayed in the content collection area start to approach each other.
  • step S12 when it is detected that the contacts for the plurality of points of the touch screen have been gathered to the preset size, an interface object container is created and displayed, and the interface object in the content collection area is saved in the interface object container.
  • a threshold is preset in the background to detect whether the contact for the plurality of points of the touch screen has been gathered to a preset size, which may be a pixel value that is spaced between the user's finger and each contact point of the touch screen.
  • a preset size which may be a pixel value that is spaced between the user's finger and each contact point of the touch screen.
  • an instruction to create an interface object container is triggered, and an interface object container is created in the upper level directory of the interface object, wherein the interface object container may be a folder or a directory in a corresponding directory hierarchy.
  • an instruction to transfer the interface object in the content collection area is triggered, and the interface object in the content collection area is transferred to the created interface object container for storage.
  • the created interface object container is displayed, and the interface object in the previously formed content collection area has been saved in the interface object container.
  • the embodiment of the present application forms a content collection area by detecting a multi-touch operation of the user, automatically creating an interface object container with the user's pinch gesture, and automatically saving the interface object in the content collection area to the interface object container, simplifying
  • the operation steps of categorizing the interface objects can quickly complete the categorization of the interface objects through multi-touch gesture operations, improve the operation efficiency, and form a simple and vivid human-computer interaction mode.
  • step S12 further includes the following steps.
  • step S121 when it is detected that the contacts for the plurality of points of the touch screen have been gathered to the first preset size, the display images of the interface objects start to compress or cover each other.
  • a first threshold and a second threshold are set in the background to determine the degree of convergence of contacts for a plurality of points of the touch screen.
  • the pixel value of the interval between the user's finger and each contact point of the touch screen is less than or equal to the first threshold, it is determined that the contact for the plurality of points of the touch screen has been gathered to the first preset size.
  • the display image of the interface object is very concentrated, and the display image of the interface object is further compressed or covered, as shown in FIG. 3, thereby further enhancing the concentration degree of the interface object in the image display effect, so as to guide the user to continue.
  • step S122 when it is detected that the contact for the plurality of points of the touch screen has been gathered to the second preset size, an interface object container is created and displayed, and the interface object in the content collection area is saved in the interface object container.
  • the interface object container is automatically created and the interface object in the content collection area is saved in the interface object container, which simplifies the operation steps of classifying the interface object.
  • step S11 when it is detected that the contacts for the plurality of points of the touch screen gradually gather on the touch screen, the display image of the interface object in the content collection area is gradually concentrated while displaying the upper level of the interface object.
  • File Directory when it is detected that the contacts for the plurality of points of the touch screen gradually gather on the touch screen, the display image of the interface object in the content collection area is gradually concentrated while displaying the upper level of the interface object.
  • the upper-level file directory is another file directory at the file directory level where the interface object container created in step S12 is located. For example, as shown in FIG. 4( a ), when the display image of the interface object in the content collection area is gradually concentrated, and the upper-level file directory of the interface object is simultaneously displayed, including the folder 1 and the folder 2, then in step S12 The interface object container created in is at the same level as the folder 1 and folder 2.
  • step S12 when it is detected that the contact for the plurality of points of the touch screen has been gathered to the preset size, an interface object container is created and displayed, and the interface object in the content collection area is saved in the interface object container, in the interface The created interface object container is displayed in the file directory of the object.
  • the interface object in the content collection area is saved in the created interface object container, and the interface object to be classified is browsed by opening the created interface object container.
  • the interface object categorization method further includes the following steps.
  • step S13 a prompt is made to rename the created interface object container.
  • the user After saving the interface object in the content collection area in the created interface object container, the user is prompted to rename the created interface object container.
  • a default name is usually added to the interface object container, such as the default names such as "new folder” and "new category”.
  • the prompt may be to automatically change the file name of the created interface object container to an editable state, or prompt the user to rename the created interface object container by text.
  • the interface object categorization method further includes the following steps.
  • step S14 contact for a plurality of points of the interface object container is detected, and when contact with a plurality of points of the interface object container is gradually spread on the touch screen, the interface object container is deleted and the interface object is displayed in the current file directory. Multiple interface objects saved by the container.
  • the terminal device detects that the contact of the plurality of points of the interface object container gradually spreads on the touch screen, that is, the user's opening gesture on the touch screen is detected. At this time, an instruction to transfer the interface object saved in the interface object container to the current file directory is triggered, and after the transfer is completed, the interface object container is deleted, and the touch screen displays the interface object transferred from the interface object container, and the interface object The interface object container has been removed from the current file directory.
  • the display image of the interface object container created during the classification may be opened on the touch screen, and when the terminal device detects the gesture, the interface object saved by the interface object container is displayed.
  • the interface object container is deleted, so that when the interface object is incorrectly classified, the display state before the classification can be quickly and easily restored, thereby further simplifying the user operation and improving the operation efficiency.
  • the contact with the plurality of points of the touch screen may be detected through the touch pad or the virtual touch space.
  • the terminal device is configured with a touch panel, the user completes contact with a plurality of points of the touch screen by performing a gesture operation on the touch panel.
  • the terminal device is equipped with an optical touch system, and a virtual touch area is formed in the surrounding three-dimensional space by infrared light, and the user completes contact with a plurality of points of the touch screen by a gesture operation in the virtual touch area.
  • FIG. 5 is a method for categorizing an interface object according to an embodiment of the present disclosure, which is applicable to a terminal device, and the method includes the following steps.
  • step S20 contact of a plurality of interface objects displayed for the touch screen is detected, and when contacts for the plurality of interface objects are gradually gathered on the touch screen, display images of the plurality of interface objects are gradually concentrated.
  • the terminal device supports a multi-touch function, and the touch event triggered by the user gesture operation detects the contact of multiple interface objects displayed on the touch screen, as shown in FIG. 6. At this time, the user gesture needs to touch the display position of the interface object, and the interface object that detects the contact will be treated as the interface object to be classified.
  • the interface object may be an application icon displayed on the touch screen, a mail item in a mailing list, an item item in a product list, a picture, and other interface objects.
  • the terminal device can detect that the contacts for the plurality of interface objects gradually gather on the touch screen according to the contact position. At this time, the display images of the plurality of interface objects gradually become concentrated as the gestures of the user gather, that is, the display images of the plurality of interface objects begin to approach each other.
  • step S21 when it is detected that the contact for the plurality of interface objects has been gathered into the preset range, an interface object container is created and displayed, and the plurality of interface objects are saved in the interface object container.
  • a threshold is preset in the background to detect whether the contacts of the plurality of interface objects displayed for the touch screen have been gathered to a preset size
  • the threshold may be a pixel value spaced between the user's finger and each contact point of the touch screen.
  • an instruction to create an interface object container is triggered, and an interface object container is created in the upper level directory of the interface object, wherein the interface object container may be a folder or a directory in a corresponding directory hierarchy.
  • an instruction to transfer the selected plurality of interface objects is triggered, and the selected plurality of interface objects are transferred to the created interface object container for saving.
  • the created interface object container is displayed, and the selected plurality of interface objects have been saved in the interface object container.
  • the embodiment of the present application determines a plurality of selected interface objects by detecting a multi-touch operation of the user, automatically creating an interface object container according to the user's pinch gesture, and automatically saving the selected plurality of interface objects to the interface object.
  • the operation steps of categorizing the interface objects are simplified, and the categorization of the interface objects can be quickly completed by the multi-touch gesture operation, the operation efficiency is improved, and a simple and vivid human-computer interaction mode is formed.
  • step S20 further includes the following steps.
  • step S201 when it is detected that the contacts of the plurality of interface objects displayed for the touch screen have been gathered to the first preset size, the display images of the plurality of interface objects start to compress or cover each other.
  • a first threshold and a second threshold are set in the background to determine the degree of convergence of contacts for the plurality of interface objects displayed by the touch screen.
  • the pixel value of the interval between the user's finger and each contact point of the touch screen is less than or equal to the first threshold, it is determined that the contact of the plurality of interface objects displayed for the touch screen has been gathered to the first preset size.
  • the display image of the interface object is very concentrated, and the display images of the selected plurality of interface objects are further compressed or covered, as shown in FIG. 7, thereby further enhancing the concentration of the interface object in the image display effect. To guide the user to continue to gather the selected interface object.
  • step S202 when it is detected that the contacts of the plurality of interface objects displayed for the touch screen have been gathered to the second preset size, an interface object container is created and displayed, and the selected plurality of interface objects are saved in the interface object container. Inside.
  • the interface object container is automatically created and the selected interface objects are saved in the interface object container, which simplifies the operation steps of classifying the interface object.
  • step S20 when it is detected that the contacts of the plurality of interface objects displayed on the touch screen gradually gather on the touch screen, the display images of the selected plurality of interface objects are gradually concentrated, and the plurality of interfaces are simultaneously displayed.
  • the upper-level file directory is another file directory at the file directory level where the interface object container created in step S21 is located. For example, as shown in FIG. 8( a ), when the display images of the selected plurality of interface objects are gradually concentrated, the upper file directory of the plurality of interface objects is simultaneously displayed, including the folder 1 and the folder 2, then The interface object container created in step S21 is at the same level as the folder 1 and folder 2.
  • step S21 when it is detected that the contacts of the plurality of interface objects displayed for the touch screen have been gathered to a preset size, an interface object container is created and displayed, and the selected plurality of interface objects are saved in the interface object container, The created interface object container is displayed in the file directory of the upper level of the plurality of interface objects.
  • a new interface object container for example, folder 3
  • a file directory of the upper level of the plurality of interface objects that is, a directory level of the created interface object container, including the new The created interface object container and other interface object containers or files at the directory level.
  • the selected plurality of interface objects are saved in the created interface object container, and the selected interface objects are browsed by opening the created interface object container.
  • the interface object categorization method further includes the following steps.
  • step S22 the renamed interface object container is prompted to be renamed.
  • the user is prompted to create the boundary.
  • the face object container is renamed.
  • a default name is usually added to the interface object container, such as the default names such as "new folder” and "new category”.
  • the prompt may be to automatically change the file name of the created interface object container to an editable state, or prompt the user to rename the created interface object container by text.
  • the interface object categorization method further includes the following steps.
  • step S23 contact for a plurality of points of the interface object container is detected, and when contact with a plurality of points of the interface object container is gradually spread on the touch screen, the interface object container is deleted and the interface object is displayed in the current file directory. Multiple interface objects saved by the container.
  • the terminal device detects that the contact of the plurality of points of the interface object container gradually spreads on the touch screen, that is, the user's opening gesture on the touch screen is detected. At this time, an instruction to transfer the interface object saved in the interface object container to the current file directory is triggered, and after the transfer is completed, the interface object container is deleted, and the touch screen displays the interface object transferred from the interface object container, and the interface object The interface object container has been removed from the current file directory.
  • the display image of the interface object container created during the classification may be opened on the touch screen, and when the terminal device detects the gesture, the interface object saved by the interface object container is displayed.
  • the interface object container is deleted, so that when the interface object is incorrectly classified, the display state before the classification can be quickly and easily restored, thereby further simplifying the user operation and improving the operation efficiency.
  • the terminal device when detecting the contact of the plurality of interface objects displayed for the touch screen, may detect the contact of the plurality of interface objects displayed by the touch screen through the touchpad or the virtual touch space.
  • the terminal device When the terminal device is configured with the touch panel, the user completes the contact of the plurality of interface objects displayed on the touch screen by performing a gesture operation on the touch panel.
  • the terminal device is equipped with an optical touch system, and a virtual touch area is formed in the surrounding three-dimensional space by the infrared light, and the user completes the plurality of interface objects displayed on the touch screen by the gesture operation in the virtual touch area. contact.
  • FIG. 9 is a method for categorizing an interface object according to an embodiment of the present application, which is applicable to a terminal device, and the method includes the following steps.
  • step S30 contacts for a plurality of points of the touch screen are detected, the plurality of points forming a content collection area on the touch screen.
  • the contact for the plurality of points of the touch screen is a non-simultaneous contact for the plurality of points.
  • the touch screen is respectively determined by the contact.
  • a plurality of points surround a content collection area by the plurality of points respectively determined.
  • the user determines a plurality of contact points on the touch screen to surround the content collection area.
  • the display position of the interface object that the user can classify according to the need determines a plurality of contact points on the touch screen, and the area of the plurality of contact points surrounding the city is used as the content collection area.
  • the interface object enclosed by the content collection area will be used as the interface object to be classified.
  • step S31 contact for a plurality of points of the content collection area is detected, and when contact for a plurality of points of the content collection area gradually gathers on the touch screen, the display image of the interface object within the content collection area is gradually concentrated.
  • the terminal device When the user makes a pinch gesture on the touch screen for the content collection area, the terminal device detects contact for a plurality of points of the content collection area, and the contacts for the plurality of points of the content collection area gradually gather on the touch screen. At this time, the display image of the interface image in the content collection area is gradually concentrated as shown in FIG. 10(b). That is, after the user surrounds the content collection area on the touch screen, the terminal device detects that the interface object in the content collection area is classified when the pinch gesture is made in the content collection area.
  • step S32 when it is detected that the contact of the plurality of points for the content collection area has been gathered into the preset range, an interface object container is created and displayed, and the interface object in the content collection area is saved in the interface object container. .
  • a threshold is preset in the background to detect whether the contacts for the plurality of points of the content collection area have been gathered to a preset size, which may be a pixel value that is spaced between the user's finger and each contact point of the touch screen.
  • a preset size which may be a pixel value that is spaced between the user's finger and each contact point of the touch screen.
  • an instruction to create an interface object container is triggered, and an interface object container is created in the upper level directory of the interface object, wherein the interface object container may be a folder or a directory in a corresponding directory hierarchy.
  • an instruction to transfer the interface object in the content collection area is triggered, and the interface object in the content collection area is transferred to the created interface object container for storage.
  • FIG. 10(c) in the interface displayed on the touch screen, the created interface object container is displayed, and the interface objects in the content collection area have been saved in the interface object container.
  • a plurality of points are sequentially determined by the user in the display range of the touch screen, and the content collection area is enclosed by the plurality of points.
  • the terminal device After detecting the gesture operation of the content collection area, the terminal device automatically creates an interface object container, and automatically saves the interface object in the content collection area to the interface object container, which simplifies the operation steps of classifying the interface object. Multi-touch gesture operation can quickly complete the classification of interface objects, improve the operation efficiency, and form a simple and vivid human-computer interaction.
  • step S32 further includes the following steps.
  • step S321 when it is detected that the contacts of the plurality of points for the content collection area have been gathered to the first preset size, the display images of the interface objects in the content collection area start to compress or cover each other.
  • a first threshold value and a second threshold value are set in the background to determine the degree of convergence of contacts for a plurality of points of the content collection area.
  • the pixel value of the interval between the user's finger and each contact point of the touch screen is less than or equal to the first threshold, it is determined that the contact of the plurality of points for the content collection area has been gathered to the first preset size.
  • the display image of the interface object is very concentrated, and the display image of the interface object in the content collection area is further compressed or covered, thereby further enhancing the concentration of the interface object in the image display effect, so as to guide the user to continue to gather.
  • the selected interface object is very concentrated, and the display image of the interface object in the content collection area is further compressed or covered, thereby further enhancing the concentration of the interface object in the image display effect, so as to guide the user to continue to gather.
  • step S322 when it is detected that the contact for the plurality of points of the content collection area has been gathered to the second preset size, an interface object container is created and displayed, and the interface object in the content collection area is saved in the interface object container. .
  • the interface object container is automatically created and the interface object in the content collection area is saved in the interface object container, which simplifies the operation steps of classifying the interface object.
  • step S31 when it is detected that the contacts of the plurality of points for the content collection area are gradually gathered on the touch screen, the display image of the interface object in the content collection area is gradually concentrated while being displayed in the content collection area.
  • the upper level file directory of the interface object For example, in the interface displaying the application icon, the display image of the application icon in the enclosed content collection area is gradually concentrated while displaying other folders for categorizing the application icon.
  • step S32 the created interface object container is displayed in the file directory of the upper level of the interface object in the content collection area.
  • the created interface object container and the upper-level file directory are also displayed.
  • both the created folder and other folders for categorizing the application icons are displayed.
  • the interface object categorization method further includes the following steps.
  • step S33 a prompt is made to rename the created interface object container.
  • the user After saving the interface object in the content collection area in the created interface object container, the user is prompted to rename the created interface object container.
  • a default name is usually added to the interface object container, such as the default names such as "new folder” and "new category”.
  • the user is prompted to rename the created interface object container according to the classification of the interface object.
  • the way to prompt can be the interface object container that will be created automatically.
  • the file name becomes editable, or the text prompts the user to rename the created interface object container.
  • the interface object categorization method further includes the following steps.
  • step S34 contact for a plurality of points of the interface object container is detected, and when contact with a plurality of points of the interface object container is gradually spread on the touch screen, the interface object container is deleted and the interface object is displayed in the current file directory. The interface object saved by the container.
  • the terminal device detects that the contact of the plurality of points of the interface object container gradually spreads on the touch screen, that is, the user's opening gesture on the touch screen is detected. At this time, an instruction to transfer the interface object saved in the interface object container to the current file directory is triggered, and after the transfer is completed, the interface object container is deleted, and the touch screen displays the interface object transferred from the interface object container, and the interface object The interface object container has been removed from the current file directory.
  • the display image of the interface object container created during the classification may be opened on the touch screen, and when the terminal device detects the gesture, the interface object saved by the interface object container is displayed.
  • the interface object container is deleted, so that when the interface object is incorrectly classified, the display state before the classification can be quickly and easily restored, thereby further simplifying the user operation and improving the operation efficiency.
  • the contact with the plurality of points of the touch screen may be detected through the touch pad or the virtual touch space.
  • the terminal device is configured with a touch panel
  • the user completes contact with a plurality of points of the touch screen by operating the touch panel to enclose the content collection area.
  • the terminal device is equipped with an optical touch system, and forms a virtual touch area in the surrounding three-dimensional space by using infrared light, and the user completes contact with multiple points of the touch screen by operation in the virtual touch area, Enclose the content collection area.
  • contact with a plurality of points for the content collection area may also be detected by the touchpad or the virtual touch space to identify a gesture operation for the content collection area.
  • Figure 11 is a device for categorizing an interface object according to an embodiment of the present application, including:
  • a first detecting module 30 configured to detect contact for a plurality of points of the touch screen, the plurality of points forming a content collecting area on the touch screen;
  • the display module 31 is configured to gradually concentrate the display image of the interface object in the content collection area when the contact of the plurality of points for the touch screen is gradually gathered on the touch screen;
  • the first processing module 32 is configured to create and display an interface object container when the contact of the plurality of points for the touch screen has been gathered to a preset size, and the interface object in the content collection area is saved in the interface object container.
  • the first processing module 32 includes:
  • a first processing submodule configured to: when it is detected that the contacts of the plurality of points for the touch screen have been gathered to the first preset size, the display images of the interface objects begin to compress or cover each other;
  • a second processing submodule configured to: when detecting that the contact for the plurality of points of the touch screen has been gathered to the second preset size, create and display an interface object container, and the interface object in the content collection area is saved in the interface object container Inside.
  • the display module 31 includes:
  • the first display sub-module is configured to gradually collect the display image of the interface object in the content collection area while detecting that the contact of the plurality of points for the touch screen gradually gathers on the touch screen, and simultaneously display the file directory of the upper-level file of the interface object.
  • the first processing module 32 includes:
  • the second display sub-module is configured to display the created interface object container in a file directory of the interface object.
  • the apparatus further comprises:
  • the first prompt module is used to prompt to rename the created interface object container.
  • FIG. 12 is a device for categorizing an interface object according to an embodiment of the present application, including:
  • a second detecting module 40 configured to detect contact of the plurality of interface objects displayed for the touch screen, and when the contacts for the plurality of interface objects gradually gather on the touch screen, the display images of the plurality of interface objects are gradually concentrated;
  • the second processing module 41 is configured to create and display an interface object container when the contact for the plurality of interface objects has been gathered to be within the preset range, and the plurality of interface objects are saved in the interface object container.
  • the second processing module 41 includes:
  • a third processing submodule configured to: when it is detected that the contact for the plurality of interface objects has been gathered to the first preset size, the display images of the plurality of interface objects begin to compress or cover each other;
  • a fourth processing submodule configured to: when detecting that the contact for the plurality of interface objects has been gathered to the second preset size, create and display an interface object container, where the plurality of interface objects are saved in the interface object container.
  • the second detecting module 40 includes:
  • the third display sub-module is configured to gradually display the display images of the plurality of interface objects when the contacts for the plurality of interface objects are gradually gathered on the touch screen, and simultaneously display the upper-level file directories of the plurality of interface objects.
  • the second processing module 41 includes:
  • the fourth display sub-module is configured to display the created interface object container in a file directory of a plurality of interface objects.
  • the apparatus further comprises:
  • the second prompt module is used to prompt to rename the created interface object container.
  • the interface object categorization method and apparatus provided by the embodiments of the present application are further illustrated in the following by a few application scenarios.
  • the terminal device When categorizing the application icons displayed on the touch screen, the terminal device detects contact with a plurality of points of the touch screen, and the plurality of points form a content collection area on the touch screen. As shown in FIG. 13(a), the user's contact with three points of the touch screen is detected. The three points form a content collection area on the touch screen, and the application icon displayed in the content collection area will be returned. Class application icon.
  • the terminal device detects that the contact for the three points gradually gathers on the touch screen, and the application icons in the content collection area are gradually concentrated, as shown in FIG. (b) shows that the application icons in the content collection area are gradually gathered from each other.
  • a folder is created, and the icon of the folder is displayed in the interface displayed by the terminal device, as shown in FIG. 13(c), the content collection area
  • the application icon inside has been saved in the created folder, and the user clicks on the icon of the folder to see the application icon saved in the folder.
  • the categorization of the application icons can also be performed for the selected application icons, that is, the user selects the application icons that need to be classified into one type by multi-touch, and is selected by the pinch gesture.
  • the app icons are categorized into a folder. After the application icons are sorted, the user is prompted to rename the created folder, for example, the created folder is named "social life", "utility”, etc. according to the category of the application icon.
  • the terminal device detects the contact of the plurality of picture files displayed on the touch screen, as shown in FIG. 14( a ), detects the contact of the three picture files displayed by the user for the touch screen, and the three The image file will be used as the image file to be categorized.
  • the terminal device detects that the contacts for the three picture files gradually gather on the touch screen, and the icons of the three picture files displayed on the touch screen are gradually concentrated. As shown in FIG. 14(b), the selected three picture files are gradually gathered from each other.
  • a folder is created, and the icon of the folder is displayed in the interface displayed by the terminal device, as shown in FIG. 14(c), is selected.
  • the three image files have been saved in the created folder, and the user clicks on the icon of the folder to see the three image files saved in the folder.
  • the categorization of the image file may also be performed by forming a content collection area on the touch screen, that is, the user forms a content collection area on the touch screen by multi-touch, and the image displayed in the content collection area is displayed.
  • the file will be used as an image file to be classified, and the image file in the content collection area is classified into a file by a pinch gesture, and the user is prompted to rename the folder, for example, the folder is named "2012. Names such as “March Travel” and “Collective Photo”.
  • the terminal device When classifying the item list items displayed on the touch screen, the terminal device detects multiple points for the touch screen The contact, a plurality of points on the touch screen form a content collection area. As shown in FIG. 15(a), the user's contact with two points of the touch screen is detected, and the two points form a content collection area on the touch screen, and the item list item displayed in the content collection area is to be returned.
  • the item category table entry of the class does not completely cover each item list entry, and the item list item will be used as the classified item list item as long as it is partially displayed in the content collection area.
  • the terminal device detects that the contact for the two points gradually gathers on the touch screen, and the item list items in the content collection area are gradually concentrated, as shown in FIG. (b) shows that the application icons in the content collection area are gradually gathered from each other, and the upper-level directories of the item list items are displayed, for example, "food and beverage”, “milk powder”, “clothing”, and the like.
  • FIG. 15(c) shows that the contact for the two points has been gathered to the preset size
  • a directory is created, and an entry of the directory is displayed in the interface displayed by the terminal device, as shown in FIG. 15(c), in the content collection area.
  • the item list entry has been saved in the created directory, and the user clicks on the directory entry to see the item list entry saved in the directory.
  • the categorization of the item list items may also be performed for the selected item list item, that is, the user selects the item list items that need to be classified into one category by multi-touch, and is selected by the pinch gesture.
  • the item list entries are grouped into a single directory. After the product list entries are sorted, the user is prompted to rename the created directory, for example, the created directories are named "home appliances", "daily department stores", and the like according to the category of the item list entries.
  • the terminal device Upon categorizing the mailing list entries displayed on the touch screen, the terminal device detects contact with a plurality of points of the touch screen, and the plurality of points form a content collection area on the touch screen. As shown in FIG. 16(a), the user's contact with two points of the touch screen is detected, and the two points form a content collection area on the touch screen, and the mailing list item displayed in the content collection area will be returned as The mailing list entry of the class does not completely cover each mailing list entry, and the mailing list entry will be used as the classified mailing list entry as long as it is partially displayed in the content collecting area.
  • the terminal device detects that the contact for the two points gradually gathers on the touch screen, and the mailing list entries in the content collection area are gradually concentrated, as shown in FIG. (b) shows that the application icons in the content collection area are gradually gathered from each other, and the upper directory of the mail list entry is displayed at the same time, for example, "Customer A", "Customer B” and the like.
  • FIG. 16(c) shows that the terminal device's finger maintains contact with the two points and makes a pinch gesture.
  • the mailing list entry has been saved in the created directory, and the user clicks on the directory entry to see the mailing list entries saved in the directory.
  • the categorization of the mailing list entries may also be performed for the selected mailing list entries, that is, the user selects the mailing list items that need to be classified into one category by multi-touch, and the pinch gesture will be selected.
  • Mailing list entries are grouped into a single directory. After the mailing list entries are sorted, the user is prompted to rename the created directory, for example, according to the postal The category of the list entry is named "Customer C", "Internal Notification", and so on.
  • a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
  • processors CPUs
  • input/output interfaces network interfaces
  • memory volatile and non-volatile memory
  • the memory may include non-persistent memory, random access memory (RAM), and/or non-volatile memory in a computer readable medium, such as read only memory (ROM) or flash memory.
  • RAM random access memory
  • ROM read only memory
  • Memory is an example of a computer readable medium.
  • Computer readable media includes both permanent and non-persistent, removable and non-removable media.
  • Information storage can be implemented by any method or technology.
  • the information can be computer readable instructions, data structures, modules of programs, or other data.
  • Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read only memory. (ROM), electrically erasable programmable read only memory (EEPROM), flash memory or other memory technology, compact disk read only memory (CD-ROM), digital versatile disk (DVD) or other optical storage, Magnetic tape cartridges, magnetic tape storage or other magnetic storage devices or any other non-transportable media can be used to store information that can be accessed by a computing device.
  • computer readable media does not include non-transitory computer readable media, such as modulated data signals and carrier waves.
  • first device if a first device is coupled to a second device, the first device can be directly electrically coupled to the second device, or electrically coupled indirectly through other devices or coupling means. Connected to the second device.
  • the description of the specification is intended to be illustrative of the preferred embodiments of the invention. The scope of protection of the application is subject to the definition of the appended claims.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种界面对象归类方法,包括:检测到针对触摸屏的多个点的接触,所述多个点在所述触摸屏形成一内容收集区域(S10);当检测到针对所述触摸屏的多个点的接触在所述触摸屏上逐渐聚拢时,所述内容收集区域内的界面对象的显示图像逐渐集中(S11);当检测到针对所述触摸屏的多个点的接触已聚拢至预设大小时,创建并显示一个界面对象容器,所述内容收集区域内的界面对象被保存在所述界面对象容器内(S12)。上述方法简化了对界面对象进行归类的操作步骤,提高了操作效率,形成了一种简单生动的人机交互方式。

Description

界面对象归类方法和装置
本申请要求2015年08月18日递交的申请号为201510507368.1发明名称为“界面对象归类方法和装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请属于移动通信领域,具体地说,涉及一种界面对象归类方法和装置。
背景技术
智能终端是具有独立的操作***,独立的运行空间,可以由用户自行安装软件、游戏、导航等第三方服务商提供的应用程序,并可以通过无线通讯网络来实现无线网络接入的终端设备。随着智能终端以及移动互联网技术的迅速普及,已经深刻的改变了人们的沟通和生活方式。人们通过智能终端可以随时随地浏览各种信息,例如,文字、语音、图片、视频等等,还可以任意安装喜欢的应用程序,使智能设备的功能更加完善。
目前,在智能终端中,如果需要对界面中显示的内容进行归类,则需要按照以下步骤进行。如图1所示,首先在界面中新建一个文件夹或者分类容器;然后对界面中需要进行分类的内容进行转移或者拖拽操作,将需要进行分类的内容移动到新建的文件夹或者分类容器内,从而完成对界面内容进行分类的操作。而通常情况下,界面中需要进行归类的内容与文件夹或者分类容器并不处于文件***的同一层级,因此进行归类时需要进行较多的操作步骤,对用户的引导性较弱。
发明内容
有鉴于此,本申请提供了一种界面对象归类方法和装置,以解决现有技术中在对界面对象进行归类时操作步骤较多的技术问题。
为了解决上述技术问题,本申请公开了一种界面对象归类方法,包括:检测到针对触摸屏的多个点的接触,所述多个点在所述触摸屏形成一内容收集区域;当检测到针对所述触摸屏的多个点的接触在所述触摸屏上逐渐聚拢时,所述内容收集区域内的界面对象的显示图像逐渐集中;当检测到针对所述触摸屏的多个点的接触已聚拢至预设大小时,创建并显示一个界面对象容器,所述内容收集区域内的界面对象被保存在所述界面对象容器内。
为了解决上述技术问题,本申请还公开了一种界面对象归类方法,包括:检测到针 对触摸屏显示的多个界面对象的接触,并且当针对所述多个界面对象的接触在所述触摸屏上逐渐聚拢时,所述多个界面对象的显示图像逐渐集中;当检测到针对所述多个界面对象的接触已聚拢到预设范围之内时,创建并显示一个界面对象容器,所述多个界面对象被保存在所述界面对象容器内。
为了解决上述技术问题,本申请还公开了一种界面对象归类方法,包括:检测到针对触摸屏的多个点的接触,所述多个点在所述触摸屏形成一内容收集区域;检测到针对所述内容收集区域的多个点的接触,并且当针对所述内容收集区域的多个点的接触在所述触摸屏上逐渐聚拢时,所述内容收集区域内的界面对象的显示图像逐渐集中;当检测到针对所述内容收集区域的多个点的接触已聚拢到预设范围之内时,创建并显示一个界面对象容器,所述内容收集区域内的界面对象被保存在所述界面对象容器内。
为了解决上述技术问题,本申请还公开了一种界面对象归类装置,包括:第一检测模块,用于检测到针对触摸屏的多个点的接触,所述多个点在所述触摸屏形成一内容收集区域;显示模块,用于当检测到针对所述触摸屏的多个点的接触在所述触摸屏上逐渐聚拢时,所述内容收集区域内的界面对象的显示图像逐渐集中;第一处理模块,用于当检测到针对所述触摸屏的多个点的接触已聚拢至预设大小时,创建并显示一个界面对象容器,所述内容收集区域内的界面对象被保存在所述界面对象容器内。
为了解决上述技术问题,本申请还公开了一种界面对象归类装置,包括:第二检测模块,用于检测到针对触摸屏显示的多个界面对象的接触,并且当针对所述多个界面对象的接触在所述触摸屏上逐渐聚拢时,所述多个界面对象的显示图像逐渐集中;第二处理模块,用于当检测到针对所述多个界面对象的接触已聚拢到预设范围之内时,创建并显示一个界面对象容器,所述多个界面对象被保存在所述界面对象容器内。
为了解决上述技术问题,本申请还公开了一种界面对象归类装置,包括:第五检测模块,用于检测到针对触摸屏的多个点的接触,所述多个点在所述触摸屏形成一内容收集区域;第六检测模块,用于检测到针对所述内容收集区域的多个点的接触,并且当针对所述内容收集区域的多个点的接触在所述触摸屏上逐渐聚拢时,所述内容收集区域内的界面对象的显示图像逐渐集中;第三处理模块,用于当检测到针对所述内容收集区域的多个点的接触已聚拢到预设范围之内时,创建并显示一个界面对象容器,所述内容收集区域内的界面对象被保存在所述界面对象容器内。
与现有技术相比,本申请可以获得包括以下技术效果:简化了对界面对象进行归类的操作步骤,通过多点触控的手势操作即可快速完成界面对象的归类,提高了操作效率, 并形成一种简单生动的人机交互方式。
当然,实施本申请的任一产品必不一定需要同时达到以上所述的所有技术效果。
附图说明
此处所说明的附图用来提供对本申请的进一步理解,构成本申请的一部分,本申请的示意性实施例及其说明用于解释本申请,并不构成对本申请的不当限定。在附图中:
图1是本申请实施例提供的一种界面对象归类方法的流程示意图;
图2是本申请实施例形成内容收集区域的示例性界面示意图;
图3是本申请实施例界面对象互相压缩或覆盖的示例性界面示意图;
图4(a)是本申请实施例界面对象压缩的同时显示上一级目录的示例性界面示意图;
图4(b)是本申请实施例归类之后显示创建的界面对象容器的示例性界面示意图;
图5是本申请实施例提供的一种界面对象归类方法的流程示意图;
图6是本申请实施例确定被选中的界面对象的示例性界面示意图;
图7是本申请实施例对被选中的界面对象进行压缩或覆盖的示例性界面示意图;
图8(a)是本申请实施例对被选中的界面对象进行压缩或覆盖的同时显示上一级目录的示例性界面示意图;
图8(b)是本申请实施例归类之后显示创建的界面对象容器的示例性界面示意图;
图9是本申请实施例提供的一种界面对象归类方法的流程示意图;
图10(a)-(c)是本申请实施例提供的界面对象归类方法的示例性界面示意图;
图11是本申请实施例提供的一种界面对象归类装置的结构示意框图;
图12是本申请实施例提供的一种界面对象归类装置的结构示意框图;
图13(a)-(c)是本申请实施例对应用程序图标进行归类的示例性界面示意图;
图14(a)-(c)是本申请实施例对图片文件进行归类的示例性界面示意图;
图15(a)-(c)是本申请实施例对商品列表条目进行归类的示例性界面示意图;
图16(a)-(c)是本申请实施例对邮件列表条目进行归类的示例性界面示意图。
具体实施方式
以下将配合附图及实施例来详细说明本申请的实施方式,藉此对本申请如何应用技术手段来解决技术问题并达成技术功效的实现过程能充分理解并据以实施。
图1是本申请实施例提供的一种界面对象归类方法,适用于终端设备,该方法包括 以下步骤。
在步骤S10中,检测到针对触摸屏的多个点的接触,多个点在触摸屏形成一内容收集区域。
终端设备支持多点触控(Multi-Touch)功能,通过用户手势操作触发的触控事件检测到针对触摸屏的多个点的接触,从而根据该多个点在触摸屏形成一内容收集区域。例如图2所示,分别在触摸屏检测到多个点的接触时,在触摸屏形成一内容收集区域。
被显示在该内容收集区域内的界面对象将成为被归类的界面对象。该界面对象可以是触摸屏显示的应用程序图标、邮件列表中的邮件条目、商品列表中商品条目、图片以及其他文件等界面对象。
检测到针对触摸屏的多个点的接触,可以是与触摸屏任意一点的接触,该接触可以点在触摸屏所显示的交互界面的界面对象上,也可以点在该交互界面的其他位置。所有与触摸屏接触的点共同形成内容收集区域,被显示在该内容收集区域内的界面对象都成为被归类的界面对象。对于单个界面对象来说,全部被显示在内容收集区域内或者是部分被显示在内容收集区域内都将成为被归类的界面对象。
在步骤S11中,当检测到针对触摸屏的多个点的接触在触摸屏上逐渐聚拢时,内容收集区域内的界面对象的显示图像逐渐集中。
用户两根以上的手指在触摸屏上做出捏合手势时,终端设备根据接触位置能够检测到针对触摸屏的多个点的接触在触摸屏上逐渐聚拢。此时,被显示在内容收集区域内的界面对象,随着用户手势的聚拢其显示图像会逐渐集中,即被显示在内容收集区域内的界面对象的显示图像会开始彼此靠近。
在步骤S12中,当检测到针对触摸屏的多个点的接触已聚拢至预设大小时,创建并显示一个界面对象容器,内容收集区域内的界面对象被保存在界面对象容器内。
在后台预设一临界值来检测针对触摸屏的多个点的接触是否已聚拢至预设大小,该临界值可以是用户手指与触摸屏的各个接触点之间间隔的像素值。当用户手指与触摸屏的各个接触点之间间隔的像素值小于或等于该临界值时,则确定该针对触摸屏的多个点的接触已聚拢至预设大小。
此时,触发创建界面对象容器的指令,在所述界面对象的上一级目录创建一个界面对象容器,其中,该界面对象容器可以是文件夹或者相应目录层级中的一个目录。当创建该界面对象容器成功时,触发转移内容收集区域内的界面对象的指令,将内容收集区域内的界面对象转移至创建的界面对象容器中进行保存。在触摸屏所显示的界面中,显 示该创建的界面对象容器,之前形成的内容收集区域内的界面对象都已被保存在该界面对象容器中。
本申请实施例通过检测用户的多点触控操作而形成内容收集区域,随着用户的捏合手势自动创建界面对象容器,并将内容收集区域内的界面对象自动保存到该界面对象容器内,简化了对界面对象进行归类的操作步骤,通过多点触控的手势操作即可快速完成界面对象的归类,提高了操作效率,并形成一种简单生动的人机交互方式。
在一个实施例中,步骤S12进一步包括以下步骤。
在步骤S121中,当检测到针对触摸屏的多个点的接触已聚拢至第一预设大小时,界面对象的显示图像开始互相压缩或者覆盖。
在后台设置第一临界值和第二临界值来确定针对触摸屏的多个点的接触的聚拢程度。当用户手指与触摸屏的各个接触点之间间隔的像素值小于或等于该第一临界值时,则确定该针对触摸屏的多个点的接触已聚拢至第一预设大小。此时界面对象的显示图像已非常集中,进一步对界面对象的显示图像进行互相压缩或者覆盖的处理,如图3所示,从而在图像显示效果上进一步增强界面对象的集中程度,以引导用户继续聚拢其中的界面对象。
在步骤S122中,当检测到针对触摸屏的多个点的接触已聚拢至第二预设大小时,创建并显示一个界面对象容器,内容收集区域内的界面对象被保存在界面对象容器内。
当用户手指与触摸屏的各个接触点之间间隔的像素值小于或等于该第二临界值时,则确定该针对触摸屏的多个点的接触已聚拢至第二预设大小。此时自动创建界面对象容器并将内容收集区域内的界面对象保存在界面对象容器内,简化了对界面对象进行归类的操作步骤。
在一个实施例中,在步骤S11中,当检测到针对触摸屏的多个点的接触在触摸屏上逐渐聚拢时,内容收集区域内的界面对象的显示图像逐渐集中,同时显示界面对象的上一级文件目录。
该上一级文件目录,即为步骤S12中创建的界面对象容器所在的文件目录层级的其他文件目录。例如,如图4(a)所示,当内容收集区域内的界面对象的显示图像逐渐集中时,同时显示界面对象的上一级文件目录,包括文件夹1和文件夹2,那么在步骤S12中创建的界面对象容器与该文件夹1和文件夹2处于同一层级。
在步骤S12中,当检测到针对触摸屏的多个点的接触已聚拢至预设大小时,创建并显示一个界面对象容器,内容收集区域内的界面对象被保存在界面对象容器内,在界面 对象的上一级文件目录中,显示创建的界面对象容器。
如图4(b)所示,创建新的界面对象容器后,例如,文件夹3,显示界面对象的上一级文件目录,即该创建的界面对象容器所在的目录层级,包括该新创建的界面对象容器和该目录层级的其他界面对象容器或者文件。内容收集区域内的界面对象保存在该创建的界面对象容器内,通过打开该创建的界面对象容器来浏览被归类的界面对象。
在一个实施例中,该界面对象归类方法还包括以下步骤。
在步骤S13中,提示对创建的界面对象容器进行重命名。
将内容收集区域内的界面对象保存在创建的界面对象容器内之后,提示用户对创建的界面对象容器重新命名。自动创建界面对象容器时,通常为该界面对象容器添加默认名称,例如“新文件夹”“新的分类”等默认名称。此时提示用户根据自己对界面对象的分类为创建的界面对象容器重新命名。提示的方式可以是自动将创建的界面对象容器的文件名变为可编辑的状态,或者通过文字提示用户为创建的界面对象容器重新命名。
在一个实施例中,该界面对象归类方法还包括以下步骤。
在步骤S14中,检测到针对界面对象容器的多个点的接触,当针对界面对象容器的多个点的接触在触摸屏上逐渐扩散时,在当前的文件目录中删除界面对象容器并显示界面对象容器保存的多个界面对象。
用户两根以上手指与触摸屏显示的界面对象容器同时接触,终端设备检测到针对该界面对象容器的多个点的接触在触摸屏上逐渐扩散时,即检测到用户在触摸屏的张开手势。此时,触发将该界面对象容器内保存的界面对象转移到当前的文件目录的指令,转移完成后将还界面对象容器删除,而触摸屏将显示从界面对象容器中转移出的界面对象,而该界面对象容器则已从当前的文件目录中删除。在用户认为对界面对象归类有误时,可以在触摸屏上针对归类时创建的界面对象容器的显示图像做张开手势,终端设备检测到该手势时,显示该界面对象容器保存的界面对象并删除该界面对象容器,从而在对界面对象归类有误时,能够便捷快速的恢复归类前的显示状态,进一步简化用户操作,提高操作效率。
在一个实施例中,终端设备在检测针对触摸屏的多个点的接触时,可通过触控板或者虚拟触控空间来检测到针对触摸屏的多个点的接触。当该终端设备配置了触控板时,用户通过对该触控板的手势操作来完成针对触摸屏的多个点的接触。或者,该终端设备配备了光学触控***,通过红外光在周围立体空间中形成一虚拟触控区域,用户通过在该虚拟触控区域内的手势操作来完成针对触摸屏的多个点的接触。
图5是本申请实施例提供的一种界面对象归类方法,适用于终端设备,该方法包括以下步骤。
在步骤S20中,检测到针对触摸屏显示的多个界面对象的接触,并且当针对多个界面对象的接触在触摸屏上逐渐聚拢时,多个界面对象的显示图像逐渐集中。
终端设备支持多点触控(Multi-Touch)功能,通过用户手势操作触发的触控事件检测到针对触摸屏显示的多个界面对象的接触,如图6所示。此时用户手势需要接触到界面对象的显示位置,检测到该接触的界面对象将被做为被归类的界面对象。界面对象可以是触摸屏显示的应用程序图标、邮件列表中的邮件条目、商品列表中商品条目、图片以及其他文件等界面对象。
用户两根以上的手指在触摸屏上做出捏合手势时,终端设备根据接触位置能够检测到针对多个界面对象的接触在触摸屏上逐渐聚拢。此时,该多个界面对象随着用户手势的聚拢其显示图像会逐渐集中,即该多个界面对象的显示图像会开始彼此靠近。
在步骤S21中,当检测到针对多个界面对象的接触已聚拢到预设范围之内时,创建并显示一个界面对象容器,多个界面对象被保存在界面对象容器内。
在后台预设一临界值来检测针对触摸屏显示的多个界面对象的接触是否已聚拢至预设大小,该临界值可以是用户手指与触摸屏的各个接触点之间间隔的像素值。当用户手指与触摸屏的各个接触点之间间隔的像素值小于或等于该临界值时,则确定该针对触摸屏显示的多个界面对象的接触已聚拢至预设大小。
此时,触发创建界面对象容器的指令,在所述界面对象的上一级目录创建一个界面对象容器,其中,该界面对象容器可以是文件夹或者相应目录层级中的一个目录。当创建该界面对象容器成功时,触发转移被选中的多个界面对象的指令,将被选中的多个界面对象转移至创建的界面对象容器中进行保存。在触摸屏所显示的界面中,显示该创建的界面对象容器,被选中的多个界面对象都已被保存在该界面对象容器内。
本申请实施例通过检测用户的多点触控操作而确定被选中的多个界面对象,随着用户的捏合手势自动创建界面对象容器,并将被选中的多个界面对象自动保存到该界面对象容器内,简化了对界面对象进行归类的操作步骤,通过多点触控的手势操作即可快速完成界面对象的归类,提高了操作效率,并形成一种简单生动的人机交互方式。
在一个实施例中,步骤S20进一步包括以下步骤。
在步骤S201中,当检测到针对触摸屏显示的多个界面对象的接触已聚拢至第一预设大小时,多个界面对象的显示图像开始互相压缩或者覆盖。
在后台设置第一临界值和第二临界值来确定针对触摸屏显示的多个界面对象的接触的聚拢程度。当用户手指与触摸屏的各个接触点之间间隔的像素值小于或等于该第一临界值时,则确定该针对触摸屏显示的多个界面对象的接触已聚拢至第一预设大小。此时界面对象的显示图像已非常集中,进一步对被选中的多个界面对象的显示图像进行互相压缩或者覆盖的处理,如图7所示,从而在图像显示效果上进一步增强界面对象的集中程度,以引导用户继续聚拢被选中的界面对象。
在步骤S202中,当检测到针对触摸屏显示的多个界面对象的接触已聚拢至第二预设大小时,创建并显示一个界面对象容器,被选中的多个界面对象被保存在该界面对象容器内。
当用户手指与触摸屏的各个接触点之间间隔的像素值小于或等于该第二临界值时,则确定该针对触摸屏显示的多个界面对象的接触已聚拢至第二预设大小。此时自动创建界面对象容器并将被选中的多个界面对象保存在界面对象容器内,简化了对界面对象进行归类的操作步骤。
在一个实施例中,在步骤S20中,当检测到针对触摸屏显示的多个界面对象的接触在触摸屏上逐渐聚拢时,被选中的多个界面对象的显示图像逐渐集中,同时显示该多个界面对象的上一级文件目录。
该上一级文件目录,即为步骤S21中创建的界面对象容器所在的文件目录层级的其他文件目录。例如,如图8(a)所示,当被选中的多个界面对象的显示图像逐渐集中时,同时显示该多个界面对象的上一级文件目录,包括文件夹1和文件夹2,那么在步骤S21中创建的界面对象容器与该文件夹1和文件夹2处于同一层级。
在步骤S21中,当检测到针对触摸屏显示的多个界面对象的接触已聚拢至预设大小时,创建并显示一个界面对象容器,被选中的多个界面对象被保存在界面对象容器内,在该多个界面对象的上一级文件目录中,显示创建的界面对象容器。
如图8(b)所示,创建新的界面对象容器后,例如文件夹3,显示该多个界面对象的上一级文件目录,即该创建的界面对象容器所在的目录层级,包括该新创建的界面对象容器和该目录层级的其他界面对象容器或者文件。被选中的多个界面对象保存在该创建的界面对象容器内,通过打开该创建的界面对象容器来浏览被选中的多个界面对象。
在一个实施例中,该界面对象归类方法还包括以下步骤。
在步骤S22中,提示对创建的界面对象容器进行重命名。
将被选中的多个界面对象保存在创建的界面对象容器内之后,提示用户对创建的界 面对象容器重新命名。自动创建界面对象容器时,通常为该界面对象容器添加默认名称,例如“新文件夹”“新的分类”等默认名称。此时提示用户根据自己对界面对象的分类为创建的界面对象容器重新命名。提示的方式可以是自动将创建的界面对象容器的文件名变为可编辑的状态,或者通过文字提示用户为创建的界面对象容器重新命名。
在一个实施例中,该界面对象归类方法还包括以下步骤。
在步骤S23中,检测到针对界面对象容器的多个点的接触,当针对界面对象容器的多个点的接触在触摸屏上逐渐扩散时,在当前的文件目录中删除界面对象容器并显示界面对象容器保存的多个界面对象。
用户两根以上手指与触摸屏显示的界面对象容器同时接触,终端设备检测到针对该界面对象容器的多个点的接触在触摸屏上逐渐扩散时,即检测到用户在触摸屏的张开手势。此时,触发将该界面对象容器内保存的界面对象转移到当前的文件目录的指令,转移完成后将还界面对象容器删除,而触摸屏将显示从界面对象容器中转移出的界面对象,而该界面对象容器则已从当前的文件目录中删除。在用户认为对界面对象归类有误时,可以在触摸屏上针对归类时创建的界面对象容器的显示图像做张开手势,终端设备检测到该手势时,显示该界面对象容器保存的界面对象并删除该界面对象容器,从而在对界面对象归类有误时,能够便捷快速的恢复归类前的显示状态,进一步简化用户操作,提高操作效率。
在一个实施例中,终端设备在检测针对触摸屏显示的多个界面对象的接触时,可通过触控板或者虚拟触控空间来检测到针对触摸屏显示的多个界面对象的接触。当该终端设备配置了触控板时,用户通过对该触控板的手势操作来完成针对触摸屏显示的多个界面对象的接触。或者,该终端设备配备了光学触控***,通过红外光在周围立体空间中形成一虚拟触控区域,用户通过在该虚拟触控区域内的手势操作来完成针对触摸屏显示的多个界面对象的接触。
图9是本申请实施例提供的一种界面对象归类方法,适用于终端设备,该方法包括以下步骤。
在步骤S30中,检测到针对触摸屏的多个点的接触,该多个点在触摸屏形成一内容收集区域。
该针对触摸屏的多个点的接触为非同时的针对多个点的接触,在不便于直接用手势确定出内容收集区域或者需要归类的界面对象时,在触摸屏上先后通过接触而分别确定出多个点,通过该分别确定出的多个点来围出一内容收集区域。如图10(a)所示,用 户在触摸屏上现后分别确定多个接触点,以围出内容收集区域。用户可根据需要进行归类的界面对象的显示位置在触摸屏上确定多个接触点,将该多个接触点围城的区域做为内容收集区域。该内容收集区域所围起来的界面对象将做为被归类的界面对象。
在步骤S31中,检测到针对内容收集区域的多个点的接触,并且当针对内容收集区域的多个点的接触在触摸屏上逐渐聚拢时,内容收集区域内的界面对象的显示图像逐渐集中。
用户针对该内容收集区域在触摸屏做出捏合手势时,终端设备检测到针对内容收集区域的多个点的接触,并且针对内容收集区域的多个点的接触在触摸屏上逐渐聚拢。此时,该内容收集区域内的界面图像的显示图像逐渐集中,如图10(b)所示。即用户在触摸屏上围出内容收集区域之后,终端设备检测到在内容收集区域内做出了捏合手势时,将该内容收集区域内的界面对象进行归类。
在步骤S32中,当检测到针对内容收集区域的多个点的接触已聚拢到预设范围之内时,创建并显示一个界面对象容器,内容收集区域内的界面对象被保存在界面对象容器内。
在后台预设一临界值来检测针对内容收集区域的多个点的接触是否已聚拢至预设大小,该临界值可以是用户手指与触摸屏的各个接触点之间间隔的像素值。当用户手指与触摸屏的各个接触点之间间隔的像素值小于或等于该临界值时,则确定该针对内容收集区域的多个点的接触已聚拢至预设大小。
此时,触发创建界面对象容器的指令,在所述界面对象的上一级目录创建一个界面对象容器,其中,该界面对象容器可以是文件夹或者相应目录层级中的一个目录。当创建该界面对象容器成功时,触发转移内容收集区域内的界面对象的指令,将内容收集区域内的界面对象转移至创建的界面对象容器中进行保存。如图10(c)所示,在触摸屏所显示的界面中,显示该创建的界面对象容器,内容收集区域内的界面对象都已被保存在该界面对象容器内。
本申请实施例由用户在触摸屏显示范围内先后确定出多个点,利用这多个点围成内容收集区域。终端设备检测到用户在内容收集区域的手势操作后自动创建界面对象容器,并将内容收集区域内的界面对象自动保存到该界面对象容器内,简化了对界面对象进行归类的操作步骤,通过多点触控的手势操作即可快速完成界面对象的归类,提高了操作效率,并形成一种简单生动的人机交互方式。
在一个实施例中,步骤S32进一步包括以下步骤。
在步骤S321中,当检测到针对内容收集区域的多个点的接触已聚拢至第一预设大小时,内容收集区域内的界面对象的显示图像开始互相压缩或者覆盖。
在后台设置第一临界值和第二临界值来确定针对内容收集区域的多个点的接触的聚拢程度。当用户手指与触摸屏的各个接触点之间间隔的像素值小于或等于该第一临界值时,则确定该针对内容收集区域的多个点的接触已聚拢至第一预设大小。此时界面对象的显示图像已非常集中,进一步对内容收集区域内的界面对象的显示图像进行互相压缩或者覆盖的处理,从而在图像显示效果上进一步增强界面对象的集中程度,以引导用户继续聚拢被选中的界面对象。
在步骤S322中,当检测到针对内容收集区域的多个点的接触已聚拢至第二预设大小时,创建并显示一个界面对象容器,内容收集区域内的界面对象被保存在界面对象容器内。
当用户手指与触摸屏的各个接触点之间间隔的像素值小于或等于该第二临界值时,则确定该针对内容收集区域的多个点的接触已聚拢至第二预设大小。此时自动创建界面对象容器并将内容收集区域内的界面对象保存在界面对象容器内,简化了对界面对象进行归类的操作步骤。
在一个实施例中,在步骤S31中,当检测到针对内容收集区域的多个点的接触在触摸屏上逐渐聚拢时,内容收集区域内的界面对象的显示图像逐渐集中,同时显示内容收集区域内的界面对象的上一级文件目录。例如,在显示应用程序图标的界面中,围出的内容收集区域内的应用程序图标的显示图像在逐渐集中的同时,显示出其他用于归类应用程序图标的文件夹。
在步骤S32中,在内容收集区域内的界面对象的上一级文件目录中,显示创建的界面对象容器。
同时显示创建的界面对象容器和上一级文件目录。例如,在显示应用程序图标的界面中,同时显示创建的文件夹和其他用于归类应用程序图标的文件夹。
在一个实施例中,该界面对象归类方法还包括以下步骤。
在步骤S33中,提示对创建的界面对象容器进行重命名。
将内容收集区域内的界面对象保存在创建的界面对象容器内之后,提示用户对创建的界面对象容器重新命名。自动创建界面对象容器时,通常为该界面对象容器添加默认名称,例如“新文件夹”“新的分类”等默认名称。此时提示用户根据自己对界面对象的分类为创建的界面对象容器重新命名。提示的方式可以是自动将创建的界面对象容器 的文件名变为可编辑的状态,或者通过文字提示用户为创建的界面对象容器重新命名。
在一个实施例中,该界面对象归类方法还包括以下步骤。
在步骤S34中,检测到针对界面对象容器的多个点的接触,当针对界面对象容器的多个点的接触在触摸屏上逐渐扩散时,在当前的文件目录中删除界面对象容器并显示界面对象容器保存的界面对象。
用户两根以上手指与触摸屏显示的界面对象容器同时接触,终端设备检测到针对该界面对象容器的多个点的接触在触摸屏上逐渐扩散时,即检测到用户在触摸屏的张开手势。此时,触发将该界面对象容器内保存的界面对象转移到当前的文件目录的指令,转移完成后将还界面对象容器删除,而触摸屏将显示从界面对象容器中转移出的界面对象,而该界面对象容器则已从当前的文件目录中删除。在用户认为对界面对象归类有误时,可以在触摸屏上针对归类时创建的界面对象容器的显示图像做张开手势,终端设备检测到该手势时,显示该界面对象容器保存的界面对象并删除该界面对象容器,从而在对界面对象归类有误时,能够便捷快速的恢复归类前的显示状态,进一步简化用户操作,提高操作效率。
在一个实施例中,终端设备在检测针对触摸屏的多个点的接触时,可通过触控板或者虚拟触控空间来检测到针对触摸屏的多个点的接触。当该终端设备配置了触控板时,用户通过对该触控板的操作来完成针对触摸屏的多个点的接触,以围出内容收集区域。或者,该终端设备配备了光学触控***,通过红外光在周围立体空间中形成一虚拟触控区域,用户通过在该虚拟触控区域内的操作来完成针对触摸屏的多个点的接触,以围出内容收集区域。在检测针对内容收集区域的多个点的接触时,也可以通过触控板或者虚拟触控空间来检测到针对内容收集区域的多个点的接触,以识别针对该内容收集区域的手势操作。
图11是本申请实施例提供的一种界面对象归类装置,包括:
第一检测模块30,用于检测到针对触摸屏的多个点的接触,多个点在触摸屏形成一内容收集区域;
显示模块31,用于当检测到针对触摸屏的多个点的接触在触摸屏上逐渐聚拢时,内容收集区域内的界面对象的显示图像逐渐集中;
第一处理模块32,用于当检测到针对触摸屏的多个点的接触已聚拢至预设大小时,创建并显示一个界面对象容器,内容收集区域内的界面对象被保存在界面对象容器内。
在一个实施例中,该第一处理模块32包括:
第一处理子模块,用于当检测到针对触摸屏的多个点的接触已聚拢至第一预设大小时,界面对象的显示图像开始互相压缩或者覆盖;
第二处理子模块,用于当检测到针对触摸屏的多个点的接触已聚拢至第二预设大小时,创建并显示一个界面对象容器,内容收集区域内的界面对象被保存在界面对象容器内。
在一个实施例中,该显示模块31包括:
第一显示子模块,用于当检测到针对触摸屏的多个点的接触在触摸屏上逐渐聚拢时,内容收集区域内的界面对象的显示图像逐渐集中,同时显示界面对象的上一级文件目录。
该第一处理模块32包括:
第二显示子模块,用于在界面对象的上一级文件目录中,显示创建的界面对象容器。
在一个实施例中,该装置还包括:
第一提示模块,用于提示对创建的界面对象容器进行重命名。
图12是本申请实施例提供的一种界面对象归类装置,包括:
第二检测模块40,用于检测到针对触摸屏显示的多个界面对象的接触,并且当针对多个界面对象的接触在触摸屏上逐渐聚拢时,多个界面对象的显示图像逐渐集中;
第二处理模块41,用于当检测到针对多个界面对象的接触已聚拢到预设范围之内时,创建并显示一个界面对象容器,多个界面对象被保存在界面对象容器内。
在一个实施例中,该第二处理模块41包括:
第三处理子模块,用于当检测到针对多个界面对象的接触已聚拢至第一预设大小时,多个界面对象的显示图像开始互相压缩或者覆盖;
第四处理子模块,用于当检测到针对多个界面对象的接触已聚拢至第二预设大小时,创建并显示一个界面对象容器,多个界面对象被保存在界面对象容器内。
在一个实施例中,该第二检测模块40包括:
第三显示子模块,用于当检测到针对多个界面对象的接触在触摸屏上逐渐聚拢时,多个界面对象的显示图像逐渐集中,同时显示多个界面对象的上一级文件目录。
该第二处理模块41包括:
第四显示子模块,用于在多个界面对象的上一级文件目录中,显示创建的界面对象容器。
在一个实施例中,该装置还包括:
第二提示模块,用于提示对创建的界面对象容器进行重命名。
下面通过几个应用场景对本申请实施例提供的界面对象归类方法和装置做进一步示例性的说明。
对触摸屏显示的应用程序图标进行归类时,终端设备检测到针对触摸屏的多个点的接触,多个点在触摸屏形成一内容收集区域。如图13(a)所示,检测到用户针对触摸屏的三个点的接触,这三个点在触摸屏形成一内容收集区域,被显示在该内容收集区域内的应用程序图标将做为被归类的应用程序图标。当用户手指保持对这三个点的接触并作出捏合手势时,终端设备检测到针对这三个点的接触在触摸屏上逐渐聚拢,此时内容收集区域内的应用程序图标逐渐集中,如图13(b)所示,内容收集区域内的应用程序图标彼此逐渐聚集。当检测到针对这三个点的接触已聚拢至预设大小时,创建一个文件夹,在终端设备所显示的界面中显示该文件夹的图标,如图13(c)所示,内容收集区域内的应用程序图标已被保存在创建的文件夹内,用户点击该文件夹的图标即可看到该文件夹内保存的应用程序图标。而在一个实施例中,对应用程序图标的归类也可以针对被选中的应用程序图标来进行,即用户通过多点触控选中需要归为一类的应用程序图标,通过捏合手势将被选中的应用程序图标归类到一个文件夹内。应用程序图标归类完毕后,提示用户对创建的文件夹重新命名,例如根据应用程序图标的类别给创建的文件夹命名为“社交生活”、“实用工具”等等。
在对图片文件进行归类时,终端设备检测到针对触摸屏显示的多个图片文件的接触,如图14(a)所示,检测到用户针对触摸屏显示的三个图片文件的接触,这三个图片文件将做为被归类的图片文件。当用户手指保持对这三个图片文件的接触并作出捏合手势时,终端设备检测到针对这三个图片文件的接触在触摸屏上逐渐聚拢,此时触摸屏显示的这三个图片文件的图标逐渐集中,如图14(b)所示,被选中的三个图片文件彼此逐渐聚集。当检测到针对这三个图片文件的接触已聚拢至预设大小时,创建一个文件夹,在终端设备所显示的界面中显示该文件夹的图标,如图14(c)所示,被选中的三个图片文件已被保存在创建的文件夹内,用户点击该文件夹的图标即可看到该文件夹内保存的三个图片文件。而在一个实施例中,针对图片文件的归类也可以通过在触摸屏形成内容收集区域来完成,即用户通过多点触控在触摸屏形成一内容收集区域,被显示在该内容收集区域内的图片文件将做为需要归类的图片文件,通过捏合手势将内容收集区域内的图片文件归类到一个文件内,并提示用户对该文件夹重新命名,例如,将该文件夹命名为“2012年3月旅游”、“集体合影”等名称。
在对触摸屏显示的商品列表条目进行归类时,终端设备检测到针对触摸屏的多个点 的接触,多个点在触摸屏形成一内容收集区域。如图15(a)所示,检测到用户针对触摸屏的两个点的接触,这两个点在触摸屏形成一内容收集区域,被显示在该内容收集区域内的商品列表条目将做为被归类的商品类表条目,形成的内容收集区域并没有完整的覆盖每个商品列表条目,而商品列表条目只要部分被显示在该内容收集区域内都将做为被归类的商品列表条目。当用户手指保持对这两个点的接触并作出捏合手势时,终端设备检测到针对这两个点的接触在触摸屏上逐渐聚拢,此时内容收集区域内的商品列表条目逐渐集中,如图15(b)所示,内容收集区域内的应用程序图标彼此逐渐聚集,同时显示商品列表条目的上一级目录,例如“食品饮料”、“奶粉”、“衣服”等目录。当检测到针对这两个点的接触已聚拢至预设大小时,创建一个目录,在终端设备所显示的界面中显示该目录的条目,如图15(c)所示,内容收集区域内的商品列表条目已被保存在创建的目录内,用户点击该目录条目即可看到该目录内保存的商品列表条目。而在一个实施例中,对商品列表条目的归类也可以针对被选中的商品列表条目来进行,即用户通过多点触控选中需要归为一类的商品列表条目,通过捏合手势将被选中的商品列表条目归类到一个目录内。商品列表条目归类完毕后,提示用户对创建的目录重新命名,例如根据商品列表条目的类别给创建的目录命名为“家用电器”、“日杂百货”等等。
在对触摸屏显示的邮件列表条目进行归类时,终端设备检测到针对触摸屏的多个点的接触,多个点在触摸屏形成一内容收集区域。如图16(a)所示,检测到用户针对触摸屏的两个点的接触,这两个点在触摸屏形成一内容收集区域,被显示在该内容收集区域内的邮件列表条目将做为被归类的邮件类表条目,形成的内容收集区域并没有完整的覆盖每个邮件列表条目,而邮件列表条目只要部分被显示在该内容收集区域内都将做为被归类的邮件列表条目。当用户手指保持对这两个点的接触并作出捏合手势时,终端设备检测到针对这两个点的接触在触摸屏上逐渐聚拢,此时内容收集区域内的邮件列表条目逐渐集中,如图16(b)所示,内容收集区域内的应用程序图标彼此逐渐聚集,同时显示邮件列表条目的上一级目录,例如“客户A”、“客户B”等目录。当检测到针对这两个点的接触已聚拢至预设大小时,创建一个目录,在终端设备所显示的界面中显示该目录的条目,如图16(c)所示,内容收集区域内的邮件列表条目已被保存在创建的目录内,用户点击该目录条目即可看到该目录内保存的邮件列表条目。而在一个实施例中,对邮件列表条目的归类也可以针对被选中的邮件列表条目来进行,即用户通过多点触控选中需要归为一类的邮件列表条目,通过捏合手势将被选中的邮件列表条目归类到一个目录内。邮件列表条目归类完毕后,提示用户对创建的目录重新命名,例如根据邮 件列表条目的类别给创建的目录命名为“客户C”、“内部通知”等等。
在一个典型的配置中,计算设备包括一个或多个处理器(CPU)、输入/输出接口、网络接口和内存。
内存可能包括计算机可读介质中的非永久性存储器,随机存取存储器(RAM)和/或非易失性内存等形式,如只读存储器(ROM)或闪存(flash RAM)。内存是计算机可读介质的示例。
计算机可读介质包括永久性和非永久性、可移动和非可移动媒体可以由任何方法或技术来实现信息存储。信息可以是计算机可读指令、数据结构、程序的模块或其他数据。计算机的存储介质的例子包括,但不限于相变内存(PRAM)、静态随机存取存储器(SRAM)、动态随机存取存储器(DRAM)、其他类型的随机存取存储器(RAM)、只读存储器(ROM)、电可擦除可编程只读存储器(EEPROM)、快闪记忆体或其他内存技术、只读光盘只读存储器(CD-ROM)、数字多功能光盘(DVD)或其他光学存储、磁盒式磁带,磁带磁磁盘存储或其他磁性存储设备或任何其他非传输介质,可用于存储可以被计算设备访问的信息。按照本文中的界定,计算机可读介质不包括非暂存电脑可读媒体(transitory media),如调制的数据信号和载波。
如在说明书及权利要求当中使用了某些词汇来指称特定组件。本领域技术人员应可理解,硬件制造商可能会用不同名词来称呼同一个组件。本说明书及权利要求并不以名称的差异来作为区分组件的方式,而是以组件在功能上的差异来作为区分的准则。如在通篇说明书及权利要求当中所提及的“包含”为一开放式用语,故应解释成“包含但不限定于”。“大致”是指在可接收的误差范围内,本领域技术人员能够在一定误差范围内解决所述技术问题,基本达到所述技术效果。此外,“耦接”一词在此包含任何直接及间接的电性耦接手段。因此,若文中描述一第一装置耦接于一第二装置,则代表所述第一装置可直接电性耦接于所述第二装置,或通过其他装置或耦接手段间接地电性耦接至所述第二装置。说明书后续描述为实施本申请的较佳实施方式,然所述描述乃以说明本申请的一般原则为目的,并非用以限定本申请的范围。本申请的保护范围当视所附权利要求所界定者为准。
还需要说明的是,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的商品或者***不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种商品或者***所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的商品或 者***中还存在另外的相同要素。
上述说明示出并描述了本申请的若干优选实施例,但如前所述,应当理解本申请并非局限于本文所披露的形式,不应看作是对其他实施例的排除,而可用于各种其他组合、修改和环境,并能够在本文所述发明构想范围内,通过上述教导或相关领域的技术或知识进行改动。而本领域人员所进行的改动和变化不脱离本申请的精神和范围,则都应在本申请所附权利要求的保护范围内。

Claims (42)

  1. 一种界面对象归类方法,其特征在于,包括:
    检测到针对触摸屏的多个点的接触,所述多个点在所述触摸屏形成一内容收集区域;
    当检测到针对所述触摸屏的多个点的接触在所述触摸屏上逐渐聚拢时,所述内容收集区域内的界面对象的显示图像逐渐集中;
    当检测到针对所述触摸屏的多个点的接触已聚拢至预设大小时,创建并显示一个界面对象容器,所述内容收集区域内的界面对象被保存在所述界面对象容器内。
  2. 如权利要求1所述的方法,其特征在于,所述当检测到针对所述触摸屏的多个点的接触已聚拢至预设大小时,创建并显示一个界面对象容器,所述内容收集区域内的界面对象被保存在所述界面对象容器内包括:
    当检测到针对所述触摸屏的多个点的接触已聚拢至第一预设大小时,所述界面对象的显示图像开始互相压缩或者覆盖;
    当检测到针对所述触摸屏的多个点的接触已聚拢至第二预设大小时,创建并显示一个界面对象容器,所述内容收集区域内的界面对象被保存在所述界面对象容器内。
  3. 如权利要求1所述的方法,其特征在于,所述当检测到针对所述触摸屏的多个点的接触在所述触摸屏上逐渐聚拢时,所述内容收集区域内的界面对象的显示图像逐渐集中包括:
    当检测到针对所述触摸屏的多个点的接触在所述触摸屏上逐渐聚拢时,所述内容收集区域内的界面对象的显示图像逐渐集中,同时显示所述界面对象的上一级文件目录。
  4. 如权利要求3所述的方法,其特征在于,所述当检测到针对所述触摸屏的多个点的接触已聚拢至预设大小时,创建并显示一个界面对象容器,所述内容收集区域内的界面对象被保存在所述界面对象容器内包括:
    在所述界面对象的上一级文件目录中,显示所述创建的界面对象容器。
  5. 如权利要求1所述的方法,其特征在于,所述方法还包括:
    提示对所述创建的界面对象容器进行重命名。
  6. 如权利要求1所述的方法,其特征在于,所述方法还包括:
    检测到针对所述界面对象容器的多个点的接触,当针对所述界面对象容器的多个点的接触在所述触摸屏上逐渐扩散时,在当前的文件目录中删除所述界面对象容器并显示所述界面对象容器保存的多个界面对象。
  7. 如权利要求1所述的方法,其特征在于,所述检测到针对触摸屏的多个点的接触包括:
    通过触控板或者虚拟触控空间检测到针对触摸屏的多个点的接触。
  8. 一种界面对象归类方法,其特征在于,包括:
    检测到针对触摸屏显示的多个界面对象的接触,并且当针对所述多个界面对象的接触在所述触摸屏上逐渐聚拢时,所述多个界面对象的显示图像逐渐集中;
    当检测到针对所述多个界面对象的接触已聚拢到预设范围之内时,创建并显示一个界面对象容器,所述多个界面对象被保存在所述界面对象容器内。
  9. 如权利要求8所述的方法,其特征在于,所述当检测到针对所述多个界面对象的接触已聚拢到预设范围之内时,创建并显示一个界面对象容器,所述多个界面对象被保存在所述界面对象容器内包括:
    当检测到针对所述多个界面对象的接触已聚拢至第一预设大小时,所述多个界面对象的显示图像开始互相压缩或者覆盖;
    当检测到针对所述多个界面对象的接触已聚拢至第二预设大小时,创建并显示一个界面对象容器,所述多个界面对象被保存在所述界面对象容器内。
  10. 如权利要求8所述的方法,其特征在于,所述当检测到针对所述多个界面对象的接触在所述触摸屏上逐渐聚拢时,所述多个界面对象的显示图像逐渐集中包括:
    当检测到针对所述多个界面对象的接触在所述触摸屏上逐渐聚拢时,所述多个界面对象的显示图像逐渐集中,同时显示所述多个界面对象的上一级文件目录。
  11. 如权利要求10所述的方法,其特征在于,所述当检测到针对所述多个界面对象的接触已聚拢至预设大小时,创建并显示一个界面对象容器,所述多个界面对象被保存在所述界面对象容器内包括:
    在所述多个界面对象的上一级文件目录中,显示所述创建的界面对象容器。
  12. 如权利要求8所述的方法,其特征在于,所述方法还包括:
    提示对所述创建的界面对象容器进行重命名。
  13. 如权利要求8所述的方法,其特征在于,所述方法还包括:
    检测到针对所述界面对象容器的多个点的接触,当针对所述界面对象容器的多个点的接触在所述触摸屏上逐渐扩散时,在当前的文件目录中删除所述界面对象容器并显示所述界面对象容器保存的多个界面对象。
  14. 如权利要求8所述的方法,其特征在于,所述检测到针对触摸屏显示的多个界 面对象的接触包括:
    通过触控板或者虚拟触控空间检测到针对触摸屏显示的多个界面对象的接触。
  15. 一种界面对象归类方法,其特征在于,包括:
    检测到针对触摸屏的多个点的接触,所述多个点在所述触摸屏形成一内容收集区域;
    检测到针对所述内容收集区域的多个点的接触,并且当针对所述内容收集区域的多个点的接触在所述触摸屏上逐渐聚拢时,所述内容收集区域内的界面对象的显示图像逐渐集中;
    当检测到针对所述内容收集区域的多个点的接触已聚拢到预设范围之内时,创建并显示一个界面对象容器,所述内容收集区域内的界面对象被保存在所述界面对象容器内。
  16. 如权利要求15所述的方法,其特征在于,所述当检测到针对所述内容收集区域的多个点的接触已聚拢到预设范围之内时,创建并显示一个界面对象容器,所述内容收集区域内的界面对象被保存在所述界面对象容器内包括:
    当检测到针对所述内容收集区域的多个点的接触已聚拢至第一预设大小时,所述内容收集区域内的界面对象的显示图像开始互相压缩或者覆盖;
    当检测到针对所述内容收集区域的多个点的接触已聚拢至第二预设大小时,创建并显示一个界面对象容器,所述内容收集区域内的界面对象被保存在所述界面对象容器内。
  17. 如权利要求15所述的方法,其特征在于,所述当针对所述内容收集区域的多个点的接触在所述触摸屏上逐渐聚拢时,所述内容收集区域内的界面对象的显示图像逐渐集中包括:
    当检测到针对所述内容收集区域的多个点的接触在所述触摸屏上逐渐聚拢时,所述内容收集区域内的界面对象的显示图像逐渐集中,同时显示所述内容收集区域内的界面对象的上一级文件目录。
  18. 如权利要求17所述的方法,其特征在于,所述当检测到针对所述内容收集区域的多个点的接触已聚拢到预设范围之内时,创建并显示一个界面对象容器,所述内容收集区域内的界面对象被保存在所述界面对象容器内包括:
    在所述内容收集区域内的界面对象的上一级文件目录中,显示所述创建的界面对象容器。
  19. 如权利要求15所述的方法,其特征在于,所述方法还包括:
    提示对所述创建的界面对象容器进行重命名。
  20. 如权利要求15所述的方法,其特征在于,所述方法还包括:
    检测到针对所述界面对象容器的多个点的接触,当针对所述界面对象容器的多个点的接触在所述触摸屏上逐渐扩散时,在当前的文件目录中删除所述界面对象容器并显示所述界面对象容器保存的界面对象。
  21. 如权利要求15所述的方法,其特征在于,所述检测到针对触摸屏的多个点的接触包括:
    通过触控板或者虚拟触控空间检测到针对触摸屏的多个点的接触。
  22. 一种界面对象归类装置,其特征在于,包括:
    第一检测模块,用于检测到针对触摸屏的多个点的接触,所述多个点在所述触摸屏形成一内容收集区域;
    显示模块,用于当检测到针对所述触摸屏的多个点的接触在所述触摸屏上逐渐聚拢时,所述内容收集区域内的界面对象的显示图像逐渐集中;
    第一处理模块,用于当检测到针对所述触摸屏的多个点的接触已聚拢至预设大小时,创建并显示一个界面对象容器,所述内容收集区域内的界面对象被保存在所述界面对象容器内。
  23. 如权利要求22所述的装置,其特征在于,所述第一处理模块包括:
    第一处理子模块,用于当检测到针对所述触摸屏的多个点的接触已聚拢至第一预设大小时,所述界面对象的显示图像开始互相压缩或者覆盖;
    第二处理子模块,用于当检测到针对所述触摸屏的多个点的接触已聚拢至第二预设大小时,创建并显示一个界面对象容器,所述内容收集区域内的界面对象被保存在所述界面对象容器内。
  24. 如权利要求22所述的装置,其特征在于,所述显示模块包括:
    第一显示子模块,用于当检测到针对所述触摸屏的多个点的接触在所述触摸屏上逐渐聚拢时,所述内容收集区域内的界面对象的显示图像逐渐集中,同时显示所述界面对象的上一级文件目录。
  25. 如权利要求24所述的装置,其特征在于,所述第一处理模块包括:
    第二显示子模块,用于在所述界面对象的上一级文件目录中,显示所述创建的界面对象容器。
  26. 如权利要求22所述的装置,其特征在于,所述装置还包括:
    第一提示模块,用于提示对所述创建的界面对象容器进行重命名。
  27. 如权利要求22所述的装置,其特征在于,所述装置还包括:
    第二检测模块,用于检测到针对所述界面对象容器的多个点的接触,当针对所述界面对象容器的多个点的接触在所述触摸屏上逐渐扩散时,在当前的文件目录中删除所述界面对象容器并显示所述界面对象容器保存的多个界面对象。
  28. 如权利要求22所述的装置,其特征在于,所述第一检测模块包括:
    第一检测子模块,用于通过触控板或者虚拟触控空间检测到针对触摸屏的多个点的接触。
  29. 一种界面对象归类装置,其特征在于,包括:
    第三检测模块,用于检测到针对触摸屏显示的多个界面对象的接触,并且当针对所述多个界面对象的接触在所述触摸屏上逐渐聚拢时,所述多个界面对象的显示图像逐渐集中;
    第二处理模块,用于当检测到针对所述多个界面对象的接触已聚拢到预设范围之内时,创建并显示一个界面对象容器,所述多个界面对象被保存在所述界面对象容器内。
  30. 如权利要求29所述的装置,其特征在于,所述第二处理模块包括:
    第三处理子模块,用于当检测到针对所述多个界面对象的接触已聚拢至第一预设大小时,所述多个界面对象的显示图像开始互相压缩或者覆盖;
    第四处理子模块,用于当检测到针对所述多个界面对象的接触已聚拢至第二预设大小时,创建并显示一个界面对象容器,所述多个界面对象被保存在所述界面对象容器内。
  31. 如权利要求29所述的装置,其特征在于,所述第三检测模块包括:
    第三显示子模块,用于当检测到针对所述多个界面对象的接触在所述触摸屏上逐渐聚拢时,所述多个界面对象的显示图像逐渐集中,同时显示所述多个界面对象的上一级文件目录。
  32. 如权利要求31所述的装置,其特征在于,所述第二处理模块包括:
    第四显示子模块,用于在所述多个界面对象的上一级文件目录中,显示所述创建的界面对象容器。
  33. 如权利要求29所述的装置,其特征在于,所述装置还包括:
    第二提示模块,用于提示对所述创建的界面对象容器进行重命名。
  34. 如权利要求29所述的装置,其特征在于,所述装置还包括:
    第四检测模块,用于检测到针对所述界面对象容器的多个点的接触,当针对所述界面对象容器的多个点的接触在所述触摸屏上逐渐扩散时,在当前的文件目录中删除所述界面对象容器并显示所述界面对象容器保存的多个界面对象。
  35. 如权利要求29所述的装置,其特征在于,所述第三检测模块包括:
    第二检测子模块,用于通过触控板或者虚拟触控空间检测到针对触摸屏显示的多个界面对象的接触。
  36. 一种界面对象归类装置,其特征在于,包括:
    第五检测模块,用于检测到针对触摸屏的多个点的接触,所述多个点在所述触摸屏形成一内容收集区域;
    第六检测模块,用于检测到针对所述内容收集区域的多个点的接触,并且当针对所述内容收集区域的多个点的接触在所述触摸屏上逐渐聚拢时,所述内容收集区域内的界面对象的显示图像逐渐集中;
    第三处理模块,用于当检测到针对所述内容收集区域的多个点的接触已聚拢到预设范围之内时,创建并显示一个界面对象容器,所述内容收集区域内的界面对象被保存在所述界面对象容器内。
  37. 如权利要求36所述的装置,其特征在于,所述第三处理模块包括:
    第五处理子模块,用于当检测到针对所述内容收集区域的多个点的接触已聚拢至第一预设大小时,所述内容收集区域内的界面对象的显示图像开始互相压缩或者覆盖;
    第六处理子模块,用于当检测到针对所述内容收集区域的多个点的接触已聚拢至第二预设大小时,创建并显示一个界面对象容器,所述内容收集区域内的界面对象被保存在所述界面对象容器内。
  38. 如权利要求36所述的装置,其特征在于,所述第六检测模块包括:
    第五显示子模块,用于当检测到针对所述内容收集区域的多个点的接触在所述触摸屏上逐渐聚拢时,所述内容收集区域内的界面对象的显示图像逐渐集中,同时显示所述内容收集区域内的界面对象的上一级文件目录。
  39. 如权利要求38所述的装置,其特征在于,所述第三处理模块包括:
    第六显示子模块,用于在所述内容收集区域内的界面对象的上一级文件目录中,显示所述创建的界面对象容器。
  40. 如权利要求36所述的装置,其特征在于,所述装置还包括:
    第三提示模块,用于提示对所述创建的界面对象容器进行重命名。
  41. 如权利要求36所述的装置,其特征在于,所述装置还包括:
    第七检测模块,用于检测到针对所述界面对象容器的多个点的接触,当针对所述界面对象容器的多个点的接触在所述触摸屏上逐渐扩散时,在当前的文件目录中删除所述界面对象容器并显示所述界面对象容器保存的界面对象。
  42. 如权利要求36所述的装置,其特征在于,所述检测到针对触摸屏的多个点的接触包括:
    第三检测子模块,用于通过触控板或者虚拟触控空间检测到针对触摸屏的多个点的接触。
PCT/CN2016/094105 2015-08-18 2016-08-09 界面对象归类方法和装置 WO2017028703A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201510507368.1A CN106469015A (zh) 2015-08-18 2015-08-18 界面对象归类方法和装置
CN201510507368.1 2015-08-18

Publications (1)

Publication Number Publication Date
WO2017028703A1 true WO2017028703A1 (zh) 2017-02-23

Family

ID=58050405

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/094105 WO2017028703A1 (zh) 2015-08-18 2016-08-09 界面对象归类方法和装置

Country Status (2)

Country Link
CN (1) CN106469015A (zh)
WO (1) WO2017028703A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106896998B (zh) * 2016-09-21 2020-06-02 阿里巴巴集团控股有限公司 一种操作对象的处理方法及装置
CN106951141B (zh) 2017-03-16 2019-03-26 维沃移动通信有限公司 一种图标的处理方法及移动终端
CN109871171A (zh) * 2018-12-29 2019-06-11 天津字节跳动科技有限公司 一种文档程序合并的方法、装置、介质和电子设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2372516A2 (en) * 2010-04-05 2011-10-05 Sony Ericsson Mobile Communications AB Methods, systems and computer program products for arranging a plurality of icons on a touch sensitive display
CN102883066A (zh) * 2012-09-29 2013-01-16 惠州Tcl移动通信有限公司 基于手势识别实现文件操作的方法及手机
CN102999286A (zh) * 2011-09-16 2013-03-27 腾讯科技(深圳)有限公司 一种快速创建文件夹的***及方法
CN103294264A (zh) * 2013-05-15 2013-09-11 贝壳网际(北京)安全技术有限公司 数据处理方法及装置
CN103294401A (zh) * 2013-06-03 2013-09-11 广东欧珀移动通信有限公司 一种具有触摸屏的电子设备的图标处理方法及装置
US20140047370A1 (en) * 2012-08-07 2014-02-13 Samsung Electronics Co., Ltd. Method and apparatus for copy-and-paste of object

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130080179A (ko) * 2012-01-04 2013-07-12 삼성전자주식회사 휴대용 단말기에서 아이콘 관리 방법 및 장치
CN103885670A (zh) * 2012-12-10 2014-06-25 广东欧珀移动通信有限公司 一种移动设备的桌面图标管理方法和装置
CN104461357A (zh) * 2014-11-28 2015-03-25 上海斐讯数据通信技术有限公司 一种信息条目处理方法及移动终端

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2372516A2 (en) * 2010-04-05 2011-10-05 Sony Ericsson Mobile Communications AB Methods, systems and computer program products for arranging a plurality of icons on a touch sensitive display
CN102999286A (zh) * 2011-09-16 2013-03-27 腾讯科技(深圳)有限公司 一种快速创建文件夹的***及方法
US20140047370A1 (en) * 2012-08-07 2014-02-13 Samsung Electronics Co., Ltd. Method and apparatus for copy-and-paste of object
CN102883066A (zh) * 2012-09-29 2013-01-16 惠州Tcl移动通信有限公司 基于手势识别实现文件操作的方法及手机
CN103294264A (zh) * 2013-05-15 2013-09-11 贝壳网际(北京)安全技术有限公司 数据处理方法及装置
CN103294401A (zh) * 2013-06-03 2013-09-11 广东欧珀移动通信有限公司 一种具有触摸屏的电子设备的图标处理方法及装置

Also Published As

Publication number Publication date
CN106469015A (zh) 2017-03-01

Similar Documents

Publication Publication Date Title
US11740914B2 (en) Positioning user interface components based on application layout and user workflows
US11393017B2 (en) Two-dimensional code identification method and device, and mobile terminal
US20180232438A1 (en) Title display method and apparatus
AU2014200472B2 (en) Method and apparatus for multitasking
US20180101616A1 (en) Search System, Page Display Method and Client Terminal
KR102199786B1 (ko) 콘텐트를 기반으로 하는 정보 제공 방법 및 장치
KR102268940B1 (ko) 서비스 프로세싱 방법 및 디바이스
US9244603B2 (en) Drag and drop techniques for discovering related content
TWI671674B (zh) 一種資訊顯示方法及裝置
CN103034518B (zh) 加载浏览器控制工具的方法及浏览器
CN105786294A (zh) 一种页面访问路径的返回方法及装置
CN105095221B (zh) 一种触摸屏终端中查找信息记录的方法及其装置
US20170329859A1 (en) Categorizing and Clipping Recently Browsed Web Pages
US20120054657A1 (en) Methods, apparatuses and computer program products for enabling efficent copying and pasting of data via a user interface
WO2017028703A1 (zh) 界面对象归类方法和装置
CN104298462A (zh) 一种单手操作智能终端的方法及装置
US20150286342A1 (en) System and method for displaying application data through tile objects
CA2959686A1 (en) Personalized contextual menu for inserting content in a current application
US9607100B1 (en) Providing inline search suggestions for search strings
WO2019147852A1 (en) Techniques for utilizing translucent user interface elements
US10824306B2 (en) Presenting captured data
CN110851712A (zh) 书籍信息的推荐方法、设备以及计算机可读介质
WO2017177820A1 (zh) 一种即时通讯中的文件发送方法及装置
CN106469095A (zh) 一种应用程序数据的处理方法及终端
JP2017538202A (ja) 画面表示装置にオブジェクト情報を表示するための方法及び装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16836567

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16836567

Country of ref document: EP

Kind code of ref document: A1