CN115470757A - Content ordering method and terminal equipment - Google Patents

Content ordering method and terminal equipment Download PDF

Info

Publication number
CN115470757A
CN115470757A CN202210981781.1A CN202210981781A CN115470757A CN 115470757 A CN115470757 A CN 115470757A CN 202210981781 A CN202210981781 A CN 202210981781A CN 115470757 A CN115470757 A CN 115470757A
Authority
CN
China
Prior art keywords
user
note
interface
instruction
terminal device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210981781.1A
Other languages
Chinese (zh)
Inventor
廖源
肖冬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202210981781.1A priority Critical patent/CN115470757A/en
Publication of CN115470757A publication Critical patent/CN115470757A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/166Editing, e.g. inserting or deleting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/338Presentation of query results
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/35Clustering; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/38Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/381Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using identifiers, e.g. barcodes, RFIDs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/103Formatting, i.e. changing of presentation of documents
    • G06F40/109Font handling; Temporal or kinetic typography

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Library & Information Science (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a content ordering method and terminal equipment. The method is beneficial to the fact that the user can record information more conveniently when the user uses the note software on the terminal equipment, and therefore user experience is improved. In the method, terminal equipment displays an interface of a note to be edited; receiving an instruction for triggering the note to be edited, and displaying a typesetting editing interface of the note to be edited; receiving and responding to a drawing path instruction on a typesetting editing interface, and displaying a drawing path on the typesetting editing interface, wherein the drawing path is used for dividing a note to be edited into a plurality of segments; receiving and responding to an instruction for identifying the custom identifier on any one of the divided parts, and displaying the corresponding custom identifier on any one of the divided parts; and receiving and responding to an instruction for sequencing each segmentation part according to the user-defined identification on each segmentation part, sequencing each segmentation part according to the user-defined identification displayed by each segmentation part, and displaying an interface of the sequenced notes.

Description

Content ordering method and terminal equipment
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a content ordering method and a terminal device.
Background
With the popularization of terminal devices, the application scenarios of note software such as memos, notes and the like are gradually widened. In order to improve efficiency and conform to the use habit of users, many current note-type software can support finger or stylus input so as to more simulate the input mode of a common paper pen. Therefore, the method has important research significance on how to enable the user to more conveniently record information such as characters, pictures, doodles and the like on the note software.
Disclosure of Invention
The content ordering method and the terminal device are beneficial to a user to more conveniently record information by using note software on the terminal device, so that the user experience is improved.
In a first aspect, an embodiment of the present application provides a content ordering method. The method can be applied to terminal equipment and comprises the following steps: displaying an interface of a note to be edited; receiving an instruction for triggering the note to be edited, and displaying a typesetting editing interface of the note to be edited; receiving and responding to a drawing path instruction on the typesetting editing interface, and displaying a drawing path on the typesetting editing interface, wherein the drawing path is used for dividing the note to be edited into a plurality of divided parts; receiving and responding to an identification instruction for identifying a custom identifier on any one of the divided parts, and displaying the corresponding custom identifier on any one of the divided parts; and receiving and responding to a first sequencing instruction for sequencing each segmentation part according to the user-defined identification on each segmentation part, sequencing each segmentation part according to the user-defined identification displayed by each segmentation part, and displaying an interface of the sequenced notes.
According to the method, when the user uses the note software, the terminal equipment can provide a new mode for the user and can intelligently sort the contents in the note. The user can firstly manually divide the content contained in one note into a plurality of segments through a drawing path instruction, and then can mark corresponding custom identifiers for each segment, so that the terminal equipment can intelligently sort each segment based on the custom identifiers, and further obtain a piece of reordered target note. Therefore, the method can help the user to use the note software more conveniently, can realize sequential arrangement of note contents more simply and conveniently, and the like, can better accord with the operation mode of the habit of the user so as to improve the working efficiency of the user in using the note software and further improve the user experience.
In a possible design, the sorting of the segments according to the custom identifier displayed by each segment may be implemented as: and displaying the segmentation parts with different custom identifications according to the front and back display sequence of the different custom identifications, and displaying the different segmentation parts with the same custom identification according to the adjacent position sequence.
In the design, the terminal equipment can classify all the segmentation parts based on the user-defined identification added by the user on each segmentation part, and the segmentation parts marked with the same or similar user-defined identification are classified into the same classification and then displayed to the adjacent positions; furthermore, the display order after the sorting of the divided parts can be determined based on the custom identification order of the identification of the divided parts. Therefore, the sequencing result of the note content can be displayed through the design, the user intention is better met, and the user experience can be further improved.
In one possible design, the method further includes: and receiving a second sorting instruction for moving at least one designated segmentation part to a target position on the interface of the sorted notes, and moving the at least one designated segmentation part to the target position for display.
In the design, the terminal equipment can automatically sort based on the user-defined identification of each partition part identification, and can also combine with manual adjustment of the user on the automatically sorted notes, so that the target note contents can be sorted more flexibly, the target note contents are more suitable for the intention of the user, and the user experience can be improved.
In one possible design, receiving an identification instruction identifying a custom identifier on any of the partitions may be implemented as: receiving an operation instruction of manually drawing a user-defined identifier on any one of the segmentation parts by a user; alternatively, it can also be implemented as: displaying a plurality of preset identifications on the typesetting editing interface, receiving an operation instruction of a user for selecting the identifications from the displayed plurality of preset identifications and drawing the selected identifications on any one of the segmentation parts selected by the user.
In the design, the terminal equipment can provide different modes for identifying the user-defined identification on each segmentation part for the user according to the actual service scene; the convenience of the user for editing the note can be improved in a manual drawing mode of the user; by providing a plurality of preset identifiers for the user to select, the accuracy of classifying the user-defined identifiers of the segmentation parts based on the identifiers by the terminal equipment can be guaranteed. It can be understood that, during specific implementation, the setting can be performed according to the user requirement, so that the use experience of the user can be improved.
In one possible design, the custom identifier may be represented by, but is not limited to, one or a combination of the following forms: numbers, letters, symbols.
In one possible design, the method further includes: and receiving an editing instruction for performing typesetting and editing on the sorted notes based on a preset typesetting rule on the interface of the sorted notes, performing typesetting and editing on the sorted notes, and displaying the interface of the typesetted and edited notes. In the design, the terminal equipment can not only realize the content sequencing of the notes to be edited, but also combine an intelligent typesetting implementation mode on the basis of the content sequencing to intelligently typeset the sequenced notes, so that the uniformity, the attractiveness and the like of the notes can be improved.
In one possible design, the preset typesetting rule may include, but is not limited to, at least one of the following: adjusting the fonts of the ordered notes to be consistent; adjusting the word space of the sorted notes to be consistent; converting the handwriting words of the sorted notes into text words; and adjusting the paragraph widths of the ordered notes to be consistent.
In a second aspect, an embodiment of the present application further provides a terminal device, where the terminal device includes at least one memory and at least one processor; wherein the at least one memory is configured to store computer program code comprising computer instructions; the computer instructions, when executed by the at least one processor, cause the terminal device to perform the method of any one of the possible designs of the first aspect.
In a third aspect, an embodiment of the present application further provides a content ranking apparatus, which includes a module/unit that performs the method in any one of the possible designs of the first aspect. These modules/units may be implemented by hardware, or by hardware executing corresponding software.
In a fourth aspect, a computer-readable storage medium is provided, in which a computer program (also referred to as code, or instructions) is stored, which when run on a computer, causes the computer to perform the method in any one of the possible designs of the first aspect.
In a fifth aspect, there is provided a computer program product comprising: computer program (also called code, or instructions), which when executed, causes the method in any of the possible designs of the first aspect described above to be performed.
In a sixth aspect, a graphical user interface on a terminal device is provided, the terminal device having a display screen, one or more memories, and one or more processors configured to execute one or more computer programs stored in the one or more memories, the graphical user interface comprising a graphical user interface displayed by the terminal device when the terminal device executes any one of the possible designs of the first aspect of the embodiments of the present application.
It should be noted that, for the beneficial effects of the respective designs of the terminal device provided in the second aspect to the sixth aspect of the embodiment of the present application, reference is made to the beneficial effect of any one of the possible designs in the first aspect, and details are not repeated here.
Drawings
FIG. 1 is a schematic diagram of a memo interface;
fig. 2 is a schematic diagram of a hardware architecture of a terminal device according to an embodiment of the present disclosure;
fig. 3 is a block diagram of a software system architecture of a terminal device according to an embodiment of the present disclosure;
fig. 4 is an interface diagram of a content sorting method according to an embodiment of the present application;
fig. 5 is a second interface diagram of a content sorting method according to an embodiment of the present application;
fig. 6 is a third interface diagram of a content sorting method according to an embodiment of the present application;
FIG. 7 is a fourth interface diagram of a content ranking method according to an embodiment of the present application;
FIG. 8 is a fifth interface diagram of a content ordering method according to an embodiment of the present application;
FIG. 9 is a sixth interface diagram of a content ordering method according to an embodiment of the present application;
fig. 10 is a flowchart illustrating a content sorting method according to an embodiment of the present application;
fig. 11 is a seventh interface diagram of a content sorting method according to an embodiment of the present application.
Detailed Description
With the rapid development of society, mobile terminal devices such as mobile phones are becoming more and more popular. The mobile phone not only has a communication function, but also has a powerful processing function, a storage function, a photographing function, a data editing function and the like. Therefore, the mobile phone can be used as a communication tool and a mobile database of a user, and can realize operations of recording data or editing recorded data and the like at any time and any place by the user; editing operations may include, but are not limited to, cutting, copying, pasting, deleting, etc. operations on the data content. Therefore, based on the mobility and convenience of the mobile terminal device, the data content is stored and edited through the mobile terminal device, and the method and the device are suitable for various scenes such as file creation, data recording and the like of a user.
Applications (APPs) having various functions, such as APPs of music, video, memo, and the like, may be installed in the terminal device. In the embodiment of the application, note type APPs such as memos, notes and the like installed in the terminal equipment can be focused.
Many note type APPs can support terminal devices to match with a stylus (also referred to as a "marker pen" or the like) or to perform finger writing input, such as a scene shown in fig. 1 a for a user to edit in an editing interface of a memo installed on a mobile phone by using the stylus, and a scene shown in fig. 1 b for a user to edit in an editing interface of a memo installed on a mobile phone by using a finger.
Based on this, the embodiment of the present application provides a content ordering method, which can be applied to any note included in a note APP, and is used to perform typesetting and editing operations such as repartitioning, reclassifying or adjusting a sequence on data content included in the note. The data content contained in the note may include, but is not limited to, one or a combination of the following types: text, pictures, video or diagrams.
The embodiment of the application can be applied to terminal devices with touch panels, such as mobile phones, tablet computers, wearable devices (e.g., watches, bracelets, etc.), vehicle-mounted devices, augmented Reality (AR)/Virtual Reality (VR) devices, notebook computers, ultra-mobile personal computers (UMPCs), netbooks, personal Digital Assistants (PDAs), smart home devices (e.g., smart televisions, smart speakers, etc.), and the like. It should be understood that, the embodiment of the present application does not set any limit to a specific type of the terminal device.
Terminal equipment to which the embodiments of the present application can be applied, exemplary embodiments can include but are not limited to piggy-backing
Figure BDA0003799795450000041
Or other operating system terminal equipment. The portable electronic device may also be other portable electronic devices such as Laptop computers (Laptop computers) with touch sensitive surfaces (e.g., touch panels)) And so on.
Fig. 2 shows a schematic diagram of a possible hardware structure of the terminal device. Wherein the terminal device 200 includes: radio Frequency (RF) circuitry 210, a power supply 220, a processor 230, a memory 240, an input unit 250, a display unit 260, audio circuitry 270, a communication interface 280, and a wireless-fidelity (Wi-Fi) module 290. Those skilled in the art will appreciate that the hardware configuration of the terminal device 200 shown in fig. 2 does not constitute a limitation to the terminal device 200, and the terminal device 200 provided in the embodiments of the present application may include more or less components than those shown, may combine two or more components, or may have a different configuration of components. The various components shown in fig. 2 may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The following describes each component of the terminal device 200 in detail with reference to fig. 2:
the RF circuit 210 may be used for receiving and transmitting data during a communication or conversation. Specifically, the RF circuit 210 sends downlink data of the base station to the processor 230 for processing; and in addition, sending the uplink data to be sent to the base station. Generally, the RF circuit 210 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like.
In addition, RF circuit 210 may also communicate with other devices via a wireless communication network. The wireless communication may use any communication standard or protocol, including but not limited to global system for mobile communications (GSM), general Packet Radio Service (GPRS), code Division Multiple Access (CDMA), wideband Code Division Multiple Access (WCDMA), long Term Evolution (LTE), email, short Message Service (SMS), and the like.
The Wi-Fi technology belongs to a short-distance wireless transmission technology, and the terminal device 200 may connect to an Access Point (AP) through the Wi-Fi module 290, thereby implementing access to a data network. The Wi-Fi module 290 may be used for receiving and transmitting data during communication.
The terminal device 200 may be physically connected to other devices through the communication interface 280. Optionally, the communication interface 280 is connected to the communication interface of the other device through a cable, so as to implement data transmission between the terminal device 200 and the other device.
The terminal device 200 can also implement a communication service, and implement interaction with a service-side device or other terminal devices, so that the terminal device 200 needs to have a data transmission function, that is, the terminal device 200 needs to include a communication module inside. Although fig. 2 shows communication modules such as the RF circuit 210, the Wi-Fi module 290, and the communication interface 280, it is understood that at least one of the above-mentioned components or other communication modules (such as a bluetooth module) for implementing communication exists in the terminal device 200 for data transmission.
For example, when the terminal device 200 is a mobile phone, the terminal device 200 may include the RF circuit 210, may further include the Wi-Fi module 290, or may include a bluetooth module (not shown in fig. 2); when the terminal device 200 is a computer, the terminal device 200 may include the communication interface 280, may further include the Wi-Fi module 290, or may include a bluetooth module (not shown in fig. 2); when the terminal device 200 is a tablet computer, the terminal device 200 may include the Wi-Fi module, or may include a bluetooth module (not shown in fig. 2).
The memory 240 may be used to store software programs and modules. The processor 230 executes various functional applications and data processing of the terminal device 200 by executing software programs and modules stored in the memory 240. Alternatively, the memory 240 may mainly include a program storage area and a data storage area. The storage program area may store an operating system (mainly including a kernel layer, a system layer, an application framework layer, an application layer, and other corresponding software programs or modules).
Further, the memory 240 may include high speed random access memory and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. In this embodiment, the memory 240 may store the note content newly created by the user on the note APP, and the note content updated after editing the existing note content, and other storage data related to the implementation of the method provided by this application.
The input unit 250 may be used to receive editing operations of a plurality of different types of data objects such as numeric or character information input by a user and to generate key signal inputs related to user settings and function control of the terminal device 200. Alternatively, the input unit 250 may include a touch panel 251 and other input devices 252.
The touch panel 251, also referred to as a touch screen, may collect a touch operation performed by a user on or near the touch panel 251 (for example, an operation performed by the user on or near the touch panel 251 using any suitable object or accessory such as a finger, a stylus, etc.), and drive a corresponding connection device according to a preset program. In this embodiment, the touch panel 251 may collect, based on a relevant editing interface of the note APP displayed by the terminal device through the display unit 260, an editing operation of a user on the relevant editing interface of the note APP, for example, the editing operation may be a user operation such as adding a character, deleting a picture, and adjusting a sequence of note content.
Optionally, the other input devices 252 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 260 may be used to display information input by a user or information provided to a user and various menus of the terminal apparatus 200. The display unit 260 is a display system of the terminal device 200, and is used for presenting an interface to implement human-computer interaction. The display unit 260 may include a display panel 261. Alternatively, the display panel 261 may be configured in the form of a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), or the like. In this embodiment, the display unit 260 may be configured to display a relevant editing interface of the note APP for the user, for example, a display interface of a memo shown in fig. 1, so as to enable the user to view, create or edit a note, and the like.
The processor 230 is a control center of the terminal device 200, connects various components using various interfaces and lines, performs various functions of the terminal device 200 and processes data by operating or executing software programs and/or modules stored in the memory 240 and calling data stored in the memory 240, thereby implementing various services based on the terminal device 200. In this embodiment, the processor 230 may be configured to implement the content ordering method provided in this embodiment.
The terminal device 200 further comprises a power supply 220, such as a battery, for powering the various components. Optionally, the power supply 220 may be logically connected to the processor 230 through a power management system, so as to implement functions of managing charging, discharging, power consumption, and the like through the power management system.
As shown in fig. 2, the terminal device 200 further includes an audio circuit 270, a microphone 271 and a speaker 272, which can provide an audio interface between the user and the terminal device 200. The audio circuit 270 may be configured to convert audio data into a signal that can be recognized by the speaker 272 and to transmit the signal to the speaker 272 for conversion by the speaker 272 into an audio signal for output. The microphone 271 is used for collecting external sound signals (such as voice of a human being, other sounds, etc.), converting the collected external sound signals into signals that can be recognized by the audio circuit 270, and sending the signals to the audio circuit 270. The audio circuit 270 may also be used to convert signals sent by the microphone 271 into audio data, and output the audio data to the RF circuit 210 for transmission to, for example, another terminal device, or output the audio data to the memory 240 for further processing. In the embodiment of the application, when a user edits a note through a note APP, a new text content can be added through the microphone 271 and based on a speech-to-text technology.
Although not shown, the terminal device 200 may further include a camera, at least one sensor, and the like, which are not described in detail herein. The at least one sensor may include, but is not limited to, a pressure sensor, an air pressure sensor, an acceleration sensor, a distance sensor, a fingerprint sensor, a touch sensor, a temperature sensor, and the like.
The Operating System (OS) according to the embodiment of the present application is the most basic system software that runs on the terminal device 200. The software system of the terminal device 200 may adopt a hierarchical architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the present application exemplifies a software system architecture of the terminal device 200 by taking an operating system adopting a layered architecture as an example.
Fig. 3 is a block diagram of a software system architecture of a terminal device according to an embodiment of the present application. As shown in fig. 3, the software system architecture of the terminal device may be a layered architecture, for example, the software may be divided into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the operating system is divided into five layers, from top to bottom, an application layer, an application framework layer (FWK), runtime and system libraries, a kernel layer, and a hardware layer.
The application layer may include a series of application packages. As shown in fig. 3, the application layer may include a camera, settings, a skin module, a User Interface (UI), a third party application, and the like. The third-party application program may include a Wireless Local Area Network (WLAN), music, a call, bluetooth, a video, a memo, and the like.
In the embodiment of the application, the application layer can be used for realizing the presentation of the editing interface. The editing interface may provide, for a user, an editing operation for realizing note APPs such as memos and notes, which are focused in the embodiment of the present application, and may be, for example, a composition editing interface related in the following embodiment. For example, the user may perform operations such as changing the order of text content and picture content in the composition editing interface of the memo APP. For example, the editing interface may be a memo display interface displayed as a in fig. 1 or b in fig. 1.
In a possible implementation manner, the application program may be developed using java language, and the application program may be developed by calling an Application Programming Interface (API) provided by an application framework layer, and a developer may interact with a bottom layer (e.g., a hardware layer, a kernel layer, etc.) of the operating system through the application framework layer to develop the application program of the developer. The application framework layer is primarily a series of services and management systems for the operating system.
The application framework layer provides an application programming interface and a programming framework for the application of the application layer. The application framework layer includes some predefined functions. As shown in FIG. 3, the application framework layers may include an activity manager, a window manager, a content provider, a view system, a phone manager, an explorer, a notification manager, and the like.
The activity manager is used for managing the life cycle of each application program, providing a common navigation backspacing function and providing an interactive interface for windows of all the programs.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and answered, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The telephone manager is used for providing a communication function of the terminal equipment. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like. In the embodiment of the application, when the user adds the pictures in the memo, all the pictures stored locally can be displayed for the user by calling the resource manager.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a brief dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scrollbar text in a status bar at the top of the system, such as a notification of a running application in the background, or a notification that appears on the screen in the form of a dialog window. For example, text information is prompted in the status bar, a prompt tone is given, the terminal device vibrates, and an indicator light flashes.
The runtime includes a core library and a virtual machine. The runtime is responsible for scheduling and management of the operating system.
The core library comprises two parts: one part is a function which needs to be called by the java language, and the other part is a core library of the operating system. The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media framework (media frame), three-dimensional graphics processing library (e.g., openGL ES), two-dimensional graphics engine (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of the two-dimensional and 3D layers for multiple applications.
The media framework supports a variety of commonly used audio, video format playback and recording, as well as still image files, and the like. The media framework may support a variety of audio-video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The two-dimensional graphics engine is a two-dimensional drawing engine.
In some embodiments, the three-dimensional graphics processing library may be used to render three-dimensional motion trace images and the two-dimensional graphics engine may be used to render two-dimensional motion trace images.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The hardware layer may include various types of sensors, such as acceleration sensors, gravity sensors, touch sensors, and the like.
In general, the terminal device 200 can simultaneously run a plurality of applications. It is simpler, and an application can correspond to a process, and more complicated, an application can correspond to a plurality of processes. Each process is provided with a process number (process ID).
It should be understood that "at least one of the following" or similar expressions in the embodiments of the present application refer to any combination of these items, including any combination of single item(s) or plural item(s). For example, at least one (one) of a, b, or c, may represent: a, b, c, a and b, a and c, b and c, or a, b and c, wherein a, b and c can be single or multiple. "plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists singly, A and B exist simultaneously, and B exists singly, wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
It is to be understood that the terms "first," "second," and the like, in the description of the present application, are used for distinguishing between descriptions and not necessarily for describing a sequential or chronological order, or for indicating or implying a relative importance.
It should be understood that the hardware structure of the terminal device may be as shown in fig. 2, and the software system architecture may be as shown in fig. 3, wherein software programs and/or modules corresponding to the software system architecture in the terminal device may be stored in the memory 240, and the processor 230 may execute the software programs and applications stored in the memory 240 to execute a flow of a content ordering method provided in the embodiment of the present application.
In order to facilitate understanding of the content ordering method provided by the present application, the following describes an implementation process using the method provided by the present application with reference to the contents shown in fig. 4 to fig. 11.
The embodiment of the application is suitable for a scene of a note APP installed on the terminal equipment. The main design concept is as follows: the terminal equipment can receive and respond to manual segmentation of a note to be edited contained in an interface of the note APP by a user in a path drawing mode, so that the note to be edited is divided into a plurality of segmentation parts, and user-defined identification marked by each segmentation part by the user is detected; then, the terminal device can automatically classify and reorder the segments based on the custom identifier, and further obtain the target note with ordered content. First, an application scenario to which the embodiments of the present application are applied will be described below with reference to a plurality of examples. It is understood that the embodiments of the present application are not limited to the following application scenarios.
In the first scenario, taking a memorandum as an example, the following description is given by taking content sorting of a note included in a memo by using the method provided by the embodiment of the present application as an example. For example, the display interface of a note included in the memo may be the note interface 41 shown in fig. 4. Note interface 41 includes multiple text and image contents belonging to the note. And, a plurality of editing controls for editing the illustrated note may also be included in the note interface 41, for example, a "switch to keyboard input" control for instructing an editing operation such as adding or deleting content to or from the note via a keyboard, a "color selection" control for instructing selection of a color (such as a font color) of the note content, an "insert picture" control for instructing insertion of a picture stored locally or taken by a camera, a "pen" control and a "pencil" control for instructing display effects of fonts in the note, an "eraser" control for instructing erasure of content in the note, and a "tidy" control 401 for instructing a user to typeset and edit the note. In addition, it is understood that, although not shown in fig. 4, in some other embodiments, the note interface 41 may further include other editing controls, for example, a "share" control, a "delete note" control, or a "collect note" control, which may be set according to the needs of the user. Besides, the note interface 41 may further include a "return arrow" for instructing to exit the illustrated note to display a first page of the memo, a status bar (which may include a mobile network, a wireless network, time, and remaining power), and the like; it will be appreciated that the status bar may also contain the name of the operator, a bluetooth icon, etc.
For example, the terminal device may receive and respond to the click operation of the user on the note interface 41 on the "sorting" control 401 to trigger the typesetting editing mode. It should be noted that the triggering of entering the composition editing mode may also be implemented in other manners, for example, the user may also directly perform a preset gesture operation (such as a double-click operation or a finger joint tapping operation) on the note interface 41, or the user may also trigger entering the composition editing mode through a voice instruction (such as "entering the composition editing mode"), and the like.
In an optional embodiment, the terminal device may trigger the composition editing mode to display a composition editing interface correspondingly, so as to implement composition editing on the composition editing interface by the user. As an example, the layout editing interface in the layout editing process performed by the user may be the layout editing interfaces 42, 43 and 44 in fig. 4. The typesetting editing interfaces 42, 43, and 44 may include a plurality of controls for the user to typeset and edit the note to be edited, such as a "marker pen" control 402, a "sorting" control 403, and a "finishing" control 402; it will be appreciated that in other embodiments, other layout editing controls may be included in the layout editing interfaces 42, 43, and 44, for example, for instructing a user to undo an editIs/are as follows
Figure BDA0003799795450000091
Control, repeatedly edited
Figure BDA0003799795450000092
Controls, a check edit "√" control, and the like can be set correspondingly according to the requirements of the user. In addition, contents such as "return arrow", status bar, and the like may be included in the layout editing interfaces 42, 43, and 44.
In some embodiments, the terminal device may receive a user's click operation on the "marker pen" control 402 on the composition editing interface, and trigger the user to perform composition editing on the composition editing interface through the marker pen. For example, the terminal device may detect a drawing operation of the user on the composition editing interface through the marker pen, and in response to the drawing operation, display a drawing path of the marker pen, for example, the drawing frame shown on the composition editing interface 42 may be the drawing path of the marker pen, and display a custom identifier drawn by the user on the drawing frame through the marker pen. The drawing frames in the embodiment of the application are used for dividing the note, each drawing frame can correspond to one divided part, and the custom identification on the drawing frame can be used for classifying or marking each divided part, so that each divided part can be reordered subsequently. For example, in the display interface 42 in fig. 4, 4 drawing frames are included from top to bottom, which respectively correspond to the note contents of 4 segments, the first drawing frame and the third drawing frame from top to bottom use a five-pointed star symbol for custom identification, and the second drawing frame uses a triangle symbol for custom identification. The custom identifier may also adopt other symbols, such as a circle symbol, a square symbol, etc., and the specific shape of the custom identifier is not specifically limited herein.
After the user finishes editing the drawing frames and the custom identifier on the composition editing interface, the composition editing interface 43 shown in fig. 4 can be displayed, where the composition editing interface 43 includes 5 drawing frames corresponding to 5 note partitions, respectively, where the first drawing frame and the third drawing frame from top to bottom use a five-pointed star symbol for custom identifier, the second drawing frame and the fourth drawing frame use a triangle symbol for custom identifier, and the fifth drawing frame uses a square symbol for custom identifier.
In the composition editing interface 43, the terminal device may further detect a click operation of the user on the "sort" control 403 on the composition editing interface 43, and in response to the click operation, the terminal device reorders the segments based on the custom identifier and the drawing frame, and displays the segments as the composition editing interface 44 of the sorted notes. The composition editing interface 44 is different from the composition editing interface 43 in that the positions of the divided parts in the note are different. For example, in the layout editing interface 44, the segments corresponding to the drawing frames with the custom triangle symbols are sorted at the positions of the first bar and the second bar, the segments corresponding to the drawing frames with the custom five-pointed star symbols are sorted at the positions of the third bar and the fourth bar, and the segments corresponding to the drawing frames with the square symbols are sorted at the position of the fifth bar. This facilitates the user to move the contents of the respective parts of the note according to the contents of the note for repeated ordered combination.
In the composition editing interface 44, if the terminal device detects a click operation of the user on the "finish finishing" control 404, the terminal device exits the composition editing mode in response to the click operation, and displays the composition editing mode as the sorted target note interface 45. Compared with the note interface 41, the target note interface 45 has the advantages that the note contents contained in the visible note interface are reordered, so that the note contents are convenient for the user to review; and compared with the layout editing interfaces 42, 43 and 44, in the target note interface 45, the frame selection lines, the user-defined identifications and the like drawn by the user can not be displayed any more, namely, the display effect after the layout editing is displayed.
In a second scenario, the processing flow is similar to that in the first scenario, and the difference is that for the implementation manner in which the user draws the custom identifier on the composition editing interface, the terminal device may further detect that the user performs drawing by selecting, through a marking pen, a plurality of preset identifiers (for example, a long press + a drag operation) displayed in an "identifier" control 501 provided on the composition editing interface 52 shown in fig. 5, where the preset identifiers included in the "identifier" control 501 include a five-pointed star identifier, a triangle identifier, a square identifier, and the like. For example, in the composition editing interface 52 in fig. 5, the user may perform a long-press operation on the five-pointed star symbol included in the "mark" control 501, and after triggering that the five-pointed star symbol may be dragged, further drag the five-pointed star symbol onto the first drawing frame. It should be noted that the five-pointed star identifier on the "identifier" control 501 is not deleted along with the dragging operation of the user, so that the five-pointed star symbol can be continuously drawn for other drawing frames as a custom identifier, for example, the five-pointed star symbol can be dragged to a third drawing frame; optionally, the terminal device may copy the five-pointed star symbol in response to a long-press operation of the user on the five-pointed star symbol, and move the copied five-pointed star symbol according to a marker pen of the user when responding to a drag operation of the user on the five-pointed star. It will be appreciated that the user may also drag the triangle symbols contained in the "identify" control 501 onto the second drawing frame and the fourth drawing frame, respectively, and the square symbols onto the fifth drawing frame, as shown in the composition editing interface 52.
In the scene, the preset identification is provided on the typesetting editing interface, so that the classification accuracy of the terminal equipment to each note segmentation part can be ensured, and the condition of inaccurate classification caused by the identification error of the user-defined identification drawn manually by the user is avoided.
Scenario three, the processing flow is similar to scenario one, except that the custom identifier may be represented by a number or letter, besides a symbol. In this scenario, the terminal device may determine the order of the note partitions directly according to the order of the numbers or letters, which is helpful for the user to sort the notes more intuitively.
Illustratively, taking numbers as an example of the custom identifier, in the layout editing interface 63 of fig. 6, the layout editing interface 63 includes 5 drawing frames, which correspond to 5 note partitions, respectively, where from top to bottom, a user assigns a number "3" to the custom identifier of the first drawing frame, a number "1" to the custom identifier of the second drawing frame, a number "4" to the custom identifier of the third drawing frame, a number "2" to the custom identifier of the fourth drawing frame, and a number "5" to the custom identifier of the fifth drawing frame. If the terminal device receives and responds to the click operation of the user on the "sort" control 403, the notes can be sorted according to the numerical identifiers marked by the split parts of the notes, and the sorted notes can be displayed as the composition editing interface 64. For example, in the composition editing interface 64, the drawing frame assigned with the numeral designation "1" is sorted in the position of the first bar in the note, the drawing frame assigned with the numeral designation "2" is sorted in the position of the second bar in the note, the drawing frame assigned with the numeral designation "3" is sorted in the position of the third bar in the note, the drawing frame assigned with the numeral designation "4" is sorted in the position of the fourth bar in the note, and the drawing frame assigned with the numeral designation "5" is sorted in the position of the fifth bar in the note.
A fourth scene, similar to the third scene, and different from the third scene in that the processing mode of reordering the note segmentation parts is performed on the basis of a drawing frame drawn on the typesetting editing interface by the user and the user-defined identification terminal equipment; in addition to the click operation on the "sort" control 403, the processing manner may also be, as shown in the typesetting editing interfaces 63 and 64 in fig. 7, that the terminal device detects and responds to a manual sorting instruction (such as a drag operation, or a click operation + a drag operation, or a long press operation + a drag operation, etc.) performed by the user through the marker pen, and sorts the divided portions one by one. For example, in the composition editing interface 63, the terminal device detects that the user moves the note division portion assigned with the digital identifier "1" to the position of the first strip in the note for display, and displays the note division portion as the composition editing interface 64; and, in the composition editing interface 64, it is further detected that the user moves the note division portion assigned with the numerical identifier "2" to the position of the second bar in the note for display, as the composition editing interface 65. When the user drags the note division portion assigned with the numeral "1" and the division portion assigned with the numeral "2", other division portions may be adaptively adjusted (for example, automatically moved to the lower idle position to be displayed), and the display interface after the adaptive adjustment may be the layout editing interface 65 in fig. 7. The display interfaces 41, 62, and 45 in fig. 7 may refer to the introduction content of the scene one, and are not described herein again.
In a fifth scenario, as shown in fig. 4, based on the first scenario, after the terminal device obtains the composition editing interface 44 and before the terminal device exits the composition editing mode and displays the target note interface 45, it may further be detected that the user performs a user operation of manually adjusting a note order on the note after the terminal device is reordered, for example, a drag operation on a note dividing portion, or a click operation + a drag operation, or a long press operation + a drag operation, or the like. For example, on the composition editing interface 441 displayed after the terminal device automatically sorts as shown in fig. 8, the terminal device may further receive a user operation of dragging the first divided portion to the second position by the user, and in response to the user operation, the composition editing interface 442 after the user manually adjusts the note order is displayed.
In the composition editing interface 442, if the terminal device detects a click operation of the user on the "finishing" control 404, the terminal device exits the composition editing mode in response to the click operation, and is displayed as the target note interface 45'. Target note interface 45' visibly contains notes whose display positions of the first scored and second note segments have been transposed as compared to target note interface 45 in scenario one.
And in the sixth scenario, in addition to the reordering of the display sequence of the content in the note introduced in the first scenario to the fifth scenario, the intelligent typesetting and editing of the note can be realized by combining with a preset typesetting rule in the implementation of the method. Wherein, the preset typesetting rule includes but is not limited to one or a combination of the following means: adjusting fonts (for example, adjusting fonts to be consistent, or adjusting fonts of the same paragraph to be consistent, etc.), adjusting word spacing (for example, adjusting word spacing to be consistent, or adjusting fonts of the same paragraph to be consistent, etc.), converting handwritten words contained in the sorted notes into text words, and adjusting paragraph widths (for example, adjusting all widths of the sorted notes to be consistent, etc.).
For example, assuming that the note display interface 91 in fig. 9 is a note display interface before intelligent typesetting, it can be seen that the width and font of the paragraph before intelligent typesetting are not uniform, and the display effect is poor. Based on the scene, the terminal equipment can receive a preset editing instruction for performing typesetting editing on the sequenced notes based on a preset typesetting rule; wherein, the preset editing instruction may include but is not limited to: the preset gesture (for example, a user may draw a preset graphic such as a circle on the note display interface 91), the preset user operation instruction (for example, a long press operation, and the like), and the click operation on the preset control (although not shown in fig. 9, for example, the intelligently typesetting control may also be included on the note display interface 91). And responding to the preset editing instruction by the terminal equipment, performing intelligent typesetting based on a preset typesetting rule, and displaying the intelligent typesetted note display interface 92. Compared with the note display interface 91, the note display interface 92 can show the notes displayed after the intelligent typesetting, and meet certain typesetting rules (such as consistent fonts, consistent word pitches, and consistent segment pitches), so that the note display interface has a good display effect. It should be noted that the preset composition rule may be configured by default, and the terminal device may further receive the personalized setting of the composition rule by the user.
Based on the above-described scenes one to six, it should be noted that the terminal device may receive user operations performed by the user through the marker pen, and may also receive user operations performed by the user directly through fingers on each display interface without editing through the marker pen.
It should be further noted that, in each of the above scenarios, the terminal device may further display the stroke color, the stroke thickness, and the like of the drawing path in response to the drawing of the user on the composition editing interface, the display effect may be preset, and the stroke thickness of the drawing path may also be determined by combining the touch pressure detected by the touch panel, which is not limited in this application.
It is to be understood that the above-described scenarios may also be combined with each other, and the present application is not limited to the above scenarios.
Based on the content that describes the interface processing effect that can be achieved by using the method provided by the embodiment of the present application in combination with a plurality of scenes, the following describes an implementation process of the content ranking method based on the terminal device provided by the present application, so as to describe how to achieve the interface processing effect described in the foregoing description with reference to fig. 4 to 9 by using the method provided by the present application, so that the note type APP can be used to record information more conveniently. Referring to fig. 10, a schematic flow chart of a content sorting method according to a possible embodiment of the present application is provided, where the method may be applied to a terminal device (e.g., a mobile phone, a tablet computer, etc.), and includes the following steps:
and 1001, the terminal equipment displays an interface of the note to be edited on a display screen.
Illustratively, the terminal device detects and displays an interface of the note to be edited, as shown by interface 41 in fig. 4, in response to a user instruction that the user triggers an interface into the note to be edited.
Wherein the user instruction may be, but is not limited to, the following:
mode 1, clicking operation of a user on any note list contained in a home page list contained in a note class APP. As in the interface 111 in fig. 11, the user instruction may be that the user performs a click operation on memo content 1 included in a home page list of a memo APP, so as to display a note interface after entering the memo content 1 on a display screen, where the note interface is an interface of the note to be edited.
Mode 2, user voice instruction. For example, the user instruction may be a user voice instruction of "start memo APP and open memo content 1", after receiving the instruction, a microphone in the terminal device processes the instruction through an audio circuit and sends the processed instruction to a processor, and the processor may recognize the instruction, and display a note interface of memo content 1 in the memo APP on a display screen as an interface of a note to be edited.
Mode 3, the terminal device displays the quick entry of the note type APP, and the user operates the quick entry of the note type APP. As in the interface 112 in fig. 11, the terminal device is a mobile phone, a service card of a memo APP is displayed on a desktop of the mobile phone, and a direct entry of a memo content 1 is correspondingly displayed on the service card; at this time, the user instruction may be that the user performs a click operation on the service card to directly enter the note interface of the memo content 1. In addition, the shortcut entry of the note type APP may also be: a memo shortcut displayed on the desktop, a desktop widget of a memo, a shortcut entry of a memo included in the notification bar page, and the like, which are not limited.
Step 1002, the terminal device receives an instruction for triggering editing of the note to be edited, and displays a typesetting editing interface of the note to be edited.
For example, the interface of the note to be edited displayed by the terminal device is generally a note preview interface, and in order to perform composition editing on the note to be edited, an operation instruction (assumed to be referred to as a "composition editing instruction") for a user to edit the note to be edited may be input on the note preview interface, and the terminal device is triggered to switch and display the interface displaying the note to be edited into the composition editing interface.
The typesetting editing instruction can be realized by the following modes: a preset gesture operation for indicating entering the typesetting editing mode (for example, a double-click operation or a single-click operation of an interface of a note to be edited, the double-click or single-click operation may be a tapping operation of a finger joint, or may also be a single-click operation or a double-click operation using a stylus, etc.), a click operation on a preset control (for example, the "tidying" control 401 shown in the interface 41 in fig. 4), a user voice instruction for "entering the typesetting editing mode", and the like, which are not limited herein.
It can be understood that, in the composition editing interface of the note to be edited, some controls and the like for assisting the user in performing composition editing may be included. Such as may include a "marker pen" control 402, a "sort" control 403, and a "finish sort" control 404 as shown in interfaces 42-44 in fig. 4; may also include instructions for the user to undo edits as shown in the upper right area of interfaces 42-44
Figure BDA0003799795450000131
Control, repeatedly edited
Figure BDA0003799795450000132
A control and a check edit "√" control; and, may also include a "return arrow," status bar, etc., as shown in the upper left area of interfaces 42-44. The content of each control and the function that can be realized by the control can refer to the content introduced in each scene, and will not be described herein again.
In this way, after the terminal device detects the operation instruction that the user needs to perform the typesetting editing on the interface to be edited, the typesetting editing interface convenient for the user to perform the typesetting editing is triggered and displayed, so that on one hand, a more accurate and more targeted typesetting editing interface can be provided for the user; on the other hand, the method can also avoid the mutual interference with other display interface scenes, for example, under the scene that the user does not enter the typesetting editing mode or exits the typesetting editing mode, the preview display effect before note editing or after note editing can be displayed, so that the user can more intuitively see the interface display effect of the notes based on the preview display effect.
And 1003, the terminal equipment receives and responds to a drawing path instruction on the typesetting editing interface, and displays a drawing path on the typesetting editing interface, wherein the drawing path is used for dividing the note to be edited into a plurality of divided parts.
For example, the terminal device may detect the drawing path instruction on the touch panel 251 by a marker pen, a finger, or the like.
For example, if the terminal device detects the drawing path instruction, information such as position coordinates and touch pressure related to the drawing path instruction may be collected by a sensor and then transmitted to the processor 230 for processing. Then, in response to the drawing path instruction, the processor 230 may, after calculating, display the drawing path on the display screen according to a default (or user-defined) drawing color, drawing pattern, and the like, for the path that the drawing path instruction passes through when drawing; in addition, the processor 230 may also display the thickness of the drawing path according to the touch pressure of the drawing path instruction; for example, the larger the touch pressure is, the thicker the corresponding drawing path may be displayed, and the smaller the touch pressure is, the thinner the corresponding drawing path may be displayed. It can be understood that the hardware layer of the terminal device may further include a pressure sensor, and the terminal device may obtain the touch pressure of the drawing path instruction through the pressure sensor.
Further, the terminal device, in response to the drawing path instruction, may obtain metadata corresponding to each drawn segment in addition to displaying the drawing path on the composition editing interface. For example, on the composition editing interface shown in the interface 43 in fig. 4, in response to the drawing path instruction, the terminal device may divide the note to be edited into 5 note partitions through 5 drawing frames, and may further obtain 5 sets of metadata corresponding to the 5 note partitions respectively. The metadata is used for describing some attribute information of each note segmentation part, such as a display area, a data type and the like.
In an optional implementation manner, the terminal device may respectively obtain a plurality of position coordinate points belonging to each drawing frame in the touch panel based on the path coordinates of each drawing frame drawn by the user. Then, the corresponding metadata may be further determined according to the plurality of position coordinate points. For example, for a first drawing frame, if it is determined that a plurality of position coordinate points belonging to the first drawing frame all belong to first metadata, it may be determined that the first drawing frame corresponds to the first metadata. Here, it is understood that the touch panel is composed of a plurality of position coordinate points which are imperceptible to the user.
In addition, when the user manually segments the note content, the user may not be able to accurately select all of a group of related content into a drawn frame (e.g., the drawn frame may not cover all of the position coordinate points included in the corresponding note segment); when the method is implemented, frame selection optimization can be realized by adopting a magnetic lasso tool and other modes, so that the minimum note unit (such as a segment of characters, a picture, a segment of voice and the like) can be selected completely. Therefore, the original note content can be prevented from being cut and damaged, for example, the text content is prevented from being lost.
In another optional embodiment, based on the data characteristics of the picture, the metadata stored in the note may be stored in groups, for example, the picture data in the note may include a plurality of pixel points, which occupy a plurality of position coordinates of the touch panel, and at this time, all the pixel points included in the picture may be stored in one group of metadata; therefore, when the terminal equipment acquires the metadata corresponding to the segmentation part, one or more groups of metadata corresponding to the segmentation part can be acquired, namely, the metadata is used as a unit to segment the note, so that the problem that the cutting damage is possibly caused due to inaccurate manual drawing of a user can be solved.
Step 1004, the terminal device receives and responds to the identification instruction for identifying the user-defined identification on any one of the divided parts, and displays the corresponding user-defined identification on any one of the divided parts.
In an alternative embodiment, the custom identifier may be manually drawn by the user, such as the user-drawn five-pointed star symbol, triangle symbol, square symbol, etc. shown in fig. 4. In this embodiment, the terminal device may display the user-defined identifier drawn by the user with reference to the implementation manner of displaying the drawing path in step 1003, which is not described herein again.
In another optional embodiment, the custom identifier may also be an instruction that the terminal device displays a plurality of preset identifiers on the composition editing interface and a user selects any one of the preset identifiers for the partition, for example, a scene shown in the interface 52 shown in fig. 5. Illustratively, the terminal device receives the identifier instruction, and may be implemented to detect a selection operation of the user on the first preset identifier; and responding to the selected operation, copying the first preset identification and triggering the copied first preset identification to be in a movable state. Then, the terminal device further detects the dragging operation of the user on the copied first preset identifier; and responding to the dragging operation, and displaying the first preset identification at a target dragging position. The target dragging position may be a position where the dragging operation is stopped (which may also be understood as any one of the divided parts selected by the user for the first preset identifier); for example, in the interface 52, the first preset symbol is a five-pointed star symbol, and the target dragging position is on the first divided portion.
Based on the above example, the terminal device further identifies the custom identifier marked by the user, for example, a neural network model that can identify various custom identifiers may be trained in advance. It should be noted that, in the embodiment of the present application, specific types of the custom identifier are not limited, and may be in any form such as numbers, letters, symbols, and the like.
In addition, the terminal device may compare a plurality of position coordinate points of the custom identifier displayed on the touch panel with position coordinate points of the respective partitions displayed on the touch panel, and determine the partition corresponding to the custom identifier. It can be understood that, for any user-defined identifier, the terminal device may determine the segment where the position coordinate point of the user-defined identifier displayed on the touch panel is the most, and the segment is the segment corresponding to the user-defined identifier. And the terminal device can store the self-defined identification of each segment mark in the corresponding metadata.
It should be noted that, although the description is made in the above description of the embodiment according to the order of introducing step 1003 first and then introducing step 1004, the execution order between step 1003 and step 1004 may not be limited when the present application is implemented. In a specific implementation, step 1003 and step 1004 may be implemented in response to a drawing path instruction and an identification instruction of a user. For example, the terminal device may receive and respond to a path drawing instruction once, draw a segment, then receive and respond to an identifier instruction for adding a custom identifier to the segment by a user, and display the custom identifier on the segment; and then the terminal equipment responds to the next path drawing instruction and the next identification drawing instruction of the user and continues to draw the next segmentation part and the corresponding self-defined identification.
Step 1005, the terminal device receives and responds to the first sorting instruction for sorting each segment according to the user-defined identification on each segment, sorts each segment according to the user-defined identification displayed on each segment, and displays the interface of the sorted notes.
Wherein the first ordering instruction may be implemented by, but is not limited to: a preset gesture operation for indicating sorting (for example, a double-click operation or a single-click operation of an interface of a note to be edited, the double-click or single-click operation may be a tapping operation of a knuckle, or may also be a one-time or two-time click operation using a stylus, etc.), a click operation of a preset control (for example, the "sorting" control 403 shown in the interface 43 in fig. 4), a user voice instruction such as "re-sorting", and the like, which are not limited herein.
Illustratively, the terminal device may sort the segments according to a pre-configured sorting order of the custom identifiers and according to the custom identifiers marked by the segments. For example, as shown in fig. 4, assuming that the preconfigured custom identifier sequence is a triangle symbol first, a pentagram symbol second, and a square symbol second, the interface of the sorted notes may be as shown in the interface 44 in fig. 4, that is, the partitions corresponding to the drawing frame with the custom identifier triangle symbol are sorted at the positions of the first bar and the second bar, the partitions corresponding to the drawing frame with the custom identifier pentagram symbol are sorted at the positions of the third bar and the fourth bar, and the partitions corresponding to the drawing frame with the square identifier symbol are sorted at the position of the fifth bar. This facilitates the user to move the contents of the respective parts of the note according to the contents of the note for repeated ordered combination.
Based on the above example, the terminal device may further classify the custom identifiers, and assign the same, similar, or same category custom identifiers to the same category, so as to order the segments of the custom identifiers belonging to the same category together. For example, two partitions corresponding to two triangle symbols in fig. 4 are sorted together, and two partitions corresponding to two pentagram symbols are sorted together. In addition, for the multiple segments with the same, similar, or same user-defined identifiers, the ordering may be further determined according to the display sequence in the original note (for example, in the interface 43 of fig. 4, the picture class segment corresponding to the triangle symbol is displayed in the original note before the text class segment corresponding to the triangle symbol; therefore, in the interface 44 after the ordering, the picture class segment corresponding to the triangle symbol and the text class segment corresponding to the triangle symbol are not only ordered together, but also the display sequence of the picture class segment displayed before the text class segment is maintained), or the ordering of the multiple segments may also be randomly determined, which is not limited herein.
In this embodiment, the terminal device may determine the target display areas of the respective segments according to the sorted order, and replace the original display areas in the metadata corresponding to the respective segments with the target display areas. Then, the terminal device displays the divided portions according to the target display area. In this way, the terminal device can display the sorted target note on the display panel.
In addition, if the first sorting instruction is implemented by the terminal device performing automatic sorting according to the custom identifier, the terminal device may further continue to receive a second sorting instruction for instructing the user to perform manual sorting on the automatically sorted partitions, for example, in an interface 441 in fig. 8, the user may further perform manual order adjustment on the sorted note partitions displayed on the first bar and the second bar. At this time, the terminal device may obtain the divided part related to the second sorting instruction, and manually adjust the related divided part; it is understood that in the manual adjustment process based on the second sorting instruction, the segments unrelated to the second sorting instruction do not need to be processed, for example, in the interface 441 in fig. 8, the display positions of the segments displayed on the third bar, the fourth bar and the fifth bar after sorting do not need to be changed.
According to the content sequencing method provided by the embodiment of the application, under the scene that a user uses a note APP, the terminal equipment can provide a new mode for intelligently sequencing the content in the note for the user, the user can manually divide the content contained in one note into a plurality of partitions through a drawing path instruction, and then can mark corresponding custom identifiers for each partition, so that the terminal equipment can intelligently sequence each partition based on the custom identifiers, and further obtain a target note after typesetting and editing. Therefore, the method can help the user to use the note APP more conveniently, conveniently realize note sequence arrangement and the like, and can better accord with the operation mode of the habit of the user so as to improve the working efficiency of the user in using the note APP and further improve the user experience.
Based on the above embodiment, the present application further provides a terminal device, where the terminal device includes a plurality of functional modules; the plurality of functional modules interact with each other to realize the functions executed by the terminal device in the methods described in the embodiments of the present application. The functional modules can be implemented based on software, hardware or a combination of software and hardware, and the functional modules can be combined or divided arbitrarily based on specific implementation. Such as steps 1001 to 1005 performed by the terminal device in the embodiment shown in fig. 10.
Based on the foregoing embodiments, the present application further provides a terminal device, where the terminal device includes at least one processor and at least one memory, where the at least one memory stores computer program instructions, and when the terminal device runs, the at least one processor executes functions performed by the terminal device in the methods described in the embodiments of the present application. Steps 1001 to 1005 performed by the terminal device in the embodiment shown in fig. 10 are performed.
Based on the above embodiments, the present application further provides a computer program product, comprising: a computer program (which may also be referred to as code, or instructions), which when executed, causes the methods described in embodiments of the present application to be performed.
Based on the above embodiments, the present application also provides a computer-readable storage medium, in which a computer program (also referred to as code or instructions) is stored, and when the code or instructions are executed by a computer, the computer executes the methods described in the embodiments of the present application.
Based on the above embodiments, the present application further provides a chip, where the chip is used to read a computer program stored in a memory, so as to implement the methods described in the embodiments of the present application.
Based on the above embodiments, the present application provides a chip system, which includes a processor and is used to support a computer device to implement the methods described in the embodiments of the present application. In one possible design, the chip system further includes a memory for storing programs and data necessary for the computer device. The chip system may be formed by a chip, or may include a chip and other discrete devices. As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and so forth) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (10)

1. A content ordering method is applied to a terminal device and comprises the following steps:
displaying an interface of a note to be edited;
receiving an instruction for triggering the note to be edited, and displaying a typesetting editing interface of the note to be edited;
receiving and responding to a drawing path instruction on the typesetting editing interface, and displaying a drawing path on the typesetting editing interface, wherein the drawing path is used for dividing the note to be edited into a plurality of divided parts; and the number of the first and second groups,
receiving and responding to an identification instruction for identifying a custom identification on any one of the divided parts, and displaying the corresponding custom identification on any one of the divided parts;
and receiving and responding to a first sequencing instruction for sequencing each segmentation part according to the user-defined identification on each segmentation part, sequencing each segmentation part according to the user-defined identification displayed by each segmentation part, and displaying the interface of the sequenced notes.
2. The method of claim 1, wherein the sorting of the segments according to the custom identifier displayed by the segments comprises:
and displaying the segmentation parts with different custom identifications according to the front and back display sequence of the different custom identifications, and displaying the different segmentation parts with the same custom identification according to the adjacent position sequence.
3. The method according to claim 1 or 2, characterized in that the method further comprises:
and receiving a second sorting instruction for moving at least one designated segmentation part to a target position on the interface of the sorted notes, and moving the at least one designated segmentation part to the target position for display.
4. The method according to any one of claims 1 to 3, wherein receiving an identification instruction for identifying a custom identifier on any one of the partitions comprises:
receiving an operation instruction of manually drawing a user-defined identifier on any one of the segmentation parts by a user; alternatively, the first and second electrodes may be,
displaying a plurality of preset identifications on the typesetting editing interface, receiving an operation instruction of a user for selecting the identifications from the displayed plurality of preset identifications and drawing the selected identifications on any one of the segmentation parts selected by the user.
5. The method of any one of claims 1 to 4, wherein the custom identifier is represented by one or a combination of the following forms: numbers, letters, symbols.
6. The method according to any one of claims 1 to 5, further comprising:
and receiving an editing instruction for typesetting and editing the sorted notes based on a preset typesetting rule on the interface of the sorted notes, typesetting and editing the sorted notes, and displaying the interface of the typesetted and edited notes.
7. The method according to claim 6, wherein the preset typesetting rules comprise at least one of the following:
adjusting the fonts of the ordered notes to be consistent;
adjusting the word spacing of the sorted notes to be consistent;
converting the handwriting words of the sorted notes into text words;
and adjusting the paragraph widths of the ordered notes to be consistent.
8. A terminal device comprising at least one processor coupled with at least one memory, the at least one processor configured to read and execute a computer program stored in the at least one memory to cause the method of any of claims 1-7 to be performed.
9. A computer-readable storage medium having stored therein instructions which, when run on a computer, cause the computer to perform the method of any one of claims 1 to 7.
10. A computer program product comprising instructions for causing a computer to perform the method according to any one of claims 1 to 7 when the computer program product is run on the computer.
CN202210981781.1A 2022-08-16 2022-08-16 Content ordering method and terminal equipment Pending CN115470757A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210981781.1A CN115470757A (en) 2022-08-16 2022-08-16 Content ordering method and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210981781.1A CN115470757A (en) 2022-08-16 2022-08-16 Content ordering method and terminal equipment

Publications (1)

Publication Number Publication Date
CN115470757A true CN115470757A (en) 2022-12-13

Family

ID=84367933

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210981781.1A Pending CN115470757A (en) 2022-08-16 2022-08-16 Content ordering method and terminal equipment

Country Status (1)

Country Link
CN (1) CN115470757A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116594548A (en) * 2023-06-15 2023-08-15 广州银狐科技股份有限公司 Intelligent teaching blackboard management and control system based on cloud platform

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116594548A (en) * 2023-06-15 2023-08-15 广州银狐科技股份有限公司 Intelligent teaching blackboard management and control system based on cloud platform

Similar Documents

Publication Publication Date Title
US11023097B2 (en) Mobile terminal and message-based conversation operation method for grouping messages
CN111324266B (en) Device, method and graphical user interface for sharing content objects in a document
CN108334371B (en) Method and device for editing object
US9003283B2 (en) Copying text using parameters displayed on a clipboard
EP2524290B1 (en) Digital signage apparatus and method using the same
CN103229141A (en) Managing workspaces in a user interface
US10739988B2 (en) Personalized persistent collection of customized inking tools
US20170277381A1 (en) Cross-platform interactivity architecture
JP2017068406A (en) Electronic device and method
KR20230107690A (en) Operating system level management of copying and pasting multiple items
CN113126838A (en) Application icon sorting method and device and electronic equipment
WO2018040547A1 (en) Method and device for creating group
CN113127773A (en) Page processing method and device, storage medium and terminal equipment
US10915501B2 (en) Inline content file item attachment
CN107193878B (en) Automatic naming method of song list and mobile terminal
CN114077825A (en) Method for adding annotations, electronic equipment and related device
CN115470757A (en) Content ordering method and terminal equipment
WO2022247787A1 (en) Application classification method and apparatus, and electronic device
CN115952771A (en) Picture editing method and device, electronic equipment and storage medium
CN112269523B (en) Object editing processing method and device and electronic equipment
CN113436297A (en) Picture processing method and electronic equipment
WO2024017097A1 (en) Interface display method and terminal device
CN111752428A (en) Icon arrangement method and device, electronic equipment and medium
CN111813366A (en) Method and device for editing characters through voice input
CN104375884A (en) Information processing method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination