CN112306976A - Information processing method and device and electronic equipment - Google Patents

Information processing method and device and electronic equipment Download PDF

Info

Publication number
CN112306976A
CN112306976A CN202010020764.2A CN202010020764A CN112306976A CN 112306976 A CN112306976 A CN 112306976A CN 202010020764 A CN202010020764 A CN 202010020764A CN 112306976 A CN112306976 A CN 112306976A
Authority
CN
China
Prior art keywords
attribute
information
user
target multimedia
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010020764.2A
Other languages
Chinese (zh)
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing ByteDance Network Technology Co Ltd
Original Assignee
Beijing ByteDance Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing ByteDance Network Technology Co Ltd filed Critical Beijing ByteDance Network Technology Co Ltd
Priority to CN202010020764.2A priority Critical patent/CN112306976A/en
Publication of CN112306976A publication Critical patent/CN112306976A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/48Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/487Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using geographical or spatial information, e.g. location
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/17Details of further file system functions
    • G06F16/176Support for shared access to files; File sharing support
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/48Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/489Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using time information

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Library & Information Science (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention discloses an information processing method, an information processing device and electronic equipment. One embodiment of the method comprises: acquiring attribute information of a target multimedia selected by a user; showing attribute selection items, wherein the attribute selection items comprise position attributes and/or time attributes and at least one attribute value corresponding to each attribute; according to the selection operation of the user on the attribute selection item, determining a target attribute selected by the user and a target attribute value corresponding to the target attribute; and displaying the fusion information of the target multimedia and the target attribute value. The information sharing user can select to share the attribute information which is more matched with the current target multimedia to the contact person, and user experience is improved.

Description

Information processing method and device and electronic equipment
Technical Field
The present invention relates to the field of multimedia information technologies, and in particular, to an information processing method and apparatus, and an electronic device.
Background
With the development of electronic terminals, users can share various information with contacts. For example, multimedia information is sent to the contact so that the multimedia information can be shared with the contact.
In the related art, if a user authorizes the electronic terminal to obtain the position information, the position information of the multimedia is automatically generated when the multimedia information is sent to the contact person, and the automatically generated multimedia information is sent to the contact person together. Multimedia information with embedded location information can be presented in the contact's terminal.
Disclosure of Invention
The embodiment of the disclosure provides an information processing method and device and electronic equipment.
In a first aspect, an embodiment of the present disclosure provides an information processing method, including: acquiring attribute information of a target multimedia selected by a user; showing attribute selection items, wherein the attribute selection items comprise position attributes and/or time attributes and at least one attribute value corresponding to each attribute; the position attributes comprise a current position attribute used for representing the current position of the user, a historical position attribute used for representing the position of the user when the target multimedia is made, and a multimedia position attribute used for representing the position matched with the information expressed by the target multimedia; according to the selection operation of the user on the attribute selection item, determining a target attribute selected by the user and a target attribute value corresponding to the target attribute; and displaying the fusion information of the target multimedia and the target attribute value.
In a second aspect, an embodiment of the present disclosure provides an information processing apparatus, including: the acquiring unit is used for acquiring the attribute information of the target multimedia selected by the user; the first presentation unit is used for presenting an attribute selection item, wherein the attribute selection item comprises a position attribute and/or a time attribute and at least one attribute value corresponding to each attribute, and the position attribute comprises a current position attribute used for representing the current position of a user, a historical position attribute used for representing the position of the user when the target multimedia is made and a multimedia position attribute used for representing the position matched with information expressed by the target multimedia; the determining unit is used for determining the target attribute selected by the user and the target attribute value corresponding to the target attribute according to the selection operation of the user on the attribute selection item; and the second display unit is used for displaying the fusion information of the target multimedia and the target attribute value.
In a third aspect, an embodiment of the present disclosure provides an electronic device, including: one or more processors; a storage device for storing one or more programs that, when executed by the one or more processors, cause the one or more processors to implement the steps of the information processing method according to any of the first aspects.
In a fourth aspect, the disclosed embodiments provide a computer readable medium, on which a computer program is stored, which when executed by a processor, implements the steps of the information processing method according to the first aspect.
According to the information processing method, the information processing device and the electronic equipment, the attribute information of the target multimedia selected by the user is acquired; showing attribute selection items, wherein the attribute selection items comprise position attributes and/or time attributes and at least one attribute value corresponding to each attribute; the position attributes comprise a current position attribute used for representing the current position of the user, a historical position attribute used for representing the position of the user when the target multimedia is made, and a multimedia position attribute used for representing the position matched with the information expressed by the target multimedia; according to the selection operation of the user on the attribute selection item, determining a target attribute selected by the user and a target attribute value corresponding to the target attribute; and displaying the fusion information of the target multimedia and the target attribute value. The information sharing user can select to share the attribute information which is more matched with the current target multimedia to the contact person, so that the user experience is improved.
Drawings
The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and features are not necessarily drawn to scale.
FIG. 1 is a flow diagram of some embodiments of an information processing method according to the present disclosure;
FIG. 2 is a flow diagram of further embodiments of an information processing method according to the present disclosure;
FIG. 3 is a schematic block diagram of some embodiments of an information processing apparatus according to the present disclosure;
FIG. 4 is a flow diagram of another embodiment of an exemplary system architecture diagram in which information processing methods according to some embodiments of the present disclosure may be applied;
FIG. 5 is a schematic block diagram of a computer system suitable for use in implementing an electronic device according to embodiments of the present application.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order, and/or performed in parallel. Moreover, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "include" and variations thereof as used herein are open-ended, i.e., "including but not limited to". The term "based on" is "based, at least in part, on". The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments". Relevant definitions for other terms will be given in the following description.
It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a", "an", and "the" modifications in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that "one or more" may be used unless the context clearly dictates otherwise.
The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
Referring to fig. 1, a flow of one embodiment of an information processing method according to the present disclosure is shown. The information processing method is applied to the terminal equipment. The information processing method as shown in fig. 1 includes the steps of:
step 101, acquiring attribute information of a target multimedia selected by a user.
The user can select a target multimedia from the plurality of multimedia by selection operations such as clicking, touching, hovering a cursor, and the like. The multimedia can be one of graphics, pictures, video, audio, and text, or a combination of at least two of the above graphics, pictures, video, audio, and text.
The attribute information of the target multimedia may include, for example: location attribute information, time attribute information, size attribute information, and the like.
After the user selects the target multimedia, the attribute information of the target multimedia selected by the user may be acquired through various methods. The target multimedia can be one of the above graphics, pictures, videos, audios and texts, or a combination of at least two of the above graphics, pictures, videos, audios and texts.
In some application scenarios, a plurality of multimedia data may be stored in association with attribute information corresponding to each of the plurality of multimedia data in an electronic device executing the information processing method. In these application scenarios, the user may use the electronic device or other electronic devices in communication with the electronic device to produce the multimedia. For example, drawing a graphic with the electronic device, or taking a picture with the electronic device, or recording a video with the electronic device, recording an audio with the electronic device, and editing a text with the electronic device. When the user makes the multimedia, the user can authorize the electronic terminal to obtain the position information. The electronic device can record corresponding position information, time information and the like when the user makes multimedia.
In these application scenarios, the attribute information corresponding to the target multimedia can be locally obtained according to the identifier of the target multimedia.
102, showing an attribute selection item, wherein the attribute selection item comprises a position attribute and/or a time attribute and at least one attribute value corresponding to each attribute.
After the attribute information of the target multimedia selected by the user is acquired in step 101, the attribute selection item may be displayed in the display interface. The attribute selection item may include a location attribute and/or a time attribute, and at least one attribute value corresponding to each attribute.
In practice, each attribute and the attribute value corresponding to each attribute may be displayed in a list form, a thumbnail form, a hierarchical form, or a popup form.
In some application scenarios, the target attribute may include a location attribute, and the location attribute may include a current location attribute for characterizing a current location of the user, a historical location attribute for characterizing a location of the user when the target multimedia is produced, and a multimedia location attribute for characterizing a location matching information expressed by the target multimedia. The attribute values respectively corresponding to the position attributes may be: the method comprises the steps of obtaining a geographic information point corresponding to the current position of a user, obtaining the geographic information point corresponding to the target multimedia when the target multimedia is produced, and obtaining the geographic information point determined by the characteristics extracted from the target multimedia.
In some other application scenarios, the attribute may include a time attribute; and the display attribute selection item of the step 102 includes: displaying the time attribute and at least one time attribute value corresponding to the time attribute, wherein the at least one time attribute value comprises: the current time, the time corresponding to the target multimedia and/or the time determined by the characteristics extracted from the target multimedia.
Taking an image of an object including a tree as an example of the target multimedia, the time determined by the feature extracted from the target multimedia may be, for example, a season corresponding to the image or the like determined according to a feature of a leaf on the tree extracted from the image, for example, a feature such as a size, a shape, a color, or the like of the leaf.
And 103, determining the target attribute selected by the user and a target attribute value corresponding to the target attribute according to the selection operation of the user on the attribute selection item.
The user can perform selection operation on each attribute shown by the attribute selection item, and the target attribute is determined from at least one attribute according to the selection operation of the user. In addition, the user can also perform a selection operation on at least one attribute value corresponding to the target attribute. And determining a target attribute value corresponding to the target attribute from the at least one attribute value according to the selection operation of the user.
And 104, displaying the fusion information of the target multimedia and the target attribute value.
In this embodiment, the target multimedia and the target attribute value may be mixed and displayed on a local terminal. In addition, the mixed information of the target multimedia and the target attribute value can be shared to other users through the server.
In the information processing method provided by the embodiment, attribute information of a target multimedia selected by a user is acquired; showing attribute selection items, wherein the attribute selection items comprise position attributes and/or time attributes and at least one attribute value corresponding to each attribute; the position attributes comprise a current position attribute used for representing the current position of the user, a historical position attribute used for representing the position of the user when the target multimedia is made, and a multimedia position attribute used for representing the position matched with the information expressed by the target multimedia; according to the selection operation of the user on the attribute selection item, determining a target attribute selected by the user and a target attribute value corresponding to the target attribute; and displaying the fusion information of the target multimedia and the target attribute value. Compared with the scheme of automatically generating the mixed multimedia of the multimedia and the attribute value by the system in the prior art, according to the information processing method provided by the embodiment, through the interaction scheme with the user, the information sharing user can autonomously select the target attribute of the target media and the attribute value of the target attribute, which are displayed together with the target multimedia. The information sharing user can select to share the attribute information which is more matched with the current target multimedia to the contact person, so that the user experience is improved.
Continuing to refer to FIG. 2, a flow diagram of additional embodiments of information processing methods according to the present disclosure is shown. The information processing method as shown in fig. 2 includes the steps of:
step 201, responding to the selection operation executed by the user on the target multimedia, and sending a request for acquiring the attribute information of the target multimedia to the server.
In this embodiment, the electronic device executing the information processing method may establish a communication connection with a server that provides a data service to the electronic device in advance.
When a user performs a selection operation such as touch, click, cursor hovering on a target multimedia displayed on a screen of the electronic device, the electronic device may send a request for obtaining attribute information of the target multimedia to a server.
The request may include multimedia information. The multimedia information may include, for example, identification information, link information, and the like corresponding to the multimedia.
Step 202, receiving the attribute information of the target multimedia determined according to the information of the target multimedia sent by the server.
The information of the target multimedia may include, for example, an identifier corresponding to the target multimedia, a link corresponding to the target multimedia, and the like.
In some application scenarios, the server may store identifiers of multiple multimedia, and attributes and attribute values corresponding to the multiple multimedia in an associated manner. Alternatively, the server stores links of a plurality of multimedia, and attributes and attribute values corresponding to the plurality of multimedia in association with each other.
The server side can match attribute information corresponding to the target multimedia from the data stored in the server side according to the identification or the link of the target multimedia. The attribute information may include, for example, a location attribute, a time attribute, a size attribute, and at least one attribute value corresponding to each of the attributes.
In some application scenarios, the information of the target multimedia and the attribute information corresponding to the target multimedia are stored in other electronic devices, and the server may obtain the information of the target multimedia and the attribute information corresponding to the target multimedia from the other electronic devices through a pre-established communication connection.
In some optional implementation manners of this embodiment, the server may further determine attribute information of the target multimedia by:
first, feature information is extracted from the target multimedia.
The following description will take the target multimedia as an example. The server can extract the feature information of the picture by various picture feature extraction methods. The image feature extraction method may include, for example: a feature extraction method using a Histogram of Oriented Gradient (HOG) algorithm, a feature extraction method using an LBP (Local Binary Pattern) algorithm, a feature extraction method using a Haar-like algorithm, and the like.
And secondly, determining attribute information corresponding to the target multimedia according to the characteristic information.
For example, the server may compare the feature information of the picture with feature information of a picture with known position information, so that the server may know that the position corresponding to the object in the picture is new. Illustratively, a picture including a mountain D may be sent to the server. And the server extracts the characteristic information E of the picture. And then the server compares the characteristic information E with a plurality of characteristic information bases of mountains. And determining the name of the mountain D according to the comparison result.
For another example, the season reflected by the picture may be determined from the color information (color information of the plant) in the picture feature information E.
The feature information library may include feature information extracted from respective pictures of a plurality of mountains of known names. Step 203, obtaining the attribute information of the target multimedia selected by the user.
Step 204, showing attribute selection items, wherein the attribute selection items comprise position attributes and/or time attributes and at least one attribute value corresponding to each attribute.
Step 205, according to the selection operation of the attribute selection item by the user, determining the target attribute selected by the user and the target attribute value corresponding to the target attribute.
And step 206, displaying the fusion information of the target multimedia and the target attribute value.
The above steps 203 to 206 can refer to the detailed description of the embodiment shown in fig. 1, and are not repeated herein.
Compared with the embodiment shown in fig. 1, the method provided by this embodiment highlights the step that after the user selects the target multimedia in the terminal electronic device, the terminal electronic device can send the information of the target multimedia to the server, and the server determines the attribute information matching with the target multimedia. On one hand, the calculation amount of the terminal electronic equipment for determining the attribute information of the target multimedia can be reduced, on the other hand, the accuracy of the determined attribute information of the target multimedia can be increased, and the user experience can be further improved.
In some optional implementations of the embodiments of the information processing method of the present disclosure, before step 101 of the embodiment shown in fig. 1, and before step 201 of the embodiment shown in fig. 2, the information processing method disclosed in fig. 1 and fig. 2 may further include: the information processing method further includes: and in response to receiving a selection operation executed by the user on the target multimedia, presenting a selection item for prompting the user whether to add attribute information.
In the application scenes, the user is prompted to complete interactive operation for adding the attribute information by prompting whether the attribute information is added in the multimedia, so that the attribute information matched with the multimedia is added in the target multimedia.
With further reference to fig. 3, as an implementation of the methods shown in the above figures, the present disclosure provides an embodiment of an information processing apparatus, which corresponds to the embodiment of the method shown in fig. 1, and which is particularly applicable in various electronic devices.
As shown in fig. 3, the information processing apparatus of the present embodiment includes: an acquisition unit 301, a first presentation unit 302, a determination unit 303, and a second presentation unit 304. The acquiring unit 301 is configured to acquire attribute information of a target multimedia selected by a user; the first presentation unit is used for presenting an attribute selection item, wherein the attribute selection item comprises a position attribute and/or a time attribute and at least one attribute value corresponding to each attribute, and the position attribute comprises a current position attribute used for representing the current position of a user, a historical position attribute used for representing the position of the user when the target multimedia is made and a multimedia position attribute used for representing the position matched with information expressed by the target multimedia; the determining unit is used for determining the target attribute selected by the user and the target attribute value corresponding to the target attribute according to the selection operation of the user on the attribute selection item; and the second display unit is used for displaying the fusion information of the target multimedia and the target attribute value.
In this embodiment, specific processing of the obtaining unit 301, the first presenting unit 302, the determining unit 303, and the second presenting unit 304 of the information processing apparatus and technical effects thereof may refer to related descriptions of step 101, step 102, step 103, and step 104 in the corresponding embodiment of fig. 1, which are not repeated herein.
In some optional implementation manners of this embodiment, the obtaining unit 301 is further configured to: responding to the selection operation executed by the user on the target multimedia, and sending a request for acquiring the attribute information of the target multimedia to a server, wherein the request comprises the information of the target multimedia; and receiving attribute information of the target multimedia, which is sent by the server and determined according to the information of the target multimedia.
In some optional implementations of this embodiment, the attribute information of the target multimedia is determined by the server according to the following steps: extracting characteristic information from the target multimedia; and determining attribute information corresponding to the target multimedia according to the characteristic information.
In some optional implementation manners of this embodiment, the information processing apparatus further includes a third presentation unit (not shown in the figure), and the third presentation unit is configured to: and before the acquisition unit acquires the attribute information of the target multimedia selected by the user, in response to receiving the selection operation of the user on the target multimedia information, displaying a selection item for prompting the user whether to add the attribute information.
In some optional implementations of this embodiment, the attribute selection item includes a time attribute; and the first presentation unit 302 is further configured to: displaying the time attribute and at least one time attribute value corresponding to the time attribute, wherein the at least one time attribute value comprises: the current time, the time corresponding to the target multimedia and/or the time determined by the characteristics extracted from the target multimedia.
Referring to fig. 4, fig. 4 illustrates an exemplary system architecture to which the information processing method of one embodiment of the present disclosure may be applied.
As shown in fig. 4, the system architecture may include terminal devices 401, 402, 403, a network 404, and a server 405. The network 404 serves as a medium for providing communication links between the terminal devices 401, 402, 403 and the server 405. Network 404 may include various types of connections, such as wire, wireless communication links, or fiber optic cables, to name a few.
The terminal devices 401, 402, 403 may interact with a server 405 over a network 404 to receive or send messages or the like. The terminal devices 401, 402, 403 may have various client applications installed thereon, such as a web browser application, a search-type application, and a news-information-type application. The client application in the terminal device 401, 402, 403 may receive the instruction of the user, and complete the corresponding function according to the instruction of the user, for example, add the corresponding information to the information according to the instruction of the user.
The terminal devices 401, 402, and 403 may be hardware or software. When the terminal devices 401, 402, and 403 are hardware, they may be various electronic devices having a display screen and supporting web browsing, including but not limited to smart phones, tablet computers, e-book readers, MP3 players (Moving Picture Experts Group Audio Layer III, mpeg compression standard Audio Layer 3), MP4 players (Moving Picture Experts Group Audio Layer IV, mpeg compression standard Audio Layer 4), laptop portable computers, desktop computers, and the like. When the terminal devices 401, 402, and 403 are software, they can be installed in the electronic devices listed above. It may be implemented as multiple pieces of software or software modules (e.g., software or software modules used to provide distributed services) or as a single piece of software or software module. And is not particularly limited herein.
The server 405 may be a server providing various services, for example, receive an information acquisition request sent by the terminal devices 401, 402, and 403, and acquire, according to the information acquisition request, presentation information corresponding to the information acquisition request in various manners. And the relevant data of the presentation information is sent to the terminal devices 401, 402, 403.
It should be noted that the information processing method provided by the embodiment of the present disclosure may be executed by a terminal device, and accordingly, the information processing apparatus may be provided in the terminal devices 401, 402, and 403.
It should be understood that the number of terminal devices, networks, and servers in fig. 4 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
Referring now to fig. 5, shown is a schematic diagram of an electronic device (e.g., the terminal device of fig. 4) suitable for use in implementing embodiments of the present disclosure. The terminal device in the embodiments of the present disclosure may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a vehicle terminal (e.g., a car navigation terminal), and the like, and a stationary terminal such as a digital TV, a desktop computer, and the like. The electronic device shown in fig. 5 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 5, the electronic device may include a processing means (e.g., central processing unit, graphics processor, etc.) 501 that may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)502 or a program loaded from a storage means 508 into a Random Access Memory (RAM) 503. In the RAM 503, various programs and data necessary for the operation of the electronic apparatus 500 are also stored. The processing device 501, the ROM 502, and the RAM 503 are connected to each other through a bus 504. An input/output (I/O) interface 505 is also connected to bus 504.
Generally, the following devices may be connected to the I/O interface 505: input devices 506 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; output devices 507 including, for example, a Liquid Crystal Display (LCD), speakers, vibrators, and the like; storage devices 508 including, for example, magnetic tape, hard disk, etc.; and a communication device 509. The communication means 509 may allow the electronic device to communicate with other devices wirelessly or by wire to exchange data. While fig. 5 illustrates an electronic device having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program carried on a non-transitory computer readable medium, the computer program containing program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means 509, or installed from the storage means 508, or installed from the ROM 502. The computer program performs the above-described functions defined in the methods of the embodiments of the present disclosure when executed by the processing device 501.
It should be noted that the computer readable medium in the present disclosure can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network Protocol, such as HTTP (HyperText Transfer Protocol), and may interconnect with any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: acquiring attribute information of a target multimedia selected by a user; showing attribute information selection items, wherein the attribute information selection items comprise position attributes and/or time attributes and at least one attribute value corresponding to each attribute; the position attributes comprise a current position attribute used for representing the current position of the user, a historical position attribute used for representing the position of the user when the target multimedia is made, and a multimedia position attribute used for representing the position matched with the information expressed by the target multimedia; according to the selection operation of the user on the attribute selection item, determining a target attribute selected by the user and a target attribute value corresponding to the target attribute; and displaying the fusion information of the multimedia and the target attribute value.
Computer program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including but not limited to an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. Here, the name of the unit does not constitute a limitation to the unit itself in some cases, and for example, the acquiring unit may also be described as a "unit that acquires attribute information of the target multimedia selected by the user".
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), systems on a chip (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the disclosure herein is not limited to the particular combination of features described above, but also encompasses other embodiments in which any combination of the features described above or their equivalents does not depart from the spirit of the disclosure. For example, the above features and (but not limited to) the features disclosed in this disclosure having similar functions are replaced with each other to form the technical solution.
Further, while operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. Under certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limitations on the scope of the disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims (11)

1. An information processing method characterized by comprising:
acquiring attribute information of a target multimedia selected by a user;
showing attribute selection items, wherein the attribute selection items comprise position attributes and/or time attributes and at least one attribute value corresponding to each attribute; the position attributes comprise a current position attribute used for representing the current position of the user, a historical position attribute used for representing the position of the user when the target multimedia is made, and a multimedia position attribute used for representing the position matched with the information expressed by the target multimedia;
according to the selection operation of the user on the attribute selection item, determining a target attribute selected by the user and a target attribute value corresponding to the target attribute;
and displaying the fusion information of the target multimedia and the target attribute value.
2. The method of claim 1, wherein the obtaining of the attribute information of the target multimedia selected by the user comprises:
responding to the selection operation executed by the user on the target multimedia, and sending a request for acquiring the attribute information of the target multimedia to a server, wherein the request comprises the information of the target multimedia;
and receiving attribute information of the target multimedia, which is sent by the server and determined according to the information of the target multimedia.
3. The method of claim 2, wherein the attribute information of the target multimedia is determined by the server according to the following steps:
extracting characteristic information from the target multimedia;
and determining attribute information corresponding to the target multimedia according to the characteristic information.
4. The method according to claim 1, wherein before said obtaining the attribute information of the target multimedia selected by the user, the method further comprises:
and in response to receiving a selection operation executed by the user on the target multimedia information, presenting a selection item for prompting the user whether to add the attribute information.
5. The method of any of claims 1-4, wherein the property selection item comprises a time property; and
the presentation property selection item includes:
showing the time attribute and at least one time attribute value corresponding to the time attribute, wherein
The at least one time attribute value comprises: the current time, the time corresponding to the target multimedia and/or the time determined by the characteristics extracted from the target multimedia.
6. An information processing apparatus characterized by comprising:
the acquiring unit is used for acquiring the attribute information of the target multimedia selected by the user;
the first presentation unit is used for presenting an attribute selection item, wherein the attribute selection item comprises a position attribute and/or a time attribute and at least one attribute value corresponding to each attribute, and the position attribute comprises a current position attribute used for representing the current position of a user, a historical position attribute used for representing the position of the user when the target multimedia is made and a multimedia position attribute used for representing the position matched with information expressed by the target multimedia;
the determining unit is used for determining the target attribute selected by the user and the target attribute value corresponding to the target attribute according to the selection operation of the user on the attribute selection item;
and the second display unit is used for displaying the fusion information of the target multimedia and the target attribute value.
7. The apparatus of claim 6, wherein the obtaining unit is further configured to:
responding to the selection operation executed by the user on the target multimedia, and sending a request for acquiring the attribute information of the target multimedia to a server, wherein the request comprises the information of the target multimedia;
and receiving attribute information of the target multimedia, which is sent by the server and determined according to the information of the target multimedia.
8. The apparatus of claim 6, further comprising a third presentation unit, the third presentation unit configured to:
and before the acquisition unit acquires the attribute information of the target multimedia selected by the user, in response to receiving the selection operation of the user on the target multimedia information, displaying a selection item for prompting the user whether to add the attribute information.
9. The apparatus according to any of claims 6-8, wherein the property selection item comprises a time property; and
the first presentation unit is further for:
showing the time attribute and at least one time attribute value corresponding to the time attribute, wherein
The at least one time attribute value comprises: the current time, the time corresponding to the target multimedia and/or the time determined by the characteristics extracted from the target multimedia.
10. An electronic device, comprising:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-5.
11. A computer-readable medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-5.
CN202010020764.2A 2020-01-09 2020-01-09 Information processing method and device and electronic equipment Pending CN112306976A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010020764.2A CN112306976A (en) 2020-01-09 2020-01-09 Information processing method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010020764.2A CN112306976A (en) 2020-01-09 2020-01-09 Information processing method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN112306976A true CN112306976A (en) 2021-02-02

Family

ID=74336674

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010020764.2A Pending CN112306976A (en) 2020-01-09 2020-01-09 Information processing method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN112306976A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115357630A (en) * 2022-10-24 2022-11-18 北京国电通网络技术有限公司 Information detection method, apparatus, device, computer readable medium and program product

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102307284A (en) * 2011-03-18 2012-01-04 海尔集团公司 Processing method and system for simultaneously displaying television program and menu
CN102722900A (en) * 2012-06-05 2012-10-10 深圳市中兴移动通信有限公司 Method and device for automatically adding introduction information to shot picture/video
CN103874997A (en) * 2011-09-29 2014-06-18 三星电子株式会社 Apparatus and method for generating and retrieving location-tagged content in computing device
US20140222865A1 (en) * 2013-01-29 2014-08-07 Michael William Casey Method, System and Program for Interactive Information Services
CN105808782A (en) * 2016-03-31 2016-07-27 广东小天才科技有限公司 Picture label adding method and device
CN105843887A (en) * 2016-03-21 2016-08-10 联想(北京)有限公司 Information processing method and electronic device
CN106648327A (en) * 2016-12-29 2017-05-10 北京珠穆朗玛移动通信有限公司 Picture processing method and mobile terminal
CN108470055A (en) * 2018-03-15 2018-08-31 维沃移动通信有限公司 A kind of display methods and mobile terminal of text message
CN108600656A (en) * 2018-04-19 2018-09-28 北京深醒科技有限公司 The method and device of facial label is added in video
CN109325213A (en) * 2018-09-30 2019-02-12 北京字节跳动网络技术有限公司 Method and apparatus for labeled data
CN109542379A (en) * 2018-11-22 2019-03-29 联想(北京)有限公司 Content displaying method and device, computer system and computer readable storage medium
CN110110014A (en) * 2017-12-26 2019-08-09 阿里巴巴集团控股有限公司 Determination method, server and the user client of the location information of target object
CN110460881A (en) * 2019-08-01 2019-11-15 广州虎牙科技有限公司 Management method, device, computer equipment and the storage medium of attribute tags

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102307284A (en) * 2011-03-18 2012-01-04 海尔集团公司 Processing method and system for simultaneously displaying television program and menu
CN103874997A (en) * 2011-09-29 2014-06-18 三星电子株式会社 Apparatus and method for generating and retrieving location-tagged content in computing device
CN102722900A (en) * 2012-06-05 2012-10-10 深圳市中兴移动通信有限公司 Method and device for automatically adding introduction information to shot picture/video
US20140222865A1 (en) * 2013-01-29 2014-08-07 Michael William Casey Method, System and Program for Interactive Information Services
CN105843887A (en) * 2016-03-21 2016-08-10 联想(北京)有限公司 Information processing method and electronic device
CN105808782A (en) * 2016-03-31 2016-07-27 广东小天才科技有限公司 Picture label adding method and device
CN106648327A (en) * 2016-12-29 2017-05-10 北京珠穆朗玛移动通信有限公司 Picture processing method and mobile terminal
CN110110014A (en) * 2017-12-26 2019-08-09 阿里巴巴集团控股有限公司 Determination method, server and the user client of the location information of target object
CN108470055A (en) * 2018-03-15 2018-08-31 维沃移动通信有限公司 A kind of display methods and mobile terminal of text message
CN108600656A (en) * 2018-04-19 2018-09-28 北京深醒科技有限公司 The method and device of facial label is added in video
CN109325213A (en) * 2018-09-30 2019-02-12 北京字节跳动网络技术有限公司 Method and apparatus for labeled data
CN109542379A (en) * 2018-11-22 2019-03-29 联想(北京)有限公司 Content displaying method and device, computer system and computer readable storage medium
CN110460881A (en) * 2019-08-01 2019-11-15 广州虎牙科技有限公司 Management method, device, computer equipment and the storage medium of attribute tags

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115357630A (en) * 2022-10-24 2022-11-18 北京国电通网络技术有限公司 Information detection method, apparatus, device, computer readable medium and program product
CN115357630B (en) * 2022-10-24 2023-01-17 北京国电通网络技术有限公司 Information detection method, apparatus, device, computer readable medium and program product

Similar Documents

Publication Publication Date Title
CN110162670B (en) Method and device for generating expression package
CN111399956A (en) Content display method and device applied to display equipment and electronic equipment
CN111784712B (en) Image processing method, device, equipment and computer readable medium
CN111399729A (en) Image drawing method and device, readable medium and electronic equipment
CN109862100B (en) Method and device for pushing information
CN111597467A (en) Display method and device and electronic equipment
CN110619078B (en) Method and device for pushing information
CN111309240B (en) Content display method and device and electronic equipment
CN109934142B (en) Method and apparatus for generating feature vectors of video
CN111599022A (en) House display method and device and electronic equipment
CN111897950A (en) Method and apparatus for generating information
CN109919220B (en) Method and apparatus for generating feature vectors of video
CN111652675A (en) Display method and device and electronic equipment
CN113220752A (en) Display method and device and electronic equipment
CN112148744A (en) Page display method and device, electronic equipment and computer readable medium
CN113628097A (en) Image special effect configuration method, image recognition method, image special effect configuration device and electronic equipment
CN112256221A (en) Information display method and device and electronic equipment
CN112307393A (en) Information issuing method and device and electronic equipment
CN112306976A (en) Information processing method and device and electronic equipment
CN111310086A (en) Page jump method and device and electronic equipment
CN115941841A (en) Associated information display method, device, equipment, storage medium and program product
CN114520928B (en) Display information generation method, information display device and electronic equipment
WO2020078049A1 (en) User information processing method and device, server, and readable medium
CN111897951A (en) Method and apparatus for generating information
CN112153439A (en) Interactive video processing method, device and equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination