CN111611430A - Song playing method, device, terminal and storage medium - Google Patents

Song playing method, device, terminal and storage medium Download PDF

Info

Publication number
CN111611430A
CN111611430A CN202010456609.5A CN202010456609A CN111611430A CN 111611430 A CN111611430 A CN 111611430A CN 202010456609 A CN202010456609 A CN 202010456609A CN 111611430 A CN111611430 A CN 111611430A
Authority
CN
China
Prior art keywords
song
target
playing
target song
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010456609.5A
Other languages
Chinese (zh)
Inventor
杨亚斌
许盛灿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Kugou Computer Technology Co Ltd
Original Assignee
Guangzhou Kugou Computer Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Kugou Computer Technology Co Ltd filed Critical Guangzhou Kugou Computer Technology Co Ltd
Priority to CN202010456609.5A priority Critical patent/CN111611430A/en
Publication of CN111611430A publication Critical patent/CN111611430A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/60Information retrieval; Database structures therefor; File system structures therefor of audio data
    • G06F16/64Browsing; Visualisation therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/60Information retrieval; Database structures therefor; File system structures therefor of audio data
    • G06F16/63Querying
    • G06F16/638Presentation of query results

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a song playing method, a song playing device, a song playing terminal and a song playing storage medium. The method comprises the following steps: when a target song is played, a song playing page is displayed, the song playing page comprises a dynamic effect mode option, and the dynamic effect mode option is used for triggering the target song to be played in a dynamic effect mode; when a first selection signal corresponding to the dynamic effect mode option is received, acquiring at least one image matched with a target song; and displaying at least one image matched with the target song on the song playing page. According to the technical scheme, the novel song playing mode (namely the dynamic effect mode) is provided, so that the user can view the images matched with the songs while enjoying the songs, and the playing mode of the songs is enriched.

Description

Song playing method, device, terminal and storage medium
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to a song playing method, a song playing device, a song playing terminal and a song playing storage medium.
Background
At present, most terminals provide a song playing function, and a user can enjoy songs through the song playing function provided by the terminal.
In the related art, a terminal provides the following two play modes when playing a song: album mode and lyric mode. In the album mode, the terminal displays album information such as an album title, a singer name, and the like to which a song being played belongs on a song playing page. In the lyric mode, the terminal displays the lyrics of the playing song on a song playing page.
In the related art, the play mode provided by the terminal is single.
Disclosure of Invention
The embodiment of the application provides a song playing method, a song playing device, a song terminal and a song storage medium, which can enrich the playing mode of songs. The technical scheme is as follows:
in one aspect, an embodiment of the present application provides a song playing method, where the method includes:
when a target song is played, a song playing page is displayed, the song playing page comprises a dynamic effect mode option, and the dynamic effect mode option is used for triggering the target song to be played in a dynamic effect mode;
when a first selection signal corresponding to the dynamic effect mode option is received, acquiring at least one image matched with the target song;
and displaying the at least one image matched with the target song on the song playing page.
In another aspect, an embodiment of the present application provides a song playback apparatus, where the apparatus includes:
the system comprises a page display module, a song playing module and a song playing module, wherein the page display module is used for displaying a song playing page when a target song is played, the song playing page comprises a dynamic effect mode option, and the dynamic effect mode option is used for triggering the target song to be played in a dynamic effect mode;
the image acquisition module is used for acquiring at least one image matched with the target song when a first selection signal corresponding to the dynamic effect mode option is received;
and the image display module is used for displaying the at least one image matched with the target song on the song playing page.
In yet another aspect, an embodiment of the present application provides a terminal, where the terminal includes a processor and a memory, where the memory stores a computer program, and the computer program is loaded and executed by the processor to implement the song playing method according to the first aspect.
In still another aspect, an embodiment of the present application provides a computer-readable storage medium, where a computer program is stored, and the computer program is loaded and executed by a processor to implement the song playing method according to the first aspect.
The technical scheme provided by the embodiment of the application can bring the beneficial effects of at least comprising:
when the songs are played, the images matched with the played songs are obtained, and then the images are sequentially displayed on the song playing page, so that a novel song playing mode (namely a dynamic mode) is provided, a user can view the images matched with the songs while enjoying the songs, and the playing mode of the songs is enriched.
Drawings
FIG. 1 is a flow diagram illustrating a song playback method according to an exemplary embodiment of the present application;
FIG. 2 is a schematic view of an interface according to the embodiment of FIG. 1;
FIG. 3 is another interface diagram according to the embodiment of FIG. 1;
FIG. 4 is a flow chart of a song playback method shown in another exemplary embodiment of the present application;
FIG. 5 is a flow chart of a song playback method shown in another exemplary embodiment of the present application;
FIG. 6 is a block diagram of a song playback device shown in one exemplary embodiment of the present application;
fig. 7 is a block diagram of a terminal according to an exemplary embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
The embodiment of the application provides a novel song playing mode (namely, dynamic effect mode), and when a song is played, a plurality of images matched with the playing song are acquired, and then the images are sequentially displayed on a song playing page, so that a user can view the images matched with the song while enjoying the song, and the playing mode of the song is enriched.
According to the technical scheme provided by the embodiment of the application, the execution main body of each step can be a terminal with a song playing function, and the terminal can be a mobile phone, a tablet Computer, a Personal Computer (PC), a Moving Picture Experts Group Audio Layer 3 (MP 3) player, a Moving Picture Experts Group Audio Layer 4 (MP 4) player and the like. Optionally, an application having a song playing function is installed in the terminal, and the execution subject of each step is the application. In the embodiments of the present application, only the case where the main body of execution of each step is a terminal is described as an example.
Referring to fig. 1, a flowchart of a song playing method according to an embodiment of the present application is shown. The method comprises the following steps:
step 101, when the target song is played, displaying a song playing page.
The target song is any one of songs and is selected by the terminal by default or selected by the user by definition. When a user desires to play a certain song, the user clicks the noun of the song, and the target song can be played.
The song playing page comprises a dynamic effect mode option which is used for triggering the target song to be played in a dynamic effect mode. The animation mode is a mode in which an image matched with a song being played is displayed on a song playing page in the form of animation, special effects and the like when the song is played.
Optionally, the song play page also provides other play mode options, such as a lyric mode option, an album mode option, and the like. The lyric mode is a mode in which, when a song is played, lyrics of the song being played are shown on a song playing page. The album mode is a mode in which, when a song is played, album information to which the song being played belongs is displayed on a song playing page.
Referring collectively to fig. 2, an interface diagram of a song playback page provided by one embodiment of the present application is shown. The song play page 21 includes a dynamic effect mode option 211, a lyric mode option 212, and an album mode option 213.
And 102, when a first selection signal corresponding to the dynamic effect mode option is received, acquiring at least one image matched with the target song.
Optionally, the first selection signal is any one of: a single-click operation signal, a double-click operation signal, a long-press operation signal, a sliding operation signal, a dragging operation signal, a sliding operation signal applied to a song playing page and the like which are applied to the action mode option. When the user desires to play the target song in the animated mode, the animated mode option is selected and touched.
The at least one image that matches the target song may be at least one of: photos of the singer of the target song, a portrait view of the location depicted by the target song, user-defined uploaded pictures, and the like.
In some embodiments, the terminal detects whether a user-defined uploaded picture exists or not, if so, the user-defined uploaded picture is determined to be at least one image matched with the target song, if not, whether a human landscape picture of a place depicted by the target song exists or not is detected, if so, the human landscape picture of the place depicted by the target song is determined to be at least one image matched with the target song, and if not, a photo of a singer of the target song is acquired as the at least one image matched with the target song.
And 103, displaying at least one image matched with the target song on the song playing page.
In the dynamic effect mode, the terminal successively displays the at least one image matched with the target song on the song playing page, so that the user can view the image matched with the song while enjoying the song, and the playing mode of the song is enriched. Referring to fig. 2 in combination, a schematic diagram of an interface for playing songs in a dynamic effect mode according to an embodiment of the present application is shown. In the dynamic mode, the terminal shows the image 3 at a certain beat when playing the song "want you" well.
Optionally, step 103 is implemented as the following sub-steps: acquiring beat information of a target song; and displaying at least one image matched with the target song on the song playing page according to the beat information of the target song.
The tempo information of the target song carries the tempo of the target song. The beat is a unit for measuring rhythm and is an organization form for expressing the duration and the strength of a fixed unit in the music. Optionally, the terminal stores a correspondence between the beats and the images, determines the images displayed at each beat by querying the correspondence, and then performs image display at each beat according to the determined image display sequence.
Optionally, step 103 is implemented as the following sub-steps: acquiring a target special effect type; and displaying at least one image matched with the target song on the song playing page according to the target special effect type.
And in the dynamic effect mode, the terminal renders the at least one image matched with the target song according to the target special effect type, and then displays the rendered at least one image on a song playing page.
The target special effect type is a special effect type set by a terminal in a default mode, or the target special effect type is a special effect type selected by a user in a self-defined mode. Optionally, the user custom selects the target special effect type by: displaying a special effect type list, wherein the special effect type list comprises a plurality of special effect types; receiving a second selection signal corresponding to a target one of the plurality of special effect types; and acquiring the target special effect type according to the second selection signal.
The plurality of special effect types described above include, but are not limited to: smoke, black and white film, streamer, feathers, fireworks, and fish filters. The second selection signal is any one of: a single-click operation signal, a double-click operation signal, a long-press operation signal, a sliding operation signal, and a dragging operation signal. In the embodiment of the present application, only the case where the second selection signal is the one-click operation signal will be described as an example.
With reference to fig. 3, a schematic diagram of an interface for selecting a target special effect type according to an embodiment of the present application is shown. The song playing page 31 displayed by the terminal includes a special effect type list, the special effect type list includes a special effect 1, a special effect 2 and a special effect 3, the user clicks the special effect 3, the terminal receives a click operation signal acting on the special effect 3 at this time, and the special effect 3 is determined as a target special effect type.
In other possible implementation manners, the terminal receives a sliding operation signal acting in the picture display window, and switches the currently adopted special effect type according to the sliding operation signal. The switching sequence is determined by the arrangement sequence of each special effect type in the special effect type list, or the switching sequence is randomly determined.
In summary, according to the technical scheme provided by the embodiment of the application, when a song is played, a plurality of images matched with the song being played are acquired, and then the plurality of images are sequentially displayed on the song playing page, so that a novel song playing mode (namely, an action mode) is provided, a user can view the images matched with the song while enjoying the song, and the song playing mode is enriched.
Referring to fig. 4, a flowchart of a song playing method according to an embodiment of the present application is shown. The method comprises the following steps:
step 401, when the target song is played, a song playing page is displayed.
The song playing page comprises a dynamic effect mode option which is used for triggering the target song to be played in a dynamic effect mode.
At step 402, when a first selection signal corresponding to the dynamic mode option is received, at least one image matched with the target song is acquired.
Step 403, obtaining beat information of the target song.
Step 404, obtaining a target special effect type.
The execution sequence of steps 403 and 404 is not limited in the embodiment of the present application. In one implementation, the terminal performs step 403 first, and then performs step 404; in another implementation, the terminal performs step 404 first, and then performs step 403; in yet another implementation, the terminal performs step 403 and step 404 simultaneously.
And step 405, displaying at least one image matched with the target song on a song playing page according to the beat information of the target song and the target special effect type.
In the embodiment of the application, in the dynamic effect mode, the terminal renders the at least one image matched with the target song by using the target special effect type, and then displays the rendered at least one image matched with the target song when each beat of the target song is played.
In step 406, a pause playing instruction corresponding to the target song is received.
The pause playing instruction is used for triggering the pause playing of the target song. Optionally, the song playing page further includes a playing control, and when the target song is played, if a trigger signal corresponding to the playing control is received, the terminal receives the playing pause instruction, and pauses the playing of the target song according to the playing pause instruction. In addition, when the target song is paused, if a trigger signal corresponding to the playing control is received, the terminal receives a continuous playing instruction, and continues to play the target song according to the continuous playing instruction and the playing progress before pausing.
Step 407, pausing to display at least one image matched with the target song on the song playing page according to the pause playing instruction.
In the embodiment of the application, in the dynamic effect mode, the terminal also plays the target song according to the playing progress of the target song. And when the terminal receives the playing pause instruction, pausing the playing of the target song and pausing the display of at least one image matched with the target song, so that the playing progress of the target song is synchronous with the image display progress.
And step 408, when the target song is finished playing, stopping displaying at least one image matched with the target song on the song playing page.
In the embodiment of the application, in the dynamic effect mode, the terminal also plays the target song according to the playing progress of the target song. And when the target song is finished playing, stopping playing the target song and stopping displaying at least one image matched with the target song, so that the playing progress of the target song is synchronous with the image displaying progress.
Referring to fig. 5 in combination, a flowchart of a song playing method according to another embodiment of the present application is shown. The method may comprise the steps of:
step 501, playing the target song.
Step 502, drum point information (i.e. beat information) of the target song is obtained.
Step 503, obtaining the target special effect type.
Step 504, detect whether the target song is in a playing state.
If the target song is in the playing state, step 505 and 507 are executed; if the target song is not in the playing state, step 508 is executed.
And step 505, acquiring the current playing progress and the current beat information.
And step 506, rendering the image matched with the target song according to the target special effect type.
And step 507, displaying the rendered image matched with the target song.
And step 508, pausing or stopping the step of rendering the image matched with the target song according to the target special effect type.
In summary, according to the technical scheme provided by the embodiment of the application, when a song is played, a plurality of images matched with the song being played are acquired, and then the plurality of images are sequentially displayed on the song playing page, so that a novel song playing mode (namely, an action mode) is provided, a user can view the images matched with the song while enjoying the song, and the song playing mode is enriched.
In the following, embodiments of the apparatus of the present application are described, and for portions of the embodiments of the apparatus not described in detail, reference may be made to technical details disclosed in the above-mentioned method embodiments.
Referring to fig. 6, a block diagram of a song playback apparatus provided in an exemplary embodiment of the present application is shown. The song playback apparatus may be implemented as all or a part of the terminal by software, hardware, or a combination of both. The song playback apparatus includes:
the page display module 601 is configured to display a song playing page when a target song is played, where the song playing page includes a dynamic effect mode option, and the dynamic effect mode option is used to trigger the target song to be played in a dynamic effect mode.
An image obtaining module 602, configured to obtain at least one image matching the target song when a first selection signal corresponding to the dynamic mode option is received.
An image display module 603, configured to display the at least one image matched with the target song on the song playing page.
In summary, according to the technical scheme provided by the embodiment of the application, when a song is played, a plurality of images matched with the song being played are acquired, and then the plurality of images are sequentially displayed on the song playing page, so that a novel song playing mode (namely, an action mode) is provided, a user can view the images matched with the song while enjoying the song, and the song playing mode is enriched.
In an alternative embodiment provided based on the embodiment shown in fig. 6, the image presentation module 603 is configured to:
acquiring beat information of the target song;
and displaying the at least one image matched with the target song on the song playing page according to the beat information of the target song.
In an alternative embodiment provided based on the embodiment shown in fig. 6, the image presentation module 603 is configured to:
acquiring a target special effect type;
and displaying the at least one image matched with the target song on the song playing page according to the target special effect type.
Optionally, the image display module 603 is configured to:
displaying a special effect type list, wherein the special effect type list comprises a plurality of special effect types;
receiving a second selection signal corresponding to the target one of the plurality of special effect types;
and acquiring the target special effect type according to the second selection signal.
In an optional embodiment provided based on the embodiment shown in fig. 6, the apparatus further comprises: an instruction receiving module (not shown in fig. 6).
And the instruction receiving module is used for receiving a pause playing instruction corresponding to the target song.
The image display module 603 is further configured to pause displaying the at least one image matched with the target song on the song playing page according to the pause playing instruction.
In an optional embodiment provided based on the embodiment shown in fig. 6, the image presentation module 603 is further configured to stop presenting the at least one image matching the target song on the song playing page when the target song finishes playing.
It should be noted that, when the apparatus provided in the foregoing embodiment implements the functions thereof, only the division of the functional modules is illustrated, and in practical applications, the functions may be distributed by different functional modules according to needs, that is, the internal structure of the apparatus may be divided into different functional modules to implement all or part of the functions described above. In addition, the apparatus and method embodiments provided by the above embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments for details, which are not described herein again.
Fig. 7 shows a block diagram of a terminal 700 according to an exemplary embodiment of the present application. The terminal 700 may be: a smartphone, a tablet, an MP3 player, an MP4 player, a laptop, or a desktop computer. Terminal 700 may also be referred to by other names such as user equipment, portable terminal, laptop terminal, desktop terminal, and so on.
In general, terminal 700 includes: a processor 701 and a memory 702.
The processor 701 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on. The processor 701 may be implemented in at least one hardware form of Digital Signal Processing (DSP), Field-Programmable Gate Array (FPGA), and Programmable Logic Array (PLA). The processor 701 may also include a main processor and a coprocessor, where the main processor is a processor for processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 701 may be integrated with a Graphics Processing Unit (GPU) which is responsible for rendering and drawing the content required to be displayed on the display screen.
Memory 702 may include one or more computer-readable storage media, which may be non-transitory. Memory 702 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 702 is used to store a computer program for execution by processor 701 to implement a song playback method provided by method embodiments herein.
In some embodiments, the terminal 700 may further optionally include: a peripheral interface 703 and at least one peripheral. The processor 701, the memory 702, and the peripheral interface 703 may be connected by buses or signal lines. Various peripheral devices may be connected to peripheral interface 703 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of a radio frequency circuit 704, a touch screen display 705, a camera 707, an audio circuit 707, a positioning component 708, and a power source 709.
The peripheral interface 703 may be used to connect at least one Input/Output (I/O) related peripheral to the processor 701 and the memory 702. In some embodiments, processor 701, memory 702, and peripheral interface 703 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 701, the memory 702, and the peripheral interface 703 may be implemented on a separate chip or circuit board, which is not limited in this embodiment.
The Radio Frequency circuit 704 is used for receiving and transmitting Radio Frequency (RF) signals, also called electromagnetic signals. The radio frequency circuitry 704 communicates with communication networks and other communication devices via electromagnetic signals. The rf circuit 704 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 704 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuitry 704 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or Wireless Fidelity (WiFi) networks. In some embodiments, rf circuitry 704 may also include Near Field Communication (NFC) related circuitry, which is not limited in this application.
The display screen 705 is used to display a User Interface (UI). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 705 is a touch display screen, the display screen 705 also has the ability to capture touch signals on or over the surface of the display screen 705. The touch signal may be input to the processor 701 as a control signal for processing. At this point, the display 705 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 705 may be one, providing the front panel of the terminal 700; in other embodiments, the display 705 can be at least two, respectively disposed on different surfaces of the terminal 700 or in a folded design; in still other embodiments, the display 705 may be a flexible display disposed on a curved surface or on a folded surface of the terminal 700. Even more, the display 705 may be arranged in a non-rectangular irregular pattern, i.e. a shaped screen. The Display 705 may be made of Liquid Crystal Display (LCD), Organic Light-Emitting Diode (OLED), or the like.
The camera assembly 707 is used to capture images or video. Optionally, the camera assembly 707 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and a Virtual Reality (VR) shooting function or other fusion shooting functions. In some embodiments, camera assembly 707 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuitry 707 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 701 for processing or inputting the electric signals to the radio frequency circuit 704 to realize voice communication. For the purpose of stereo sound collection or noise reduction, a plurality of microphones may be provided at different portions of the terminal 700. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 701 or the radio frequency circuit 704 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuitry 707 may also include a headphone jack.
The positioning component 708 is used to locate the current geographic Location of the terminal 700 for purposes of navigation or Location Based Service (LBS). The positioning component 708 can be a positioning component based on the Global Positioning System (GPS) in the united states, the beidou System in china, or the galileo System in russia.
Power supply 709 is provided to supply power to various components of terminal 700. The power source 709 may be alternating current, direct current, disposable batteries, or rechargeable batteries. When the power source 709 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 700 can also include one or more sensors 77. The one or more sensors 77 include, but are not limited to: acceleration sensor 711, gyro sensor 712, pressure sensor 713, fingerprint sensor 714, optical sensor 717, and proximity sensor 717.
The acceleration sensor 711 can detect the magnitude of acceleration in three coordinate axes of a coordinate system established with the terminal 700. For example, the acceleration sensor 711 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 701 may control the touch display 707 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 711. The acceleration sensor 711 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 712 may detect a body direction and a rotation angle of the terminal 700, and the gyro sensor 712 may cooperate with the acceleration sensor 711 to acquire a 3D motion of the terminal 700 by the user. From the data collected by the gyro sensor 712, the processor 701 may implement the following functions: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensors 713 may be disposed on a side bezel of terminal 700 and/or an underlying layer of touch display 707. When the pressure sensor 713 is disposed on a side frame of the terminal 700, a user's grip signal on the terminal 700 may be detected, and the processor 701 performs right-left hand recognition or shortcut operation according to the grip signal collected by the pressure sensor 713. When the pressure sensor 713 is arranged at the lower layer of the touch display screen 707, the processor 701 controls the operability control on the UI interface according to the pressure operation of the user on the touch display screen 707. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 714 is used for collecting a fingerprint of a user, and the processor 701 identifies the identity of the user according to the fingerprint collected by the fingerprint sensor 714, or the fingerprint sensor 714 identifies the identity of the user according to the collected fingerprint. When the user identity is identified as a trusted identity, the processor 701 authorizes the user to perform relevant sensitive operations, including unlocking a screen, viewing encrypted information, downloading software, paying, changing settings, and the like. The fingerprint sensor 714 may be disposed on the front, back, or side of the terminal 700. When a physical button or a vendor Logo is provided on the terminal 700, the fingerprint sensor 714 may be integrated with the physical button or the vendor Logo.
Optical sensor 717 is used to collect ambient light intensity. In one embodiment, the processor 701 may control the display brightness of the touch display 707 based on the ambient light intensity collected by the optical sensor 717. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 707 is increased; when the ambient light intensity is low, the display brightness of the touch display 707 is turned down. In another embodiment, the processor 701 may also dynamically adjust the shooting parameters of the camera assembly 707 according to the ambient light intensity collected by the optical sensor 717.
A proximity sensor 716, also referred to as a distance sensor, is typically disposed on a front panel of the terminal 700. The proximity sensor 716 is used to collect the distance between the user and the front surface of the terminal 700. In one embodiment, when the proximity sensor 716 detects that the distance between the user and the front surface of the terminal 700 gradually decreases, the processor 701 controls the touch display 707 to switch from the bright screen state to the dark screen state; when the proximity sensor 716 detects that the distance between the user and the front surface of the terminal 700 gradually becomes larger, the processor 701 controls the touch display 707 to switch from the breath-screen state to the bright-screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 7 is not intended to be limiting of terminal 700 and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be used.
In an exemplary embodiment, there is also provided a computer-readable storage medium having stored therein a computer program, which is loaded and executed by a processor of a terminal to implement the song playing method in the above-described method embodiments.
Alternatively, the computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic tape, a floppy disk, an optical data storage device, and the like.
In an exemplary embodiment, a computer program product is also provided, which, when executed, is adapted to implement the song playing method provided in the above-described method embodiments.
It should be understood that reference to "a plurality" herein means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. As used herein, the terms "first," "second," and the like, do not denote any order, quantity, or importance, but rather are used to distinguish one element from another.
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
The above description is only exemplary of the present application and should not be taken as limiting the present application, and any modifications, equivalents, improvements and the like that are made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (10)

1. A method for playing a song, the method comprising:
when a target song is played, a song playing page is displayed, the song playing page comprises a dynamic effect mode option, and the dynamic effect mode option is used for triggering the target song to be played in a dynamic effect mode;
when a first selection signal corresponding to the dynamic effect mode option is received, acquiring at least one image matched with the target song;
and displaying the at least one image matched with the target song on the song playing page.
2. The method of claim 1, wherein the presenting the at least one image matching the target song on the song playing page comprises:
acquiring beat information of the target song;
and displaying the at least one image matched with the target song on the song playing page according to the beat information of the target song.
3. The method of claim 1, wherein the presenting the at least one image matching the target song on the song playing page comprises:
acquiring a target special effect type;
and displaying the at least one image matched with the target song on the song playing page according to the target special effect type.
4. The method of claim 3, wherein obtaining the target special effect type comprises:
displaying a special effect type list, wherein the special effect type list comprises a plurality of special effect types;
receiving a second selection signal corresponding to the target one of the plurality of special effect types;
and acquiring the target special effect type according to the second selection signal.
5. The method according to any one of claims 1 to 4, wherein after the displaying the at least one image matching the target song on the song playing page, further comprising:
receiving a pause playing instruction corresponding to the target song;
and pausing to display the at least one image matched with the target song on the song playing page according to the pause playing instruction.
6. The method according to any one of claims 1 to 4, wherein after the displaying the at least one image matching the target song on the song playing page, further comprising:
and when the target song is finished playing, stopping displaying the at least one image matched with the target song on the song playing page.
7. A song playback apparatus, characterized in that the apparatus comprises:
the system comprises a page display module, a song playing module and a song playing module, wherein the page display module is used for displaying a song playing page when a target song is played, the song playing page comprises a dynamic effect mode option, and the dynamic effect mode option is used for triggering the target song to be played in a dynamic effect mode;
the image acquisition module is used for acquiring at least one image matched with the target song when a first selection signal corresponding to the dynamic effect mode option is received;
and the image display module is used for displaying the at least one image matched with the target song on the song playing page.
8. The apparatus of claim 7, wherein the image presentation module is configured to:
acquiring beat information of the target song;
and displaying the at least one image matched with the target song on the song playing page according to the beat information of the target song.
9. A terminal, characterized in that the terminal comprises a processor and a memory, the memory storing a computer program which is loaded and executed by the processor to implement a song playback method according to any one of claims 1 to 6.
10. A computer-readable storage medium, in which a computer program is stored, the computer program being loaded and executed by a processor to implement the song playback method according to any one of claims 1 to 6.
CN202010456609.5A 2020-05-26 2020-05-26 Song playing method, device, terminal and storage medium Pending CN111611430A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010456609.5A CN111611430A (en) 2020-05-26 2020-05-26 Song playing method, device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010456609.5A CN111611430A (en) 2020-05-26 2020-05-26 Song playing method, device, terminal and storage medium

Publications (1)

Publication Number Publication Date
CN111611430A true CN111611430A (en) 2020-09-01

Family

ID=72202150

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010456609.5A Pending CN111611430A (en) 2020-05-26 2020-05-26 Song playing method, device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN111611430A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113518235A (en) * 2021-04-30 2021-10-19 广州繁星互娱信息科技有限公司 Live video data generation method and device and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101930446A (en) * 2009-06-26 2010-12-29 鸿富锦精密工业(深圳)有限公司 Electronic device and method for playing music in embedded electronic device
CN105049959A (en) * 2015-07-08 2015-11-11 腾讯科技(深圳)有限公司 Multimedia file playing method and device
CN107170471A (en) * 2017-03-24 2017-09-15 联想(北京)有限公司 The processing method and electronic equipment of a kind of music background
CN108259983A (en) * 2017-12-29 2018-07-06 广州市百果园信息技术有限公司 A kind of method of video image processing, computer readable storage medium and terminal
CN109615682A (en) * 2018-12-07 2019-04-12 北京微播视界科技有限公司 Animation producing method, device, electronic equipment and computer readable storage medium
CN110244998A (en) * 2019-06-13 2019-09-17 广州酷狗计算机科技有限公司 Page layout background, the setting method of live page background, device and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101930446A (en) * 2009-06-26 2010-12-29 鸿富锦精密工业(深圳)有限公司 Electronic device and method for playing music in embedded electronic device
CN105049959A (en) * 2015-07-08 2015-11-11 腾讯科技(深圳)有限公司 Multimedia file playing method and device
CN107170471A (en) * 2017-03-24 2017-09-15 联想(北京)有限公司 The processing method and electronic equipment of a kind of music background
CN108259983A (en) * 2017-12-29 2018-07-06 广州市百果园信息技术有限公司 A kind of method of video image processing, computer readable storage medium and terminal
CN109615682A (en) * 2018-12-07 2019-04-12 北京微播视界科技有限公司 Animation producing method, device, electronic equipment and computer readable storage medium
CN110244998A (en) * 2019-06-13 2019-09-17 广州酷狗计算机科技有限公司 Page layout background, the setting method of live page background, device and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113518235A (en) * 2021-04-30 2021-10-19 广州繁星互娱信息科技有限公司 Live video data generation method and device and storage medium
CN113518235B (en) * 2021-04-30 2023-11-28 广州繁星互娱信息科技有限公司 Live video data generation method, device and storage medium

Similar Documents

Publication Publication Date Title
CN110336960B (en) Video synthesis method, device, terminal and storage medium
CN109302538B (en) Music playing method, device, terminal and storage medium
CN109756784B (en) Music playing method, device, terminal and storage medium
CN108769561B (en) Video recording method and device
CN107908929B (en) Method and device for playing audio data
CN108965922B (en) Video cover generation method and device and storage medium
CN109379485B (en) Application feedback method, device, terminal and storage medium
CN111061405B (en) Method, device and equipment for recording song audio and storage medium
CN109346111B (en) Data processing method, device, terminal and storage medium
CN109068160B (en) Method, device and system for linking videos
CN109144346B (en) Song sharing method and device and storage medium
CN109922356B (en) Video recommendation method and device and computer-readable storage medium
CN110139143B (en) Virtual article display method, device, computer equipment and storage medium
CN109635133B (en) Visual audio playing method and device, electronic equipment and storage medium
CN111083526B (en) Video transition method and device, computer equipment and storage medium
CN110266982B (en) Method and system for providing songs while recording video
CN111711838B (en) Video switching method, device, terminal, server and storage medium
CN110868636B (en) Video material intercepting method and device, storage medium and terminal
CN109743461B (en) Audio data processing method, device, terminal and storage medium
CN109982129B (en) Short video playing control method and device and storage medium
CN111368114A (en) Information display method, device, equipment and storage medium
CN111092991B (en) Lyric display method and device and computer storage medium
CN110868642B (en) Video playing method, device and storage medium
CN112118482A (en) Audio file playing method and device, terminal and storage medium
CN110377208B (en) Audio playing method, device, terminal and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200901

RJ01 Rejection of invention patent application after publication