CN108337367B - Musical instrument playing method and device based on mobile terminal - Google Patents

Musical instrument playing method and device based on mobile terminal Download PDF

Info

Publication number
CN108337367B
CN108337367B CN201810032252.0A CN201810032252A CN108337367B CN 108337367 B CN108337367 B CN 108337367B CN 201810032252 A CN201810032252 A CN 201810032252A CN 108337367 B CN108337367 B CN 108337367B
Authority
CN
China
Prior art keywords
target
instrument
musical instrument
mobile terminal
parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810032252.0A
Other languages
Chinese (zh)
Other versions
CN108337367A (en
Inventor
陈文龙
何军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201810032252.0A priority Critical patent/CN108337367B/en
Publication of CN108337367A publication Critical patent/CN108337367A/en
Application granted granted Critical
Publication of CN108337367B publication Critical patent/CN108337367B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/60Information retrieval; Database structures therefor; File system structures therefor of audio data
    • G06F16/63Querying
    • G06F16/632Query formulation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/60Information retrieval; Database structures therefor; File system structures therefor of audio data
    • G06F16/68Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/686Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using information manually generated, e.g. tags, keywords, comments, title or artist information, time, location or usage information, user ratings
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/36Accompaniment arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72442User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for playing music files

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Acoustics & Sound (AREA)
  • Environmental & Geological Engineering (AREA)
  • Library & Information Science (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Electrophonic Musical Instruments (AREA)
  • Telephone Function (AREA)

Abstract

The embodiment of the invention provides a musical instrument playing method based on a mobile terminal, wherein the method comprises the following steps: determining a target musical instrument parameter for accompanying the multimedia data in the process of playing the multimedia data; detecting an accompaniment operation acting on the mobile terminal; determining a target instrument sound source corresponding to the accompaniment operation based on the target instrument parameters; and when the multimedia data is played, playing the target musical instrument sound source. The embodiment of the invention uses the mobile terminal to replace musical instruments to sound, outputs the sound to the playing equipment in real time for playing, and plays the sound and the currently played multimedia data in a mixed manner, thereby achieving the purpose of playing the multimedia data by using the mobile terminal, enriching the function of multimedia playing and enhancing the interaction between users and the mobile terminal.

Description

Musical instrument playing method and device based on mobile terminal
Technical Field
The embodiment of the invention relates to the technical field of multimedia data processing, in particular to a musical instrument playing method and device based on a mobile terminal.
Background
With the development of information technology, more and more functions are provided in the mobile terminal, and the life and work of the user are greatly enriched. For example, a mobile terminal has a multimedia player therein, and a user can play audio or video using the multimedia player. However, the existing multimedia player has too single function and less interaction with the user.
Disclosure of Invention
The embodiment of the invention provides a musical instrument playing method based on a mobile terminal, and aims to solve the problems that the existing multimedia player is too single in function and less in interaction with a user.
In order to solve the technical problem, the invention is realized as follows: a musical instrument playing method based on a mobile terminal comprises the following steps: determining a target musical instrument parameter for accompanying the multimedia data in the process of playing the multimedia data; detecting an accompaniment operation acting on the mobile terminal; determining a target instrument sound source corresponding to the accompaniment operation based on the target instrument parameters; and when the multimedia data is played, playing the target musical instrument sound source.
In a first aspect, an embodiment of the present invention further provides a mobile terminal-based musical instrument playing apparatus, where the apparatus includes: the target musical instrument parameter determining module is used for determining target musical instrument parameters for accompanying the multimedia data in the process of playing the multimedia data; the accompaniment operation detection module is used for detecting the accompaniment operation acted on the mobile terminal; the target instrument sound source determining module is used for determining a target instrument sound source corresponding to the accompaniment operation based on the target instrument parameters; and the playing module is used for playing the target musical instrument sound source when the multimedia data is played.
In the embodiment of the invention, in the process of playing the multimedia data, a user can enable the mobile terminal to play the corresponding target musical instrument sound source by triggering the accompaniment operation, the mobile terminal is used for replacing the musical instrument to sound and outputting the sound to the playing equipment in real time to play, and the sound and the currently played multimedia data are mixed and played, so that the purpose of playing the multimedia data by using the mobile terminal is achieved, the function of multimedia playing is enriched, and the interaction between the user and the mobile terminal is enhanced.
Drawings
Fig. 1 is a flowchart of steps of an embodiment of a mobile terminal-based musical instrument playing method according to an embodiment of the present invention;
fig. 2 is a flowchart of steps of another embodiment of a mobile terminal-based musical instrument playing method according to an embodiment of the present invention;
FIG. 3 is a schematic view of a designation control interface according to an embodiment of the invention;
FIG. 4 is a schematic diagram of interface partitioning according to an embodiment of the present invention;
FIG. 5 is a partial schematic view of a mobile terminal according to an embodiment of the invention;
fig. 6 is a block diagram showing the construction of an embodiment of a mobile terminal-based musical instrument playing apparatus according to an embodiment of the present invention;
fig. 7 is a schematic diagram of a hardware structure of a mobile terminal according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, a flowchart illustrating steps of an embodiment of a mobile terminal-based musical instrument playing method according to an embodiment of the present invention is shown, which may specifically include the following steps:
step 101, determining a target musical instrument parameter for accompanying multimedia data in the process of playing the multimedia data;
in the embodiment of the invention, in the process of playing multimedia data such as pictures, audio and/or video and the like by the mobile terminal, a user can use the mobile terminal to accompany musical instruments.
First, target instrument parameters for accompanying multimedia data may be determined. When implemented, the target instrument parameter may be an instrument parameter selected by a user through an interactive interface (e.g., a multimedia playing interface, an instrument playing interface, etc.), and may include, but is not limited to, information such as an instrument identifier, an instrument pitch parameter, etc.
102, detecting an accompaniment operation acting on the mobile terminal;
in a specific implementation, the accompaniment operation acting on the mobile terminal can be detected by a gravity sensor in the mobile terminal, and when the gravity sensor detects the pressure parameter, the accompaniment operation acting on the mobile terminal can be determined to be detected.
Step 103, determining a target musical instrument sound source corresponding to the accompaniment operation based on the target musical instrument parameters;
in a specific implementation, when an accompaniment operation is detected, the embodiment of the present invention may acquire a pressure parameter corresponding to the accompaniment operation, and determine a target instrument sound source corresponding to the pressure parameter in a table lookup manner.
And 104, playing the target musical instrument sound source when the multimedia data is played.
After the target musical instrument sound source is obtained, the target musical instrument sound source can be played, and in the playing process of the multimedia data, the target musical instrument sound source and the multimedia data are played simultaneously, so that the effect of accompaniment is achieved.
In the embodiment of the invention, in the process of playing the multimedia data, a user can enable the mobile terminal to play the corresponding target musical instrument sound source by triggering the accompaniment operation, the mobile terminal is used for replacing the musical instrument to sound and outputting the sound to the playing equipment in real time to play, and the sound and the currently played multimedia data are mixed and played, so that the purpose of playing the multimedia data by using the mobile terminal is achieved, the function of multimedia playing is enriched, and the interaction between the user and the mobile terminal is enhanced.
Referring to fig. 2, a flowchart illustrating steps of another embodiment of a mobile terminal-based musical instrument playing method according to an embodiment of the present invention is shown, and may specifically include the following steps:
step 201, in the process of playing multimedia data, determining target musical instrument parameters for accompanying the multimedia data;
in the embodiment of the invention, a performance mode can be set, and in the performance mode, in the process of playing the multimedia data, a user can use the mobile terminal playing the multimedia data currently to accompany the multimedia data.
In a specific implementation, the performance mode may be set by an APK (android package) or a switch integrated in a system of the mobile terminal.
Specifically, in an embodiment, an application having a performance mode may be downloaded and installed, and when the application is started, a user may add a musical instrument sound source to the multimedia data to accompany the multimedia data while playing the multimedia data.
In another embodiment, a switch of a performance mode may be provided in a system of the mobile terminal, and when the switch is turned on, if the multimedia data is played, a musical instrument sound source may be added to the multimedia data to accompany the multimedia data.
First, target instrument parameters for accompanying multimedia data may be determined.
As a preferred example of the embodiment of the present invention, the target instrument parameter may include a target instrument identification, a target pitch parameter, and the like. And the instrument identification may be identified by instrument name and/or instrument icon, for example, for a percussion instrument, the instrument identification may include, but is not limited to, a timpani, xylophone, marimba, tremolo, bell, pipe clock, and the like.
The pitch parameter may be a parameter indicating the pitch, where the pitch is the pitch of the tone, and the pitch is determined by the frequency, and the unit is Hertz (HZ), which indicates how much one second vibrates, the frequency is fast, the tone is high, the frequency is slow, and the tone is low.
In a preferred embodiment of the present invention, if the target instrument parameter is the target instrument identifier, step 201 may include the following sub-steps:
a substep S11 of presenting an instrument identification list;
in the embodiment of the present invention, a sound source database (the generation manner of the sound source database will be described in step 204) may be preset, and sound source data corresponding to one or more pitches of multiple musical instruments may be stored in the sound source database, so that the musical instrument identifiers of all the musical instruments in the sound source database may be organized into a musical instrument identifier list.
After obtaining the list of instrument identifications, the list of instrument identifications may be presented in a designated control interface, wherein the instrument identifications in the list of instrument identifications may be selected.
In a preferred embodiment of the present invention, the sub-step S11 may further be:
and displaying the instrument identification list in a designated area of a display interface of the terminal.
For example, as shown in fig. 3, the presentation interface of the terminal may be a designated control interface, the designated control interface may include a designated area, the designated area may be an area where the instrument identifier 301 is located, in implementation, the area where the instrument identifier 301 is located may be a number selector, and a control of the instrument identifier list may be provided through the number selector, and the instrument identifier list may be implemented as a drop-down list.
In fig. 3, the designated control interface may also include instrument control areas such as previous instrument identification 302, next instrument identification 303, paused instrument performance 304, etc., and multimedia control areas that may include, but are not limited to, song name 305, previous song 306, next song 307, pause 308, etc.
In another preferred embodiment of the present invention, the sub-step S11 may further be:
and respectively displaying each instrument identification in the instrument identification list in different areas of a display interface of the terminal.
In implementation, the display interface (i.e. the designated control interface) of the terminal may be divided into a plurality of regions, for example, as shown in the interface division diagram shown in fig. 4, if there are 8 instrument identifiers in the instrument identifier list, the designated control interface may be divided into 8 regions 401, and each region displays one instrument identifier.
In practice, in order to realize the playing control of the song, a switching area 402 may be further included in the designated control interface of fig. 4, and clicking the switching area 402 may realize interface switching to switch the musical instrument control interface to the song control interface, and song information and song control information are displayed in the song control interface.
And a substep S12 of using the detected selected instrument identification as the target instrument identification.
In implementation, for the illustration manner shown in fig. 3, the instrument identifier corresponding to the selected value in the drop-down list may be obtained as the target instrument identifier, for example, a method of calling cardumber obtains the selected value in the drop-down list.
For the display mode of fig. 4, after the list of musical instrument identifiers is displayed, the selection operation of the user on the screen may be detected, the position information corresponding to the selection operation may be determined, the area corresponding to the position information may be determined as the target area, and the target musical instrument identifier corresponding to the target area may be determined according to the association relationship between the area and the musical instrument identifier.
In this embodiment, if the musical instrument is a musical instrument with a fixed pitch, the step 202 can be executed after the target musical instrument identifier is determined.
In another preferred embodiment of the present invention, if the target instrument parameter is a target pitch parameter, step 201 may further include the following sub-steps:
a substep S13 of presenting a pitch adjustment list of the target instrument, the pitch adjustment list including a plurality of pitch parameters;
and a substep S14 of using the detected selected pitch parameter as a target pitch parameter.
For the musical instrument with adjustable pitch, various types of pitch parameters of the musical instrument can be set according to the characteristics of the musical instrument and stored in a pitch adjustment list, when a user selects a target musical instrument, a pitch setting interface can be popped up or jumped, the pitch adjustment list is displayed in the pitch setting interface in the forms of a drop-down list or a list and the like, the selection operation of the user on the pitch adjustment list is detected, and the pitch parameter selected by the user is used as the target pitch parameter.
In this embodiment, if the musical instrument is a musical instrument with unfixed pitch, step 202 is executed after determining the target musical instrument identifier and the target pitch parameter.
Step 202, detecting an accompaniment operation acting on the mobile terminal;
in the embodiment of the invention, in the process of playing multimedia data such as pictures, audio and/or video and the like by the mobile terminal, after the target musical instrument parameters are determined, the accompaniment operation acting on the mobile terminal can be detected in the state that the designated control interface is opened.
In one embodiment, the accompaniment operation acting on the mobile terminal may be detected by a gravity sensor in the mobile terminal, and when the gravity sensor detects the pressure parameter, it may be determined that the accompaniment operation acting on the mobile terminal is detected.
In a specific implementation, as shown in a partial schematic diagram of a mobile terminal in fig. 5, the mobile terminal may include a main board 501, a main upper and screen module 503, a battery 504, and a rear cover 505 of the mobile terminal, and the gravity sensor 502 (i.e., a gravity sensor chip) is disposed on the main board 501.
As shown in fig. 5, the accompaniment operation may be performed on the rear cover 505 of the mobile terminal, i.e., the accompaniment operation F1 in fig. 5, or the accompaniment operation may be performed on the screen module 503 of the mobile terminal, i.e., the accompaniment operation F2 in fig. 5 (preferably, the accompaniment operation is performed in the screen-off state when being performed on the screen module 503). The embodiment of the invention does not limit the acting direction of the accompaniment operation.
In fig. 5, when an accompaniment operation acts on the mobile terminal, a vibration of the mobile terminal is induced, and the vibration causes the gravity sensor 502 to measure a change in pressure data in a vertical direction (Z direction), thereby determining that the accompaniment operation acting on the mobile terminal is detected.
Step 203, acquiring a target pressure parameter corresponding to the accompaniment operation, which is measured by a gravity sensor in the mobile terminal;
in a specific implementation, after the gravity sensor detects the accompaniment operation, the pressure parameter of the accompaniment operation can be further acquired as the target pressure parameter.
In a preferred embodiment of the present invention, step 203 further comprises the following sub-steps:
a substep S21, determining the vibration duration of the mobile terminal caused by the accompaniment operation when the accompaniment operation is triggered to the mobile terminal;
and a substep S22 of obtaining a maximum value and a minimum value of all pressure data measured by the gravity sensor in the vibration duration, and taking a difference value between the maximum value and the minimum value as a target pressure parameter corresponding to the accompaniment operation.
For example, as shown in fig. 3, assuming that the accompaniment operation triggered by the user is a slapping operation of the user on the rear cover of the terminal, a force acts on the terminal after the slapping operation, the force is marked as F1, F1 causes vibration of the terminal, the terminal vibration amount caused by different sizes of F1 is different, and the terminal vibration causes the gravity sensor to measure data change in the vertical direction (Z direction). Generally, the time of the terminal vibrating due to being flapped is fixed, and the duration of the vibration is denoted as S0At S0The variation of the data jitter of the internal gravity sensor is used as a target pressure parameter corresponding to the accompaniment operation, namely Z is | Zmax-Zmin |, wherein Z is the variation of the vertical direction of the gravity sensor, Zmax is the maximum value of the Z direction data of the gravity sensor, and Zmin is the Z direction data of the gravity sensorA minimum value.
Step 204, matching the target pressure parameter from a sound source database corresponding to the target musical instrument parameter to obtain a corresponding target musical instrument sound source;
after the target pressure parameter is determined, the target pressure parameter can be searched from the corresponding relation between the reference pressure parameter of the target musical instrument and the musical instrument sound source stored in the preset sound source database, and the target musical instrument sound source corresponding to the target pressure parameter is obtained.
In a specific implementation, for a musical instrument with a fixed pitch, the sound source database can store the association relationship between each type of musical instrument and corresponding reference pressure data; for the musical instruments with non-fixed pitches, the association relationship between each type of musical instrument and the pitch parameter of one or more corresponding pitch types, and the association relationship between each type of pitch parameter and the corresponding multiple types of reference pressure data can be stored in the pitch source database.
In practice, the sound source database may be divided into a plurality of sub sound source databases according to the instrument identifiers, that is, each instrument identifier has a corresponding sound source database.
In a preferred embodiment of the present invention, the sound source database corresponding to each instrument identifier may be generated as follows:
step S1, detecting a reference accompaniment operation triggered by a user aiming at each sound source in a preset sound source list;
in a specific implementation, a sound source list may be preset, where the sound source list may include multiple sound sources corresponding to one or more pitches, which are previously collected and can be emitted by the instrument corresponding to the current instrument identifier.
For each sound source in the sound source list, the user can be prompted to input a reference accompaniment operation, and a gravity sensor is adopted to detect the accompaniment operation.
Step S2, acquiring a reference pressure parameter corresponding to the reference accompaniment operation;
referring to the above-mentioned manner of obtaining the target pressure parameter, the gravity sensor may detect a pressure parameter variation within a vibration duration caused by the reference accompaniment operation as a reference pressure parameter.
And step S3, generating a corresponding relation between the reference pressure parameter and the corresponding reference accompaniment operation, and storing the corresponding relation in a sound source database.
After obtaining the reference pressure parameter, a corresponding relationship between the reference pressure parameter and the corresponding reference accompaniment operation may be generated, and the corresponding relationship may be stored in the sound source database. In the embodiment of the invention, the user can trigger the reference accompaniment operation corresponding to each sound source according to the use habit of the user, and the obtained sound source database meets the personalized requirements of the user.
It should be noted that the embodiment of the present invention is not limited to the above-mentioned manner for generating the sound source database, and those skilled in the art may generate the sound source database in other manners, for example, a default sound source database is set. In a particular implementation, the instrument sound sources may be stored in encoded form. The sound source database may be stored in a local memory of the terminal, and/or may also be stored in the server, and when the sound source database is stored in the server, the client needs to send the target pressure parameter to the server, and the server performs the matching operation and returns the target musical instrument sound source.
And step 205, playing the target musical instrument sound source when the multimedia data is played.
After the target musical instrument sound source is obtained, the target musical instrument sound source can be played, and in the playing process of the multimedia data, the target musical instrument sound source and the multimedia data are played simultaneously, so that the effect of accompaniment is achieved.
In implementation, the instrument sound source and the multimedia data can be played simultaneously in a multithreading mode, for example, a first thread can be used for controlling the playing of the instrument sound source, and a second thread can be used for controlling the playing of the multimedia data, so that the aim of accompanying the multimedia data is fulfilled.
In the embodiment of the invention, in the process of playing multimedia data, after the target musical instrument parameter is determined, the mobile terminal can determine the target pressure parameter by detecting the accompaniment operation triggered by a user, and determine the target musical instrument sound source by the target pressure parameter, so that the mobile terminal plays the multimedia data and simultaneously plays the target musical instrument sound source.
In order to enable those skilled in the art to better understand the embodiments of the present invention, the following description illustrates the embodiments of the present invention by way of a specific example, but it should be noted that the embodiments of the present invention are not limited thereto:
the performance mode is turned on by the APK or a switch integrated in the system of the mobile terminal, in which local and online songs can be played using a speaker or a headphone. After the instrument is selected, the force of a user for beating the rear cover or the screen of the mobile terminal is measured by a gravity sensor of the mobile terminal (the beating screen is preferably operated in a screen-off state), a sound source for sounding the instrument beaten by the force is called by looking up a table, the sound source is taken out from a memory, and the decoded sound source and the song which is played currently are played together after being mixed.
It should be noted that, for simplicity of description, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the illustrated order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments of the present invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
Referring to fig. 6, a block diagram of a mobile terminal-based musical instrument playing apparatus according to an embodiment of the present invention is shown, which may specifically include the following modules:
a target musical instrument parameter determining module 601, configured to determine a target musical instrument parameter for accompanying the multimedia data in a multimedia data playing process;
an accompaniment operation detection module 602, configured to detect an accompaniment operation applied to the mobile terminal;
a target instrument sound source determining module 603, configured to determine a target instrument sound source corresponding to the accompaniment operation based on the target instrument parameters;
the playing module 604 is configured to play the target musical instrument sound source when the multimedia data is played.
In a preferred embodiment of the present invention, the target instrument sound source determining module 603 further includes the following sub-modules:
the target pressure parameter determining submodule is used for determining a corresponding target pressure parameter based on the accompaniment operation;
and the target pressure parameter matching sub-module is used for matching the target pressure parameters from the sound source database corresponding to the target musical instrument parameters to obtain corresponding target musical instrument sound sources.
In a preferred embodiment of the present invention, the target instrument parameter includes a target instrument identifier, and the target instrument parameter determining module 601 may include the following sub-modules:
the musical instrument identification list display submodule is used for displaying the musical instrument identification list;
and the instrument identification selection submodule is used for taking the detected selected instrument identification as the target instrument identification.
In a preferred embodiment of the present invention, the target instrument parameter further includes a target pitch parameter of the target instrument, and the target instrument parameter determining module 601 further includes the following sub-modules:
a pitch adjustment list display sub-module for displaying a pitch adjustment list of the target musical instrument, the pitch adjustment list including a plurality of pitch parameters;
and the pitch selection sub-module is used for taking the detected selected pitch parameter as a target pitch parameter.
In a preferred embodiment of the present invention, the target pressure parameter determination submodule includes:
and the measurement submodule is used for acquiring a target pressure parameter measured by a gravity sensor in the mobile terminal and corresponding to the accompaniment operation.
In a preferred embodiment of the present invention, the measurement sub-module includes:
the mobile terminal comprises a vibration duration determining unit, a vibration duration determining unit and a control unit, wherein the vibration duration determining unit is used for determining the vibration duration of the mobile terminal caused by the accompaniment operation when the accompaniment operation is triggered to the mobile terminal;
and the difference value calculating unit is used for acquiring the maximum value and the minimum value of all pressure data measured by the gravity sensor in the vibration duration, and taking the difference value between the maximum value and the minimum value as a target pressure parameter corresponding to the accompaniment operation.
In a preferred embodiment of the embodiments of the present invention, the apparatus further comprises:
the voice source database generating module is used for generating a voice source database and comprises:
the reference accompaniment operation detection submodule is used for detecting reference accompaniment operation triggered by a user aiming at each sound source in the preset sound source list;
the reference pressure parameter acquisition submodule is used for acquiring a reference pressure parameter corresponding to the reference accompaniment operation;
and the corresponding relation generation submodule is used for generating the corresponding relation between the reference pressure parameter and the corresponding reference accompaniment operation and storing the corresponding relation in a sound source database.
For the embodiment of the mobile terminal, since it is basically similar to the embodiment of the method, the description is relatively simple, and for relevant points, reference may be made to part of the description of the embodiment of the method.
The mobile terminal provided in the embodiment of the present invention can implement each process implemented by the mobile terminal in the method embodiments of fig. 1 to fig. 2, and is not described herein again to avoid repetition. This mobile terminal is at the in-process of multimedia data broadcast, and the user can make mobile terminal broadcast corresponding target musical instrument sound source through triggering the accompaniment operation, uses mobile terminal to replace the musical instrument sound production to real-time output plays for playback devices, and the multimedia data of playing at present mixes the broadcast, reaches the purpose of using mobile terminal for multimedia data performance, has richened the function of multimedia broadcast, strengthens user and mobile terminal's interaction.
Fig. 7 is a schematic diagram of a hardware structure of a mobile terminal for implementing various embodiments of the present invention, where the mobile terminal 70 includes, but is not limited to: radio frequency unit 71, network module 72, audio output unit 73, input unit 74, sensor 75, display unit 76, user input unit 77, interface unit 78, memory 79, processor 710, and power supply 711. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 7 is not intended to be limiting of mobile terminals, and that a mobile terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the mobile terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The processor 710 is configured to determine a target musical instrument parameter for accompanying the multimedia data in a multimedia data playing process; detecting an accompaniment operation acting on the mobile terminal; determining a target instrument sound source corresponding to the accompaniment operation based on the target instrument parameters; and when the multimedia data is played, playing the target musical instrument sound source.
In the process of playing the multimedia data, the mobile terminal provided by the embodiment of the invention can enable the mobile terminal to play the corresponding target musical instrument sound source by triggering the accompaniment operation, use the mobile terminal to replace the musical instrument to sound, output the sound to the playing equipment in real time to play, and play the sound mixed with the currently played multimedia data, so that the purpose of playing the multimedia data by using the mobile terminal is achieved, the multimedia playing function is enriched, and the interaction between the user and the mobile terminal is enhanced.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 71 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, the processor 710 is configured to receive downlink data from a base station and process the received downlink data; in addition, the uplink data is transmitted to the base station. Typically, the radio frequency unit 71 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 71 can also communicate with a network and other devices through a wireless communication system.
The mobile terminal provides wireless broadband internet access to the user via the network module 72, such as assisting the user in sending and receiving e-mails, browsing web pages, and accessing streaming media.
The audio output unit 73 may convert audio data received by the radio frequency unit 71 or the network module 72 or stored in the memory 79 into an audio signal and output as sound. Also, the audio output unit 73 may also provide audio output related to a specific function performed by the mobile terminal 70 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 73 includes a speaker, a buzzer, a receiver, and the like.
The input unit 74 is for receiving an audio or video signal. The input Unit 74 may include a Graphics Processing Unit (GPU) 741 and a microphone 742, and the Graphics processor 741 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 76. The image frames processed by the graphic processor 741 may be stored in the memory 79 (or other storage medium) or transmitted via the radio frequency unit 71 or the network module 72. The microphone 742 may receive sounds and may be capable of processing such sounds into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 71 in case of the phone call mode.
The mobile terminal 70 also includes at least one sensor 75, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that adjusts the brightness of the display panel 761 according to the brightness of ambient light, and a proximity sensor that turns off the display panel 761 and/or a backlight when the mobile terminal 70 moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of the mobile terminal (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 75 may also include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., which are not described in detail herein.
The display unit 76 is used to display information input by the user or information provided to the user. The Display unit 76 may include a Display panel 761, and the Display panel 761 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 77 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 77 includes a touch panel 771 and other input devices 772. The touch panel 771, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 771 (e.g., operations by a user on or near the touch panel 771 using a finger, stylus, or any suitable object or attachment). The touch panel 771 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 710, receives a command from the processor 710, and executes the command. In addition, the touch panel 771 can be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 771, the user input unit 77 may also include other input devices 772. In particular, other input devices 772 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 771 may be overlaid on the display panel 761, and when the touch panel 771 detects a touch operation thereon or nearby, the touch operation is transmitted to the processor 710 to determine the type of the touch event, and then the processor 710 provides a corresponding visual output on the display panel 761 according to the type of the touch event. Although the touch panel 771 and the display panel 761 are shown as two separate components in fig. 7 to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 771 and the display panel 761 may be integrated to implement the input and output functions of the mobile terminal, and is not limited herein.
The interface unit 78 is an interface for connecting an external device to the mobile terminal 70. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 78 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 70 or may be used to transmit data between the mobile terminal 70 and external devices.
The memory 79 may be used to store software programs as well as various data. The memory 79 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 79 may include high speed random access memory and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 710 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 79 and calling data stored in the memory 79, thereby integrally monitoring the mobile terminal. Processor 710 may include one or more processing units; preferably, the processor 710 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 710.
The mobile terminal 70 may also include a power supply 711 (e.g., a battery) for powering the various components, and the power supply 711 may be logically coupled to the processor 710 via a power management system that may be configured to manage charging, discharging, and power consumption.
In addition, the mobile terminal 70 includes some functional modules that are not shown, and thus, the detailed description thereof is omitted.
Preferably, an embodiment of the present invention further provides a mobile terminal, including a processor 710, a memory 79, and a computer program stored in the memory 79 and capable of running on the processor 710, where the computer program is executed by the processor 710 to implement the above-mentioned processes in the embodiment of the musical instrument playing method based on the mobile terminal, and can achieve the same technical effects, and in order to avoid repetition, the detailed description is omitted here.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the above-mentioned mobile terminal-based musical instrument playing method embodiment, and can achieve the same technical effect, and in order to avoid repetition, the detailed description is omitted here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (6)

1. A musical instrument playing method based on a mobile terminal is characterized by comprising the following steps:
determining a target musical instrument parameter for accompanying the multimedia data in the process of playing the multimedia data;
detecting an accompaniment operation acting on the mobile terminal;
determining a target instrument sound source corresponding to the accompaniment operation based on the target instrument parameters;
when the multimedia data is played, playing the target musical instrument sound source;
wherein the step of determining a target instrument sound source corresponding to the accompaniment operation based on the target instrument parameters comprises:
determining a corresponding target pressure parameter based on the accompaniment operation;
matching the target pressure parameter from a sound source database corresponding to the target musical instrument parameter to obtain a corresponding target musical instrument sound source;
wherein the target instrument parameters comprise a target instrument identification; or, the target instrument parameter comprises a target instrument identification and a target pitch parameter;
the step of determining a target instrument parameter to accompany the multimedia data comprises:
displaying an instrument identification list;
taking the detected selected instrument identification as a target instrument identification;
wherein after the step of using the detected selected instrument identifier as the target instrument identifier, the method further comprises the following steps:
if the musical instrument is a musical instrument with fixed pitch, detecting the accompaniment operation acting on the mobile terminal after determining the target musical instrument identifier;
and if the musical instrument is the musical instrument with unfixed pitch, detecting the accompaniment acting on the mobile terminal after determining the target musical instrument identifier and the target pitch parameter.
2. The method of claim 1, wherein the step of determining target instrument parameters for accompanying the multimedia data comprises:
showing a pitch adjustment list of the target instrument, the pitch adjustment list comprising a plurality of pitch parameters;
the detected selected pitch parameter is taken as the target pitch parameter.
3. The method of claim 1, wherein the step of determining a corresponding target pressure parameter based on the accompaniment actions comprises:
and acquiring a target pressure parameter which is measured by a gravity sensor in the mobile terminal and corresponds to the accompaniment operation.
4. The method according to claim 3, wherein the step of obtaining the target pressure parameter corresponding to the accompaniment manipulation, measured by a gravity sensor in the mobile terminal, comprises:
when an accompaniment operation is triggered to a mobile terminal, determining the vibration duration of the mobile terminal caused by the accompaniment operation;
and acquiring the maximum value and the minimum value of all pressure data measured by the gravity sensor in the vibration duration, and taking the difference value between the maximum value and the minimum value as a target pressure parameter corresponding to the accompaniment operation.
5. The method of claim 1, wherein the audio source database is generated by:
detecting a reference accompaniment operation triggered by a user aiming at each sound source in a preset sound source list;
acquiring a reference pressure parameter corresponding to the reference accompaniment operation;
and generating a corresponding relation between the reference pressure parameter and the corresponding reference accompaniment operation, and storing the corresponding relation in a sound source database.
6. A mobile terminal-based musical instrument playing apparatus, characterized in that the apparatus comprises:
the target musical instrument parameter determining module is used for determining target musical instrument parameters for accompanying the multimedia data in the process of playing the multimedia data;
the accompaniment operation detection module is used for detecting the accompaniment operation acted on the mobile terminal;
the target instrument sound source determining module is used for determining a target instrument sound source corresponding to the accompaniment operation based on the target instrument parameters;
the playing module is used for playing the target musical instrument sound source when the multimedia data is played;
wherein the target instrument sound source determination module includes:
the target pressure parameter determining submodule is used for determining a corresponding target pressure parameter based on the accompaniment operation;
the target pressure parameter matching submodule is used for matching the target pressure parameters from a sound source database corresponding to the target musical instrument parameters to obtain corresponding target musical instrument sound sources;
wherein the target instrument parameters comprise a target instrument identification; or, the target instrument parameter comprises a target instrument identification and a target pitch parameter;
the target instrument parameter determination module comprises:
the musical instrument identification list display submodule is used for displaying the musical instrument identification list;
an instrument identifier selection submodule for selecting the detected selected instrument identifier as a target instrument identifier;
the musical instrument identification selection submodule is also used for executing detection of accompaniment operation acting on the mobile terminal after determining the target musical instrument identification if the musical instrument is a musical instrument with fixed pitch; and if the musical instrument is the musical instrument with unfixed pitch, detecting the accompaniment acting on the mobile terminal after determining the target musical instrument identifier and the target pitch parameter.
CN201810032252.0A 2018-01-12 2018-01-12 Musical instrument playing method and device based on mobile terminal Active CN108337367B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810032252.0A CN108337367B (en) 2018-01-12 2018-01-12 Musical instrument playing method and device based on mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810032252.0A CN108337367B (en) 2018-01-12 2018-01-12 Musical instrument playing method and device based on mobile terminal

Publications (2)

Publication Number Publication Date
CN108337367A CN108337367A (en) 2018-07-27
CN108337367B true CN108337367B (en) 2021-01-12

Family

ID=62924951

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810032252.0A Active CN108337367B (en) 2018-01-12 2018-01-12 Musical instrument playing method and device based on mobile terminal

Country Status (1)

Country Link
CN (1) CN108337367B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110223664A (en) * 2019-05-22 2019-09-10 上海墩庐生物医学科技有限公司 The methods and applications of opera gong and drum are played in a kind of mobile phone and plate mobile terminal

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103258529A (en) * 2013-04-16 2013-08-21 初绍军 Performance method of electronic musical instrument and music
CN106297760A (en) * 2016-08-08 2017-01-04 西北工业大学 A kind of algorithm of software quick playing musical instrument

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7498504B2 (en) * 2004-06-14 2009-03-03 Condition 30 Inc. Cellular automata music generator
CN101465121B (en) * 2009-01-14 2012-03-21 苏州瀚瑞微电子有限公司 Method for implementing touch virtual electronic organ
CN101673540A (en) * 2009-08-13 2010-03-17 上海酷吧信息技术有限公司 Method and device for realizing playing music of mobile terminal
US8901406B1 (en) * 2013-07-12 2014-12-02 Apple Inc. Selecting audio samples based on excitation state
CN105391864B (en) * 2015-11-26 2019-08-30 努比亚技术有限公司 Device and method based on pressure control mobile terminal vibration
CN105635458B (en) * 2015-12-28 2019-05-24 Oppo广东移动通信有限公司 A kind of percussion music playing method and device
CN105700808A (en) * 2016-02-18 2016-06-22 广东欧珀移动通信有限公司 Music playing method and device and terminal device
CN105739995A (en) * 2016-03-03 2016-07-06 上海与德通讯技术有限公司 Playing simulating method and playing simulating module
CN105824529B (en) * 2016-03-11 2019-07-09 深圳市元征科技股份有限公司 A kind of method and intelligent wearable device controlling music

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103258529A (en) * 2013-04-16 2013-08-21 初绍军 Performance method of electronic musical instrument and music
CN106297760A (en) * 2016-08-08 2017-01-04 西北工业大学 A kind of algorithm of software quick playing musical instrument

Also Published As

Publication number Publication date
CN108337367A (en) 2018-07-27

Similar Documents

Publication Publication Date Title
CN108683927B (en) Anchor recommendation method and device and storage medium
KR20110064530A (en) Operation method of device based on a alteration ratio of touch area and apparatus using the same
CN109448761B (en) Method and device for playing songs
CN108763316B (en) Audio list management method and mobile terminal
CN108668024B (en) Voice processing method and terminal
CN108334272B (en) Control method and mobile terminal
CN110568926B (en) Sound signal processing method and terminal equipment
CN110097872B (en) Audio processing method and electronic equipment
CN108735194B (en) Beat prompting method and device
WO2021139535A1 (en) Method, apparatus and system for playing audio, and device and storage medium
CN109862430B (en) Multimedia playing method and terminal equipment
CN110796918A (en) Training method and device and mobile terminal
CN111933098A (en) Method and device for generating accompaniment music and computer readable storage medium
CN110958485A (en) Video playing method, electronic equipment and computer readable storage medium
CN107911777B (en) Processing method and device for return-to-ear function and mobile terminal
CN111048111A (en) Method, device and equipment for detecting rhythm point of audio frequency and readable storage medium
CN108337367B (en) Musical instrument playing method and device based on mobile terminal
CN111061407B (en) Video program operation control method, electronic device, and storage medium
CN111103983A (en) AR musical instrument playing method and electronic equipment
CN107945777B (en) Audio production method, mobile terminal and computer readable storage medium
CN110677770B (en) Sound production control method, electronic device, and medium
CN112118482A (en) Audio file playing method and device, terminal and storage medium
CN113515209A (en) Music screening method, device, equipment and medium
CN108418961B (en) Audio playing method and mobile terminal
CN111445929A (en) Voice information processing method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant