CN113035157B - Graphical music editing method, system and storage medium - Google Patents

Graphical music editing method, system and storage medium Download PDF

Info

Publication number
CN113035157B
CN113035157B CN202110120091.2A CN202110120091A CN113035157B CN 113035157 B CN113035157 B CN 113035157B CN 202110120091 A CN202110120091 A CN 202110120091A CN 113035157 B CN113035157 B CN 113035157B
Authority
CN
China
Prior art keywords
playing
notes
mouse
graphic
music
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110120091.2A
Other languages
Chinese (zh)
Other versions
CN113035157A (en
Inventor
孙悦
李天驰
蔡欣嘉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Dianmao Technology Co Ltd
Original Assignee
Shenzhen Dianmao Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Dianmao Technology Co Ltd filed Critical Shenzhen Dianmao Technology Co Ltd
Priority to CN202110120091.2A priority Critical patent/CN113035157B/en
Publication of CN113035157A publication Critical patent/CN113035157A/en
Application granted granted Critical
Publication of CN113035157B publication Critical patent/CN113035157B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/0008Associated control or indicating means
    • G10H1/0025Automatic or semi-automatic music composition, e.g. producing random music, applying rules from music theory or modifying a musical piece
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/165Management of the audio stream, e.g. setting of volume, audio stream path
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2210/00Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
    • G10H2210/101Music Composition or musical creation; Tools or processes therefor
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/091Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith
    • G10H2220/101Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters
    • G10H2220/126Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters for graphical editing of individual notes, parts or phrases represented as variable length segments on a 2D or 3D representation, e.g. graphical edition of musical collage, remix files or pianoroll representations of MIDI-like files

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Acoustics & Sound (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a graphical music editing method, a graphical music editing system and a storage medium, wherein the graphical music editing method comprises the following steps: detecting a mouse event input by a user in a music canvas, and editing corresponding graphic notes on the music canvas according to the mouse event; converting the graphic notes into corresponding data structures according to preset rules, wherein the data structures comprise position information, starting time information and ending time information of the graphic notes; when receiving the playing instruction, playing the corresponding notes at each playing time point according to the data structure. According to the embodiment of the invention, the corresponding graphic notes are directly edited on the music canvas by detecting the mouse event, meanwhile, the edited graphic notes are converted into the corresponding data structure and then played, notes with different duration can be edited according to the starting time and the ending time of each graphic note, and the editing and playing of notes with different duration are realized.

Description

Graphical music editing method, system and storage medium
Technical Field
The invention relates to the technical field of graphical programming, in particular to a graphical music editing method, a graphical music editing system and a storage medium.
Background
In a traditional visual music editor, note drawing is generally performed based on development technologies such as canvas with bitmap mode, however, in the existing note drawing, each beat of each note is an independent graph, and whether each note is continuous or independent cannot be confirmed, so that all notes can only be played according to a fixed time value, continuous and long-range effects cannot be drawn and presented, and editing output effects of the graph music editor are affected.
Accordingly, the prior art is still in need of improvement and development.
Disclosure of Invention
In view of the above-mentioned shortcomings of the prior art, the present invention aims to provide a graphical music editing method, system and storage medium, which aims to solve the problem that the graphical music editing in the prior art cannot realize editing of different long notes.
In order to achieve the above purpose, the invention adopts the following technical scheme:
a graphical music editing method comprising the steps of:
detecting a mouse event input by a user in a music canvas, and editing corresponding graphic notes on the music canvas according to the mouse event;
converting the graphic notes into corresponding data structures according to preset rules, wherein the data structures comprise position information, starting time information and ending time information of the graphic notes;
when receiving the playing instruction, playing the corresponding notes at each playing time point according to the data structure.
In the graphical music editing method, before the step of detecting a mouse event input by a user in a music canvas and editing a corresponding graphical note on the music canvas according to the mouse event, the graphical music editing method further comprises:
and building a music canvas for editing notes based on the vector graphic library, wherein an editing area of the music canvas is in a grid structure.
In the graphical music editing method, the step of detecting a mouse event input by a user in a music canvas and editing a corresponding graphical note on the music canvas according to the mouse event comprises the following steps:
detecting a mouse pressing event and a mouse moving event input by a user;
respectively triggering a drawing function or a frame selection function according to the direction information in the mouse moving event;
and detecting a mouse release event input by a user, and drawing graphic notes of corresponding length or framing the graphic notes in the corresponding area according to the mouse pressing position and the mouse release position.
In the graphical music editing method, the step of triggering a drawing function or a frame selection function according to the direction information in the mouse movement event comprises the following steps:
when the direction information in the mouse moving event is transverse, triggering a drawing function;
and triggering a box selection function when the direction information in the mouse movement event is longitudinal.
In the graphical music editing method, the step of converting the graphical notes into corresponding data structures according to preset rules, wherein the data structures comprise position information, starting time information and ending time information of the graphical notes comprises the following steps:
acquiring position information, starting time information and ending time information of the graphic notes;
storing the position information, the starting time information and the ending time information of the graphic notes into an array of playing data;
storing the array of the playing data and the playing time point into the array of the time data, and forming the data structure by the rule that the array of the time data is increased according to time.
In the graphical music editing method, when receiving a playing instruction, the step of playing the corresponding notes at each playing time point according to the data structure includes:
when a playing instruction is received, acquiring playing starting points, time values and pitch of all graphic notes according to the data structure;
and playing corresponding notes at each playing time point according to the playing starting points, the time values and the pitches of all the graphic notes.
In the graphical music editing method, before the step of playing the corresponding notes at each playing time point according to the playing start points, the time values and the pitch of all the graphical notes, the graphical music editing method further comprises:
detecting a positioning time point of the current positioning line;
comparing the positioning time point with the playing starting points of all the graphic notes, and judging whether the playing starting points are earlier than the positioning time point;
if yes, starting playing from the positioning time point; otherwise, starting playing from the earliest playing start point in all the playing start points.
In the graphical music editing method, after the step of playing the corresponding notes at each playing time point according to the data structure when the playing instruction is received, the method further includes:
the graphic note in the play state is switched to the highlight state.
Another embodiment of the present invention also provides a graphical music editing system, including: a processor, a memory, and a communication bus;
the memory has stored thereon a computer readable program executable by the processor;
the communication bus realizes connection communication between the processor and the memory;
the processor, when executing the computer readable program, implements the steps in the graphical music editing method as described above.
Another embodiment of the present invention also provides a computer-readable storage medium storing one or more programs executable by one or more processors to implement the steps in the graphical music editing method as described above.
Compared with the prior art, in the graphical music editing method, the system and the storage medium provided by the invention, the graphical music editing method directly edits the corresponding graphical notes on the music canvas by detecting the mouse event, meanwhile, the edited graphical notes are converted into the corresponding data structure and then played, notes with different duration can be edited according to the starting time and the ending time of each graphical note, and the editing and playing of notes with different duration are realized.
Drawings
FIG. 1 is a flowchart of a graphical music editing method according to a preferred embodiment of the present invention;
FIG. 2 is a flowchart of step S10 in a preferred embodiment of the graphical music editing method according to the present invention;
FIG. 3 is a flowchart of detecting a mouse event to edit a graphic note in an application embodiment of the graphic music editing method provided by the present invention;
FIG. 4 is a flowchart of step S20 in a preferred embodiment of the graphical music editing method according to the present invention;
FIG. 5 is a flowchart of step S30 in a preferred embodiment of the graphical music editing method provided by the present invention;
FIG. 6 is a flowchart of steps S33, S34 and S35 in a preferred embodiment of the graphical music editing method according to the present invention;
FIG. 7 is an interface diagram of a portion of a music canvas and graphic notes in an exemplary embodiment of a graphical music editing method provided by the present invention;
FIG. 8 is a schematic diagram of a hardware configuration of a graphical music editing system according to a preferred embodiment of the present invention;
FIG. 9 is a functional block diagram of a system for installing a graphical music editing program according to a preferred embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and effects of the present invention clearer and more specific, the present invention will be described in further detail below with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
Referring to fig. 1, the graphical music editing method provided by the invention comprises the following steps:
s10, detecting a mouse event input by a user in a music canvas, and editing a corresponding graphic note on the music canvas according to the mouse event.
In this embodiment, editing of corresponding graphic notes is performed by detecting mouse events input by a user on a music canvas, including drawing, deleting, copying and other editing operations are all realized through mouse event driving, the note editing mode is visual and flexible, notes of corresponding vector graphics can be obtained by inputting different mouse events, graphical note editing is not limited to fixed graphics, and continuous or independent notes are effectively distinguished.
Specifically, the user may further edit the corresponding graphic notes on the existing music canvas to implement music re-editing, or perform note editing in the newly created blank music canvas, so in an optional embodiment, the step S10 further includes a step of building a music canvas for editing notes, specifically building the music canvas based on a vector graphic library paper.
Referring specifically to fig. 2, a flowchart of step S10 in the graphical music editing method provided by the present invention is shown in fig. 2, where the step S10 includes:
s11, detecting a mouse pressing event and a mouse moving event which are input by a user;
s12, respectively triggering a drawing function or a frame selection function according to the direction information in the mouse moving event;
s13, detecting a mouse release event input by a user, and drawing graphic notes of corresponding length or framing the graphic notes in the corresponding area according to the mouse pressing position and the mouse release position.
In this embodiment, a user performs a note editing operation on the music canvas through a mouse, where the music canvas is sequentially provided with a grid layer, a drawing layer and a playing layer from a bottom layer to a top layer, that is, the bottom layer is the grid layer, and divides an editing drawing area into rectangular grid areas, specifically, dividing the editing area on the music canvas into grids one by using a line that alternates vertically based on paper. The second layer is a drawing layer, which is a rectangular lattice of graphic notes, mainly uses paper. Js to draw a rectangular structure as a note, matches the drawn rectangular edge with a grid through position judgment to obtain a corresponding graphic note, specifically uses continuous rectangles to represent continuous notes, and uses single rectangles to represent monosyllabic notes; the third layer is a playing line layer for positioning line movement during note playing.
Therefore, the user performs drawing of the graphic notes by using the mouse, specifically, the drawing layer is realized, firstly, a mouse pressing event input by the user and a mouse moving event after pressing are detected, and the drawing function or the frame selection function is respectively triggered according to different current mouse moving directions, wherein the drawing function is triggered when the direction information in the mouse moving event is transverse, the frame selection function is triggered when the direction information in the mouse moving event is longitudinal, specifically, the moving direction can be distinguished by comparing the ordinate when the mouse is pressed with the ordinate after moving, and the transverse movement is judged when the difference value between the ordinate after moving and the ordinate when the mouse is pressed is smaller than a preset threshold, otherwise, the longitudinal movement is judged.
After triggering different functions, continuously detecting a mouse loosening event input by a user, drawing graphic notes of corresponding length or framing the graphic notes in corresponding areas by comparing the mouse pressing position with the mouse loosening position, enabling the user to draw diversified graphic notes or select the graphic notes of different corresponding areas by controlling the mouse pressing and loosening positions and the mouse moving direction, for example, when the monosyllabic notes are required to be drawn, the drawing is realized by clicking a certain grid, when the continuous tone notes are required to be drawn, the drawing can be realized by continuously transversely moving a plurality of grids after clicking the certain grid, the limitation of the graph notes in the existing visual music editing is broken, the long tone and continuous tone effects can be realized by drawing the graphic notes of any length according to the requirement, and the flexibility and the output effect of the graphical music editing are improved.
In an alternative application embodiment, as shown in fig. 3, which is a flowchart of detecting a mouse event to edit a graphic note in an application embodiment, the scheme for implementing drawing or framing through mouse event driving in the application embodiment includes the following steps:
s101, pressing a mouse;
s102, moving a mouse;
s103, judging the movement direction of the mouse, if the movement direction is horizontal, executing the step S104, and if the movement direction is vertical, executing the step S107;
s104, dynamically generating a note rectangle from the mouse pressing coordinate to the current coordinate;
s105, loosening a mouse;
s106, attaching the current note to the grid;
s107, dynamically generating a dotted rectangle from the mouse pressing coordinate to the current coordinate;
s108, loosening a mouse;
s109, all notes in the dotted line area are boxed.
Specifically, in this optional embodiment, a mouse pressing event and a mouse moving event are detected first, then the mouse moving direction is determined, when the mouse moves transversely, a drawing mode is entered, in this mode, a note rectangle with a pressing coordinate to a current coordinate is dynamically generated according to the movement of the mouse, namely, a note rectangle with a corresponding length is generated along with the movement of the mouse until the mouse is released, at this time, a final drawn graphic note is obtained after the current note rectangle is attached to a grid, in particular, when the mouse is released in the drawing mode, whether other notes overlapped with a newly drawn graphic note exist in the current row is further searched, if the overlapped other notes exist, the overlapped other notes are deleted, only the note drawn this time is reserved, the accuracy of the note drawing is ensured, and the data errors such as the overlapped notes are avoided; and when the frame selection mode is entered during longitudinal movement, dynamically generating a dotted frame selection rectangle from a pressing coordinate to a current coordinate according to movement of a mouse until the mouse is released, and at the moment, marking all notes in the dotted line area as selected notes, wherein the selected notes can further perform editing events such as copying, pasting, deleting, dragging, moving, zooming and the like according to input mouse and keyboard events, so that editing and modifying operations on drawn graphic notes are realized.
S20, converting the graphic notes into corresponding data structures according to preset rules, wherein the data structures comprise position information, starting time information and ending time information of the graphic notes.
In this embodiment, after corresponding graphic notes are drawn, in order to achieve an accurate and continuous note playing effect, the drawn graphic notes are converted into corresponding data structures according to preset rules, accurate playing information of the graphic notes is obtained through data conversion, note data to be processed at each time point are defined, and the clamping caused by a large amount of detection and calculation in the playing process is reduced, so that an accurate and smooth note playing effect is achieved.
Referring to fig. 4, a flowchart of step S20 in the graphical music editing method provided by the present invention is shown. As shown in fig. 4, the step S20 includes:
s21, acquiring position information, starting time information and ending time information of the graphic notes;
s22, storing the position information, the starting time information and the ending time information of the graphic notes into an array of playing data;
s23, storing the array of the playing data and the playing time point into the array of the time data, and forming the data structure by a rule that the array of the time data is increased according to time.
In this embodiment, when performing data conversion, the position information, the start time information and the end time information of the graphic notes are obtained first, specifically, the horizontal axis of the editing area on the music canvas is the time axis, the vertical axis is the pitch axis, that is, each row corresponds to a pitch, each column corresponds to a beat, the position information, the start time information and the end time information of the graphic notes drawn by the user can be obtained according to the position and the length of the graphic notes, the pieces of information are stored in the array of the playing data, that is, the playing of a certain note can be started or stopped at a certain time point, and the number of notes in the row of a certain note belong to the certain note, then the array of the playing data and the playing time point are stored in the array of the time data, that is, the playing of the notes in a certain time point is started or stopped, and the array of the time data is gradually increased according to the position and the length of the graphic notes drawn by the user, so that the data structure can be obtained by conversion, and the subsequent accurate control of the playing of the graphic notes drawn by the user is realized.
And S30, when a playing instruction is received, playing corresponding notes at each playing time point according to the data structure.
In this embodiment, after the data conversion is performed to obtain the corresponding data structure, when the user inputs the playing instruction, the corresponding notes are played at each playing time point according to the data structure, and because the data structure includes the accurate position, the starting time and the ending time of the graphic notes, the playing state of each note does not need to be confirmed in a conventional mode of judging whether the positioning line intersects the notes during playing, the accuracy of playing the notes is improved, and the notes are played in the mode of the data structure, so that the effect of continuous tone playing can be realized during playing by identifying the starting time and the ending time of the continuous tone notes after the continuous tone notes are drawn, and the playing output effect of graphic music editing is perfected.
Referring to fig. 5, a flowchart of step S30 in the graphical music editing method provided by the present invention is shown. As shown in fig. 5, the step S30 includes:
s31, when a playing instruction is received, acquiring playing starting points, time values and pitch of all graphic notes according to the data structure;
s32, playing corresponding notes at each playing time point according to the playing starting points, the time values and the pitch of all the graphic notes.
In this embodiment, when playing, the playing start points, the time values and the pitch of all the graphic notes are obtained according to the data structure, specifically, the corresponding pitch is obtained according to the row where each graphic note is located, the playing start point is obtained according to the time point where the leftmost side of each graphic note is located, the time value (one grid corresponds to one beat) is obtained according to the number of grids occupied by each graphic note, and after playing is started, the notes with the corresponding pitch and the time value are played according to the playing or stopping state of each note at each playing time point, so as to realize the graphical music editing preview effect.
Further, referring to fig. 6, the step S32 further includes:
s33, detecting a positioning time point where the current positioning line is located;
s34, comparing the positioning time point with the playing starting points of all the graphic notes, and judging whether the playing starting points are earlier than the positioning time point;
s35, if yes, starting playing from the positioning time point; otherwise, starting playing from the earliest playing start point in all the playing start points.
In this embodiment, a positioning line for positioning a playing time point is provided on a playing line layer of a music canvas, and the positioning line moves to a corresponding playing time point along with the playing of a note when the note is played, so that before the note is played, the position of the current positioning line is detected, the starting point of the playing of the note is correspondingly adjusted according to the position of the positioning line, when the playing starting point is earlier than the positioning time point, it is indicated that the positioning line is not before the time point of all graphic notes at this time, for example, the playing is paused after a certain note is played, or a user moves the positioning line to the position where the designated note is located according to the preview requirement, and at this time, the playing is started from the positioning time point, thereby improving the music preview efficiency; when the play starting point is not earlier than the positioning time point, the current graphic note is in a default play-from-the-head state, and the current graphic note starts to play from the earliest play starting point, so that different play preview requirements are met.
Further, the step S30 further includes:
the graphic note in the play state is switched to the highlight state.
In this embodiment, when a note is played, the note being played is switched to a display effect in a highlight state, so as to clearly prompt the user of the note currently played, specifically, when the note being played, the positioning line will move along with the playing of the music, the note intersecting with the positioning line is the note being played, and the note intersecting with the positioning line is switched to the highlight state, so as to realize the corresponding playing relationship between the positioning line and the note and the dynamic display of the note playing.
In order to better understand the data conversion and playing process in the graphical music editing method provided by the invention, the process of converting the data structure and playing the notes will be described in detail below by referring to fig. 7, which is a specific application embodiment.
FIG. 7 is an interface diagram of a portion of a music canvas and graphic notes in an application embodiment, as shown in FIG. 7, wherein three continuous notes are plotted, the first note being the first of the first row, the start time point being 1, the end time point being 4, the pitch being the treble do; the second note is the first of the second row, the beginning time point is 4, the ending time point is 6, and the pitch is the treble si; the third note is the first of the third row, the starting time point is 2, the ending time point is 4, and the pitch is the tweeter la, of course, different graphic notes can be set to different colors in practical implementation, for example, each pitch is preset with a color, and notes of different pitches can be more intuitively distinguished in drawing; before playing music, the rectangular vector of the paper. Js is converted into a data structure driven in time according to the following preset rule:
interface PlayData {
type:'start' | 'end';
row:number;
index:number;
}
interface TimeData {
time:number;
data:PlayData[];
}
timeDataArr:TimeData[] = []
wherein timeDataArr is an array that is incremented by time, and upon data conversion, converts the notes plotted by the user into a time data TimeData representing that a number of notes can start playing or stop playing in a time, and the notes are stored in an array of play data playData, which records the point in time at which the note starts or stops, and the position of the note in the row, so that the graphical notes in fig. 7 are converted into the following data structure according to the above rule:
[
{time: 1; data:[{type:’start’, row:‘1’,index: ‘1’}]}
{time: 2, data:[{type:’start’, row:‘3’,index: ‘1’}]}
{time: 4; data:[
{type:‘end’, row:‘1’,index: ‘1’},
{type:‘end’, row:‘3’,index: ‘1’},
{type: ‘start, row: ‘2’, index: ‘1’}
]};
{time: 6, data:[{type:’end’, row:‘2’,index: ‘1’}]}
]
the above data structure represents that when the play starts, at time point 1, the 1 st note on line 1 is in the start state, the double treble do note is played and is made to be in the highlight state;
when entering time point 2, the 1 st note of the 3 rd row is in a starting state, and then the high-pitch la note is played and is in a highlight state;
when entering the time point 3, no note needs to be changed at the current time point, and the playing state is not changed;
when entering the time point 4, the 1 st note of the 1 st row and the 1 st note of the 3 rd row are in a stop state, which means that the 1 st note of the 2 nd row is in a start state after playing the beat, so that the high-power do, the high-power la and the high-power si are played at the same time and the high-power si note is in a highlight state;
when entering the time point 5, the 1 st note of the 1 st row and the 1 st note of the 3 rd row stop playing, and the highlighting state of the 1 st note and the 1 st note is canceled;
when the time point 6 is entered, the 1 st note of the 2 nd line is in a stop state, so that the high pitch si stops playing after playing the beat, resets to a default state, cancels all the highlight states and restores the positioning line to a default starting point, and completes the playing.
It should be noted that, there is not necessarily a certain sequence between the steps, and those skilled in the art will understand that, in different embodiments, the steps may be performed in different execution sequences, that is, may be performed in parallel, may be performed interchangeably, and so on.
As shown in fig. 8, based on the above-mentioned graphical music editing method, the present invention further provides a graphical music editing system, which may be a computing device such as a mobile terminal, a desktop computer, a notebook, a palm computer, a server, etc., and includes a processor 10, a memory 20, and a display 30. Fig. 8 shows only some of the components of the graphical music editing system, but it should be understood that not all of the illustrated components are required to be implemented and that more or fewer components may be implemented instead.
The memory 20 may in some embodiments be an internal storage unit of the graphical music editing system, such as a hard disk or a memory of the system. The memory 20 may also be an external storage device of the graphical music editing system in other embodiments, such as a plug-in hard disk, smart Media Card (SMC), secure Digital (SD) Card, flash Card (Flash Card) or the like, which are provided on the graphical music editing system. Further, the memory 20 may also include both an internal storage unit and an external storage device of the graphical music editing system. The memory 20 is used for storing application software installed in the graphical music editing system and various data, such as program codes for installing the graphical music editing system. In one embodiment, the memory 20 stores a graphical music editing program 40, and the graphical music editing program 40 is executable by the processor 10 to implement the graphical music editing methods of the embodiments of the present application.
The processor 10 may in some embodiments be a central processing unit (Central Processing Unit, CPU), microprocessor or other data processing chip for executing program code or processing data stored in the memory 20, for example for performing the graphical music editing method, etc.
The display 30 may be an LED display, a liquid crystal display, a touch-sensitive liquid crystal display, an OLED (Organic Light-Emitting Diode) touch, or the like in some embodiments. The display 30 is used for displaying information in the graphical music editing system and for displaying a visual user interface. The components 10-30 of the graphical music editing system communicate with each other via a system bus. In one embodiment, the steps of the graphical music editing method described above are implemented when the processor 10 executes the graphical music editing program 40 in the memory 20.
Please refer to fig. 9, which is a functional block diagram of a system for installing a graphical music editing program according to a preferred embodiment of the present invention. In this embodiment, the system for installing the graphical music editor may be divided into one or more modules that are stored in the memory 20 and executed by one or more processors (the processor 10 in this embodiment) to complete the present invention. For example, in fig. 9, a system for installing a graphic music editing program may be divided into a mouse detection module 21, a data conversion module 22, and a voice playback module 23, the mouse detection module 21, the data conversion module 22, and the voice playback module 23 being connected in sequence.
The mouse detection module 21 is used for detecting a mouse event input by a user in a music canvas, and editing corresponding graphic notes on the music canvas according to the mouse event;
the data conversion module 22 is configured to convert the graphic notes into corresponding data structures according to a preset rule, where the data structures include position information, start time information, and end time information of the graphic notes;
the note playing module 23 is configured to play a corresponding note at each playing time point according to the data structure when receiving a playing instruction.
The module refers to a series of computer program instruction segments capable of performing specific functions, and is more suitable for describing the execution process of the graphical music editing program in the graphical music editing system than the program. For a specific function of the modules 21-23 reference is made to the embodiments corresponding to the above-described method.
In summary, in the graphical music editing method, system and storage medium provided by the present invention, the graphical music editing method includes: building a music canvas for editing notes; detecting a mouse event input by a user in a music canvas, and editing corresponding graphic notes on the music canvas according to the mouse event; converting the graphic notes into corresponding data structures according to preset rules, wherein the data structures comprise position information, starting time information and ending time information of the graphic notes; when receiving the playing instruction, playing the corresponding notes at each playing time point according to the data structure. According to the embodiment of the invention, the corresponding graphic notes are directly edited on the music canvas by detecting the mouse event, meanwhile, the edited graphic notes are converted into the corresponding data structure and then played, notes with different duration can be edited according to the starting time and the ending time of each graphic note, and the editing and playing of notes with different duration are realized.
The embodiments described above are merely illustrative, wherein elements illustrated as separate elements may or may not be physically separate, and elements shown as elements may or may not be physical elements, may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
From the above description of embodiments, it will be apparent to those skilled in the art that the embodiments may be implemented by means of software plus a general purpose hardware platform, or may be implemented by hardware. Based on such understanding, the foregoing technical solutions may be embodied essentially or in part in a form of a software product, which may exist in a computer-readable storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, etc., including several instructions for causing a computer electronic device (which may be a personal computer, a server, or a network electronic device, etc.) to perform the various embodiments or methods of some parts of the embodiments.
Conditional language such as "capable," "energy," "possible," or "may," among others, is generally intended to convey that a particular embodiment can include (but other embodiments do not include) particular features, elements, and/or operations unless specifically stated otherwise or otherwise understood within the context as used. Thus, such conditional language is also generally intended to imply that features, elements and/or operations are in any way required for one or more embodiments or that one or more embodiments must include logic for deciding, with or without input or prompting, whether these features, elements and/or operations are included or are to be performed in any particular embodiment.
What has been described herein in this specification and the drawings includes examples that can provide a method, system, and medium for project stem human analysis and assessment. It is, of course, not possible to describe every conceivable combination of components and/or methodologies for purposes of describing the various features of the present disclosure, but it may be appreciated that many further combinations and permutations of the disclosed features are possible. It is therefore evident that various modifications may be made thereto without departing from the scope or spirit of the disclosure. Further, or in the alternative, other embodiments of the disclosure may be apparent from consideration of the specification and drawings, and practice of the disclosure as presented herein. It is intended that the examples set forth in this specification and figures be considered illustrative in all respects as illustrative and not limiting. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for purposes of limitation.

Claims (3)

1. A graphical music editing method, comprising the steps of:
detecting a mouse event input by a user in a music canvas, and editing corresponding graphic notes on the music canvas according to the mouse event;
converting the graphic notes into corresponding data structures according to preset rules, wherein the data structures comprise position information, starting time information and ending time information of the graphic notes;
when receiving a playing instruction, playing a corresponding note at each playing time point according to the data structure;
the method comprises the steps of detecting a mouse event input by a user in a music canvas, and editing corresponding graphic notes on the music canvas according to the mouse event, and further comprises:
building a music canvas for editing notes based on a vector graphic library, wherein an editing area of the music canvas is of a grid structure, and the music canvas is sequentially provided with a grid layer, a drawing layer and a playing line layer from bottom to top;
the step of detecting a mouse event input by a user in a music canvas and editing a corresponding graphic note on the music canvas according to the mouse event comprises the following steps:
detecting a mouse pressing event and a mouse moving event input by a user;
respectively triggering a drawing function or a frame selection function according to the direction information in the mouse moving event;
detecting a mouse loosening event input by a user, and drawing graphic notes of corresponding length or framing graphic notes in a corresponding area according to a mouse pressing position and a mouse loosening position;
the step of triggering a drawing function or a box selection function according to the direction information in the mouse moving event comprises the following steps:
when the direction information in the mouse moving event is transverse, triggering a drawing function;
when the direction information in the mouse moving event is longitudinal, triggering a frame selection function;
when the mouse moves transversely, a drawing mode is entered, in the drawing mode, a note rectangle which presses coordinates to the current coordinates is dynamically generated according to the movement of the mouse until the mouse is released, at the moment, the current note rectangle is attached to a grid to obtain a final drawn graph note, meanwhile, when the mouse is released in the drawing mode, whether other notes overlapped with the newly drawn graph note exist in the current row or not is further searched, if the overlapped other notes exist, the overlapped other notes are deleted, and only the graph note drawn at the time is reserved; when the mouse moves longitudinally, entering a frame selection mode, in the frame selection mode, dynamically generating a dotted frame selection rectangle which presses the coordinates to the current coordinates according to the movement of the mouse until the mouse is released, at the moment, marking all notes in the dotted frame selection rectangle area as selected states, and simultaneously, editing the selected notes according to the input mouse and keyboard events;
the step of converting the graphic notes into corresponding data structures according to preset rules, wherein the data structures comprise position information, starting time information and ending time information of the graphic notes, and the step comprises the following steps:
acquiring position information, starting time information and ending time information of the graphic notes;
storing the position information, the starting time information and the ending time information of the graphic notes into an array of playing data;
storing the array of the playing data and the playing time point into the array of the time data, and forming the data structure by the rule that the array of the time data is increased according to time;
when receiving the playing instruction, the step of playing the corresponding notes at each playing time point according to the data structure comprises the following steps:
when a playing instruction is received, acquiring playing starting points, time values and pitch of all graphic notes according to the data structure;
playing corresponding notes at each playing time point according to the playing starting points, the time values and the pitch of all the graphic notes;
before the step of playing the corresponding notes at each playing time point according to the playing starting points, the time values and the pitch of all the graphic notes, the method further comprises the following steps:
detecting a positioning time point of the current positioning line;
comparing the positioning time point with the playing starting points of all the graphic notes, and judging whether the playing starting points are earlier than the positioning time point;
if yes, starting playing from the positioning time point; otherwise, starting playing from the earliest playing start point in all the playing start points;
when receiving the playing instruction, after the step of playing the corresponding notes at each playing time point according to the data structure, the method further comprises:
the graphic note in the play state is switched to the highlight state.
2. A graphical music editing system comprising: a processor, a memory, and a communication bus;
the memory has stored thereon a computer readable program executable by the processor;
the communication bus realizes connection communication between the processor and the memory;
the processor, when executing the computer readable program, implements the steps in the graphical music editing method of claim 1.
3. A computer-readable storage medium storing one or more programs executable by one or more processors to implement the steps in the graphical music editing method of claim 1.
CN202110120091.2A 2021-01-28 2021-01-28 Graphical music editing method, system and storage medium Active CN113035157B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110120091.2A CN113035157B (en) 2021-01-28 2021-01-28 Graphical music editing method, system and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110120091.2A CN113035157B (en) 2021-01-28 2021-01-28 Graphical music editing method, system and storage medium

Publications (2)

Publication Number Publication Date
CN113035157A CN113035157A (en) 2021-06-25
CN113035157B true CN113035157B (en) 2024-04-16

Family

ID=76459388

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110120091.2A Active CN113035157B (en) 2021-01-28 2021-01-28 Graphical music editing method, system and storage medium

Country Status (1)

Country Link
CN (1) CN113035157B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003271164A (en) * 2002-03-19 2003-09-25 Yamaha Music Foundation Musical sound generating method, musical sound generating program, storage medium, and musical sound generating device
JP2010091744A (en) * 2008-10-07 2010-04-22 Kawai Musical Instr Mfg Co Ltd Musical symbol input device and musical symbol input program
JP2012083564A (en) * 2010-10-12 2012-04-26 Yamaha Corp Music editing device and program
US9443501B1 (en) * 2015-05-13 2016-09-13 Apple Inc. Method and system of note selection and manipulation

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6988343B2 (en) * 2017-09-29 2022-01-05 ヤマハ株式会社 Singing voice editing support method and singing voice editing support device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003271164A (en) * 2002-03-19 2003-09-25 Yamaha Music Foundation Musical sound generating method, musical sound generating program, storage medium, and musical sound generating device
JP2010091744A (en) * 2008-10-07 2010-04-22 Kawai Musical Instr Mfg Co Ltd Musical symbol input device and musical symbol input program
JP2012083564A (en) * 2010-10-12 2012-04-26 Yamaha Corp Music editing device and program
US9443501B1 (en) * 2015-05-13 2016-09-13 Apple Inc. Method and system of note selection and manipulation

Also Published As

Publication number Publication date
CN113035157A (en) 2021-06-25

Similar Documents

Publication Publication Date Title
US9208138B2 (en) Range adjustment for text editing
US7714864B2 (en) Visual resource profiler for graphical applications
CN105068727A (en) Realization method and device for drawing tool
CN113035158B (en) Online MIDI music editing method, system and storage medium
JP2008275687A (en) Display control device and method
WO2017024929A1 (en) Handwriting editing method and system based on touch operation
US20060033884A1 (en) Projection device projection system, and image obtainment method
KR20100071361A (en) Project management device and method for architecture modeling tool of application software on autosar and computer readable recording medium therefor
CN106250063B (en) Page turning method and device and writing terminal
CN113010162A (en) Page construction method, device and equipment
CN112686973A (en) Image editing method, control device, storage medium and computer equipment
JPH08328795A (en) Method for setting tool button and editing device
JP3969775B2 (en) Handwritten information input device and handwritten information input method
CN113035157B (en) Graphical music editing method, system and storage medium
US20240105232A1 (en) Video editing method and apparatus, and device and storage medium
US20070182740A1 (en) Information processing method, information processor, recording medium, and program
JP2002074381A (en) Device and method for editing graphic
JP2720807B2 (en) Scenario editing device
JP2007133815A (en) State transition preparing device
CN116567164B (en) Centralized management system and method for multimedia equipment
JP2009020689A (en) Data retrieval method and data retrieval device
JP5174648B2 (en) Program development support device, method and program
JP2575664B2 (en) Screen control method
KR102566934B1 (en) Electronic apparatus that performs parallel processing-based document format conversion and operating method thereof
JP2008310723A (en) Time-series data processing system and computer program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant