GB2603485A - Melody concretization identification system - Google Patents

Melody concretization identification system Download PDF

Info

Publication number
GB2603485A
GB2603485A GB2101538.3A GB202101538A GB2603485A GB 2603485 A GB2603485 A GB 2603485A GB 202101538 A GB202101538 A GB 202101538A GB 2603485 A GB2603485 A GB 2603485A
Authority
GB
United Kingdom
Prior art keywords
concretization
command
music
audio signal
calculation module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2101538.3A
Other versions
GB202101538D0 (en
Inventor
Lin Pei-Chun
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to GB2101538.3A priority Critical patent/GB2603485A/en
Publication of GB202101538D0 publication Critical patent/GB202101538D0/en
Publication of GB2603485A publication Critical patent/GB2603485A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/0008Associated control or indicating means
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/36Accompaniment arrangements
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2210/00Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
    • G10H2210/031Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/005Non-interactive screen display of musical or status data
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/021Indicator, i.e. non-screen output user interfacing, e.g. visual or tactile instrument status or guidance information using lights, LEDs, seven segments displays
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2240/00Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
    • G10H2240/075Musical metadata derived from musical analysis or for use in electrophonic musical instruments
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2240/00Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
    • G10H2240/075Musical metadata derived from musical analysis or for use in electrophonic musical instruments
    • G10H2240/085Mood, i.e. generation, detection or selection of a particular emotional content or atmosphere in a musical piece
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2240/00Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
    • G10H2240/121Musical libraries, i.e. musical databases indexed by musical parameters, wavetables, indexing schemes using musical parameters, musical rule bases or knowledge bases, e.g. for automatic composing methods
    • G10H2240/131Library retrieval, i.e. searching a database or selecting a specific musical piece, segment, pattern, rule or parameter set
    • G10H2240/141Library retrieval matching, i.e. any of the steps of matching an inputted segment or phrase with musical database contents, e.g. query by humming, singing or playing; the steps may include, e.g. musical analysis of the input, musical feature extraction, query formulation, or details of the retrieval process

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Acoustics & Sound (AREA)
  • Multimedia (AREA)
  • Auxiliary Devices For Music (AREA)

Abstract

A melody concretization identification system includes a database and a calculation module. The database stores a plurality of audio signal data 11 such as music and a concretization command 12 corresponding to each audio signal data. The audio signal data includes time 112 and amplitude 111 parameters which are integrated into a coordinate diagram 13. The concretization commands are reflected on the coordinate diagram. The calculation module is coupled with the database and a target device such as a display; it receives and analyses the audio signal data to produce a plurality of period information including a frequency, amplitude and time. The calculation module compares each period information with each audio signal data, to acquire the concretization command matching each period information. The calculation module outputs the acquired concretization command to the target device so that the target device generates visual performance, such as a digital facial expression on a robot, corresponding to the audio. The invention analyses the audio in real time, thereby integrating visual performance effects with the audio on the target device.

Description

MELODY CONCRETIZATION IDENTIFICATION
SYSTEM
BACKGROUND OF THE INVENTION
1. Field of the Invention:
The present invention relates to processing systems, and more particularly, to a melody concretization identification system which integrates music a visual effects.
2. Description of the Related Art:
Current entertainment robots feature in interactions with people. The entertainment robots are set to carry out different motions according to preset music for generating, thereby achieving the purpose of entertaining people.
Conventional entertainment robots are usually provided with, preset in the chips of the robots, different motions corresponding to different music. However, the user needs to press down the activation button on the entertainment robots first, so that the robots play the music and carry out corresponding motion. Therefore, the entertainment robots repeat same music and motion without variations, and the motion variations will be produced through a passive control method.
For improving the aforementioned shortcomings, Taiwan patent 1598871 discloses a robot which dances according to different music tempos, wherein the robot carries out various motions based on music tempos. However, the music and corresponding motion variations still have to be stored in the robot in advance, and when the robot is to perform, the user needs to press the button on the robot for motion variations.
In other words, regarding the patent above, the robots still have to be provided with music and corresponding motions first, and fail to produce motions variations by receiving external music, lacking performance variation control mechanism.
SUMMARY OF THE INVENTION
To improve the issues above, a melody concretization identification system is disclosed, which is able to receive music and carry out analysis at the same time, so as to integrate 20 the music variation and visual effect that are immediately presented on the target device, thereby increasing the variation capability of the present invention.
For achieving the aforementioned objectives, a melody concretization identification system in accordance with an embodiment of the present invention is provided, comprising a database, storing a plurality of audio signal data and a concretization command corresponding to each audio signal data, wherein the audio signal data comprises an amplitude parameter and a time parameter, each audio signal data is a value of the time parameter corresponding to the amplitude parameter, the amplitude parameter and the time parameter of the audio signal data are integrated into a coordinate diagram, the concretization command being reflexed on the coordinate diagram according to the amplitude parameter matching a time unit of the time parameter; and a calculation module coupled with the database and a target device, the calculation module receiving and analyzing a music to produce a plurality of period information comprising frequency, amplitude, and time, the calculation module reflecting each period information on the coordinate diagram to be matched with each audio signal data -3 -to acquire the concretization command matching each period information, the calculation module outputting the concretization command to the target device, whereby the target device generates visual performance corresponding to the music.
With such configuration, the calculation module of the present invention simultaneously analyzes the music upon receiving it, and immediately matches the analyzed period information with the audio signal data in the database, so as to generate a concretization command corresponding to the music.
Therefore, the present invention is able to integrate the music and visual effect at the same time when playing the music, so as to immediately control the device to present visualization performance without the necessity of establishing the music and motions for performance in the robot in advance. Thus, the present invention realizes a variety of visual effect variations through an active controlling method.
BRIEF DESCRIPTION OF THE DRAWINGS
Fig. 1 is a schematic view illustrating the processing 20 analysis of the calculation module in accordance with an embodiment of the present invention.
Fig. 2 is a structural block diagram of the present invention. Fig. 3 is a schematic view illustrating the coordinate diagram of the present invention.
DETAILED DESCRIPTION OF THE INVENTION
The aforementioned and further advantages and features of the present invention will be understood by reference to the description of the preferred embodiment in conjunction with the accompanying drawings where the components are illustrated based on a proportion for explanation but not subject to the actual component proportion.
Referring to Fig. 1 to Fig. 3, a melody concretization identification system 100 is disclosed, comprising a database 10 and a calculation module 20.
The database 10 stores a plurality of audio signal data 11 and a concretization command 12 corresponding to each audio signal data 11. The audio signal data 11 comprises an amplitude parameter 111 and a time parameter 112. The amplitude parameter 111 and the time parameter 112 of the audio signal data 11 are integrated into a coordinate diagram 13, and the concretization command 12 is reflected on the coordinate diagram 13. More particularly, referring to Fig. 3, the transverse axis 131 of the coordinate diagram 13 is formed of the time parameter 112 of each audio signal data 11, and the longitudinal axis 132 is formed of the amplitude parameter 111 of each audio signal data 11. In other words, each audio signal data 11 is a value of the time parameter 112 corresponding to the amplitude parameter 111. Therein, the time parameter 112 is allowed to be 0.1 seconds, 0.5 seconds, 1 second, or other time unit, and the concretization command 12 is reflected in the coordinate diagram 13 in accordance with a matched amplitude. Further, referring to Fig. 3, the concretization command 12 is able to be applied for controlling the displaying light colors and patterns. When the concretization command 12 is applied for controlling the displaying lights, the concretization command 12 is a hexadecimal code, wherein when the concretization command 12 is a hexadecimal code, each concretization command 12 varies over time, i.e. corresponding to the transverse axis 131 of the coordinate diagram 13. When the concretization command 12 is applied for controlling the displaying patterns, each concretization command 12 varies over time and the amplitudes, i.e. corresponding to the transverse axis 131 and the longitudinal axis 132 of the coordinate diagram 13.
The calculation module 20 is coupled with the database 10 and the target device 1. The calculation module 20 comprises an input unit 21 for receiving a music which is played in an external environment or inputted through a signal. The input unit 21 is a microphone and a communication port coupled with a signal wire, wherein the microphone is applied for receiving the music 2 played in an external environment, and the communication port is applied for signally receiving the music The calculation module 20 comprises an analysis unit 22, which analyzes the music 2 received through the input unit 21 to produce a plurality of period information, wherein the period information comprises a frequency, amplitude, and time. More particularly, when the music 2 enters the analysis unit 22, the analysis unit 22 locates the frequency and amplitude of the music 2 at intervals. In the embodiment, the analysis unit 22 generates a period information over a 4-tempo period. The number of tempos of each period in the present invention is not limited to the example herein and allowed to be adjusted according to different demands.
The calculation module 20 further comprises a match unit 23 and an output unit 24. The match unit 23 is applied for matching each period information process by the analysis unit 22 with each audio signal data 11 in the database 10, so as to acquire the concretization command 12 matching each period information. In other words, the match unit 23 reflects each period information and time point in the coordinate diagram 13, thereby acquiring the matching concretization command 12. For example, if the amplitude of the period information is 0.5, the time is 1.9 seconds, and therefore, in the coordinate diagram 13, the color is yellow and the pattern is a suspicious emotion. As a result, the match unit 23 acquires the concretization command 12 of an ffff00 color code and a suspicious emotion pattern. The output unit 24 is connected with the target device 1 20 through a wired connection. The output unit 24 outputs the -8 -acquired concretization command 12 and the corresponding music 2 to the target device 1, so that the target device 1 produces the visual performance corresponding to the music 2, as shown in Fig. 1. In the embodiment, the calculation module 20 and the database 10 are disposed in the target device 1. The output unit 20 has a speaker. When the input unit 21 receives the signal of the music 2 through the communication port, the speaker simultaneously plays out the music 2 when the concretization command 12 is outputted to the target device 1, whereby the target device 1 produces the visual performance corresponding to the music 2.
In addition, the target device 1 is allowed to be a lamp or a robot. The target device 1 comprises a plurality of light members each having a color code. Therein, the color code is an ROB value coded in 24 bits per pixel (bpp) through the RBG color model. For example when the hexadecimal code of the concretization 12 is ffff00, the corresponding color code is presented as a yellow color on the light member.
Furthermore, when the target device 1 is a robot, the target 20 device 1 also comprises a display interface. When the concretization command 12 is a pattern, the display interface displays the concretization command 12 as an expression pattern.
Additionally, the calculation module 20 comprises a 5 storage unit 25 for storing the concretization command 12 matching the music 2. Therefore, when the match unit 23 processes a music 2 each time, the concretization command 12 matching the processed music 2 is stored in the storage unit 25. When the input unit 21 receives the identical music 2 next time, 10 the analyzing and processing steps are omitted, and the concretization command 12 corresponding to the music 2 can be directly performed on the target device 1, whereby the calculation burden of the calculation module 20 is lowered.
Thus, when the input unit 21 of the melody concretization identification system 100 receives the music which is signally inputted through external playing environment, the input unit 21 transmits the music 2 to the analysis unit 22 at the same time, and the analysis unit 22 process the music 2 into the period information. Then, the match unit 23 matches the period information with the database 10, and the output unit 24 outputs -10 -the concretization command 12 and the music 2 to the target device 1, whereby the target device produces the visual performance according to the concretization command 12 immediately generated from the music 2.
Therefore, when the music 2 is played, the melody concretization identification system 100 integrates the music 2 with visual effects, so as to produce visual performance on the target device 1, thereby achieving a variety of visual effects through an active controlling method.
Further, the audio signal data 11 and the concretization command 12 are integrated in the coordinate diagram 13, so that the data storage volume is lowered. Also, the match unit 23 reflects the period information of the music 2 on the coordinate diagram 13, so that the matching concretization command 12 is easily and efficiently located, thereby lowering the calculation time and process of the calculation module 20, providing a music-based real-time visual performance.
Although particular embodiments of the invention have been described in detail for purposes of illustration, various 20 modifications and enhancements may be made without departing from the spirit and scope of the invention. Accordingly, the invention is not to be limited except as by the appended claims.
-12 -

Claims (8)

  1. What is claimed is: 1. A melody concretization identification system (100), comprising: a database (10) storing a plurality of audio signal data (11) and a concretization command (12) corresponding to each audio signal data (11), the audio signal data (11) comprising an amplitude parameter (111) and a time parameter (112), each audio signal data (11) being a value of the time parameter (112) corresponding to the amplitude parameter (111), the amplitude parameter (111) and the time parameter (112) of the audio signal data (11) being integrated into a coordinate diagram (13), the concretization command (12) being reflexed on the coordinate diagram (13) according to the amplitude parameter (111) matching a time unit of the time parameter (112); and a calculation module (20) coupled with the database (10) and a target device (1), the calculation module (20) receiving a music (2) and analyzing the music (2) to produce a plurality of period information, the period information comprising a frequency, an amplitude, and a time, the calculation module (20) reflecting each period information on the coordinate diagram -13 - (13) to be matched with each audio signal data (11) to acquire the concretization command (12) matching each period information, the calculation module (20) outputting the acquired concretization command (12) to the target device (1), so that the target device (1) generates visual performance corresponding to the music (2).
  2. 2. The melody concretization and identification system (100) of claim 1, wherein the concretization command (12) is applied for controlling light colors; the concretization command (12) is a hexadecimal code the target device (1) comprises a plurality of light members, and each light member comprises a color code corresponding to the concretization command (12).
  3. 3. The melody concretization and identification system (100) of claim 1, wherein the concretization command (12) is a pattern; the target device (1) comprises a display interface for displaying the concretization command (12).
  4. 4. The melody concretization and identification system (100) of claim 3, wherein the calculation module (20) comprises an output unit (24); the output unit (24) is coupled with the -14 -target device (1); when the music (2) is played, the output unit (24) outputs the corresponding concretization command (12) to the target device (1).
  5. 5. The melody concretization and identification system (100) of claim 1, wherein the calculation module (20) comprises an analysis unit (22); the analysis unit (22) processes the music (2), such that the frequency and the amplitude of the music (2) over an interval are deemed as the period information.
  6. 6. The melody concretization and identification system (100) of claim 5, wherein the match unit (23) matches the period information with the audio signal data (11), so as to acquire the corresponding concretization command (12).
  7. 7. The melody concretization and identification system (100) of claim 1, wherein the calculation module (20) comprises an input unit (21); the input unit (21) receives the music (2) from an external environment or through signal transmission.
  8. 8. The melody concretization and identification system (100) of claim 7, wherein the calculation module (20) comprises a storage unit (25) for storing the concretization command (12) matching the music (2).-15 -
GB2101538.3A 2021-02-04 2021-02-04 Melody concretization identification system Pending GB2603485A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2101538.3A GB2603485A (en) 2021-02-04 2021-02-04 Melody concretization identification system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2101538.3A GB2603485A (en) 2021-02-04 2021-02-04 Melody concretization identification system

Publications (2)

Publication Number Publication Date
GB202101538D0 GB202101538D0 (en) 2021-03-24
GB2603485A true GB2603485A (en) 2022-08-10

Family

ID=74879088

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2101538.3A Pending GB2603485A (en) 2021-02-04 2021-02-04 Melody concretization identification system

Country Status (1)

Country Link
GB (1) GB2603485A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090165632A1 (en) * 2005-12-19 2009-07-02 Harmonix Music Systems, Inc. Systems and methods for generating video game content
EP2204774A2 (en) * 2008-12-05 2010-07-07 Sony Corporation Information processing apparatus, information processing method, and program
CN104574453A (en) * 2013-10-17 2015-04-29 付晓宇 Software for expressing music with images
CN105773612A (en) * 2016-03-28 2016-07-20 深圳前海勇艺达机器人有限公司 System and method for controlling dance of robot
WO2017136854A1 (en) * 2016-02-05 2017-08-10 New Resonance, Llc Mapping characteristics of music into a visual display
CN108305604A (en) * 2018-01-30 2018-07-20 浙江省公众信息产业有限公司 Music visualization, device and computer readable storage medium
US10770092B1 (en) * 2017-09-22 2020-09-08 Amazon Technologies, Inc. Viseme data generation

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090165632A1 (en) * 2005-12-19 2009-07-02 Harmonix Music Systems, Inc. Systems and methods for generating video game content
EP2204774A2 (en) * 2008-12-05 2010-07-07 Sony Corporation Information processing apparatus, information processing method, and program
CN104574453A (en) * 2013-10-17 2015-04-29 付晓宇 Software for expressing music with images
WO2017136854A1 (en) * 2016-02-05 2017-08-10 New Resonance, Llc Mapping characteristics of music into a visual display
CN105773612A (en) * 2016-03-28 2016-07-20 深圳前海勇艺达机器人有限公司 System and method for controlling dance of robot
US10770092B1 (en) * 2017-09-22 2020-09-08 Amazon Technologies, Inc. Viseme data generation
CN108305604A (en) * 2018-01-30 2018-07-20 浙江省公众信息产业有限公司 Music visualization, device and computer readable storage medium

Also Published As

Publication number Publication date
GB202101538D0 (en) 2021-03-24

Similar Documents

Publication Publication Date Title
US7501571B2 (en) Lighting display responsive to vibration
TWI486904B (en) Method for rhythm visualization, system, and computer-readable memory
US7582015B2 (en) Program, information storage medium and game system
US8098831B2 (en) Visual feedback in electronic entertainment system
CA2720723C (en) Gesture-related feedback in eletronic entertainment system
US6522417B1 (en) Communication terminal device that processes received images and transmits physical quantities that affect the receiving communication terminal device
CN104487208A (en) Method and device for generating robot control scenario
JP2018011201A (en) Information processing apparatus, information processing method, and program
KR20170024374A (en) Stage Image Displaying Apparatus Capable of Interaction of Performance Condition and Audience Participation and Displaying Method Using the Same
US11420129B2 (en) Gameplay event detection and gameplay enhancement operations
US11508393B2 (en) Controller for real-time visual display of music
US6876847B2 (en) Control of synchronous display of melody information and different information on mobile communication terminal
JP2019009770A (en) Sound input/output device
JP4555039B2 (en) Robot, robot control method, robot control program
US20210236931A1 (en) Gameplay event detection and gameplay enhancement operations
KR20190034120A (en) Sequential activity intelligent personal assistant
GB2603485A (en) Melody concretization identification system
CN106454491A (en) Method and device for playing voice information in video smartly
WO2012001750A1 (en) Game device, game control method, and game control program
CN113810837B (en) Synchronous sounding control method of display device and related equipment
JP2018075657A (en) Generating program, generation device, control program, control method, robot device and telephone call system
TWI689826B (en) Musical visualization system
US11406895B2 (en) Gameplay event detection and gameplay enhancement operations
KR20160140037A (en) Emotion matching module for controlling color temperature based on emotion and emotion lighting system having the emotion matching module
KR20050025552A (en) Digital cellular phone