CN110929051B - Integrated system and method for controlling music, vision and lamplight through image recognition - Google Patents

Integrated system and method for controlling music, vision and lamplight through image recognition Download PDF

Info

Publication number
CN110929051B
CN110929051B CN201911080203.5A CN201911080203A CN110929051B CN 110929051 B CN110929051 B CN 110929051B CN 201911080203 A CN201911080203 A CN 201911080203A CN 110929051 B CN110929051 B CN 110929051B
Authority
CN
China
Prior art keywords
data
program
light
control module
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911080203.5A
Other languages
Chinese (zh)
Other versions
CN110929051A (en
Inventor
吴强
刘江
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Nvidia Enti Technology Co ltd
Original Assignee
Shenzhen Enwei Media Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Enwei Media Co ltd filed Critical Shenzhen Enwei Media Co ltd
Priority to CN201911080203.5A priority Critical patent/CN110929051B/en
Publication of CN110929051A publication Critical patent/CN110929051A/en
Application granted granted Critical
Publication of CN110929051B publication Critical patent/CN110929051B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/43Querying
    • G06F16/432Query formulation
    • G06F16/434Query formulation using image data, e.g. images, photos, pictures taken by a user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/43Querying
    • G06F16/438Presentation of query results
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/587Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using geographical or spatial information, e.g. location
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Library & Information Science (AREA)
  • Multimedia (AREA)
  • Mathematical Physics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Circuit Arrangement For Electric Light Sources In General (AREA)

Abstract

The invention discloses an integrated system and a method for controlling music, vision and lamplight by image recognition, wherein the method comprises the following steps: step S1, acquiring a two-dimensional pattern on an interaction module through an image acquisition module; s2, the image recognition module acquires and recognizes the ID and the position attribute of the two-dimensional element of the two-dimensional pattern, analyzes the data and sends the data to the control module; step S3, after the control module reads the sent ID and the position attribute, triggering a corresponding visual effect, and simultaneously sending the analyzed data to the sound and light control module; and S4, triggering corresponding music and light effects after the data are read by the sound and light control module. In the invention, the image recognition, the audio-visual interaction and the light control technology are integrated into a whole, so that different visual changes and sound effects are triggered by different images, and meanwhile, the light can flicker and change along with the rhythm of music, so that a new interaction experience is formed, and the effect is good.

Description

Integrated system and method for controlling music, vision and lamplight through image recognition
Technical Field
The invention belongs to the technical field of audio-visual interaction based on image recognition, and particularly relates to an integrated system and method for controlling music, vision and lamplight by image recognition.
Background
With the high-speed development of technology, the interactive technology of a multimedia system is also more and more perfect; in contrast to conventional user interfaces, the multimedia user interface after the introduction of video and audio, the most important change is that the interface is no longer a static interface, but a time-dependent time-varying media interface.
Humans use speech and other time-varying media (e.g., gestures) in a manner that is quite different from other media. From the information presented to the user, the time-varying media is mainly presented sequentially, while visual media (text and graphics) we are often familiar with are often presented simultaneously. In a conventional stationary interface, the user either selects from a series of options (explicit interface communication component) or interacts in a repudiatable manner (implicit interface communication component). In a user interface of time-varying media, all options and files must be presented sequentially. Due to limitations in media bandwidth and human attention, in time-varying media, users must control not only the content of the presentation information, but also when and how to present.
In the prior art, no technology capable of controlling corresponding presentation of music, vision and lamplight through image recognition exists.
Disclosure of Invention
In order to solve the above problems, the present invention aims to provide an integrated system and method for controlling music, vision and light by image recognition, which can recognize different images, thereby triggering different visual changes and sound effects, and simultaneously, the light can flash and change along with the rhythm of the music.
In order to achieve the above purpose, the technical scheme of the invention is as follows:
an integrated system for controlling music, vision and light by image recognition, comprising:
the interaction module is used for projecting a two-dimensional pattern of a user gesture or gesture and presenting a corresponding visual effect;
the image acquisition module is used for acquiring the two-dimensional pattern on the interaction module;
the image recognition module is used for recognizing the ID and the position attribute of the two-dimensional element of the two-dimensional pattern, analyzing the data and then sending the data to the control module;
the control module is used for receiving the ID and the position attribute identified by the image identification module, triggering the corresponding visual effect and then sending the analyzed data to the sound and light control module;
and the sound and light control module is used for reading the data information sent by the control module and then presenting the corresponding music and light effect.
According to the invention, through the arrangement of the modules, the image recognition, the audio-visual interaction and the light control technology can be integrated into a whole, different visual changes and sound effects are triggered by different images, and meanwhile, the light can flash and change along with the music rhythm, so that a new interaction experience is formed, and the effect is good.
Further, the interaction module comprises an acrylic plate and an infrared light supplementing lamp, the acrylic plate is used for projecting and imaging to form a two-dimensional pattern, and the corresponding visual effect is displayed, and the infrared light supplementing lamp is used for supplementing light for the two-dimensional pattern on the acrylic plate. According to the invention, through the arrangement of the acrylic plate and the infrared light supplementing lamp, the projection imaging definition and the camera acquisition definition can be ensured.
Further, the thickness of the acrylic plate is 8mm, and the number of the infrared light supplementing lamps is 2. Through the arrangement, the projection can be ensured to be imaged and the definition acquired by the camera can be ensured to be optimal.
Further, the image acquisition module is an infrared camera, the image recognition module is a reacTIVision program, the control module is a U3D engine, the sound and light control module is a touch design program, two-dimensional patterns on an acrylic plate are acquired through the infrared camera, picture information is transmitted to the reacTIVision program to acquire unique ID and position attribute of the picture information, and the reacTIVision program analyzes data and then sends the data to the U3D engine through a TUIO protocol; the U3D engine reads the sent ID and position attribute, triggers the corresponding visual effect, simultaneously sends the analyzed data to a touch design program through a UDP protocol, and controls interaction of sound and light through program compiling by the touch design program to present the corresponding music and light effect. In the invention, by adopting the technology, the realization effect of the integrated system is better and more stable.
A method of controlling music, vision and lighting by image recognition, comprising the steps of:
step S1, image acquisition, wherein two-dimensional patterns on an interaction module are acquired through an image acquisition module;
s2, the image recognition module acquires and recognizes the ID and the position attribute of the two-dimensional element of the two-dimensional pattern, analyzes the data and sends the data to the control module;
step S3, after the control module reads the sent ID and the position attribute, triggering a corresponding visual effect, and simultaneously sending the analyzed data to the sound and light control module;
and S4, after the data are read by the sound and light control module, the corresponding music and light effects are presented.
According to the invention, the technology of corresponding presentation of music, vision and lamplight can be controlled through image recognition, the audio-visual interaction and the lamplight control technology are integrated into a whole, different images trigger different visual changes and sound effects, and meanwhile, lamplight can flicker and change along with the rhythm of music, so that a new interaction experience is formed, and the effect is good.
Further, the interaction module comprises an acrylic plate and an infrared light supplementing lamp, the acrylic plate is used for projecting and imaging to form a two-dimensional pattern, and the corresponding visual effect is displayed, and the infrared light supplementing lamp is used for supplementing light for the two-dimensional pattern on the acrylic plate. According to the invention, through the arrangement of the acrylic plate and the infrared light supplementing lamp, the projection imaging definition and the camera acquisition definition can be ensured.
Further, the thickness of the acrylic plate is 8mm, and the number of the infrared light supplementing lamps is 2. Through the arrangement, the projection can be ensured to be imaged and the definition acquired by the camera can be ensured to be optimal.
Further, in step S1, the image acquisition module is an infrared camera, and after the positions and angles of the infrared camera and the infrared light compensating lamp are repeatedly adjusted, image acquisition is performed. In the invention, in order to avoid visible light interference, an infrared camera and an infrared light supplementing lamp are adopted; 2 infrared light filling lamps are erected to supplement light for two-dimensional patterns of the interactive module, and the positions and angles of the camera and the infrared light filling lamps need to be repeatedly adjusted so as to achieve the optimal acquisition effect.
Furthermore, before image acquisition, the screen matching is needed, specifically, the correction of the cross lattice is adopted, namely, an adhesive tape is horizontally and vertically stuck at equal intervals by using a black adhesive tape on an acrylic plate, an infrared camera is used for acquiring the cross lattice image, PS is used for processing the cross lattice image into a program output screen template, the output screen is completely matched with the cross lattice of the black adhesive tape through arena, so that the consistency of the visual matching of the interaction module and the triggering of the interaction module is ensured, and the visual experience is optimized.
Further, the image recognition module is a deactiVision program, acquires a two-dimensional pattern on the acrylic plate through the infrared camera, transmits picture information to the deactiVision program, acquires a unique ID and a position attribute of the picture information, and sends the deactiVision program to the control module after data analysis is performed on the deactiVision program.
Further, the control module is a U3D engine, the sound and light control module is a touch designer program, and the reacTIVision program analyzes data and sends the data to the U3D engine through a TUIO protocol; the U3D engine reads the sent ID and position attribute, triggers the corresponding visual effect, simultaneously sends the analyzed data to a touch design program through a UDP protocol, and controls interaction of sound and light through program compiling by the touch design program to present the corresponding music and light effect.
In the invention, the technology is applied to the invention in consideration of stability, so that the coordination among the modules is better and the stability is better.
Further, an interactive element dynamic library is built in the U3D engine, different IDs trigger different vision by reading data information sent by reacTIVision, and logic and effects of dynamic elements are debugged in a program; and simultaneously, transmitting the data to a TouchDesigner program through a UDP protocol.
Further, in the U3D engine, the corresponding visual effect is triggered by projecting the picture generated by the program onto the acrylic plate in a rear projection mode.
Further, a sound element library is built in the touch design program, the sound element library comprises rhythm audio, melody audio and environment audio, different IDs trigger different audios by reading data sent by a U3D engine, meanwhile, triggered music is analyzed to be high, medium and low frequencies through the touch design, audio information is converted into color information, then the color is mapped onto light, and brightness and color of the light are controlled through a DMX512 protocol in the touch design, so that the light can change along with flicker of music rhythm. In the invention, sound is processed by a TouchDesigner program, and is output by a sound box connected with a Kanong wire, and lamplight is processed by the TouchDesigner program and is transmitted to a lamplight controller through a DMX512 protocol, so that the lamplight controller flickers and changes along with the rhythm of music.
Further, all the audios in the sound element library are aligned to rhythm points in advance, and a volume switch of the audios is set in a program, so that sound can not be disordered and harshness in the interaction process, and meanwhile, the uniformity of music is guaranteed.
The beneficial effects of the invention are as follows: compared with the prior art, the invention can control the corresponding presentation technology of music, vision and lamplight through image recognition, integrates the image recognition, the sound-image interaction and the lamplight control technology into a whole, realizes different image triggering different visual changes and sound effects, and simultaneously, the lamplight can flicker and change along with the rhythm of the music to form a new interaction experience, and has good realization effect.
Drawings
FIG. 1 is a flow chart of a method for controlling music, vision and lighting by image recognition according to the present invention.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
In order to achieve the above object, the technical scheme of the present invention is as follows.
The invention provides an integrated system for controlling music, vision and lamplight by image recognition, which comprises:
the interaction module is used for projecting a two-dimensional pattern of a user gesture or gesture and presenting a corresponding visual effect;
the image acquisition module is used for acquiring the two-dimensional pattern on the interaction module;
the image recognition module is used for recognizing the ID and the position attribute of the two-dimensional element of the two-dimensional pattern, analyzing the data and then sending the data to the control module;
the control module is used for receiving the ID and the position attribute identified by the image identification module, triggering the corresponding visual effect and then sending the analyzed data to the sound and light control module;
and the sound and light control module is used for reading the data information sent by the control module and then presenting the corresponding music and light effect.
According to the invention, through the arrangement of the modules, the image recognition, the audio-visual interaction and the light control technology can be integrated into a whole, different visual changes and sound effects are triggered by different images, and meanwhile, the light can flash and change along with the music rhythm, so that a new interaction experience is formed, and the effect is good.
The interaction module comprises an acrylic plate and an infrared light supplementing lamp, wherein the acrylic plate is used for projecting and imaging to form a two-dimensional pattern, and the infrared light supplementing lamp is used for supplementing light for the two-dimensional pattern on the acrylic plate. According to the invention, through the arrangement of the acrylic plate and the infrared light supplementing lamp, the projection imaging definition and the camera acquisition definition can be ensured.
Wherein, the thickness of ya keli board is 8mm, and infrared light filling lamp is 2. Through the arrangement, the projection can be ensured to be imaged and the definition acquired by the camera can be ensured to be optimal.
The system comprises an image acquisition module, a sound and light control module, a touch design module, a touch display module and a touch display module, wherein the image acquisition module is an infrared camera, the image recognition module is a reacTIVision program, the control module is a U3D engine, the sound and light control module is a touch design module, a two-dimensional pattern on an acrylic plate is acquired through the infrared camera, picture information is transmitted to the reacTIVision program, a unique ID and a position attribute of the picture information are acquired, and the reacTIVision program analyzes data and then transmits the data to the U3D engine through a TUIO protocol; the U3D engine reads the sent ID and position attribute, triggers the corresponding visual effect, simultaneously sends the analyzed data to a touch design program through a UDP protocol, and controls interaction of sound and light through program compiling by the touch design program to present the corresponding music and light effect. In the invention, by adopting the technology, the realization effect of the integrated system is better and more stable.
The invention also provides a method for controlling music, vision and lamplight by image recognition, which is shown in fig. 1 and comprises the following steps:
step S1, image acquisition, wherein two-dimensional patterns on an interaction module are acquired through an image acquisition module;
s2, the image recognition module acquires and recognizes the ID and the position attribute of the two-dimensional element of the two-dimensional pattern, analyzes the data and sends the data to the control module;
step S3, after the control module reads the sent ID and the position attribute, triggering a corresponding visual effect, and simultaneously sending the analyzed data to the sound and light control module;
and S4, after the data are read by the sound and light control module, the corresponding music and light effects are presented.
According to the invention, the technology of corresponding presentation of music, vision and lamplight can be controlled through image recognition, the audio-visual interaction and the lamplight control technology are integrated into a whole, different images trigger different visual changes and sound effects, and meanwhile, lamplight can flicker and change along with the rhythm of music, so that a new interaction experience is formed, and the effect is good.
The interaction module comprises an acrylic plate and an infrared light supplementing lamp, wherein the acrylic plate is used for projecting and imaging to form a two-dimensional pattern, and the infrared light supplementing lamp is used for supplementing light for the two-dimensional pattern on the acrylic plate. According to the invention, through the arrangement of the acrylic plate and the infrared light supplementing lamp, the projection imaging definition and the camera acquisition definition can be ensured.
Wherein, the thickness of ya keli board is 8mm, and infrared light filling lamp is 2. Through the arrangement, the projection can be ensured to be imaged and the definition acquired by the camera can be ensured to be optimal.
In step S1, the image acquisition module is an infrared camera, and the positions and angles of the infrared camera and the infrared light compensating lamp are repeatedly adjusted and then the image acquisition is performed. In the invention, in order to avoid visible light interference, an infrared camera and an infrared light supplementing lamp are adopted; 2 infrared light filling lamps are erected to supplement light for two-dimensional patterns of the interactive module, and the positions and angles of the camera and the infrared light filling lamps need to be repeatedly adjusted so as to achieve the optimal acquisition effect.
The method comprises the steps of collecting images through an infrared camera, processing the images into a program output image template through PS, and completely matching the output image with the black adhesive tape cross lattice through arena, so that the consistency of the interaction module and the triggered visual matching is ensured, and the visual experience is optimized.
The image recognition module is a deactiVision program, acquires a two-dimensional pattern on the acrylic plate through the infrared camera, transmits picture information to the deactiVision program, acquires a unique ID and a position attribute of the picture information, and sends the deactiVision program to the control module after data analysis is carried out on the deactiVision program.
The sound and light control module is a touch designer program, and the reacTIVision program analyzes data and then sends the analyzed data to the U3D engine through a TUIO protocol; the U3D engine reads the sent ID and position attribute, triggers the corresponding visual effect, simultaneously sends the analyzed data to a touch design program through a UDP protocol, and controls interaction of sound and light through program compiling by the touch design program to present the corresponding music and light effect.
In the invention, the technology is applied to the invention in consideration of stability, so that the coordination among the modules is better and the stability is better.
The method comprises the steps that an interactive element dynamic library is built in a U3D engine, different IDs trigger different vision by reading data information sent by a reacTIVision, and logic and effects of dynamic elements are debugged in a program; and simultaneously, transmitting the data to a TouchDesigner program through a UDP protocol.
In the U3D engine, the corresponding visual effect is triggered by projecting a picture generated by a program onto an acrylic plate in a rear projection mode.
The method comprises the steps that a sound element library is built in a TouchDesigner program, the sound element library comprises rhythm audio, melody audio and environment audio, different IDs trigger different audios by reading data sent by a U3D engine, meanwhile, triggered music is analyzed to be high, medium and low frequencies through the TouchDesigner, audio information is converted into color information, then the colors are mapped onto lamplight, the brightness and the colors of the lamplight are controlled through a DMX512 protocol in the TouchDesigner, and the lamplight can change along with the flicker of the music rhythm.
All the audios in the sound element library are aligned to rhythm points in advance, and a volume switch of the audios is set in a program, so that sound can not be disordered and harsh in the interaction process, and meanwhile, the uniformity of music is guaranteed.
The beneficial effects of the invention are as follows: compared with the prior art, the invention can control the corresponding presentation technology of music, vision and lamplight through image recognition, integrates the image recognition, the sound-image interaction and the lamplight control technology into a whole, realizes different image triggering different visual changes and sound effects, and simultaneously, the lamplight can flicker and change along with the rhythm of the music to form a new interaction experience, and has good realization effect.
According to the invention, a two-dimensional pattern on an interactive module is acquired by using an infrared camera, ID and position attribute of two-dimensional elements are acquired by using a reacTIVision, data are sent to a U3D engine by using a TUIO protocol, corresponding visual effects are triggered by program analysis operation, meanwhile, the analyzed data are sent to a touch designer program by using a UDP protocol, each ID is classified in the touch designer to correspond to corresponding audio data, the corresponding audio data comprise rhythm audio, melody audio and environment audio, different IDs trigger different audio, the triggered audio data are converted into light recognizable data, and the brightness and color of light are controlled by using a DMX512 protocol, so that the light can change along with music rhythm.
The invention integrates the image recognition technology, the projection correction technology, the sound-picture interaction technology and the light control system, and realizes that different modules trigger different visual changes and sound effects in a mode that different recognition modules are put into the interaction table, and meanwhile, the light can flicker and change along with the music rhythm.
The beneficial effects of the invention are as follows: compared with the prior art, the invention can control the corresponding presentation technology of music, vision and lamplight through image recognition, integrates the image recognition, the sound-image interaction and the lamplight control technology into a whole, realizes different image triggering different visual changes and sound effects, and simultaneously, the lamplight can flicker and change along with the rhythm of the music to form a new interaction experience, and has good realization effect.
The foregoing description of the preferred embodiments of the invention is not intended to be limiting, but rather is intended to cover all modifications, equivalents, and alternatives falling within the spirit and principles of the invention.

Claims (5)

1. An integrated system for controlling music, vision and lamplight by image recognition is characterized by comprising:
the interaction module is used for projecting a two-dimensional pattern of a user gesture or gesture and presenting a corresponding visual effect;
the image acquisition module is used for acquiring the two-dimensional pattern on the interaction module;
the image recognition module is used for recognizing the ID and the position attribute of the two-dimensional element of the two-dimensional pattern, analyzing the data and then sending the data to the control module;
the control module is used for receiving the ID and the position attribute identified by the image identification module, triggering the corresponding visual effect and then sending the analyzed data to the sound and light control module;
the sound and light control module is used for reading the data information sent by the control module and then presenting corresponding music and light effects;
the interaction module comprises an acrylic plate and an infrared light supplementing lamp, wherein the acrylic plate is used for projecting and imaging to form a two-dimensional pattern, and presenting a corresponding visual effect, and the infrared light supplementing lamp is used for supplementing light for the two-dimensional pattern on the acrylic plate; the thickness of the acrylic plate is 8mm, and the number of the infrared light supplementing lamps is 2;
the image acquisition module is an infrared camera, the image recognition module is a reacTIVision program, the control module is a U3D engine, the sound and light control module is a touch design program, a two-dimensional pattern on an acrylic plate is acquired through the infrared camera, picture information is transmitted to the reacTIVision program, unique ID and position attribute of the picture information are acquired, and the reacTIVision program analyzes data and then transmits the data to the U3D engine through a TUIO protocol; the U3D engine reads the sent ID and position attribute, triggers the corresponding visual effect, simultaneously sends the analyzed data to a touch design program through a UDP protocol, and controls interaction of sound and light through program compiling by the touch design program to present the corresponding music and light effect.
2. A method for controlling music, vision and light by image recognition, which is characterized by comprising the following steps:
step S1, image acquisition, wherein two-dimensional patterns on an interaction module are acquired through an image acquisition module;
s2, the image recognition module acquires and recognizes the ID and the position attribute of the two-dimensional element of the two-dimensional pattern, analyzes the data and sends the data to the control module;
step S3, after the control module reads the sent ID and the position attribute, triggering a corresponding visual effect, and simultaneously sending the analyzed data to the sound and light control module;
step S4, after the data are read by the sound and light control module, corresponding music and light effects are presented;
the interaction module comprises an acrylic plate and an infrared light supplementing lamp, wherein the acrylic plate is used for projecting and imaging to form a two-dimensional pattern, and presenting a corresponding visual effect, and the infrared light supplementing lamp is used for supplementing light for the two-dimensional pattern on the acrylic plate; the thickness of the acrylic plate is 8mm, and the number of the infrared light supplementing lamps is 2;
in step S1, the image acquisition module is an infrared camera, and the positions and angles of the infrared camera and the infrared light supplementing lamp are repeatedly debugged, and then image acquisition is performed; before image acquisition, the image is required to be matched, specifically, the correction of the cross lattice is adopted, namely, a black adhesive tape is used for sticking the adhesive tape on an acrylic plate at equal intervals horizontally and vertically, an infrared camera is used for acquiring the cross lattice image, PS is used for processing the cross lattice image into a program output image template, and the output image is completely matched with the cross lattice of the black adhesive tape through arena;
the image recognition module is a deactiVision program, acquires two-dimensional patterns on an acrylic plate through an infrared camera, transmits picture information to the deactiVision program, acquires unique ID and position attribute thereof,
the deactiVision program analyzes the data and sends the data to the control module;
the control module is a U3D engine, the sound and light control module is a touch designer program, and the reacTIVision program analyzes data and sends the data to the U3D engine through a TUIO protocol; the U3D engine reads the sent ID and position attribute, triggers the corresponding visual effect, simultaneously sends the analyzed data to a touch design program through a UDP protocol, and controls interaction of sound and light through program compiling by the touch design program to present the corresponding music and light effect.
3. The method for controlling music, vision and lamplight by image recognition according to claim 2, characterized in that an interactive element dynamic library is built in the U3D engine, different vision is triggered by different IDs by reading data information sent by reacTIVision, and logic and effects of dynamic elements are debugged in a program; and simultaneously, transmitting the data to a TouchDesigner program through a UDP protocol.
4. The method for controlling music, vision and light according to claim 3, wherein in the U3D engine, the corresponding visual effect is triggered by projecting a picture generated by a program onto an acrylic plate by means of rear projection.
5. The method for controlling music, vision and lamplight according to claim 4, wherein a sound element library is built in the TouchDesigner program, the sound element library comprises rhythm audio, melodic audio and environment audio, different IDs trigger different audios by reading data sent by a U3D engine, meanwhile, the triggered music is analyzed to be high, medium and low frequencies by the TouchDesigner, audio information is converted into color information, then the color is mapped onto lamplight, and the brightness and the color of the lamplight are controlled by a DMX512 protocol in the TouchDesigner, so that the lamplight can change along with the flicker of the music rhythm.
CN201911080203.5A 2019-11-07 2019-11-07 Integrated system and method for controlling music, vision and lamplight through image recognition Active CN110929051B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911080203.5A CN110929051B (en) 2019-11-07 2019-11-07 Integrated system and method for controlling music, vision and lamplight through image recognition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911080203.5A CN110929051B (en) 2019-11-07 2019-11-07 Integrated system and method for controlling music, vision and lamplight through image recognition

Publications (2)

Publication Number Publication Date
CN110929051A CN110929051A (en) 2020-03-27
CN110929051B true CN110929051B (en) 2023-05-16

Family

ID=69853611

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911080203.5A Active CN110929051B (en) 2019-11-07 2019-11-07 Integrated system and method for controlling music, vision and lamplight through image recognition

Country Status (1)

Country Link
CN (1) CN110929051B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117053167A (en) * 2023-08-23 2023-11-14 上海永加灯光音响工程有限公司 Multifunctional modularized acousto-optic integrated equipment capable of being moved conveniently

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2739518A1 (en) * 2003-09-16 2005-03-16 Smart Technologies Ulc Gesture recognition method and touch system incorporating the same
CN106371594A (en) * 2016-08-31 2017-02-01 李姣昂 Binocular infrared vision portable gesture-controlled projection system and method
CN109272808A (en) * 2018-11-13 2019-01-25 上海艺瓣文化传播有限公司 A kind of interaction systems towards the teaching of music beat rhythm and science popularization

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7601904B2 (en) * 2005-08-03 2009-10-13 Richard Dreyfuss Interactive tool and appertaining method for creating a graphical music display
US20150339301A1 (en) * 2014-05-20 2015-11-26 Syn Holdings LLC Methods and systems for media synchronization
CN107018121B (en) * 2016-10-13 2021-07-20 创新先进技术有限公司 User identity authentication method and device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2739518A1 (en) * 2003-09-16 2005-03-16 Smart Technologies Ulc Gesture recognition method and touch system incorporating the same
CN106371594A (en) * 2016-08-31 2017-02-01 李姣昂 Binocular infrared vision portable gesture-controlled projection system and method
CN109272808A (en) * 2018-11-13 2019-01-25 上海艺瓣文化传播有限公司 A kind of interaction systems towards the teaching of music beat rhythm and science popularization

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
徐志博 ; 房大磊 ; 达彦 ; .音乐新媒体艺术中的声音-媒介交互设计.复旦学报(自然科学版).2018,(03),全文. *

Also Published As

Publication number Publication date
CN110929051A (en) 2020-03-27

Similar Documents

Publication Publication Date Title
US20140247216A1 (en) Trigger and control method and system of human-computer interaction operation command and laser emission device
CN107426887A (en) The operating mode switching method of Intelligent lightening device
WO2015184841A1 (en) Method and apparatus for controlling projection display
CN104035616B (en) A kind of multi-point touch touch-screen using diode display
US8204610B2 (en) Eletronic device, display device, and method of controlling audio/video output of an electronic device
CN110929051B (en) Integrated system and method for controlling music, vision and lamplight through image recognition
WO2023116396A1 (en) Rendering display method and apparatus, computer device, and storage medium
CN116321627A (en) Screen atmosphere lamp synchronous control method, system and control equipment based on image pickup
CN112148241B (en) Light processing method, device, computing equipment and storage medium
CN101847084A (en) Electronic device, display device and method for controlling audio-visual output of electronic device
US9323367B2 (en) Automatic annotation de-emphasis
KR100773905B1 (en) Apparatus for remote pointing using image sensor and method of the same
CN112913331A (en) Determining light effects based on video and audio information according to video and audio weights
CN112256191B (en) Intelligent blackboard, data processing method and device thereof and intelligent interactive panel
CN103680225A (en) Teaching all-in-one machine and multi-input display method thereof
US20230282186A1 (en) Systems and methods for improved production and presentation of video content
US20190387182A1 (en) Live streaming system and method for live streaming
CN203386147U (en) Multimedia desktop interaction system
CN201903865U (en) Interactive projector
CN108965787A (en) A kind of electronic equipment and conference system
CN205121439U (en) Interactive intelligence is dull and stereotyped
CN103295428A (en) Laser teaching interaction system
CN105786224A (en) Universal laser pointer and computer operation method
CN113490040A (en) HDMI (high-definition multimedia interface) -based transmission method and related device
CN106020433A (en) 3D vehicle terminal man-machine interactive system and interaction method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20231114

Address after: 518000 135/136, floor 1, building 7, Vanke Xinghuo, No. 2, Wuhe Avenue (South), Nankeng community, Bantian street, Longgang District, Shenzhen, Guangdong Province

Patentee after: Shenzhen NVIDIA enti Technology Co.,Ltd.

Address before: 801E, 1st Floor, Building 8, Vanke Xinghuo, No. 2 Wuhe Avenue (South), Nankeng Community, Bantian Street, Longgang District, Shenzhen City, Guangdong Province, 518000

Patentee before: Shenzhen Enwei Media Co.,Ltd.