CN117227622A - Atmosphere lamp control method and device, electronic equipment and vehicle - Google Patents

Atmosphere lamp control method and device, electronic equipment and vehicle Download PDF

Info

Publication number
CN117227622A
CN117227622A CN202210644466.XA CN202210644466A CN117227622A CN 117227622 A CN117227622 A CN 117227622A CN 202210644466 A CN202210644466 A CN 202210644466A CN 117227622 A CN117227622 A CN 117227622A
Authority
CN
China
Prior art keywords
information
light effect
effect control
control information
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210644466.XA
Other languages
Chinese (zh)
Inventor
齐晓磊
王鹿笛
叶晶
樊卉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Jidu Automobile Co Ltd
Original Assignee
Shanghai Jidu Automobile Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Jidu Automobile Co Ltd filed Critical Shanghai Jidu Automobile Co Ltd
Priority to CN202210644466.XA priority Critical patent/CN117227622A/en
Publication of CN117227622A publication Critical patent/CN117227622A/en
Pending legal-status Critical Current

Links

Landscapes

  • Circuit Arrangement For Electric Light Sources In General (AREA)

Abstract

The application provides a control method and device of an atmosphere lamp, electronic equipment and a vehicle, wherein the control method is applied to the vehicle and specifically comprises the following steps: playing a first multimedia resource, wherein the first multimedia resource comprises at least one of audio, pictures and videos; when first light effect triggering information for triggering first light effects is identified in the playing process of the first multimedia resources, first light effect control information is determined, and the first light effect control information corresponds to the first light effect triggering information; and controlling the vehicle-mounted atmosphere lamp according to the first lamp effect control information. The application can realize the real-time feedback of the vehicle-mounted atmosphere lamp to the special information points in the first multimedia resource playing process, so that the cabin can provide more immersed multimedia resource playing atmosphere and effect for the user, and the driving experience is improved.

Description

Atmosphere lamp control method and device, electronic equipment and vehicle
Technical Field
The present application relates to the field of vehicle control technologies, and in particular, to a method and an apparatus for controlling an atmosphere lamp, an electronic device, and a vehicle.
Background
With the development of intelligent technology, more and more services are accessed into the cabin to meet various personalized demands of users. At present, a vehicle-mounted atmosphere lamp is usually arranged in a cabin, and different atmosphere senses are created for the cabin by controlling the brightness or the color of the atmosphere lamp, so that the control mode of the atmosphere lamp needs to be further perfected for improving the entertainment and the intelligence of the cabin.
Disclosure of Invention
The application provides a control method and device of an atmosphere lamp, electronic equipment and a vehicle.
According to a first aspect of the present application, there is provided a control method of an atmosphere lamp, applied to a vehicle, the method comprising:
playing a first multimedia resource, wherein the first multimedia resource comprises at least one of audio, pictures and videos;
when first light effect triggering information for triggering first light effects is identified in the playing process of the first multimedia resources, first light effect control information is determined, and the first light effect control information corresponds to the first light effect triggering information;
and controlling the vehicle-mounted atmosphere lamp according to the first lamp effect control information.
In the embodiment of the application, in the playing process of the first multimedia resource, the CDC can capture the light effect triggering information. When the CDC captures the first light effect triggering information, the special light effect control mechanism is triggered currently by the characterization, and the vehicle-mounted atmosphere lamp can be controlled based on the first light effect control information corresponding to the first light effect triggering information, so that the vehicle-mounted atmosphere lamp presents the first light effect, real-time feedback of the vehicle-mounted atmosphere lamp to special information points in the first multimedia resource playing process is realized, the cabin can provide more immersive multimedia resource playing atmosphere and effect for a user, and driving experience is improved.
Optionally, when the light effect triggering information for triggering the first light effect is identified in the playing process of the first multimedia resource, determining the first light effect control information includes:
identifying first information in the playing process of the first multimedia resource, wherein the first information is associated with the first multimedia resource and comprises at least one of multimedia characteristic information of the first multimedia resource and man-machine interaction information associated with the first multimedia resource;
and determining the first light effect control information when target information matched with the first light effect trigger information is identified.
In this embodiment, in the process of playing the first multimedia resource, at least one of the multimedia feature information of the first multimedia resource and the man-machine interaction information associated with the first multimedia resource may be identified, the light effect triggering information is captured therein, and emotion feedback of the vehicle-mounted atmosphere lamp on the first multimedia resource and interaction of the user based on the first multimedia resource may be reflected, so that emotion communication between the cabin and the user may be enhanced through the vehicle-mounted atmosphere lamp.
Optionally, the first light effect triggering information includes first keyword information, and the first light effect control information corresponds to the first keyword information;
When target information matched with the first light effect triggering information is identified, determining the first light effect control information comprises the following steps:
and when the target keyword information matched with the first keyword information is identified, determining the first light effect control information.
In this embodiment, when the CDC identifies the target keyword information matched with the first keyword information, the first light effect control information may be determined according to the first keyword information, so that real-time feedback of the vehicle-mounted atmosphere lamp to the special keyword may be implemented, and the playing effect and atmosphere of the multimedia resource in the cabin may be further improved.
Optionally, the first lighting effect triggering information includes first paragraph node information, and the first lighting effect control information corresponds to the first paragraph node information;
when target information matched with the first light effect triggering information is identified, determining the first light effect control information comprises the following steps:
determining the first lighting effect control information when target paragraph node information matched with the first paragraph node information is identified;
wherein the first paragraph node information includes at least one of:
paragraph node information for characterizing a music start node and/or a music end node;
Paragraph node information for characterizing a chorus start node and/or a chorus end node;
paragraph node information for characterizing the prompt interactor node.
In this embodiment, when the CDC identifies the target paragraph node information matched with the first paragraph node information, the first light effect control information may be determined according to the first paragraph node information, so that the vehicle-mounted atmosphere lamp may play a role in prompting a special paragraph node, and further, the playing effect and atmosphere of the multimedia resource in the cabin may be improved.
Optionally, the first light effect triggering information includes first input information, and the first light effect control information corresponds to the first input information;
determining the first light effect control information when target information matched with the first light effect trigger information is identified, wherein the first light effect control information comprises;
and determining the first light effect control information when target input information matched with the first input information is identified.
In this embodiment, when the CDC identifies the target input information matched with the first input information, the first light effect control information may be determined according to the first input information, so that the on-vehicle atmosphere lamp may not only feed back the interaction information input by the user in real time, but also further improve the playing effect and atmosphere of the multimedia resource in the cabin.
Optionally, the first light effect triggering information includes first interaction feedback information, and the first light effect control information corresponds to the first interaction feedback information;
determining the first light effect control information when target information matched with the first light effect trigger information is identified, wherein the first light effect control information comprises;
determining the first light effect control information when target interaction feedback information matched with the first interaction feedback information is identified;
wherein the first interactive feedback information includes at least one of:
the interaction feedback information is used for representing the interaction start or the interaction end;
the interactive feedback information is used for representing the switching of interactors;
the interaction feedback information is used for representing that the interaction score is in a preset score range;
and the interaction feedback information is used for representing the interaction progress.
In this embodiment, when the CDC identifies the target interactive feedback information matched with the first interactive feedback information, the first light effect control information may be determined according to the first interactive feedback information, so that the vehicle-mounted atmosphere lamp may play a role in prompting the special interactive feedback information, and further, the playing effect and atmosphere of the multimedia resource in the cabin may be improved.
Optionally, the controlling the vehicle-mounted atmosphere lamp according to the first light effect control information includes:
and controlling the vehicle-mounted atmosphere lamp according to the first light effect control information and the second light effect control information, wherein the second light effect control information is determined according to the first multimedia resource.
In this embodiment, when the CDC identifies the first light effect triggering information, the vehicle-mounted atmosphere lamp is controlled according to the first light effect control information and the second light effect control information, where the second light effect control information may be determined according to the first multimedia resource, so that when the first light effect triggering information is captured, the first light effect triggering information is given to the personalized features of the first multimedia resource through the second light effect control information, thereby improving personalized feedback of the vehicle-mounted atmosphere lamp to different multimedia resources, and further improving playing effects and atmosphere of the multimedia resource in the cabin.
Optionally, the controlling the vehicle-mounted atmosphere lamp according to the first light effect control information and the second light effect control information includes any one of the following:
according to the light effect control information with higher priority in the first light effect control information and the second light effect control information, controlling the vehicle-mounted atmosphere lamp;
And combining the first light effect control information with the second light effect control information to obtain target light effect control information, and controlling the vehicle-mounted atmosphere lamp according to the target light effect control information.
Optionally, the method further comprises:
identifying target tag information of the first multimedia resource, wherein the target tag information comprises at least one of type tag information, emotion tag information and musical instrument tag information;
and in the playing process of the first multimedia resource, under the condition that the first light effect triggering information is not recognized, controlling the vehicle-mounted atmosphere lamp according to third light effect control information, wherein the third light effect control information corresponds to the target tag information.
In this embodiment, the personalized features of the first multimedia resource may be represented by the target tag information, so as to determine the corresponding third light effect control information for different multimedia resources.
Optionally, the vehicle-mounted atmosphere lamp is arranged in at least two areas;
according to the first lamp efficiency control information, the vehicle-mounted atmosphere lamp is controlled, and the method comprises the following steps:
under the condition that the first light effect triggering information carries first position information, controlling vehicle-mounted atmosphere lamps in a target area according to the first light effect control information, wherein the target area is determined in the at least two areas based on the first position information;
The first light effect control information corresponds to the first light effect trigger information and the target area.
In this embodiment, the location information carried in the first lighting effect triggering information may correspondingly determine the atmosphere lamp area to be controlled, so as to further enhance interactivity between the vehicle-mounted atmosphere lamp and the user in the first multimedia resource playing process.
Optionally, the vehicle-mounted atmosphere lamp comprises a plurality of light emitting components;
the first light effect control information includes at least one of:
at least one of a division rule of display areas, the number of the light emitting parts in each of the display areas, and the number of the light emitting parts per unit display length;
and a rule that a parameter value of the display parameter of each light emitting component changes with time, wherein the display parameter comprises at least one of display color, display brightness, display color temperature, start display time, end display time, continuous display duration and flicker frequency.
According to a second aspect of the present application, there is provided a control device of an atmosphere lamp, applied to a vehicle, the device comprising:
The playing module is used for playing a first multimedia resource, wherein the first multimedia resource comprises at least one of audio, pictures and videos;
the identification determining module is used for determining first light effect control information when first light effect triggering information for triggering first light effects is identified in the playing process of the first multimedia resource, and the first light effect control information corresponds to the first light effect triggering information;
and the first control module is used for controlling the vehicle-mounted atmosphere lamp according to the first lamp efficiency control information.
Optionally, the identification determination module includes:
the identification unit is used for identifying first information in the playing process of the first multimedia resource, wherein the first information is associated with the first multimedia resource and comprises at least one of multimedia characteristic information of the first multimedia resource and man-machine interaction information associated with the first multimedia resource;
and the determining unit is used for determining the first light effect control information when the target information matched with the first light effect triggering information is identified.
Optionally, the first light effect triggering information includes first keyword information, and the first light effect control information corresponds to the first keyword information;
The determining unit is used for:
and when the target keyword information matched with the first keyword information is identified, determining the first light effect control information.
Optionally, the first lighting effect triggering information includes first paragraph node information, and the first lighting effect control information corresponds to the first paragraph node information;
the determining unit is used for:
determining the first lighting effect control information when target paragraph node information matched with the first paragraph node information is identified;
wherein the first paragraph node information includes at least one of:
paragraph node information for characterizing a music start node and/or a music end node;
paragraph node information for characterizing a chorus start node and/or a chorus end node;
paragraph node information for characterizing the prompt interactor node.
Optionally, the first light effect triggering information includes first input information, and the first light effect control information corresponds to the first input information;
the determining unit is used for:
and determining the first light effect control information when target input information matched with the first input information is identified.
Optionally, the first light effect triggering information includes first interaction feedback information, and the first light effect control information corresponds to the first interaction feedback information;
The determining unit is used for:
determining the first light effect control information when target interaction feedback information matched with the first interaction feedback information is identified;
wherein the first interactive feedback information includes at least one of:
the interaction feedback information is used for representing the interaction start or the interaction end;
the interactive feedback information is used for representing the switching of interactors;
the interaction feedback information is used for representing that the interaction score is in a preset score range;
and the interaction feedback information is used for representing the interaction progress.
Optionally, the first control module is configured to:
and controlling the vehicle-mounted atmosphere lamp according to the first light effect control information and the second light effect control information, wherein the second light effect control information is determined according to the first multimedia resource.
Optionally, the first control module is configured to:
according to the light effect control information with higher priority in the first light effect control information and the second light effect control information, controlling the vehicle-mounted atmosphere lamp;
and combining the first light effect control information with the second light effect control information to obtain target light effect control information, and controlling the vehicle-mounted atmosphere lamp according to the target light effect control information.
Optionally, the apparatus further comprises:
the identification module is used for identifying target tag information of the first multimedia resource, wherein the target tag information comprises at least one of type tag information, emotion tag information and musical instrument tag information;
and the second control module is used for controlling the vehicle-mounted atmosphere lamp according to third light effect control information when the first light effect triggering information is not recognized in the playing process of the first multimedia resource, and the third light effect control information corresponds to the target label information.
Optionally, the vehicle-mounted atmosphere lamp is arranged in at least two areas;
the first control module is used for:
under the condition that the first light effect triggering information carries first position information, controlling vehicle-mounted atmosphere lamps in a target area according to the first light effect control information, wherein the target area is determined in the at least two areas based on the first position information;
the first light effect control information corresponds to the first light effect trigger information and the target area.
Optionally, the vehicle-mounted atmosphere lamp comprises a plurality of light emitting components;
the first light effect control information includes at least one of:
At least one of a division rule of display areas, the number of the light emitting parts in each of the display areas, and the number of the light emitting parts per unit display length;
and a rule that a parameter value of the display parameter of each light emitting component changes with time, wherein the display parameter comprises at least one of display color, display brightness, display color temperature, start display time, end display time, continuous display duration and flicker frequency.
According to a third aspect of the present application, there is provided an electronic device comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of the first aspect of the application.
According to a fourth aspect of the present application there is provided a non-transitory computer readable storage medium storing computer instructions for causing a computer to perform the method of the first aspect of the present application.
According to a fifth aspect of the present application there is provided a computer program product comprising a computer program which, when executed by a processor, implements the method of the first aspect of the present application.
According to a sixth aspect of the application there is provided a vehicle configured to perform the method of the first aspect of the application.
Drawings
Fig. 1 is a schematic diagram of setting an on-vehicle atmosphere lamp according to an embodiment of the present application;
fig. 2 is a schematic view of an appearance structure of a vehicle-mounted atmosphere lamp according to an embodiment of the present application;
fig. 3 is a flow chart of a control method of an atmosphere lamp according to an embodiment of the present application;
fig. 4a is a schematic diagram of a display unit of a vehicle-mounted atmosphere lamp according to an embodiment of the present application;
fig. 4b is a schematic diagram of a lighting effect of a vehicle-mounted atmosphere lamp according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a control device for an atmosphere lamp according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of a vehicle according to an embodiment of the present application.
Detailed Description
Exemplary embodiments of the present application will now be described with reference to the accompanying drawings, in which various details of the embodiments of the present application are included to facilitate understanding, and are to be considered merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
The embodiment of the application provides a control method of an atmosphere lamp, which can be used for controlling the atmosphere lamp in a vehicle cabin. The control method described above may be performed by the vehicle, specifically by a cabin controller (Cockpit Domain in Controller, CDC) or by a controller dedicated to the control of the atmosphere lamp, without limitation. The control method of the atmosphere lamp will be described below by taking CDC execution as an example.
It should be noted that, in the embodiment of the present application, the number, the type and the arrangement form of the atmosphere lamps in the cabin are not limited, for example, the atmosphere lamps in the cabin may be all intelligent atmosphere lamps, or may include general atmosphere lamps and intelligent atmosphere lamps, where the intelligent atmosphere lamps are atmosphere lamps capable of controlling parameter values of display parameters such as brightness, color temperature through a program, and the general atmosphere lamps are atmosphere lamps capable of controlling only on or off, and parameter values of display parameters such as brightness, color temperature are fixed. For another example, the atmosphere lamp in the cabin may be provided in the form of at least one lamp strip or may be provided in the form of at least one lamp array, which is not particularly limited herein.
Illustratively, as shown in fig. 1, six intelligent atmosphere lamp strips are provided in the cabin, specifically including two intelligent atmosphere lamp strips 11-1 and 11-2 respectively provided on a right Instrument Panel (IP) and a left IP, and four intelligent atmosphere lamp strips 11-3, 11-4, 11-5 and 11-6 respectively provided on a right front door, a right rear door, a left front door and a left rear door, the lengths of the six intelligent atmosphere lamp strips, that is, the number of light emitting components in the lamp strips may be the same or different, further, the two intelligent atmosphere lamp strips 11-1 and 11-2 provided on the right IP and the left IP may be separately provided, the two intelligent atmosphere lamp belts 11-3 and 11-4 arranged on the right front door and the right rear door can be arranged separately or can be connected into a whole, the two intelligent atmosphere lamp belts 11-5 and 11-6 arranged on the left front door and the left rear door can be arranged separately or can be connected into a whole, further, the six lamp belts 11-1, 11-2, 11-3, 11-4, 11-5 and 11-6 can be connected into a whole, and a U-shaped lamp belt shown in fig. 2 is formed in the cabin, and the intelligent atmosphere lamp belt can be specifically determined according to practical conditions and is not particularly limited. Eight ordinary atmosphere lamp arrays are further arranged in the cabin, and specifically comprise two circles of ordinary atmosphere lamp arrays 12-1 and 12-2 which are respectively arranged around a right loudspeaker and a left loudspeaker, four ordinary atmosphere lamp arrays 12-3, 12-4, 12-5 and 12-6 which are respectively arranged on a right front door, a right rear door, a left front door and a left rear door, and two ordinary atmosphere lamp arrays 12-7 and 12-8 which are respectively arranged on a right CC and a left CC of the cabin.
Referring to fig. 3, fig. 3 is a flowchart of a control method of an atmosphere lamp according to an embodiment of the application. As shown in fig. 3, the control method includes the steps of:
step 301, playing the first multimedia resource.
The first multimedia resource may be a multimedia resource that is playing or to be played in the running vehicle-mounted application. The vehicle-mounted application may include, but is not limited to, a music playing application, an audio playing application, a video playing application, a K song application, a recording application, a game application, etc., and the first multimedia resource may specifically include at least one of audio, pictures, and video, for example, may be pure music and songs played in the music playing application or the K song application, or may also be video and background music in the video playing application or the game application, which is not limited herein.
In particular, after the vehicle-mounted application is running, the CDC may play the first multimedia resource.
Step 302, when first light effect triggering information for triggering the first light effect is identified in the playing process of the first multimedia resource, determining first light effect control information, wherein the first light effect control information corresponds to the first light effect triggering information.
In specific implementation, the CDC may preset one or more light effect triggering information, and determine the light effect and the light effect control information of each light effect triggering information correspondingly. The CDC may identify the first multimedia asset during the process of playing the first multimedia asset, for example, identify music spectrum information of song audio, identify lyrics of song audio, identify image information of a picture, and so on. Alternatively, the received information may be identified, for example, by identifying audio received by a microphone, by identifying an operation performed by the user within the cabin, etc., to determine whether light effect triggering information is identified. When first light effect triggering information for triggering the first light effect is identified, first light effect control information can be determined to trigger the first light effect.
The light effect triggering information may be determined based on a type of a multimedia resource playable by the CDC, for example, in a case where the first multimedia resource is song audio, the light effect triggering information may be set as a keyword, for example, "meteor", and when the CDC recognizes that "meteor" exists in song lyrics, the first light effect control information may be determined to be the light effect control information corresponding to "meteor"; alternatively, the light effect triggering information may be set as a paragraph node, for example, a chorus start node, and when the CDC identifies the chorus start node of the song, it may determine that the first light effect control information is the light effect control information corresponding to the chorus start node. When the CDC recognizes that the picture includes a flower image, it may determine that the first light effect control information is light effect control information corresponding to the flower image. The light effect triggering information can also be determined based on the content or interaction information of the multimedia resource playable by the CDC, specifically, the light effect triggering information can be determined according to the actual situation, and the light effect triggering information is not limited at all.
The first light effect control information is light effect control information corresponding to the first light effect trigger information. An on-vehicle atmosphere lamp is generally composed of a plurality of light emitting parts, such as lamp beads, and the above-described lamp effect triggering information may include lamp effect control information for controlling the plurality of light emitting parts.
Optionally, the light efficiency control information includes at least one of: 1) At least one of a division rule of display areas, the number of light emitting parts in each display area, and the number of light emitting parts in a unit display length, wherein a display unit refers to a unit when an atmosphere lamp is controlled; 2) The parameter value of the display parameter of each light emitting part, and a rule of the parameter value of the display parameter varying with time, wherein the display parameter may include at least one of a display color, a display brightness, a display color temperature, a start display time, an end display time, a duration of display, and a flicker frequency, and the rule of the parameter value of the display parameter varying with time may include speed information of the parameter value varying with time, such as a speed at constant speed, an initial parameter value at non-constant speed, an acceleration, and the like. The CDC can make the vehicle-mounted atmosphere lamp entirely exhibit the first light effect by setting the light effect control information.
And 303, controlling the vehicle-mounted atmosphere lamp according to the first lamp efficiency control information.
In the embodiment of the application, in the playing process of the first multimedia resource, the CDC can capture the light effect triggering information. When the CDC captures the first light effect triggering information, the special light effect control mechanism is triggered currently by the characterization, and the vehicle-mounted atmosphere lamp can be controlled based on the first light effect control information corresponding to the first light effect triggering information, so that the vehicle-mounted atmosphere lamp presents the first light effect, real-time feedback of the vehicle-mounted atmosphere lamp to special information points in the first multimedia resource playing process is realized, the cabin can provide more immersive multimedia resource playing atmosphere and effect for a user, and driving experience is improved.
In an alternative embodiment, step 302 includes:
identifying first information in the playing process of the first multimedia resource, wherein the first information is associated with the first multimedia resource and comprises at least one of multimedia characteristic information of the first multimedia resource and man-machine interaction information associated with the first multimedia resource;
and determining the first light effect control information when the target information matched with the first light effect trigger information is identified.
In this embodiment, during the playing process of the first multimedia resource, the first information associated with the first multimedia resource may be identified, specifically, at least one of the multimedia feature information of the first multimedia resource and the man-machine interaction information associated with the first multimedia resource may be identified, and the lighting effect triggering information may be captured therein. The multimedia feature information of the first multimedia resource can represent the emotion expression of the first multimedia resource, and the light effect triggering information captured in the multimedia feature information of the first multimedia resource can represent emotion feedback of the vehicle-mounted atmosphere lamp to the first multimedia resource. The man-machine interaction information associated with the first multimedia resource can represent the emotion expression of interaction of the user based on the first multimedia resource, the light effect triggering information captured in the man-machine interaction information associated with the first multimedia resource can embody emotion feedback of interaction of the vehicle-mounted atmosphere lamp to the user based on the first multimedia resource, and accordingly emotion communication between the cabin and the user in the playing process of the first multimedia resource can be enhanced through the vehicle-mounted atmosphere lamp.
In a specific implementation, the multimedia feature information of the first multimedia resource refers to feature information of the first multimedia resource, which may specifically include music feature information, text feature information, image feature information and the like of the first multimedia resource, and may specifically include lyric information, subtitle information, rhythm feature information, audio feature information, tone feature information, paragraph node information and the like. The man-machine interaction information associated with the first multimedia asset may include user input information associated with the first multimedia asset, such as a fast forward instruction, a song selection instruction, a song cutting instruction, a singing audio, etc., input by a user. The man-machine interaction information associated with the first multimedia resource may further include interaction feedback information associated with the first multimedia resource, for example, if the first multimedia resource is independent pure music or song, the interaction feedback information may include information indicating that the playing of the first multimedia resource is finished, and the like; or if the first multimedia resource is a song sung by the user in the singing application, the interactive feedback information may include feedback information output by the singing application during the singing process, for example, information representing a singing progress, information representing a singer, information representing a singing score, and the like.
Optionally, for the captured first light effect triggering information, four embodiments are included:
in a first embodiment, the first light effect triggering information includes first keyword information;
step 302 includes: and when the target keyword information matched with the first keyword information is identified, determining first light effect control information, wherein the first light effect control information corresponds to the first keyword information.
In this embodiment, the text information is a type of multimedia feature information of the first multimedia resource, and the CDC may identify the text information of the first multimedia resource, for example, identify lyric information of song audio, identify subtitle information of video, or identify text information in a picture, and perform special light effect control corresponding to preset keyword information by capturing keyword information matching with the preset keyword information. When the CDC identifies target keyword information matched with the first keyword information, the first light effect control information can be determined according to the first keyword information, so that real-time feedback of the vehicle-mounted atmosphere lamp on special keywords can be realized, and the playing effect and atmosphere of multimedia resources in the seat cabin are further improved.
In specific implementation, the CDC may preset one or more keywords and associated words thereof, and determine the light efficiency control information of each keyword correspondingly. The preset keywords may be related to seasons, for example, spring, summer, autumn, winter, etc., may be related to holidays, for example, firecrackers, past year, spring festival, etc., may be related to weather, for example, sunny, rainy, cloudy, snowy, etc., and may be other keywords capable of realizing special lighting effects, for example, meteor, lapse, star system, blooming, etc., and may be specifically determined according to practical situations, without any limitation.
The keyword information corresponding to the keyword may include the keyword or the association itself, or may include identification information of the keyword or the association, which is not particularly limited herein. In the playing process of the first multimedia resource, the CDC may identify text information of the first multimedia resource in real time based on semantic identification, and when identifying a target keyword identical or similar to the first keyword or identifying a target keyword identical or similar to an associated word of the first keyword, may determine to capture light effect triggering information, thereby determining light effect control information corresponding to the first keyword as first light effect control information. Alternatively, the CDC may pre-mark the first multimedia resource before playing, and illustratively, the CDC may pre-identify, based on semantic recognition, a target keyword that is the same as or similar to the first keyword or an association thereof in the text information of the first multimedia resource, and mark the target keyword, for example, set a keyword mark at a playing time point of the target keyword in song audio, or set a keyword mark on a picture in which the target keyword exists, so that during playing of the first multimedia resource, the CDC may capture the light effect triggering information by detecting the keyword mark in real time.
The light effect control information corresponding to each keyword information may be determined according to actual situations, and is not limited herein, and may be determined based on the meaning of the keywords, for example. For ease of understanding, two exemplary first light effect control information in this embodiment are described herein:
1) And the key word 'meteor' corresponds to the light effect control information.
Taking the U-shaped lamp strip shown in fig. 2 as an example, the lamp efficiency control information includes:
the unit length of one meteor display unit of the meteor light effect can be 25 lamp beads, and the meteor light effect is specifically divided into five areas A-E, and 5 lamp beads are arranged in each area. The display brightness of the light beads in the lighting state in the same area is kept consistent, the initial display brightness of the light beads in the lighting state in the five areas is gradually decreased from the area A to the area E, and the initial number of the light beads in the lighting state in the five areas is different, for example, one meteor display unit can be as shown in fig. 4a, the display brightness of the light beads in the lighting state in the area A is 70%, the display brightness of the light beads in the lighting state in the area B is 65%, and so on; and the number of the lamp beads in the lighting state in the area A is 5, the number of the lamp beads in the lighting state in the area B is 5, the number of the lamp beads in the lighting state in the area C is 5, the number of the lamp beads in the lighting state in the area D is 3, and the number of the lamp beads in the lighting state in the area E is 1. To show the effect of the passage of meteor, the forward movement of the light beads in the lit state in the full band at a speed of 20 ms/granule can be achieved by controlling the lighting or extinguishing of the light beads and the display brightness of the light beads, the forward direction being the direction from the a zone to the E zone, for example, as shown in fig. 4b, the display brightness of each light bead in the lit state decreases non-uniformly with time/displacement, and the number of light beads in the lit state decreases uniformly with time/displacement.
2) The key words "fireworks"/"firecrackers" are corresponding to the light effect control information.
Taking the U-shaped lamp strip shown in fig. 2 as an example, the lamp efficiency control information includes:
the lamp beads are turned on or off and the display brightness of the lamp beads is controlled, the lamp beads are moved backwards at the speed of 1 second/lamp bead from the middle point of the right IP lamp band and the middle point of the left IP lamp band to the two sides, the initial length of the lamp beads is 5 lamp beads, the color is gradually transited from orange to white from the initial lamp beads to the tail lamp beads, the length of each section of the lamp beads is gradually increased along with the movement, for example, the length of each section of lamp beads is increased to the length of 7 lamp beads from the length of 5 lamp beads, and the length of each section of lamp beads is increased to the length of 10 lamp beads. In addition, in order to exhibit the effect of gradually dissipating the fireworks, the display brightness of each of the lamp beads in the lighted state is unequally decreased with time/displacement.
In a second embodiment, the first lighting effect triggering information includes first paragraph node information;
step 302 includes: and when the target paragraph node information matched with the first paragraph node information is identified, determining first light effect control information, wherein the first light effect control information corresponds to the first paragraph node information.
In this embodiment, the paragraph node information is a multimedia feature information of the first multimedia resource, the CDC may identify the paragraph node information of the first multimedia resource, for example, identify a chorus start node of the song audio, identify a chapter end node or an advertisement start node of the video, or identify a play end node of the repeatedly played audio picture, and execute special light effect control corresponding to the preset paragraph node information by capturing target paragraph node information matched with the preset paragraph node information, and when the CDC identifies the target paragraph node information matched with the first paragraph node information, determine the first light effect control information according to the first paragraph node information, so that the vehicle-mounted atmosphere lamp may play a role in prompting the special paragraph node, and further enhance the play effect and atmosphere of the multimedia resource in the cockpit.
In particular, the CDC may preset one or more paragraph nodes, and determine the light efficiency control information of each paragraph node correspondingly. For example, the preset paragraph node may be a chorus node, and specifically may include a chorus start node and/or a chorus end node, or may include a music start node and/or a music end node, or may further include an advertisement node, a chapter node, and the like in a video. In addition, in interactive applications such as game applications or karaoke applications, the paragraph nodes may also include nodes that prompt the interactors, e.g., in karaoke applications, the chorus needs to switch the singer's node, or in game applications, the online mode needs to switch the operator's node. In addition, the preset paragraph node may be determined specifically according to the content of different multimedia resources, for example, the node with a large change in rhythm, a large change in emotion, a large change in tone, etc. in the song may be determined specifically according to the actual situation, which is not limited in any way.
Paragraph node information corresponding to a paragraph node may include identification information of the paragraph node. The CDC may pre-mark the paragraph node before the first multimedia resource is played, for example, set a node mark at a time point when the paragraph node appears, so that during the playing process of the first multimedia resource, the CDC may capture the light effect triggering information by detecting the node mark in real time.
The light effect control information corresponding to each paragraph node information may be determined according to actual situations, and is not limited herein, for example, may be determined based on characteristics of the paragraph node information. For example, if the first paragraph node includes a chorus start node, the display color of all the light emitting components may be controlled to gradually deepen when the CDC recognizes the chorus start node.
In a third embodiment, the first light effect triggering information includes first input information;
step 302 includes: and when the target input information matched with the first input information is identified, determining first light effect control information, wherein the first light effect control information corresponds to the first input information.
In this embodiment, the input information is a man-machine interaction information associated with the first multimedia resource, and the CDC may identify the input information received during the playing process of the first multimedia resource, for example, identify a touch input or a voice input of a user, identify an audio accessed by a microphone, and perform special light effect control corresponding to the preset input information by capturing target input information matched with the preset input information. When the CDC identifies target input information matched with the first input information, the first light effect control information can be determined according to the first input information, so that the vehicle-mounted atmosphere lamp can feed back interaction information input by a user in real time, and the playing effect and atmosphere of multimedia resources in the seat cabin can be further improved.
In particular, the CDC may preset one or more input information and determine the light efficiency control information of each input information correspondingly. For example, the preset input information may be touch input information, voice input information, or audio input information received by a microphone. The operations corresponding to the touch input or the voice input may include, but are not limited to, a song selection operation, a song cutting operation, a fast forward operation, a volume increasing or decreasing operation, a tone increasing or decreasing operation, etc., and may be specifically determined according to functions of the music application.
The light efficiency control information corresponding to each input information may be determined according to the actual situation, and is not limited herein, for example, may be determined according to the response content corresponding to the input information. For example, if the first input is a song selection input, in a K song application, when the CDC recognizes the song selection input, for example, when a touch input of a selection button corresponding to a certain song by a user is received, all the light emitting components may be controlled to keep the parameter values of display parameters such as display color, display brightness and the like unchanged, and flash synchronously.
In a fourth embodiment, the first light effect triggering information includes first interactive feedback information;
Step 302 includes: and when target interactive feedback information matched with the first interactive feedback information is identified, determining first light effect control information, wherein the first light effect control information corresponds to the first interactive feedback information.
In this embodiment, the interactive feedback information is a man-machine interactive information associated with the first multimedia resource, and the CDC may identify the interactive feedback information of the user during the playing process of the first multimedia resource, for example, identify a singing score, identify a singing progress or a game progress, and execute special light effect control corresponding to the preset interactive feedback information by capturing target interactive feedback information matched with the preset interactive feedback information. When the CDC identifies target interaction feedback information matched with the first interaction feedback information, the first lamp effect control information can be determined according to the first interaction feedback information, so that the vehicle-mounted atmosphere lamp plays a role in prompting special interaction feedback information, and the playing effect and atmosphere of multimedia resources in the cabin can be further improved.
In specific implementation, the CDC may preset one or more interactive feedback, and determine the interactive feedback information and the light efficiency control information of each interactive feedback correspondingly. The preset interaction feedback may include an interaction start and/or an interaction end, for example, in a K song application, when the microphone receives the audio of the singer, the representation identifies the interaction start; alternatively, in a gaming application, when a game begins, the token identifies the beginning of an interaction. The preset interactive feedback may also include an interactor switch, for example, in a karaoke application, when the microphone receives a change in voiceprint characteristics of the singer's audio, characterizing that the interactor switch is identified; alternatively, in a gaming application, when a change in the location of the screen of the interaction is detected (e.g., switching from the primary screen to the secondary screen), a characterization identifies an interactor switch. The preset interactive feedback may further include characterizing that the interactive score falls within a preset score range, e.g., in a K song application, when the CDC identifies a singing score greater than 90 scores, the characterizing identifies an interactive score greater than 90 scores; alternatively, in a music game, when the CDC identifies a game score greater than 90 points, the characterization identifies an interaction score greater than 90 points. The preset interactive feedback information may further include a representation of an interactive progress, for example, in a K song application, when the CDC recognizes that the singing progress exceeds 50% according to a duration of the microphone input audio, the representation recognizes that the interactive progress exceeds 50%; alternatively, in a music game, when the CDC identifies a game progress of more than 50%, the characterization identifies an interaction progress of more than 50%.
The light efficiency control information corresponding to each interactive feedback information may be determined according to practical situations, and is not limited herein, for example, may be determined based on the characteristics of the interactive feedback information. For ease of understanding, three exemplary light efficiency control information in this embodiment are presented herein:
1) The interactive feedback is the lamp efficiency control information corresponding to the singing progress.
In the K song application, the CDC may light a portion of the vehicle-mounted atmosphere lamp when the microphone receives audio, and gradually increment the vehicle-mounted atmosphere lamp in a lighting state with an increase in the duration of the input audio, so as to exhibit an effect of "singing is in progress".
2) The interactive feedback is the lamp efficiency control information corresponding to the singing score in the preset score range.
In the K song application, when the singing score is recognized to be more than 90%, the CDC can control the display color of all the lamp beads to be red, and specifically, the display color of each lamp bead can be sequentially deepened clockwise or replaced clockwise from light to dark so as to show the flowing effect; and when the singing score is recognized to be between 80 and 90 minutes, controlling the display color of all the lamp beads to be a blue color system, and when the singing score is recognized to be between 70 and 80 minutes, controlling the display color of all the lamp beads to be a yellow color system.
3) The interactive feedback is the lamp effect control information corresponding to the singing end.
In the K song application, the CDC may control the U-shaped light band as shown in fig. 2 to be iridescent after recognizing that singing is finished, specifically, 64 colors may be predefined, each color is displayed by using 3 to 4 light beads, and when the score is loaded, all the light beads are enabled to flow clockwise in iridescent according to the speed of 20 ms/light bead by controlling the color switching of each light bead, and when the score is loaded, all the light beads are controlled to flash three times in the current display color.
In the embodiment of the application, when the CDC identifies the first light effect triggering information, the vehicle-mounted atmosphere lamp can be controlled only according to the first light effect control information corresponding to the first light effect triggering information, and the vehicle-mounted atmosphere lamp can also be controlled according to the first light effect control information and the second light effect control information, wherein the second light effect control information can be determined according to the first multimedia resource, so that when the first light effect triggering information is captured, the personalized characteristics of the first multimedia resource can be given to the first light effect triggering information through the second light effect control information, thereby improving the personalized feedback of the vehicle-mounted atmosphere lamp to different multimedia resources and further improving the playing effect and atmosphere of the multimedia resource in the seat cabin.
In a specific implementation, the second light effect control information may be determined according to the content of the first multimedia resource, and in a case where the first multimedia resource includes audio, the second light effect control information may also be determined according to music spectrum information of the first multimedia resource.
Optionally, the second light effect control information may also be determined according to at least one of type tag information, emotion tag information, and musical instrument tag information of the first multimedia resource. In specific implementation, the type tag information may be determined based on the type of the first multimedia resource, where the type tag information may include rock, ballad, jazz, classical, and the like when the first multimedia resource is audio, and may include drama, movie, net lesson, variety, animation, and the like when the first multimedia resource is video, and may include scene, character, building, and the like when the first multimedia resource is a picture. The emotion tag information may be determined based on emotion or emotion exhibited by the first multimedia resource, may include cheerful, quiet, sad, motivation, darkness, and the like, and the musical instrument tag information may be determined in a case where the first multimedia resource includes audio, may include guitar, piano, violin, orchestra, and the like.
The CDC may parse the first multimedia asset to be played or being played, identifying at least one of type tag information, emotion tag information, and instrument tag information of the first multimedia asset. For example, the lyrics of a song may be parsed to identify emotion tag information of the song, or information may be displayed on a screen of a video, such as information of display brightness, color combination, etc., to identify emotion tag information of the video. It should be noted that, optionally, which of the CDC specific extraction type tag information, emotion tag information, and musical instrument tag information may be determined based on a user-defined setting or a default setting. For example, if the user-defined setting extracts three of type tag information, emotion tag information, and instrument tag information, the CDC may determine target tag information based on various combinations of different type tag information, different emotion tag information, and different instrument tag information, and correspondingly determine third light effect control information according to the target tag information, for example, the third light effect control information corresponding to cheering, rock, and guitar may be different from the third light effect control information corresponding to sadness, classical, and piano.
Optionally, the vehicle-mounted atmosphere lamp is controlled according to the first lamp efficiency control information and the second lamp efficiency control information, which comprises two embodiments:
in the first embodiment, the vehicle-mounted atmosphere lamp is controlled according to the lamp effect control information with higher priority in the first lamp effect control information and the second lamp effect control information.
In specific implementation, the CDC may obtain priority levels among different light effect control information in advance, where the priority levels may be set by user definition or may be set by the CDC itself, and are not limited herein specifically. When the plurality of light effect control information conflicts, the CDC may control the vehicle-mounted atmosphere lamp based on the light effect control information with higher priority. In an optional embodiment, the CDC may set the priority of the first light effect control information corresponding to the first light effect trigger information to be the highest, and when the first light effect trigger information is identified in the playing process of the first multimedia resource, the vehicle-mounted atmosphere lamp may be controlled according to the first light effect control information.
In the second embodiment, the first light effect control information and the second light effect control information are combined to obtain target light effect control information, and the vehicle-mounted atmosphere lamp is controlled according to the target light effect control information.
When the vehicle-mounted atmosphere lamp is specifically implemented, the CDC can combine the first lamp effect control information with the second lamp effect control information to obtain new lamp effect control information, the new lamp effect control information is recorded as target lamp effect control information, and the vehicle-mounted atmosphere lamp is controlled according to the target lamp effect control information. Illustratively, taking the first light effect control information corresponding to "meteor" as shown in fig. 4b as an example, the target light effect control information may include: and (3) integrally adjusting the display color temperature of each lamp bead in the first light effect control information corresponding to the meteor shown in the figure 4b to be cold-adjusted, wherein the parameter values of other display parameters and the rule that the parameter values of the display parameters change along with time/displacement are unchanged. Also by way of example, in the case where two juxtaposed light strips are provided in the cabin, the target light effect control information may include: the light effect control information of the light emitting component on the lower light band is first light effect control information, and the light effect control information of the light emitting component on the upper light band is second light effect control information.
In the embodiment of the present application, in the playing process of the first multimedia resource and under the condition that the first lighting effect triggering information is not identified, the description in the related art may be referred to, for example, when the first multimedia resource includes audio, the vehicle-mounted atmosphere lamp is controlled to be displayed according to the music spectrum information rhythm of the first multimedia resource. When the CDC recognizes the first light effect triggering information, the vehicle-mounted atmosphere lamp is controlled according to the first light effect control information corresponding to the first light effect triggering information, and when the execution of the first light effect control information is finished, the music frequency spectrum information rhythmic display according to the first multimedia resource is restored.
In an alternative embodiment, the method further comprises:
identifying target tag information of the first multimedia resource, wherein the target tag information comprises at least one of type tag information, emotion tag information and musical instrument tag information;
and in the playing process of the first multimedia resource, under the condition that the first lighting effect triggering information is not recognized, controlling the vehicle-mounted atmosphere lamp according to the third lighting effect control information, wherein the third lighting effect control information corresponds to the target label information.
In this embodiment, the CDC may determine the corresponding third light effect control information based on at least one of type tag information, emotion tag information, and musical instrument tag information of the first multimedia resource. In the playing process of the first multimedia resource, and under the condition that the first lighting effect triggering information is not recognized, the vehicle-mounted atmosphere lamp can be controlled based on the third lighting effect control information. In specific implementation, the embodiments of determining the type tag information, the emotion tag information, and the musical instrument tag information of the first multimedia resource may refer to the above related descriptions, and in order to avoid repetition, the description is omitted here.
In the embodiment of the application, the vehicle-mounted atmosphere lamps can be distributed in different areas in the cabin, for example, as shown in fig. 1, the lamp strip 11-2 on the left IP and the lamp strip 11-5 on the left front door correspond to main driving areas, the lamp strip 11-1 on the right IP and the lamp strip 11-3 on the right front door correspond to auxiliary driving areas, the lamp strip 11-6 on the left rear door corresponds to left rear areas, the lamp strip 11-4 on the right rear door corresponds to right rear areas, and the lamp array 12-7 on the right CC and the lamp array 12-8 on the left CC correspond to rear middle areas. If the first light effect control information identified by the CDC carries the first position information, a target area can be determined in the cabin according to the first position information, and the vehicle-mounted atmosphere lamp in the target area is controlled according to the first light effect control information, wherein the first light effect control information and the first light effect trigger information correspond to the target area.
In a specific implementation, the first position information carried by the first light effect control information may include position information of an interactor, and the CDC may determine the position information of the interactor according to a line of accessing the microphone to the audio, or according to a positioning of a sound source of the audio, or according to a position where the interactive display screen is located, so as to determine the target area. For example, in the K song software, if the CDC recognizes that the current singer is the main driver according to the audio received by the microphone, the lamp strip 11-2 on the left IP and the lamp strip 11-5 on the left front door corresponding to the main driving area as shown in fig. 1 may be controlled according to the first light effect control information.
In this embodiment, the location information carried in the first lighting effect triggering information may correspondingly determine the atmosphere lamp area to be controlled, so as to further enhance interactivity between the vehicle-mounted atmosphere lamp and the user in the first multimedia resource playing process.
In summary, in the embodiment of the present application, during the playing process of the first multimedia resource, the CDC may capture the lighting effect triggering information. When the CDC captures the first light effect triggering information, the special light effect control mechanism is triggered currently by the characterization, and the vehicle-mounted atmosphere lamp can be controlled based on the first light effect control information corresponding to the first light effect triggering information, so that the vehicle-mounted atmosphere lamp presents the first light effect, real-time feedback of the vehicle-mounted atmosphere lamp to special information points in the first multimedia resource playing process is realized, the cabin can provide more immersive multimedia resource playing atmosphere and effect for a user, and driving experience is improved.
Referring to fig. 5, fig. 5 is a block diagram of a control device for an atmosphere lamp according to an embodiment of the present application. The device can be applied to a vehicle.
As shown in fig. 5, the control device 500 for an atmosphere lamp includes:
a playing module 501, configured to play a first multimedia resource, where the first multimedia resource includes at least one of audio, a picture, and a video;
the identification determining module 502 is configured to determine first light efficiency control information when first light efficiency triggering information for triggering a first light efficiency is identified in a playing process of the first multimedia resource, where the first light efficiency control information corresponds to the first light efficiency triggering information;
and the first control module 503 is configured to control the vehicle-mounted atmosphere lamp according to the first light effect control information.
Optionally, the identification determination module 502 includes:
the identification unit is used for identifying first information in the playing process of the first multimedia resource, wherein the first information is associated with the first multimedia resource and comprises at least one of multimedia characteristic information of the first multimedia resource and man-machine interaction information associated with the first multimedia resource;
and the determining unit is used for determining the first light effect control information when the target information matched with the first light effect triggering information is identified.
Optionally, the first light effect triggering information includes first keyword information, and the first light effect control information corresponds to the first keyword information;
the determining unit is used for:
and when the target keyword information matched with the first keyword information is identified, determining the first light effect control information.
Optionally, the first lighting effect triggering information includes first paragraph node information, and the first lighting effect control information corresponds to the first paragraph node information;
the determining unit is used for:
determining the first lighting effect control information when target paragraph node information matched with the first paragraph node information is identified;
wherein the first paragraph node information includes at least one of:
paragraph node information for characterizing a music start node and/or a music end node;
paragraph node information for characterizing a chorus start node and/or a chorus end node;
paragraph node information for characterizing the prompt interactor node.
Optionally, the first light effect triggering information includes first input information, and the first light effect control information corresponds to the first input information;
the determining unit is used for:
And determining the first light effect control information when target input information matched with the first input information is identified.
Optionally, the first light effect triggering information includes first interaction feedback information, and the first light effect control information corresponds to the first interaction feedback information;
the determining unit is used for:
determining the first light effect control information when target interaction feedback information matched with the first interaction feedback information is identified;
wherein the first interactive feedback information includes at least one of:
the interaction feedback information is used for representing the interaction start or the interaction end;
the interactive feedback information is used for representing the switching of interactors;
the interaction feedback information is used for representing that the interaction score is in a preset score range;
and the interaction feedback information is used for representing the interaction progress.
Optionally, the first control module 503 is configured to:
and controlling the vehicle-mounted atmosphere lamp according to the first light effect control information and the second light effect control information, wherein the second light effect control information is determined according to the first multimedia resource.
Optionally, the first control module 503 is configured to any one of the following:
according to the light effect control information with higher priority in the first light effect control information and the second light effect control information, controlling the vehicle-mounted atmosphere lamp;
And combining the first light effect control information with the second light effect control information to obtain target light effect control information, and controlling the vehicle-mounted atmosphere lamp according to the target light effect control information.
Optionally, the control device 500 of the atmosphere lamp further comprises:
the identification module is used for identifying target tag information of the first multimedia resource, wherein the target tag information comprises at least one of type tag information, emotion tag information and musical instrument tag information;
and the second control module is used for controlling the vehicle-mounted atmosphere lamp according to third light effect control information when the first light effect triggering information is not recognized in the playing process of the first multimedia resource, and the third light effect control information corresponds to the target label information.
Optionally, the vehicle-mounted atmosphere lamp is arranged in at least two areas;
the first control module 503 is configured to:
under the condition that the first light effect triggering information carries first position information, controlling vehicle-mounted atmosphere lamps in a target area according to the first light effect control information, wherein the target area is determined in the at least two areas based on the first position information;
The first light effect control information corresponds to the first light effect trigger information and the target area.
Optionally, the vehicle-mounted atmosphere lamp comprises a plurality of light emitting components;
the first light effect control information includes at least one of:
at least one of a division rule of display areas, the number of the light emitting parts in each of the display areas, and the number of the light emitting parts per unit display length;
and a rule that a parameter value of the display parameter of each light emitting component changes with time, wherein the display parameter comprises at least one of display color, display brightness, display color temperature, start display time, end display time, continuous display duration and flicker frequency.
The above-mentioned control device 500 for an atmosphere lamp can implement the processes of the above-mentioned method embodiments and achieve the same beneficial effects, and in order to avoid repetition, the description is omitted here.
In the technical scheme of the application, the acquisition, storage, application and the like of the related user personal information all conform to the regulations of related laws and regulations, and the public sequence is not violated.
According to embodiments of the present application, the present application also provides an electronic device, a readable storage medium and a computer program product.
Fig. 6 shows a schematic block diagram of an example electronic device 600 that may be used to implement an embodiment of the application. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the applications described and/or claimed herein.
As shown in fig. 6, the apparatus 600 includes a computing unit 601 that can perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM) 602 or a computer program loaded from a storage unit 608 into a Random Access Memory (RAM) 603. In the RAM603, various programs and data required for the operation of the device 600 may also be stored. The computing unit 601, ROM602, and RAM603 are connected to each other by a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
Various components in the device 600 are connected to the I/O interface 605, including: an input unit 606 such as a keyboard, mouse, etc.; an output unit 607 such as various types of displays, speakers, and the like; a storage unit 608, such as a magnetic disk, optical disk, or the like; and a communication unit 609 such as a network card, modem, wireless communication transceiver, etc. The communication unit 609 allows the device 600 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The computing unit 601 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of computing unit 601 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, etc. The computing unit 601 performs the respective methods and processes described above, for example, the control method of the atmosphere lamp. For example, in some embodiments, the method of controlling an atmosphere lamp may be implemented as a computer software program, which is tangibly embodied on a machine-readable medium, such as the storage unit 608. In some embodiments, part or all of the computer program may be loaded and/or installed onto the device 600 via the ROM 602 and/or the communication unit 609. When the computer program is loaded into the RAM603 and executed by the computing unit 601, one or more steps of the above-described control method of the atmosphere lamp may be performed. Alternatively, in other embodiments, the computing unit 601 may be configured to perform the method of controlling the atmosphere lamp in any other suitable way (e.g. by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for carrying out methods of the present application may be written in any combination of one or more programming languages. These program code may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus such that the program code, when executed by the processor or controller, causes the functions/operations specified in the flowchart and/or block diagram to be implemented. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of the present application, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and pointing device (e.g., a mouse or trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), and the internet.
The computer system may include a client and a server. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server may be a cloud server, a server of a distributed system, or a server incorporating a blockchain.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present disclosure may be performed in parallel, sequentially, or in a different order, so long as the desired result of the technical solution of the present disclosure is achieved, and the present disclosure is not limited herein.
According to an embodiment of the present application, the present application also provides a vehicle configured to perform the control method of the mood light provided by the embodiment of the present application, optionally, as shown in fig. 7, the vehicle may include a computing unit 701, a ROM702, a RAM703, a bus 704, a storage unit 705, an input unit 706, an output unit 707, a storage unit 708, and a communication unit 709. The specific implementation manner of each part may refer to the description of each part of the electronic device in the foregoing embodiment, and in order to avoid repetition, a description is omitted here.
The above embodiments do not limit the scope of the present application. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present application should be included in the scope of the present application.

Claims (15)

1. A control method of an atmosphere lamp, characterized by being applied to a vehicle, the method comprising:
playing a first multimedia resource, wherein the first multimedia resource comprises at least one of audio, pictures and videos;
when first light effect triggering information for triggering first light effects is identified in the playing process of the first multimedia resources, first light effect control information is determined, and the first light effect control information corresponds to the first light effect triggering information;
and controlling the vehicle-mounted atmosphere lamp according to the first lamp effect control information.
2. The method of claim 1, wherein the determining the first light effect control information when the first light effect triggering information for triggering the first light effect is identified during the playing of the first multimedia resource comprises:
identifying first information in the playing process of the first multimedia resource, wherein the first information is associated with the first multimedia resource and comprises at least one of multimedia characteristic information of the first multimedia resource and man-machine interaction information associated with the first multimedia resource;
and determining the first light effect control information when target information matched with the first light effect trigger information is identified.
3. The method of claim 2, wherein the first light effect trigger information comprises first keyword information, the first light effect control information corresponding to the first keyword information;
when target information matched with the first light effect triggering information is identified, determining the first light effect control information comprises the following steps:
and when the target keyword information matched with the first keyword information is identified, determining the first light effect control information.
4. The method of claim 2, wherein the first light effect trigger information comprises first paragraph node information, the first light effect control information corresponding to the first paragraph node information;
when target information matched with the first light effect triggering information is identified, determining the first light effect control information comprises the following steps:
determining the first lighting effect control information when target paragraph node information matched with the first paragraph node information is identified;
wherein the first paragraph node information includes at least one of:
paragraph node information for characterizing a music start node and/or a music end node;
paragraph node information for characterizing a chorus start node and/or a chorus end node;
Paragraph node information for characterizing the prompt interactor node.
5. The method of claim 2, wherein the first light effect trigger information comprises first input information, the first light effect control information corresponding to the first input information;
determining the first light effect control information when target information matched with the first light effect trigger information is identified, wherein the first light effect control information comprises;
and determining the first light effect control information when target input information matched with the first input information is identified.
6. The method of claim 2, wherein the first light effect trigger information comprises first interactive feedback information, the first light effect control information corresponding to the first interactive feedback information;
determining the first light effect control information when target information matched with the first light effect trigger information is identified, wherein the first light effect control information comprises;
determining the first light effect control information when target interaction feedback information matched with the first interaction feedback information is identified;
wherein the first interactive feedback information includes at least one of:
the interaction feedback information is used for representing the interaction start or the interaction end;
The interactive feedback information is used for representing the switching of interactors;
the interaction feedback information is used for representing that the interaction score is in a preset score range;
and the interaction feedback information is used for representing the interaction progress.
7. The method according to any one of claims 1-6, wherein controlling the vehicle-mounted atmosphere lamp according to the first light effect control information comprises:
and controlling the vehicle-mounted atmosphere lamp according to the first light effect control information and the second light effect control information, wherein the second light effect control information is determined according to the first multimedia resource.
8. The method of claim 7, wherein the controlling the vehicle-mounted atmosphere lamp according to the first light effect control information and the second light effect control information comprises any one of:
according to the light effect control information with higher priority in the first light effect control information and the second light effect control information, controlling the vehicle-mounted atmosphere lamp;
and combining the first light effect control information with the second light effect control information to obtain target light effect control information, and controlling the vehicle-mounted atmosphere lamp according to the target light effect control information.
9. The method according to any one of claims 1-8, further comprising:
identifying target tag information of the first multimedia resource, wherein the target tag information comprises at least one of type tag information, emotion tag information and musical instrument tag information;
and in the playing process of the first multimedia resource, under the condition that the first light effect triggering information is not recognized, controlling the vehicle-mounted atmosphere lamp according to third light effect control information, wherein the third light effect control information corresponds to the target tag information.
10. The method according to any one of claims 1-9, wherein the vehicle atmosphere lights are arranged in at least two areas;
according to the first lamp efficiency control information, the vehicle-mounted atmosphere lamp is controlled, and the method comprises the following steps:
under the condition that the first light effect triggering information carries first position information, controlling vehicle-mounted atmosphere lamps in a target area according to the first light effect control information, wherein the target area is determined in the at least two areas based on the first position information;
the first light effect control information corresponds to the first light effect trigger information and the target area.
11. The method of any one of claims 1-10, wherein the vehicular atmosphere lamp comprises a plurality of light emitting components;
the first light effect control information includes at least one of:
at least one of a division rule of display areas, the number of the light emitting parts in each of the display areas, and the number of the light emitting parts per unit display length;
and a rule that a parameter value of the display parameter of each light emitting component changes with time, wherein the display parameter comprises at least one of display color, display brightness, display color temperature, start display time, end display time, continuous display duration and flicker frequency.
12. An atmosphere lamp control device, characterized by being applied to a vehicle, comprising:
the playing module is used for playing a first multimedia resource, wherein the first multimedia resource comprises at least one of audio, pictures and videos;
the identification determining module is used for determining first light effect control information when first light effect triggering information for triggering first light effects is identified in the playing process of the first multimedia resource, and the first light effect control information corresponds to the first light effect triggering information;
And the first control module is used for controlling the vehicle-mounted atmosphere lamp according to the first lamp efficiency control information.
13. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-11.
14. A non-transitory computer readable storage medium storing computer instructions for causing the computer to perform the method of any one of claims 1-11.
15. A vehicle configured to perform the method of any one of claims 1-11.
CN202210644466.XA 2022-06-08 2022-06-08 Atmosphere lamp control method and device, electronic equipment and vehicle Pending CN117227622A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210644466.XA CN117227622A (en) 2022-06-08 2022-06-08 Atmosphere lamp control method and device, electronic equipment and vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210644466.XA CN117227622A (en) 2022-06-08 2022-06-08 Atmosphere lamp control method and device, electronic equipment and vehicle

Publications (1)

Publication Number Publication Date
CN117227622A true CN117227622A (en) 2023-12-15

Family

ID=89093569

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210644466.XA Pending CN117227622A (en) 2022-06-08 2022-06-08 Atmosphere lamp control method and device, electronic equipment and vehicle

Country Status (1)

Country Link
CN (1) CN117227622A (en)

Similar Documents

Publication Publication Date Title
CN108806656B (en) Automatic generation of songs
CN108806655B (en) Automatic generation of songs
US11475867B2 (en) Method, system, and computer-readable medium for creating song mashups
US20180374461A1 (en) System and method for automatically generating media
US20100050064A1 (en) System and method for selecting a multimedia presentation to accompany text
AU2016330618A1 (en) Machines, systems and processes for automated music composition and generation employing linguistic and/or graphical icon based musical experience descriptors
CN107680571A (en) A kind of accompanying song method, apparatus, equipment and medium
US11511200B2 (en) Game playing method and system based on a multimedia file
CN108200703A (en) Control method, device and the electronic equipment of a kind of light with audio interaction
Booth et al. More than Bollywood: Studies in Indian popular music
CN111213200A (en) System and method for automatically generating music output
WO2022242706A1 (en) Multimodal based reactive response generation
JP6452229B2 (en) Karaoke sound effect setting system
CN111316350A (en) System and method for automatically generating media
JP2022092619A (en) Light control method and device
CN111105776A (en) Audio playing device and playing method thereof
CN111354325B (en) Automatic word and song creation system and method thereof
CN117227622A (en) Atmosphere lamp control method and device, electronic equipment and vehicle
WO2023144269A1 (en) Determining global and local light effect parameter values
US20230166594A1 (en) System and method for controlling lamplight by using music, and in-vehicle infotainment system
US7499860B2 (en) Computer system and method for enhancing experience using networked devices
Byron et al. Hooks in popular music
CN115346503A (en) Song creation method, song creation apparatus, storage medium, and electronic device
CN114974184A (en) Audio production method and device, terminal equipment and readable storage medium
CN113140202A (en) Information processing method, information processing device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication