WO2024051347A1 - 控制灯光显示的方法和装置 - Google Patents

控制灯光显示的方法和装置 Download PDF

Info

Publication number
WO2024051347A1
WO2024051347A1 PCT/CN2023/107419 CN2023107419W WO2024051347A1 WO 2024051347 A1 WO2024051347 A1 WO 2024051347A1 CN 2023107419 W CN2023107419 W CN 2023107419W WO 2024051347 A1 WO2024051347 A1 WO 2024051347A1
Authority
WO
WIPO (PCT)
Prior art keywords
lighting
display device
node
light
audio
Prior art date
Application number
PCT/CN2023/107419
Other languages
English (en)
French (fr)
Inventor
李华宇
周锦
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2024051347A1 publication Critical patent/WO2024051347A1/zh

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q3/00Arrangement of lighting devices for vehicle interiors; Lighting devices specially adapted for vehicle interiors
    • B60Q3/80Circuits; Control arrangements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B45/00Circuit arrangements for operating light-emitting diodes [LED]
    • H05B45/10Controlling the intensity of the light
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B45/00Circuit arrangements for operating light-emitting diodes [LED]
    • H05B45/20Controlling the colour of the light
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • H05B47/115Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings
    • H05B47/12Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings by detecting audible sound
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/165Controlling the light source following a pre-assigned programmed sequence; Logic control [LC]

Definitions

  • the present application relates to the field of lighting control, and more specifically, to a method and device for controlling a lighting display.
  • in-car breathing lights and ambient lights are increasingly used Used by car manufacturers.
  • the combination of in-cabin music and lighting effects allows the lights to move with the sound, giving the driver an immersive music experience, which is a feature that is loved by users.
  • the lighting effect of the current lights in the car that moves with the sound is poor and cannot meet the needs of thousands of users.
  • This application provides a method and device for controlling light display, which helps to improve the user's driving experience at the level of light interaction.
  • the mobile carriers may include road vehicles, water vehicles, air vehicles, industrial equipment, agricultural equipment, or entertainment equipment, etc.
  • the mobile carrier can be a vehicle, which is a vehicle in a broad sense, and can be a means of transportation (such as a commercial vehicle, a passenger car, a truck, a motorcycle, an airplane, a train, a ship, etc.), or an industrial vehicle (such as: Forklifts, trailers, tractors, etc.), engineering vehicles (such as excavators, bulldozers, cranes, etc.), agricultural equipment (such as lawn mowers, harvesters, etc.), amusement equipment, toy vehicles, etc., the embodiments of this application are for vehicles The type is not specifically limited.
  • the mobile carrier can be a vehicle such as an airplane or a ship.
  • a method for controlling light display includes: obtaining audio data to be played; determining audio attributes of the audio data, where the audio attributes include beat characteristics, speed characteristics and melody characteristics of the audio data. At least one of; according to the audio attribute, control the change of the lighting effect displayed by the lighting display device.
  • lighting effects can be designed based on the characteristics of one or more dimensions of audio, which helps to improve the coordination between lighting effects and music, improve the expressiveness of lighting effects to music, and thereby improve the user's driving experience. Ride experience.
  • the light effect includes at least one of the following: light color, light brightness, and dynamic effect formed by the light; wherein, when the light display device includes a light strip, the dynamic effect formed by the light includes but is not limited to: changes in the light in the light strip The direction, speed, flow length of the light, and color change type (such as changing according to the color wheel or changing according to black and white).
  • the above audio data may be obtained from music playing software, or may be obtained through other methods, which is not specifically limited in the embodiment of the present application.
  • the above-mentioned determination of the audio attributes of the audio data may be determined by processing the audio data with a music recognition algorithm.
  • the music recognition algorithm may include a note starting point detection algorithm, a music rhythm detection algorithm, and a music melody recognition algorithm. Based on algorithms such as sonopole contour diagram and constant Q transform algorithm.
  • the audio attribute of the audio data may be directly obtained from the data source of the audio data. For example, if the data source has an audio attribute related tag, the audio attribute of the audio data may be directly determined based on the tag.
  • the lighting display device can be installed in the cabin of the vehicle.
  • it can include ambient lights, breathing lights, virtual lamp beads displayed on the vehicle display screen, etc.; or it can also be installed in the vehicle's cabin. Outside the cockpit, for example, they can be vehicle headlights, including but not limited to: brake lights, turn lights, low beams, high beams, daytime running lights, and position lights.
  • control of changes in lighting effects displayed by the lighting display device includes: nodes that control changes in lighting effects displayed by the lighting display device.
  • Nodes that control the change of the light effect displayed by the lighting display device may include: controlling the time when the light effect displayed by the lighting display device changes, and/or the specific form of the light effect before and after the light effect changes, such as the color of the light, the shape formed by the light Dynamic effects, etc.
  • the node that controls the lighting effect change according to the audio attribute allows the lighting to change according to at least one of the beat, speed and melody of the audio, which enables the audio to express ideas or emotions through lighting and helps to enhance the user experience. User experience while listening to audio.
  • the node where the light effect changes is associated with the audio attribute, including at least one of the following: a beat point indicated by the beat feature, a downbeat indicated by the speed feature The point at which the frequency changes occurs, the point at which the tune characteristics indicate a change in the tune.
  • the node that controls the change of the light effect displayed by the lighting display device according to the audio attribute may include at least one of the following: according to the beat point, controlling the change of light color before and after the beat point; changing the frequency of occurrence of the strong beat according to the beat point According to the point where the tune changes, control the dynamic effect of the lights before and after the point.
  • the audio attribute also includes the rhythm characteristics of the audio data
  • the node that controls the change of the light effect displayed by the lighting display device includes: according to the rhythm characteristics, controlling One or more of the light color before and after the node and the frequency of the node, where the frequency of the node can be understood as the speed of the light effect change (or the light effect change frequency).
  • before and after the node includes one or more situations among before the node, after the node, and before and after the node, which can be selected during the specific implementation process.
  • the rhythm feature indicates that the faster the audio rhythm changes, the greater the change in light color before the node and after the node; and/or the faster the frequency of the node.
  • the range of light color change can be understood as: the change in the color of the light displayed twice adjacently by the same lamp bead. The greater the difference in the red, green, blue (red, green, blue, RGB) values of the adjacent lights, the greater the difference. The greater the change in light color.
  • the rhythm feature indicates that the faster the rhythm of the audio changes, the more light colors are used corresponding to the audio.
  • the lighting display device can be controlled to display faster light effect changes, and/or the light color changes before and after the control node are larger; for music with a slower tempo, such as For classical music, the lighting display device can be controlled to display slower lighting effect changes, and/or the lighting color changes before and after the control node are smaller.
  • the color of the light and/or the speed of the light effect change is controlled by combining the rhythm characteristics of the audio, which helps to further improve the expressiveness of the light to the audio. For example, making the lighting effects displayed based on classical music with a weak rhythm more expressive will help further improve the user's experience when using lights that move with the music.
  • the method further includes: determining delay information, where the delay information includes at least one of the following: a first duration required to determine the audio attribute based on the audio data , the first delay generated by transmitting the lighting effect indication information, the second delay generated by encoding and decoding the lighting effect indication information, and the second duration required to generate the first lighting effect based on the lighting effect indication information; wherein, The lighting effect indication information is generated based on the audio attribute and is used to indicate the first lighting effect displayed by the lighting display device; the node that controls the change of the lighting effect displayed by the lighting display device includes: based on the audio attribute and the time delay Information, a node that controls the change of the lighting effect displayed by the lighting display device, and the lighting effects before and after the node include the first lighting effect.
  • the nodes that control the change of the lighting effect displayed by the lighting display device include: according to the first duration, the second duration, the first delay and the second duration. At least one of the delays is used to compensate for the delay in the lighting effect displayed by the lighting display device, and then control the lighting display device to display the corresponding lighting effect.
  • a lighting display device that displays lights according to audio may include two or more links, wherein each link includes at least one lighting display device, and each of the two or more links
  • the light refresh rate on the road may be different, so delay compensation can be performed on each link separately.
  • the node where the actual lighting effect changes can be basically coincident with the node determined based on the audio attribute of the audio data, thereby reducing the problem of poor user experience caused by lighting effect delay.
  • the audio attributes further include timbre characteristics
  • the timbre characteristics include a first timbre characteristic and a second timbre characteristic, which control the change of the lighting effect displayed by the lighting display device.
  • Nodes include: nodes that control changes in lighting effects displayed by the first lighting display device based on the first timbre characteristics; and/or nodes that control changes in lighting effects displayed by the second lighting display device based on the second timbre characteristics; wherein , the lighting display device includes the first lighting display device and the second lighting display device.
  • the first timbre feature is used to indicate relevant information of the first timbre in the audio data
  • the second timbre feature is used to indicate relevant information of the second timbre in the audio data.
  • each lighting display device in the device performs lighting display according to a timbre characteristic.
  • the audio data includes only one timbre, and within the first duration, the audio data includes the first timbre, and during the second duration, the audio data includes the second timbre; then the audio data can be
  • the first lighting display device is controlled to perform lighting display according to the first tone within a period of time
  • the second lighting display device is controlled to perform lighting display according to the second tone within a second period of time.
  • the first light display device and the second light display device are arranged in different locations.
  • the first light display device and the second light display device may be arranged in different areas of the cockpit.
  • the first light display device may be a light strip disposed at the main driver's door armrest
  • the second light display device may be a light strip disposed at the passenger door armrest
  • the first light display device may be a light strip disposed at the passenger door armrest.
  • It can be a central control screen (lighting effects are displayed through virtual lamp beads)
  • the second lighting display device can be a light strip set at the passenger door armrest; or the first lighting display device can be a light strip set at the main driver's door armrest.
  • the second light display device may be at least one of a brake light, a turn light, a low beam, a high beam, a daytime running light, and a position light.
  • the first lighting display device and the second lighting display device may be the same lighting display device.
  • the lighting display device is made to perform lighting display according to the tone characteristics of the audio, and the lighting display mode of the lighting display device is increased, thereby improving the sense of technology that users experience when using the lighting effect that moves with the sound. In addition, it can also improve the dynamic effect of the light when it moves with the sound, thereby improving the user experience.
  • the timbre characteristics include vocal timbre characteristics and/or musical instrument timbre characteristics.
  • the voice timbre characteristics may indicate at least one of the following timbres: male voice timbre, female voice timbre, and child voice timbre.
  • the musical instrument timbre feature may indicate the timbre of at least one of the following musical instruments: somatic instruments, membrane instruments, pneumatic instruments, string instruments, and electric instruments.
  • body sound instruments include but are not limited to: mouth string, rhyme board, and sandalwood board
  • membrane sound instruments include but are not limited to: bass drum, double-sided drum, and octagonal drum
  • air sound instruments include but are not limited to: single reed, flute, xiao, Xun, trumpet, French horn, organ, accordion, harmonica
  • stringed instruments include but are not limited to: guqin, guzheng, and eijieke
  • electric instruments include but are not limited to: electric violin, electric cello, electric piano, and electric organ , electric Liuqin, electric pipa.
  • the audio attribute further includes a harmony feature
  • the harmony feature includes a first harmony feature and a second harmony feature
  • the control light display device displays
  • Nodes for lighting effect changes include: nodes for controlling the lighting effect change displayed by the third lighting display device according to the first harmony characteristics; and/or controlling the lights displayed by the fourth lighting display device according to the second harmony characteristics.
  • the first harmony feature is used to indicate related information of the first part in the audio data
  • the second harmony feature is used to indicate related information of the second part in the audio data.
  • the third light display device and the fourth light display device are disposed in different locations.
  • the third light display device and the fourth light display device may be disposed in different areas of the cockpit.
  • the third light display device and the fourth light display device may be the same light display device.
  • the audio attribute also includes a chord feature
  • the chord feature includes a first chord feature and a second chord feature, which controls the change of the lighting effect displayed by the lighting display device.
  • Nodes include: nodes that control changes in lighting effects displayed by the fifth lighting display device based on the first chord characteristics; and/or nodes that control changes in lighting effects displayed by the sixth lighting display device based on the second chord characteristics; wherein , the light display device includes the fifth light display device and the sixth light display device.
  • the first chord feature is used to indicate the relevant information of the first musical tone in the audio data
  • the second chord feature is used to indicate the relevant information of the second musical tone in the audio data.
  • the fifth light display device and the sixth light display device are disposed in different locations.
  • the fifth light display device and the sixth light display device may be disposed in different areas of the cockpit.
  • the fifth light display device and the sixth light display device may be the same light display device.
  • the method is applied to a cockpit.
  • the method also includes: obtaining the human body characteristics or identity information of the first user in the cockpit; controlling the lights displayed by the lighting display device.
  • Nodes for effect change include: nodes for controlling the change of light effect displayed by the lighting display device based on the audio attribute, the human body characteristics or the identity information.
  • the human body characteristics include but are not limited to gender, age, emotion, etc.
  • glaring colors are reduced or not used, and/or the frequency of nodes is reduced.
  • the light display device is controlled to display a light color preferred by women.
  • the light display device is controlled to display a light color preferred by men.
  • controlling the node that changes the lighting effect displayed by the lighting display device includes: controlling the color of lights before and after the node and/or the frequency of the node based on the audio attribute and the human body characteristics.
  • controlling the node for changing the lighting effect displayed by the lighting display device based on the audio attribute and identity information may also include: determining the lighting effect preference of the first user based on the identity information, and determining the lighting effect preference according to the lighting effect preference. Controls the color of the lights before and after this node, and/or the frequency of this node.
  • the lighting effect preference includes the lighting color and/or lighting effect change frequency that the first user likes.
  • the identity information of one or more users is stored in the vehicle. If the identity information of the user in the cockpit matches the identity information of the one or more users, the identity information of the user in the cockpit can be Determine your lighting preferences.
  • the lighting effect can be determined in combination with user preferences, human body characteristics, etc., which can meet the needs of users of different ages and/or different preferences, and help improve the user's interactive experience when using lighting display devices.
  • the node that controls the change of the lighting effect displayed by the lighting display device includes: the vehicle in the cockpit is in a driving state, and the first user is in the In the main driving position of the cockpit, and the human body characteristics indicate that the first user is in a fatigue state, the frequency of the node is increased, and/or the lights in the front and rear of the node are controlled to display warning colors.
  • warning colors include colors with warning effects, such as red, yellow, orange, etc.
  • warning lighting effects are displayed through lighting display equipment, which helps to improve user experience and driving safety at the same time.
  • the method further includes: obtaining environmental information outside the cockpit, the environmental information including one or more of temperature information, current season information, and light intensity information.
  • the node that controls the change of light effect displayed by the lighting display device includes: controlling one of the color of the light before and after the node, the frequency of the node, the brightness of the light before and after the node according to the audio attribute and the environmental information, or Multiple.
  • the node that controls the light effect change displayed by the lighting display device includes: based on the audio attribute, and the temperature information outside the cockpit and/or The current season information controls the light color before and after the node.
  • the light color before and after the node is controlled to be a warm color.
  • the preset temperature may be 20 degrees, or it may be 25 degrees, or it may also be a temperature of other numerical values.
  • the color of the light displayed by the lighting display device is controlled based on seasonal information or the temperature outside the cabin. In summer or when the outside temperature is high, cool-toned lights are displayed to make people feel cool; in spring, autumn, winter or when the outside temperature is low, warm-toned lights are displayed to make people feel warm. a feeling of.
  • the node that controls the light effect change displayed by the lighting display device includes: controlling the light brightness before and after the node according to the audio attribute and the light intensity information outside the cockpit. .
  • a method for controlling light display includes: obtaining audio data to be played; and determining audio attributes of the audio data.
  • the audio attributes include timbre characteristics, chord characteristics, and harmony characteristics of the audio data. At least one of; according to the audio attribute, control the change of the lighting effect displayed by the lighting display device.
  • the timbre feature includes a first timbre feature and a second timbre feature
  • the control of the lighting effect change displayed by the lighting display device includes: according to the first timbre feature A node that controls the change of the lighting effect displayed by the first lighting display device; and/or a node that controls the change of the lighting effect displayed by the second lighting display device according to the second tone characteristic; wherein the lighting display device includes the first lighting display device and the second lighting display device.
  • the timbre characteristics include vocal timbre characteristics and/or musical instrument timbre characteristics.
  • the harmony feature includes a first harmony feature and a second harmony feature
  • the control of the lighting effect change displayed by the lighting display device includes: according to the first harmony feature A harmonic characteristic controls a node that changes the lighting effect displayed by the third lighting display device; and/or controls a node that changes the lighting effect displayed by the fourth lighting display device according to the second harmonic characteristic; wherein, the lighting display device Including the third light display device and the fourth light display device.
  • the chord feature includes a first chord feature and a second chord feature
  • the control of the lighting effect changes displayed by the lighting display device includes: according to the first chord feature Control the fifth light display device to display A node where the lighting effect changes; and/or a node where the lighting effect displayed by the sixth lighting display device is controlled according to the second chord characteristics; wherein the lighting display device includes the fifth lighting display device and the sixth lighting display device .
  • the audio attributes also include at least one of the beat characteristics, speed characteristics and melody characteristics of the audio data; the control light effect changes displayed by the lighting display device, Including: nodes that control the lighting effect changes displayed by the lighting display device based on the audio attribute.
  • the audio attribute also includes the rhythm characteristics of the audio data
  • controlling the change of the light effect displayed by the lighting display device includes: controlling the node according to the rhythm characteristics The color of the lights before and after, and/or the frequency of that node.
  • the method further includes: determining delay information, where the delay information includes at least one of the following: a first duration required to determine the audio attribute based on the audio data , the first delay generated by transmitting the lighting effect indication information, the second delay generated by encoding and decoding the lighting effect indication information, and the second duration required to generate the first lighting effect based on the lighting effect indication information; wherein, The lighting effect indication information is generated based on the audio attribute and is used to indicate the first lighting effect displayed by the lighting display device; the node that controls the change of the lighting effect displayed by the lighting display device includes: based on the audio attribute and the time delay Information, a node that controls the change of the lighting effect displayed by the lighting display device, and the lighting effects before and after the node include the first lighting effect.
  • the method further includes: obtaining the human body characteristics or identity information of the first user in the cockpit; and controlling the change of the lighting effect displayed by the lighting display device, including: according to the The audio attributes, as well as the human body characteristics or the identity information, control the lighting effect changes displayed by the lighting display device.
  • controlling the change of the light effect displayed by the lighting display device includes: when the vehicle in the cockpit is in a driving state, and the first user is in the cockpit. At the main driver's seat, and the human body characteristics indicate that the first user is in a fatigue state, the frequency of the node is controlled to be increased, and/or the lights of the node are controlled to display a warning color before and after the node.
  • the method further includes: obtaining temperature information outside the cockpit and/or current season information; and controlling changes in lighting effects displayed by the lighting display device, including: according to This audio attribute, as well as the temperature information outside the cockpit and/or the current season information, controls the color of the lights before and after the node.
  • the method also includes: obtaining light intensity information outside the cockpit; controlling the change of light effect displayed by the lighting display device, including: according to the audio attribute, and the The light intensity information outside the cockpit controls the light brightness before and after the node.
  • the node where the light effect changes is associated with at least one of the following in the audio attributes: a beat point indicated by the beat feature, a downbeat indicated by the speed feature The point at which the frequency changes occurs, the point at which the tune characteristics indicate a change in the tune.
  • a device for controlling light display includes: an acquisition unit for acquiring audio data to be played; a first determining unit for determining the audio attributes of the audio data, where the audio attributes include the At least one of the beat characteristics, speed characteristics and melody characteristics of the audio data; a processing unit used to control nodes for changing the lighting effects displayed by the lighting display device based on the audio attributes.
  • the audio attributes also include the rhythm characteristics of the audio data
  • the processing unit is used to: control the color of lights before and after the node, the color of the node according to the rhythm characteristics, One or more of the frequencies, where the frequency of the node can be understood as the speed at which the lighting effect changes.
  • the device further includes a second determining unit, configured to: determine delay information, where the delay information includes at least one of the following: determining the audio based on the audio data The first duration required for the attribute, the first delay caused by transmitting the light effect indication information, the second delay caused by encoding and decoding the light effect indication information, and the time required to generate the first light effect based on the light effect indication information.
  • the second duration wherein the lighting effect indication information is generated according to the audio attribute and is used to indicate the first lighting effect displayed by the lighting display device; the processing unit is used to control the lighting effect according to the audio attribute and the time delay information.
  • the lighting display device displays a node where the lighting effect changes, and the lighting effects before and after the node include the first lighting effect.
  • the audio attributes further include timbre characteristics
  • the timbre characteristics include first timbre characteristics and second timbre characteristics
  • the processing unit includes a first processing unit and a second processing unit. unit, wherein the first processing unit is used to: control the node of the lighting effect change displayed by the first lighting display device according to the first timbre characteristic; and/or the second processing unit is used to: according to the second timbre characteristic A node that controls the change of the lighting effect displayed by the second lighting display device; wherein the lighting display device includes the first lighting display device and the second lighting display device.
  • the first processing unit and the second processing unit are the same processing unit.
  • the timbre characteristics include vocal timbre characteristics and/or musical instrument timbre characteristics.
  • the audio attributes also include harmony features
  • the harmony features include first harmony features and second harmony features
  • the processing unit includes a third processing unit and a fourth processing unit, wherein the third processing unit is used to: control the node of the lighting effect change displayed by the third lighting display device according to the first harmonic characteristic; and/or the fourth processing unit is used to: according to the first harmonic characteristic
  • the second harmonic characteristic controls the node at which the lighting effect displayed by the fourth lighting display device changes; wherein the lighting display device includes the third lighting display device and the fourth lighting display device.
  • the third processing unit and the fourth processing unit are the same processing unit.
  • the audio attribute further includes a chord feature
  • the chord feature includes a first chord feature and a second chord feature
  • the processing unit includes a fifth processing unit and a sixth processing unit. unit
  • the fifth processing unit is used to: control the node of the lighting effect change displayed by the fifth lighting display device according to the first chord characteristics
  • the sixth processing unit is used to: according to the second chord characteristics A node that controls the change of the lighting effect displayed by the sixth lighting display device; wherein the lighting display device includes the fifth lighting display device and the sixth lighting display device.
  • the fifth processing unit and the sixth processing unit are the same processing unit.
  • the acquisition unit is also used to: acquire the human body characteristics or identity information of the first user in the cockpit; the processing unit is used to: according to the audio attributes, and the human body Characteristics or the identity information, the node that controls the change of the lighting effect displayed by the lighting display device.
  • the processing unit is configured to: when the vehicle in the cockpit is in a driving state, the first user is in the main driving position of the cockpit, and the human body characteristic indicator When the first user is in a fatigue state, the frequency of the node is increased, and/or the lights of the node are controlled to display warning colors before and after.
  • the acquisition unit is also used to: acquire environmental information outside the cockpit, where the environmental information includes one of temperature information, current season information, light intensity information, or Multiple; the processing unit is used to: control one or more of the light color before and after the node, the frequency of the node, and the light brightness before and after the node according to the audio attribute and the environmental information.
  • the acquisition unit is also used to: obtain the temperature information outside the cabin and/or the current season information; the processing unit is used to: according to the audio attribute, and the temperature information outside the cabin and/or the current season Information to control the light color before and after this node.
  • the acquisition unit is also used to: acquire the light intensity information outside the cockpit; the processing unit is used to: control the light brightness before and after the node according to the audio attribute and the light intensity information outside the cockpit.
  • the node where the light effect changes is associated with at least one of the following in the audio attributes: the beat point indicated by the beat feature, the downbeat indicated by the speed feature The point at which the frequency changes occurs, the point at which the tune characteristics indicate a change in the tune.
  • a device for controlling light display including: an acquisition unit, used to acquire audio data to be played; a first determination unit, used to determine audio attributes of the audio data, where the audio attributes include the audio data At least one of the timbre characteristics, chord characteristics, and harmony characteristics; the processing unit is used to control changes in the lighting effects displayed by the lighting display device according to the audio attributes.
  • the timbre feature includes a first timbre feature and a second timbre feature
  • the processing unit is configured to: control the display of the first lighting display device according to the first timbre feature.
  • the timbre characteristics include vocal timbre characteristics and/or musical instrument timbre characteristics.
  • the harmony feature includes a first harmony feature and a second harmony feature
  • the processing unit is configured to: control the third light according to the first harmony feature A node that changes the lighting effect displayed by the display device; and/or controls a node that changes the lighting effect displayed by the fourth lighting display device according to the second harmonic characteristics; wherein the lighting display device includes the third lighting display device and The fourth light display device.
  • the chord feature includes a first chord feature and a second chord feature
  • the processing unit is configured to: control the display of the fifth lighting display device according to the first chord feature The node of the lighting effect change; and/or the node of the lighting effect change displayed by the sixth lighting display device is controlled according to the second chord characteristics; wherein the lighting display device includes the fifth lighting display device and the sixth lighting display equipment.
  • the audio attributes also include at least one of beat characteristics, speed characteristics and melody characteristics of the audio data; the processing unit is also configured to: according to the audio attributes, A node that controls the changes in lighting effects displayed by lighting display devices.
  • the audio attributes also include the rhythm characteristics of the audio data
  • the processing unit is also used to: control the color of lights before and after the node according to the rhythm characteristics, and/ or the frequency of that node.
  • the device further includes a second determining unit, configured to: determine delay information, where the delay information includes at least one of the following: determining the audio based on the audio data The first duration required for the attribute, the first delay caused by transmitting the light effect indication information, the second delay caused by encoding and decoding the light effect indication information, and the time required to generate the first light effect based on the light effect indication information. the second duration; wherein, the lighting effect indication information is generated according to the audio attribute and is used to indicate the first lighting effect displayed by the lighting display device; the processing unit is used to control the lighting effect according to the audio attribute and the time delay information.
  • the lighting display device displays a node where the lighting effect changes, and the lighting effects before and after the node include the first lighting effect.
  • the acquisition unit is also used to: acquire the human body characteristics or identity information of the first user in the cockpit; the processing unit is used to: according to the audio attributes, and the human body Characteristics or the identity information, the node that controls the change of the lighting effect displayed by the lighting display device.
  • the processing unit is configured to: the vehicle in the cockpit is in a driving state, and the first user is in the main driving position of the cockpit, and the human body characteristics When it is indicated that the first user is in a fatigue state, the frequency of the node is increased, and/or the node is controlled to display lights of a warning color before and after the node.
  • the obtaining unit is also used to: obtain the temperature information outside the cockpit and/or the current season information; the processing unit is also used: according to the audio attribute, and The temperature information outside the cockpit and/or the current season information controls the color of the lights before and after the node.
  • the acquisition unit is also used to: obtain the light intensity information outside the cockpit; the processing unit is also used to: according to the audio attribute, and the light intensity outside the cockpit Information to control the brightness of lights before and after this node.
  • the node where the light effect changes is associated with at least one of the following in the audio attributes: a beat point indicated by the beat feature, a downbeat indicated by the speed feature The point at which the frequency changes occurs, the point at which the tune characteristics indicate a change in the tune.
  • a device for controlling a light display includes: a memory for storing a program; a processor for executing the program stored in the memory.
  • the processor is used to execute the above The method in any possible implementation manner of the first aspect or the second aspect.
  • a mobile carrier which includes the device in any possible implementation manner of the above-mentioned third aspect to the fifth aspect, and the lighting display device.
  • the lighting display device may include ambient lights, breathing lights, vehicle-mounted display screens, head-up displays (HUD), and may also include other devices capable of displaying lights.
  • the light display device is set at at least one place on the instrument panel, central control area display, behind the headrest of the front seat, at the front center armrest, at the passenger seat, and can also be set at other locations in the vehicle.
  • the application examples do not specifically limit this.
  • a computer program product includes: computer program code.
  • the computer program code When the computer program code is run on a computer, it enables the computer to execute any one of the first aspect or the second aspect. method within the method.
  • the above computer program code may be stored in whole or in part on the first storage medium, where the first storage medium may be packaged together with the processor, or may be packaged separately from the processor. This is not the case in the embodiments of this application. Specific limitations.
  • a computer-readable medium stores program code.
  • the computer program code When the computer program code is run on a computer, it makes it possible for the computer to execute any one of the first aspect or the second aspect. Methods in the implementation.
  • a chip in a ninth aspect, includes a processor for calling a computer program or computer instructions stored in a memory, so that the processor executes any of the possible implementation methods of the first aspect or the second aspect. method in.
  • the processor is coupled with the memory through an interface.
  • the chip system further includes a memory, and a computer program or computer instructions are stored in the memory.
  • lighting effect design can be carried out based on the characteristics of one or more dimensions of audio, which helps to improve lighting efficiency and
  • the cooperation between audio and video can improve the expressiveness of lighting effects on audio, thereby improving the user's driving experience.
  • Nodes that control lighting effect changes based on audio attributes allow the lighting to change according to at least one of the audio's beat, speed, and melody. This allows the audio to express ideas or emotions through lighting, helping to improve the user's experience when listening to audio. .
  • combining the rhythm characteristics of the audio to control the color of the light and/or the speed of the light effect change will help further improve the expressiveness of the light to the audio.
  • the lighting display device can also perform lighting display according to at least one of the timbre characteristics, harmony characteristics, and chord characteristics of the audio, and increase the display mode of the lighting display device to improve the user's ability to use the light to move with the sound.
  • the lighting effect can also be determined based on user preferences, human body characteristics, etc., which can meet the needs of users of different ages and/or different preferences, helping to improve users' interactive experience when using lighting display devices.
  • this application can make the node where the actual lighting effect changes basically coincide with the node determined based on the audio attribute of the audio data, reducing the problem of poor user experience caused by lighting effect delay.
  • Figure 1 is a schematic block diagram of a vehicle provided by an embodiment of the present application.
  • Figure 2 is a schematic diagram of an application scenario of a method for controlling light display provided by an embodiment of the present application.
  • Figure 3 is a schematic block diagram of a system for controlling light display provided by an embodiment of the present application.
  • Figure 4 is a schematic flow chart of a method for controlling light display provided by an embodiment of the present application.
  • Figure 5 is a schematic flow chart of a method for controlling light display provided by an embodiment of the present application.
  • Figure 6 is a schematic flow chart of a method for controlling light display provided by an embodiment of the present application.
  • Figure 7 is a schematic flow chart of a method for controlling light display provided by an embodiment of the present application.
  • Figure 8 is a schematic flow chart of a method for controlling light display provided by an embodiment of the present application.
  • FIG. 9 is a schematic diagram of an application scenario of a method for controlling light display provided by an embodiment of the present application.
  • Figure 10 is a schematic block diagram of a device for controlling light display provided by an embodiment of the present application.
  • Figure 11 is a schematic block diagram of a device for controlling light display provided by an embodiment of the present application.
  • FIG. 1 is a functional block diagram of a vehicle 100 provided by an embodiment of the present application.
  • Vehicle 100 may include perception system 120, display device 130, and computing platform 150.
  • Sensing system 120 may include several types of sensors that may be used to sense information about the environment surrounding vehicle 100 .
  • the sensing system 120 may include a positioning system.
  • the positioning system may be a global positioning system (GPS), a Beidou system or other positioning systems, an inertial measurement unit (IMU), a lidar, a millimeter One or more of wave radar, ultrasonic radar, visual sensor, sound sensor, steering angle sensor and camera device.
  • the perception system 120 may also include one or more cameras installed inside or outside the smart cockpit to capture images inside or outside the cabin, for example, a camera of a driver monitor system (DMS), a cockpit monitoring system (cabin monitor system, CMS) camera, and driving recorder (dashcam) camera.
  • the cameras used to capture the interior and exterior of the cabin can be the same camera or different cameras.
  • the above one or more cameras can be used to collect facial information of users in the vehicle.
  • the camera and millimeter-wave radar in the cabin can be used to collect the actions or gestures of the user in the vehicle
  • the microphone in the cabin can be used to collect audio information in the vehicle.
  • the display device 130 in the embodiment of the present application mainly includes a light display device for displaying light.
  • the lighting display device may be a breathing light or an ambient light composed of a light emitting diode (LED) light strip, and the LED light strip (hereinafter referred to as the light strip) may include multiple LED lamp beads; or,
  • the light display device can also be other types of lights; or the light display device can also be set on a vehicle display, such as a central control area display, a rearview mirror display, or it can also be set on the head of the front seat.
  • the lighting display device involved in the embodiments of the present application may include lamp beads or lamp strips (or light strips), Its installation location is shown in (a) in Figure 2.
  • the lighting display device can be set at the central control area display screen, the instrument panel, or can be set on a liftable camera, or can be set at other locations in the cockpit.
  • the lighting display device can be a liftable device.
  • the lighting display device can be retracted to the back of the display screen without displaying it. It can be shown as a in (a) in Figure 2.
  • the lighting display device When the lighting display device performs lighting effect display, its position can be set at b and c in (a) in Figure 2 through lifting control, or it can also be hung in the upper left or upper right corner of the central control display.
  • the light display device When the light display device is installed on the instrument panel, its position can be e as shown in (a) in Figure 2, that is, the middle position of the instrument panel, or it can also be raised to the top of the instrument panel, that is, e in Figure 2 d shown in (a).
  • the lighting display device may include a light strip arranged on the edge of the central control display screen, which light strip is composed of several LED lamp beads; or the lighting display device may also be composed of a light strip set on a liftable camera.
  • the composition of LED lamp beads is shown in (b) in Figure 2.
  • the lamp beads in the lighting display device are LED lamp beads, or they may be virtual lamp beads displayed on the vehicle display screen, as shown in (c) of Figure 2 .
  • the light display device may also include a light strip 1 disposed at the armrest of the cabin door, or a light strip 2 disposed on the top of the cabin; or, as shown in (d) of Figure 2 ), the lighting display device may also include a light strip 3 set at the center armrest of the front row of the cockpit, or a light strip 4 set under the central control screen and co-pilot screen, or a light set under the front windshield.
  • Strip 5 alternatively, the lighting display device may also include light strips arranged at door handles, air conditioning outlets, storage boxes, cup holders, speakers, buckles, etc. in the cabin, which are not specifically limited in the embodiments of this application. .
  • the lighting display device may also include the vehicle's headlights, taillights, brake lights, etc.
  • the computing platform 150 may include processors 151 to 15n (n is a positive integer).
  • the processor is a circuit with signal processing capabilities.
  • the processor may be a circuit with instruction reading and execution capabilities.
  • CPU central processing unit
  • microprocessor microprocessor
  • GPU graphics processing unit
  • DSP digital signal processor
  • the processor can realize certain functions through the logical relationship of the hardware circuit. The logical relationship of the hardware circuit is fixed or can be reconstructed.
  • the processor is an application-specific integrated circuit (application-specific integrated circuit).
  • Hardware circuits implemented by ASIC or programmable logic device (PLD), such as field programmable gate array (FPGA).
  • PLD programmable logic device
  • FPGA field programmable gate array
  • the process of the processor loading the configuration file and realizing the hardware circuit configuration can be understood as the process of the processor loading instructions to realize the functions of some or all of the above units.
  • it can also be a hardware circuit designed for artificial intelligence, which can be understood as an ASIC, such as a neural network processing unit (NPU), tensor processing unit (TPU), deep learning processing Unit (deep learning processing unit, DPU), etc.
  • the computing platform 150 may also include a memory, which is used to store instructions. Some or all of the processors 151 to 15n may call instructions in the memory and execute the instructions to implement corresponding functions.
  • the processor can identify the rhythm features, intensity features, speed features, melody features and other features of the music, and based on at least one of the identified rhythm features, intensity features, speed features, and melody features in the music, A feature that generates lighting effects.
  • the processor can also obtain the environmental information, user body characteristics and user identity information detected by the sensing system 120, and adjust the lighting effect in combination with the above information and music characteristics.
  • the above-mentioned "light effect” includes at least one of the displayed light color, light brightness, and dynamic effects formed by the light.
  • the dynamic effects formed by the lights include but are not limited to: color gradient, light flashing, and light rhythm.
  • environmental information can include the environment inside and outside the car, such as light intensity, weather conditions, etc.; user body characteristics include but are not limited to gender, age, emotion, etc.; user identity information includes but is not limited to biometric information stored in the mobile carrier, and Account number, etc. Among them, biometric information includes but is not limited to fingerprints, palmprints, facial information, iris information, gait information, etc.; the account number may include account information for logging into the vehicle system, etc.
  • the above-mentioned environmental information, user body characteristics and user identity information can also be stored in the memory in the computing platform 150 in the form of data.
  • the processor can process the above-mentioned environmental information, user body characteristics and user identity information, and obtain parameterized indicators to instruct the lighting display device to perform lighting effect display.
  • Figure 3 shows a system architecture diagram for controlling light display provided by an embodiment of the present application.
  • the system includes a music selection module, a music attribute identification module, a personalized lighting effect feature extraction module, a delay prediction and compensation module, a lamp Effect generation module and lighting effect display module.
  • the music selection module, music attribute identification module, personalized lighting effect feature extraction module, delay prediction and compensation module, and lighting effect generation module may include the computing platform shown in Figure 1
  • One or more processors in the platform 150; the lighting effect display module may include one or more lighting display devices in the display device 130 shown in FIG. 1.
  • the music selection module is used to determine the audio used to generate the lighting effect according to the user's selection, and then perform segmented real-time processing of the audio. Further, the music selection module inputs the segmented real-time processed audio to the music attribute identification module.
  • the music attribute identification module analyzes the intensity, beat, speed, melody, vocal timbre, instrument timbre, harmony, chords, lyrics, etc. of the music. Features are identified, and then the attributes required to generate light effects are determined based on the above features, such as nodes where light effects change, gradient points on light strips, and changing light strips, etc.
  • the music attribute recognition module inputs the attribute information required to generate lighting effects into the personalized lighting effect feature extraction module.
  • This module can determine the lighting effect style based on the current music style, combined with environmental attributes, user body characteristics and/or facial information, etc. , and then input the lighting effect style into the lighting effect generation module.
  • the lighting effect generation module generates lighting effects based on music attributes and lighting effect styles, and controls the lighting display module to display the lighting effects.
  • the music selection module inputs the segmented real-time processed audio to the delay prediction and compensation module. This module predicts the algorithm delay, transmission delay and encoding and decoding delay. In the lighting effect generation module Compensatory lighting effects are added to the locations where there is time delay in the generated lighting effects.
  • the above-mentioned modules and devices are only examples. In actual applications, the above-mentioned modules and devices may be added or deleted according to actual needs.
  • the personalized lighting effect feature extraction module and the lighting effect generation module in Figure 3 can be combined into one module, that is, the functions of both are implemented by one module.
  • the correspondence between the audio features and the lighting effect features determined by the music attribute recognition module is as shown in Table 1.
  • the lighting effect characteristics can include when the lighting effect changes (or the node where the lighting effect changes), such as the node where the light flashes, the node where the light color changes, the node where the animation form changes.
  • the lighting effect characteristics can also include other features, For example, the direction of light color gradient, the amplitude of light color change, the type of light effect, etc.
  • the correspondence between audio features and lighting effect features is as follows:
  • Strength feature It is also called loudness feature.
  • the strength of the midrange can be used as the strength.
  • the maximum value point in the audio intensity (loudness) change curve can be determined to be the node where the light effect changes, such as the node where the light flashes (or the node where the light brightness changes).
  • Beats (beats) in a narrow sense are the intervals between quarter notes in the score. It should be understood that the beat in each section can include strong beats and weak beats; or, the beats in each section can include strong beats, sub-strong beats, and weak beats. shoot.
  • the beat point For a beat, the part where the first half of the beat and the second half of the beat connect is the beat point.
  • the beat point can be used as the node where the light effect changes, for example, the node where the light color changes.
  • beat points can be used as nodes for light effect changes.
  • Speed characteristics that is, the frequency of strong beats.
  • the amplitude of the light color change can be adjusted according to the speed characteristics. For example, the faster the frequency of strong beats occurs, the greater the amplitude of the light color change.
  • the light color change amplitude can be understood as: the change of the light color displayed twice adjacently by the same lamp bead. The greater the difference in the color RGB values of the two adjacent lights, the greater the light color change amplitude.
  • the strong beat point at the changed position is used as the node where the lighting effect changes, such as the node where the lighting effect animation changes, such as the node where the point flickering animation changes to the light and shadow gradient animation. .
  • the part between the first half of the downbeat and the second half of the downbeat is the downbeat point.
  • the frequency of strong beats is divided into slow, medium and fast.
  • the strong beat point at the changed position can be used as the node where the lighting effect changes.
  • the frequency of strong beats below 14 beats per minute is slow
  • the frequency between 14 and 27 beats per minute is medium
  • the frequency above 27 beats per minute is fast.
  • the intensity feature and the beat feature can be comprehensively considered to determine the node where the light color changes.
  • the node where the light color changes can be determined by weighted summing the two, and the weight of the two can be based on the speed. Characteristics determined.
  • the direction of the light gradient can be determined based on whether the tune is ascending or descending. For example, if the tune is ascending, a lighting design that gradients upward or toward the user's location can be displayed. It should be understood that proceeding from a lower pitch level to a higher pitch level is called an ascending tune, and proceeding from a higher pitch level to a lower pitch level is called a descending tune. For example, from note do to note re is the upward movement of the tune, and from note re to note do is the downward movement of the tune.
  • Vocal timbre characteristics If the music contains multiple human voices, the human voices can be classified according to gender and divided into male voices and female voices. Furthermore, one or a group of lighting display devices among multiple lighting display devices can be used to classify the voices according to gender. Use the male voice to perform lighting effect display; use another one or another group of lighting display devices among multiple lighting display devices to perform lighting effect display based on the female voice's voice.
  • multiple light display devices include the light strips at the main driver's door armrest (or the first set of light display devices), the light strips at the passenger door armrest (or the second set of light display devices), the second row left door Light strips on the armrests (or the third set of lighting display devices), The light strip on the right door armrest of the second row (or the fourth set of light display devices), and the light strip on the central control screen (or the fifth set of light display devices).
  • the four groups of lighting display devices in the above-mentioned first to fifth groups of lighting display devices can be responsible for lighting according to the audio of guitar, bass, piano, and drums respectively. Effective display.
  • Harmony and chord characteristics Use multi-light display. Each light or each group of lights is responsible for displaying light effects based on a characteristic of the harmony or chord.
  • the type and/or frequency of changes in lighting effects can be determined based on the emotions expressed in the lyrics. For example, when the emotion expressed in the lyrics is a happy emotion or a positive emotion, the types of lighting effects are relatively rich and/or the frequency of light effect changes is high; when the emotion expressed in the lyrics is a sad emotion, the types of lighting effects are relatively simple and /or the frequency of light effect changes is low.
  • Fundamental frequency (pitch) is the fundamental frequency, which can describe the main pitch of music and human voices.
  • the maximum value point of the fundamental tone can be determined to be the node where the light effect changes, such as the node where the light flashes.
  • Lighting effect style can include light color, light effect change frequency, light brightness, etc.
  • the corresponding relationship between each feature and the lighting effect style is as follows:
  • the music style can be determined based on the audio attribute recognition algorithm, or the music style can also be determined based on the music tag information carried by the acquired audio data, or the music style can also be determined based on other methods. This is not the case in the embodiments of this application. Specific limitations. If the music style is classical music or pure music, the corresponding lighting effect style will have fewer light colors and a low frequency of light effect changes; if the music style is rock or heavy metal music, the corresponding lighting effect style will have many light colors and a high frequency of light effect changes. ; If the music style is lyrical music, the corresponding lighting effect style is moderate light color and moderate light effect change frequency.
  • the music style features include rhythm features, and then the light color and/or light effect change frequency is determined based on the rhythm features.
  • rhythm refers to the number of beats per unit duration (or note density).
  • note density can be used to characterize the speed of the rhythm.
  • the rhythm feature indicates that the faster the rhythm of the audio, the more colors the light changes displayed according to the audio control, and/or the faster the light effect changes frequency.
  • the rhythm feature indicates that the faster the audio rhythm changes, the greater the amplitude of the light color change before and after the node where the light effect changes is determined; and/or the faster the frequency of the node where the light effect changes.
  • the light color change amplitude can be understood as: the change of the light color displayed twice adjacently by the same lamp bead. The greater the difference in the color RGB values of the two adjacent lights, the greater the light color change amplitude.
  • Environmental attribute characteristics may include at least one of light intensity information, temperature information, and seasonal information.
  • the brightness of the light can be determined according to the ambient light intensity. For example, when the ambient light intensity reaches 10,000 lux (luminance, lux) and above, the light brightness can be set to the maximum brightness that the light display device can achieve, that is, 100%; When the ambient light intensity is between 10000lux and 500lux, the light brightness can be set between 100% and 40% of the maximum brightness that the lighting display device can achieve. Among them, the greater the ambient light intensity, the higher the light brightness; in ambient light When the intensity is lower than 500lux, the light brightness can be set to 40% of the maximum brightness that the light display device can achieve.
  • the light color can be mainly cool colors.
  • the light color can be mainly warm.
  • cool colors refer to colors close to the green system in the color spectrum, such as green, blue, and purple
  • warm colors refer to colors close to the red system in the color spectrum, such as red-purple, red, orange, yellow, etc.
  • the preset temperature may be 20 degrees, or it may be 25 degrees, or it may be other temperatures.
  • User human body characteristics Determine the lighting effect style according to the user's age group. For example, when the user is greater than or equal to 18 years old and less than or equal to 35 years old, use a lighting effect style with more light colors and a higher frequency of light effect changes; When the user is 36 or older, a lighting effect style with fewer light colors and less frequent light effect changes is adopted.
  • eye-catching colors may include red, blue, and purple.
  • the user's preferred lighting effect style can also be determined based on the user's identity information.
  • weights occupied by each of the above features in determining the light color and light effect change frequency may be different.
  • the weights occupied by the above three aspects may be 0.7, 0.1 and 0.2 respectively, or may be other weights. This is not the case in the embodiments of this application. Specific limitations.
  • the weights of each aspect can be determined according to the operating status of the vehicle.
  • the weights of the above three aspects when the vehicle is in a parking state, can be 0.9, 0.05, and 0.05 respectively; when the vehicle is in a parking state, When driving, the weights of the above three aspects can be 0.5, 0.2 and 0.3 respectively.
  • the node at which the lighting effect changes can be determined, that is, the moment when the lighting effect changes; according to some other characteristics shown in Table 2, it can be determined
  • the specific lighting effects displayed according to the audio data, as well as the color, brightness, etc. of the lights, etc. for example, the lighting effects before and/or after the node where the lighting effect changes, as well as the color, brightness, etc. of the lights.
  • Figure 4 shows a method 400 for controlling light display provided by an embodiment of the present application.
  • the method 400 may be applied to the vehicle 100 shown in FIG. 1 , and the method may also be performed by the system shown in FIG. 3 .
  • the method 400 includes:
  • S401 Determine a first attribute of the audio data, where the first attribute includes at least one of beat characteristics, speed characteristics and melody characteristics of the audio data.
  • this method can be applied to the cockpit of the vehicle 100 in the above embodiment, or can also be applied to the field of lighting control of smart homes.
  • the beat characteristics, speed characteristics and melody characteristics may be the characteristics shown in Table 1 of the above embodiment.
  • a note onset detection (onset detection) algorithm may be used to determine the beat of the audio data.
  • the note starting point detection algorithm the low-frequency starting point with strong energy (such as the position where the drum beat appears) is used as a roughly estimated beat position; further, the beat point is determined through short-term energy estimation.
  • short-term energy refers to the change in the average energy of the signal energy within a period of time.
  • the period of time may be 500 milliseconds, or may be other period of time.
  • the frequency of occurrence of downbeats may be determined through a rhythm detection algorithm.
  • the rhythm detection algorithm is used to identify note density (or beat frequency, that is, the number of beats per unit duration), and then determine the rhythm of the audio.
  • the location and number of strong beats are determined based on the short-term energy, and then the frequency of strong beats is determined.
  • the loudness maximum value can also be determined through short-term energy; a music melody recognition algorithm can be used to determine the melody changes in the audio data. It should be understood that other algorithms can also be used to determine the intensity characteristics, beat characteristics, speed characteristics, melody characteristics, etc. of the audio data, which are not specifically limited in the embodiments of the present application.
  • the lighting effect includes at least one of the following: light color, light brightness, and dynamic effect formed by the light.
  • the dynamic effects caused by light include but are not limited to: color gradient, light flashing, and light rhythm.
  • a mid-range intensity change curve is generated based on the mid-range intensity change, and the maximum value point on the curve is determined to be the node where the light flashes.
  • the beat point is determined based on the beat, and the beat point can be used as the node where the light color changes.
  • the light color before and after the node can be randomly generated, or can also be determined based on external reference information.
  • the point where the speed characteristic changes is determined to be the node where the dynamic effect changes.
  • the dynamic effect before the node is a point flash animation
  • the dynamic effect after the node is a light and shadow gradient animation.
  • the jumping amplitude of the gradient point in the lighting effect changes from large to small.
  • the "gradient point" can be one or more lamp beads with different colors at the previous moment and the current moment
  • the "gradient point jump amplitude" can include the gradient point color change amplitude.
  • the gradient point When the gradient point changes before and after When the color is a highly saturated contrast color, the gradient point is considered to have a larger beating amplitude; when the colors before and after the gradient point change are relatively similar colors, the gradient point is considered to have a small beating amplitude.
  • the "gradient point" can be one or more lamp beads arranged between two lamp beads with a large color contrast, and the "gradient point jump amplitude" can include the gradient point color change amplitude.
  • the gradient point When the colors of two adjacent lamp beads are of high saturation contrast, the gradient point is considered to have a large beating amplitude.
  • the gradient point When the colors of the two adjacent lamp beads are relatively similar colors, the gradient point is considered to have a small beating amplitude.
  • the point where the melody changes is determined to be the node where the gradient direction changes. If the tune goes up, it can display a light effect that gradually changes upward or toward the user. If the tune goes downward, it can display a light effect that gradually changes downward or away from the user. If the tune before the node goes up and the tune after the node goes down, the light gradient direction changes from upward to downward (or from towards the user to away from the user).
  • the relationship between the above-mentioned first attribute and the node where the lighting effect changes is only an exemplary description.
  • the corresponding relationship can also be in other forms, for example, determining the extremes on the mid-range intensity change curve.
  • the maximum value point is the node where the light color changes
  • the beat point is determined as the point where the light flashes
  • the point where the downbeat frequency changes is determined as the point where the gradient direction changes
  • the point where the melody is determined is the point where the dynamic effect changes.
  • only one of the first attributes may be considered, or two or more of the first attributes may be comprehensively considered.
  • music style can be combined to determine the first attribute that needs to be considered when designing lighting effects.
  • the node where the lighting effect changes can be determined mainly based on the characteristics of the tune.
  • the node where the lighting effect changes can be determined mainly based on intensity characteristics.
  • the first attribute also includes the rhythm characteristics of the audio data, and then controls the light color displayed by the lighting display device and/or the frequency of light effect changes according to the rhythm characteristics.
  • the method for controlling lighting display provided by the embodiment of the present application can design lighting effects according to the characteristics of music in multiple dimensions, which helps to improve the coordination between lighting effects and music and improve the expressiveness of lighting effects to music. Thereby improving the user’s driving experience.
  • Figure 5 shows a method 500 for controlling light display provided by an embodiment of the present application.
  • the method 500 may be applied to the vehicle 100 shown in FIG. 1 , and the method may also be performed by the system shown in FIG. 3 .
  • the method 500 includes:
  • the second attribute includes the timbre characteristics, chord characteristics, and harmony characteristics of the audio data. at least one of them.
  • this method can be applied to the cockpit of the vehicle 100 in the above embodiment, or can also be applied to the field of lighting control of smart homes.
  • a musical instrument identification algorithm based on a recursive graph can be used to determine the number of musical instruments in the audio data; an algorithm based on pitch class profile (PCP) and the constant Q transform (CQT) can be used , perform chord recognition and/or harmony recognition of audio data, and determine the number of chord tones and/or the number of harmony parts. It should be understood that other algorithms can also be used to determine the number of musical instruments, the number of chord tones, and the number of harmony parts, which are not specifically limited in the embodiments of the present application.
  • the first light display device includes one or more light display devices in the above embodiments.
  • the lighting display device includes a total of ten lighting display devices from lighting display device 1 to lighting display device 10 .
  • the timbre feature indicates that the number of musical instruments in the audio data is 4, then four of the above-mentioned lighting display devices 1 to 10 can be controlled, for example, the lighting display device 1 to the lighting device 10 can be controlled.
  • the display device 4 performs lighting effect display according to the audio of one of the four musical instruments.
  • the chord feature indicates that the number of chord tones in the audio data is 3, then three of the above-mentioned lighting display devices 1 to 10 can be controlled, for example, the lighting display device 5 To the lighting display device 7, the lighting effect is displayed according to one of the three musical tones.
  • the harmony feature indicates that the number of harmony parts in the audio data is 3, then three of the above-mentioned lighting display devices 1 to 10 can be controlled, for example, lights The display device 8 to the lighting display device 10 respectively perform lighting effect display according to one of the three parts.
  • the lighting display device that displays lighting effects based on musical instruments and the lighting display device that displays lighting effects based on musical sounds or voices can also be the same lighting display device. This is not specified in the embodiments of this application. limited.
  • the method for controlling lighting display provided by the embodiment of the present application can determine the lighting display device for lighting effect display based on at least one of the timbre characteristics, harmony characteristics, and chord characteristics in the audio data, which is helpful to improve the lighting display equipment.
  • the expressiveness of the lamp can be improved, thereby improving the dynamic effect of the light when it moves with the sound, and improving the user experience.
  • the method 500 may be executed before the method 400, or may be executed after the method 400, or may be executed in parallel with the method 400.
  • the embodiments of this application are not specifically limited.
  • method 500 can be combined with method 400, for example, according to the first attribute and the second attribute, respectively control the node where the lighting effect displayed by each of the two or more lighting display devices changes. .
  • Figure 6 shows a method 600 for controlling light display provided by an embodiment of the present application.
  • the method 600 may be applied to the vehicle 100 shown in FIG. 1 , and the method may also be performed by the system shown in FIG. 3 .
  • the method 600 includes:
  • the external reference information includes at least one of the following: human body characteristics of the user in the cockpit, identity information of the user in the cockpit, light intensity information outside the cockpit, temperature information outside the cockpit, and current location. Season information.
  • the cockpit may be the cockpit of the vehicle 100 in the above embodiment, or it may be a cockpit in other vehicles, which is not specifically limited in the embodiment of the present application.
  • the human body characteristics may include the human body characteristics in the above embodiment
  • the identity information may include the identity information in the above embodiment.
  • obtaining the human body characteristics of the user in the cabin may include: obtaining the user's facial image captured by a camera device or a visual sensor in the cabin (such as DMS or CMS). Furthermore, when the user is a driver, the user's fatigue level can be obtained by processing the face image through the face key point detection algorithm and the fatigue detection algorithm; or, the user's fatigue level can be determined by processing the face image through the C3AE face age recognition algorithm. The age of the user; alternatively, the user's emotional state can be determined by processing the face image through a facial key point detection algorithm and an emotion recognition algorithm.
  • obtaining the light intensity outside the cockpit includes: obtaining the light intensity information collected by the photosensitive sensor; obtaining the temperature outside the cockpit includes: obtaining the temperature information collected by the temperature sensor; obtaining the current season information includes: obtaining the current season information from the cloud server Get current season information.
  • the light effect, light color, and brightness may include: the light effect, light color, and brightness before the node where the light effect changes, and/or the light effect, light color, and brightness after the node where the light effect changes.
  • determining the lighting effect displayed by the lighting display device, the color, brightness, and frequency of lighting effect changes based on the user's identity information may also include: determining the user's lighting style preference based on the identity information.
  • the identity information of one or more users is stored in the vehicle. If the identity information of the user in the cockpit matches the identity information of the one or more users, the identity information of the user in the cockpit can be Determine the lighting style.
  • the method of controlling light display provided by the embodiment of the present application can determine the lighting effect in combination with user preferences, cockpit external environment information, etc., can meet the needs of users of different ages and/or different preferences, and can also improve the user's comfort. .
  • the method 600 may be executed before the method 400 and/or the method 500, or may be executed after the method 400 and/or the method 500, or may be executed in parallel with the method 400 and/or the method 500.
  • the comparison of the embodiments of this application will not be detailed. limited.
  • method 600 can be combined with method 400, for example, according to the external reference information and the first attribute, the frequency of the node where the lighting effect displayed by the lighting display device changes, and/or the color of the lights before and after the node.
  • the frequency of nodes at which the lighting effect changes may include the above-mentioned lighting effect changing frequency.
  • audio data is obtained from the music player of the vehicle, and the audio data is segmented to obtain one or more segmented audio data.
  • the audio data can be segmented based on the self-similarity matrix (SSM) algorithm.
  • SSM self-similarity matrix
  • the audio data can be segmented according to the intro, chorus, and verse.
  • the audio data can be segmented. Segmentation based on presentation, development and recapitulation; alternatively, audio data can also be segmented based on a clustering algorithm and repeated audio can be grouped into one category; alternatively, Audio data can be segmented by other methods, which are not specifically limited in the embodiments of this application. It should be understood that by segmenting the audio data, it helps to reduce the time delay in the real-time generation and display of lighting effects.
  • the lighting effect can be saved based on the user's identity information or based on the audio data, so that the user can use it next time.
  • the lighting display device can be directly controlled to display the lighting effect.
  • Figure 7 shows a method 700 for controlling light display provided by an embodiment of the present application.
  • the method 700 may be applied to the vehicle 100 shown in FIG. 1 , and the method may also be performed by the system shown in FIG. 3 .
  • the method 700 includes:
  • the audio data may include the audio data in the above embodiment; the audio attribute may include the first attribute and/or the second attribute in the above embodiment.
  • the first duration may include the duration required by the audio recognition algorithm when identifying audio data to obtain audio attributes.
  • the more complex the audio data is the longer the first duration is.
  • the more musical instruments, harmonies, and chords the audio data contains the more complex the audio data is, and the longer the first duration is.
  • S702. Determine the first delay caused by transmitting the light effect indication information, determine the second delay caused by encoding and decoding the light effect information, and determine the second time required to generate the first light effect based on the light effect indication information.
  • the lighting effect indication information is generated based on the audio attribute and is used to instruct the lighting display device to display the first lighting effect.
  • the first delay is determined based on the network bandwidth for transmitting the lighting effect indication information in the vehicle and the data frame length of the lighting effect indication information.
  • the smaller the network bandwidth and the longer the data frame length the longer the first delay.
  • the second delay is determined according to the data frame length of the lighting effect indication information. The longer the data frame length is, the longer the second delay is.
  • S703 Perform delay compensation on the first lighting effect displayed by the lighting display device according to at least one of the first duration, the second duration, the first time delay, and the second time delay.
  • the lighting effect displayed by the lighting display device lags behind the audio output by the audio device, resulting in a node where the lighting effect actually changes and The nodes determined based on the first attribute are inconsistent.
  • Delay compensation is performed on the first lighting effect displayed by the lighting display device according to the first duration, the first delay and the second delay, so that the node where the lighting effect actually changes is consistent with the node determined based on the first attribute. match; or the time difference between the node where the lighting effect actually changes and the node determined based on the first attribute is less than or equal to the first threshold.
  • the first threshold may be 16 milliseconds, or It may be 15 milliseconds, or it may be other values higher than the human eye recognition frequency, which is not specifically limited in the embodiments of the present application.
  • the audio sampling rate and the lighting refresh rate can also be combined to perform lighting effect compensation on the lighting effects displayed by the lighting display device. For example, taking the audio sampling rate as 44100 hertz (Hz), the playback time of each audio frame is approximately 23.2 milliseconds (or one frame of audio can be updated immediately every 23.2 milliseconds). Assume that the light refresh rate is 20Hz, which means it refreshes every 50 milliseconds. Then you can add compensation lighting effects at locations where there is a delay between audio updates and light refreshes to smooth the lighting effects, such as adding water ripple (whitening) lighting effects for buffer compensation.
  • the audio sampling rate as 44100 hertz (Hz)
  • the playback time of each audio frame is approximately 23.2 milliseconds (or one frame of audio can be updated immediately every 23.2 milliseconds).
  • the light refresh rate is 20Hz, which means it refreshes every 50 milliseconds.
  • a lighting display device that displays lights according to audio may include two or more links, wherein each link includes at least one lighting display device, and each of the two or more links
  • the light refresh rate in the network may be different, so delay compensation can be performed on each link separately.
  • the method for controlling lighting display provided by the embodiment of the present application can make the node where the actual lighting effect changes basically coincide with the node determined based on the first attribute of the audio data, which helps to improve the user's experience of using the lighting display device.
  • Figure 8 shows a method 800 for controlling light display provided by an embodiment of the present application.
  • the method 800 may be applied to the vehicle 100 shown in FIG. 1 , and the method may also be performed by the system shown in FIG. 3 .
  • the method 800 includes:
  • the audio data may be the audio data in the above embodiment, or may be other audio data to be played.
  • the audio attribute includes at least one of the beat characteristics, speed characteristics and melody characteristics of the audio data.
  • the audio attribute may be the audio attribute in the above method 700; or the audio attribute may include the first attribute in the above embodiment, or may also include the second attribute in the above embodiment, or may also include other audio attributes. related properties.
  • the method of determining audio attributes may refer to the description in the above embodiment, and will not be described again here.
  • the method process of controlling the node of the lighting effect change displayed by the lighting display device according to the audio attribute can be referred to the description in the above embodiment, and will not be described again here.
  • the node that controls the change of the lighting effect displayed by the lighting display device includes at least one of the following: controlling the time when the lighting effect displayed by the lighting display device changes, and/or the lighting effects before and after the lighting effect changes. Specific forms.
  • the light effect includes at least one of the following: light color, light brightness, and dynamic effect formed by the light; wherein, when the light display device includes a light strip, the dynamic effect formed by the light includes but is not limited to: changes in the light in the light strip The direction, speed, flow length of the light, and color change type (such as changing according to the color wheel or changing according to black and white).
  • the lighting effects may also include: dynamic effects formed by lights in the light strip dimension, and/or dynamic effects formed by lights in the time dimension.
  • the lighting display device includes a light strip 910, a light strip 920, and a light strip 930, wherein each light strip includes a plurality of lamp beads.
  • the direction of light change in the light strip may include: from the lamp bead 911 to the lamp bead 913, or from the lamp bead 913 to the lamp bead 911 direction change.
  • the above-mentioned light changes may include but are not limited to: the lamp beads light up in sequence, the lamp beads go out in sequence, and the color of the lamp beads gradually changes. The speed of light change can be reflected in the duration.
  • the flow of light can be shown by the lamp beads turning on and/or turning off in sequence.
  • the flow length of the light can be reflected by the number of lamp beads in the light strip. For example, it can be controlled from lamp beads 911 to lamp beads 913.
  • the light display, or the light display from the lamp bead 911 to the lamp bead 912 can also be controlled, wherein the flow length of the light in the former case is longer than the flow length of the light in the latter case.
  • the dynamic effect formed by light in the time dimension can be reflected in the time of light display from the light strip 910 to the light strip 930 .
  • At least two of the light strips 910, 920, and 930 can be controlled to display lights at the same time.
  • the light strip 910, the light strip 920 and the light strip 930 can be controlled to perform light display in sequence. For example, after the light strip 910 performs the light display, the light strip 920 performs the light display. After the light display of the light strip 920 ends, , and then the light strip 930 performs light display.
  • the audio attribute includes at least one of timbre characteristics, chord characteristics, and harmony characteristics of the audio data; according to at least one of the timbre characteristics, chord characteristics, and harmony characteristics, the lights displayed by the light display device are controlled. effect changes.
  • the first timbre feature is used to indicate relevant information of the first timbre in the audio data
  • the second timbre feature is used to indicate relevant information of the second timbre in the audio data.
  • the first harmony feature is used to indicate related information of the first part in the audio data
  • the second harmony feature is used to indicate related information of the second part in the audio data.
  • the first chord feature is used to indicate the relevant information of the first musical tone in the audio data
  • the second chord feature is used to indicate the relevant information of the second musical tone in the audio data.
  • a node for changing the lighting effect displayed by the lighting display device is controlled based on the audio attributes and the human body characteristics or identity information of the first user in the cockpit. For example, control the color of the lights before and after the node, and/or the frequency of the node.
  • the human body characteristics or identity information of the first user in the cockpit may include one or more items of the external reference information in the above embodiment.
  • the specific method process of controlling the node of the lighting effect change displayed by the lighting display device according to the audio attributes and the human body characteristics or identity information of the first user in the cockpit reference can be made to the description in the above embodiments and will not be described again here.
  • the light color before and after the node is controlled based on the audio attributes, the temperature information outside the cockpit and/or the current season information.
  • the temperature information outside the cabin and/or the current season information may include one or more of the external reference information in the above embodiments.
  • the temperature information may include the outside temperature.
  • control the light brightness before and after the node based on the audio attributes and the light intensity information outside the cockpit.
  • the light intensity information outside the cabin may include one or more of the external reference information in the above embodiment, such as ambient light intensity.
  • the external reference information such as ambient light intensity.
  • the method of controlling light display provided by the embodiment of the present application can control the light effect according to the characteristics of one or more dimensions of music, which helps to improve the coordination between the light effect and the music and improve the performance of the light effect on the music. power, thereby improving the user’s driving experience.
  • Figure 10 shows a schematic block diagram of a device 2000 for controlling light display provided by an embodiment of the present application.
  • the device 2000 includes an acquisition unit 2010, a first determination unit 2020 and a processing unit 2030.
  • the apparatus 2000 may include means for performing the methods in Figures 4 to 8. Moreover, each unit in the device 2000 and the above-mentioned other operations and/or functions are respectively intended to implement the corresponding processes of the method embodiments in Figures 4 to 8.
  • the acquisition unit 2010 can be used to perform S801 in the method 800
  • the first determining unit 2020 can be used to perform S802 in the method 800
  • the processing unit 2030 can be used to perform the method S803 in 800.
  • the device 2000 includes: an obtaining unit 2010, used to obtain the audio data to be played; a first determining unit 2020, used to determine the audio attributes of the audio data, the audio attributes include the beat characteristics and speed characteristics of the audio data. and at least one of the melody characteristics; the processing unit 2030 is configured to control the node of the lighting effect change displayed by the lighting display device according to the audio attribute.
  • the audio attributes also include rhythm characteristics of the audio data
  • the processing unit 2030 is configured to: control one or more of the color of the lights before and after the node and the frequency of the node according to the rhythm characteristics, wherein the The frequency of the node is used to indicate how quickly the lighting effect changes.
  • the device further includes a second determination unit, configured to determine delay information, where the delay information includes at least one of the following: determining the first duration required for the audio attribute based on the audio data, transmitting the light effect indication The first delay of information generation, the lighting effect indication information The second delay generated by encoding and decoding the information, and the second duration required to generate the first lighting effect based on the lighting effect indication information; wherein the lighting effect indication information is generated based on the audio attribute and is used to instruct the lighting display device The first lighting effect displayed; the processing unit 2030 is configured to control the node where the lighting effect changes displayed by the lighting display device according to the audio attribute and the time delay information.
  • the lighting effects before and after the node include the first lighting effect. .
  • the first determining unit 2020 and the second determining unit are the same determining unit.
  • the audio attributes also include timbre characteristics
  • the timbre characteristics include first timbre characteristics and second timbre characteristics.
  • the processing unit 2030 includes a first processing unit and a second processing unit, wherein the first processing unit is used to : a node that controls the change of the lighting effect displayed by the first lighting display device according to the first timbre characteristic; and/or the second processing unit is used to: control the lighting effect displayed by the second lighting display device according to the second timbre characteristic A changing node; wherein the light display device includes the first light display device and the second light display device.
  • the first processing unit and the second processing unit are the same processing unit.
  • the timbre characteristics include vocal timbre characteristics and/or musical instrument timbre characteristics.
  • the audio attribute further includes a harmony feature
  • the harmony feature includes a first harmony feature and a second harmony feature
  • the processing unit 2030 includes a third processing unit and a fourth processing unit, wherein the third The processing unit is used to: control the node of the lighting effect change displayed by the third lighting display device according to the first harmony characteristic; and/or the fourth processing unit is used to: control the fourth lighting display according to the second harmony characteristic A node where the lighting effect displayed by the device changes; wherein the lighting display device includes the third lighting display device and the fourth lighting display device.
  • the third processing unit and the fourth processing unit are the same processing unit.
  • the audio attribute further includes a chord feature
  • the chord feature includes a first chord feature and a second chord feature
  • the processing unit 2030 includes a fifth processing unit and a sixth processing unit, wherein the fifth processing unit is used to : a node that controls the change of the lighting effect displayed by the fifth lighting display device according to the first chord characteristics; and/or the sixth processing unit is used to: control the lighting effect displayed by the sixth lighting display device according to the second chord characteristics A changing node; wherein the light display device includes the fifth light display device and the sixth light display device.
  • the fifth processing unit and the sixth processing unit are the same processing unit.
  • the acquisition unit 2010 is also used to: acquire the human body characteristics or identity information of the first user in the cockpit; the processing unit 2030 is used to: control the light display according to the audio attributes, the human body characteristics or the identity information The node where the lighting effect displayed by the device changes.
  • the processing unit 2030 is configured to: when the vehicle in which the cabin is located is in a driving state, the first user is in the main driving position of the cabin, and the human body characteristics indicate that the first user is in a fatigue state, improve the The frequency of the node, and/or controls the lights that display warning colors before and after the node.
  • the acquisition unit 2010 is also configured to: acquire environmental information outside the cabin, where the environmental information includes one or more of temperature information, current season information, and light intensity information;
  • the processing unit 2030 is configured to: according to The audio attribute, and the environment information, control one or more of the color of the lights before and after the node, the frequency of the node, and the brightness of the lights before and after the node.
  • the acquisition unit 2010 is also configured to: acquire the temperature information outside the cabin and/or the current season information; the processing unit 2030 is configured to: acquire the temperature information outside the cabin and/or the current season information according to the audio attribute, and the temperature information outside the cabin and/or the current season information. Season information is used to control the light color before and after the node.
  • the acquisition unit 2010 is also configured to: acquire light intensity information outside the cabin; and the processing unit 2030 is configured to: control the brightness of lights before and after the node according to the audio attribute and the light intensity information outside the cabin.
  • the node where the light effect changes is associated with at least one of the following in the audio attribute: a beat point indicated by the beat feature, a point where the downbeat frequency changes indicated by the speed feature, a tune indicated by the melody feature point of change.
  • each unit in the above device is only a division of logical functions.
  • the units may be fully or partially integrated into a physical entity, or may be physically separated.
  • the unit in the device can be implemented in the form of a processor calling software; for example, the device includes a processor, the processor is connected to a memory, instructions are stored in the memory, and the processor calls the instructions stored in the memory to implement any of the above methods.
  • the processor is, for example, a general-purpose processor, such as a CPU or a microprocessor
  • the memory is a memory within the device or a memory outside the device.
  • the units in the device can be implemented in the form of hardware circuits, and some or all of the functions of the units can be implemented through the design of the hardware circuits, which can be understood as one or more processors; for example, in one implementation,
  • the hardware circuit is an ASIC, which realizes some or all of the above through the design of the logical relationship of the components in the circuit.
  • the function of the unit for another example, in another implementation, the hardware circuit can be implemented through PLD.
  • FPGA as an example, it can include a large number of logic gate circuits, and the connection relationship between the logic gate circuits is configured through the configuration file. Thereby realizing the functions of some or all of the above units. All units of the above device may be fully realized by the processor calling software, or may be fully realized by hardware circuits, or part of the units may be realized by the processor calling software, and the remaining part may be realized by hardware circuits.
  • the processor is a circuit with signal processing capabilities.
  • the processor may be a circuit with instruction reading and execution capabilities, such as a CPU, a microprocessor, a GPU, or DSP, etc.; in another implementation, the processor can realize certain functions through the logical relationship of the hardware circuit. The logical relationship of the hardware circuit is fixed or can be reconstructed.
  • the processor is a hardware circuit implemented by ASIC or PLD. For example, FPGA.
  • the process of the processor loading the configuration file and realizing the hardware circuit configuration can be understood as the process of the processor loading instructions to realize the functions of some or all of the above units.
  • it can also be a hardware circuit designed for artificial intelligence, which can be understood as an ASIC, such as NPU, TPU, DPU, etc.
  • each unit in the above device can be one or more processors (or processing circuits) configured to implement the above method, such as: CPU, GPU, NPU, TPU, DPU, microprocessor, DSP, ASIC, FPGA , or a combination of at least two of these processor forms.
  • processors or processing circuits
  • each unit in the above device may be integrated together in whole or in part, or may be implemented independently. In one implementation, these units are integrated together and implemented as a system-on-a-chip (SOC).
  • SOC may include at least one processor for implementing any of the above methods or implementing the functions of each unit of the device.
  • the at least one processor may be of different types, such as a CPU and an FPGA, or a CPU and an artificial intelligence processor. CPU and GPU etc.
  • each operation performed by the above-mentioned obtaining unit 2010, first determining unit 2020 and processing unit 2030 may be performed by the same processor, or may also be performed by different processors, for example, by multiple Processor execution;
  • Each operation performed by the above-mentioned first processing unit, second processing unit, third processing unit, fourth processing unit, fifth processing unit, and sixth processing unit may be executed by the same processor, or, It may also be executed by different processors, for example, by multiple processors respectively.
  • one or more processors can be connected to one or more sensors in the perception system 120 in Figure 1 to obtain information about the user's location from one or more sensors and process it; in another example One or more processors may also be connected to one or more lighting display devices in the display device 130, and then respectively control the lighting effect of each lighting display in the one or more lighting display devices.
  • the one or more processors described above may be processors provided in a vehicle machine, or may also be processors provided in other vehicle-mounted terminals.
  • the above-mentioned device 2000 may be a chip provided in a vehicle machine or other vehicle-mounted terminal.
  • the above-mentioned device 2000 may be the computing platform 150 as shown in FIG. 1 provided in the vehicle.
  • Embodiments of the present application also provide a device, which includes a processing unit and a storage unit, where the storage unit is used to store instructions, and the processing unit executes the instructions stored in the storage unit, so that the device performs the method performed in the above embodiments or step.
  • the above-mentioned processing unit includes the processors 151-15n shown in Figure 1.
  • the above-mentioned acquisition unit may be a certain sensor in the sensing system 120 shown in FIG. 1 , or may also be the processor 151 - 15n shown in FIG. 1 .
  • FIG. 11 is a schematic block diagram of a device for controlling light display according to an embodiment of the present application.
  • the device 2100 for controlling light display shown in FIG. 11 may include: a processor 2110, a transceiver 2120, and a memory 2130.
  • the processor 2110, the transceiver 2120 and the memory 2130 are connected through an internal connection path.
  • the memory 2130 is used to store instructions.
  • the processor 2110 is used to execute the instructions stored in the memory 2130, and the transceiver 2120 receives/sends some parameters.
  • the memory 2130 can be coupled with the processor 2110 through an interface or integrated with the processor 2110 .
  • transceiver 2120 may include but is not limited to a transceiver device such as an input/output interface to realize communication between the device 2100 and other devices or communication networks.
  • the processor 2110 may use a general-purpose CPU, microprocessor, ASIC, GPU or one or more integrated circuits to execute relevant programs to implement the method for controlling light display in the method embodiment of the present application.
  • the processor 2110 may also be an integrated circuit chip with signal processing capabilities.
  • each step of the method for controlling light display in this application can be completed by instructions in the form of hardware integrated logic circuits or software in the processor 2110 .
  • the above-mentioned processor 2110 can also be a general-purpose processor, DSP, ASIC, FPGA or other programmable logic device, discrete gate or transistor logic device, or discrete hardware component.
  • Each method, step and logical block diagram disclosed in the embodiment of this application can be implemented or executed.
  • a general-purpose processor may be a microprocessor or the processor may be any conventional processor, etc.
  • the steps of the method disclosed in conjunction with the embodiments of the present application can be directly implemented by a hardware decoding processor, or executed by a combination of hardware and software modules in the decoding processor.
  • Software modules can be located at random Memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, registers and other mature storage media in this field.
  • the storage medium is located in the memory 2130.
  • the processor 2110 reads the information in the memory 2130 and executes the method for controlling the light display according to the method embodiment of the present application in conjunction with its hardware.
  • the memory 2130 may be a read-only memory (ROM), a static storage device, a dynamic storage device or a random access memory (RAM).
  • ROM read-only memory
  • RAM random access memory
  • the transceiver 2120 uses a transceiver device such as but not limited to a transceiver to implement communication between the device 2100 and other devices or communication networks. For example, the user's location information can be obtained through the transceiver 2120.
  • An embodiment of the present application also provides a mobile carrier, which may include the above device 2000 or the above device 2100.
  • the mobile carrier may be the vehicle in the above embodiment.
  • Embodiments of the present application also provide a computer program product.
  • the computer program product includes: computer program code.
  • the computer program code When the computer program code is run on a computer, it causes the computer to execute the methods in Figures 4 to 8.
  • Embodiments of the present application also provide a computer-readable storage medium.
  • the computer-readable medium stores program codes or instructions.
  • the processor executes the above-mentioned Figures 4 to 4.
  • An embodiment of the present application also provides a chip, including: at least one processor and a memory.
  • the at least one processor is coupled to the memory and is used to read and execute instructions in the memory to execute the above-mentioned Figures 4 to 8. Methods.
  • At least one refers to one or more
  • plural refers to two or more.
  • “And/or” describes the association of associated objects, indicating that there can be three relationships, for example, A and/or B, which can mean: including the existence of A alone, the existence of A and B at the same time, and the existence of B alone, where A and B can be singular or plural.
  • the character “/” generally indicates that the related objects are in an “or” relationship.
  • “At least one of the following” or similar expressions thereof refers to any combination of these items, including any combination of a single item (items) or a plurality of items (items).
  • At least one of a, b, or c can mean: a, b, c, a-b, a-c, b-c, or a-b-c, where a, b, c can be single or multiple .
  • the disclosed systems, devices and methods can be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components may be combined or can be integrated into another system, or some features can be ignored, or not implemented.
  • the coupling or direct coupling or communication connection between each other shown or discussed may be through some interfaces, and the indirect coupling or communication connection of the devices or units may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in one place, or they may be distributed to multiple network units. can be based on actual It is necessary to select some or all of the units to achieve the purpose of this embodiment.
  • each functional unit in each embodiment of the present application can be integrated into one processing unit, each unit can exist physically alone, or two or more units can be integrated into one unit.
  • the functions are implemented in the form of software functional units and sold or used as independent products, they can be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present application can essentially be embodied in the form of a software product.
  • the computer software product is stored in a storage medium and includes a number of instructions to enable a computer device (which can be a personal computer, a server, or network equipment, etc.) to perform all or part of the steps of the methods described in various embodiments of this application.
  • the aforementioned storage media include: U disk, mobile hard disk, ROM, RAM, magnetic disk or optical disk and other media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Circuit Arrangement For Electric Light Sources In General (AREA)
  • Illuminated Signs And Luminous Advertising (AREA)

Abstract

本申请提供了一种控制灯光显示的方法和装置,该方法包括:获取待播放的音频数据;确定该音频数据的音频属性,该音频属性包括该音频数据的节拍特征、速度特征和曲调特征中的至少一个;根据该音频属性,控制灯光显示设备所显示的灯效变化。本申请的方法,可以应用于智能车辆、电动车辆中,有助于在灯光交互层面提高用户的驾乘体验。

Description

控制灯光显示的方法和装置
本申请要求于2022年9月5日提交中国专利局、申请号为202211080757.7、申请名称为“控制灯光显示的方法和装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及灯光控制领域,更具体地,涉及一种控制灯光显示的方法和装置。
背景技术
随着汽车行业的不断发展,人们对车内环境的个性化、健康化要求越来越高,车内呼吸灯和氛围灯作为汽车造型及整体外观至关重要的一部分,被越来越多的造车厂所采用。而将舱内音乐和灯效结合,实现灯随音动,给驾乘沉浸式的音乐体验,是受用户喜爱的一个功能。但是当前车内的灯随音动的灯效表现力较差,无法满足用户千人千面需求。
因此,一种能够提高用户体验的控制灯光显示的方案亟待开发。
发明内容
本申请提供一种控制灯光显示的方法和装置,有助于在灯光交互层面提高用户的驾乘体验。
本申请提供的方法可以应用于移动载体中,其中,移动载体可以包括路上交通工具、水上交通工具、空中交通工具、工业设备、农业设备、或娱乐设备等。例如移动载体可以为车辆,该车辆为广义概念上的车辆,可以是交通工具(如汽车商用车、乘用车、卡车、摩托车、飞机飞行车、火车、轮船等),工业车辆(如:叉车、挂车、牵引车等),工程车辆(如挖掘机、推土车、吊车等),农用设备(如割草机、收割机等),游乐设备,玩具车辆等,本申请实施例对车辆的类型不作具体限定。再如,移动载体可以为飞机、或轮船等交通工具。
第一方面,提供了一种控制灯光显示的方法,该方法包括:获取待播放的音频数据;确定该音频数据的音频属性,该音频属性包括该音频数据的节拍特征、速度特征和曲调特征中的至少一个;根据该音频属性,控制灯光显示设备所显示的灯效变化。
在上述技术方案中,能够根据音频的一个或多个维度的特征进行灯效设计,有助于提高灯效与音乐之间的配合度,提高灯效对音乐的表现力,进而提升用户的驾乘体验。
示例性地,灯效包括如下至少一项:灯光颜色、灯光亮度、灯光形成的动态效果;其中,在灯光显示设备包括灯带时,灯光形成的动态效果包括但不限于:灯带中灯光变化的方向、速度、灯光的流动长度、颜色变化类型(如按色盘变化还是按黑白变化)。
示例性地,上述音频数据可以是从音乐播放软件处获取的,或者也可以是通过其他方式获取的,本申请实施例对此不作具体限定。
示例性地,上述确定该音频数据的音频属性可以为:通过音乐识别算法对音频数据进行处理确定的,例如,音乐识别算法可以包括音符起始点检测算法、音乐节奏检测算法、音乐旋律识别算法、基于音极轮廓图和恒Q变换算法等算法。或者,上述音频数据的音频属性可以是从音频数据的数据源直接获取的,例如,数据源具有音频属性相关标签,则根据标签可以直接确定音频数据的音频属性。
示例性地,该方法应用于车辆时,该灯光显示设备可以设置于车辆的座舱内,例如,可以包括氛围灯、呼吸灯、车载显示屏所显示的虚拟灯珠等;或者也可以设置在车辆的座舱外,例如可以为车载大灯,包括但不限于:刹车灯、转向灯、近光灯、远光灯、日行灯、位置灯。
结合第一方面,在第一方面的某些实现方式中,该控制灯光显示设备所显示的灯效变化,包括:控制灯光显示设备所显示的灯效变化的节点。
“控制灯光显示设备所显示的灯效变化的节点”可以包括:控制灯光显示设备所显示的灯效变化的时刻,和/或灯效变化前后的灯效具体形式,如灯光颜色、灯光形成的动态效果等。
在上述技术方案中,根据音频属性控制灯效变化的节点,使得灯光根据音频的节拍、速度和曲调中的至少一个进行变化,能够使得音频表达理念或情感通过灯光具象化,有助于提升用户在聆听音频时的用户体验。
结合第一方面,在第一方面的某些实现方式中,该灯效变化的节点与该音频属性相关联,包括如下至少一项:该节拍特征指示的节拍点,该速度特征指示的强拍出现频率变化的点,该曲调特征指示的曲调变化的点。
示例性地,根据该音频属性控制灯光显示设备所显示的灯效变化的节点,可以包括如下至少一项:根据该节拍点,控制该节拍点前后的灯光颜色变化;根据该强拍出现频率变化的点,控制该点前后的灯光形成的动态效果变化;根据该曲调变化的点,控制该点前后的灯光形成的动态效果变化。
结合第一方面,在第一方面的某些实现方式中,该音频属性还包括该音频数据的节奏特征,该控制灯光显示设备所显示的灯效变化的节点,包括:根据该节奏特征,控制该节点前后的灯光颜色、该节点的频率中的一个或多个,其中,该节点的频率可以理解为灯效变化的速度(或称灯效变化频率)。
需要说明的是,在本申请中,“节点前后”包括节点之前、节点之后、和节点之前及节点之后中的一种情况或多种情况,可以在具体实现过程中选定。
在一些可能的实现方式中,该节奏特征指示该音频节奏变化的越快,则节点之前和该节点之后的灯光颜色变化幅度越大;和/或该节点的频率越快。其中,灯光颜色变化幅度可以理解为:同一灯珠相邻两次所显示的灯光颜色的变化大小,相邻两次灯光的颜色红绿蓝(red,green,blue,RGB)值相差越大,则灯光颜色变化幅度越大。
在一些可能的实现方式中,该节奏特征指示该音频节奏变化的越快,则该音频对应使用的灯光颜色越多。
示例性地,对于节奏较快的音乐,例如摇滚音乐,可以控制灯光显示设备显示较快的灯效变化,和/或控制节点前后的灯光颜色变化幅度较大;对于节奏较慢的音乐,例如古典音乐,可以控制灯光显示设备显示较慢的灯效变化,和/或控制节点前后的灯光颜色变化幅度较小。
在上述技术方案中,在确定灯效变化的节点的基础上,结合音频的节奏特征控制灯光的颜色和/或灯效变化的速度,有助于进一步提高灯光对音频的表现力。例如,使得根据节奏较弱的古典音乐进行显示的灯效更具表现力,有助于进一步地提高用户使用灯随音动时的体验。
结合第一方面,在第一方面的某些实现方式中,该方法还包括:确定时延信息,该时延信息包括如下至少一项:根据该音频数据确定该音频属性所需的第一时长,传输灯效指示信息产生的第一时延,对该灯效指示信息进行编解码产生的第二时延,以及根据该灯效指示信息生成第一灯效所需的第二时长;其中,该灯效指示信息根据该音频属性生成,用于指示该灯光显示设备所显示的第一灯效;该控制灯光显示设备所显示的灯效变化的节点,包括:根据该音频属性以及该时延信息,控制该灯光显示设备所显示的灯效变化的节点,该节点前后的灯效包括该第一灯效。
应理解,根据该音频属性以及该时延信息,控制该灯光显示设备所显示的灯效变化的节点,包括:根据该第一时长、该第二时长、该第一时延和该第二时延中的至少一项,对灯光显示设备所显示的灯效进行时延补偿,进而控制灯光显示设备显示相应灯效。
在一些可能的实现方式中,根据音频显示灯光的灯光显示设备,可能包括两条及以上链路,其中,每条链路中包括至少一个灯光显示设备,两条及以上链路中每条链路中的灯光刷新率可能不同,则可以分别对每条链路进行时延补偿。
在上述技术方案中,能够使得实际灯效变化的节点与根据音频数据的音频属性确定的节点基本重合,减少灯效延迟导致的用户体验不佳的问题。
结合第一方面,在第一方面的某些实现方式中,该音频属性还包括音色特征,该音色特征包括第一音色特征和第二音色特征,该控制灯光显示设备所显示的灯效变化的节点,包括:根据该第一音色特征控制第一灯光显示设备所显示的灯效变化的节点;和/或根据该第二音色特征控制第二灯光显示设备所显示的灯效变化的节点;其中,该灯光显示设备包括该第一灯光显示设备和该第二灯光显示设备。
示例性地,该第一音色特征用于指示音频数据中的第一音色的相关信息,该第二音色特征用于指示音频数据中的第二音色的相关信息。
示例性地,在同一时刻,音频数据中包括两个及以上音色时,可以分别控制两个及以上灯光显示 设备中的每个灯光显示设备,根据一个音色特征进行灯光显示。
在一些可能的方式中,在同一时刻,音频数据中仅包括一个音色,且在第一时长内该音频数据包括第一音色,在第二时长内该音频数据包括第二音色;则可以在第一时长内控制第一灯光显示设备根据第一音色进行灯光显示,在第二时长内控制控制第二灯光显示设备根据第二音色进行灯光显示。
在一些可能的实现方式中,该第一灯光显示设备和该第二灯光显示设备设置在不同位置。例如,在该方法应用于座舱时,该第一灯光显示设备和该第二灯光显示设备可以设置在座舱的不同区域。示例性地,该第一灯光显示设备可以为设置在主驾驶车门扶手处的灯带,该第二灯光显示设备可以为设置在副驾驶车门扶手处的灯带;或者,该第一灯光显示设备可以为中控屏(通过虚拟灯珠显示灯效),该第二灯光显示设备可以为设置在副驾驶车门扶手处的灯带;或者,该第一灯光显示设备可以为设置在主驾驶车门扶手处的灯带,该第二灯光显示设备可以为刹车灯、转向灯、近光灯、远光灯、日行灯、位置灯中的至少一个。
在一些可能的实现方式中,该第一灯光显示设备和该第二灯光显示设备可以为同一灯光显示设备。
在上述技术方案中,使灯光显示设备根据音频的音色特征进行灯光显示,增加灯光显示设备的显示灯光的模式,提高用户在使用灯随音动效果时的体验到的科技感。此外,还能够提高灯随音动过程中灯光的动感效果,继而提高用户的使用体验。
结合第一方面,在第一方面的某些实现方式中,该音色特征包括人声音色特征和/或乐器音色特征。
示例性地,人声音色特征可以指示如下至少一个音色:男声音色、女声音色、童声音色。
示例性地,乐器音色特征可以指示如下至少一个乐器的音色:体鸣乐器、膜鸣乐器、气鸣乐器、弦鸣乐器、电鸣乐器。其中,体鸣乐器包括但不限于:口弦、韵板、檀板;膜鸣乐器包括但不限于:大鼓、双面鼓、八角鼓;气鸣乐器包括但不限于:单簧、笛、箫、埙、小号、圆号、风琴、手风琴、口琴;弦鸣乐器包括但不限于:古琴、古筝、艾捷克(Eijieke);电鸣乐器包括但不限于:电小提琴、电大提琴、电钢琴、电风琴、电柳琴、电琵琶。
结合第一方面,在第一方面的某些实现方式中,该音频属性还包括和声特征,该和声特征包括第一和声特征和第二和声特征,该控制灯光显示设备所显示的灯效变化的节点,包括:根据该第一和声特征控制第三灯光显示设备所显示的灯效变化的节点;和/或根据该第二和声特征控制第四灯光显示设备所显示的灯效变化的节点;其中,该灯光显示设备包括该第三灯光显示设备和该第四灯光显示设备。
示例性地,该第一和声特征用于指示音频数据中的第一声部的相关信息,该第二和声特征用于指示音频数据中的第二声部的相关信息。
在一些可能的实现方式中,该第三灯光显示设备和该第四灯光显示设备设置在不同位置。例如,在该方法应用于座舱时,该第三灯光显示设备和该第四灯光显示设备可以设置在座舱的不同区域。
在一些可能的实现方式中,该第三灯光显示设备和该第四灯光显示设备可以为同一灯光显示设备。
结合第一方面,在第一方面的某些实现方式中,该音频属性还包括和弦特征,该和弦特征包括第一和弦特征和第二和弦特征,该控制灯光显示设备所显示的灯效变化的节点,包括:根据该第一和弦特征控制第五灯光显示设备所显示的灯效变化的节点;和/或根据该第二和弦特征控制第六灯光显示设备所显示的灯效变化的节点;其中,该灯光显示设备包括该第五灯光显示设备和该第六灯光显示设备。
示例性地,该第一和弦特征用于指示音频数据中的第一乐音的相关信息,该第二和弦特征用于指示音频数据中的第二乐音的相关信息。
在一些可能的实现方式中,该第五灯光显示设备和该第六灯光显示设备设置在不同位置。例如,在该方法应用于座舱时,该第五灯光显示设备和该第六灯光显示设备可以设置在座舱的不同区域。
在一些可能的实现方式中,该第五灯光显示设备和该第六灯光显示设备可以为同一灯光显示设备。
结合第一方面,在第一方面的某些实现方式中,该方法应用于座舱,该方法还包括:获取该座舱内第一用户的人体特征或身份信息;该控制灯光显示设备所显示的灯效变化的节点,包括:根据该音频属性,以及该人体特征或该身份信息,控制该灯光显示设备所显示的灯效变化的节点。
示例性地,该人体特征包括但不限于性别、年龄、情绪等。
示例性地,在人体特征指示第一用户为儿童时,减少或不使用刺眼颜色,和/或降低节点的频率。
示例性地,在人体特征指示第一用户为女性时,控制该灯光显示设备显示女性偏好的灯光颜色。
示例性地,在人体特征指示第一用户为男性时,控制该灯光显示设备显示男性偏好的灯光颜色。
在一些可能的实现方式中,控制该灯光显示设备所显示的灯效变化的节点,包括:根据该音频属性和该人体特征,控制该节点前后的灯光颜色,和/或该节点的频率。
在一些可能的实现方式中,根据该音频属性以及身份信息,控制该灯光显示设备所显示的灯效变化的节点,还可以包括:基于身份信息确定第一用户的灯效偏好,根据灯效偏好控制该节点前后的灯光颜色,和/或该节点的频率。其中,灯效偏好包括第一用户喜欢的灯光颜色和/或灯效变化频率。
在一些可能的实现方式中,车辆中保存了一个或者多个用户的身份信息,若座舱内用户的身份信息与该一个或者多个用户的身份信息匹配,则可以根据该座舱内用户的身份信息确定灯效偏好。
在上述技术方案中,能够结合用户偏好、人体特征等确定灯效,能够满足不同年龄段和/或不同偏好的用户的需求,有助于提高用户使用灯光显示设备时的交互体验。
结合第一方面,在第一方面的某些实现方式中,该控制该灯光显示设备所显示的灯效变化的节点,包括:在该座舱所处的车辆处于行驶状态,该第一用户处于该座舱的主驾驶处,且该人体特征指示该第一用户处于疲劳状态时,提高该节点的频率,和/或控制该节点前后显示警示颜色的灯光。
示例性地,警示颜色包括具有警示效果的颜色,例如红色、黄色、橙色等。
在上述技术方案中,通过灯光显示设备显示警示性灯光效果,有助于在提升用户体验的同时,提高行车安全性。
结合第一方面,在第一方面的某些实现方式中,该方法还包括:获取座舱外的环境信息,该环境信息包括温度信息、当前所处季节信息、光强信息中的一个或多个;该控制灯光显示设备所显示的灯效变化的节点,包括:根据该音频属性,以及该环境信息,控制该节点前后的灯光颜色、该节点的频率、该节点前后的灯光亮度中的一个或多个。
可选地,获取该座舱外的温度信息和/或当前该季节信息;该控制灯光显示设备所显示的灯效变化的节点,包括:根据该音频属性,以及该座舱外的温度信息和/或当前所处季节信息,控制该节点前后的灯光颜色。
示例性地,在当前所处季节信息指示当前季节为春、秋、冬季,和/或温度信息指示座舱外的温度低于或等于预设温度时,控制该节点前后的灯光颜色为暖色调颜色;在当前季节为夏季和/或座舱外的温度高于预设温度时,控制该节点前后的灯光颜色为冷色调颜色。示例性地,该预设温度可以为20度,或者也可以为25度,或者还可以为其他数值的温度。
在上述技术方案中,根据季节信息或座舱外的温度,控制灯光显示设备所显示的灯光颜色。使得在夏季或者外界温度较高的情况下,显示冷色调的灯光,使人有凉爽的感觉;在春、秋、冬季或者外界温度较低的情况下,显示暖色调的灯光,使人有温暖的感觉。
可选地,获取该座舱外的光强信息;该控制灯光显示设备所显示的灯效变化的节点,包括:根据该音频属性,以及该座舱外的光强信息,控制该节点前后的灯光亮度。
在上述技术方案中,在夜晚光强较低的场景下,降低灯光亮度,有助于提高行车安全性,且有利于节约能源;在光强较高的场景下,提高灯光亮度,有助于提升用户的使用体验。
第二方面,提供了一种控制灯光显示的方法,该方法包括:获取待播放的音频数据;确定该音频数据的音频属性,该音频属性包括该音频数据的音色特征、和弦特征、和声特征中的至少一个;根据该音频属性,控制灯光显示设备所显示的灯效变化。
结合第二方面,在第二方面的某些实现方式中,该音色特征包括第一音色特征和第二音色特征,该控制灯光显示设备所显示的灯效变化,包括:根据该第一音色特征控制第一灯光显示设备所显示的灯效变化的节点;和/或根据该第二音色特征控制第二灯光显示设备所显示的灯效变化的节点;其中,该灯光显示设备包括该第一灯光显示设备和该第二灯光显示设备。
结合第二方面,在第二方面的某些实现方式中,该音色特征包括人声音色特征和/或乐器音色特征。
结合第二方面,在第二方面的某些实现方式中,该和声特征包括第一和声特征和第二和声特征,该控制灯光显示设备所显示的灯效变化,包括:根据该第一和声特征控制第三灯光显示设备所显示的灯效变化的节点;和/或根据该第二和声特征控制第四灯光显示设备所显示的灯效变化的节点;其中,该灯光显示设备包括该第三灯光显示设备和该第四灯光显示设备。
结合第二方面,在第二方面的某些实现方式中,该和弦特征包括第一和弦特征和第二和弦特征,该控制灯光显示设备所显示的灯效变化,包括:根据该第一和弦特征控制第五灯光显示设备所显示的 灯效变化的节点;和/或根据该第二和弦特征控制第六灯光显示设备所显示的灯效变化的节点;其中,该灯光显示设备包括该第五灯光显示设备和该第六灯光显示设备。
结合第二方面,在第二方面的某些实现方式中,该音频属性还包括该音频数据的节拍特征、速度特征和曲调特征中的至少一个;该控制灯光显示设备所显示的灯效变化,包括:根据该音频属性,控制灯光显示设备所显示的灯效变化的节点。
结合第二方面,在第二方面的某些实现方式中,该音频属性还包括该音频数据的节奏特征,该控制灯光显示设备所显示的灯效变化,包括:根据该节奏特征,控制该节点前后的灯光颜色,和/或该节点的频率。
结合第二方面,在第二方面的某些实现方式中,该方法还包括:确定时延信息,该时延信息包括如下至少一项:根据该音频数据确定该音频属性所需的第一时长,传输灯效指示信息产生的第一时延,对该灯效指示信息进行编解码产生的第二时延,以及根据该灯效指示信息生成第一灯效所需的第二时长;其中,该灯效指示信息根据该音频属性生成,用于指示该灯光显示设备所显示的第一灯效;该控制灯光显示设备所显示的灯效变化的节点,包括:根据该音频属性以及该时延信息,控制该灯光显示设备所显示的灯效变化的节点,该节点前后的灯效包括该第一灯效。
结合第二方面,在第二方面的某些实现方式中,该方法还包括:获取座舱内第一用户的人体特征或身份信息;该控制灯光显示设备所显示的灯效变化,包括:根据该音频属性,以及该人体特征或该身份信息,控制该灯光显示设备所显示的灯效变化的节点。
结合第二方面,在第二方面的某些实现方式中,该控制该灯光显示设备所显示的灯效变化,包括:在该座舱所处的车辆处于行驶状态,该第一用户处于该座舱的主驾驶处,且该人体特征指示该第一用户处于疲劳状态时,控制提高该节点的频率,和/或控制该节点前后显示警示颜色的灯光。
结合第二方面,在第二方面的某些实现方式中,该方法还包括:获取座舱外的温度信息和/或当前该季节信息;该控制灯光显示设备所显示的灯效变化,包括:根据该音频属性,以及该座舱外的温度信息和/或当前所处季节信息,控制该节点前后的灯光颜色。
结合第二方面,在第二方面的某些实现方式中,该方法还包括:获取座舱外的光强信息;该控制灯光显示设备所显示的灯效变化,包括:根据该音频属性,以及该座舱外的光强信息,控制该节点前后的灯光亮度。
结合第二方面,在第二方面的某些实现方式中,该灯效变化的节点与该音频属性中的如下至少一项相关联:该节拍特征指示的节拍点,该速度特征指示的强拍出现频率变化的点,该曲调特征指示的曲调变化的点。
第三方面,提供了一种控制灯光显示的装置,该装置包括:获取单元,用于获取待播放的音频数据;第一确定单元,用于确定该音频数据的音频属性,该音频属性包括该音频数据的节拍特征、速度特征和曲调特征中的至少一个;处理单元,用于根据该音频属性,控制灯光显示设备所显示的灯效变化的节点。
结合第三方面,在第三方面的某些实现方式中,该音频属性还包括该音频数据的节奏特征,该处理单元用于:根据该节奏特征,控制该节点前后的灯光颜色、该节点的频率中的一个或多个,其中,该节点的频率可以理解为灯效变化的速度。
结合第三方面,在第三方面的某些实现方式中,该装置还包括第二确定单元,用于:确定时延信息,该时延信息包括如下至少一项:根据该音频数据确定该音频属性所需的第一时长、传输灯效指示信息产生的第一时延、对该灯效指示信息进行编解码产生的第二时延,以及根据该灯效指示信息生成第一灯效所需的第二时长,其中,该灯效指示信息根据该音频属性生成,用于指示该灯光显示设备所显示的第一灯效;该处理单元用于根据该音频属性以及该时延信息,控制该灯光显示设备所显示的灯效变化的节点,该节点前后的灯效包括该第一灯效。
结合第三方面,在第三方面的某些实现方式中,该音频属性还包括音色特征,该音色特征包括第一音色特征和第二音色特征,该处理单元包括第一处理单元和第二处理单元,其中,该第一处理单元用于:根据该第一音色特征控制第一灯光显示设备所显示的灯效变化的节点;和/或该第二处理单元用于:根据该第二音色特征控制第二灯光显示设备所显示的灯效变化的节点;其中,该灯光显示设备包括该第一灯光显示设备和该第二灯光显示设备。
在一些可能的实现方式中,第一处理单元与第二处理单元为同一处理单元。
结合第三方面,在第三方面的某些实现方式中,该音色特征包括人声音色特征和/或乐器音色特征。
结合第三方面,在第三方面的某些实现方式中,该音频属性还包括和声特征,该和声特征包括第一和声特征和第二和声特征,该处理单元包括第三处理单元和第四处理单元,其中,该第三处理单元用于:根据该第一和声特征控制第三灯光显示设备所显示的灯效变化的节点;和/或该第四处理单元用于:根据该第二和声特征控制第四灯光显示设备所显示的灯效变化的节点;其中,该灯光显示设备包括该第三灯光显示设备和该第四灯光显示设备。
在一些可能的实现方式中,第三处理单元与第四处理单元为同一处理单元。
结合第三方面,在第三方面的某些实现方式中,该音频属性还包括和弦特征,该和弦特征包括第一和弦特征和第二和弦特征,该处理单元包括第五处理单元和第六处理单元,其中,该第五处理单元用于:根据该第一和弦特征控制第五灯光显示设备所显示的灯效变化的节点;和/或该第六处理单元用于:根据该第二和弦特征控制第六灯光显示设备所显示的灯效变化的节点;其中,该灯光显示设备包括该第五灯光显示设备和该第六灯光显示设备。
在一些可能的实现方式中,第五处理单元与第六处理单元为同一处理单元。
结合第三方面,在第三方面的某些实现方式中,该获取单元还用于:获取座舱内第一用户的人体特征或身份信息;该处理单元用于:根据该音频属性,以及该人体特征或该身份信息,控制该灯光显示设备所显示的灯效变化的节点。
结合第三方面,在第三方面的某些实现方式中,该处理单元用于:在该座舱所处的车辆处于行驶状态,该第一用户处于该座舱的主驾驶处,且该人体特征指示该第一用户处于疲劳状态时,提高该节点的频率,和/或控制该节点前后显示警示颜色的灯光。
结合第三方面,在第三方面的某些实现方式中,该获取单元还用于:获取座舱外的环境信息,该环境信息包括温度信息、当前所处季节信息、光强信息中的一个或多个;该处理单元用于:根据该音频属性,以及该环境信息,控制该节点前后的灯光颜色、该节点的频率、该节点前后的灯光亮度中的一个或多个。
可选地,该获取单元还用于:获取座舱外的温度信息和/或当前该季节信息;该处理单元用于:根据该音频属性,以及该座舱外的温度信息和/或当前所处季节信息,控制该节点前后的灯光颜色。
可选地,该获取单元还用于:获取座舱外的光强信息;该处理单元用于:根据该音频属性,以及该座舱外的光强信息,控制该节点前后的灯光亮度。
结合第三方面,在第三方面的某些实现方式中,该灯效变化的节点与该音频属性中的如下至少一项相关联:该节拍特征指示的节拍点,该速度特征指示的强拍出现频率变化的点,该曲调特征指示的曲调变化的点。
第四方面,提供了一种控制灯光显示的装置,包括:获取单元,用于获取待播放的音频数据;第一确定单元,用于确定该音频数据的音频属性,该音频属性包括该音频数据的音色特征、和弦特征、和声特征中的至少一个;处理单元,用于根据该音频属性,控制灯光显示设备所显示的灯效变化。
结合第四方面,在第四方面的某些实现方式中,该音色特征包括第一音色特征和第二音色特征,该处理单元用于:根据该第一音色特征控制第一灯光显示设备所显示的灯效变化的节点;和/或根据该第二音色特征控制第二灯光显示设备所显示的灯效变化的节点;其中,该灯光显示设备包括该第一灯光显示设备和该第二灯光显示设备。
结合第四方面,在第四方面的某些实现方式中,该音色特征包括人声音色特征和/或乐器音色特征。
结合第四方面,在第四方面的某些实现方式中,该和声特征包括第一和声特征和第二和声特征,该处理单元用于:根据该第一和声特征控制第三灯光显示设备所显示的灯效变化的节点;和/或根据该第二和声特征控制第四灯光显示设备所显示的灯效变化的节点;其中,该灯光显示设备包括该第三灯光显示设备和该第四灯光显示设备。
结合第四方面,在第四方面的某些实现方式中,该和弦特征包括第一和弦特征和第二和弦特征,该处理单元用于:根据该第一和弦特征控制第五灯光显示设备所显示的灯效变化的节点;和/或根据该第二和弦特征控制第六灯光显示设备所显示的灯效变化的节点;其中,该灯光显示设备包括该第五灯光显示设备和该第六灯光显示设备。
结合第四方面,在第四方面的某些实现方式中,该音频属性还包括该音频数据的节拍特征、速度特征和曲调特征中的至少一个;该处理单元还用于:根据该音频属性,控制灯光显示设备所显示的灯效变化的节点。
结合第四方面,在第四方面的某些实现方式中,该音频属性还包括该音频数据的节奏特征,该处理单元还用于:根据该节奏特征,控制该节点前后的灯光颜色,和/或该节点的频率。
结合第四方面,在第四方面的某些实现方式中,该装置还包括第二确定单元,用于:确定时延信息,该时延信息包括如下至少一项:根据该音频数据确定该音频属性所需的第一时长,传输灯效指示信息产生的第一时延,对该灯效指示信息进行编解码产生的第二时延,以及根据该灯效指示信息生成第一灯效所需的第二时长;其中,该灯效指示信息根据该音频属性生成,用于指示该灯光显示设备所显示的第一灯效;该处理单元用于根据该音频属性以及该时延信息,控制该灯光显示设备所显示的灯效变化的节点,该节点前后的灯效包括该第一灯效。
结合第四方面,在第四方面的某些实现方式中,该获取单元还用于:获取座舱内第一用户的人体特征或身份信息;该处理单元用于:根据该音频属性,以及该人体特征或该身份信息,控制该灯光显示设备所显示的灯效变化的节点。
结合第四方面,在第四方面的某些实现方式中,该处理单元用于:在该座舱所处的车辆处于行驶状态,且该第一用户处于该座舱的主驾驶处,且该人体特征指示该第一用户处于疲劳状态时,控制提高该节点的频率,和/或控制该节点前后显示警示颜色的灯光。
结合第四方面,在第四方面的某些实现方式中,该获取单元还用于:获取座舱外的温度信息和/或当前该季节信息;该处理单元还用于:根据该音频属性,以及该座舱外的温度信息和/或当前所处季节信息,控制该节点前后的灯光颜色。
结合第四方面,在第四方面的某些实现方式中,该获取单元还用于:获取座舱外的光强信息;该处理单元还用于:根据该音频属性,以及该座舱外的光强信息,控制该节点前后的灯光亮度。
结合第四方面,在第四方面的某些实现方式中,该灯效变化的节点与该音频属性中的如下至少一项相关联:该节拍特征指示的节拍点,该速度特征指示的强拍出现频率变化的点,该曲调特征指示的曲调变化的点。
第五方面,提供了一种控制灯光显示的装置,该装置包括:存储器,用于存储程序;处理器,用于执行存储器存储的程序,当存储器存储的程序被执行时,处理器用于执行上述第一方面或第二方面中任一种可能实现方式中的方法。
第六方面,提供了一种移动载体,该移动载体包括上述第三方面至第五方面中任一种可能实现方式中的装置,以及该灯光显示设备。其中,该灯光显示设备可以包括氛围灯、呼吸灯、车载显示屏、抬头显示(head-up display,HUD),还可以包括其他能够显示灯光的设备。该灯光显示设备设置于仪表盘、中控区域显示屏、前排座椅头枕后部、前排中央扶手处、副驾驶处的至少一处,还可以设置在该车辆中的其他位置,本申请实施例对此不作具体限定。
第七方面,提供了一种计算机程序产品,上述计算机程序产品包括:计算机程序代码,当上述计算机程序代码在计算机上运行时,使得计算机执行上述第一方面或第二方面中任一种可能实现方式中的方法。
需要说明的是,上述计算机程序代码可以全部或部分存储在第一存储介质上,其中第一存储介质可以与处理器封装在一起的,也可以与处理器单独封装,本申请实施例对此不作具体限定。
第八方面,提供了一种计算机可读介质,上述计算机可读介质存储由程序代码,当上述计算机程序代码在计算机上运行时,使得计算机执行上述第一方面或第二方面中任一种可能实现方式中的方法。
第九方面,提供了一种芯片,该芯片包括处理器,用于调用存储器中存储的计算机程序或计算机指令,以使得该处理器执行上述第一方面或第二方面中任一种可能实现方式中的方法。
结合第九方面,在一种可能的实现方式中,该处理器通过接口与存储器耦合。
结合第九方面,在一种可能的实现方式中,该芯片***还包括存储器,该存储器中存储有计算机程序或计算机指令。
本申请实施例中,能够根据音频的一个或多个维度的特征进行灯光效果设计,有助于提高灯效与 音频之间的配合度,提高灯效对音频的表现力,进而提升用户的驾乘体验。根据音频属性控制灯效变化的节点,使得灯光根据音频的节拍、速度和曲调中的至少一个进行变化,能够使得音频表达理念或情感通过灯光具象化,有助于提升用户在聆听音频时的体验。在确定灯效变化的节点的基础上,结合音频的节奏特征控制灯光的颜色和/或灯效变化的速度,有助于进一步提高灯光对音频的表现力。例如,使得根据节奏较弱的古典音乐进行显示的灯效更具表现力。进一步地,还可以使灯光显示设备,根据音频的音色特征、和声特征、和弦特征中的至少一个进行灯光显示,增加灯光显示设备的显示灯光的模式,提高用户在使用灯随音动效果时的体验到的科技感。此外,还能够提高灯随音动过程中灯光的动感效果,继而提高用户的使用体验。此外,还能够结合用户偏好、人体特征等确定灯效,能够满足不同年龄段和/或不同偏好的用户的需求,有助于提高用户使用灯光显示设备时的交互体验。进一步地,在驾驶员处于疲劳状态时,通过灯光显示设备显示警示性灯光效果,有助于提高行车安全性。此外,本申请通过灯效时延补偿,能够使得实际灯效变化的节点与根据音频数据的音频属性确定的节点基本重合,减少灯效延迟导致的用户体验不佳的问题。
附图说明
图1是本申请实施例提供的车辆的示意性框图。
图2是本申请实施例提供的一种控制灯光显示的方法的应用场景示意图。
图3是本申请实施例提供的一种控制灯光显示的***的示意性框图。
图4是本申请实施例提供的一种控制灯光显示的方法的示意性流程图。
图5是本申请实施例提供的一种控制灯光显示的方法的示意性流程图。
图6是本申请实施例提供的一种控制灯光显示的方法的示意性流程图。
图7是本申请实施例提供的一种控制灯光显示的方法的示意性流程图。
图8是本申请实施例提供的一种控制灯光显示的方法的示意性流程图。
图9是本申请实施例提供的一种控制灯光显示的方法的应用场景的示意图。
图10是本申请实施例提供的一种控制灯光显示的装置的示意性框图。
图11是本申请实施例提供的一种控制灯光显示的装置的示意性框图。
具体实施方式
下面将结合附图,对本申请实施例中的技术方案进行描述。
图1是本申请实施例提供的车辆100的一个功能框图示意。车辆100可以包括感知***120、显示装置130和计算平台150。感知***120可以包括可以用于感测关于车辆100周边的环境的信息的若干种传感器。例如,感知***120可以包括定位***,定位***可以是全球定位***(global positioning system,GPS),也可以是北斗***或者其他定位***、惯性测量单元(inertial measurement unit,IMU)、激光雷达、毫米波雷达、超声雷达、视觉传感器、声音传感器、转向角传感器以及摄像装置中的一种或者多种。
感知***120还可以包括智能座舱内部或外部可以安装有一个或多个摄像头,用于捕捉舱内或舱外的图像,例如,驾驶员监测***(driver monitor system,DMS)的摄像头,座舱监测***(cabin monitor system,CMS)的摄像头,以及行车记录仪(dashcam)的摄像头。其中,用于捕捉舱内和舱外的摄像头可以是同一个摄像头,也可以是不同摄像头。上述一个或多个摄像头可以用于采集车辆中用户的面容信息等。例如,舱内的摄像头和毫米波雷达可以用于采集车辆中用户的动作或手势等,舱内的麦克风可以用于采集车辆中的音频信息等。
本申请实施例中的显示装置130主要包括灯光显示设备,用于显示灯光。示例性地,该灯光显示设备可以是由发光二极管(light emitting diode,LED)灯带构成的呼吸灯或氛围灯,LED灯带(下文简称灯带)中可以包括多个LED灯珠;或者,该灯光显示设备也可以为其他类型的灯;或者,该灯光显示设备也可以设置在车载显示屏,例如中控区域显示屏、后视镜显示屏,或者也可以为设置在前排座椅头枕后部或前排中央扶手处的显示屏;或者,该灯光显示设备还可以包括车载显示屏、HUD,还可以包括其他能够显示灯光的设备。
在一些可能的实现方式中,本申请实施例涉及的灯光显示设备可以包括灯珠或灯带(或称灯条), 其安装位置如图2中的(a)所示。示例性地,灯光显示设备可以设置于中控区域显示屏处、仪表盘处,或者还可以设置在可升降摄像头上,或者还可以设置于座舱的其他位置。进一步地,灯光显示设备可以为可升降装置,例如灯光显示设备设置于中控区域显示屏时,则在不需要灯光显示设备时,灯光显示设备可以伸缩到显示屏后方,不进行显示,设置位置可以如图2中的(a)中的a所示。在灯光显示设备进行灯效显示时,其位置可以通过升降控制设置于如图2中的(a)中的b、c处,或也可以悬挂于中控显示屏左上角或右上角。当灯光显示设备设置于仪表盘时,其位置可以为图2中的(a)中所示的e处,即仪表盘中间位置,或者,也可以升至仪表盘的上方,即图2中的(a)中所示的d处。
在一些可能的实现方式中,灯光显示设备可以包括设置在中控显示屏边缘的灯带,该灯带由若干LED灯珠组成;或者,该灯光显示设备也可以由设置在可升降摄像头上的LED灯珠组成,具体如图2中的(b)所示。进一步地,灯光显示设备中的灯珠为LED灯珠,或者也可以为车载显示屏中显示的虚拟灯珠,具体如图2中的(c)所示。或者,如图2中的(c)所示,该灯光显示设备也可以包括设置在座舱车门扶手处的灯带1,或者设置于座舱顶部的灯带2;或者,如图2中的(d)所示,该灯光显示设备也可以包括设置在座舱前排中央扶手处的灯带3,或者设置于中控屏和副驾驶屏幕下方的灯带4,或者设置在前挡风玻璃下方的灯带5;或者,该灯光显示设备还可以包括设置在座舱内的门把手、空调出风口、储物盒、杯架、音箱、卡扣等位置的灯带,本申请实施例对此不作具体限定。
在一些可能的实现方式中,灯光显示设备还可以包括车辆的前灯、尾灯、刹车灯等。
车辆100的部分或所有功能可以由计算平台150控制。计算平台150可包括处理器151至15n(n为正整数),处理器是一种具有信号的处理能力的电路,在一种实现中,处理器可以是具有指令读取与运行能力的电路,例如中央处理单元(central processing unit,CPU)、微处理器、图形处理器(graphics processing unit,GPU)(可以理解为一种微处理器)、或数字信号处理器(digital signal processor,DSP)等;在另一种实现中,处理器可以通过硬件电路的逻辑关系实现一定功能,该硬件电路的逻辑关系是固定的或可以重构的,例如处理器为专用集成电路(application-specific integrated circuit,ASIC)或可编程逻辑器件(programmable logic device,PLD)实现的硬件电路,例如现场可编辑逻辑门阵列(filed programmable gate array,FPGA)。在可重构的硬件电路中,处理器加载配置文档,实现硬件电路配置的过程,可以理解为处理器加载指令,以实现以上部分或全部单元的功能的过程。此外,还可以是针对人工智能设计的硬件电路,其可以理解为一种ASIC,例如神经网络处理单元(neural network processing unit,NPU)、张量处理单元(tensor processing unit,TPU)、深度学***台150还可以包括存储器,存储器用于存储指令,处理器151至15n中的部分或全部处理器可以调用存储器中的指令,执行指令,以实现相应的功能。
在本申请实施例中,处理器可以对音乐的节奏特征、力度特征、速度特征、曲调特征等特征进行识别,并根据识别出的音乐中节奏特征、力度特征、速度特征、曲调特征中的至少一个特征,生成灯效。此外,处理器还可以获取感知***120检测的环境信息、用户人体特征和用户身份信息,并结合上述信息以及音乐特征调整灯光效果。应理解,上述“灯光效果”(以下简称灯效)包括显示的灯光颜色、灯光亮度、灯光形成的动态效果中的至少一个。其中,灯光形成的动态效果包括但不限于:颜色渐变、灯光闪烁、灯光律动。其中,环境信息可以包括车内外环境,例如光强、天气情况等;用户人体特征包括但不限于性别、年龄、情绪等;用户身份信息包括但不限于保存在移动载体中的生物特征信息,以及账号等。其中,生物特征信息包括但不限于指纹、掌纹、面容信息、虹膜信息、步态信息等;账号可以包括登录车机***的账号信息等。在一些可能的实现方式中,上述环境信息、用户人体特征和用户身份信息,还可以以数据的形式存储于计算平台150中的存储器中。在一些可能的实现方式中,处理器可以对上述环境信息、用户人体特征和用户身份信息进行处理,获得参数化指标以指示灯光显示设备进行灯效显示。
应理解,上述操作可以由同一个处理器执行,或者,也可以由一个或多个处理器执行,本申请实施例对此不作具体限定。
以下结合图3详细描述控制灯光显示的***的工作流程。如图3所示为本申请实施例提供的一种控制灯光显示的***架构图,该***中包括音乐选择模块、音乐属性识别模块、个性化灯效特征提取模块、时延预测补偿模块、灯效生成模块和灯效显示模块。示例性地,该音乐选择模块、音乐属性识别模块、个性化灯效特征提取模块、时延预测补偿模块、灯效生成模块可以包括图1中所示的计算平 台150中的一个或多个处理器;灯效显示模块可以包括图1中所示的显示装置130中的一个或多个灯光显示设备。其中,音乐选择模块用于根据用户的选择确定用于生成灯效的音频,进而对音频进行分段实时处理。进一步地,音乐选择模块将分段实时处理后的音频输入至音乐属性识别模块,音乐属性识别模块对音乐的力度、节拍、速度、曲调、人声音色、乐器音色、和声、和弦、歌词等特征进行识别,进而根据上述特征确定生成灯效所需属性,例如灯效变化的节点、灯带上渐变点、以及变化的灯带等。可选地,音乐属性识别模块将生成灯效所需属性信息输入个性化灯效特征提取模块,该模块可以根据当前音乐风格,结合环境属性、用户人体特征和/或面容信息等确定灯效风格,进而将灯效风格输入灯效生成模块。灯效生成模块根据音乐属性和灯效风格生成灯效,并控制灯光显示模块显示该灯效。在一些可能的实现方式中,音乐选择模块将分段实时处理后的音频输入至时延预测补偿模块,该模块对算法时延、传输时延和编解码时延进行预测,在灯效生成模块生成的灯效中存在时延的位置加入补偿灯效。
应理解,上述模块及装置仅为一个示例,实际应用中,上述模块和装置有可能根据实际需要添加或删除。一示例中,图3中的个性化灯效特征提取模块和灯效生成模块可以合并为一个模块,即二者的功能由一个模块实现。
示例性地,音乐属性识别模块确定的音频特征与灯效特征之间的对应关系如表1所示。其中,灯效特征可以包括何时进行灯效变化(或称灯效变化的节点),例如灯光闪烁的节点、灯光颜色变化的节点、动画形式变化的节点,灯效特征还可以包括其他特征,例如灯光颜色渐变方向、灯光颜色变化幅度、灯效种类等。示例性地,音频特征与灯效特征之间的对应关系如下所示:
1)力度特征:或称响度特征,示例性地,可以以中音的强弱程度作为力度的大小。在一些可能的实现方式中,可以将音频的力度(响度)变化曲线中的极大值点确定是灯效变化的节点,例如灯光闪烁的节点(或称灯光亮度变化的节点)。
2)节拍特征:节拍(beats)狭义上为乐谱四分音符的间隔,应理解,每小节中节拍可以包括强拍、弱拍;或者,每小节中节拍可以包括强拍、次强拍、弱拍。对于一个节拍,其中,前半拍和后半拍衔接的部分为节拍点。示例性地,可以以节拍点作为灯效变化的节点,例如灯光颜色变化的节点。
一示例中,对于古典音乐等响度变化幅度较小的音乐,可以使用节拍点作为灯效变化的节点。
3)速度特征:即强拍出现频率。一示例中,可以根据速度特征调节灯光颜色变化幅度,例如,强拍出现频率越快,则灯光颜色变化幅度越大。其中,灯光颜色变化幅度可以理解为:同一灯珠相邻两次所显示的灯光颜色的变化大小,相邻两次灯光的颜色RGB值相差越大,则灯光颜色变化幅度越大。又一示例中,速度特征出现发生变化时,则以该变化的位置的强拍点作为灯效变化的节点,例如灯效动画形式变化的节点,如从点闪烁动画变为光影渐变动画的节点。其中,强拍的前半拍和后半拍衔接的部分为强拍点。示例性地,将强拍出现频率分为慢速、中速和快速,在强拍出现频率由慢速变为中速时,可以将该变化位置的强拍点作为灯效变化的节点。示例性地,其中,强拍出现频率为每分钟14拍以下的为慢速,每分钟14到27拍的为中速,每分钟27拍以上的为快速。应理解,上述对速度特征的划分仅为示例性说明,在具体实现过程中,也可以为其他划分方式。
在一些可能的实现方式中,可以综合考虑力度特征和节拍特征确定灯光颜色变化的节点,示例性地,可以按照将两者加权求和的方式确定灯光颜色变化的节点,两者权重可以根据速度特征确定。
4)曲调特征:可以根据曲调上行或下行确定灯光渐变的方向,例如,曲调如果上行,可以显示向上或者向用户所处位置方向渐变的灯光设计。应理解,由较低的音级向较高的音级进行叫曲调上行,由较高的音级向较低的音级进行叫曲调下行。例如,由音符do到音符re是曲调上行,由音符re到音符do是曲调下行。
5)人声音色特征:在音乐中包含多个人声,可以根据性别对人声进行分类,分为男声和女声,进一步地,使用多个灯光显示设备中的一个或一组灯光显示设备,根据男声音色进行灯效显示;使用多个灯光显示设备中的另一个或另一组灯光显示设备,根据女声音色进行灯效显示。
6)乐器音色特征:在音频中识别出多个乐器的音色时,可以使用多个灯光显示设备中的一个或一组灯光显示设备,分别负责根据一个乐器的音频进行灯效显示。
例如,多个灯光显示设备包括主驾驶车门扶手处灯带(或称第一组灯光显示设备)、副驾驶车门扶手处灯带(或称第二组灯光显示设备)、第二排左侧车门扶手处灯带(或称第三组灯光显示设备)、 第二排右侧车门扶手处灯带(或称第四组灯光显示设备)、中控屏处灯带(或称第五组灯光显示设备)。某音频中包含吉他、贝斯、钢琴、架子鼓时,则上述第一组至第五组灯光显示设备中的四组灯光显示设备,可以分别负责根据吉他、贝斯、钢琴、架子鼓的音频进行灯效显示。
7)和声、和弦特征:使用多灯显示,每个灯或每组灯负责根据和声或和弦中一个特征进行灯效显示。
8)歌词特征:可以根据歌词所表达的情感确定灯效种类和/或变化频率。例如,歌词所表达的情感为快乐的情感或正向积极的情感时,灯效种类较为丰富和/或灯效变化频率高;歌词所表达的情感为悲伤的情感时,灯效种类较为简单和/或灯效变化频率低。
9)基频特征:基频(pitch)即基音频率,可以描述乐声、人声的主要音高。一些可能的实现方式中,可以将基音的最大值点确定是灯效变化的节点,例如灯光闪烁的节点。
表1音频特征与灯效特征之间的对应关系
示例性地,个性化灯效特征提取模块确定的各特征与灯效风格之间的对应关系如表2所示。灯效风格可以包括灯光颜色、灯效变化频率、灯光亮度等。示例性地,各特征与灯效风格之间的对应关系如下所示:
1)音乐风格特征:可以根据音频属性识别算法确定音乐风格,或者还可以根据获取的音频数据携带的音乐标签信息确定音乐风格,或者还可以根据其他方式确定音乐风格,本申请实施例对此不作具体限定。若音乐风格为古典音乐、纯音乐,对应的灯效风格为灯光颜色少,灯效变化频率低;若音乐风格为摇滚、重金属音乐,对应的灯效风格为灯光颜色多,灯效变化频率高;若音乐风格为抒情音乐,对应的灯效风格为灯光颜色适中,灯效变化频率适中。
在一些可能的实现方式中,音乐风格特征包括节奏特征,进而根据节奏特征确定灯光颜色和/或灯效变化频率。其中,节奏(tempo)是指单位时长内节拍的数量(或称音符密度)。示例性地,可以使用音符密度表征节奏的快慢。例如,单位时长内音符密度越大,则音乐节奏越快。进一步地,节奏特征指示音频的节奏越快,则在根据该音频控制显示的灯光变化的颜色越多,和/或灯效变化频率越快。例如,该节奏特征指示该音频节奏变化的越快,则上述确定的灯效变化的节点之前后的灯光颜色变化幅度越大;和/或该灯效变化的节点的频率越快。其中,灯光颜色变化幅度可以理解为:同一灯珠相邻两次所显示的灯光颜色的变化大小,相邻两次灯光的颜色RGB值相差越大,则灯光颜色变化幅度越大。
2)环境属性特征:环境属性特征可以包括光强信息、温度信息、季节信息中的至少一个。示例性地,可以根据环境光强确定灯光的亮度,例如,在环境光强达到10000勒克斯(luminance,lux)及以上时,灯光亮度可以设置为灯光显示设备能达到的最大亮度,即100%;在环境光强处于10000lux至500lux之间时,灯光亮度可以设置为灯光显示设备能达到的最大亮度的100%至40%之间,其中,环境光强越大,灯光亮度越高;在环境光强低于500lux时,灯光亮度可以设置为灯光显示设备能达到的最大亮度的40%。
根据季节和/或外界温度确定灯光颜色。例如,外界温度高于预设温度时和/或当前季节为夏季时, 灯光颜色可以以冷色调为主。外界温度低于或等于预设温度时和/或当前季节为春、秋、冬季中的任一季节时,灯光颜色可以以暖色调为主。其中,冷色调是指色谱中靠近绿色系的颜色,如绿色、蓝色和紫色等,暖色调是指色谱中靠近红色系的颜色,例如,红紫、红、橙、黄等颜色。示例性地,预设温度可以为20度,或者也可以为25度,或者还可以为其他温度。
3)用户人体特征:根据用户年龄段确定灯效风格,例如,在用户年龄大于或等于18岁且小于或等于35岁时,采用灯光颜色较多、灯效变化频率较高的灯效风格;在用户年龄大于或等于36岁时,采用灯光颜色较少、灯效变化频率较低的灯效风格。
在驾驶员处于疲劳状态,采用的灯效变化频率较高的灯效风格,和/或灯光颜色以黄色、橙色等具有警示效果的颜色为主,和/或采用较高的灯光亮度。
在座舱内用户中有孩童时,减少或不使用刺眼颜色,和/或降低灯光闪烁频率。示例性地,刺眼颜色可以包括红色、蓝色、紫色。
表2各特征与灯效风格之间的对应关系
在一些可能的实现方式中,还可以根据用户的身份信息确定其偏好的灯效风格。
在一些可能的实现方式中,在确定灯光颜色和灯效变化频率时,可以综合考虑音乐风格、环境属性、以及用户人体特征和/或身份信息。上述各特征在确定灯光颜色和灯效变化频率时所占权重可以不同,例如上述三个方面所占权重可以分别为0.7、0.1和0.2,或者也可以为其他权重,本申请实施例对此不作具体限定。在一些可能的实现方式中,可以根据车辆的运行状态确定各方面所占权重,例如,在车辆处于驻车状态时,上述三个方面所占权重可以分别为0.9、0.05和0.05;在车辆处于行驶状态时,上述三个方面所占权重可以分别为0.5、0.2和0.3。
在一些可能的实现方式中,根据表1中所示的音频数据的各种特征,可以确定灯效变化的节点,即灯效变化的时刻;根据表2中所示的一些其他特征,可以确定根据音频数据显示的具体灯效,以及灯等的颜色、亮度等,例如,灯效变化的节点之前和/或之后的灯效,以及灯光的颜色、亮度等。
图4示出了本申请实施例提供的一种控制灯光显示的方法400。该方法400可以应用于图1所示的车辆100中,该方法也可以由图3所示的***执行。该方法400包括:
S401,确定音频数据的第一属性,该第一属性包括该音频数据的节拍特征、速度特征和曲调特征中的至少一个。
示例性地,该方法可以应用于上述实施例中车辆100的座舱,或者也可以应用于智能家居的灯光控制领域。
示例性地,该节拍特征、速度特征和曲调特征可以为上述实施例如表1中所示的特征。
示例性地,可以采用音符起始点检测(onset detection)算法确定音频数据的节拍。例如,通过音符起始点检测算法,将能量较强的低频起始点(如鼓点出现的位置)作为粗略估计的节拍位置;进一步地,通过短时能量估计确定节拍点。其中,短时能量是指信号能量在一段时长内能量均值的变化。示例性地,该一段时长可以为500毫秒,或者也可以为其他时长。
示例性地,可以通过节奏检测算法确定强拍出现频率。例如,通过节奏检测算法识别音符密度(或称节拍频率,即单位时长内节拍的数量),进而确定音频的节奏。进一步地,结合短时能量确定强拍的位置及数量,进而确定强拍出现频率。
示例性地,也可以通过短时能量确定响度最大值;可以采用音乐旋律识别算法确定音频数据中的曲调变化情况。应理解,还可以采用其他算法确定音频数据的力度特征、节拍特征、速度特征、曲调特征等,本申请实施例对此不作具体限定。
S402,根据该第一属性控制灯光显示设备所显示的灯效变化的节点,该灯效包括如下至少一项:灯光颜色、灯光亮度、灯光形成的动态效果。
示例性地,灯光形成的动态效果包括但不限于:颜色渐变、灯光闪烁、灯光律动。
示例性地,根据该第一属性确定该座舱的灯光显示设备所显示的灯效变化的节点的方法,可以参考表1及对应部分的描述。
一示例中,根据中音强弱变化生成中音强弱变化曲线,确定曲线上的极大值点为灯光闪烁的节点。
又一示例中,根据节拍确定节拍点,可以将节拍点作为灯光颜色变化的节点。示例性地,节点前后的灯光颜色可以为随机生成的,或者也可以为根据外部参考信息确定的。
再一示例中,确定速度特征变化的点为动态效果变化的节点,例如节点前的动态效果为点闪烁动画,节点后的动态效果为光影渐变动画。或者强拍出现频率由快速变为中速或慢速时,灯效中渐变点跳动幅度由较大变为较小。在一些可能的实现方式中,“渐变点”可以为在前一时刻与当前时刻颜色不同的一个或多个灯珠,“渐变点跳动幅度”可以包括渐变点颜色变化幅度,当渐变点变化前后颜色为高饱和反差颜色时,认为渐变点跳动幅度较大;当渐变点变化前后颜色为较为相近的颜色时,认为渐变点跳动幅度较小。在一些可能的实现方式中,“渐变点”可以为设置在两个颜色反差较大的灯珠之间的一个或多个灯珠,“渐变点跳动幅度”可以包括渐变点颜色变化幅度,在相邻两个灯珠的颜色为高饱和反差颜色时,认为渐变点跳动幅度较大,当相邻两个灯珠颜色为较为相近的颜色时,认为渐变点跳动幅度较小。
再一示例中,确定曲调变化的点为渐变方向变化的节点。曲调如果上行,可以显示向上或者向用户渐变的灯效,曲调如果下行,可以显示向下或者远离用户渐变的灯效。若节点前曲调上行,节点后曲调下行,则灯光渐变方向由向上变为向下(或者由向用户变为远离用户)。
应理解,上述第一属性与灯效变化的节点之间的关系仅为示例性说明,在具体实现过程中,其对应关系也可以为其他形式,例如,确定中音强弱变化曲线上的极大值点为灯光颜色变化的节点,确定节拍点为灯光闪烁的点,确定强拍出现频率变化的点为渐变方向变化的点,确定曲调变化的点为动态效果变化的点。
在确定灯光显示设备所显示的灯光效果时,可以只考虑第一属性中的一个属性,或者也可以综合考虑第一属性中的两个或以上属性。
在一些可能的实现方式中,可以结合音乐风格确定在设计灯光效果时需要考虑的第一属性。一示例中,若音乐风格属于古典风格、纯音乐风格或抒情风格,则可以主要根据曲调特征确定灯效变化的节点。又一示例中,若音乐风格属于摇滚风格或重金属风格,则可以主要根据力度特征确定灯效变化的节点。
在一些可能的实现方式中,该第一属性还包括该音频数据的节奏特征,进而根据该节奏特征,控制灯光显示设备显示的灯光颜色,和/或灯效变化频率。
本申请实施例提供的一种控制灯光显示的方法,能够根据音乐多个维度的特征进行灯光效果设计,有助于提高灯光效果与音乐之间的配合度,提高灯效对音乐的表现力,进而提升用户的驾乘体验。
图5示出了本申请实施例提供的一种控制灯光显示的方法500。该方法500可以应用于图1所示的车辆100中,该方法也可以由图3所示的***执行。该方法500包括:
S501,确定音频数据的第二属性,该第二属性包括该音频数据的音色特征、和弦特征、和声特征 中的至少一个。
示例性地,该方法可以应用于上述实施例中车辆100的座舱,或者也可以应用于智能家居的灯光控制领域。
示例性地,可以采用基于递归图的乐器识别算法确定音频数据中的乐器的个数;可以采用基于音极轮廓图(pitch class profile,PCP)和恒Q变换(the constant Q transform,CQT)算法,进行音频数据的和弦识别和/或和声识别,确定和弦乐音的个数和/或和声声部的个数。应理解,还可以采用其他算法确定乐器的个数、和弦乐音的个数、和声声部的个数,本申请实施例对此不作具体限定。
S502,根据该第二属性控制该座舱的第一灯光显示设备进行灯光显示,其中,第一灯光显示设备根据音色特征指示的第一音色进行灯光显示,或者根据和弦特征指示的第一乐音进行灯光显示,或者根据和声特征指示的第一声部进行灯光显示。
示例性地,第一灯光显示设备包括上述实施例中的一个或多个灯光显示设备。
示例性地,灯光显示设备包括灯光显示设备1至灯光显示设备10共十个灯光显示设备。
一示例中,该音色特征指示该音频数据中的乐器的个数为4个,则可以控制上述灯光显示设备1至灯光显示设备10中的四个灯光显示设备,例如,灯光显示设备1至灯光显示设备4,分别根据4个乐器中的一个乐器的音频进行灯效显示。
又一示例中,该和弦特征指示该音频数据中的和弦乐音的个数为3个,则可以控制上述灯光显示设备1至灯光显示设备10中的三个灯光显示设备,例如,灯光显示设备5至灯光显示设备7,分别根据3个乐音中的一个乐音进行灯效显示。
再一示例中,该和声特征指示该音频数据中的和声声部的个数为3个,则可以控制上述灯光显示设备1至灯光显示设备10中的三个灯光显示设备,例如,灯光显示设备8至灯光显示设备10,分别根据3个声部中的一个声部进行灯效显示。
在一些可能的实现方式中,根据乐器进行灯效显示的灯光显示设备,与根据乐音或声部进行灯效显示的灯光显示设备也可以为相同的灯光显示设备,本申请实施例对此不作具体限定。
本申请实施例提供的一种控制灯光显示的方法,能够根据音频数据中的音色特征、和声特征、和弦特征中的至少一个确定进行灯效显示的灯光显示设备,有助于提高灯光显示设备的表现力,进而提高灯随音动过程中灯光的动感效果,提高用户的使用体验。
该方法500可以在方法400之前执行,或者也可以在方法400之后执行,或者也可以与方法400并行执行,本申请实施例对比不作具体限定。
在一些可能的实现方式中,方法500可以与方法400结合,例如根据第一属性和第二属性,分别控制两个及以上灯光显示设备中,每个灯光显示设备所显示的灯效变化的节点。
图6示出了本申请实施例提供的一种控制灯光显示的方法600。该方法600可以应用于图1所示的车辆100中,该方法也可以由图3所示的***执行。该方法600包括:
S601,获取外部参考信息,该外部参考信息包括如下至少一项:座舱内用户的人体特征、该座舱内用户的身份信息、该座舱外部的光强信息、该座舱外部的温度信息、当前所处季节信息。
示例性地,该座舱可以为上述实施例中车辆100的座舱,或者也可以为其他车辆中的座舱,本申请实施例对此不作具体限定。
示例性地,人体特征可以包括上述实施例中的人体特征,身份信息可以包括上述实施例中的身份信息。
一示例中,获取座舱内用户的人体特征,可以包括:获取摄像装置或舱内视觉传感器(例如DMS或CMS)拍摄的用户的面容图像。进一步地,在用户为驾驶员时,可以通过人脸关键点检测算法、疲劳检测算法对面容图像进行处理得到的用户的疲劳程度;或者,可以通过C3AE人脸年龄识别算法对面容图像进行处理确定用户的年龄;或者,可以通过人脸关键点检测算法、情绪识别算法对面容图像进行处理确定用户的情绪状态。
又一示例中,获取该座舱外部的光强包括:获取光敏传感器采集的光强信息;获取该座舱外部的温度包括:获取温度传感器采集的温度信息;获取当前所处季节信息包括:从云端服务器获取当前季节信息。
S602,根据该外部参考信息,控制该座舱的灯光显示设备显示的灯效、灯光的颜色、亮度、以及 灯效变化频率中的至少一项。
示例性地,根据外部参考信息确定该座舱的灯光显示设备所显示灯效、灯光的颜色、亮度、以及灯效变化频率的方法,可以参考表2及对应部分的描述。
示例性地,该灯效、灯光的颜色、亮度可以包括:灯效变化的节点之前的灯效、灯光的颜色、亮度,和/或灯效变化的节点之后的灯效、灯光的颜色、亮度。
在一些可能的实现方式中,根据用户的身份信息确定灯光显示设备所显示灯效、灯光的颜色、亮度、以及灯效变化频率,还可以包括:基于身份信息确定用户的灯效风格偏好。
在一些可能的实现方式中,车辆中保存了一个或者多个用户的身份信息,若座舱内用户的身份信息与该一个或者多个用户的身份信息匹配,则可以根据该座舱内用户的身份信息确定灯效风格。
本申请实施例提供的一种控制灯光显示的方法,能够结合用户偏好、座舱外部环境信息等确定灯效,能够满足不同年龄段和/或不同偏好的用户的需求,还能够提高用户的舒适度。
该方法600可以在方法400和/或方法500之前执行,或者也可以在方法400和/或方法500之后执行,或者也可以与方法400和/或方法500并行执行,本申请实施例对比不作具体限定。
在一些可能的实现方式中,方法600可以与方法400结合,例如根据外部参考信息和第一属性,控制灯光显示设备所显示的灯效变化的节点的频率,和/或该节点前后的灯光颜色。其中,灯效变化的节点的频率可以包括上述灯效变化频率。
在一些可能的实现方式中,在执行方法400、方法500、方法600之前,从车辆的音乐播放器处获取音频数据,对音频数据进行分段处理得到一个或多个分段音频数据。示例性地,可以基于自相似矩阵(self-similarity matrix,SSM)算法对音频数据进行分段,例如,对于流行音乐可以根据前奏、副歌、主歌对音频数据进行分段,对于古典音乐可以根据呈现部(exposition),发展部(development)和再现部(recapitulation)进行分段;或者,也可以基于聚类算法对音频数据进行分段,并将重复的音频归为一类;或者,还可以通过其他方法对音频数据进行分段,本申请实施例对此不作具体限定。应理解,通过对音频数据进行分段处理,有助于减少灯效实时生成和显示过程中的时延。
在一些可能的实现方式中,在通过方法400、方法500、方法600中的至少一项确定灯效后,可以基于用户的身份信息或基于该音频数据保存该灯效,以便于该用户下次播放此音频数据时,能够直接控制灯光显示设备显示该灯效。
图7示出了本申请实施例提供的一种控制灯光显示的方法700。该方法700可以应用于图1所示的车辆100中,该方法也可以由图3所示的***执行。该方法700包括:
S701,确定根据音频数据确定音频属性所需的第一时长。
示例性地,该音频数据可以包括上述实施例中的音频数据;该音频属性可以包括上述实施例中的第一属性和/或第二属性。
示例性地,该第一时长可以包括上述音频识别算法在对音频数据进行识别以获得音频属性时,所需的时长。在一些可能的实现方式中,音频数据越复杂,该第一时长越长。例如,音频数据包含的乐器、和声、和弦越多,则该音频数据越复杂,该第一时长越长。
S702,确定传输灯效指示信息产生的第一时延,确定对该灯效信息进行编解码产生的第二时延,确定根据该灯效指示信息生成第一灯效所需的第二时长,该灯效指示信息根据该音频属性生成,用于指示灯光显示设备显示第一灯效。
在一些可能的实现方式中,根据车辆中传输该灯效指示信息的网络带宽,以及该灯效指示信息的数据帧长度确定第一时延。其中,网络带宽越小、数据帧长度越长,则第一时延越长。示例性地,根据该灯效指示信息的数据帧长度确定第二时延。其中,数据帧长度越长,则第二时延越长。
S703,根据该第一时长、该第二时长、该第一时延和该第二时延中的至少一项,对该灯光显示设备显示的该第一灯效进行时延补偿。
在一些可能的实现方式中,由于上述第一时长、第一时延和第二时延的存在,使得灯光显示设备显示的灯效滞后与音频设备输出的音频,导致灯效实际变化的节点与根据第一属性确定的节点不一致。根据该第一时长、该第一时延和该第二时延对该灯光显示设备显示的该第一灯效进行时延补偿,能够使得灯效实际变化的节点与根据第一属性确定的节点吻合;或者使得灯效实际变化的节点与根据第一属性确定的节点之间的时间差,小于或等于第一阈值。示例性地,该第一阈值可以为16毫秒,或者也 可以为15毫秒,或者也可以为其他高于人眼识别频率的数值,本申请实施例对此不作具体限定。
在一些可能的实现方式中,还可以结合音频采样率和灯光刷新率,对灯光显示设备显示的灯效进行灯效补偿。示例性地,以音频采样率为44100赫兹(hertz,Hz)为例,则每帧音频的播放时间约为23.2毫秒(或者也可以立即为每23.2毫秒更新一帧音频)。假设灯光刷新率为20Hz,即每50毫秒刷新一次。则可以在音频更新和灯光刷新之间存在时延的位置增加补偿灯效,进行灯效平滑,例如增加水波纹(渐白)灯效,做缓冲补偿。
在一些可能的实现方式中,根据音频显示灯光的灯光显示设备可能包括两条及以上链路,其中,每条链路中包括至少一个灯光显示设备,两条及以上链路中每条链路中的灯光刷新率可能不同,则可以分别对每条链路进行时延补偿。
本申请实施例提供的一种控制灯光显示的方法,能够使得实际灯效变化的节点与根据音频数据的第一属性确定的节点基本重合,有助于提升用户使用灯光显示设备的体验。
图8示出了本申请实施例提供的一种控制灯光显示的方法800。该方法800可以应用于图1所示的车辆100中,该方法也可以由图3所示的***执行。该方法800包括:
S801,获取待播放的音频数据。
示例性地,该音频数据可以为上述实施例中的音频数据,或者也可以为其他待播放的音频数据。
S802,确定该音频数据的音频属性,该音频属性包括该音频数据的节拍特征、速度特征和曲调特征中的至少一个。
示例性地,该音频属性可以为上述方法700中音频属性;或者该音频属性可以包括上述实施例中的第一属性,或者还可以包括上述实施例中的第二属性,或者还可以包括其他音频相关的属性。
示例性地,确定音频属性的方法可以参考上述实施例中的描述,在此不再赘述。
S803,根据该音频属性,控制灯光显示设备所显示的灯效变化的节点。
示例性地,根据该音频属性控制灯光显示设备所显示的灯效变化的节点的方法流程,可以参考上述实施例中的描述,在此不再赘述。
在一些可能的实现方式中,控制灯光显示设备所显示的灯效变化的节点,包括如下至少一项:控制灯光显示设备所显示的灯效变化的时刻,和/或灯效变化前后的灯效具体形式。
示例性地,灯效包括如下至少一项:灯光颜色、灯光亮度、灯光形成的动态效果;其中,在灯光显示设备包括灯带时,灯光形成的动态效果包括但不限于:灯带中灯光变化的方向、速度、灯光的流动长度、颜色变化类型(如按色盘变化还是按黑白变化)。此外,灯光效果还可以包括:灯带维度上灯光形成的动态效果,和/或时间维度上灯光形成的动态效果。以下结合图9作具体说明:
如图9所示,灯光显示设备包括灯带910、灯带920、灯带930,其中,每条灯带包括多个灯珠。以灯带910为例说明灯带维度上灯光形成的动态效果,其中,灯带中灯光变化的方向可以包括:从灯珠911向灯珠913的方向变化,或者从灯珠913向灯珠911的方向变化。上述灯光变化可以包括但不限于:灯珠依次亮起、灯珠依次熄灭、灯珠颜色渐变。灯光变化的速度可以从时长上体现,例如,从灯珠911到灯珠913依次亮起所需时长越短,则灯光变化的速度越快。灯光的流动可以通过灯珠依次亮起和/或依次熄灭呈现,灯光的流动长度可以从灯带中进行灯光显示的灯珠个数体现,例如,可以控制从灯珠911到灯珠913均进行灯光显示,或者也可以控制从灯珠911到灯珠912进行灯光显示,其中,前一种情况下灯光的流动长度长于后一种情况下灯光的流动长度。在一些实现方式中,时间维度上灯光形成的动态效果,可以从灯带910至灯带930进行灯光显示的时间体现。一示例中,可以控制灯带910、灯带920和灯带930中的至少两个同时进行灯光显示。又一示例中,可以控制灯带910、灯带920和灯带930依次进行灯光显示,例如,灯带910进行灯光显示之后,再由灯带920进行灯光显示,待灯带920灯光显示结束之后,再由灯带930进行灯光显示。
可选地,该音频属性包括该音频数据的音色特征、和弦特征、和声特征中的至少一个;根据该音色特征、和弦特征、和声特征中的至少一个,控制灯光显示设备所显示的灯效变化。例如,根据该第一音色特征控制第一灯光显示设备所显示的灯效变化的节点;和/或根据该第二音色特征控制第二灯光显示设备所显示的灯效变化的节点;其中,该灯光显示设备包括该第一灯光显示设备和该第二灯光显示设备。根据该第一和声特征控制第三灯光显示设备所显示的灯效变化的节点;和/或根据该第二和声特征控制第四灯光显示设备所显示的灯效变化的节点;其中,该灯光显示设备包括该第三灯光显示设 备和该第四灯光显示设备。根据该第一和弦特征控制第五灯光显示设备所显示的灯效变化的节点;和/或根据该第二和弦特征控制第六灯光显示设备所显示的灯效变化的节点;其中,该灯光显示设备包括该第五灯光显示设备和该第六灯光显示设备。
示例性地,该第一音色特征用于指示音频数据中的第一音色的相关信息,该第二音色特征用于指示音频数据中的第二音色的相关信息。
示例性地,该第一和声特征用于指示音频数据中的第一声部的相关信息,该第二和声特征用于指示音频数据中的第二声部的相关信息。
示例性地,该第一和弦特征用于指示音频数据中的第一乐音的相关信息,该第二和弦特征用于指示音频数据中的第二乐音的相关信息。
其中,根据该音色特征、和弦特征、和声特征中的至少一个,控制灯光显示设备所显示的灯效变化的具体方法流程,可以参考上述实施例中的描述,在此不再赘述。
可选地,根据音频属性和座舱内第一用户的人体特征或身份信息,控制该灯光显示设备所显示的灯效变化的节点。例如,控制该节点前后的灯光颜色,和/或该节点的频率。
示例性地,该座舱内第一用户的人体特征或身份信息,可以包括上述实施例中外部参考信息中的一项或多项。根据音频属性和座舱内第一用户的人体特征或身份信息,控制该灯光显示设备所显示的灯效变化的节点的具体方法流程,可以参考上述实施例中的描述,在此不再赘述。
可选地,根据音频属性,以及座舱外的温度信息和/或当前该季节信息,控制该节点前后的灯光颜色。
示例性地,该座舱外的温度信息和/或当前该季节信息,可以包括上述实施例中外部参考信息中的一项或多项,例如温度信息可以包括外界温度。根据音频属性,以及座舱外的温度信息和/或当前该季节信息,控制该节点前后的灯光颜色的具体方法流程,可以参考上述实施例中的描述,在此不再赘述。
可选地,根据音频属性,以及座舱外的光强信息,控制该节点前后的灯光亮度。
示例性地,该座舱外的光强信息,可以包括上述实施例中外部参考信息中的一项或多项,例如环境光强。根据音频属性,以及光强信息,控制该节点前后的灯光亮度的具体方法流程,可以参考上述实施例中的描述,在此不再赘述。
本申请实施例提供的一种控制灯光显示的方法,能够根据音乐一个或多个维度的特征进行灯效控制,有助于提高灯效与音乐之间的配合度,提高灯效对音乐的表现力,进而提升用户的驾乘体验。
在本申请的各个实施例中,如果没有特殊说明以及逻辑冲突,各个实施例之间的术语和/或描述具有一致性、且可以相互引用,不同的实施例中的技术特征根据其内在的逻辑关系可以组合形成新的实施例。
上文中结合图4至图9详细说明了本申请实施例提供的方法。下面将结合图10和图11详细说明本申请实施例提供的装置。应理解,装置实施例的描述与方法实施例的描述相互对应,因此,未详细描述的内容可以参见上文方法实施例,为了简洁,这里不再赘述。
图10示出了本申请实施例提供的一种控制灯光显示的装置2000的示意性框图,该装置2000包括获取单元2010、第一确定单元2020和处理单元2030。
该装置2000可以包括用于执行图4至图8中的方法的单元。并且,该装置2000中的各单元和上述其他操作和/或功能分别为了实现图4至图8中的方法实施例的相应流程。
其中,当该装置2000用于执行图8中的方法800时,获取单元2010可用于执行方法800中的S801,第一确定单元2020可用于执行方法800中的S802,处理单元2030可用于执行方法800中的S803。
具体地,该装置2000包括:获取单元2010,用于获取待播放的音频数据;第一确定单元2020,用于确定该音频数据的音频属性,该音频属性包括该音频数据的节拍特征、速度特征和曲调特征中的至少一个;处理单元2030,用于根据该音频属性,控制灯光显示设备所显示的灯效变化的节点。
可选地,该音频属性还包括该音频数据的节奏特征,该处理单元2030用于:根据该节奏特征,控制该节点前后的灯光颜色、该节点的频率中的一个或多个,其中,该节点的频率用于指示该灯效变化的速度。
可选地,该装置还包括第二确定单元,用于:确定时延信息,该时延信息包括如下至少一项:根据该音频数据确定该音频属性所需的第一时长、传输灯效指示信息产生的第一时延、对该灯效指示信 息进行编解码产生的第二时延,以及根据该灯效指示信息生成第一灯效所需的第二时长;其中,该灯效指示信息根据该音频属性生成,用于指示该灯光显示设备所显示的第一灯效;该处理单元2030用于根据该音频属性以及该时延信息,控制该灯光显示设备所显示的灯效变化的节点,该节点前后的灯效包括该第一灯效。
在一些可能的实现方式中,第一确定单元2020和第二确定单元为同一确定单元。
可选地,该音频属性还包括音色特征,该音色特征包括第一音色特征和第二音色特征,该处理单元2030包括第一处理单元和第二处理单元,其中,该第一处理单元用于:根据该第一音色特征控制第一灯光显示设备所显示的灯效变化的节点;和/或该第二处理单元用于:根据该第二音色特征控制第二灯光显示设备所显示的灯效变化的节点;其中,该灯光显示设备包括该第一灯光显示设备和该第二灯光显示设备。
在一些可能的实现方式中,第一处理单元与第二处理单元为同一处理单元。
可选地,该音色特征包括人声音色特征和/或乐器音色特征。
可选地,该音频属性还包括和声特征,该和声特征包括第一和声特征和第二和声特征,该处理单元2030包括第三处理单元和第四处理单元,其中,该第三处理单元用于:根据该第一和声特征控制第三灯光显示设备所显示的灯效变化的节点;和/或该第四处理单元用于:根据该第二和声特征控制第四灯光显示设备所显示的灯效变化的节点;其中,该灯光显示设备包括该第三灯光显示设备和该第四灯光显示设备。
在一些可能的实现方式中,第三处理单元与第四处理单元为同一处理单元。
可选地,该音频属性还包括和弦特征,该和弦特征包括第一和弦特征和第二和弦特征,该处理单元2030包括第五处理单元和第六处理单元,其中,该第五处理单元用于:根据该第一和弦特征控制第五灯光显示设备所显示的灯效变化的节点;和/或该第六处理单元用于:根据该第二和弦特征控制第六灯光显示设备所显示的灯效变化的节点;其中,该灯光显示设备包括该第五灯光显示设备和该第六灯光显示设备。
在一些可能的实现方式中,第五处理单元与第六处理单元为同一处理单元。
可选地,该获取单元2010还用于:获取座舱内第一用户的人体特征或身份信息;该处理单元2030用于:根据该音频属性,以及该人体特征或该身份信息,控制该灯光显示设备所显示的灯效变化的节点。
可选地,该处理单元2030用于:在该座舱所处的车辆处于行驶状态,该第一用户处于该座舱的主驾驶处,且该人体特征指示该第一用户处于疲劳状态时,提高该节点的频率,和/或控制该节点前后显示警示颜色的灯光。
可选地,该获取单元2010还用于:获取座舱外的环境信息,该环境信息包括温度信息、当前所处季节信息、光强信息中的一个或多个;该处理单元2030用于:根据该音频属性,以及该环境信息,控制该节点前后的灯光颜色、该节点的频率、该节点前后的灯光亮度中的一个或多个。
可选地,该获取单元2010还用于:获取座舱外的温度信息和/或当前该季节信息;该处理单元2030用于:根据该音频属性,以及该座舱外的温度信息和/或当前所处季节信息,控制该节点前后的灯光颜色。
可选地,该获取单元2010还用于:获取座舱外的光强信息;该处理单元2030用于:根据该音频属性,以及该座舱外的光强信息,控制该节点前后的灯光亮度。
可选地,该灯效变化的节点与该音频属性中的如下至少一项相关联:该节拍特征指示的节拍点,该速度特征指示的强拍出现频率变化的点,该曲调特征指示的曲调变化的点。
应理解,以上装置中各单元的划分仅是一种逻辑功能的划分,实际实现时可以全部或部分集成到一个物理实体上,也可以物理上分开。此外,装置中的单元可以以处理器调用软件的形式实现;例如装置包括处理器,处理器与存储器连接,存储器中存储有指令,处理器调用存储器中存储的指令,以实现以上任一种方法或实现该装置各单元的功能,其中处理器例如为通用处理器,例如CPU或微处理器,存储器为装置内的存储器或装置外的存储器。或者,装置中的单元可以以硬件电路的形式实现,可以通过对硬件电路的设计实现部分或全部单元的功能,该硬件电路可以理解为一个或多个处理器;例如,在一种实现中,该硬件电路为ASIC,通过对电路内元件逻辑关系的设计,实现以上部分或全部 单元的功能;再如,在另一种实现中,该硬件电路为可以通过PLD实现,以FPGA为例,其可以包括大量逻辑门电路,通过配置文件来配置逻辑门电路之间的连接关系,从而实现以上部分或全部单元的功能。以上装置的所有单元可以全部通过处理器调用软件的形式实现,或全部通过硬件电路的形式实现,或部分通过处理器调用软件的形式实现,剩余部分通过硬件电路的形式实现。
在本申请实施例中,处理器是一种具有信号的处理能力的电路,在一种实现中,处理器可以是具有指令读取与运行能力的电路,例如CPU、微处理器、GPU、或DSP等;在另一种实现中,处理器可以通过硬件电路的逻辑关系实现一定功能,该硬件电路的逻辑关系是固定的或可以重构的,例如处理器为ASIC或PLD实现的硬件电路,例如FPGA。在可重构的硬件电路中,处理器加载配置文档,实现硬件电路配置的过程,可以理解为处理器加载指令,以实现以上部分或全部单元的功能的过程。此外,还可以是针对人工智能设计的硬件电路,其可以理解为一种ASIC,例如NPU、TPU、DPU等。
可见,以上装置中的各单元可以是被配置成实施以上方法的一个或多个处理器(或处理电路),例如:CPU、GPU、NPU、TPU、DPU、微处理器、DSP、ASIC、FPGA,或这些处理器形式中至少两种的组合。
此外,以上装置中的各单元可以全部或部分可以集成在一起,或者可以独立实现。在一种实现中,这些单元集成在一起,以片上***(system-on-a-chip,SOC)的形式实现。该SOC中可以包括至少一个处理器,用于实现以上任一种方法或实现该装置各单元的功能,该至少一个处理器的种类可以不同,例如包括CPU和FPGA,CPU和人工智能处理器,CPU和GPU等。
在具体实现过程中,上述获取单元2010、第一确定单元2020和处理单元2030所执行的各项操作可以由同一个处理器执行,或者,也可以由不同的处理器执行,例如分别由多个处理器执行;上述第一处理单元、第二处理单元、第三处理单元、第四处理单元、第五处理单元、第六处理单元所执行的各项操作可以由同一个处理器执行,或者,也可以由不同的处理器执行,例如分别由多个处理器执行。一示例中,一个或多个处理器可以与图1中的感知***120中一个或多个传感器相连接,从一个或多个传感器中获取用户所处位置的信息并进行处理;又一示例中,一个或多个处理器还可以与显示装置130中的一个或多个灯光显示设备相连接,进而分别控制一个或多个灯光显示设备中,每个灯光显示显示的灯光效果。示例性地,在具体实现过程中,上述一个或多个处理器可以设置在车机中的处理器,或者也可以为设置在其他车载终端中的处理器。示例性地,在具体实现过程中,上述装置2000可以为设置在车机或者其他车载终端中的芯片。示例性地,在具体实现过程中,上述装置2000可以为设置在车辆中的如图1所示的计算平台150。
本申请实施例还提供了一种装置,该装置包括处理单元和存储单元,其中存储单元用于存储指令,处理单元执行存储单元所存储的指令,以使该装置执行上述实施例执行的方法或者步骤。
可选地,在具体实现过程中,上述处理单元是包括图1所示的处理器151-15n。上述获取单元可以为图1所示的感知***120中某个传感器,或者也可以为图1所示的处理器151-15n。
图11是本申请实施例的一种控制灯光显示的装置的示意性框图。图11所示的控制灯光显示的装置2100可以包括:处理器2110、收发器2120以及存储器2130。其中,处理器2110、收发器2120以及存储器2130通过内部连接通路相连,该存储器2130用于存储指令,该处理器2110用于执行该存储器2130存储的指令,以收发器2120接收/发送部分参数。可选地,存储器2130既可以和处理器2110通过接口耦合,也可以和处理器2110集成在一起。
需要说明的是,上述收发器2120可以包括但不限于输入/输出接口(input/output interface)一类的收发装置,来实现装置2100与其他设备或通信网络之间的通信。
处理器2110可以采用通用的CPU,微处理器,ASIC,GPU或者一个或多个集成电路,用于执行相关程序,以实现本申请方法实施例的控制灯光显示的方法。处理器2110还可以是一种集成电路芯片,具有信号的处理能力。在具体实现过程中,本申请的控制灯光显示的方法的各个步骤可以通过处理器2110中的硬件的集成逻辑电路或者软件形式的指令完成。上述处理器2110还可以是通用处理器、DSP、ASIC、FPGA或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件。可以实现或者执行本申请实施例中的公开的各方法、步骤及逻辑框图。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。结合本申请实施例所公开的方法的步骤可以直接体现为硬件译码处理器执行完成,或者用译码处理器中的硬件及软件模块组合执行完成。软件模块可以位于随机 存储器,闪存、只读存储器,可编程只读存储器或者电可擦写可编程存储器、寄存器等本领域成熟的存储介质中。该存储介质位于存储器2130,处理器2110读取存储器2130中的信息,结合其硬件执行本申请方法实施例的控制灯光显示的方法。
存储器2130可以是只读存储器(read only memory,ROM),静态存储设备,动态存储设备或者随机存取存储器(random access memory,RAM)。
收发器2120使用例如但不限于收发器一类的收发装置,来实现装置2100与其他设备或通信网络之间的通信。例如,可以通过收发器2120获取用户所处位置的信息。
本申请实施例还提供一种移动载体,该移动载体可以包括上述装置2000,或者上述装置2100。
示例性地,该移动载体可以为上述实施例中的车辆。
本申请实施例还提供了一种计算机程序产品,该计算机程序产品包括:计算机程序代码,当该计算机程序代码在计算机上运行时,使得计算机执行上述图4至图8中的方法。
本申请实施例还提供一种计算机可读存储介质,该计算机可读介质存储有程序代码或指令,当该计算机程序代码或指令被计算机的处理器执行时,使得该处理器实现上述图4至图8中的方法。
本申请实施例还提供一种芯片,包括:至少一个处理器和存储器,该至少一个处理器与该存储器耦合,用于读取并执行该存储器中的指令,以执行上述图4至图8中的方法。
本申请将围绕包括多个设备、组件、模块等的***来呈现各个方面、实施例或特征。应当理解和明白的是,各个***可以包括另外的设备、组件、模块等,并且/或者可以并不包括结合附图讨论的所有设备、组件、模块等。此外,还可以使用这些方案的组合。
另外,在本申请实施例中,“示例的”、“例如”等词用于表示作例子、例证或说明。本申请中被描述为“示例”的任何实施例或设计方案不应被解释为比其它实施例或设计方案更优选或更具优势。确切而言,使用示例的一词旨在以具体方式呈现概念。
本申请实施例中,“相应的(corresponding,relevant)”和“对应的(corresponding)”有时可以混用,应当指出的是,在不强调其区别时,其所要表达的含义是一致的。
在本说明书中描述的参考“一个实施例”或“一些实施例”等意味着在本申请的一个或多个实施例中包括结合该实施例描述的特定特征、结构或特点。由此,在本说明书中的不同之处出现的语句“在一个实施例中”、“在一些实施例中”、“在其他一些实施例中”、“在另外一些实施例中”等不是必然都参考相同的实施例,而是意味着“一个或多个但不是所有的实施例”,除非是以其他方式另外特别强调。术语“包括”、“包含”、“具有”及它们的变形都意味着“包括但不限于”,除非是以其他方式另外特别强调。
本申请中,“至少一个”是指一个或者多个,“多个”是指两个或两个以上。“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:包括单独存在A,同时存在A和B,以及单独存在B的情况,其中A,B可以是单数或者复数。字符“/”一般表示前后关联对象是一种“或”的关系。“以下至少一项(个)”或其类似表达,是指的这些项中的任意组合,包括单项(个)或复数项(个)的任意组合。例如,a,b,或c中的至少一项(个),可以表示:a,b,c,a-b,a-c,b-c,或a-b-c,其中a,b,c可以是单个,也可以是多个。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的***、装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的***、装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个***,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际 的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。
所述功能如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、ROM、RAM、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (27)

  1. 一种控制灯光显示的方法,其特征在于,包括:
    获取待播放的音频数据;
    确定所述音频数据的音频属性,所述音频属性包括所述音频数据的节拍特征、速度特征和曲调特征中的至少一个;
    根据所述音频属性,控制灯光显示设备所显示的灯效变化的节点。
  2. 根据权利要求1所述的方法,其特征在于,所述音频属性还包括所述音频数据的节奏特征,所述控制灯光显示设备所显示的灯效变化的节点,包括:
    根据所述节奏特征,控制所述节点前后的灯光颜色、所述节点的频率中的一个或多个,其中,所述节点的频率用于指示所述灯效变化的速度。
  3. 根据权利要求1或2所述的方法,其特征在于,所述方法还包括:
    确定时延信息,所述时延信息包括如下至少一项:
    根据所述音频数据确定所述音频属性所需的第一时长,
    传输灯效指示信息产生的第一时延,
    对所述灯效指示信息进行编解码产生的第二时延,以及
    根据所述灯效指示信息生成第一灯效所需的第二时长;
    其中,所述灯效指示信息根据所述音频属性生成,用于指示所述灯光显示设备所显示的第一灯效;
    所述控制灯光显示设备所显示的灯效变化的节点,包括:
    根据所述音频属性以及所述时延信息,控制所述灯光显示设备所显示的灯效变化的节点,所述节点前后的灯效包括所述第一灯效。
  4. 根据权利要求1至3中任一项所述的方法,其特征在于,所述音频属性还包括音色特征,所述音色特征包括第一音色特征和第二音色特征,所述控制灯光显示设备所显示的灯效变化的节点,包括:
    根据所述第一音色特征控制第一灯光显示设备所显示的灯效变化的节点;和/或
    根据所述第二音色特征控制第二灯光显示设备所显示的灯效变化的节点;
    其中,所述灯光显示设备包括所述第一灯光显示设备和所述第二灯光显示设备。
  5. 根据权利要求4所述的方法,其特征在于,所述音色特征包括人声音色特征和/或乐器音色特征。
  6. 根据权利要求1至5中任一项所述的方法,其特征在于,所述音频属性还包括和声特征,所述和声特征包括第一和声特征和第二和声特征,所述控制灯光显示设备所显示的灯效变化的节点,包括:
    根据所述第一和声特征控制第三灯光显示设备所显示的灯效变化的节点;和/或
    根据所述第二和声特征控制第四灯光显示设备所显示的灯效变化的节点;
    其中,所述灯光显示设备包括所述第三灯光显示设备和所述第四灯光显示设备。
  7. 根据权利要求1至6中任一项所述的方法,其特征在于,所述音频属性还包括和弦特征,所述和弦特征包括第一和弦特征和第二和弦特征,所述控制灯光显示设备所显示的灯效变化的节点,包括:
    根据所述第一和弦特征控制第五灯光显示设备所显示的灯效变化的节点;和/或
    根据所述第二和弦特征控制第六灯光显示设备所显示的灯效变化的节点;
    其中,所述灯光显示设备包括所述第五灯光显示设备和所述第六灯光显示设备。
  8. 根据权利要求1至7中任一项所述的方法,其特征在于,所述方法应用于座舱,所述方法还包括:
    获取所述座舱内的第一用户的人体特征或身份信息;
    所述控制灯光显示设备所显示的灯效变化的节点,包括:
    根据所述音频属性,以及所述人体特征或所述身份信息,控制所述灯光显示设备所显示的灯效变化的节点。
  9. 根据权利要求8所述的方法,其特征在于,所述控制所述灯光显示设备所显示的灯效变化的节点,包括:
    在所述座舱所处的车辆处于行驶状态,所述第一用户处于所述座舱的主驾驶处,且所述人体特征 指示所述第一用户处于疲劳状态时,提高所述节点的频率,和/或控制所述节点前后的显示警示颜色的灯光。
  10. 根据权利要求1至9中任一项所述的方法,其特征在于,所述方法还包括:
    获取座舱外的环境信息,所述环境信息包括温度信息、当前所处季节信息、光强信息中的一个或多个;
    所述控制灯光显示设备所显示的灯效变化的节点,包括:
    根据所述音频属性,以及所述环境信息,控制所述节点前后的灯光颜色、所述节点的频率、所述节点前后的灯光亮度中的一个或多个。
  11. 根据权利要求1至10中任一项所述的方法,其特征在于,所述灯效变化的节点与所述音频属性中的如下至少一项相关联:
    所述节拍特征指示的节拍点,所述速度特征指示的强拍出现频率变化的点,所述曲调特征指示的曲调变化的点。
  12. 一种控制灯光显示的装置,其特征在于,包括:
    获取单元,用于获取待播放的音频数据;
    第一确定单元,用于确定所述音频数据的音频属性,所述音频属性包括所述音频数据的节拍特征、速度特征和曲调特征中的至少一个;
    处理单元,用于根据所述音频属性,控制灯光显示设备所显示的灯效变化的节点。
  13. 根据权利要求12所述的装置,其特征在于,所述音频属性还包括所述音频数据的节奏特征,所述处理单元用于:
    根据所述节奏特征,控制所述节点前后的灯光颜色、所述节点的频率中的一个或多个,其中,所述节点的频率用于指示所述灯效变化的速度。
  14. 根据权利要求12或13所述的装置,其特征在于,所述装置还包括第二确定单元,用于:
    确定时延信息,所述时延信息包括如下至少一项:
    根据所述音频数据确定所述音频属性所需的第一时长,
    传输灯效指示信息产生的第一时延,
    对所述灯效指示信息进行编解码产生的第二时延,以及
    根据所述灯效指示信息生成第一灯效所需的第二时长;
    其中,所述灯效指示信息根据所述音频属性生成,用于指示所述灯光显示设备所显示的第一灯效;
    所述处理单元用于:根据所述音频属性以及所述时延信息,控制所述灯光显示设备所显示的灯效变化的节点,所述节点前后的灯效包括所述第一灯效。
  15. 根据权利要求12至14中任一项所述的装置,其特征在于,所述音频属性还包括音色特征,所述音色特征包括第一音色特征和第二音色特征,所述处理单元包括第一处理单元和第二处理单元,其中,
    所述第一处理单元用于:根据所述第一音色特征控制第一灯光显示设备所显示的灯效变化的节点;和/或
    所述第二处理单元用于:根据所述第二音色特征控制第二灯光显示设备所显示的灯效变化的节点;
    其中,所述灯光显示设备包括所述第一灯光显示设备和所述第二灯光显示设备。
  16. 根据权利要求15所述的装置,其特征在于,所述音色特征包括人声音色特征和/或乐器音色特征。
  17. 根据权利要求12至16中任一项所述的装置,其特征在于,所述音频属性还包括和声特征,所述和声特征包括第一和声特征和第二和声特征,所述处理单元包括第三处理单元和第四处理单元,其中,
    所述第三处理单元用于:根据所述第一和声特征控制第三灯光显示设备所显示的灯效变化的节点;和/或
    所述第四处理单元用于:根据所述第二和声特征控制第四灯光显示设备所显示的灯效变化的节点;
    其中,所述灯光显示设备包括所述第三灯光显示设备和所述第四灯光显示设备。
  18. 根据权利要求12至17中任一项所述的装置,其特征在于,所述音频属性还包括和弦特征, 所述和弦特征包括第一和弦特征和第二和弦特征,所述处理单元包括第五处理单元和第六处理单元,其中,
    所述第五处理单元用于:根据所述第一和弦特征控制第五灯光显示设备所显示的灯效变化的节点;和/或
    所述第六处理单元用于:根据所述第二和弦特征控制第六灯光显示设备所显示的灯效变化的节点;
    其中,所述灯光显示设备包括所述第五灯光显示设备和所述第六灯光显示设备。
  19. 根据权利要求12至18中任一项所述的装置,其特征在于,所述获取单元还用于:获取座舱内第一用户的人体特征或身份信息;
    所述处理单元用于:根据所述音频属性,以及所述人体特征或所述身份信息,控制所述灯光显示设备所显示的灯效变化的节点。
  20. 根据权利要求19所述的装置,其特征在于,所述处理单元用于:
    在所述座舱所处的车辆处于行驶状态,所述第一用户处于所述座舱的主驾驶处,且所述人体特征指示所述第一用户处于疲劳状态时,提高所述节点的频率,和/或控制所述节点前后显示警示颜色的灯光。
  21. 根据权利要求12至20中任一项所述的装置,其特征在于,所述获取单元还用于:获取座舱外的环境信息,所述环境信息包括温度信息、当前所处季节信息、光强信息中的一个或多个;
    所述处理单元用于:根据所述音频属性,以及所述环境信息,控制所述节点前后的灯光颜色、所述节点的频率、所述节点前后的灯光亮度中的一个或多个。
  22. 根据权利要求12至21中任一项所述的装置,其特征在于,所述灯效变化的节点与所述音频属性中的如下至少一项相关联:
    所述节拍特征指示的节拍点,所述速度特征指示的强拍出现频率变化的点,所述曲调特征指示的曲调变化的点。
  23. 一种控制灯光显示的装置,其特征在于,包括:
    存储器,用于存储计算机程序;
    处理器,用于执行所述存储器中存储的计算机程序,以使得所述装置执行如权利要求1至11中任一项所述的方法。
  24. 一种移动载体,其特征在于,包括权利要求12至23中任一项所述的装置。
  25. 根据权利要求24所述的移动载体,其特征在于,所述移动载体为车辆。
  26. 一种计算机可读存储介质,其特征在于,其上存储有计算机程序,所述计算机程序被计算机执行时,以使得实现如权利要求1至11中任一项所述的方法。
  27. 一种芯片,其特征在于,所述芯片包括处理器与数据接口,所述处理器通过所述数据接口读取存储器上存储的指令,以执行如权利要求1至11中任一项所述的方法。
PCT/CN2023/107419 2022-09-05 2023-07-14 控制灯光显示的方法和装置 WO2024051347A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211080757.7A CN117693086A (zh) 2022-09-05 2022-09-05 控制灯光显示的方法和装置
CN202211080757.7 2022-09-05

Publications (1)

Publication Number Publication Date
WO2024051347A1 true WO2024051347A1 (zh) 2024-03-14

Family

ID=90125127

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/107419 WO2024051347A1 (zh) 2022-09-05 2023-07-14 控制灯光显示的方法和装置

Country Status (2)

Country Link
CN (1) CN117693086A (zh)
WO (1) WO2024051347A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017000794A1 (zh) * 2015-06-30 2017-01-05 芋头科技(杭州)有限公司 一种音乐灯光律动***及方法
CN107461710A (zh) * 2017-08-14 2017-12-12 广州法锐科技有限公司 便于控制调节光照强度及色彩的车用灯具及其控制方法
CN113613369A (zh) * 2021-08-11 2021-11-05 深圳市智岩科技有限公司 一种灯光效果控制方法、装置、设备及存储介质
WO2022036945A1 (zh) * 2020-08-17 2022-02-24 广州橙行智动汽车科技有限公司 车灯的控制方法、控制装置和存储介质
CN114245528A (zh) * 2021-12-16 2022-03-25 浙江吉利控股集团有限公司 车辆灯光秀控制方法、装置、设备、介质及程序产品
CN114375083A (zh) * 2021-12-17 2022-04-19 广西世纪创新显示电子有限公司 一种灯光律动方法、装置、终端设备及存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017000794A1 (zh) * 2015-06-30 2017-01-05 芋头科技(杭州)有限公司 一种音乐灯光律动***及方法
CN107461710A (zh) * 2017-08-14 2017-12-12 广州法锐科技有限公司 便于控制调节光照强度及色彩的车用灯具及其控制方法
WO2022036945A1 (zh) * 2020-08-17 2022-02-24 广州橙行智动汽车科技有限公司 车灯的控制方法、控制装置和存储介质
CN113613369A (zh) * 2021-08-11 2021-11-05 深圳市智岩科技有限公司 一种灯光效果控制方法、装置、设备及存储介质
CN114245528A (zh) * 2021-12-16 2022-03-25 浙江吉利控股集团有限公司 车辆灯光秀控制方法、装置、设备、介质及程序产品
CN114375083A (zh) * 2021-12-17 2022-04-19 广西世纪创新显示电子有限公司 一种灯光律动方法、装置、终端设备及存储介质

Also Published As

Publication number Publication date
CN117693086A (zh) 2024-03-12

Similar Documents

Publication Publication Date Title
US10481858B2 (en) Generating personalized audio content based on mood
US7881934B2 (en) Method and system for adjusting the voice prompt of an interactive system based upon the user's state
CN107464572B (zh) 多模式交互音乐感知***及其控制方法
CN101669090A (zh) 情绪提示***和方法
CN111601433B (zh) 舞台灯光效果控制策略的预测方法及装置
DE102021112020A1 (de) Umgebungswahrnehmungssystem zum Erfahren einer Umgebung durch Musik
KR20220000655A (ko) 주행음 라이브러리, 주행음 라이브러리 생성 장치 및 주행음 라이브러리를 포함하는 차량
CN109992677A (zh) 基于Valence-Arousal情感空间的图像-音乐匹配***
JP2022092619A (ja) ライト制御方法および装置
WO2024051347A1 (zh) 控制灯光显示的方法和装置
WO2024087727A1 (zh) 基于车载语音ai的语音数据处理方法及相关设备
JP6645480B2 (ja) 音生成装置、音生成制御プログラム
CN116797725A (zh) 一种车载场景生成方法、装置和***
US11537358B1 (en) Sound experience generator
CN115402333A (zh) 基于驾驶员情绪的车内互动控制***、方法及存储介质
Chang et al. Personalized EV Driving Sound Design Based on the Driver's Total Emotion Recognition
CN117087531A (zh) 控制灯光显示设备的方法和装置
CN115214696A (zh) 一种车机虚拟形象交互方法、***、车辆及存储介质
KR20220004156A (ko) 디지털 휴먼에 기반한 자동차 캐빈 인터랙션 방법, 장치 및 차량
CN113850106A (zh) 车辆及其控制方法
Haverkamp Advanced description of noise perception by analysis of cross-sensory interactions within soundscapes
WO2024093401A1 (zh) 控制方法、装置和运载工具
CN110197659A (zh) 基于用户画像的反馈方法、装置及***
WO2024122322A1 (ja) 情報処理装置、情報処理方法、およびプログラム
Haverkamp Essentials for description of crosssensory interaction during perception of a complex environment

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23862048

Country of ref document: EP

Kind code of ref document: A1