CN115826904A - Sound effect configuration method and device, computer equipment and storage medium - Google Patents

Sound effect configuration method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN115826904A
CN115826904A CN202211288141.9A CN202211288141A CN115826904A CN 115826904 A CN115826904 A CN 115826904A CN 202211288141 A CN202211288141 A CN 202211288141A CN 115826904 A CN115826904 A CN 115826904A
Authority
CN
China
Prior art keywords
sound effect
information
node
target
configuration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211288141.9A
Other languages
Chinese (zh)
Inventor
赵鸿含
张良杰
林俊斌
姚振杰
郭宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202211288141.9A priority Critical patent/CN115826904A/en
Publication of CN115826904A publication Critical patent/CN115826904A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a sound effect configuration method, a sound effect configuration device, computer equipment and a storage medium, wherein a target UI node of a target application and UI attribute information corresponding to the target UI node are displayed on a sound effect configuration interface; responding to touch operation acting on the UI attribute information, updating the UI attribute information, and acquiring sound effect information according to the updated UI attribute information; the sound effect configuration is carried out on the target UI node based on the sound effect information, the direct editing of the UI node and the UI node corresponding to the UI attribute is realized through the sound effect configuration interface, the sound effect and the UI node are directly connected in an articulated mode, a developer is not required to write codes to realize sound effect connection, the operation difficulty of sound effect configuration is reduced, and the efficiency of sound effect configuration is improved.

Description

Sound effect configuration method and device, computer equipment and storage medium
Technical Field
The application relates to the technical field of artificial intelligence, in particular to a sound effect configuration method and device, computer equipment and a storage medium.
Background
In order to improve the game experience of game players, different sound effects are often set for different game interaction interfaces (UI, user interfaces) in game applications. The existing sound effect configuration is usually that sound effect designers make sound effect audio, and then program developers realize that the sound effect audio is hung to different game interfaces through program coding, and the process usually needs to carry out multi-version iteration until the interface sound effect in a game application program achieves effects of sound-picture synchronization and the like.
Along with the increasing bulkiness of games, game interaction interfaces in game application programs are more and more complex, sound effect hooking aiming at the game interaction interfaces is more and more difficult, program developers are required to write a large amount of sound effect hooking codes, and the like, so that the difficulty of sound effect configuration is high, and the efficiency is low.
Disclosure of Invention
Accordingly, there is a need to provide an audio configuration method, apparatus, computer device and storage medium for improving efficiency of audio configuration.
In a first aspect, the present application provides a sound effect configuration method, including:
displaying a target UI node of a target application and UI attribute information corresponding to the target UI node on a sound effect configuration interface;
responding to touch operation acted on the UI attribute information, updating the UI attribute information, and acquiring sound effect information according to the updated UI attribute information;
and carrying out sound effect configuration on the target UI node based on the sound effect information.
In a second aspect, the present application provides a sound effect configuration apparatus, comprising:
the display module is used for displaying a target UI node of a target application and UI attribute information corresponding to the target UI node on a sound effect configuration interface;
the operation module is used for responding to touch operation acted on the UI attribute information, updating the UI attribute information and acquiring sound effect information according to the updated UI attribute information;
and the configuration module is used for carrying out sound effect configuration on the target UI node based on the sound effect information.
In a third aspect, the present application further provides a computer device, comprising:
one or more processors;
a memory; and
one or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the processor to implement a sound effects configuration method.
In a fourth aspect, the present application further provides a computer readable storage medium, on which a computer program is stored, the computer program being loaded by a processor to perform the steps of the sound effect configuration method.
In a fifth aspect, embodiments of the present application provide a computer program product or a computer program comprising computer instructions stored in a computer-readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the method provided by the first aspect.
According to the sound effect configuration method, the sound effect configuration device, the computer equipment and the storage medium, the target UI node of the target application and the UI attribute information corresponding to the target UI node are displayed on the sound effect configuration interface; responding to touch operation acting on the UI attribute information, updating the UI attribute information, and acquiring sound effect information according to the updated UI attribute information; the sound effect configuration is carried out on the target UI node based on the sound effect information, the direct editing of the UI node and the UI node corresponding to the UI attribute is achieved through the sound effect configuration interface, the sound effect and the UI node are directly connected in a hanging mode, a developer does not need to write codes to achieve sound effect connection, the operation difficulty of the sound effect configuration is reduced, and the efficiency of the sound effect configuration is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a flow chart illustrating a sound effect configuration method according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a sound effect configuration interface in an embodiment of the present application;
FIG. 3 is another diagram of a sound effect configuration interface in an embodiment of the present application;
FIG. 4 is a flowchart illustrating a UI attribute information updating step in an embodiment of the present application;
FIG. 5 is another flowchart illustrating the UI attribute information updating step in the embodiment of the present application;
FIG. 6 is a schematic structural diagram of a sound effect configuration apparatus according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of a computer device in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments obtained by a person skilled in the art based on the embodiments in the present application without making any creative effort belong to the protection scope of the present application.
In the description of the present application, the terms "first", "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or as implying a number of the indicated technical features. Thus, features defined as "first", "second", may explicitly or implicitly include one or more of the described features. In the description of the present application, "a plurality" means two or more unless specifically limited otherwise.
In the description of the present application, the word "such as" is used to mean "serving as an example, instance, or illustration. Any embodiment described herein as "for example" is not necessarily to be construed as preferred or advantageous over other embodiments. The following description is presented to enable any person skilled in the art to make and use the invention. In the following description, details are set forth for the purpose of explanation. It will be apparent to one of ordinary skill in the art that the present invention may be practiced without these specific details. In other instances, well-known structures and processes are not shown in detail to avoid obscuring the description of the invention with unnecessary detail. Thus, the present invention is not intended to be limited to the embodiments shown, but is to be accorded the widest scope consistent with the principles and features disclosed herein.
The existing configuration of interface sound effect is usually that sound effect designers make sound effect audio, then program developers hang the sound effect audio to different game interfaces through program coding, and the sound effect designers adjust the game interfaces and the interface sound effect again, and this process needs to perform multi-version iteration until the interface sound effect in the game application program achieves effects such as sound and picture synchronization. Along with the increasing volume of games, game interaction interfaces in game application programs are more and more complex, sound effect hooking aiming at different game interfaces is more and more difficult, a program developer is often required to write a large amount of sound effect hooking codes to realize adjustment and integration of UI interfaces and interface sound effects, and the game development efficiency is low. The embodiment of the application provides a sound effect configuration method, which is applied to computer equipment capable of presenting a sound effect configuration interface, codes do not need to be written, only touch operation of configuration parameters needs to be input on the sound effect configuration interface, the computer equipment can respond to the touch operation and set sound effect information corresponding to a UI node, then, sound effects corresponding to the UI node are configured based on the sound effect information, the workload of developers is reduced, the sound effect configuration is more convenient and visual, and the sound effect configuration efficiency is improved.
The computer device may be a terminal, a server, or other types of devices having a display function and a processing function, which are not limited in this embodiment of the present application. The following explains a game interface dynamic effect editing method provided in the embodiment of the present application, with a processing device as an execution subject.
Referring to fig. 1, an embodiment of the present application provides a sound effect configuration method, including steps S110 to S130, as follows:
and step S110, displaying a target UI node of the target application and UI attribute information corresponding to the target UI node on a sound effect configuration interface.
And step S120, responding to the touch operation acted on the UI attribute information, updating the UI attribute information, and acquiring sound effect information according to the updated UI attribute information.
And step S130, performing sound effect configuration on the target UI node based on the sound effect information.
The audio configuration interface can be used for displaying a UI node in an application program and attribute information corresponding to the UI node, and receiving configuration information input by a user so as to realize sound effect configuration on the UI node; specifically, the sound effect configuration interface may be a visual interface provided by a graphics processing application, for example, the graphics processing application may specifically be a cocostudio application; or a visual interface provided by an audio engine program, for example, the audio engine may be specifically an audio engine Wwise.
The target UI node refers to an interface interaction element which needs sound effect configuration in an interaction interface corresponding to the target application; for example, the target UI node may be a UI control in the interactive interface, such as a button control, a prompt box control, a popup control, and the like in the interactive interface, and the target UI node may also be a UI animation, such as a countdown animation, a task success animation, and the like on the game interface in the game application. The specific interface interaction element for displaying the target UI node of the target application on the sound effect configuration interface can be an interface interaction element corresponding to the target UI node; for example, taking a UI node as a UI control, as shown in fig. 2, the UI control 210 corresponding to the target UI node is shown in the sound effect configuration interface shown in fig. 2; for another example, taking the UI node as the UI animation, as shown in fig. 3, the sound effect configuration interface shown in fig. 3 opens the UI animation through the canvas to show the UI animation 310 corresponding to the target UI node.
Further, the target UI node may be one or more UI nodes selected from the interface layout information of the interactive interface corresponding to the target application, or may be a UI node corresponding to the loaded UI file.
Specifically, in one embodiment, before the step of displaying the target UI node of the target application and the UI attribute information corresponding to the target UI node on the sound effect configuration interface, the method further includes: displaying a UI node tree of a target application in a sound effect configuration interface; in response to a touch operation acting on the UI node tree, a target UI node is determined in the UI node tree.
The interactive interface of the target application comprises different interface interactive elements (namely UI nodes), and the interactive logic of the whole interactive interface can be realized based on the interface interactive elements; specifically, the terminal can load a UI layout file of the target application, generate a UI node tree according to UI layout information in the UI layout file, and display the UI node tree of the target application in the sound effect configuration interface. It can be understood that the UI node tree is used to represent layout information between different UI nodes in the interactive interface corresponding to the target application, and each node in the UI node tree is a UI node. For convenience of user operation, the UI nodes in the UI node tree are represented by node names, and if a user wants to operate a certain UI node, the UI node name corresponding to the UI node can be operated.
After the UI node tree of the target application is displayed in the sound effect configuration interface, developers can trigger selection operation acting on any UI node of the UI node tree through the terminal to select the target UI node, at the moment, the terminal obtains the selection operation acting on the UI node tree, obtains the UI node triggered by the selection operation, obtains the target UI node, and then updates and displays the target UI node and the UI attribute information of the UI node on the sound effect configuration interface.
The UI attribute information refers to attribute information related to the UI node, and may be, for example, a name of an interactive element corresponding to the UI node, a track (track) of the UI node, or the like. Specifically, in order to facilitate user operations, the UI property information may be displayed on the audio configuration interface in the form of a graphical control, and a touch operation for changing the UI property information is received through the graphical control. Still taking fig. 2 as an example, the sound effect configuration interface shown in fig. 2 further includes a control name 220 corresponding to the UI control; as also shown in FIG. 3, the sound effect configuration interface shown in FIG. 3 also includes an animation timeline 320 corresponding to the UI animation 310. And displaying the UI nodes and the attribute information corresponding to the UI nodes on the sound effect editing interface, so that developers can directly configure sound effects on different UI nodes in the interface.
The function of the UI attribute information touch operation is to edit UI attribute information corresponding to an updated UI node, and specifically, the touch operation may be a touch operation, a cursor operation, a key operation, and the like. The sound effect information may include information such as sound effect corresponding audio file information and trigger nodes.
Specifically, a user can trigger touch operation for editing UI attribute information corresponding to a target UI node through a sound effect configuration interface to change the UI attribute information of the target UI node, the terminal updates the UI attribute information based on the touch operation after receiving the touch operation, acquires update information from the updated UI attribute information, acquires corresponding sound effect information according to the update information, and finally performs sound effect configuration on the target UI node according to the sound effect information to generate a sound effect file corresponding to the target UI node.
Further, the terminal can update the UI attribute information in response to a touch operation applied to the UI attribute information through the following two possible implementation manners, and acquire sound effect information according to the updated UI attribute information:
in a first possible implementation manner, as described above, the target UI node may be a UI control in the interactive interface, and the corresponding UI attribute information may refer to a control name of the UI control, and in an embodiment, as shown in fig. 4, the step of updating the UI attribute information in response to a touch operation applied to the UI attribute information, and acquiring sound effect information according to the updated UI attribute information includes:
s410, responding to the editing operation aiming at the control name, and updating the control name;
s420, extracting a sound effect configuration field from the updated control name;
s430, according to the sound effect configuration field, indexing the sound effect information corresponding to the sound effect configuration field.
The terminal can receive the editing operation input by the user for the control name based on the sound effect configuration interface, and updates the control name of the UI control based on the editing operation after receiving the editing operation. Specifically, the editing operation may specifically be a touch operation, a cursor operation, a key operation, or the like.
The sound effect configuration field is field information used for identifying different sound effect information, and corresponding sound effect audio and sound effect information such as sound effect trigger nodes can be found through the sound effect configuration field. Specifically, in an embodiment, before the step of extracting the sound effect configuration field from the updated control name, the method may further include: acquiring audio file information of sound effect audio; and constructing sound effect information based on the audio file information corresponding to the sound effect audio and the sound effect trigger node of the sound effect audio, and generating a sound effect configuration field for indexing the sound effect information.
The audio file information includes, but is not limited to, a file path and a file name of the audio file, and the sound effect audio can be quickly located through the file path and the file name of the audio file. The sound effect trigger node is used for indicating a trigger event for triggering sound effect audio to play; for example, the sound effect trigger node may specifically include a sound effect trigger identifier value, where when the sound effect trigger identifier value is 1, the sound effect audio starts to be played at a hand-up stage of the trigger operation when the UI control receives the trigger operation, when the sound effect trigger identifier is 2, the sound effect audio starts to be played at a press stage of the trigger operation when the UI control receives the trigger operation, and when the sound effect trigger identifier is 3, the sound effect audio starts to be played after the UI control is completely displayed.
Specifically, audio file information corresponding to the audio effect audio and an audio effect trigger node of the audio effect audio are subjected to associated mapping to establish audio effect information, and an audio effect configuration field is defined for the audio effect information. As shown in table 1 below, table 1 shows different sound effect information and sound effect configuration fields corresponding to the sound effect information, where the sound effect configuration fields may be composed of identification characters and numbers, and a sound effect trigger node includes a sound effect trigger identification value; taking the sound effect configuration field "%100" as an example, the sound effect configuration field includes identification characters "%" and numbers "100", the sound effect audio "Play _ UI _ click _1" can be indexed through the sound effect configuration field "%100", and the sound effect trigger node triggering the sound effect audio to Play is a sound effect trigger value 1, that is, the sound effect audio "Play _ UI _ click _1" starts to Play at the hand-up stage of the trigger operation when the UI control receives the trigger operation.
Sound effect configuration field File path of sound effect audio Filename of sound effect audio Sound effect trigger node
%100 Sound\ui_common.bnk Play_ui_click_1 1
%200 Sound\ui_common.bnk Play_ui_tab 2
%300 Sound\ui_common.bnk Play_ui_bagopen 3
TABLE 1
Specifically, after a UI control of a target application and a control name corresponding to the UI control are displayed on a sound effect configuration interface, a user can edit the control name through the sound effect configuration interface, and a pre-generated sound effect configuration field is input through editing operation; and after the terminal receives the editing operation, updating the control name of the UI control based on the editing operation, identifying the updating information from the control name to obtain a sound effect configuration field, and indexing sound effect information corresponding to the sound effect configuration field according to the sound effect configuration field value.
Referring to fig. 2, the editing operation may include a click operation of a user for selecting a control name 220 in the audio configuration interface through a left mouse button and an input operation of a user for inputting an audio configuration field through a keyboard, where the audio configuration interface selects the control name 220 in response to the click operation of the user for clicking the left mouse button, updates the control name 200 from "PanelBG" to "PanelBG%200" in response to the input operation of the user for inputting the audio configuration field through the keyboard, identifies update information from the control name, obtains a sound configuration field "%200", and indexes sound information corresponding to the sound configuration field according to the sound configuration field "% 200".
After the sound effect information corresponding to the sound effect configuration field is acquired, in an embodiment, the step of performing sound effect configuration on the target UI node based on the sound effect information may further include: acquiring audio file information and a sound effect trigger node in the sound effect information; and writing the audio file information and the sound effect trigger node into a control configuration file of the UI control.
Specifically, after the sound effect configuration field is led out corresponding to the sound effect information, the audio file information and the sound effect trigger nodes in the sound effect information can be written into the UI control configuration file corresponding to the UI control one by one; in the running process of the target application, after the UI control receives the touch event, the touch event can be compared with the sound effect trigger node in the control configuration file corresponding to the UI control, and when the touch event is consistent with the sound effect trigger node, the corresponding sound effect audio is searched for according to the sound effect file information in the control configuration file to be played.
For example, in combination with the target application being a gunfight survival game application and the table 1 as an example, the game interface includes a shooting key corresponding to a virtual firearm, specifically, in the sound effect configuration process of the shooting key, the shooting key and the key name corresponding to the shooting key can be displayed through the sound effect configuration interface, and the user can edit the key name of the shooting key through the sound effect configuration interface to input a preset sound effect configuration field "%200" in the suffix of the key name; after the terminal receives the editing operation, the control name of the UI control is updated based on the editing operation, the updated information is identified from the control name, a sound effect configuration field "%200" is obtained, sound effect information corresponding to the sound effect configuration field, namely sound effect audio "Play _ UI _ tab", is indexed according to the sound effect configuration field value "%200", and a sound effect trigger node triggering the sound effect audio to be played is a sound effect trigger identification value 1, namely, the shooting button starts to Play the shooting sound effect audio "Play _ UI _ click _1" at the pressing stage of receiving the trigger operation.
Through the editing of the control name of the UI control, the sound effect configuration field is attached, sound effect configuration is carried out on the UI control through sound effect information corresponding to the sound effect configuration field index, repeated sound effect configuration work on the UI control which repeatedly appears in different interaction interfaces is avoided, the work complexity of sound effect configuration is reduced, and the efficiency of sound effect configuration is improved.
Further, in an embodiment, after the sound effect configuration field is obtained, the sound effect configuration field can be directly written into a control configuration file of the UI control, and meanwhile, sound effect information is constructed based on audio file information corresponding to the sound effect audio and a sound effect trigger node of the sound effect audio, and the sound effect configuration field for indexing the sound effect information is stored as a control sound effect index table; in the running process of the target application, after the UI control receives the touch event, the UI control can search audio file information and a sound effect trigger node in the control sound index table based on a sound effect configuration field in the control configuration file corresponding to the UI control, the currently received touch event is compared with the sound effect trigger node, and when the touch event is consistent with the sound effect trigger node, the corresponding sound effect audio is searched according to the sound effect file information and played.
In a second possible implementation manner, as described above, the target UI node may be a UI animation in the interactive interface, and the corresponding UI attribute information may refer to an animation timeline of the UI animation, as shown in fig. 5, in an embodiment, in response to a touch operation applied to the UI attribute information, the step of updating the UI attribute information, and acquiring sound effect information according to the updated UI attribute information includes:
s510, in response to the editing operation acted on the animation time shaft, audio frame information is set at a specified time point on the animation time shaft;
and S520, generating sound effect information of the UI animation according to the designated time point and the audio frame information.
Wherein, the animation time axis refers to a playing track (track) for UI animation playing; the terminal can receive the editing operation input by the user on the animation time axis based on the sound effect configuration interface, and after receiving the editing operation, the terminal acquires the sound effect information set on the animation time axis based on the editing operation. Specifically, the editing operation may specifically be a touch operation, a cursor operation, a key operation, or the like.
Wherein the specified time point is a time point selected on the animation timeline based on an editing operation applied to the animation timeline; the sound effect frame information refers to sound effect related information triggered at the set time point, and includes but is not limited to audio file information of sound effect audio, sound effect audio playing related settings, and the like, wherein the sound effect audio playing related settings include but are not limited to a sound effect audio insertion mode (such as sound effect fade-in fade-out), a sound effect circulation setting (such as single play, circulation play), and the like.
Specifically, after a UI animation of a target application and an animation time axis corresponding to the UI animation are displayed on a sound effect configuration interface, a user can edit the animation time axis through the sound effect configuration interface, select a specified time point on the animation time axis through editing operation, and set audio frame information such as audio file information played at the time point, sound effect and sound effect playing related setting and the like at the specified time point; and after the terminal receives the editing operation, acquiring the appointed time point and sound effect frame information based on the editing operation so as to obtain the sound effect information of the UI animation.
Referring to fig. 3, the editing operation may include a click operation of a left mouse button on an animation timeline 320 in the sound effect configuration interface by a user to specify a time point, the sound effect configuration interface determines a specified time point 321 in response to the click operation of the left mouse button by the user, and displays a sound effect menu 330, the user may select a sound effect audio 331 in the sound effect menu 330, and the sound effect configuration interface determines audio frame information in response to the user's selection operation of the sound effect file 331.
After the sound effect information corresponding to the animation time axis is acquired, in one embodiment, the step of performing sound effect configuration on the target UI node based on the sound effect information includes: and packaging the specified time point and the audio frame information in the sound effect information into a sound effect configuration file of the UI animation.
Specifically, after the sound effect information is obtained, the sound effect information can be packaged and stored as a sound effect configuration file and stored under a corresponding path of the UI animation. And in the running process of the target application, the target application automatically loads the sound effect configuration file, and in the process of playing the UI animation, sound effect playing is carried out based on the sound effect configuration file. The method and the device have the advantages that the effect frame information is set at the appointed time point based on the editing operation of the animation time axis of the UI animation, repeated effect configuration work of the UI controls which repeatedly appear in different interactive interfaces is avoided, the work complexity of the effect configuration is reduced, and the effect configuration efficiency is improved.
Taking Wwise as an example, the audio frame information is specifically a file in an Event format, the audio frame information is integrated with a specified time point to obtain a sound effect configuration file in a Bnk format, the sound effect configuration file in the Bnk format is an audio format that can be identified by a game engine finally output by Wwise, usually, the Bnk contains a plurality of Event events, and the game engine needs to load the Bnk file first to read and play the Event events contained in the Bnk file. The hanging link of the sound effect and the UI animation is completed by hanging the Event file and the Bnk file.
In the sound effect configuration method, a target UI node of a target application and UI attribute information corresponding to the target UI node are displayed on a sound effect configuration interface; responding to touch operation acting on the UI attribute information, updating the UI attribute information, and acquiring sound effect information according to the updated UI attribute information; and performing sound effect configuration on the target UI node based on the sound effect information. The direct editing of the UI nodes and the UI nodes corresponding to the UI attributes is achieved through the sound effect configuration interface, the sound effect and the UI nodes are directly connected in a hanging mode, developers are not required to write codes to achieve sound effect connection, the operation difficulty of sound effect configuration is reduced, and the efficiency of sound effect configuration is improved.
In order to better implement the sound effect configuration method provided in the embodiment of the present application, on the basis of the sound effect configuration method provided in the embodiment of the present application, a sound effect configuration device is further provided in the embodiment of the present application, as shown in fig. 6, the sound effect configuration device 600 includes:
the display module 610 is configured to display a target UI node of the target application and UI attribute information corresponding to the target UI node on the sound effect configuration interface;
the operation module 620 is configured to update the UI attribute information in response to a touch operation applied to the UI attribute information, and acquire sound effect information according to the updated UI attribute information;
and a configuration module 630, configured to perform sound effect configuration on the target UI node based on the sound effect information.
In some embodiments of the present application, the target UI node is a UI control, and the UI attribute information is a control name of the UI control; the operation module is used for responding to the editing operation aiming at the control name and updating the control name; extracting a sound effect configuration field from the updated control name; and indexing the sound effect information corresponding to the sound effect configuration field according to the sound effect configuration field.
In some embodiments of the present application, the configuration module is configured to obtain audio file information and a sound effect trigger node in the sound effect information; and writing the audio file information and the sound effect trigger node into a control configuration file of the UI control.
In some embodiments of the present application, the operation module is specifically further configured to acquire audio file information of sound effect audio; and constructing sound effect information based on the audio file information corresponding to the sound effect audio and the sound effect trigger node of the sound effect audio, and generating a sound effect configuration field for indexing the sound effect information.
In some embodiments of the present application, the target UI node is a UI animation, and the UI attribute information is an animation timeline of the UI animation; an operation module for setting audio frame information at a specified time point on the animation timeline in response to an editing operation acting on the animation timeline; and generating sound effect information of the UI animation according to the designated time point and the audio frame information.
In some embodiments of the present application, the configuration module is configured to package the specified time point and the audio frame information in the audio information into an audio configuration file of the UI animation.
In some embodiments of the present application, the display module is configured to display a UI node tree of a target application in a sound effect configuration interface; in response to a touch operation acting on the UI node tree, a target UI node is determined in the UI node tree.
For the specific limitation of the audio configuration apparatus, reference may be made to the above limitation on the audio configuration method, which is not described herein again. The modules in the sound effect configuration device can be wholly or partially realized by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent of a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In addition, the embodiment of the present application further provides a Computer device, where the Computer device may be a terminal, and the terminal may be a terminal device such as a smart phone, a tablet Computer, a notebook Computer, a touch screen, a game machine, a Personal Computer (PC), a Personal Digital Assistant (PDA), and the like. As shown in fig. 7, fig. 7 is a schematic structural diagram of a computer device according to an embodiment of the present application. The computer apparatus 700 includes a processor 701 having one or more processing cores, a memory 702 having one or more computer-readable storage media, and a computer program stored on the memory 702 and executable on the processor. The processor 701 is electrically connected to the memory 702. Those skilled in the art will appreciate that the computer device configurations illustrated in the figures are not meant to be limiting of computer devices and may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components.
The processor 701 is a control center of the computer device 700, connects various parts of the entire computer device 700 using various interfaces and lines, performs various functions of the computer device 700 and processes data by running or loading software programs and/or modules stored in the memory 702, and calling data stored in the memory 702, thereby monitoring the computer device 700 as a whole.
In this embodiment, the processor 701 in the computer device 700 loads instructions corresponding to processes of one or more application programs into the memory 702 according to the following steps, and the processor 701 executes the application programs stored in the memory 702, so as to implement the following functions:
displaying a target UI node of the target application and UI attribute information corresponding to the target UI node on a sound effect configuration interface;
responding to touch operation acting on the UI attribute information, updating the UI attribute information, and acquiring sound effect information according to the updated UI attribute information;
and performing sound effect configuration on the target UI node based on the sound effect information.
In some embodiments of the present application, the target UI node is a UI control, and the UI attribute information is a control name of the UI control; execution of the application program stored in the memory 702 by the processor 701 may implement the following functions: responding to the editing operation aiming at the control name and updating the control name; extracting a sound effect configuration field from the updated control name; and indexing the sound effect information corresponding to the sound effect configuration field according to the sound effect configuration field.
In some embodiments of the present application, execution of an application program stored in memory 702 by processor 701 may implement the following functions: acquiring audio file information and a sound effect trigger node in the sound effect information; and writing the audio file information and the sound effect trigger node into a control configuration file of the UI control.
In some embodiments of the present application, the execution of the application program stored in the memory 702 by the processor 701 may implement the following functions: acquiring audio file information of sound effect audio; and constructing sound effect information based on the audio file information corresponding to the sound effect audio and the sound effect trigger node of the sound effect audio, and generating a sound effect configuration field for indexing the sound effect information.
In some embodiments of the present application, the target UI node is a UI animation, and the UI attribute information is an animation timeline of the UI animation; execution of the application program stored in the memory 702 by the processor 701 may implement the following functions: setting audio frame information at a specified time point on the animation timeline in response to an editing operation applied to the animation timeline; and generating sound effect information of the UI animation according to the designated time point and the audio frame information.
In some embodiments of the present application, execution of an application program stored in memory 702 by processor 701 may implement the following functions: and packaging the specified time point and the audio frame information in the sound effect information into a sound effect configuration file of the UI animation.
In some embodiments of the present application, execution of an application program stored in memory 702 by processor 701 may implement the following functions: displaying a UI node tree of a target application in a sound effect configuration interface; in response to a touch operation acting on the UI node tree, a target UI node is determined in the UI node tree.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Optionally, as shown in fig. 7, the computer device 700 further includes: a touch display screen 703, a radio frequency circuit 704, an audio circuit 705, an input unit 706, and a power supply 707. The processor 701 is electrically connected to the touch display screen 703, the radio frequency circuit 704, the audio circuit 705, the input unit 706, and the power source 707. Those skilled in the art will appreciate that the computer device configuration illustrated in FIG. 7 does not constitute a limitation of the computer device, and may include more or fewer components than illustrated, or some components may be combined, or a different arrangement of components.
The touch display screen 703 may be used to display a graphical user interface and receive operation instructions generated by a user acting on the graphical user interface. The touch display screen 703 may include a display panel and a touch panel. The display panel may be used, among other things, to display information entered by or provided to a user and various graphical user interfaces of the computer device, which may be made up of graphics, text, icons, video, and any combination thereof. Alternatively, the Display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. The touch panel may be used to collect touch operations of a user on or near the touch panel (for example, operations of the user on or near the touch panel using any suitable object or accessory such as a finger, a stylus pen, and the like), and generate corresponding operation instructions, and the operation instructions execute corresponding programs. Alternatively, the touch panel may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 701, and can receive and execute commands sent by the processor 701. The touch panel may cover the display panel, and when the touch panel detects a touch operation thereon or nearby, the touch panel transmits the touch operation to the processor 701 to determine the type of the touch event, and then the processor 701 provides a corresponding visual output on the display panel according to the type of the touch event. In the embodiment of the present application, the touch panel and the display panel may be integrated into the touch display screen 703 to realize input and output functions. However, in some embodiments, the touch panel and the touch panel can be implemented as two separate components to perform the input and output functions. That is, the touch display screen 703 can also be used as a part of the input unit 706 to implement an input function.
The rf circuit 704 may be configured to transmit and receive rf signals to establish wireless communication with a network device or other computer device through wireless communication, and to transmit and receive signals with the network device or other computer device.
Audio circuitry 705 may be used to provide an audio interface between a user and a computer device through speakers and microphones. The audio circuit 705 may transmit the electrical signal converted from the received audio data to a speaker, and convert the electrical signal into a sound signal for output; on the other hand, the microphone converts the collected sound signal into an electrical signal, which is received by the audio circuit 705 and converted into audio data, which is then processed by the output processor 701 and sent to, for example, another computer device via the rf circuit 704, or output to the memory 702 for further processing. The audio circuit 705 may also include an earbud jack to provide communication of a peripheral headset with the computer device.
The input unit 706 may be used to receive input numbers, character information, or user characteristic information (e.g., fingerprint, iris, facial information, etc.), and generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function control.
The power supply 707 is used to power the various components of the computer device 700. Optionally, the power supply 707 may be logically connected to the processor 701 through a power management system, so as to implement functions of managing charging, discharging, power consumption management, and the like through the power management system. The power supply 707 may also include any component of one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and the like.
Although not shown in fig. 7, the computer device 700 may further include a camera, a sensor, a wireless fidelity module, a bluetooth module, etc., which are not described in detail herein.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions or by associated hardware controlled by the instructions, which may be stored in a computer readable storage medium and loaded and executed by a processor.
To this end, the present application provides a computer readable storage medium, in which a plurality of computer programs are stored, and the computer programs can be loaded by a processor to execute the steps in any sound effect configuration method provided by the present application. For example, the computer program may perform the steps of:
displaying a target UI node of the target application and UI attribute information corresponding to the target UI node on a sound effect configuration interface;
responding to touch operation acted on the UI attribute information, updating the UI attribute information, and acquiring sound effect information according to the updated UI attribute information;
and performing sound effect configuration on the target UI node based on the sound effect information.
In some embodiments of the present application, the target UI node is a UI control, and the UI attribute information is a control name of the UI control; the computer program may perform the steps of: updating the control name in response to the editing operation aiming at the control name; extracting a sound effect configuration field from the updated control name; and indexing the sound effect information corresponding to the sound effect configuration field according to the sound effect configuration field.
In some embodiments of the present application, the computer program may perform the steps of: acquiring audio file information and a sound effect trigger node in the sound effect information; and writing the audio file information and the sound effect trigger node into a control configuration file of the UI control.
In some embodiments of the present application, the computer program may perform the steps of: acquiring audio file information of sound effect audio; and constructing sound effect information based on the audio file information corresponding to the sound effect audio and the sound effect trigger node of the sound effect audio, and generating a sound effect configuration field for indexing the sound effect information.
In some embodiments of the present application, the target UI node is a UI animation, and the UI attribute information is an animation timeline of the UI animation; the computer program may perform the steps of: setting audio frame information at a specified time point on the animation timeline in response to an editing operation applied to the animation timeline; and generating sound effect information of the UI animation according to the designated time point and the audio frame information.
In some embodiments of the present application, the computer program may perform the steps of: and packaging the specified time point and the audio frame information in the sound effect information into a sound effect configuration file of the UI animation.
In some embodiments of the present application, the computer program may perform the steps of: displaying a UI node tree of a target application in a sound effect configuration interface; in response to a touch operation acting on the UI node tree, a target UI node is determined in the UI node tree.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Wherein the storage medium may include: read Only Memory (ROM), random Access Memory (RAM), magnetic or optical disks, and the like.
Since the computer program stored in the storage medium can execute the steps of any sound effect configuration method provided in the embodiment of the present application, the beneficial effects that can be achieved by any sound effect configuration method provided in the embodiment of the present application can be achieved, for which details are given in the foregoing embodiment and are not described herein again.
The sound effect configuration method, device, computer device and storage medium provided by the embodiments of the present application are introduced in detail above, and a specific example is applied in the present application to explain the principle and the implementation of the present application, and the description of the above embodiments is only used to help understanding the method and the core idea of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

1. A sound effect configuration method is characterized by comprising the following steps:
displaying a target UI node of a target application and UI attribute information corresponding to the target UI node on a sound effect configuration interface;
responding to touch operation acted on the UI attribute information, updating the UI attribute information, and acquiring sound effect information according to the updated UI attribute information;
and performing sound effect configuration on the target UI node based on the sound effect information.
2. The method according to claim 1, wherein the target UI node is a UI control, and the UI property information is a control name of the UI control;
the step of updating the UI attribute information in response to the touch operation acting on the UI attribute information and acquiring sound effect information according to the updated UI attribute information comprises the following steps:
updating the control name in response to the editing operation aiming at the control name;
extracting a sound effect configuration field from the updated control name;
and indexing sound effect information corresponding to the sound effect configuration field according to the sound effect configuration field.
3. The method according to claim 2, wherein the step of performing sound effect configuration on the target UI node based on the sound effect information comprises:
acquiring audio file information and a sound effect trigger node in the sound effect information;
and writing the audio file information and the sound effect trigger node into a control configuration file of the UI control.
4. The method according to claim 2, wherein the step of extracting the sound effect configuration field from the updated control name is preceded by the step of:
acquiring audio file information of sound effect audio;
and constructing sound effect information based on the audio file information corresponding to the sound effect audio and the sound effect trigger node of the sound effect audio, and generating an index into a sound effect configuration field of the sound effect information.
5. The method of claim 1, wherein the target UI node is a UI animation, and the UI property information is an animation timeline of the UI animation;
the step of updating the UI attribute information in response to the touch operation acting on the UI attribute information and acquiring sound effect information according to the updated UI attribute information comprises the following steps:
setting audio frame information at a specified time point on the animation timeline in response to an editing operation applied to the animation timeline;
and generating sound effect information of the UI animation according to the designated time point and the audio frame information.
6. The method according to claim 5, wherein the step of performing sound effect configuration on the target UI node based on the sound effect information comprises:
and packaging the specified time point in the sound effect information and the audio frame information into a sound effect configuration file of the UI animation.
7. The method according to any one of claims 1 to 6, wherein the step of displaying the target UI node of the target application and the UI attribute information corresponding to the target UI node on the sound effect configuration interface is preceded by:
displaying a UI node tree of a target application in the sound effect configuration interface;
and responding to the touch operation acted on the UI node tree, and determining a target UI node in the UI node tree.
8. An audio configuration apparatus, the apparatus comprising:
the display module is used for displaying a target UI node of a target application and UI attribute information corresponding to the target UI node on a sound effect configuration interface;
the operation module is used for responding to touch operation acted on the UI attribute information, updating the UI attribute information and acquiring sound effect information according to the updated UI attribute information;
and the configuration module is used for carrying out sound effect configuration on the target UI node based on the sound effect information.
9. A computer device, characterized in that the computer device comprises:
one or more processors;
a memory; and
one or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the processor to implement the sound-effects configuration method of any of claims 1-7.
10. A computer-readable storage medium, having stored thereon a computer program which is loaded by a processor to perform the steps of the sound-effect configuration method of any of claims 1 to 7.
CN202211288141.9A 2022-10-20 2022-10-20 Sound effect configuration method and device, computer equipment and storage medium Pending CN115826904A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211288141.9A CN115826904A (en) 2022-10-20 2022-10-20 Sound effect configuration method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211288141.9A CN115826904A (en) 2022-10-20 2022-10-20 Sound effect configuration method and device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115826904A true CN115826904A (en) 2023-03-21

Family

ID=85525117

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211288141.9A Pending CN115826904A (en) 2022-10-20 2022-10-20 Sound effect configuration method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115826904A (en)

Similar Documents

Publication Publication Date Title
CN111282268B (en) Plot showing method, plot showing device, plot showing terminal and storage medium in virtual environment
CN111240777B (en) Dynamic wallpaper generation method and device, storage medium and electronic equipment
CN112870724B (en) Resource management method and device, storage medium and electronic equipment
CN111420399A (en) Virtual character reloading method, device, terminal and storage medium
CN111193960A (en) Video processing method and device, electronic equipment and computer readable storage medium
CN113485617A (en) Animation display method and device, electronic equipment and storage medium
CN112329184A (en) Network architecture configuration information generation method and device, storage medium and electronic equipment
CN112163174B (en) Message display method and device, storage medium and computer equipment
CN112843719A (en) Skill processing method, skill processing device, storage medium and computer equipment
CN116546242A (en) Live broadcast control method and device, computer equipment and storage medium
CN113332718B (en) Interactive element query method and device, electronic equipment and storage medium
CN115826904A (en) Sound effect configuration method and device, computer equipment and storage medium
CN112799754B (en) Information processing method, information processing device, storage medium and computer equipment
CN114432696A (en) Special effect configuration method and device of virtual object, storage medium and electronic equipment
CN114546113A (en) Menu operation method and device, storage medium and electronic equipment
CN112843729A (en) Operation parameter determination method and device, computer equipment and storage medium
CN114416234B (en) Page switching method and device, computer equipment and storage medium
CN112783860B (en) Method, device, storage medium and computer equipment for constructing mirror image database
CN113625968B (en) File authority management method and device, computer equipment and storage medium
CN114146418A (en) Game resource processing method and device, computer equipment and storage medium
CN117101121A (en) Game prop repairing method, device, terminal and storage medium
CN106970814B (en) Processing method, device and system for software upgrading
CN115944923A (en) Instance object editing method and device, electronic equipment and storage medium
CN115364487A (en) Game table processing method and device, computer equipment and storage medium
CN114510254A (en) Resource updating method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination