CN114116105A - Control method and device of dynamic desktop, storage medium and electronic device - Google Patents

Control method and device of dynamic desktop, storage medium and electronic device Download PDF

Info

Publication number
CN114116105A
CN114116105A CN202111443572.3A CN202111443572A CN114116105A CN 114116105 A CN114116105 A CN 114116105A CN 202111443572 A CN202111443572 A CN 202111443572A CN 114116105 A CN114116105 A CN 114116105A
Authority
CN
China
Prior art keywords
instruction
software
virtual character
dynamic wallpaper
interaction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111443572.3A
Other languages
Chinese (zh)
Inventor
孙武
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Perfect World Beijing Software Technology Development Co Ltd
Original Assignee
Perfect World Beijing Software Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Perfect World Beijing Software Technology Development Co Ltd filed Critical Perfect World Beijing Software Technology Development Co Ltd
Priority to CN202111443572.3A priority Critical patent/CN114116105A/en
Publication of CN114116105A publication Critical patent/CN114116105A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention provides a control method and device of a dynamic desktop, a storage medium and an electronic device, wherein the method comprises the following steps: configuring a synchronous interface of dynamic wallpaper in a desktop starter of a target terminal; loading a three-dimensional model of a first virtual character in a dynamic wallpaper process, and detecting an interaction instruction aiming at the first virtual character through the dynamic wallpaper process; and rendering the role image of the three-dimensional model in real time according to the interactive instruction, sending the interactive instruction to the desktop starter through the synchronous interface, and controlling the target terminal according to the interactive instruction. According to the invention, the technical problem that the terminal desktop in the related technology cannot carry out human-computer interaction is solved, and the dynamic wallpaper is used as the display medium and the control center of the target terminal, so that the human-computer interaction efficiency is improved.

Description

Control method and device of dynamic desktop, storage medium and electronic device
Technical Field
The invention relates to the technical field of computers, in particular to a method and a device for controlling a dynamic desktop, a storage medium and an electronic device.
Background
In the related art, in the prior art, a mobile phone, a computer and a tablet can be provided with a desktop, screen locking wallpaper and the like, which are generally pictures or moving pictures, static or dynamic, and cannot interact with a user or a program in a terminal.
In the related technology, the desktop wallpaper displayed on the terminal is poor in interactivity, only one static or semi-static picture is used, interaction and man-machine interaction in the terminal cannot be achieved, interactivity is lacked, and playability is poor.
In view of the above problems in the related art, no effective solution has been found at present.
Disclosure of Invention
The embodiment of the invention provides a control method and device of a dynamic desktop, a storage medium and an electronic device.
According to an embodiment of the present invention, there is provided a method for controlling a dynamic desktop, including: configuring a synchronous interface of dynamic wallpaper in a desktop starter of a target terminal; loading a three-dimensional model of a first virtual character in a dynamic wallpaper process, and detecting an interaction instruction aiming at the first virtual character through the dynamic wallpaper process; and rendering the role image of the three-dimensional model in real time according to the interactive instruction, sending the interactive instruction to the desktop starter through the synchronous interface, and controlling the target terminal according to the interactive instruction.
Optionally, the detecting, by the dynamic wallpaper process, the interaction instruction for the first virtual character includes at least one of: detecting, by the dynamic wallpaper process, a voice control instruction for the first virtual character; detecting, by the dynamic wallpaper process, a touch instruction for the first virtual character.
Optionally, rendering the character image of the three-dimensional model in real time according to the interactive instruction includes: searching a target model element matched with the interactive instruction in an element set of the three-dimensional model, wherein the three-dimensional model comprises a plurality of model elements; and adjusting the target model element in real time according to the interactive instruction so as to update the role image of the first virtual role.
Optionally, after detecting the instruction of interaction for the first virtual character through the dynamic wallpaper process, the method further includes: responding to the interaction instruction, and loading a second virtual role in the dynamic wallpaper process; generating social instructions between the second virtual character and the first virtual character based on the interaction instructions, wherein the social instructions comprise language social instructions and/or action social instructions; controlling the second virtual character and the first virtual character based on the social instructions.
Optionally, rendering the character image of the three-dimensional model in real time according to the interactive instruction includes: searching the wearing materials and the virtual animation matched with the interactive instruction in a preset material library; and adding the wearing materials to the three-dimensional model, and generating a dynamic picture of the first virtual character according to the virtual animation.
Optionally, the detecting, by the dynamic wallpaper process, an interaction instruction for the first virtual character includes: acquiring software data of associated software through the dynamic wallpaper process, wherein the software data comprises at least one of: weather data of weather software, calendar data and travel data of calendar software, session data of communication software, version update notification and prop on-line notification of game software, and system notification data of system software; determining the software data as the interaction instruction.
Optionally, rendering the character image of the three-dimensional model in real time according to the interactive instruction includes: analyzing the software data, and extracting text content or audio content in the software data; and displaying the text content at a preset position of the first virtual character, or synchronously rendering the mouth shape of the first virtual character by adopting a preset phoneme model when the audio content is played.
Optionally, after the text content is displayed at the predetermined position of the first virtual character, the method further includes: detecting the manipulation instruction at the predetermined position; and responding to the control instruction, starting the associated software through the desktop starter, and displaying the text content on a display interface of the associated software.
Optionally, controlling the target terminal according to the interactive instruction includes: responding to the interaction instruction, sequencing the plurality of software installed on the target terminal based on the use frequency, and displaying the abbreviated identifications of the plurality of software with the use frequency larger than a preset threshold value on the display interface of the target terminal, wherein the dynamic wallpaper is a background picture of the display interface.
Optionally, before acquiring software data of associated software through the dynamic wallpaper process, the method further includes: configuring a signing relation between the associated software and the dynamic wallpaper, wherein the signing relation is used for indicating that the dynamic wallpaper has a data acquisition permission of the associated software; and creating a data synchronization channel from the associated software to the dynamic wallpaper based on the subscription relationship.
Optionally, controlling the target terminal according to the interactive instruction includes: positioning an image block to which a touch position of the interactive instruction belongs, wherein the first virtual character comprises a plurality of image blocks; generating a terminal control instruction corresponding to the image block based on a preset mapping relation; and executing the terminal control instruction on the target terminal.
Optionally, after rendering the character image of the three-dimensional model in real time according to the interactive instruction, the method further includes: detecting state data of the target terminal, wherein the state data is used for indicating the equipment state of the target terminal and the running state of software; and updating the role image of the three-dimensional model in real time according to the state data.
Optionally, the method further includes: detecting an interaction request sent by a fourth virtual character to the first virtual character, wherein the fourth virtual character is a player-controlled character PCC or a non-player-controlled character NPC; and responding to the interaction request, and calling the dynamic wallpaper process to control the fourth virtual role.
Optionally, the rendering the character image of the three-dimensional model in real time according to the interactive instruction includes: calling the dynamic wallpaper process to transmit the live-action image to a search engine; reading a retrieval result recalled by the search engine based on the live-action image; extracting rendering parameters in the retrieval result; and rendering the three-dimensional model of the first virtual character by adopting the rendering parameters.
According to another embodiment of the present invention, there is provided a control apparatus of a dynamic desktop, including: the configuration module is used for configuring a synchronous interface of the dynamic wallpaper in a desktop starter of the target terminal; the system comprises a first detection module, a second detection module and a third detection module, wherein the first detection module is used for loading a three-dimensional model of a first virtual role in a dynamic wallpaper process and detecting an interaction instruction aiming at the first virtual role through the dynamic wallpaper process; and the first control module is used for rendering the character image of the three-dimensional model in real time according to the interactive instruction, sending the interactive instruction to the desktop starter through the synchronous interface, and controlling the target terminal according to the interactive instruction.
Optionally, the first detection module includes at least one of: a first detection unit, configured to detect, through the dynamic wallpaper process, a voice control instruction for the first virtual character; and the second detection unit is used for detecting a touch instruction aiming at the first virtual role through the dynamic wallpaper process.
Optionally, the first control module includes: a searching unit, configured to search, in an element set of the three-dimensional model, for a target model element that matches the interactive instruction, where the three-dimensional model includes a plurality of model elements; and the adjusting unit is used for adjusting the target model element in real time according to the interactive instruction so as to update the role image of the first virtual role.
Optionally, the apparatus further comprises: the loading module is used for responding to the interaction instruction and loading a second virtual character in the dynamic wallpaper process after the first detection module detects the interaction instruction aiming at the first virtual character through the dynamic wallpaper process; a generating module, configured to generate a social instruction between the second virtual character and the first virtual character based on the interaction instruction, where the social instruction includes a language social instruction and/or an action social instruction; a second control module to control the second virtual character and the first virtual character based on the social instructions.
Optionally, the first control module includes: the searching unit is used for searching the wearing materials and the virtual animation matched with the interactive instructions in a preset material library; and the generating unit is used for adding the wearing materials to the three-dimensional model and generating a dynamic picture of the first virtual character according to the virtual animation.
Optionally, the first detecting module includes: an obtaining unit, configured to obtain software data of associated software through the dynamic wallpaper process, where the software data includes at least one of: weather data of weather software, calendar data and travel data of calendar software, session data of communication software, version update notification and prop on-line notification of game software, and system notification data of system software; a determining unit, configured to determine the software data as the interactive instruction.
Optionally, the first control module includes: the extraction unit is used for analyzing the software data and extracting the text content or the audio content in the software data; and the processing unit is used for displaying the text content at a preset position of the first virtual character, or synchronously rendering the mouth shape of the first virtual character by adopting a preset phoneme model when the audio content is played.
Optionally, the first control module further includes: the detection unit is used for detecting the control instruction at a preset position after the processing unit displays the text content at the preset position of the first virtual character; and the display unit is used for responding to the control instruction, starting the associated software through the desktop starter, and displaying the text content on a display interface of the associated software.
Optionally, the first control module includes: and the control unit is used for responding to the interaction instruction, sequencing the plurality of pieces of software installed on the target terminal based on the use frequency, and displaying the abbreviated identifications of the plurality of pieces of software with the use frequency larger than a preset threshold value on a display interface of the target terminal, wherein the dynamic wallpaper is a background picture of the display interface.
Optionally, the apparatus further comprises: the association unit is used for configuring a signing relationship between the association software and the dynamic wallpaper before the detection of acquiring the software data of the association software through the dynamic wallpaper process, wherein the signing relationship is used for indicating that the dynamic wallpaper has the data acquisition permission of the association software; and the creating unit is used for creating a data synchronization channel from the associated software to the dynamic wallpaper based on the signing relationship.
Optionally, the first control module includes: the positioning unit is used for positioning an image block to which the touch position of the interactive instruction belongs, wherein the first virtual character comprises a plurality of image blocks; the generating unit is used for generating a terminal control instruction corresponding to the image block based on a preset mapping relation; and the execution unit is used for executing the terminal control instruction on the target terminal.
Optionally, the apparatus further comprises: the second detection module is used for detecting the state data of the target terminal after the first control module renders the character image of the three-dimensional model in real time according to the interactive instruction, wherein the state data is used for indicating the equipment state of the target terminal and the running state of software; and the updating module is used for updating the role image of the three-dimensional model in real time according to the state data.
Optionally, the apparatus further comprises: a third detection module, configured to detect an interaction request sent by a fourth virtual character to the first virtual character, where the fourth virtual character is a player-controlled character PCC or a non-player-controlled character NPC; and the third control module is used for responding to the interaction request and calling the dynamic wallpaper process to control the fourth virtual role.
Optionally, the interactive instruction is a live-action image acquired by the dynamic wallpaper process, and the first control module includes: the calling unit is used for calling the dynamic wallpaper process to transmit the live-action image to a search engine; a reading unit, configured to read a retrieval result recalled by the search engine based on the live-action image; the extraction unit is used for extracting the rendering parameters in the retrieval result; and the rendering unit is used for rendering the three-dimensional model of the first virtual character by adopting the rendering parameters.
According to a further embodiment of the invention, a storage medium is also provided, in which a computer program is stored, wherein the computer program is arranged to perform the steps of any of the method embodiments when executed.
According to yet another embodiment of the present invention, there is also provided an electronic device, comprising a memory in which a computer program is stored and a processor arranged to run the computer program to perform the steps of any of the method embodiments.
According to the invention, a synchronous interface of the dynamic wallpaper is configured in the desktop starter of the target terminal; loading a three-dimensional model of a first virtual character in a dynamic wallpaper process, and detecting an interaction instruction aiming at the first virtual character through the dynamic wallpaper process; the method comprises the steps of rendering the character image of the three-dimensional model in real time according to an interactive instruction, sending the interactive instruction to a desktop starter through a synchronous interface, controlling a target terminal according to the interactive instruction, configuring the synchronous interface of dynamic wallpaper, and detecting the interactive instruction aiming at the first virtual character by adopting a dynamic wallpaper process, so that image updating and terminal control of the first virtual character are realized, the technical problem that the terminal desktop of the related technology cannot carry out human-computer interaction is solved, the dynamic wallpaper is used as a display medium and a control center of the target terminal, and the human-computer interaction efficiency is improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
fig. 1 is a block diagram of a hardware structure of a mobile phone for controlling a dynamic desktop according to an embodiment of the present invention;
FIG. 2 is a flowchart illustrating a method for controlling a dynamic desktop according to an embodiment of the present invention;
FIG. 3 is a pictorial diagram of a first avatar in an embodiment of the present invention;
FIG. 4 is a distribution diagram of image blocks of a first avatar according to an embodiment of the present invention;
FIG. 5 is a block diagram of a control device of a dynamic desktop according to an embodiment of the present invention;
fig. 6 is a block diagram of an electronic device according to an embodiment of the invention.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application. It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
The method provided by the first embodiment of the present application may be executed in a mobile phone, a tablet, a server, a computer, or a similar electronic terminal. Taking the operation on a mobile phone as an example, fig. 1 is a block diagram of a hardware structure of a mobile phone for controlling a dynamic desktop according to an embodiment of the present invention. As shown in fig. 1, the handset may include one or more (only one shown in fig. 1) processors 102 (the processor 102 may include, but is not limited to, a processing device such as a microprocessor MCU or a programmable logic device FPGA) and a memory 104 for storing data, and optionally may also include a transmission device 106 for communication functions and an input-output device 108. It will be understood by those skilled in the art that the structure shown in fig. 1 is merely illustrative and not limiting to the structure of the mobile phone. For example, a cell phone may also include more or fewer components than shown in FIG. 1, or have a different configuration than shown in FIG. 1.
The memory 104 may be used to store a mobile phone program, for example, a software program and a module of an application software, such as a mobile phone program corresponding to a control method of a dynamic desktop in an embodiment of the present invention, and the processor 102 executes various functional applications and data processing by running the mobile phone program stored in the memory 104, so as to implement the method. The memory 104 may include high speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 104 may further include memory located remotely from the processor 102, which may be connected to a cell phone over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof. In this embodiment, the processor 104 is configured to control the first virtual character to perform a designated operation to complete the game task in response to the human-machine interaction instruction and the game policy. The memory 104 is used for storing program scripts, configuration information, attribute information of virtual characters, and the like.
The transmission device 106 is used to receive or transmit data via a network. The specific example of the network may include a wireless network provided by a communication provider of a mobile phone. In one example, the transmission device 106 includes a Network adapter (NIC) that can be connected to other Network devices through a base station to communicate with the internet. In one example, the transmission device 106 may be a Radio Frequency (RF) module, which is used to communicate with the internet in a wireless manner.
Optionally, the input/output device 108 further includes a human-computer interaction screen for acquiring a human-computer interaction instruction through a human-computer interaction interface and presenting a streaming media picture;
in this embodiment, a method for controlling a dynamic desktop is provided, and fig. 2 is a schematic flowchart of a method for controlling a dynamic desktop according to an embodiment of the present invention, as shown in fig. 2, the flowchart includes the following steps:
step S202, configuring a synchronous interface of the dynamic wallpaper in a desktop starter of a target terminal;
the synchronous interface is used for sending an interactive instruction detected by the dynamic wallpaper to the desktop starter and controlling the target terminal, and the desktop starter can adopt the interactive instruction to control operating system software of the target terminal and other software installed on the target terminal.
Step S204, loading a three-dimensional model of a first virtual character in a dynamic wallpaper process, and detecting an interaction instruction aiming at the first virtual character through the dynamic wallpaper process;
optionally, after loading the three-dimensional model of the first virtual character, an initial character image of the three-dimensional model may also be loaded, and an image parameter of the initial character image may be generated by presetting or by reading state information of the target terminal, and rendered to the three-dimensional model of the first virtual character.
Step S206, rendering the role image of the three-dimensional model in real time according to the interactive instruction, sending the interactive instruction to the desktop starter through the synchronous interface, and controlling the target terminal according to the interactive instruction;
the desktop of the target terminal is used as a display element on the screen, and a user can see the desktop of the mobile phone when starting up, lightening the screen, quitting the software and switching the software, and the desktop is used as a display medium and a control center of the target terminal.
Configuring a synchronous interface of the dynamic wallpaper in a desktop starter of a target terminal; loading a three-dimensional model of a first virtual character in a dynamic wallpaper process, and detecting an interaction instruction aiming at the first virtual character through the dynamic wallpaper process; the method comprises the steps of rendering the character image of the three-dimensional model in real time according to an interactive instruction, sending the interactive instruction to a desktop starter through a synchronous interface, controlling a target terminal according to the interactive instruction, configuring the synchronous interface of dynamic wallpaper, and detecting the interactive instruction aiming at the first virtual character by adopting a dynamic wallpaper process, so that image updating and terminal control of the first virtual character are realized, the technical problem that the terminal desktop of the related technology cannot carry out human-computer interaction is solved, the dynamic wallpaper is used as a display medium and a control center of the target terminal, and the human-computer interaction efficiency is improved.
In one embodiment of this embodiment, detecting the instruction of interaction for the first virtual character by the dynamic wallpaper process includes at least one of: detecting a voice control instruction for a first virtual character through a dynamic wallpaper process; a touch instruction for a first virtual character is detected through a dynamic wallpaper process.
In this embodiment, the dynamic wallpaper is used as a control component of a terminal such as a mobile phone, and when all or part of the dynamic wallpaper (e.g., characters therein) is not blocked, a user can realize interaction through the mobile phone, such as voice interaction, without waking up, as long as the dynamic wallpaper is not blocked, the dynamic wallpaper is in an activated state, and can acquire a voice control instruction of the user in real time, besides a voice and touch instruction, a body state (e.g., a gesture, a mouth shape, and the like) can be acquired, and a body state feature of the user is acquired in real time through a front-facing camera.
When the voice control instruction is detected, the awakening instruction of the first virtual role can be detected, and after the awakening instruction is detected, the voice control instruction of the first virtual role is detected.
The interaction instructions related to the first virtual character of the user comprise a voice control instruction and a touch instruction of a touch screen, after the first virtual character receives the touch instruction, collision detection and rendering updating can be carried out according to a physical engine in a virtual game, the precision of desktop rendering and the reality of animation are improved by adding a physical animation system, the simulated collision detection after touch screen information detection is completed, and interaction is completed.
In one embodiment of this embodiment, rendering the character image of the three-dimensional model in real time according to the interactive instruction includes: searching a target model element matched with the interactive instruction in an element set of the three-dimensional model, wherein the three-dimensional model comprises a plurality of model elements; and adjusting the target model element in real time according to the interactive instruction so as to update the role image of the first virtual role.
In one example, the set of elements of the three-dimensional model includes N model elements, e.g., N-10, the instructions for interaction include M, e.g., M-5, each instruction for interaction corresponds to at least one model element, e.g., a double-click instruction corresponds to an eye and a mouth of the three-dimensional model, and when the detected instruction for interaction is a double-click instruction, the eyes and the mouth of the three-dimensional model are adjusted to achieve the character image of the first virtual character.
Fig. 3 is a schematic diagram of an avatar of a first virtual character in an embodiment of the present invention, where a face model includes 5 model elements corresponding to a hair style, eyes, nose, ears, and mouth, respectively, and 4 avatars are illustrated in the diagram, and correspond to 4 interactive commands, respectively, where the avatar is an initial avatar, and three target model elements of eyes, nose, and mouth are adjusted when switching from avatar 1 to avatar 2.
In some examples of this embodiment, after detecting the instruction of interaction for the first virtual character by the dynamic wallpaper process, further comprising: responding to the interactive instruction, and loading a second virtual role in the dynamic wallpaper process; generating social instructions between the second virtual character and the first virtual character based on the interaction instructions, wherein the social instructions comprise language social instructions and/or action social instructions; the second virtual character and the first virtual character are controlled based on the social instructions.
In this embodiment, in addition to rendering and updating the first virtual character, a new second virtual character may be added to the dynamic wallpaper, the second virtual character is loaded, and performs action and linguistic interaction with the first virtual character, optionally, after the second virtual character and the first virtual character are loaded, other interaction instructions (such as a second interaction instruction) may be further detected, where the interaction instruction is also a social instruction between the second virtual character and the first virtual character, or the interaction instruction detected at the beginning is directly configured as a social instruction, the social instruction may implement interaction between the second virtual character and the first virtual character in the dynamic wallpaper, such as conversation, singing jump, and the like, and the second virtual character may be regarded as an enhanced character of the first virtual character, and is used to assist the first virtual character to complete more and more complex interaction actions, and realizing more operations such as setting, triggering, starting and the like on the target terminal.
In one embodiment of this embodiment, rendering the character image of the three-dimensional model in real time according to the interactive instruction includes: searching wearing materials and virtual animations matched with the interactive instructions in a preset material library; and adding the wearing materials to the three-dimensional model, and generating a dynamic picture of the first virtual character according to the virtual animation.
The method comprises the steps of displaying, updating and rendering the animation of the virtual character, correspondingly training some action animations and wearing materials according to interactive instructions, correspondingly updating and rendering, and loading the animation of the virtual character according to correspondingly executed system operation processing. Generating a dynamic picture of a first virtual character according to a virtual animation includes: loading a preset dynamic picture, playing the preset dynamic picture as a special effect, wherein a plurality of sets of dynamic pictures are preset by a first virtual character, and each set corresponds to an interactive instruction; and reading a driving parameter queue, wherein the driving parameter queue comprises a plurality of groups of driving parameters arranged according to time sequence, transmitting the driving parameter queue to a physical engine, controlling the physical engine to read the driving parameters from the driving parameter queue in sequence and rendering in real time.
Optionally, the detecting, by the dynamic wallpaper process, the interaction instruction for the first virtual character includes: acquiring software data of associated software through a dynamic wallpaper process, wherein the software data comprises at least one of the following: weather data of weather software, calendar data and travel data of calendar software, session data of communication software, version update notification and prop on-line notification of game software, and system notification data of system software; the software data is determined as the instructions for interaction.
By adopting the method of the embodiment, the state information change in the APP (for example, important APP notification information is received) is displayed in the virtual role, and the software data is transmitted to the virtual role process for processing through the synchronous interface of the dynamic wallpaper.
For example, a certain character in a game has updated the skin, a notification can be sent through a virtual character on dynamic wallpaper to notify a user that the skin of the character is updated, notification information (such as missed calls, unread messages and the like), travel information, alarm clock information and the like of a terminal operating system such as a mobile phone and the like are obtained in real time, notification and display are carried out through a first virtual character on the dynamic wallpaper, for example, display is carried out through bubbles and sticky notes, the bubbles are displayed after the user reads, different types of notification information can be displayed at different positions of the virtual character in different forms, sequencing is carried out through priority, time limit and the like, the traditional popup window and set-top notification mode is changed, and the user can see a desired notification message by unlocking the mobile phone.
In this example, rendering the character image of the three-dimensional model in real-time according to the instructions for interaction includes: analyzing the software data, and extracting text content or audio content in the software data; and displaying text contents at a preset position of the first virtual character, or synchronously rendering the mouth shape of the first virtual character by adopting a preset phoneme model when audio contents are played.
In some examples, jumping to the associated software may further be performed with the first virtual character as an intermediate bridge, and after displaying the text content at the predetermined position of the first virtual character, further includes: detecting a manipulation instruction at a predetermined position; and responding to the control instruction, starting the associated software through the desktop starter, and displaying the text content on a display interface of the associated software.
The UI shape of the first virtual character on the terminal desktop can be changed by information obtained from a plurality of applications associated with the presence of the first virtual character on the desktop, for example, the shape of the first virtual character is changed, for example, when the temperature of data obtained from a weather application is low, the first virtual character takes an action of feeling cold, and when the temperature of data obtained from a calendar is today, the first virtual character takes an action of celebrating a holiday; or, the first virtual role on the desktop directly performs voice broadcast on the acquired information, for example, if the user sets that the first virtual role broadcasts the news in the application a at a certain time every day, the first virtual role on the desktop broadcasts the news in a certain time; the first virtual role of the dynamic desktop carries out voice broadcast on the information received in the WeChat; and reminding the schedule time set in the calendar application by the user through the first virtual role.
Icons of apps installed in the cell phone can be revealed through interaction with the first virtual character of the desktop. For example, in addition to the first virtual role of the desktop, only icons of apps frequently used by the user may be kept on the desktop, and desktop management is implemented by clicking the first virtual role, further displaying icons of other apps at other positions in the desktop, and the like.
In an implementation manner of this embodiment, controlling the target terminal according to the interactive instruction includes: responding to the interactive instruction, sequencing a plurality of pieces of software installed on the target terminal based on the use frequency, and displaying the abbreviated identifications of a plurality of pieces of software with the use frequency larger than a preset threshold value on the display interface of the target terminal, wherein the dynamic wallpaper is a background picture of the display interface.
Optionally, before acquiring the software data of the associated software through the dynamic wallpaper process, the method further includes: configuring a signing relation between the associated software and the dynamic wallpaper, wherein the signing relation is used for indicating that the dynamic wallpaper has the data acquisition permission of the associated software; and creating a data synchronization channel for associating the software to the dynamic wallpaper based on the subscription relationship.
For example, when a certain application is installed, a user can select whether to associate with dynamic wallpaper, and after the association, data in the application can be pulled or called through a dynamic wallpaper process, so that monitoring and desktop presentation are realized.
The dynamic wallpaper is used as software installed on a mobile phone, can interact with other software when a process of the dynamic wallpaper is started, for example, the dynamic wallpaper is associated with other software through modes of account number login, signing authorization and the like, obtains control authority of other software, obtains some software data of the software from a server associated with the software, further displays the software data through the dynamic wallpaper, and displays the software data through a virtual role, so that efficient notification is realized, for example, a notification message of WeChat, update information of the software, state information and the like are obtained.
In an implementation manner of this embodiment, controlling the target terminal according to the interactive instruction includes: positioning an image block to which a touch position of the interactive instruction belongs, wherein the first virtual character comprises a plurality of image blocks; generating a terminal control instruction corresponding to the image block based on a preset mapping relation; and executing the terminal control instruction on the target terminal.
Fig. 4 is a distribution diagram of image blocks of a first virtual character in an embodiment of the present invention, which includes 6 image blocks respectively corresponding to a head, a left upper limb, a left lower limb, a right upper limb, and a handheld prop, and when a user touches a certain position to generate an interactive instruction, a corresponding terminal control instruction is generated, such as clicking block 1, and a pull-down instruction of a notification bar is generated.
Interaction is performed through a certain part or a certain direction of the first virtual character in the dynamic desktop, such as clicking a hand, opening the software A, clicking a leg, switching a display page, clicking eyes, turning off and the like. Further, virtual role action responses on the desktop, such as sessions, can be controlled based on control commands of the user, and the APP which the user wants to open is switched out from a certain part of the virtual role.
Optionally, the image blocks may correspond to the model elements in the above embodiments one to one, and may also be set differently.
In an implementation manner of this embodiment, after rendering the character image of the three-dimensional model in real time according to the interactive instruction, the method further includes: detecting state data of the target terminal, wherein the state data is used for indicating the equipment state of the target terminal and the running state of software; and updating the role image of the three-dimensional model in real time according to the state data.
The target terminal is in different states, virtual characters in a mobile phone desktop are in different actions or states, the system is pre-configured with pictures or animations of the virtual characters in multiple states and correspondingly switched, for example, when listening to songs, a first virtual character is provided with an earphone, when the system is silent, the first virtual character sleeps, and when running, wearing materials of the first virtual character are switched into sports clothes and the like.
In an embodiment of this embodiment, the method further includes: detecting an interaction request sent by a fourth virtual Character to the first virtual Character, wherein the fourth virtual Character is a Player-Controlled Character (PCC) or a Non-Player-Controlled Character (NPC); and responding to the interaction request, and calling the dynamic wallpaper process to control the fourth virtual role.
The fourth virtual role is the user's own control PCC or NPC, and interacts with the first virtual role (similar to role interaction in a game), and system operation instructions, voice accompanying of virtual role AI, etc., and virtual role education correlation (which can be purchased according to a module, provide an entry for corresponding service, and be switched to a corresponding APP by virtual role guidance to load content) can be performed between the virtual roles, so as to realize interaction between the two virtual roles in the application, and display of an interactive interface is performed through dynamic wallpaper or an application interface where the fourth virtual role is located, for example, a dynamic wallpaper process is controlled to periodically start a game process, and the PCC in the game process is controlled according to AI to perform growth training.
In an application scenario of this embodiment, the interactive instruction is a live-action image acquired by the dynamic wallpaper process, and rendering the character image of the three-dimensional model in real time according to the interactive instruction includes: calling a dynamic wallpaper process to transmit the live-action image to a search engine; reading a retrieval result recalled by a search engine based on the live-action image; extracting rendering parameters in the retrieval result; rendering the three-dimensional model of the first virtual character using the rendering parameters.
And loading the AR real scene in the virtual character process, performing interactive interaction with the virtual character in the real scene, performing image recognition in the AR real scene, and correspondingly calling the APP to perform system operations such as input and retrieval of recognition keywords. For example, the picture of a piece of clothes is identified, the virtual role is enabled to initiate the identification of the picture of the clothes in the Taobao APP, the Taobao APP at the background is started, the picture of the retrieval result is loaded, further the virtual role can be enabled to load the virtual prop of the clothes corresponding to the clothes, and the clothes are displayed on the try-on.
Through the above description of the embodiments, those skilled in the art can clearly understand that the method according to the embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but the former is a better implementation mode in many cases. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
Example 2
In this embodiment, a control device of a dynamic desktop is further provided for implementing the embodiments and the preferred embodiments, which have already been described and are not described again. As used below, the term "module" may be a combination of software and/or hardware that implements a predetermined function. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware, or a combination of software and hardware is also possible and contemplated.
Fig. 5 is a block diagram illustrating a control apparatus of a dynamic desktop according to an embodiment of the present invention, and as shown in fig. 5, the apparatus includes: a configuration module 50, a first detection module 52, a first control module 54, wherein,
a configuration module 50, configured to configure a synchronous interface of the dynamic wallpaper in the desktop launcher of the target terminal;
a first detection module 52, configured to load a three-dimensional model of a first virtual character in a dynamic wallpaper process, and detect an interaction instruction for the first virtual character through the dynamic wallpaper process;
and the first control module 54 is configured to render the character image of the three-dimensional model in real time according to the interactive instruction, send the interactive instruction to the desktop starter through the synchronous interface, and control the target terminal according to the interactive instruction.
Optionally, the first detection module includes at least one of: a first detection unit, configured to detect, through the dynamic wallpaper process, a voice control instruction for the first virtual character; and the second detection unit is used for detecting a touch instruction aiming at the first virtual role through the dynamic wallpaper process.
Optionally, the first control module includes: a searching unit, configured to search, in an element set of the three-dimensional model, for a target model element that matches the interactive instruction, where the three-dimensional model includes a plurality of model elements; and the adjusting unit is used for adjusting the target model element in real time according to the interactive instruction so as to update the role image of the first virtual role.
Optionally, the apparatus further comprises: the loading module is used for responding to the interaction instruction and loading a second virtual character in the dynamic wallpaper process after the first detection module detects the interaction instruction aiming at the first virtual character through the dynamic wallpaper process; a generating module, configured to generate a social instruction between the second virtual character and the first virtual character based on the interaction instruction, where the social instruction includes a language social instruction and/or an action social instruction; a second control module to control the second virtual character and the first virtual character based on the social instructions.
Optionally, the first control module includes: the searching unit is used for searching the wearing materials and the virtual animation matched with the interactive instructions in a preset material library; and the generating unit is used for adding the wearing materials to the three-dimensional model and generating a dynamic picture of the first virtual character according to the virtual animation.
Optionally, the first detecting module includes: an obtaining unit, configured to obtain software data of associated software through the dynamic wallpaper process, where the software data includes at least one of: weather data of weather software, calendar data and travel data of calendar software, session data of communication software, version update notification and prop on-line notification of game software, and system notification data of system software; a determining unit, configured to determine the software data as the interactive instruction.
Optionally, the first control module includes: the extraction unit is used for analyzing the software data and extracting the text content or the audio content in the software data; and the processing unit is used for displaying the text content at a preset position of the first virtual character, or synchronously rendering the mouth shape of the first virtual character by adopting a preset phoneme model when the audio content is played.
Optionally, the first control module further includes: the detection unit is used for detecting the control instruction at a preset position after the processing unit displays the text content at the preset position of the first virtual character; and the display unit is used for responding to the control instruction, starting the associated software through the desktop starter, and displaying the text content on a display interface of the associated software.
Optionally, the first control module includes: and the control unit is used for responding to the interaction instruction, sequencing the plurality of pieces of software installed on the target terminal based on the use frequency, and displaying the abbreviated identifications of the plurality of pieces of software with the use frequency larger than a preset threshold value on a display interface of the target terminal, wherein the dynamic wallpaper is a background picture of the display interface.
Optionally, the apparatus further comprises: the association unit is used for configuring a signing relationship between the association software and the dynamic wallpaper before the detection of acquiring the software data of the association software through the dynamic wallpaper process, wherein the signing relationship is used for indicating that the dynamic wallpaper has the data acquisition permission of the association software; and the creating unit is used for creating a data synchronization channel from the associated software to the dynamic wallpaper based on the signing relationship.
Optionally, the first control module includes: the positioning unit is used for positioning an image block to which the touch position of the interactive instruction belongs, wherein the first virtual character comprises a plurality of image blocks; the generating unit is used for generating a terminal control instruction corresponding to the image block based on a preset mapping relation; and the execution unit is used for executing the terminal control instruction on the target terminal.
Optionally, the apparatus further comprises: the second detection module is used for detecting the state data of the target terminal after the first control module renders the character image of the three-dimensional model in real time according to the interactive instruction, wherein the state data is used for indicating the equipment state of the target terminal and the running state of software; and the updating module is used for updating the role image of the three-dimensional model in real time according to the state data.
Optionally, the apparatus further comprises: a third detection module, configured to detect an interaction request sent by a fourth virtual character to the first virtual character, where the fourth virtual character is a player-controlled character PCC or a non-player-controlled character NPC; and the third control module is used for responding to the interaction request and calling the dynamic wallpaper process to control the fourth virtual role.
Optionally, the interactive instruction is a live-action image acquired by the dynamic wallpaper process, and the first control module includes: the calling unit is used for calling the dynamic wallpaper process to transmit the live-action image to a search engine; a reading unit, configured to read a retrieval result recalled by the search engine based on the live-action image; the extraction unit is used for extracting the rendering parameters in the retrieval result; and the rendering unit is used for rendering the three-dimensional model of the first virtual character by adopting the rendering parameters.
It should be noted that, the modules may be implemented by software or hardware, and for the latter, the following may be implemented, but is not limited to the following: the modules are all located in the same processor; or, the modules are respectively located in different processors in any combination.
Example 3
Fig. 6 is a structural diagram of an electronic device according to an embodiment of the present invention, and as shown in fig. 6, the electronic device includes a processor 61, a communication interface 62, a memory 63, and a communication bus 64, where the processor 61, the communication interface 62, and the memory 63 complete mutual communication through the communication bus 64, and the memory 63 is used for storing a computer program;
the processor 61 is configured to implement the following steps when executing the program stored in the memory 63: configuring a synchronous interface of dynamic wallpaper in a desktop starter of a target terminal; loading a three-dimensional model of a first virtual character in a dynamic wallpaper process, and detecting an interaction instruction aiming at the first virtual character through the dynamic wallpaper process; and rendering the role image of the three-dimensional model in real time according to the interactive instruction, sending the interactive instruction to the desktop starter through the synchronous interface, and controlling the target terminal according to the interactive instruction.
Optionally, the detecting, by the dynamic wallpaper process, the interaction instruction for the first virtual character includes at least one of: detecting, by the dynamic wallpaper process, a voice control instruction for the first virtual character; detecting, by the dynamic wallpaper process, a touch instruction for the first virtual character.
Optionally, rendering the character image of the three-dimensional model in real time according to the interactive instruction includes: searching a target model element matched with the interactive instruction in an element set of the three-dimensional model, wherein the three-dimensional model comprises a plurality of model elements; and adjusting the target model element in real time according to the interactive instruction so as to update the role image of the first virtual role.
Optionally, after detecting the instruction of interaction for the first virtual character through the dynamic wallpaper process, the method further includes: responding to the interaction instruction, and loading a second virtual role in the dynamic wallpaper process; generating social instructions between the second virtual character and the first virtual character based on the interaction instructions, wherein the social instructions comprise language social instructions and/or action social instructions; controlling the second virtual character and the first virtual character based on the social instructions.
Optionally, rendering the character image of the three-dimensional model in real time according to the interactive instruction includes: searching the wearing materials and the virtual animation matched with the interactive instruction in a preset material library; and adding the wearing materials to the three-dimensional model, and generating a dynamic picture of the first virtual character according to the virtual animation.
Optionally, the detecting, by the dynamic wallpaper process, an interaction instruction for the first virtual character includes: acquiring software data of associated software through the dynamic wallpaper process, wherein the software data comprises at least one of: weather data of weather software, calendar data and travel data of calendar software, session data of communication software, version update notification and prop on-line notification of game software, and system notification data of system software; determining the software data as the interaction instruction.
Optionally, rendering the character image of the three-dimensional model in real time according to the interactive instruction includes: analyzing the software data, and extracting text content or audio content in the software data; and displaying the text content at a preset position of the first virtual character, or synchronously rendering the mouth shape of the first virtual character by adopting a preset phoneme model when the audio content is played.
Optionally, after the text content is displayed at the predetermined position of the first virtual character, the method further includes: detecting the manipulation instruction at the predetermined position; and responding to the control instruction, starting the associated software through the desktop starter, and displaying the text content on a display interface of the associated software.
Optionally, controlling the target terminal according to the interactive instruction includes: responding to the interaction instruction, sequencing the plurality of software installed on the target terminal based on the use frequency, and displaying the abbreviated identifications of the plurality of software with the use frequency larger than a preset threshold value on the display interface of the target terminal, wherein the dynamic wallpaper is a background picture of the display interface.
Optionally, before acquiring software data of associated software through the dynamic wallpaper process, the method further includes: configuring a signing relation between the associated software and the dynamic wallpaper, wherein the signing relation is used for indicating that the dynamic wallpaper has a data acquisition permission of the associated software; and creating a data synchronization channel from the associated software to the dynamic wallpaper based on the subscription relationship.
Optionally, controlling the target terminal according to the interactive instruction includes: positioning an image block to which a touch position of the interactive instruction belongs, wherein the first virtual character comprises a plurality of image blocks; generating a terminal control instruction corresponding to the image block based on a preset mapping relation; and executing the terminal control instruction on the target terminal.
Optionally, after rendering the character image of the three-dimensional model in real time according to the interactive instruction, the method further includes: detecting state data of the target terminal, wherein the state data is used for indicating the equipment state of the target terminal and the running state of software; and updating the role image of the three-dimensional model in real time according to the state data.
Optionally, the method further includes: detecting an interaction request sent by a fourth virtual character to the first virtual character, wherein the fourth virtual character is a player-controlled character PCC or a non-player-controlled character NPC; and responding to the interaction request, and calling the dynamic wallpaper process to control the fourth virtual role.
Optionally, the rendering the character image of the three-dimensional model in real time according to the interactive instruction includes: calling the dynamic wallpaper process to transmit the live-action image to a search engine; reading a retrieval result recalled by the search engine based on the live-action image; extracting rendering parameters in the retrieval result; and rendering the three-dimensional model of the first virtual character by adopting the rendering parameters.
The communication bus mentioned by the terminal may be a Peripheral Component Interconnect (PCI) bus or an Extended Industry Standard Architecture (EISA) bus, etc. The communication bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown, but this does not mean that there is only one bus or one type of bus.
The communication interface is used for communication between the terminal and other equipment.
The Memory may include a Random Access Memory (RAM) or a non-volatile Memory (non-volatile Memory), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the processor.
The Processor may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; the Integrated Circuit may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, or a discrete hardware component.
In another embodiment provided by the present application, a computer-readable storage medium is further provided, in which instructions are stored, and when the instructions are executed on a computer, the instructions cause the computer to execute the control method of the dynamic desktop according to any one of the embodiments.
In yet another embodiment provided by the present application, there is also provided a computer program product containing instructions which, when run on a computer, cause the computer to execute the method for controlling a dynamic desktop according to any of the embodiments.
In the described embodiments, this may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
The serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
In the embodiments of the present application, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one type of division of logical functions, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present application and it should be noted that those skilled in the art can make several improvements and modifications without departing from the principle of the present application, and these improvements and modifications should also be considered as the protection scope of the present application.

Claims (17)

1. A control method of a dynamic desktop is characterized by comprising the following steps:
configuring a synchronous interface of dynamic wallpaper in a desktop starter of a target terminal;
loading a three-dimensional model of a first virtual character in a dynamic wallpaper process, and detecting an interaction instruction aiming at the first virtual character through the dynamic wallpaper process;
and rendering the role image of the three-dimensional model in real time according to the interactive instruction, sending the interactive instruction to the desktop starter through the synchronous interface, and controlling the target terminal according to the interactive instruction.
2. The method of claim 1, wherein detecting, by the dynamic wallpaper process, the interaction instructions for the first virtual character comprises at least one of:
detecting, by the dynamic wallpaper process, a voice control instruction for the first virtual character;
detecting, by the dynamic wallpaper process, a touch instruction for the first virtual character.
3. The method of claim 1, wherein rendering the character representation of the three-dimensional model in real-time in accordance with the instructions for interaction comprises:
searching a target model element matched with the interactive instruction in an element set of the three-dimensional model, wherein the three-dimensional model comprises a plurality of model elements;
and adjusting the target model element in real time according to the interactive instruction so as to update the role image of the first virtual role.
4. The method as recited in claim 1, wherein after detecting the instruction of interaction for the first virtual character by the dynamic wallpaper process, the method further comprises:
responding to the interaction instruction, and loading a second virtual role in the dynamic wallpaper process;
generating social instructions between the second virtual character and the first virtual character based on the interaction instructions, wherein the social instructions comprise language social instructions and/or action social instructions;
controlling the second virtual character and the first virtual character based on the social instructions.
5. The method of claim 1, wherein rendering the character representation of the three-dimensional model in real-time in accordance with the instructions for interaction comprises:
searching the wearing materials and the virtual animation matched with the interactive instruction in a preset material library;
and adding the wearing materials to the three-dimensional model, and generating a dynamic picture of the first virtual character according to the virtual animation.
6. The method of claim 1, wherein detecting, by the dynamic wallpaper process, the interaction instructions for the first virtual character comprises:
acquiring software data of associated software through the dynamic wallpaper process, wherein the software data comprises at least one of: weather data of weather software, calendar data and travel data of calendar software, session data of communication software, version update notification and prop on-line notification of game software, and system notification data of system software;
determining the software data as the interaction instruction.
7. The method of claim 6, wherein rendering the character representation of the three-dimensional model in real-time in accordance with the instructions for interaction comprises:
analyzing the software data, and extracting text content or audio content in the software data;
and displaying the text content at a preset position of the first virtual character, or synchronously rendering the mouth shape of the first virtual character by adopting a preset phoneme model when the audio content is played.
8. The method of claim 7, wherein after displaying the textual content at the predetermined location of the first avatar, the method further comprises:
detecting a manipulation instruction at the predetermined position;
and responding to the control instruction, starting the associated software through the desktop starter, and displaying the text content on a display interface of the associated software.
9. The method of claim 1, wherein controlling the target terminal according to the interaction instruction comprises:
responding to the interaction instruction, sequencing the plurality of software installed on the target terminal based on the use frequency, and displaying the abbreviated identifications of the plurality of software with the use frequency larger than a preset threshold value on the display interface of the target terminal, wherein the dynamic wallpaper is a background picture of the display interface.
10. The method as recited in claim 6, wherein prior to obtaining software data for associated software by the dynamic wallpaper process, the method further comprises:
configuring a signing relation between the associated software and the dynamic wallpaper, wherein the signing relation is used for indicating that the dynamic wallpaper has a data acquisition permission of the associated software;
and creating a data synchronization channel from the associated software to the dynamic wallpaper based on the subscription relationship.
11. The method of claim 1, wherein controlling the target terminal according to the interaction instruction comprises:
positioning an image block to which a touch position of the interactive instruction belongs, wherein the first virtual character comprises a plurality of image blocks;
generating a terminal control instruction corresponding to the image block based on a preset mapping relation;
and executing the terminal control instruction on the target terminal.
12. The method of claim 1, wherein after rendering the character representation of the three-dimensional model in real-time in accordance with the instructions for interaction, the method further comprises:
detecting state data of the target terminal, wherein the state data is used for indicating the equipment state of the target terminal and the running state of software;
and updating the role image of the three-dimensional model in real time according to the state data.
13. The method of claim 1, further comprising:
detecting an interaction request sent by a fourth virtual character to the first virtual character, wherein the fourth virtual character is a player-controlled character PCC or a non-player-controlled character NPC;
and responding to the interaction request, and calling the dynamic wallpaper process to control the fourth virtual role.
14. The method of claim 1, wherein the interactive instruction is a live-action image acquired by the dynamic wallpaper process, and rendering the character image of the three-dimensional model in real time according to the interactive instruction comprises:
calling the dynamic wallpaper process to transmit the live-action image to a search engine;
reading a retrieval result recalled by the search engine based on the live-action image;
extracting rendering parameters in the retrieval result;
and rendering the three-dimensional model of the first virtual character by adopting the rendering parameters.
15. A control apparatus for a dynamic desktop, comprising:
the configuration module is used for configuring a synchronous interface of the dynamic wallpaper in a desktop starter of the target terminal;
the system comprises a first detection module, a second detection module and a third detection module, wherein the first detection module is used for loading a three-dimensional model of a first virtual role in a dynamic wallpaper process and detecting an interaction instruction aiming at the first virtual role through the dynamic wallpaper process;
and the first control module is used for rendering the character image of the three-dimensional model in real time according to the interactive instruction, sending the interactive instruction to the desktop starter through the synchronous interface, and controlling the target terminal according to the interactive instruction.
16. A storage medium, in which a computer program is stored, wherein the computer program is arranged to perform the method of any of claims 1 to 14 when executed.
17. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program, and wherein the processor is arranged to execute the computer program to perform the method of any of claims 1 to 14.
CN202111443572.3A 2021-11-30 2021-11-30 Control method and device of dynamic desktop, storage medium and electronic device Pending CN114116105A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111443572.3A CN114116105A (en) 2021-11-30 2021-11-30 Control method and device of dynamic desktop, storage medium and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111443572.3A CN114116105A (en) 2021-11-30 2021-11-30 Control method and device of dynamic desktop, storage medium and electronic device

Publications (1)

Publication Number Publication Date
CN114116105A true CN114116105A (en) 2022-03-01

Family

ID=80368610

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111443572.3A Pending CN114116105A (en) 2021-11-30 2021-11-30 Control method and device of dynamic desktop, storage medium and electronic device

Country Status (1)

Country Link
CN (1) CN114116105A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114296853A (en) * 2021-12-28 2022-04-08 完美世界控股集团有限公司 Control method and device of dynamic desktop, storage medium and electronic device

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20060036358A (en) * 2004-10-25 2006-04-28 (주)엔브이엘소프트 System for providing 3d background screen
CN103092485A (en) * 2013-02-07 2013-05-08 广州市久邦数码科技有限公司 Method and system for achieving desktop dynamic theme based on Android equipment
CN103744600A (en) * 2014-01-17 2014-04-23 广州市久邦数码科技有限公司 Method and system for interaction between 3D (three-dimensional) dynamic wallpaper and desktop icon
CN108037859A (en) * 2017-11-17 2018-05-15 珠海市君天电子科技有限公司 A kind of wallpaper control method, device, electronic equipment and storage medium
CN112099683A (en) * 2020-09-03 2020-12-18 维沃移动通信有限公司 Wallpaper display method and device and electronic equipment
CN113238693A (en) * 2021-06-09 2021-08-10 维沃移动通信有限公司 Icon sorting method, icon sorting device and electronic equipment
CN113296665A (en) * 2021-05-19 2021-08-24 武汉长戟科技有限公司 Method for expanding desktop display content
CN114296853A (en) * 2021-12-28 2022-04-08 完美世界控股集团有限公司 Control method and device of dynamic desktop, storage medium and electronic device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20060036358A (en) * 2004-10-25 2006-04-28 (주)엔브이엘소프트 System for providing 3d background screen
CN103092485A (en) * 2013-02-07 2013-05-08 广州市久邦数码科技有限公司 Method and system for achieving desktop dynamic theme based on Android equipment
CN103744600A (en) * 2014-01-17 2014-04-23 广州市久邦数码科技有限公司 Method and system for interaction between 3D (three-dimensional) dynamic wallpaper and desktop icon
CN108037859A (en) * 2017-11-17 2018-05-15 珠海市君天电子科技有限公司 A kind of wallpaper control method, device, electronic equipment and storage medium
CN112099683A (en) * 2020-09-03 2020-12-18 维沃移动通信有限公司 Wallpaper display method and device and electronic equipment
CN113296665A (en) * 2021-05-19 2021-08-24 武汉长戟科技有限公司 Method for expanding desktop display content
CN113238693A (en) * 2021-06-09 2021-08-10 维沃移动通信有限公司 Icon sorting method, icon sorting device and electronic equipment
CN114296853A (en) * 2021-12-28 2022-04-08 完美世界控股集团有限公司 Control method and device of dynamic desktop, storage medium and electronic device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114296853A (en) * 2021-12-28 2022-04-08 完美世界控股集团有限公司 Control method and device of dynamic desktop, storage medium and electronic device

Similar Documents

Publication Publication Date Title
US10744409B2 (en) Method, apparatus, and storage medium for displaying game data on a desktop of a mobile terminal
US20140351720A1 (en) Method, user terminal and server for information exchange in communications
CN114296853A (en) Control method and device of dynamic desktop, storage medium and electronic device
CA3159186A1 (en) Information interaction method, apparatus, device, storage medium and program product
US20210389856A1 (en) Method and electronic device for displaying interactive content
CN109032470A (en) Screenshot method, device, terminal and computer readable storage medium
CN111541945B (en) Video playing progress control method and device, storage medium and electronic equipment
US20210069601A1 (en) Game drawer
CN115237301B (en) Method and device for processing barrage in interactive novel
CN111569436A (en) Processing method, device and equipment based on interaction in live broadcast fighting
CN114422465A (en) Message processing method, device, equipment and storage medium
CN110912806B (en) Message processing method, device, storage medium and electronic device
CN112688859B (en) Voice message sending method and device, electronic equipment and readable storage medium
CN105808231A (en) System and method for recording script and system and method for playing script
WO2024114571A1 (en) Information display method and apparatus, electronic device, and storage medium
CN114116105A (en) Control method and device of dynamic desktop, storage medium and electronic device
CN105094576B (en) Application scenarios switching method and apparatus
CN112306450A (en) Information processing method and device
CN112187624A (en) Message reply method and device and electronic equipment
CN107247598A (en) Application message acquisition methods and device, computer installation, readable storage medium storing program for executing
CN112083994A (en) Notification message processing method and device
CN107402763B (en) Icon control method and control system based on intelligent terminal
CN113282268B (en) Sound effect configuration method and device, storage medium and electronic equipment
CN112291602B (en) Video playing method, electronic equipment and storage medium
CN112637409B (en) Content output method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination