WO2019019968A1 - 虚拟角色的位移控制方法、装置和存储介质 - Google Patents
虚拟角色的位移控制方法、装置和存储介质 Download PDFInfo
- Publication number
- WO2019019968A1 WO2019019968A1 PCT/CN2018/096646 CN2018096646W WO2019019968A1 WO 2019019968 A1 WO2019019968 A1 WO 2019019968A1 CN 2018096646 W CN2018096646 W CN 2018096646W WO 2019019968 A1 WO2019019968 A1 WO 2019019968A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- orientation
- virtual character
- instruction
- displacement
- determining
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/002—Specific input/output arrangements not covered by G06F3/01 - G06F3/16
- G06F3/005—Input arrangements through a video camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0338—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of limited linear or angular displacement of an operating part of the device from a neutral position, e.g. isotonic or isometric joysticks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/20—3D [Three Dimensional] animation
- G06T13/40—3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/021—Services related to particular areas, e.g. point of interest [POI] services, venue services or geofences
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/029—Location-based management or tracking services
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/012—Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
Definitions
- the embodiments of the present application relate to the field of computers, and in particular, to a displacement control method, apparatus, and storage medium for a virtual character.
- the embodiment of the present application provides a displacement control method for a virtual character.
- the virtual character's displacement control method is applied to a computing device, including: receiving, in a virtual reality scenario, a first instruction, wherein the first instruction is used to indicate that the virtual character generates a target displacement from the first orientation; After the command, determining a second orientation that is offset from the first orientation by the target displacement; controlling the virtual character to disappear in the first orientation, and controlling the virtual character to appear in the second orientation, wherein the virtual character does not appear in the first orientation and the second orientation between.
- the embodiment of the present application further provides a displacement control device for a virtual character.
- the displacement control device of the virtual character includes: a processor and a memory coupled to the processor, the memory storing machine readable instructions executable by the processor, the processor executing the machine readable The instruction completes the following operations:
- a first instruction wherein the first instruction is used to indicate that the virtual character generates a target displacement from the first orientation; and after receiving the first instruction, determining a second target displacement from the first orientation Orientation; controlling the virtual character to disappear in the first orientation, and controlling the virtual character to appear in the second orientation, wherein the virtual character does not appear between the first orientation and the second orientation.
- the embodiment of the present application further provides a non-transitory computer readable storage medium, wherein the storage medium stores machine readable instructions, which are executable by a processor to complete the displacement of the virtual character described above. Control Method.
- FIG. 1 is a schematic diagram of a hardware environment of a method for controlling a displacement of a virtual character according to an embodiment of the present application
- FIG. 2 is a flowchart of a method for controlling displacement of a virtual character according to an embodiment of the present application
- FIG. 3 is a flowchart of a method for controlling a virtual character to disappear in a first orientation and controlling a virtual character to appear in a second orientation according to an embodiment of the present application;
- FIG. 4 is a flow chart of another method for controlling a virtual character to disappear in a first orientation and controlling a virtual character to appear in a second orientation, in accordance with an embodiment of the present application;
- FIG. 5 is a flow chart of a method of determining a second orientation of a target displacement from a first orientation, in accordance with an embodiment of the present application
- FIG. 6 is a flowchart of a method of determining an area within a preset distance range of a distance intersection position as a position indicated by a second orientation according to an embodiment of the present application;
- FIG. 7 is a flowchart of another method for controlling displacement of a virtual character according to an embodiment of the present application.
- FIG. 8 is a schematic diagram showing a displacement effect display according to an embodiment of the present application.
- FIG. 9 is a schematic diagram showing another displacement special effect display according to an embodiment of the present application.
- FIG. 10a is a schematic diagram of a displacement control apparatus for a virtual character according to an embodiment of the present application.
- FIG. 10b is a schematic diagram of a displacement control apparatus for a virtual character according to an embodiment of the present application.
- FIG. 10c is a schematic diagram of a displacement control device for a virtual character according to an embodiment of the present application.
- FIG. 11 is a structural block diagram of a terminal according to an embodiment of the present application.
- the method for controlling the movement of a virtual character in a 3D space can be implemented through three platforms: a PC end, a host end, and a mobile phone end.
- the PC side is operated by keyboard and mouse as the default.
- the button control of the virtual character is very standardized. You can use the four buttons “W”, “A”, “S” and “D” on the keyboard as the forward and backward. , left pan, right pan to perform continuous linear behavior of the virtual character.
- the front, back, left, and right of the virtual character are determined according to the relative position of the perspective of the camera controlled by the mouse. Among them, the mouse is used to control the orientation of the camera, and the orientation can be any angle.
- Each host has its own dedicated handle, that is, the control operation of each host is pre-defined, and the free movement of the virtual character in the virtual space is generally controlled by the joystick on the handle. There is no big difference between the operation on the host side and the PC side. The movement is continuous, and the buttons are the same as the mouse, but the hardware devices are customized.
- Smartphones are turning to heavy games, that is, more and more of the original 3D games are applied to mobile phones.
- For the original physical button operation it also evolved into a virtual button operation on the mobile phone.
- the virtual joystick is adopted for the movement control of the virtual character object in the 3D space.
- VR Virtual Reality
- the HTC Vive under the Vive platform is a virtual reality head-mounted display developed by HTC and Valve Corporation.
- This head-mounted display utilizes "room size" technology. Transforming a room into a three-dimensional space with sensors allows the user to navigate naturally, move around, and use motion-tracking handheld controllers to manipulate objects in a virtual world for sophisticated interaction, communication, and immersive environments. Since the Vive device can track a certain real space and then fully match the virtual space, many casual games are designed to be about the same size as the Vive tracking space, so that the user can completely fit in the space. Free to move, but the size of the simulated space is limited, and the size of the virtual space area that can be tracked by different devices is also inconsistent, making it difficult to adapt to different devices.
- the embodiment of the present application provides a displacement control method for a virtual character.
- FIG. 1 is a schematic diagram of a hardware environment of a displacement control method for a virtual character according to an embodiment of the present application.
- the server 102 is connected to the terminal 104 through a network.
- the network includes but is not limited to a wide area network, a metropolitan area network, or a local area network.
- the terminal 104 is not limited to a PC, a mobile phone, a tablet, or the like.
- the displacement control method of the virtual character in the embodiment of the present application may be performed by the computing device, for example, by the server 102, by the terminal 104, or by the server 102 and the terminal 104.
- the method for controlling the displacement of the virtual character by the terminal 104 in the embodiment of the present application may also be performed by a client installed thereon.
- FIG. 2 is a flowchart of a method for controlling displacement of a virtual character according to an embodiment of the present application. This method can be applied to computing devices. As shown in FIG. 2, the method may include the following steps:
- Step S202 in the virtual reality scenario, receiving a first instruction for instructing the virtual character to generate a target displacement from the first orientation.
- a first instruction for instructing the virtual character to generate a target displacement from the first orientation is received.
- Virtual reality that is, virtual reality, virtual environment, using virtual technology, integrated computer graphics, computer simulation, artificial intelligence, sensing, display and network parallel processing and other technologies, the latest development results, is a computer-aided generation
- High-tech simulation system using computer simulation to generate a virtual world in three-dimensional space, can provide users with a simulation environment such as vision and sense, timely and unrestricted observation of things in three-dimensional space, thus bringing users immersive Feel.
- the computer can immediately perform complex operations on the information of the virtual character, and feedback the accurate three-dimensional space in the form of video, thereby giving the user a sense of presence.
- the virtual reality scenario of the embodiment is a scenario in which the real-world scenario is simulated by using the virtual reality technology to obtain a scenario suitable for a specific application.
- the virtual reality scenario is a virtual reality technology that completely matches the real space into the virtual space.
- a scenario suitable for a gaming application wherein the gaming application is a VR gaming application, in some embodiments of the present application, an application for controlling a displacement representation process of a virtual character.
- the user matches the virtual character.
- the user's operation matches the behavior of the virtual character. For example, if the user pushes the joystick of the VR device in any direction, Trigger the displacement selection mechanism of the virtual character.
- the virtual reality scene can provide the user with a simulated scene about the senses such as vision, so that the user can observe the things in the scene through the virtual character in a timely and unrestricted manner, so that the user has the same as in the real scene.
- the virtual reality scene can provide the user with a simulated scene about the senses such as vision, so that the user can observe the things in the scene through the virtual character in a timely and unrestricted manner, so that the user has the same as in the real scene.
- a displacement change process of the virtual character may be implemented; an area where the virtual character arrives may be determined, for example, a legal area that allows the virtual character to be active; and an operation performed by the virtual character may be determined, For example, determine the type of skill that the virtual character is allowed to use in combat; the attributes that the virtual character can have, such as the virtual character's ability to increase health and reduce the difficulty of life, etc., are not limited here.
- the virtual reality scenario is a large-scale multi-player online scenario, that is, the virtual reality scenario includes multiple online virtual characters, and the user corresponding to each virtual character can also learn the virtual roles corresponding to other users.
- the behavior change process for example, to understand the displacement change process of the virtual character corresponding to other users.
- the foregoing virtual reality scenario is only a preferred embodiment in the embodiment of the present application, and does not represent that the virtual reality scenario of the embodiment of the present application is limited to the foregoing manner, and any displacement control method of the virtual character may be implemented, and may be avoided.
- the virtual reality scenes in which the continuous movement of the virtual character is likely to cause the user to be dizzy are within the protection scope of the present application, and will not be exemplified herein.
- the virtual character activity of this embodiment can control the displacement of the virtual character in the virtual reality scene.
- the displacement mechanism of the virtual character is triggered, that is, the displacement selection mechanism of the virtual character in the virtual reality scene is started, and the first instruction for instructing the virtual character to generate the target displacement from the first orientation is received, that is, the virtual character is selected to be virtual.
- the target displacement to be moved in the real scene, the triggering of the first instruction is convenient and flexible.
- the first orientation is a starting orientation of the virtual character before the displacement in the virtual reality scene, including an initial position and an initial orientation in the virtual reality scene, wherein the orientation is the direction in which the virtual character is displaced.
- the first instruction is generated based on a physical handle, and the rocker can be pushed in any direction to trigger the selection of the target displacement, and the touch panel can be pressed at any position to trigger the selection of the target displacement, thereby selecting the virtual The unique orientation of the character.
- physical buttons that can precisely control the direction are employed, such as a joystick and a touchpad.
- the rocker under the Oculus platform, the rocker is pushed in either direction to trigger the selection of the displacement position; under the Vive platform, the touchpad is pressed at any position to trigger the target displacement selection.
- the Oculus Rift under the Oculus platform is a virtual reality head-mounted display, the software used is mainly video games, using Rift custom programming.
- Step S204 after receiving the first instruction, determining a second orientation that is offset from the first orientation by the target.
- step S204 of the present application after receiving the first instruction, determining a second orientation that is offset from the first orientation by the target.
- the displacement determination mechanism is triggered, that is, the virtual character is triggered to be displaced.
- the displacement triggering mechanism since the first instruction triggering is convenient and flexible, the displacement triggering mechanism has the possibility of false triggering, and thus the embodiment can also provide a displacement canceling mechanism, which can release the rocker or raise the big circle.
- the disc touchpad cancels the determined target displacement. For example, for the Oculus platform, the user can release the joystick to cancel the selected target displacement. For the Vive platform, the user can raise the touchpad to cancel the selected target displacement. Subsequent displacement selection mechanisms are not triggered.
- the target displacement after determining the target displacement, may be previewed, for example, by using a curve to preview the determined target displacement, and the extension process time of the curve is also the buffer time determined by the displacement, when the displacement is triggered by mistake.
- the user can cancel the determined displacement in time.
- the effect of the curve in the virtual reality scene is to slowly emit a ray from the position of the hand to the front of the handle. For example, slowly ejecting the curve with the arrow, it takes about a second to bend and extend to the ground, thereby passing the curve head. The part indicates the position of the second orientation.
- the user's complete operational flow is to push the rocker to trigger the displacement selection mechanism, then adjust the spatial position of the user's hand to determine the position of the target displacement, and then adjust the direction of the joystick to determine The direction of the target displacement, and finally press the joystick to confirm the start of the target displacement;
- the user's complete operation flow is to press the touchpad to trigger the displacement selection mechanism, and then adjust the spatial position of the user's hand to determine the target displacement. Position, then adjust the position of the touch point of the hand to determine the direction of the target displacement, and finally lift the hand away from the touchpad to confirm the start of the target displacement, so that the operation of determining the target displacement is simple, convenient and rapid.
- Step S206 controlling the virtual character to disappear in the first orientation, and controlling the virtual character to appear in the second orientation, wherein the virtual character does not appear between the first orientation and the second orientation.
- control virtual character disappears in the first orientation, and the virtual character is controlled to appear in the second orientation, wherein the virtual character does not appear between the first orientation and the second orientation.
- the control virtual character disappears in the first orientation and controls the virtual character to appear in the second orientation.
- the virtual character remains stationary in the first orientation, that is, remains stationary, the lens slowly blacks out, and then when the lens is restored, the virtual character reaches the position and orientation indicated by the second orientation, slowly in the lens.
- the position and orientation indicated by the virtual character in the second orientation are immobile, that is, the virtual character remains stationary.
- the virtual character does not appear between the first orientation and the second orientation.
- the virtual character moves instantaneously in the virtual reality scene, and the user does not feel the acceleration and speed caused by the continuous movement, thereby realizing the avoidance of the continuous movement by the instantaneous movement, thereby avoiding the feeling of dizziness and improving the user experience.
- the sudden disappearance in the first orientation and the sudden appearance in the second orientation, the first orientation and the second orientation may simultaneously have the effect of rotating around the point particle, the first orientation of the particle
- the point is offset to the second orientation during the disturbance, as if it were blown by the wind, that is, the particle wind is generated to prompt other users to move in the second orientation and the second orientation, and the particle wind direction is used to prompt
- the displacement change process of the current virtual character of other users arrives from which place in the virtual reality scene, so that other users in the virtual reality scene can clearly see the displacement process of the virtual character
- the operation method adapts to various hardware specifications, and the operation is simple and convenient. Easy to learn, it has a good performance in the virtual reality scene of large-scale multiplayer real-time online.
- step S202 to step S206 by receiving, in the virtual reality scene, a first instruction for instructing the virtual character to generate a target displacement from the first orientation; after receiving the first instruction, determining a target displacement from the first orientation The second orientation; controlling the virtual character to disappear in the first orientation, and controlling the virtual character to appear in the second orientation, wherein the virtual character does not appear between the first orientation and the second orientation. That is, by controlling the virtual character to disappear in the first orientation and controlling the virtual character to appear in the second orientation in the virtual reality scene, the purpose of avoiding the continuous movement by the instantaneous movement of the virtual character is achieved, thereby avoiding the virtual character being avoided.
- the continuous movement easily leads to the technical effect of the user's dizziness, thereby solving the technical problem that the related technology is easy to cause the user to stun due to the continuous movement of the virtual character.
- controlling the virtual character to disappear in the first orientation, and controlling the virtual character to appear in the second orientation comprises: controlling the virtual character to stay in the first orientation after determining the second orientation, and controlling The preset lens is black; after controlling the preset lens black screen, the preset lens is controlled to resume display, and the virtual character is controlled to stay in the second orientation.
- FIG. 3 is a flow chart of a method of controlling a virtual character to disappear in a first orientation and controlling a virtual character to appear in a second orientation, in accordance with an embodiment of the present application. As shown in FIG. 3, the method includes the following steps:
- Step S301 after determining the second orientation, control the virtual character to stay in the first orientation, and control the preset lens black screen.
- step S301 of the present application after determining the second orientation, the virtual character is controlled to stay in the first orientation, and the preset lens black screen is controlled, wherein the preset lens is used to display the screen of the virtual reality scene.
- the first orientation is an initial position before the virtual character is displaced in the virtual reality scene.
- the control virtual character stays in the first orientation, that is, the control virtual character is in the first position.
- the preset lens is black, so that the virtual character disappears in the first orientation.
- the preset lens is used to display a screen of the virtual display scene, that is, the screen of the virtual display scene is a screen in the virtual reality scene that is viewed from the perspective of the user corresponding to the virtual character.
- This embodiment achieves a display effect in which the virtual character suddenly disappears in the first orientation by controlling the virtual character to stay in the first orientation and controlling the preset shot black screen.
- Step S302 after controlling the preset lens black screen, controlling the preset lens to resume display, and controlling the virtual character to stay in the second orientation.
- step S302 of the present application after controlling the black screen of the preset lens, the preset lens is controlled to resume display, and the virtual character is controlled to stay in the second orientation.
- the preset lens is controlled to resume the screen display, that is, the preset lens is no longer a black screen.
- the virtual character stays in the virtual reality scene.
- the second orientation is the latest arrival orientation of the virtual character
- the target displacement is from the first orientation
- the target displacement is determined by the displacement selection mechanism.
- the virtual character does not appear between the first orientation and the second orientation, thereby circumventing the dizziness feeling caused by the continuous movement of the virtual character to the user corresponding to the virtual character in the virtual reality scene.
- This embodiment restores the display by controlling the preset lens and controls the virtual character to stay in the second orientation, realizing the realistic effect that the virtual character suddenly appears in a new position.
- the virtual character after determining the second orientation, the virtual character is controlled to stay in the first orientation, and the preset lens black screen is controlled, and the preset lens is used to display the virtual reality scene; after controlling the preset lens black screen, the preset is controlled.
- the lens resumes display and controls the virtual character to stay in the second orientation, thereby realizing the purpose of controlling the virtual character to disappear in the first orientation and controlling the virtual character to appear in the second orientation, thereby achieving the technique of avoiding vertigo caused by continuous movement of the virtual character. effect.
- controlling the preset shot black screen includes: controlling the preset shot to gradually black out, wherein the virtual character stays in the first orientation during the gradual blackout of the preset shot;
- Controlling the preset lens recovery display includes: controlling the preset lens to gradually resume display, wherein the virtual character stays in the second orientation while the preset lens is gradually restored to display.
- the preset preset lens is gradually blacked out, that is, when the control virtual character stays in the first orientation, the preset shot is slowly blacked out, so that the first orientation of the virtual character in the virtual reality scene disappears.
- the control preset lens gradually resumes display, that is, the preset lens slowly resumes displaying the virtual reality scene, and in the process of slowly recovering the preset lens, the virtual character is in the second orientation.
- the position and orientation are unchanged, that is, the virtual character remains stationary, the preset lens gradually blacks out, and the preset lens gradually resumes the display process, and the user corresponding to the virtual character does not feel the acceleration caused by the continuous movement of the displacement. And speed, it will not give users a sense of dizziness, which will enhance the user experience.
- controlling the virtual character to disappear in the first orientation, and controlling the virtual character to appear in the second orientation comprises: after determining the second orientation, controlling the virtual character to disappear in the first orientation, and The first orientation displays the first indication information; the control virtual character appears in the second orientation, and the second indication information is displayed in the second orientation.
- FIG. 4 is a flow chart of another method of controlling a virtual character to disappear in a first orientation and controlling a virtual character to appear in a second orientation, in accordance with an embodiment of the present application. As shown in FIG. 4, the method includes the following steps:
- Step S401 after determining the second orientation, control the virtual character to disappear in the first orientation, and display the first indication information in the first orientation.
- step S401 of the present application after determining the second orientation, the control virtual character disappears in the first orientation, and displays the first indication information in the first orientation, where the first indication information is used to indicate the first
- the orientation is dynamic in the virtual reality scene.
- the first indication information is displayed in the first orientation, and the first indication information may be a point particle, which is a virtual character. a vanishing point where the first orientation is about to disappear.
- a special effect of the point particle surrounding the rotation is displayed in a preset range of the first orientation, and is used to indicate that the first orientation in the virtual reality scene has a motion, thereby reaching a reminder The purpose of the user's attention.
- Step S402 controlling the virtual character to appear in the second orientation, and displaying the second indication information in the second orientation.
- control virtual character appears in the second orientation
- the second indication information is displayed in the second orientation, wherein the second indication information is used to indicate that the second orientation is in the virtual reality scene. movement.
- the second indication information may be a point particle, and the virtual character is about to appear in the second orientation.
- a special effect of the point particle surrounding the rotation is displayed to indicate that the second orientation has motion in the virtual reality scene.
- the virtual character after determining the second orientation, the virtual character is controlled to disappear in the first orientation, and the first indication information is displayed in the first orientation, wherein the indication information is used to indicate that the first orientation has motion in the virtual reality scene;
- the virtual character appears in the second orientation, and displays the second indication information in the second orientation, where the second indication information is used to indicate that the second orientation has motion in the virtual reality scene, and the control virtual character disappears in the first orientation, And control the virtual character to appear in the second orientation.
- step S401 displaying the first indication information in the first orientation comprises: displaying the first point particle moving in the first orientation, wherein the first indication information comprises the first point particle moving
- step S402 displaying the second indication information in the second orientation comprises: displaying the moved second point particle in the second orientation, wherein the second indication information comprises the moving second point particle, the first point particle to the second
- the direction of motion of the point particles is used to represent the change process of the virtual character from the first orientation to the second orientation.
- the first point particle that is moving is displayed in the first orientation
- the first point particle may include multiple small particles.
- the plurality of small particles rotate around to prompt other users to move in the first orientation in the virtual reality scene, and may also indicate that the orientation in the virtual reality scene is the position of the virtual character before the displacement.
- the control virtual character appears in the second orientation and displays the second indication information in the second orientation
- the moving second point particle is displayed in the second orientation to prompt the other user to move the second orientation in the virtual reality scene. It can also indicate that the orientation in the virtual reality scene is the position of the virtual character after the displacement.
- the second point particle is different from the first point particle, for example, the number of point particles included in the second point particle is different from the number of point particles included in the first point particle, and/or The size of the point particles included in the two-point particle is different from the size of the point particle included in the first point particle.
- the moving direction of the first point particle to the second point particle is used to represent the change process of the first orientation to the second orientation, and the first point particle may be offset to the second orientation in the disturbance to obtain the second point particle, first
- the special effect exhibited by the change process of the point particle to the second point particle is the particle wind, which is like the effect of the wind blowing.
- the particle wind direction is used to prompt the change process of the displacement of other user virtual characters in the virtual reality scene.
- the change process from one orientation to the second orientation, this displacement representation process makes other users in the virtual reality scene also understand the change process of the displacement of the virtual character, and is suitable for a large multiplayer scene.
- the display effect of the foregoing first indication information and the second indication information is only a preferred embodiment of the embodiment of the present application, and does not mean that the display special effect of the embodiment of the present application is only a particle display special effect, and others can be used for prompting.
- the process of changing the displacement of the user avatar from the first orientation to the second orientation is within the protection scope of the present application, and will not be exemplified herein.
- step S204 before determining the second orientation that is away from the target displacement by the first orientation, third indication information indicating the target displacement is displayed; and in step S204, determining the target displacement from the first orientation
- the second orientation includes: determining the second orientation according to the third indication information.
- the displacement triggering mechanism of the virtual character is convenient and flexible, that is, there is a possibility of erroneously triggering displacement when selecting the target displacement, it is necessary to provide a buffering time to the user, that is, regret time, the user can cancel the error in the buffering time.
- Trigger displacement The embodiment gives the user a buffer time to determine the displacement prior to determining the second orientation that is offset from the first orientation by the target displacement.
- third indication information for indicating the target displacement is displayed, and the third indication information may be displayed by the special effect, thereby previewing the selection result of the target displacement. Determining the second orientation according to the third indication information, for example, determining the position and orientation of the second orientation in the virtual reality scene according to the third indication information.
- displaying the third indication information for indicating the target displacement comprises: displaying a curve for indicating the target displacement, wherein the third indication information includes a curve.
- the special indication form of the third indication information may be that a ray is slowly emitted from the position of the user's hand to the front of the handle, for example, the ray is a blue arrow
- the curve bends to the ground.
- the curve bends to the ground for about one second, and the display effect of the new position indicated by the curve head appears, as shown in the end of the curve in FIG. The cylinder shown.
- the time taken for the extension of this blue curve is the user's buffer time.
- the user releases the joystick.
- the Vive platform the user lifts the large disc trackpad, thereby eliminating the displacement selection mechanism and not triggering the subsequent displacement mechanism.
- determining the second orientation according to the third indication information comprises: determining an intersection position of the curve with the preset plane; determining an area within the preset distance range from the intersection position as the position indicated by the second orientation.
- FIG. 5 is a flow chart of a method of determining a second orientation that is a target displacement from a first orientation, in accordance with an embodiment of the present application. As shown in FIG. 5, the method includes the following steps:
- step S501 the intersection of the curve and the preset plane is determined.
- step S501 of the present application the intersection of the curve and the preset plane is determined, wherein the preset plane is used to support the virtual character.
- the second orientation includes the location and orientation of the virtual character in the virtual reality scene.
- the position and orientation of the second orientation indication are determined when determining the second orientation that is offset from the first orientation by the target orientation. Determine the intersection of the curve and the preset plane.
- the end of the curve can be represented by a cylindrical shape. The cylindrical effect is used to show the position and orientation that the user will have after displacement.
- the position indicated by the second orientation is a position where the parabola that is emitted at a certain speed in front of the user's hand intersects the preset plane
- the preset plane may be a ground, a mountain, or the like in the virtual reality scene.
- the location used to support the virtual character is not limited here.
- Step S502 determining an area within a preset distance range from the intersection position as a position indicated by the second orientation.
- the area within the preset distance range from the intersection position is determined as the position indicated by the second orientation.
- the area within the preset distance range from the intersection position is determined as the position indicated by the second orientation, thereby determining the position of the second orientation indication.
- the intersection position of the curve and the preset plane is determined; the area within the preset distance range of the intersection position is determined as the position indicated by the second orientation, and the second orientation for determining the target displacement from the first orientation is realized, and further
- the control virtual character disappears in the first orientation, and the virtual character is controlled to appear in the second orientation, thereby achieving the technical effect of avoiding the user's dizziness due to continuous movement of the virtual character.
- step S502 determining the area within the preset distance range of the intersection position as the position of the second orientation indication comprises: detecting whether the second orientation is legal in the virtual reality scene; if the second detection is detected The orientation is legal in the virtual reality scene, and the region within the preset distance range from the intersection position is determined as the position indicated by the second orientation.
- FIG. 6 is a flowchart of a method of determining an area within a predetermined distance range from a intersecting position as a position indicated by a second orientation, according to an embodiment of the present application. As shown in FIG. 6, the method includes the following steps:
- Step S601 detecting whether the second orientation is legal in the virtual reality scene.
- step S601 of the present application it is detected whether the second orientation is legal in the virtual reality scenario.
- This embodiment introduces the concept of an illegal transmission area.
- Some virtual areas in a virtual reality scene cannot be reached by a virtual character, but a parabola emitted by a user's hand can reach any virtual area in the virtual reality scene, so it is necessary to do the virtual area.
- a certain limit is imposed. It is detected whether the second orientation is legal in the virtual reality scene, that is, whether the second orientation is a virtual area that the virtual character can reach. If it is detected that the second orientation is a virtual area reachable by the virtual character, determining that the second orientation is legal in the virtual reality scene, and if detecting that the second orientation is an unreachable virtual area of the virtual character, determining that the second orientation is virtual It is illegal in the real world scene.
- Step S602 if it is detected that the second orientation is legal in the virtual reality scene, the area within the preset distance range from the intersection position is determined as the position indicated by the second orientation.
- step S602 of the present application if it is detected that the second orientation is legal in the virtual reality scene, the region within the preset distance range from the intersection position is determined as the position indicated by the second orientation.
- the second orientation After detecting whether the second orientation is legal in the virtual reality scene, if it is detected that the second orientation is legal in the virtual reality scene, determining an area within a preset distance range from the intersection position as the position indicated by the second orientation, thereby implementing The position indicated by the second orientation is determined, thereby controlling the virtual character to disappear in the first orientation, and controlling the virtual character to appear in the second orientation, thereby achieving the technical effect of avoiding vertigo of the user due to continuous movement of the virtual character.
- Step S603 If it is detected that the second orientation is invalid in the virtual reality scene, the preset identifier information is displayed.
- step S603 of the present application if it is detected that the second orientation is invalid in the virtual reality scenario, the preset identifier information is displayed, where the preset identifier information is used to indicate that the second orientation is not in the virtual reality scenario. legitimate.
- This embodiment calibrates the reachable area of the virtual scene. If it is detected that the second orientation is illegal in the virtual reality scene, that is, the second orientation is a non-reachable area in the virtual reality scene, the parabola has an intersection after it is emitted, but since the virtual character cannot reach, the curve and the second are The display effects of the position indicated by the orientation are set to a conspicuous color, for example, set to red, thereby reminding the user that the virtual character cannot be transmitted to the second orientation. At this time, if the user presses the rocker or lifts the large disk of the touchpad, the displacement selection is canceled, and the displacement performance scene is not triggered.
- the second instruction after displaying the third indication information for indicating the target displacement, the second instruction is received, wherein the second instruction is used to indicate that the virtual character cancels the displacement from the first orientation, wherein the displacement A target displacement is included; after receiving the second instruction, the control virtual character cancels generating the target displacement from the first orientation.
- the displacement triggering mechanism is very convenient and flexible, when the user releases the joystick or raises the large disk of the touchpad, a second command is generated, which triggers the displacement determining mechanism.
- a second command is generated, which triggers the displacement determining mechanism.
- the user releases the joystick.
- the Vive platform the user lifts the large disc trackpad to generate a second command, which cancels the displacement selection mechanism according to the second command and does not trigger the subsequent displacement mechanism.
- determining the second orientation that is offset from the first orientation by the target includes: receiving a third instruction, wherein the third instruction is for indicating a location of the second orientation indication; after receiving the third instruction , determining the position of the second orientation indication.
- a third command is generated by the user's operation, and the third command is received to indicate the position of the second orientation, thereby determining the position of the second orientation.
- the receiving the third instruction includes: acquiring location information of the first operation object in the real scene, where the first operation object is used to adjust the position of the second orientation indication, the location information and the second The position of the orientation indication corresponds; the third instruction is obtained according to the location information.
- the first operation object may be a user's hand, and the user can simply move the position of the hand, and can receive the third command by rotating the angle of the front of the hand, and realize the wide adjustment of the target displacement position according to the third instruction.
- determining the second orientation that is offset from the first orientation by the target includes: receiving a fourth instruction, wherein the fourth instruction is for indicating an orientation of the second orientation indication; after receiving the fourth instruction , determining the orientation of the second orientation indication.
- a fourth command is generated by the user's operation, and the fourth command is received to indicate the orientation of the second orientation, thereby determining the orientation of the second orientation.
- the receiving the fourth instruction includes: acquiring angle information of the second operation object in the real scene, where the second operation object is used to adjust the orientation of the second orientation indication, the angle information and the second The orientation of the orientation indication corresponds; the fourth instruction is obtained according to the angle information.
- the second operation object may be a handle, and the angle information may be determined by the direction selection mechanism of the handle.
- the 360 degree direction of the joystick is mapped to the horizontal 360 degree of the position indicated by the second orientation, so the user only needs to be convenient.
- the direction of the rotary rocker can determine the orientation of the second orientation indication.
- the second orientation of the virtual character in the virtual reality scene is displayed by an arrow at the end of the curve to visually display the target displacement, determining the position and orientation of the virtual character after the displacement of the moving target.
- receiving the first instruction for instructing the virtual character to generate the target displacement from the first orientation comprises: receiving the first instruction by the joystick; and determining the second orientation of the target displacement from the first orientation comprises: Determining a second orientation by the joystick; or receiving a first instruction for instructing the virtual character to generate a target displacement from the first orientation comprises: receiving a first instruction through the touchpad; determining a second orientation that is offset from the first orientation by the target Including: determining the second orientation through the touchpad.
- this embodiment is compatible with the current mainstream Oculus and Vive helmets and corresponding operating handles on the market.
- the operation of this embodiment is also applicable if new hardware devices are available.
- This embodiment provides a method for controlling the displacement of a virtual character in a virtual reality scene.
- the user operation is simple, convenient, and intuitive.
- the preview effect when the displacement is selected can be very intuitive to tell the user the result of the operation, by implementing the virtual character.
- Instant movement avoids the stun caused by continuous movement in the virtual reality scene. Through the direction effect, other users in the scene can see the displacement process of the virtual character.
- the operation method adapts to various hardware specifications, and the operation is simple and convenient, and is large and large. People have a good performance in real-time online scenes.
- FIG. 7 is a flowchart of another method for controlling displacement of a virtual character according to an embodiment of the present application. As shown in Fig. 7, rounded rectangles are used to represent user input, rectangular rectangles are used to represent logical units, diamonds are used to represent control flow selection, and dashed lines are used to distinguish between displacement selection processes and displacement representation processes. The method includes the following steps:
- step S701 the user pushes the joystick or presses the touchpad.
- a displacement mechanism In the process of performing displacement control of a virtual character, firstly, a displacement mechanism is triggered, and the user pushes the joystick or presses the touchpad.
- the triggering of the displacement is based on the physical handle, and the different platforms are slightly different. Under the Oculus platform, push the joystick in either direction to trigger the selection of the displacement position; under the Vive platform, press the touchpad at any position to trigger the displacement selection.
- the physical buttons of the Oculus platform and the Vive platform are not the same, the design concept is the same, that is, because the orientation of the displacement needs to be selected, physical buttons that can accurately control the displacement direction are required, and the joystick and the touchpad are the most Good choice.
- Step S702 starting a displacement selection mechanism.
- the displacement selection mechanism is activated.
- step S703 the position and orientation of the user's hand in the space are adjusted.
- the displacement selection mechanism After the displacement selection mechanism is activated, the position and orientation of the user's hand in the space is adjusted, and the user can simply move the position of the hand in the space and rotate the angle of the front of the hand to adjust the orientation of the hand in the space.
- Step S704 adjusting the position of the target displacement.
- step S705 the joystick direction is adjusted or the touch point position is adjusted.
- the angle is determined by the direction selection mechanism of the handle.
- the 360 degree direction of the joystick is mapped to the horizontal 360 degree direction of the target position. Therefore, the user only needs to rotate the direction of the rocker to conveniently determine the target rotation position, thereby determining the direction of the target displacement. . Or adjust the position of the touch point on the touchpad to determine the direction of the target displacement.
- step S706 the direction of the target displacement is adjusted.
- Step S707 waiting for the buffer time.
- the displacement triggering mechanism of this embodiment is very convenient and flexible. However, this embodiment has the possibility of false triggering, and therefore a cancellation mechanism for canceling the target displacement is required.
- the buffer time is awaited to give the user a regret time when selecting the displacement.
- a ray is slowly emitted from the position of the user's hand to the front of the handle, for example, a curve that slowly emits a blue arrow, which may also be called a parabola, takes about a second or so.
- the bend extends to the ground and the display effect of the new position indicated by the curved head appears (the cylinder at the end of the curve as shown in Figure 8).
- the extension of the above curve is to give the user the regret time, that is, the buffer time.
- the user releases the joystick.
- the Vive platform the user lifts the large disc trackpad, thereby eliminating the displacement selection mechanism and not triggering subsequent displacement mechanisms. If the curve extends to the end of the curve where the effect is displayed, it means that the displacement display effect is triggered.
- step S708 is performed; if the buffer time is not reached, the curve does not bend to the ground and the curve head indicates the new position The display effect does not appear. For example, if the Oculus user releases the joystick, or the Vive user lifts the large disc trackpad, step S716 is performed.
- Step S708 performing a legality check of the target location.
- the legality of the target position of the target displacement is checked.
- some virtual areas are set to be unreachable by the virtual character, but the parabola shot by the user's hand can reach any virtual area, so it is necessary to impose certain restrictions on the virtual area.
- the position that the virtual character can reach in the virtual reality scene is calibrated.
- the parabola has an intersection point after the shot, but the virtual character cannot reach, the parabola and the target position can be The effects are set to red to remind the user that the displacement of the virtual character cannot be transmitted.
- the displacement selection is canceled and the displacement performance is not triggered. If it is checked that the target location is legal, step S709 is performed. If it is checked that the target location is illegal, step S716 is performed.
- step S709 the target displacement is confirmed.
- step S711 is performed; if the target displacement is not confirmed, step S716 is performed.
- step S710 the joystick is pressed or the touchpad is released.
- the displacement selection confirmation mechanism is also very convenient to design.
- the displacement determination mechanism is triggered, and step S709 is performed.
- step S711 the displacement mechanism is started.
- the displacement mechanism After confirming the target displacement, the displacement mechanism is activated, that is, the virtual character is triggered to perform the displacement operation.
- step S712 the lens black screen is controlled.
- the virtual character is kept at the current position, that is, remains stationary, and the control lens is slowly blacked out, and the lens is used to display the virtual reality scene.
- Step S713 determining the position and direction of the new orientation according to the target displacement.
- the position and orientation of the new orientation to which the virtual character is to arrive is determined based on the target displacement.
- step S714 the special effect is played.
- step S715 the black screen of the lens is controlled to be restored.
- the effect of the point particle rotation around the new orientation and the original orientation occurs, and then the original orientation of the particle point is shifted to the new orientation in the disturbance, as if it were blown by the wind.
- the new orientation and the original orientation effect are used to remind other users that there is movement in this place, and the particle wind direction is used to prompt other users, and the current change process of the user's displacement is from which place.
- step S716 the displacement selection mechanism is cancelled.
- the Oculus user releases the joystick, the Vive user lifts the large disc touchpad, or after checking that the target position is illegal, or the target displacement is not confirmed, the displacement is canceled. Selection mechanism.
- the process is always effective. Only at the moment when the position and angle change process ends, the position and angle of the changed displacement are judged. Whether it meets the requirements to perform the next step, for example, to check the legality of the target position, to confirm the target displacement when the target position is checked, and to cancel the displacement selection mechanism when the target position is illegal.
- the instantaneous movement of the virtual character to avoid continuous movement, through the direction effect to make other players in the scene see the process of the displacement of the virtual character, the operation method adapts to a variety of hardware specifications, the operation is simple and easy to learn, in the scene of large multiplayer real-time online There is good performance underneath.
- the application environment of the embodiment of the present application may be, but is not limited to, the reference to the application environment in the foregoing embodiments.
- An embodiment of the present application provides an optional specific application for implementing a virtual color displacement control method, and specifically illustrates a virtual reality scenario for a game application.
- the design form of this embodiment is mainly divided into two parts. One part is to determine the displacement of the virtual character in the virtual reality scene, and the other part is to determine the displacement of the virtual character in the virtual reality scene.
- the displacement mechanism Before determining the displacement of the virtual character in the virtual reality scene, the displacement mechanism is first triggered.
- the trigger mechanism of the displacement is based on the physical handle, and the different platforms are slightly different. For example, on the Oculus platform, pushing the joystick in either direction triggers the selection of the displacement position; on the Vive platform, pressing the touchpad at any position is the trigger displacement selection step.
- the physical buttons of the Oculus platform and the Vive platform are not the same, the design concept is the same, that is, because the orientation of the displacement needs to be selected, a physical button that can precisely control the direction is required, and the joystick and the touch platform are the best. s Choice.
- the displacement trigger mechanism is very convenient and flexible, that is, there is the possibility of displacement false triggering. Therefore, a displacement cancellation mechanism is needed, which can provide the game player with the buffer time for determining the displacement, and determine the virtual character in the virtual reality scene.
- the special effect of the virtual reality scene is to slowly emit a ray from the position of the player's hand to the front of the handle, for example, the curve with an arrow shown in FIG. 8 is a schematic diagram of a displacement special effect display according to an embodiment of the present application.
- the curve with an arrow may have a color, for example, the curve is blue, and the curved end extends to the ground at about the end of the curve.
- the extension of this curve is to give the game player regret time, that is, to provide the game player with the buffer time determined by the displacement.
- the game player releases the joystick.
- the game player lifts the large disc trackpad, thereby canceling the displacement selection mechanism and not triggering the subsequent displacement mechanism. If there is a display effect in which the curve extends to the end position, as shown in the cylinder at the end of the curve in Fig. 8, it indicates that the displacement scene in the virtual reality scene is triggered.
- the displacement selection confirmation mechanism that is, the displacement determination mechanism, is also very convenient to design.
- the displacement determination mechanism is triggered.
- the cylindrical effect at the end of the curve shown in Figure 8 is used to show the game player the position and orientation of the determined displacement in the virtual reality scene.
- the target position of the displacement is determined according to the position where the parabola emitted by the player's hand directly at a certain speed intersects the ground, and therefore, the game player only needs to simply move the hand.
- the position of the part, rotating the angle of the front of the hand, can adjust the target position of the displacement in a wide range; the determination of the angle of the displacement is determined by the direction selection mechanism of the handle, for example, on the Oculus platform, the rocker is
- the 360 degree direction maps to the horizontal 360 degrees of the target position, so the game player only needs to conveniently rotate the direction of the joystick to determine the target rotation position.
- a prominent arrow can be used to clearly show the target rotational position, so this embodiment uses the displacement effect to visualize the game player.
- the displacement of the virtual character allows the game player to know where and where the virtual character is to be reached.
- the reachable area of the virtual reality scene is calibrated. For the non-reachable area, if the parabola is emitted, there is an intersection with the target position in the area, but because the virtual character cannot reach the virtual area, Set the effect of the parabola and the target position display to a striking color, for example, it is displayed in red to remind the game player that the area cannot be reached.
- the joystick or lifts the large disc trackpad Cancel the displacement that has been selected and will not trigger the displacement performance.
- the game player's operation is: for the Oculus platform, the game player pushes the joystick to trigger the displacement selection mechanism, then adjusts the position and orientation of the hand in the space to determine the position of the target displacement, and then adjusts the joystick direction to determine The direction of the target displacement, and finally press the joystick to confirm the starting displacement; for the Vive platform, the game player presses the touchpad to trigger the displacement selection mechanism, and then adjusts the position and orientation of the hand in the space to determine the position of the target displacement, and then Adjust the position of the touch point of the hand on the touchpad to determine the direction of the target displacement. Finally, lift the hand away from the touchpad to confirm the start displacement.
- the whole operation process can be one-handed operation, which is very convenient and fast. The player determines where the avatar wants to be displaced.
- the virtual character corresponding to the game player is triggered to perform the displacement operation, and then the following steps are triggered:
- the game player For the game player, keep the corresponding virtual character in the current position in the virtual reality scene, that is, keep the state of rest, then the camera slowly black screen, and then slowly restore the lens, the virtual character will be selected before The new position and orientation, during the slow recovery of the lens, the virtual character is still in the new position and orientation, that is, it remains stationary. In this process, the game player does not feel the acceleration and speed caused by the continuous movement of the virtual character, and does not bring a feeling of dizziness.
- FIG. 9 is a schematic diagram showing another displacement special effect display according to an embodiment of the present application, in which the particle point of the original position is shifted to a new position in the disturbance, as if it were blown by the wind.
- the effect of the point of the new position and the effect of the point of the original position are used to remind other game players that there is movement in this place, and the particle wind direction is used to prompt other game players, which is the change process of the displacement of the current virtual character. Where does the place arrive?
- the operation of the game player is simple, convenient, and intuitive.
- the selected displacement can be previewed, thereby directly telling the result of the operation of the game player, and the performance of the entire displacement is shifted.
- the game character corresponding to the virtual character feels comfortable and does not cause dizziness; in addition, during the displacement performance process, other game players will also understand the displacement process of the virtual character, so the scheme is also applicable to large multiplayer scenes.
- the special effects display effect of the above displacement can be adjusted, but the purpose and effect expressed by the display special effects are consistent with the objectives and effects in the above embodiments, that is, the displayed special effects provide displacement for the game player.
- the determined buffer time is such that the game player knows the position and orientation of the virtual character to be reached, and the special effects of the new position and the original position are used to prompt other game players to have a movement, suggesting which change process of the virtual character's displacement is from. Where does the place arrive?
- the embodiment only implements and is compatible with the two Oculus and Vive helmets and corresponding operating handles currently on the market. If there are new hardware devices, the operation is also applicable to the virtual characters of the embodiment of the present application. Displacement control method.
- the technical solution of the embodiments of the present application may be embodied in the form of a software product in essence or in the form of a software product stored in a non-transitory computer readable storage medium. (such as ROM/RAM, disk, CD), including a plurality of machine readable instructions for causing a terminal device (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the methods described in various embodiments of the present application. .
- a terminal device which may be a mobile phone, a computer, a server, or a network device, etc.
- FIG. 10a is a schematic diagram of a displacement control device for a virtual character according to an embodiment of the present application.
- the apparatus may include a receiving unit 10, a determining unit 20, and a control unit 30.
- the receiving unit 10 is configured to receive, in the virtual reality scenario, a first instruction for instructing the virtual character to generate a target displacement from the first orientation.
- the determining unit 20 is configured to determine a second orientation that is offset from the first orientation by the target after receiving the first instruction.
- the control unit 30 is configured to control the virtual character to disappear in the first orientation, and control the virtual character to appear in the second orientation, wherein the virtual character does not appear between the first orientation and the second orientation.
- the control unit 30 includes: a first control module and a second control module.
- the first control module is configured to control the virtual character to stay in the first orientation after determining the second orientation, and control the preset lens black screen, wherein the preset lens is used to display the screen of the virtual reality scene; the second control module For controlling the preset lens to resume display after controlling the preset lens black screen, and controlling the virtual character to stay in the second orientation.
- the first control module is configured to control the gradual black screen of the preset lens, wherein the virtual character stays in the first orientation during the gradual blackout of the preset lens; and the second control module is configured to control the preset The lens gradually resumes display, wherein the virtual character stays in the second orientation while the preset lens is gradually restored to display.
- control unit 30 includes: a third control module and a fourth control module.
- the third control module is configured to: after determining the second orientation, control the virtual character to disappear in the first orientation, and display the first indication information in the first orientation, where the first indication information is used to indicate that the first orientation is virtual
- the second control module is configured to control the virtual character to appear in the second orientation, and display the second indication information in the second orientation, where the second indication information is used to indicate that the second orientation is in the virtual reality scene. There is movement.
- the third control module includes: a first display sub-module, configured to display the moving first point particle in the first orientation, wherein the first indication information includes the moving first point particle;
- the fourth control module includes: a second display sub-module, configured to display the moved second point particle in the second orientation, wherein the second indication information comprises the moving second point particle, the first point particle to the second point particle
- the direction of motion is used to represent the change process of the virtual character from the first orientation to the second orientation.
- the apparatus further includes: a display unit 40, configured to display third indication information for indicating a target displacement before determining a second orientation that is offset from the first orientation by the target displacement
- the determining unit 20 includes: a first determining module, configured to determine the second orientation according to the third indication information.
- the display unit 40 includes: a display module for displaying a curve for indicating a target displacement, wherein the third indication information includes a curve.
- the first determining module includes: a first determining submodule and a second sub determining module.
- the first determining sub-module is configured to determine a intersection position of the curve and the preset plane, wherein the preset plane is used to support the virtual character; and the second determining sub-module is configured to preset the distance within the distance range of the intersection position Determine the position indicated as the second orientation.
- the second determining sub-module is further configured to detect whether the second orientation is legal in the virtual reality scene; when detecting that the second orientation is legal in the virtual reality scene, the distance intersecting the position is within a preset distance range The area is determined as the position indicated by the second orientation; and when the second orientation is detected to be invalid in the virtual reality scene, the preset identifier information is displayed, wherein the preset identifier information is used to indicate that the second orientation is not in the virtual reality scene legitimate.
- the apparatus further includes: a first receiving unit 50 and a canceling unit 60.
- the first receiving unit 50 is configured to: after displaying the third indication information for indicating the target displacement, the second instruction is used to indicate that the virtual character cancels the displacement from the first orientation, wherein the displacement The target displacement is included; the canceling unit 60 is configured to control the virtual character to cancel the target displacement from the first orientation after receiving the second instruction.
- the determining unit 20 includes: a first receiving module, configured to receive a third instruction, where the third instruction is used to indicate a location of the second orientation indication; and the second determining module is configured to receive the third After the three instructions, the position of the second orientation indication is determined.
- the first receiving module includes: a first acquiring sub-module, configured to acquire location information of the first operating object in a real scene, where the first operating object is used to adjust a position indicated by the second orientation, The location information corresponds to the location indicated by the second orientation; the second acquisition submodule is configured to acquire the third instruction according to the location information.
- the determining unit 20 includes: a second receiving module, configured to receive a fourth instruction, where the fourth instruction is used to indicate an orientation of the second orientation indication; and the third determining module is configured to receive the fourth After the four instructions, the orientation of the second orientation indication is determined.
- the second receiving module includes: a third acquiring sub-module, configured to acquire angle information of the second operating object in the real scene, where the second operating object is used to adjust the orientation of the second orientation indication, The angle information corresponds to the orientation of the second orientation indication; the fourth acquisition submodule is configured to acquire the fourth instruction according to the angle information.
- the receiving unit 10 includes: a third receiving module, configured to receive the first instruction by the joystick; the determining unit 20 includes: a fourth determining module, configured to determine the second orientation by the joystick; or
- the receiving unit 10 includes: a fourth receiving module, configured to receive the first instruction by using the touch panel; and the determining unit 20 includes: a fifth determining module, configured to determine the second orientation by using the touch panel.
- the receiving unit 10 in this embodiment may be used to perform step S202 in the foregoing method embodiment of the present application.
- the determining unit 20 in this embodiment may be used to perform step S204 in the foregoing method embodiment of the present application.
- the control unit 30 in this embodiment may be used to perform step S206 in the foregoing method embodiment of the present application.
- the receiving unit 10 receives, in the virtual reality scenario, a first instruction for instructing the virtual character to generate a target displacement from the first orientation, and the determining unit 20 determines the target from the first orientation after receiving the first instruction.
- the second orientation of the displacement, the control unit 30 controls the virtual character to disappear in the first orientation, and controls the virtual character to appear in the second orientation, and the virtual character does not appear between the first orientation and the second orientation, ie, in the virtual reality
- the control virtual character disappears in the first orientation, and the virtual character is controlled to appear in the second orientation, thereby achieving the purpose of avoiding continuous movement by the instantaneous movement of the virtual character, thereby avoiding the user being easily caused by the continuous movement of the virtual character.
- the technical effect of vertigo further solves the technical problem that the related art is likely to cause dizziness of the user due to continuous movement of the virtual character.
- the above-mentioned units and modules are the same as the examples and application scenarios implemented by the steps in the corresponding method embodiments, but are not limited to the contents disclosed in the foregoing method embodiments.
- the foregoing module may be implemented in a hardware environment as shown in FIG. 1 as part of the device, and may be implemented by software or by hardware, where the hardware environment includes a network environment.
- the embodiment of the present application further provides a computing device, such as a server or a terminal, for implementing the above-described displacement control method for a virtual character.
- a computing device such as a server or a terminal, for implementing the above-described displacement control method for a virtual character.
- FIG. 11 is a structural block diagram of a terminal according to an embodiment of the present application.
- the terminal may include one or more (only one shown in the figure) processor 111, memory 113, and transmission device 115.
- the terminal may further include an input/output device 117. .
- the memory 113 can be used to store a software program and a machine readable instruction module, such as a virtual character displacement control method and a program instruction/module corresponding to the device in the embodiment of the present application, and the processor 111 runs the software stored in the memory 113.
- the program and the module thereby performing various function applications and data processing, that is, implementing the above-described displacement control method of the virtual character.
- Memory 113 may include high speed random access memory, and may also include non-volatile memory such as one or more magnetic storage devices, flash memory, or other non-volatile solid state memory.
- memory 113 may further include memory remotely located relative to processor 111, which may be connected to the terminal over a network. Examples of such networks include, but are not limited to, the Internet, intranets, local area networks, mobile communication networks, and combinations thereof.
- the transmission device 115 described above is used to receive or transmit data via a network, and can also be used for data transmission between the processor and the memory. Specific examples of the above network may include a wired network and a wireless network.
- the transmission device 115 includes a Network Interface Controller (NIC) that can be connected to other network devices and routers via a network cable to communicate with the Internet or a local area network.
- NIC Network Interface Controller
- transmission device 115 is a Radio Frequency (RF) module for communicating wirelessly with the Internet.
- RF Radio Frequency
- the memory 113 is used to store an application, that is, machine readable instructions.
- the processor 111 can call and execute the application stored in the memory 113 through the transmission device 115 to perform the following steps:
- the control virtual character disappears in the first orientation, and controls the virtual character to appear in the second orientation, wherein the virtual character does not appear between the first orientation and the second orientation.
- the processor 111 is further configured to: after determining the second orientation, control the virtual character to stay in the first orientation, and control the preset lens black screen, wherein the preset lens is used to display the screen of the virtual reality scene; After the preset lens is black, the preset lens is controlled to resume display, and the virtual character is controlled to stay in the second orientation.
- the processor 111 is further configured to: control a gradual black screen of the preset lens, wherein the avatar stays in the first orientation during the gradual blackout of the preset lens; and the preset camera is gradually restored to display, wherein During the process of gradually resetting the preset lens, the virtual character stays in the second orientation.
- the processor 111 is further configured to: after determining the second orientation, control the virtual character to disappear in the first orientation, and display the first indication information in the first orientation, wherein the first indication information is used to indicate the first orientation
- the virtual reality scene has a motion; the control virtual character appears in the second orientation, and the second indication information is displayed in the second orientation, wherein the second indication information is used to indicate that the second orientation has motion in the virtual reality scene.
- the processor 111 is further configured to: display the moving first point particle in the first orientation, wherein the first indication information comprises the moving first point particle; and the second orientation shows the moving second point a particle, wherein the second indication information includes the moving second point particle, and the moving direction of the first point particle to the second point particle is used to indicate a change process of the virtual character from the first orientation to the second orientation.
- the processor 111 is further configured to: perform third indication information indicating a target displacement before determining a second orientation that is offset from the first orientation by the target orientation; and determine the second orientation according to the third indication information.
- the processor 111 is further configured to: display a curve for indicating a target displacement, wherein the third indication information includes a curve; determining a intersection position of the curve with the preset plane, wherein the preset plane is used to support the virtual character; The area within the preset distance range from the intersection position is determined as the position indicated by the second orientation.
- the processor 111 is further configured to: determine whether the second orientation is legal in the virtual reality scene; if it is detected that the second orientation is legal in the virtual reality scene, determine an area within a preset distance range from the intersecting position as the first The position of the two-direction indication; if it is detected that the second orientation is invalid in the virtual reality scene, the preset identification information is displayed, wherein the preset identification information is used to indicate that the second orientation is illegal in the virtual reality scene.
- the processor 111 is further configured to: after receiving the third indication information for indicating the target displacement, receiving the second instruction, wherein the second instruction is used to indicate that the virtual character cancels the displacement from the first orientation, where The displacement includes a target displacement; after receiving the second instruction, the control virtual character cancels the target displacement from the first orientation.
- the processor 111 is further configured to: receive a third instruction, wherein the third instruction is used to indicate a location of the second orientation indication; and after receiving the third instruction, determine a location of the second orientation indication.
- the processor 111 is further configured to: obtain location information of the first operation object in the real scene, where the first operation object is used to adjust the position indicated by the second orientation, the location information and the location indicated by the second orientation Corresponding; obtaining a third instruction according to the location information.
- the processor 111 is further configured to: receive a fourth instruction, wherein the fourth instruction is used to indicate the orientation of the second orientation indication; after receiving the fourth instruction, determine the orientation of the second orientation indication.
- the processor 111 is further configured to: obtain angle information of the second operation object in the real scene, where the second operation object is used to adjust the orientation of the second orientation indication, the angle information and the orientation of the second orientation indication Corresponding; obtaining the fourth instruction according to the angle information.
- the embodiment of the present application provides a scheme for displacement control of a virtual character.
- the continuous movement is likely to cause technical problems for the user to stun.
- FIG. 11 is only schematic, and the terminal can be a smart phone (such as an Android mobile phone, an iOS mobile phone, etc.), a tablet computer, a palm computer, and a mobile Internet device (MID). Terminal equipment such as PAD.
- FIG. 11 does not limit the structure of the above electronic device.
- the terminal may also include more or less components (such as a network interface, display device, etc.) than shown in FIG. 11, or have a different configuration than that shown in FIG.
- a program to instruct terminal device related hardware may be completed by a program to instruct terminal device related hardware, and the program may be stored in a computer readable storage medium. It may include: a flash disk, a read-only memory (ROM), a random access memory (RAM), a magnetic disk or an optical disk, and the like.
- Embodiments of the present application also provide a non-transitory computer readable storage medium.
- the foregoing storage medium may be used to store program code of a displacement control method of a virtual character provided by the foregoing method embodiment.
- the foregoing storage medium may be located on at least one of the plurality of network devices in the network shown in the foregoing embodiment.
- the storage medium is arranged to store program code for performing the following steps:
- the control virtual character disappears in the first orientation, and controls the virtual character to appear in the second orientation, wherein the virtual character does not appear between the first orientation and the second orientation.
- the storage medium is further configured to store program code for performing the following steps: after determining the second orientation, controlling the virtual character to stay in the first orientation and controlling the preset shot black screen, wherein the preset The lens is used to display the picture of the virtual reality scene; after controlling the black screen of the preset lens, the preset lens is controlled to resume display, and the virtual character is controlled to stay in the second orientation.
- the storage medium is further configured to store program code for performing a step of: controlling the gradual blackout of the preset lens, wherein the avatar stays in the first orientation during the gradual blackout of the preset lens Controlling the preset lens gradually resumes display, wherein the virtual character stays in the second orientation while the preset lens is gradually restored to display.
- the storage medium is further configured to store program code for performing the following steps: after determining the second orientation, controlling the virtual character to disappear in the first orientation, and displaying the first indication information in the first orientation,
- the first indication information is used to indicate that the first orientation has motion in the virtual reality scene; the control virtual character appears in the second orientation, and the second indication information is displayed in the second orientation, where the second indication information is used to indicate The two directions are moving in the virtual reality scene.
- the storage medium is further configured to store program code for performing the step of displaying the first point particle moving in the first orientation, wherein the first indication information comprises the first point particle moving Displaying the moving second point particle in the second orientation, wherein the second indication information comprises the moving second point particle, and the moving direction of the first point particle to the second point particle is used to indicate the virtual character from the first orientation The process of change to the second orientation.
- the storage medium is further configured to store program code for performing: displaying third indication information indicating a target displacement before determining a second orientation that is offset from the first orientation by the target; Determining the second orientation according to the third indication information.
- the storage medium is further configured to store program code for performing the steps of: displaying a curve for indicating a target displacement, wherein the third indication information comprises a curve; determining a intersection of the curve with the preset plane Wherein, the preset plane is used to support the virtual character; and the area within the preset distance range from the intersection position is determined as the position indicated by the second orientation.
- the storage medium is further configured to store program code for performing the steps of: detecting whether the second orientation is legal in the virtual reality scene; if detecting that the second orientation is legal in the virtual reality scene, the distance is The area within the preset distance range of the intersection position is determined as the position indicated by the second orientation; if it is detected that the second orientation is invalid in the virtual reality scene, the preset identification information is displayed, wherein the preset identification information is used to indicate the second orientation Not legal in virtual reality scenes.
- the storage medium is further configured to store program code for performing the following steps: after displaying the third indication information for indicating the target displacement, receiving the second instruction, wherein the second instruction is for indicating The virtual character cancels the displacement from the first orientation, wherein the displacement includes the target displacement; after receiving the second instruction, the control virtual character cancels the target displacement from the first orientation.
- the storage medium is further configured to store program code for: receiving a third instruction, wherein the third instruction is for indicating a location of the second orientation indication; after receiving the third instruction , determining the position of the second orientation indication.
- the storage medium is further configured to store program code for performing the following steps: acquiring location information of the first operation object in a real scene, wherein the first operation object is for adjusting the second orientation indication a location, the location information corresponding to the location indicated by the second orientation; obtaining a third instruction based on the location information.
- the storage medium is further configured to store program code for: receiving a fourth instruction, wherein the fourth instruction is for indicating an orientation of the second orientation indication; after receiving the fourth instruction , determining the orientation of the second orientation indication.
- the storage medium is further configured to store program code for performing the following steps: acquiring angle information of the second operation object in the real scene, wherein the second operation object is for adjusting the second orientation indication In the orientation, the angle information corresponds to the orientation of the second orientation indication; the fourth instruction is obtained according to the angle information.
- the foregoing storage medium may include, but not limited to, a U disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a mobile hard disk, a magnetic disk, or an optical disk.
- ROM Read-Only Memory
- RAM Random Access Memory
- mobile hard disk a magnetic disk
- magnetic disk a magnetic disk
- optical disk a variety of media that can store program code.
- the integrated unit in the above embodiment if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in the above-described computer readable storage medium.
- the technical solution of the embodiments of the present application may be embodied in the form of a software product in the form of a software product in essence or in the form of a contribution to the prior art, and the computer software product is stored in the storage medium.
- a plurality of machine readable instructions are included to cause one or more computer devices (which may be a personal computer, server or network device, etc.) to perform all or part of the steps of the methods described in various embodiments of the present application.
- the disclosed client may be implemented in other manners.
- the device embodiments described above are merely illustrative.
- the division of the unit is only a logical function division.
- multiple units or components may be combined or may be Integrate into another system, or some features can be ignored or not executed.
- the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, unit or module, and may be electrical or otherwise.
- the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
- each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
- the above integrated unit can be implemented in the form of hardware or in the form of a software functional unit.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Architecture (AREA)
- Multimedia (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
Claims (29)
- 一种虚拟角色的位移控制方法,应用于计算设备,包括:在虚拟现实场景中,接收第一指令,其中,所述第一指令用于指示虚拟角色从第一方位产生目标位移;在接收到所述第一指令之后,确定与所述第一方位相距所述目标位移的第二方位;控制所述虚拟角色消失在所述第一方位,并控制所述虚拟角色出现在所述第二方位,其中,所述虚拟角色未出现过在所述第一方位和所述第二方位之间。
- 根据权利要求1所述的方法,所述控制所述虚拟角色消失在所述第一方位,并控制所述虚拟角色出现在所述第二方位包括:在确定所述第二方位之后,控制所述虚拟角色停留在所述第一方位,并控制预设镜头黑屏,其中,所述预设镜头用于显示所述虚拟现实场景的画面;在控制所述预设镜头黑屏之后,控制所述预设镜头恢复显示,并控制所述虚拟角色停留在所述第二方位。
- 根据权利要求2所述的方法,所述控制所述预设镜头黑屏包括:控制所述预设镜头逐渐黑屏,其中,在所述预设镜头逐渐黑屏的过程中,保持所述虚拟角色停留在所述第一方位;所述控制所述预设镜头恢复显示包括:控制所述预设镜头逐渐恢复显示,其中,在所述预设镜头逐渐恢复显示的过程中,保持所述虚拟角色停留在所述第二方位。
- 根据权利要求1所述的方法,所述控制所述虚拟角色消失在所述第一方位,并控制所述虚拟角色出现在所述第二方位包括:在确定所述第二方位之后,控制所述虚拟角色消失在所述第一方位,并在所述第一方位显示第一指示信息,其中,所述第一指示 信息用于指示所述第一方位在所述虚拟现实场景中有动静;控制所述虚拟角色出现在所述第二方位,并在所述第二方位显示第二指示信息,其中,所述第二指示信息用于指示所述第二方位在所述虚拟现实场景中有动静。
- 根据权利要求4所述的方法,所述在所述第一方位显示所述第一指示信息包括:在所述第一方位显示运动着的第一点粒子,其中,所述第一指示信息包括运动着的所述第一点粒子;所述在所述第二方位显示所述第二指示信息包括:在所述第二方位显示运动着的第二点粒子,其中,所述第二指示信息包括运动着的所述第二点粒子,所述第一点粒子到所述第二点粒子的运动方向用于表示所述虚拟角色从所述第一方位到所述第二方位的变化过程。
- 根据权利要求1所述的方法,在确定与所述第一方位相距所述目标位移的所述第二方位之前,所述方法还包括:显示用于指示所述目标位移的第三指示信息;所述确定与所述第一方位相距所述目标位移的所述第二方位包括:根据所述第三指示信息确定所述第二方位。
- 根据权利要求6所述的方法,所述显示用于指示所述目标位移的第三指示信息包括:显示用于指示所述目标位移的曲线,其中,所述第三指示信息包括所述曲线;所述根据所述第三指示信息确定所述第二方位包括:确定所述曲线与预设平面的相交位置,其中,所述预设平面用于支撑所述虚拟角色;将距离所述相交位置预设距离范围内的区域确定为所述第二方位指示的位置。
- 根据权利要求7所述的方法,所述将距离所述相交位置预设距离范围内的区域确定为所述第二方位指示的位置包括:检测所述第二方位在所述虚拟现实场景中是否合法;如果检测到所述第二方位在所述虚拟现实场景中合法,将距离所述相交位置预设距离范围内的区域确定为所述第二方位指示的位置;如果检测到所述第二方位在所述虚拟现实场景中不合法,显示预设标识信息,其中,所述预设标识信息用于指示所述第二方位在所述虚拟现实场景中不合法。
- 根据权利要求6所述的方法,在所述显示用于指示所述目标位移的第三指示信息之后,所述方法还包括:接收第二指令,其中,所述第二指令用于指示所述虚拟角色取消从所述第一方位产生位移,所述位移包括所述目标位移;在接收所述第二指令之后,控制所述虚拟角色取消从所述第一方位产生所述目标位移。
- 根据权利要求1所述的方法,所述确定与所述第一方位相距所述目标位移的所述第二方位包括:接收第三指令,其中,所述第三指令用于指示所述第二方位指示的位置;在接收到所述第三指令之后,确定所述第二方位指示的位置。
- 根据权利要求10所述的方法,所述接收所述第三指令包括:获取第一操作对象在真实场景中的位置信息,其中,所述第一操作对象用于调整所述第二方位指示的位置,所述位置信息与所述第二方位指示的位置相对应;根据所述位置信息获取所述第三指令。
- 根据权利要求1所述的方法,所述确定与所述第一方位相距所述目标位移的所述第二方位包括:接收第四指令,其中,所述第四指令用于指示所述第二方位指 示的朝向;在接收到所述第四指令之后,确定所述第二方位指示的朝向。
- 根据权利要求12所述的方法,所述接收所述第四指令包括:获取第二操作对象在真实场景中的角度信息,其中,所述第二操作对象用于调整所述第二方位指示的朝向,所述角度信息与所述第二方位指示的朝向相对应;根据所述角度信息获取所述第四指令。
- 根据权利要求1至13中任意一项所述的方法,所述接收所述第一指令包括:通过摇杆接收所述第一指令;所述确定与所述第一方位相距所述目标位移的第二方位包括:通过所述摇杆确定所述第二方位;或者所述接收所述第一指令包括:通过触控板接收所述第一指令;所述确定与所述第一方位相距所述目标位移的第二方位包括:通过所述触控板确定所述第二方位。
- 一种虚拟角色的位移控制装置,包括:处理器以及与所述处理器相连接的存储器,所述存储器中存储有可由所述处理器执行的机器可读指令,所述处理器执行所述机器可读指令完成以下操作:在虚拟现实场景中,接收第一指令,其中,所述第一指令用于指示虚拟角色从第一方位产生目标位移;在接收到所述第一指令之后,确定与所述第一方位相距所述目标位移的第二方位;控制所述虚拟角色消失在所述第一方位,并控制所述虚拟角色出现在所述第二方位,其中,所述虚拟角色未出现过在所述第一方位和所述第二方位之间。
- 根据权利要求15所述的装置,所述处理器执行所述机器可读指令完成以下操作:在确定所述第二方位之后,控制所述虚拟角色停留在所述第一方位,并控制预设镜头黑屏,其中,所述预设镜头用于显示所述虚拟现实场景的画面;在控制所述预设镜头黑屏之后,控制所述预设镜头恢复显示,并控制所述虚拟角色停留在所述第二方位。
- 根据权利要求16所述的装置,所述处理器执行所述机器可读指令完成以下操作:控制所述预设镜头逐渐黑屏,其中,在所述预设镜头逐渐黑屏的过程中,保持所述虚拟角色停留在所述第一方位;控制所述预设镜头逐渐恢复显示,其中,在所述预设镜头逐渐恢复显示的过程中,保持所述虚拟角色停留在所述第二方位。
- 根据权利要求15所述的装置,所述处理器执行所述机器可读指令完成以下操作:在确定所述第二方位之后,控制所述虚拟角色消失在所述第一方位,并在所述第一方位显示第一指示信息,其中,所述第一指示信息用于指示所述第一方位在所述虚拟现实场景中有动静;控制所述虚拟角色出现在所述第二方位,并在所述第二方位显示第二指示信息,其中,所述第二指示信息用于指示所述第二方位在所述虚拟现实场景中有动静。
- 根据权利要求18所述的装置,所述处理器执行所述机器可读指令完成以下操作:在所述第一方位显示运动着的第一点粒子,其中,所述第一指示信息包括运动着的所述第一点粒子;在所述第二方位显示运动着的第二点粒子,其中,所述第二指示信息包括运动着的所述第二点粒子,所述第一点粒子到所述第二点粒子的运动方向用于表示所述虚拟角色从所述第一方位到所述第二方位的变化过程。
- 根据权利要求15所述的装置,在确定与所述第一方位相距所述目标位移的所述第二方位之前,所述处理器执行所述机器可读指令完成以下操作:显示用于指示所述目标位移的第三指示信息;根据所述第三指示信息确定所述第二方位。
- 根据权利要求20所述的装置,所述处理器执行所述机器可读指令完成以下操作:显示用于指示所述目标位移的曲线,其中,所述第三指示信息包括所述曲线;确定所述曲线与预设平面的相交位置,其中,所述预设平面用于支撑所述虚拟角色;将距离所述相交位置预设距离范围内的区域确定为所述第二方位指示的位置。
- 根据权利要求21所述的装置,所述处理器执行所述机器可读指令完成以下操作:检测所述第二方位在所述虚拟现实场景中是否合法;如果检测到所述第二方位在所述虚拟现实场景中合法,将距离所述相交位置预设距离范围内的区域确定为所述第二方位指示的位置;如果检测到所述第二方位在所述虚拟现实场景中不合法,显示预设标识信息,其中,所述预设标识信息用于指示所述第二方位在所述虚拟现实场景中不合法。
- 根据权利要求20所述的装置,在所述显示用于指示所述目标位移的第三指示信息之后,所述处理器执行所述机器可读指令完成以下操作:接收第二指令,其中,所述第二指令用于指示所述虚拟角色取消从所述第一方位产生位移,所述位移包括所述目标位移;在接收所述第二指令之后,控制所述虚拟角色取消从所述第一方位产生所述目标位移。
- 根据权利要求15所述的装置,所述处理器执行所述机器可读指令完成以下操作:接收第三指令,其中,所述第三指令用于指示所述第二方位指示的位置;在接收到所述第三指令之后,确定所述第二方位指示的位置。
- 根据权利要求24所述的装置,所述处理器执行所述机器可读指令完成以下操作:获取第一操作对象在真实场景中的位置信息,其中,所述第一操作对象用于调整所述第二方位指示的位置,所述位置信息与所述第二方位指示的位置相对应;根据所述位置信息获取所述第三指令。
- 根据权利要求15所述的装置,所述处理器执行所述机器可读指令完成以下操作:接收第四指令,其中,所述第四指令用于指示所述第二方位指示的朝向;在接收到所述第四指令之后,确定所述第二方位指示的朝向。
- 根据权利要求26所述的装置,所述处理器执行所述机器可读指令完成以下操作:获取第二操作对象在真实场景中的角度信息,其中,所述第二操作对象用于调整所述第二方位指示的朝向,所述角度信息与所述第二方位指示的朝向相对应;根据所述角度信息获取所述第四指令。
- 根据权利要求15至27中任意一项所述的装置,所述处理器执行所述机器可读指令完成以下操作:通过摇杆接收所述第一指令;通过所述摇杆确定所述第二方位;或者,通过触控板接收所述第一指令;通过所述触控板确定所述第二方位。
- 一种非易失性计算机可读存储介质,所述存储介质中存储有机器可读指令,所述机器可读指令可以由处理器执行以完成如权利要求1至14任一项所述的方法。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP18837684.2A EP3575928A4 (en) | 2017-07-25 | 2018-07-23 | METHOD AND DEVICE FOR SHIFT CONTROL FOR VIRTUAL CHARACTERS AND STORAGE MEDIUM |
KR1020197029586A KR102574170B1 (ko) | 2017-07-25 | 2018-07-23 | 가상 캐릭터의 배치 제어 방법 및 디바이스와 저장 매체 |
JP2019572111A JP7023991B2 (ja) | 2017-07-25 | 2018-07-23 | 仮想キャラクターの変位制御方法、装置、および記憶媒体 |
US16/538,147 US11049329B2 (en) | 2017-07-25 | 2019-08-12 | Method and apparatus for controlling placement of virtual character and storage medium |
US17/325,187 US11527052B2 (en) | 2017-07-25 | 2021-05-19 | Method and apparatus for controlling placement of virtual character and storage medium |
US17/985,450 US12026847B2 (en) | 2017-07-25 | 2022-11-11 | Method and apparatus for controlling placement of virtual character and storage medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710613071.2A CN107450747B (zh) | 2017-07-25 | 2017-07-25 | 虚拟角色的位移控制方法和装置 |
CN201710613071.2 | 2017-07-25 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/538,147 Continuation US11049329B2 (en) | 2017-07-25 | 2019-08-12 | Method and apparatus for controlling placement of virtual character and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019019968A1 true WO2019019968A1 (zh) | 2019-01-31 |
Family
ID=60487554
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2018/096646 WO2019019968A1 (zh) | 2017-07-25 | 2018-07-23 | 虚拟角色的位移控制方法、装置和存储介质 |
Country Status (6)
Country | Link |
---|---|
US (3) | US11049329B2 (zh) |
EP (1) | EP3575928A4 (zh) |
JP (1) | JP7023991B2 (zh) |
KR (1) | KR102574170B1 (zh) |
CN (1) | CN107450747B (zh) |
WO (1) | WO2019019968A1 (zh) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111078031A (zh) * | 2019-12-23 | 2020-04-28 | 上海米哈游网络科技股份有限公司 | 一种虚拟人物的位置确定方法、装置、设备及存储介质 |
Families Citing this family (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107450747B (zh) | 2017-07-25 | 2018-09-18 | 腾讯科技(深圳)有限公司 | 虚拟角色的位移控制方法和装置 |
KR102085440B1 (ko) | 2017-12-26 | 2020-03-05 | (주)스코넥엔터테인먼트 | 가상 현실 제어 시스템 |
CN108211342A (zh) * | 2018-01-19 | 2018-06-29 | 腾讯科技(深圳)有限公司 | 视角调整方法和装置、存储介质及电子装置 |
CN108379780B (zh) * | 2018-03-13 | 2020-06-02 | 北京小米移动软件有限公司 | 虚拟跑步场景控制方法和装置、跑步机 |
CN108434741A (zh) * | 2018-03-15 | 2018-08-24 | 网易(杭州)网络有限公司 | 虚拟现实中的移动控制方法及装置 |
CN108427501B (zh) * | 2018-03-19 | 2022-03-22 | 网易(杭州)网络有限公司 | 虚拟现实中移动控制方法和装置 |
CN109316741A (zh) * | 2018-07-17 | 2019-02-12 | 派视觉虚拟现实(深圳)软件技术有限公司 | 一种vr场景中控制角色移动的方法、装置及设备 |
EP3640767A1 (de) * | 2018-10-17 | 2020-04-22 | Siemens Schweiz AG | Verfahren zum bestimmen mindestens eines bereichs in mindestens einem eingabemodell für mindestens ein zu platzierendes element |
CN109814713A (zh) * | 2019-01-10 | 2019-05-28 | 重庆爱奇艺智能科技有限公司 | 一种用于vr用户视角切换的方法与设备 |
CN111054076B (zh) * | 2019-11-21 | 2021-05-04 | 珠海剑心互动娱乐有限公司 | 一种游戏角色行走动画处理的方法、终端及存储介质 |
CN111757081B (zh) * | 2020-05-27 | 2022-07-08 | 海南车智易通信息技术有限公司 | 用于虚拟场景的移动限制方法、客户端、服务器及计算设备 |
CN111729306B (zh) * | 2020-06-24 | 2024-06-04 | 网易(杭州)网络有限公司 | 游戏角色的传送方法、装置、电子设备及存储介质 |
CN111803947A (zh) * | 2020-07-15 | 2020-10-23 | 广州玖的数码科技有限公司 | 虚拟空间中的游戏角色移动控制方法、存储介质和服务器 |
CN111803948A (zh) * | 2020-07-15 | 2020-10-23 | 广州玖的数码科技有限公司 | 联机游戏角色移动处理方法、存储介质和电子设备 |
US20230093585A1 (en) * | 2021-09-21 | 2023-03-23 | Facebook Technologies, Llc | Audio system for spatializing virtual sound sources |
US11847750B2 (en) * | 2022-05-18 | 2023-12-19 | Niantic, Inc. | Smooth object correction for augmented reality devices |
CN115098005B (zh) * | 2022-06-24 | 2023-01-24 | 北京华建云鼎科技股份公司 | 一种控制目标对象移动的数据处理*** |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105094920A (zh) * | 2015-08-14 | 2015-11-25 | 网易(杭州)网络有限公司 | 一种游戏渲染方法和装置 |
JP2016214807A (ja) * | 2015-05-19 | 2016-12-22 | フミソー株式会社 | パズルゲーム |
CN106527722A (zh) * | 2016-11-08 | 2017-03-22 | 网易(杭州)网络有限公司 | 虚拟现实中的交互方法、***及终端设备 |
CN106598465A (zh) * | 2016-12-20 | 2017-04-26 | 上海逗屋网络科技有限公司 | 基于虚拟摇杆的控制方法、装置和设备 |
CN107450747A (zh) * | 2017-07-25 | 2017-12-08 | 腾讯科技(深圳)有限公司 | 虚拟角色的位移控制方法和装置 |
Family Cites Families (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07236769A (ja) * | 1994-02-28 | 1995-09-12 | Sega Enterp Ltd | 画像表示制御方法及びこれを用いた電子遊戯装置 |
JP2995703B1 (ja) * | 1998-10-08 | 1999-12-27 | コナミ株式会社 | 画像作成装置、画像作成装置における表示場面切替方法、画像作成装置における表示場面切替プログラムが記録された可読記録媒体及びビデオゲーム装置 |
ATE349247T1 (de) * | 2000-07-17 | 2007-01-15 | Sony Computer Entertainment Inc | Programmausführungssystem, programmausführungsvorrichtung, aufzeichungsmedium und entsprechendes computerausführbares programm |
CN100428218C (zh) * | 2002-11-13 | 2008-10-22 | 北京航空航天大学 | 一种实现通用虚拟环境漫游引擎的方法 |
JP3924579B2 (ja) * | 2005-03-30 | 2007-06-06 | 株式会社コナミデジタルエンタテインメント | ゲームプログラム、ゲーム装置及びゲーム制御方法 |
JP4971846B2 (ja) * | 2007-03-16 | 2012-07-11 | 株式会社コナミデジタルエンタテインメント | ゲーム装置、ゲーム装置の制御方法及びプログラム |
US8214750B2 (en) * | 2007-10-31 | 2012-07-03 | International Business Machines Corporation | Collapsing areas of a region in a virtual universe to conserve computing resources |
JP2009112631A (ja) * | 2007-11-08 | 2009-05-28 | Koto:Kk | ゲームキャラクタ表示制御システム |
US8259100B2 (en) * | 2008-04-24 | 2012-09-04 | International Business Machines Corporation | Fixed path transitions |
US20130271457A1 (en) * | 2012-04-11 | 2013-10-17 | Myriata, Inc. | System and method for displaying an object within a virtual environment |
US10510189B2 (en) | 2014-04-16 | 2019-12-17 | Sony Interactive Entertainment Inc. | Information processing apparatus, information processing system, and information processing method |
JP5781213B1 (ja) | 2014-12-26 | 2015-09-16 | 株式会社Cygames | ゲーム制御プログラム、ゲーム制御方法及びゲーム制御装置 |
CN104539929B (zh) * | 2015-01-20 | 2016-12-07 | 深圳威阿科技有限公司 | 带有运动预测的立体图像编码方法和编码装置 |
US10373392B2 (en) * | 2015-08-26 | 2019-08-06 | Microsoft Technology Licensing, Llc | Transitioning views of a virtual model |
CN105183296B (zh) * | 2015-09-23 | 2018-05-04 | 腾讯科技(深圳)有限公司 | 交互界面显示方法及装置 |
CN105913497B (zh) * | 2016-05-27 | 2018-09-07 | 杭州映墨科技有限公司 | 用于虚拟看房的虚拟现实空间移动定位***及方法 |
JP6126273B1 (ja) * | 2016-06-10 | 2017-05-10 | 株式会社コロプラ | 仮想空間を提供するための方法、当該方法をコンピュータに実現させるためのプログラム、および仮想空間を提供するためのシステム |
CN106094639A (zh) * | 2016-07-12 | 2016-11-09 | 大连普菲克科技有限公司 | 行走模拟控制装置 |
CN106502395A (zh) * | 2016-10-18 | 2017-03-15 | 深圳市火花幻境虚拟现实技术有限公司 | 一种在虚拟现实应用中避免用户眩晕的方法及装置 |
CN106569609B (zh) * | 2016-11-11 | 2019-05-07 | 上海远鉴信息科技有限公司 | 改进型虚拟现实中用户传送方法及*** |
CN106484123A (zh) * | 2016-11-11 | 2017-03-08 | 上海远鉴信息科技有限公司 | 虚拟现实中用户传送方法及*** |
CN106774872A (zh) * | 2016-12-09 | 2017-05-31 | 网易(杭州)网络有限公司 | 虚拟现实***、虚拟现实交互方法及装置 |
CN106621324A (zh) * | 2016-12-30 | 2017-05-10 | 当家移动绿色互联网技术集团有限公司 | Vr游戏的交互操作方法 |
CN106924970B (zh) * | 2017-03-08 | 2020-07-07 | 网易(杭州)网络有限公司 | 虚拟现实***、基于虚拟现实的信息显示方法及装置 |
JP6257827B1 (ja) * | 2017-06-01 | 2018-01-10 | 株式会社コロプラ | 仮想空間を提供するためにコンピュータで実行される方法、プログラム、および、情報処理装置 |
-
2017
- 2017-07-25 CN CN201710613071.2A patent/CN107450747B/zh active Active
-
2018
- 2018-07-23 KR KR1020197029586A patent/KR102574170B1/ko active IP Right Grant
- 2018-07-23 WO PCT/CN2018/096646 patent/WO2019019968A1/zh unknown
- 2018-07-23 EP EP18837684.2A patent/EP3575928A4/en not_active Ceased
- 2018-07-23 JP JP2019572111A patent/JP7023991B2/ja active Active
-
2019
- 2019-08-12 US US16/538,147 patent/US11049329B2/en active Active
-
2021
- 2021-05-19 US US17/325,187 patent/US11527052B2/en active Active
-
2022
- 2022-11-11 US US17/985,450 patent/US12026847B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016214807A (ja) * | 2015-05-19 | 2016-12-22 | フミソー株式会社 | パズルゲーム |
CN105094920A (zh) * | 2015-08-14 | 2015-11-25 | 网易(杭州)网络有限公司 | 一种游戏渲染方法和装置 |
CN106527722A (zh) * | 2016-11-08 | 2017-03-22 | 网易(杭州)网络有限公司 | 虚拟现实中的交互方法、***及终端设备 |
CN106598465A (zh) * | 2016-12-20 | 2017-04-26 | 上海逗屋网络科技有限公司 | 基于虚拟摇杆的控制方法、装置和设备 |
CN107450747A (zh) * | 2017-07-25 | 2017-12-08 | 腾讯科技(深圳)有限公司 | 虚拟角色的位移控制方法和装置 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111078031A (zh) * | 2019-12-23 | 2020-04-28 | 上海米哈游网络科技股份有限公司 | 一种虚拟人物的位置确定方法、装置、设备及存储介质 |
CN111078031B (zh) * | 2019-12-23 | 2023-11-14 | 上海米哈游网络科技股份有限公司 | 一种虚拟人物的位置确定方法、装置、设备及存储介质 |
Also Published As
Publication number | Publication date |
---|---|
US11049329B2 (en) | 2021-06-29 |
KR20190126377A (ko) | 2019-11-11 |
US20230074857A1 (en) | 2023-03-09 |
US20190362564A1 (en) | 2019-11-28 |
JP7023991B2 (ja) | 2022-02-22 |
US12026847B2 (en) | 2024-07-02 |
US11527052B2 (en) | 2022-12-13 |
CN107450747B (zh) | 2018-09-18 |
EP3575928A4 (en) | 2020-02-26 |
JP2020527262A (ja) | 2020-09-03 |
EP3575928A1 (en) | 2019-12-04 |
US20210272380A1 (en) | 2021-09-02 |
CN107450747A (zh) | 2017-12-08 |
KR102574170B1 (ko) | 2023-09-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2019019968A1 (zh) | 虚拟角色的位移控制方法、装置和存储介质 | |
JP7256284B2 (ja) | ゲームキャラクター制御方法、装置、機器および記憶媒体 | |
JP6722252B2 (ja) | 情報処理方法及び装置、記憶媒体、電子機器 | |
EP3469466B1 (en) | Directional interface object | |
US11226722B2 (en) | Information interaction method and apparatus, storage medium, and electronic apparatus | |
TW202004421A (zh) | 用於在hmd環境中利用傳至gpu之預測及後期更新的眼睛追蹤進行快速注視點渲染 | |
WO2018064601A1 (en) | Using a portable device and a head-mounted display to view a shared virtual reality space | |
US11194400B2 (en) | Gesture display method and apparatus for virtual reality scene | |
US20190377473A1 (en) | VR Comfort Zones Used to Inform an In-VR GUI Editor | |
CN111417989B (zh) | 程序、信息处理方法、信息处理***、头戴式显示装置和信息处理装置 | |
US20200241733A1 (en) | Extended on-screen gameplay via augmented reality | |
CN107930114A (zh) | 信息处理方法及装置、存储介质、电子设备 | |
JP7249975B2 (ja) | 位置に基づくゲームプレイコンパニオンアプリケーションへユーザの注目を向ける方法及びシステム | |
KR20220018562A (ko) | 인공 현실 시스템을 위한 모서리-식별 제스처-구동 사용자 인터페이스 요소 게이팅 | |
WO2018116544A1 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
US20230214005A1 (en) | Information processing apparatus, method, program, and information processing system | |
WO2019166005A1 (zh) | 智能终端及其感控方法、具有存储功能的装置 | |
JP2019516180A (ja) | 仮想化環境内にイメージを提示するための方法及び装置 | |
US20180059788A1 (en) | Method for providing virtual reality, program for executing the method on computer, and information processing apparatus | |
US20230310989A1 (en) | Object control method and apparatus in virtual scene, terminal device, computer-readable storage medium, and computer program product | |
US20240160273A1 (en) | Inferring vr body movements including vr torso translational movements from foot sensors on a person whose feet can move but whose torso is stationary | |
CN117742479A (zh) | 人机交互方法、装置、设备和介质 | |
CN117742478A (zh) | 信息显示方法、装置、设备和介质 | |
WO2018234318A1 (en) | REDUCING VIRTUAL DISEASE IN VIRTUAL REALITY APPLICATIONS |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18837684 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2018837684 Country of ref document: EP Effective date: 20190830 |
|
ENP | Entry into the national phase |
Ref document number: 20197029586 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2019572111 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |