WO2019019968A1 - 虚拟角色的位移控制方法、装置和存储介质 - Google Patents

虚拟角色的位移控制方法、装置和存储介质 Download PDF

Info

Publication number
WO2019019968A1
WO2019019968A1 PCT/CN2018/096646 CN2018096646W WO2019019968A1 WO 2019019968 A1 WO2019019968 A1 WO 2019019968A1 CN 2018096646 W CN2018096646 W CN 2018096646W WO 2019019968 A1 WO2019019968 A1 WO 2019019968A1
Authority
WO
WIPO (PCT)
Prior art keywords
orientation
virtual character
instruction
displacement
determining
Prior art date
Application number
PCT/CN2018/096646
Other languages
English (en)
French (fr)
Inventor
沈超
王学强
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Priority to EP18837684.2A priority Critical patent/EP3575928A4/en
Priority to KR1020197029586A priority patent/KR102574170B1/ko
Priority to JP2019572111A priority patent/JP7023991B2/ja
Publication of WO2019019968A1 publication Critical patent/WO2019019968A1/zh
Priority to US16/538,147 priority patent/US11049329B2/en
Priority to US17/325,187 priority patent/US11527052B2/en
Priority to US17/985,450 priority patent/US12026847B2/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0338Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of limited linear or angular displacement of an operating part of the device from a neutral position, e.g. isotonic or isometric joysticks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/403D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/021Services related to particular areas, e.g. point of interest [POI] services, venue services or geofences
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/029Location-based management or tracking services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Definitions

  • the embodiments of the present application relate to the field of computers, and in particular, to a displacement control method, apparatus, and storage medium for a virtual character.
  • the embodiment of the present application provides a displacement control method for a virtual character.
  • the virtual character's displacement control method is applied to a computing device, including: receiving, in a virtual reality scenario, a first instruction, wherein the first instruction is used to indicate that the virtual character generates a target displacement from the first orientation; After the command, determining a second orientation that is offset from the first orientation by the target displacement; controlling the virtual character to disappear in the first orientation, and controlling the virtual character to appear in the second orientation, wherein the virtual character does not appear in the first orientation and the second orientation between.
  • the embodiment of the present application further provides a displacement control device for a virtual character.
  • the displacement control device of the virtual character includes: a processor and a memory coupled to the processor, the memory storing machine readable instructions executable by the processor, the processor executing the machine readable The instruction completes the following operations:
  • a first instruction wherein the first instruction is used to indicate that the virtual character generates a target displacement from the first orientation; and after receiving the first instruction, determining a second target displacement from the first orientation Orientation; controlling the virtual character to disappear in the first orientation, and controlling the virtual character to appear in the second orientation, wherein the virtual character does not appear between the first orientation and the second orientation.
  • the embodiment of the present application further provides a non-transitory computer readable storage medium, wherein the storage medium stores machine readable instructions, which are executable by a processor to complete the displacement of the virtual character described above. Control Method.
  • FIG. 1 is a schematic diagram of a hardware environment of a method for controlling a displacement of a virtual character according to an embodiment of the present application
  • FIG. 2 is a flowchart of a method for controlling displacement of a virtual character according to an embodiment of the present application
  • FIG. 3 is a flowchart of a method for controlling a virtual character to disappear in a first orientation and controlling a virtual character to appear in a second orientation according to an embodiment of the present application;
  • FIG. 4 is a flow chart of another method for controlling a virtual character to disappear in a first orientation and controlling a virtual character to appear in a second orientation, in accordance with an embodiment of the present application;
  • FIG. 5 is a flow chart of a method of determining a second orientation of a target displacement from a first orientation, in accordance with an embodiment of the present application
  • FIG. 6 is a flowchart of a method of determining an area within a preset distance range of a distance intersection position as a position indicated by a second orientation according to an embodiment of the present application;
  • FIG. 7 is a flowchart of another method for controlling displacement of a virtual character according to an embodiment of the present application.
  • FIG. 8 is a schematic diagram showing a displacement effect display according to an embodiment of the present application.
  • FIG. 9 is a schematic diagram showing another displacement special effect display according to an embodiment of the present application.
  • FIG. 10a is a schematic diagram of a displacement control apparatus for a virtual character according to an embodiment of the present application.
  • FIG. 10b is a schematic diagram of a displacement control apparatus for a virtual character according to an embodiment of the present application.
  • FIG. 10c is a schematic diagram of a displacement control device for a virtual character according to an embodiment of the present application.
  • FIG. 11 is a structural block diagram of a terminal according to an embodiment of the present application.
  • the method for controlling the movement of a virtual character in a 3D space can be implemented through three platforms: a PC end, a host end, and a mobile phone end.
  • the PC side is operated by keyboard and mouse as the default.
  • the button control of the virtual character is very standardized. You can use the four buttons “W”, “A”, “S” and “D” on the keyboard as the forward and backward. , left pan, right pan to perform continuous linear behavior of the virtual character.
  • the front, back, left, and right of the virtual character are determined according to the relative position of the perspective of the camera controlled by the mouse. Among them, the mouse is used to control the orientation of the camera, and the orientation can be any angle.
  • Each host has its own dedicated handle, that is, the control operation of each host is pre-defined, and the free movement of the virtual character in the virtual space is generally controlled by the joystick on the handle. There is no big difference between the operation on the host side and the PC side. The movement is continuous, and the buttons are the same as the mouse, but the hardware devices are customized.
  • Smartphones are turning to heavy games, that is, more and more of the original 3D games are applied to mobile phones.
  • For the original physical button operation it also evolved into a virtual button operation on the mobile phone.
  • the virtual joystick is adopted for the movement control of the virtual character object in the 3D space.
  • VR Virtual Reality
  • the HTC Vive under the Vive platform is a virtual reality head-mounted display developed by HTC and Valve Corporation.
  • This head-mounted display utilizes "room size" technology. Transforming a room into a three-dimensional space with sensors allows the user to navigate naturally, move around, and use motion-tracking handheld controllers to manipulate objects in a virtual world for sophisticated interaction, communication, and immersive environments. Since the Vive device can track a certain real space and then fully match the virtual space, many casual games are designed to be about the same size as the Vive tracking space, so that the user can completely fit in the space. Free to move, but the size of the simulated space is limited, and the size of the virtual space area that can be tracked by different devices is also inconsistent, making it difficult to adapt to different devices.
  • the embodiment of the present application provides a displacement control method for a virtual character.
  • FIG. 1 is a schematic diagram of a hardware environment of a displacement control method for a virtual character according to an embodiment of the present application.
  • the server 102 is connected to the terminal 104 through a network.
  • the network includes but is not limited to a wide area network, a metropolitan area network, or a local area network.
  • the terminal 104 is not limited to a PC, a mobile phone, a tablet, or the like.
  • the displacement control method of the virtual character in the embodiment of the present application may be performed by the computing device, for example, by the server 102, by the terminal 104, or by the server 102 and the terminal 104.
  • the method for controlling the displacement of the virtual character by the terminal 104 in the embodiment of the present application may also be performed by a client installed thereon.
  • FIG. 2 is a flowchart of a method for controlling displacement of a virtual character according to an embodiment of the present application. This method can be applied to computing devices. As shown in FIG. 2, the method may include the following steps:
  • Step S202 in the virtual reality scenario, receiving a first instruction for instructing the virtual character to generate a target displacement from the first orientation.
  • a first instruction for instructing the virtual character to generate a target displacement from the first orientation is received.
  • Virtual reality that is, virtual reality, virtual environment, using virtual technology, integrated computer graphics, computer simulation, artificial intelligence, sensing, display and network parallel processing and other technologies, the latest development results, is a computer-aided generation
  • High-tech simulation system using computer simulation to generate a virtual world in three-dimensional space, can provide users with a simulation environment such as vision and sense, timely and unrestricted observation of things in three-dimensional space, thus bringing users immersive Feel.
  • the computer can immediately perform complex operations on the information of the virtual character, and feedback the accurate three-dimensional space in the form of video, thereby giving the user a sense of presence.
  • the virtual reality scenario of the embodiment is a scenario in which the real-world scenario is simulated by using the virtual reality technology to obtain a scenario suitable for a specific application.
  • the virtual reality scenario is a virtual reality technology that completely matches the real space into the virtual space.
  • a scenario suitable for a gaming application wherein the gaming application is a VR gaming application, in some embodiments of the present application, an application for controlling a displacement representation process of a virtual character.
  • the user matches the virtual character.
  • the user's operation matches the behavior of the virtual character. For example, if the user pushes the joystick of the VR device in any direction, Trigger the displacement selection mechanism of the virtual character.
  • the virtual reality scene can provide the user with a simulated scene about the senses such as vision, so that the user can observe the things in the scene through the virtual character in a timely and unrestricted manner, so that the user has the same as in the real scene.
  • the virtual reality scene can provide the user with a simulated scene about the senses such as vision, so that the user can observe the things in the scene through the virtual character in a timely and unrestricted manner, so that the user has the same as in the real scene.
  • a displacement change process of the virtual character may be implemented; an area where the virtual character arrives may be determined, for example, a legal area that allows the virtual character to be active; and an operation performed by the virtual character may be determined, For example, determine the type of skill that the virtual character is allowed to use in combat; the attributes that the virtual character can have, such as the virtual character's ability to increase health and reduce the difficulty of life, etc., are not limited here.
  • the virtual reality scenario is a large-scale multi-player online scenario, that is, the virtual reality scenario includes multiple online virtual characters, and the user corresponding to each virtual character can also learn the virtual roles corresponding to other users.
  • the behavior change process for example, to understand the displacement change process of the virtual character corresponding to other users.
  • the foregoing virtual reality scenario is only a preferred embodiment in the embodiment of the present application, and does not represent that the virtual reality scenario of the embodiment of the present application is limited to the foregoing manner, and any displacement control method of the virtual character may be implemented, and may be avoided.
  • the virtual reality scenes in which the continuous movement of the virtual character is likely to cause the user to be dizzy are within the protection scope of the present application, and will not be exemplified herein.
  • the virtual character activity of this embodiment can control the displacement of the virtual character in the virtual reality scene.
  • the displacement mechanism of the virtual character is triggered, that is, the displacement selection mechanism of the virtual character in the virtual reality scene is started, and the first instruction for instructing the virtual character to generate the target displacement from the first orientation is received, that is, the virtual character is selected to be virtual.
  • the target displacement to be moved in the real scene, the triggering of the first instruction is convenient and flexible.
  • the first orientation is a starting orientation of the virtual character before the displacement in the virtual reality scene, including an initial position and an initial orientation in the virtual reality scene, wherein the orientation is the direction in which the virtual character is displaced.
  • the first instruction is generated based on a physical handle, and the rocker can be pushed in any direction to trigger the selection of the target displacement, and the touch panel can be pressed at any position to trigger the selection of the target displacement, thereby selecting the virtual The unique orientation of the character.
  • physical buttons that can precisely control the direction are employed, such as a joystick and a touchpad.
  • the rocker under the Oculus platform, the rocker is pushed in either direction to trigger the selection of the displacement position; under the Vive platform, the touchpad is pressed at any position to trigger the target displacement selection.
  • the Oculus Rift under the Oculus platform is a virtual reality head-mounted display, the software used is mainly video games, using Rift custom programming.
  • Step S204 after receiving the first instruction, determining a second orientation that is offset from the first orientation by the target.
  • step S204 of the present application after receiving the first instruction, determining a second orientation that is offset from the first orientation by the target.
  • the displacement determination mechanism is triggered, that is, the virtual character is triggered to be displaced.
  • the displacement triggering mechanism since the first instruction triggering is convenient and flexible, the displacement triggering mechanism has the possibility of false triggering, and thus the embodiment can also provide a displacement canceling mechanism, which can release the rocker or raise the big circle.
  • the disc touchpad cancels the determined target displacement. For example, for the Oculus platform, the user can release the joystick to cancel the selected target displacement. For the Vive platform, the user can raise the touchpad to cancel the selected target displacement. Subsequent displacement selection mechanisms are not triggered.
  • the target displacement after determining the target displacement, may be previewed, for example, by using a curve to preview the determined target displacement, and the extension process time of the curve is also the buffer time determined by the displacement, when the displacement is triggered by mistake.
  • the user can cancel the determined displacement in time.
  • the effect of the curve in the virtual reality scene is to slowly emit a ray from the position of the hand to the front of the handle. For example, slowly ejecting the curve with the arrow, it takes about a second to bend and extend to the ground, thereby passing the curve head. The part indicates the position of the second orientation.
  • the user's complete operational flow is to push the rocker to trigger the displacement selection mechanism, then adjust the spatial position of the user's hand to determine the position of the target displacement, and then adjust the direction of the joystick to determine The direction of the target displacement, and finally press the joystick to confirm the start of the target displacement;
  • the user's complete operation flow is to press the touchpad to trigger the displacement selection mechanism, and then adjust the spatial position of the user's hand to determine the target displacement. Position, then adjust the position of the touch point of the hand to determine the direction of the target displacement, and finally lift the hand away from the touchpad to confirm the start of the target displacement, so that the operation of determining the target displacement is simple, convenient and rapid.
  • Step S206 controlling the virtual character to disappear in the first orientation, and controlling the virtual character to appear in the second orientation, wherein the virtual character does not appear between the first orientation and the second orientation.
  • control virtual character disappears in the first orientation, and the virtual character is controlled to appear in the second orientation, wherein the virtual character does not appear between the first orientation and the second orientation.
  • the control virtual character disappears in the first orientation and controls the virtual character to appear in the second orientation.
  • the virtual character remains stationary in the first orientation, that is, remains stationary, the lens slowly blacks out, and then when the lens is restored, the virtual character reaches the position and orientation indicated by the second orientation, slowly in the lens.
  • the position and orientation indicated by the virtual character in the second orientation are immobile, that is, the virtual character remains stationary.
  • the virtual character does not appear between the first orientation and the second orientation.
  • the virtual character moves instantaneously in the virtual reality scene, and the user does not feel the acceleration and speed caused by the continuous movement, thereby realizing the avoidance of the continuous movement by the instantaneous movement, thereby avoiding the feeling of dizziness and improving the user experience.
  • the sudden disappearance in the first orientation and the sudden appearance in the second orientation, the first orientation and the second orientation may simultaneously have the effect of rotating around the point particle, the first orientation of the particle
  • the point is offset to the second orientation during the disturbance, as if it were blown by the wind, that is, the particle wind is generated to prompt other users to move in the second orientation and the second orientation, and the particle wind direction is used to prompt
  • the displacement change process of the current virtual character of other users arrives from which place in the virtual reality scene, so that other users in the virtual reality scene can clearly see the displacement process of the virtual character
  • the operation method adapts to various hardware specifications, and the operation is simple and convenient. Easy to learn, it has a good performance in the virtual reality scene of large-scale multiplayer real-time online.
  • step S202 to step S206 by receiving, in the virtual reality scene, a first instruction for instructing the virtual character to generate a target displacement from the first orientation; after receiving the first instruction, determining a target displacement from the first orientation The second orientation; controlling the virtual character to disappear in the first orientation, and controlling the virtual character to appear in the second orientation, wherein the virtual character does not appear between the first orientation and the second orientation. That is, by controlling the virtual character to disappear in the first orientation and controlling the virtual character to appear in the second orientation in the virtual reality scene, the purpose of avoiding the continuous movement by the instantaneous movement of the virtual character is achieved, thereby avoiding the virtual character being avoided.
  • the continuous movement easily leads to the technical effect of the user's dizziness, thereby solving the technical problem that the related technology is easy to cause the user to stun due to the continuous movement of the virtual character.
  • controlling the virtual character to disappear in the first orientation, and controlling the virtual character to appear in the second orientation comprises: controlling the virtual character to stay in the first orientation after determining the second orientation, and controlling The preset lens is black; after controlling the preset lens black screen, the preset lens is controlled to resume display, and the virtual character is controlled to stay in the second orientation.
  • FIG. 3 is a flow chart of a method of controlling a virtual character to disappear in a first orientation and controlling a virtual character to appear in a second orientation, in accordance with an embodiment of the present application. As shown in FIG. 3, the method includes the following steps:
  • Step S301 after determining the second orientation, control the virtual character to stay in the first orientation, and control the preset lens black screen.
  • step S301 of the present application after determining the second orientation, the virtual character is controlled to stay in the first orientation, and the preset lens black screen is controlled, wherein the preset lens is used to display the screen of the virtual reality scene.
  • the first orientation is an initial position before the virtual character is displaced in the virtual reality scene.
  • the control virtual character stays in the first orientation, that is, the control virtual character is in the first position.
  • the preset lens is black, so that the virtual character disappears in the first orientation.
  • the preset lens is used to display a screen of the virtual display scene, that is, the screen of the virtual display scene is a screen in the virtual reality scene that is viewed from the perspective of the user corresponding to the virtual character.
  • This embodiment achieves a display effect in which the virtual character suddenly disappears in the first orientation by controlling the virtual character to stay in the first orientation and controlling the preset shot black screen.
  • Step S302 after controlling the preset lens black screen, controlling the preset lens to resume display, and controlling the virtual character to stay in the second orientation.
  • step S302 of the present application after controlling the black screen of the preset lens, the preset lens is controlled to resume display, and the virtual character is controlled to stay in the second orientation.
  • the preset lens is controlled to resume the screen display, that is, the preset lens is no longer a black screen.
  • the virtual character stays in the virtual reality scene.
  • the second orientation is the latest arrival orientation of the virtual character
  • the target displacement is from the first orientation
  • the target displacement is determined by the displacement selection mechanism.
  • the virtual character does not appear between the first orientation and the second orientation, thereby circumventing the dizziness feeling caused by the continuous movement of the virtual character to the user corresponding to the virtual character in the virtual reality scene.
  • This embodiment restores the display by controlling the preset lens and controls the virtual character to stay in the second orientation, realizing the realistic effect that the virtual character suddenly appears in a new position.
  • the virtual character after determining the second orientation, the virtual character is controlled to stay in the first orientation, and the preset lens black screen is controlled, and the preset lens is used to display the virtual reality scene; after controlling the preset lens black screen, the preset is controlled.
  • the lens resumes display and controls the virtual character to stay in the second orientation, thereby realizing the purpose of controlling the virtual character to disappear in the first orientation and controlling the virtual character to appear in the second orientation, thereby achieving the technique of avoiding vertigo caused by continuous movement of the virtual character. effect.
  • controlling the preset shot black screen includes: controlling the preset shot to gradually black out, wherein the virtual character stays in the first orientation during the gradual blackout of the preset shot;
  • Controlling the preset lens recovery display includes: controlling the preset lens to gradually resume display, wherein the virtual character stays in the second orientation while the preset lens is gradually restored to display.
  • the preset preset lens is gradually blacked out, that is, when the control virtual character stays in the first orientation, the preset shot is slowly blacked out, so that the first orientation of the virtual character in the virtual reality scene disappears.
  • the control preset lens gradually resumes display, that is, the preset lens slowly resumes displaying the virtual reality scene, and in the process of slowly recovering the preset lens, the virtual character is in the second orientation.
  • the position and orientation are unchanged, that is, the virtual character remains stationary, the preset lens gradually blacks out, and the preset lens gradually resumes the display process, and the user corresponding to the virtual character does not feel the acceleration caused by the continuous movement of the displacement. And speed, it will not give users a sense of dizziness, which will enhance the user experience.
  • controlling the virtual character to disappear in the first orientation, and controlling the virtual character to appear in the second orientation comprises: after determining the second orientation, controlling the virtual character to disappear in the first orientation, and The first orientation displays the first indication information; the control virtual character appears in the second orientation, and the second indication information is displayed in the second orientation.
  • FIG. 4 is a flow chart of another method of controlling a virtual character to disappear in a first orientation and controlling a virtual character to appear in a second orientation, in accordance with an embodiment of the present application. As shown in FIG. 4, the method includes the following steps:
  • Step S401 after determining the second orientation, control the virtual character to disappear in the first orientation, and display the first indication information in the first orientation.
  • step S401 of the present application after determining the second orientation, the control virtual character disappears in the first orientation, and displays the first indication information in the first orientation, where the first indication information is used to indicate the first
  • the orientation is dynamic in the virtual reality scene.
  • the first indication information is displayed in the first orientation, and the first indication information may be a point particle, which is a virtual character. a vanishing point where the first orientation is about to disappear.
  • a special effect of the point particle surrounding the rotation is displayed in a preset range of the first orientation, and is used to indicate that the first orientation in the virtual reality scene has a motion, thereby reaching a reminder The purpose of the user's attention.
  • Step S402 controlling the virtual character to appear in the second orientation, and displaying the second indication information in the second orientation.
  • control virtual character appears in the second orientation
  • the second indication information is displayed in the second orientation, wherein the second indication information is used to indicate that the second orientation is in the virtual reality scene. movement.
  • the second indication information may be a point particle, and the virtual character is about to appear in the second orientation.
  • a special effect of the point particle surrounding the rotation is displayed to indicate that the second orientation has motion in the virtual reality scene.
  • the virtual character after determining the second orientation, the virtual character is controlled to disappear in the first orientation, and the first indication information is displayed in the first orientation, wherein the indication information is used to indicate that the first orientation has motion in the virtual reality scene;
  • the virtual character appears in the second orientation, and displays the second indication information in the second orientation, where the second indication information is used to indicate that the second orientation has motion in the virtual reality scene, and the control virtual character disappears in the first orientation, And control the virtual character to appear in the second orientation.
  • step S401 displaying the first indication information in the first orientation comprises: displaying the first point particle moving in the first orientation, wherein the first indication information comprises the first point particle moving
  • step S402 displaying the second indication information in the second orientation comprises: displaying the moved second point particle in the second orientation, wherein the second indication information comprises the moving second point particle, the first point particle to the second
  • the direction of motion of the point particles is used to represent the change process of the virtual character from the first orientation to the second orientation.
  • the first point particle that is moving is displayed in the first orientation
  • the first point particle may include multiple small particles.
  • the plurality of small particles rotate around to prompt other users to move in the first orientation in the virtual reality scene, and may also indicate that the orientation in the virtual reality scene is the position of the virtual character before the displacement.
  • the control virtual character appears in the second orientation and displays the second indication information in the second orientation
  • the moving second point particle is displayed in the second orientation to prompt the other user to move the second orientation in the virtual reality scene. It can also indicate that the orientation in the virtual reality scene is the position of the virtual character after the displacement.
  • the second point particle is different from the first point particle, for example, the number of point particles included in the second point particle is different from the number of point particles included in the first point particle, and/or The size of the point particles included in the two-point particle is different from the size of the point particle included in the first point particle.
  • the moving direction of the first point particle to the second point particle is used to represent the change process of the first orientation to the second orientation, and the first point particle may be offset to the second orientation in the disturbance to obtain the second point particle, first
  • the special effect exhibited by the change process of the point particle to the second point particle is the particle wind, which is like the effect of the wind blowing.
  • the particle wind direction is used to prompt the change process of the displacement of other user virtual characters in the virtual reality scene.
  • the change process from one orientation to the second orientation, this displacement representation process makes other users in the virtual reality scene also understand the change process of the displacement of the virtual character, and is suitable for a large multiplayer scene.
  • the display effect of the foregoing first indication information and the second indication information is only a preferred embodiment of the embodiment of the present application, and does not mean that the display special effect of the embodiment of the present application is only a particle display special effect, and others can be used for prompting.
  • the process of changing the displacement of the user avatar from the first orientation to the second orientation is within the protection scope of the present application, and will not be exemplified herein.
  • step S204 before determining the second orientation that is away from the target displacement by the first orientation, third indication information indicating the target displacement is displayed; and in step S204, determining the target displacement from the first orientation
  • the second orientation includes: determining the second orientation according to the third indication information.
  • the displacement triggering mechanism of the virtual character is convenient and flexible, that is, there is a possibility of erroneously triggering displacement when selecting the target displacement, it is necessary to provide a buffering time to the user, that is, regret time, the user can cancel the error in the buffering time.
  • Trigger displacement The embodiment gives the user a buffer time to determine the displacement prior to determining the second orientation that is offset from the first orientation by the target displacement.
  • third indication information for indicating the target displacement is displayed, and the third indication information may be displayed by the special effect, thereby previewing the selection result of the target displacement. Determining the second orientation according to the third indication information, for example, determining the position and orientation of the second orientation in the virtual reality scene according to the third indication information.
  • displaying the third indication information for indicating the target displacement comprises: displaying a curve for indicating the target displacement, wherein the third indication information includes a curve.
  • the special indication form of the third indication information may be that a ray is slowly emitted from the position of the user's hand to the front of the handle, for example, the ray is a blue arrow
  • the curve bends to the ground.
  • the curve bends to the ground for about one second, and the display effect of the new position indicated by the curve head appears, as shown in the end of the curve in FIG. The cylinder shown.
  • the time taken for the extension of this blue curve is the user's buffer time.
  • the user releases the joystick.
  • the Vive platform the user lifts the large disc trackpad, thereby eliminating the displacement selection mechanism and not triggering the subsequent displacement mechanism.
  • determining the second orientation according to the third indication information comprises: determining an intersection position of the curve with the preset plane; determining an area within the preset distance range from the intersection position as the position indicated by the second orientation.
  • FIG. 5 is a flow chart of a method of determining a second orientation that is a target displacement from a first orientation, in accordance with an embodiment of the present application. As shown in FIG. 5, the method includes the following steps:
  • step S501 the intersection of the curve and the preset plane is determined.
  • step S501 of the present application the intersection of the curve and the preset plane is determined, wherein the preset plane is used to support the virtual character.
  • the second orientation includes the location and orientation of the virtual character in the virtual reality scene.
  • the position and orientation of the second orientation indication are determined when determining the second orientation that is offset from the first orientation by the target orientation. Determine the intersection of the curve and the preset plane.
  • the end of the curve can be represented by a cylindrical shape. The cylindrical effect is used to show the position and orientation that the user will have after displacement.
  • the position indicated by the second orientation is a position where the parabola that is emitted at a certain speed in front of the user's hand intersects the preset plane
  • the preset plane may be a ground, a mountain, or the like in the virtual reality scene.
  • the location used to support the virtual character is not limited here.
  • Step S502 determining an area within a preset distance range from the intersection position as a position indicated by the second orientation.
  • the area within the preset distance range from the intersection position is determined as the position indicated by the second orientation.
  • the area within the preset distance range from the intersection position is determined as the position indicated by the second orientation, thereby determining the position of the second orientation indication.
  • the intersection position of the curve and the preset plane is determined; the area within the preset distance range of the intersection position is determined as the position indicated by the second orientation, and the second orientation for determining the target displacement from the first orientation is realized, and further
  • the control virtual character disappears in the first orientation, and the virtual character is controlled to appear in the second orientation, thereby achieving the technical effect of avoiding the user's dizziness due to continuous movement of the virtual character.
  • step S502 determining the area within the preset distance range of the intersection position as the position of the second orientation indication comprises: detecting whether the second orientation is legal in the virtual reality scene; if the second detection is detected The orientation is legal in the virtual reality scene, and the region within the preset distance range from the intersection position is determined as the position indicated by the second orientation.
  • FIG. 6 is a flowchart of a method of determining an area within a predetermined distance range from a intersecting position as a position indicated by a second orientation, according to an embodiment of the present application. As shown in FIG. 6, the method includes the following steps:
  • Step S601 detecting whether the second orientation is legal in the virtual reality scene.
  • step S601 of the present application it is detected whether the second orientation is legal in the virtual reality scenario.
  • This embodiment introduces the concept of an illegal transmission area.
  • Some virtual areas in a virtual reality scene cannot be reached by a virtual character, but a parabola emitted by a user's hand can reach any virtual area in the virtual reality scene, so it is necessary to do the virtual area.
  • a certain limit is imposed. It is detected whether the second orientation is legal in the virtual reality scene, that is, whether the second orientation is a virtual area that the virtual character can reach. If it is detected that the second orientation is a virtual area reachable by the virtual character, determining that the second orientation is legal in the virtual reality scene, and if detecting that the second orientation is an unreachable virtual area of the virtual character, determining that the second orientation is virtual It is illegal in the real world scene.
  • Step S602 if it is detected that the second orientation is legal in the virtual reality scene, the area within the preset distance range from the intersection position is determined as the position indicated by the second orientation.
  • step S602 of the present application if it is detected that the second orientation is legal in the virtual reality scene, the region within the preset distance range from the intersection position is determined as the position indicated by the second orientation.
  • the second orientation After detecting whether the second orientation is legal in the virtual reality scene, if it is detected that the second orientation is legal in the virtual reality scene, determining an area within a preset distance range from the intersection position as the position indicated by the second orientation, thereby implementing The position indicated by the second orientation is determined, thereby controlling the virtual character to disappear in the first orientation, and controlling the virtual character to appear in the second orientation, thereby achieving the technical effect of avoiding vertigo of the user due to continuous movement of the virtual character.
  • Step S603 If it is detected that the second orientation is invalid in the virtual reality scene, the preset identifier information is displayed.
  • step S603 of the present application if it is detected that the second orientation is invalid in the virtual reality scenario, the preset identifier information is displayed, where the preset identifier information is used to indicate that the second orientation is not in the virtual reality scenario. legitimate.
  • This embodiment calibrates the reachable area of the virtual scene. If it is detected that the second orientation is illegal in the virtual reality scene, that is, the second orientation is a non-reachable area in the virtual reality scene, the parabola has an intersection after it is emitted, but since the virtual character cannot reach, the curve and the second are The display effects of the position indicated by the orientation are set to a conspicuous color, for example, set to red, thereby reminding the user that the virtual character cannot be transmitted to the second orientation. At this time, if the user presses the rocker or lifts the large disk of the touchpad, the displacement selection is canceled, and the displacement performance scene is not triggered.
  • the second instruction after displaying the third indication information for indicating the target displacement, the second instruction is received, wherein the second instruction is used to indicate that the virtual character cancels the displacement from the first orientation, wherein the displacement A target displacement is included; after receiving the second instruction, the control virtual character cancels generating the target displacement from the first orientation.
  • the displacement triggering mechanism is very convenient and flexible, when the user releases the joystick or raises the large disk of the touchpad, a second command is generated, which triggers the displacement determining mechanism.
  • a second command is generated, which triggers the displacement determining mechanism.
  • the user releases the joystick.
  • the Vive platform the user lifts the large disc trackpad to generate a second command, which cancels the displacement selection mechanism according to the second command and does not trigger the subsequent displacement mechanism.
  • determining the second orientation that is offset from the first orientation by the target includes: receiving a third instruction, wherein the third instruction is for indicating a location of the second orientation indication; after receiving the third instruction , determining the position of the second orientation indication.
  • a third command is generated by the user's operation, and the third command is received to indicate the position of the second orientation, thereby determining the position of the second orientation.
  • the receiving the third instruction includes: acquiring location information of the first operation object in the real scene, where the first operation object is used to adjust the position of the second orientation indication, the location information and the second The position of the orientation indication corresponds; the third instruction is obtained according to the location information.
  • the first operation object may be a user's hand, and the user can simply move the position of the hand, and can receive the third command by rotating the angle of the front of the hand, and realize the wide adjustment of the target displacement position according to the third instruction.
  • determining the second orientation that is offset from the first orientation by the target includes: receiving a fourth instruction, wherein the fourth instruction is for indicating an orientation of the second orientation indication; after receiving the fourth instruction , determining the orientation of the second orientation indication.
  • a fourth command is generated by the user's operation, and the fourth command is received to indicate the orientation of the second orientation, thereby determining the orientation of the second orientation.
  • the receiving the fourth instruction includes: acquiring angle information of the second operation object in the real scene, where the second operation object is used to adjust the orientation of the second orientation indication, the angle information and the second The orientation of the orientation indication corresponds; the fourth instruction is obtained according to the angle information.
  • the second operation object may be a handle, and the angle information may be determined by the direction selection mechanism of the handle.
  • the 360 degree direction of the joystick is mapped to the horizontal 360 degree of the position indicated by the second orientation, so the user only needs to be convenient.
  • the direction of the rotary rocker can determine the orientation of the second orientation indication.
  • the second orientation of the virtual character in the virtual reality scene is displayed by an arrow at the end of the curve to visually display the target displacement, determining the position and orientation of the virtual character after the displacement of the moving target.
  • receiving the first instruction for instructing the virtual character to generate the target displacement from the first orientation comprises: receiving the first instruction by the joystick; and determining the second orientation of the target displacement from the first orientation comprises: Determining a second orientation by the joystick; or receiving a first instruction for instructing the virtual character to generate a target displacement from the first orientation comprises: receiving a first instruction through the touchpad; determining a second orientation that is offset from the first orientation by the target Including: determining the second orientation through the touchpad.
  • this embodiment is compatible with the current mainstream Oculus and Vive helmets and corresponding operating handles on the market.
  • the operation of this embodiment is also applicable if new hardware devices are available.
  • This embodiment provides a method for controlling the displacement of a virtual character in a virtual reality scene.
  • the user operation is simple, convenient, and intuitive.
  • the preview effect when the displacement is selected can be very intuitive to tell the user the result of the operation, by implementing the virtual character.
  • Instant movement avoids the stun caused by continuous movement in the virtual reality scene. Through the direction effect, other users in the scene can see the displacement process of the virtual character.
  • the operation method adapts to various hardware specifications, and the operation is simple and convenient, and is large and large. People have a good performance in real-time online scenes.
  • FIG. 7 is a flowchart of another method for controlling displacement of a virtual character according to an embodiment of the present application. As shown in Fig. 7, rounded rectangles are used to represent user input, rectangular rectangles are used to represent logical units, diamonds are used to represent control flow selection, and dashed lines are used to distinguish between displacement selection processes and displacement representation processes. The method includes the following steps:
  • step S701 the user pushes the joystick or presses the touchpad.
  • a displacement mechanism In the process of performing displacement control of a virtual character, firstly, a displacement mechanism is triggered, and the user pushes the joystick or presses the touchpad.
  • the triggering of the displacement is based on the physical handle, and the different platforms are slightly different. Under the Oculus platform, push the joystick in either direction to trigger the selection of the displacement position; under the Vive platform, press the touchpad at any position to trigger the displacement selection.
  • the physical buttons of the Oculus platform and the Vive platform are not the same, the design concept is the same, that is, because the orientation of the displacement needs to be selected, physical buttons that can accurately control the displacement direction are required, and the joystick and the touchpad are the most Good choice.
  • Step S702 starting a displacement selection mechanism.
  • the displacement selection mechanism is activated.
  • step S703 the position and orientation of the user's hand in the space are adjusted.
  • the displacement selection mechanism After the displacement selection mechanism is activated, the position and orientation of the user's hand in the space is adjusted, and the user can simply move the position of the hand in the space and rotate the angle of the front of the hand to adjust the orientation of the hand in the space.
  • Step S704 adjusting the position of the target displacement.
  • step S705 the joystick direction is adjusted or the touch point position is adjusted.
  • the angle is determined by the direction selection mechanism of the handle.
  • the 360 degree direction of the joystick is mapped to the horizontal 360 degree direction of the target position. Therefore, the user only needs to rotate the direction of the rocker to conveniently determine the target rotation position, thereby determining the direction of the target displacement. . Or adjust the position of the touch point on the touchpad to determine the direction of the target displacement.
  • step S706 the direction of the target displacement is adjusted.
  • Step S707 waiting for the buffer time.
  • the displacement triggering mechanism of this embodiment is very convenient and flexible. However, this embodiment has the possibility of false triggering, and therefore a cancellation mechanism for canceling the target displacement is required.
  • the buffer time is awaited to give the user a regret time when selecting the displacement.
  • a ray is slowly emitted from the position of the user's hand to the front of the handle, for example, a curve that slowly emits a blue arrow, which may also be called a parabola, takes about a second or so.
  • the bend extends to the ground and the display effect of the new position indicated by the curved head appears (the cylinder at the end of the curve as shown in Figure 8).
  • the extension of the above curve is to give the user the regret time, that is, the buffer time.
  • the user releases the joystick.
  • the Vive platform the user lifts the large disc trackpad, thereby eliminating the displacement selection mechanism and not triggering subsequent displacement mechanisms. If the curve extends to the end of the curve where the effect is displayed, it means that the displacement display effect is triggered.
  • step S708 is performed; if the buffer time is not reached, the curve does not bend to the ground and the curve head indicates the new position The display effect does not appear. For example, if the Oculus user releases the joystick, or the Vive user lifts the large disc trackpad, step S716 is performed.
  • Step S708 performing a legality check of the target location.
  • the legality of the target position of the target displacement is checked.
  • some virtual areas are set to be unreachable by the virtual character, but the parabola shot by the user's hand can reach any virtual area, so it is necessary to impose certain restrictions on the virtual area.
  • the position that the virtual character can reach in the virtual reality scene is calibrated.
  • the parabola has an intersection point after the shot, but the virtual character cannot reach, the parabola and the target position can be The effects are set to red to remind the user that the displacement of the virtual character cannot be transmitted.
  • the displacement selection is canceled and the displacement performance is not triggered. If it is checked that the target location is legal, step S709 is performed. If it is checked that the target location is illegal, step S716 is performed.
  • step S709 the target displacement is confirmed.
  • step S711 is performed; if the target displacement is not confirmed, step S716 is performed.
  • step S710 the joystick is pressed or the touchpad is released.
  • the displacement selection confirmation mechanism is also very convenient to design.
  • the displacement determination mechanism is triggered, and step S709 is performed.
  • step S711 the displacement mechanism is started.
  • the displacement mechanism After confirming the target displacement, the displacement mechanism is activated, that is, the virtual character is triggered to perform the displacement operation.
  • step S712 the lens black screen is controlled.
  • the virtual character is kept at the current position, that is, remains stationary, and the control lens is slowly blacked out, and the lens is used to display the virtual reality scene.
  • Step S713 determining the position and direction of the new orientation according to the target displacement.
  • the position and orientation of the new orientation to which the virtual character is to arrive is determined based on the target displacement.
  • step S714 the special effect is played.
  • step S715 the black screen of the lens is controlled to be restored.
  • the effect of the point particle rotation around the new orientation and the original orientation occurs, and then the original orientation of the particle point is shifted to the new orientation in the disturbance, as if it were blown by the wind.
  • the new orientation and the original orientation effect are used to remind other users that there is movement in this place, and the particle wind direction is used to prompt other users, and the current change process of the user's displacement is from which place.
  • step S716 the displacement selection mechanism is cancelled.
  • the Oculus user releases the joystick, the Vive user lifts the large disc touchpad, or after checking that the target position is illegal, or the target displacement is not confirmed, the displacement is canceled. Selection mechanism.
  • the process is always effective. Only at the moment when the position and angle change process ends, the position and angle of the changed displacement are judged. Whether it meets the requirements to perform the next step, for example, to check the legality of the target position, to confirm the target displacement when the target position is checked, and to cancel the displacement selection mechanism when the target position is illegal.
  • the instantaneous movement of the virtual character to avoid continuous movement, through the direction effect to make other players in the scene see the process of the displacement of the virtual character, the operation method adapts to a variety of hardware specifications, the operation is simple and easy to learn, in the scene of large multiplayer real-time online There is good performance underneath.
  • the application environment of the embodiment of the present application may be, but is not limited to, the reference to the application environment in the foregoing embodiments.
  • An embodiment of the present application provides an optional specific application for implementing a virtual color displacement control method, and specifically illustrates a virtual reality scenario for a game application.
  • the design form of this embodiment is mainly divided into two parts. One part is to determine the displacement of the virtual character in the virtual reality scene, and the other part is to determine the displacement of the virtual character in the virtual reality scene.
  • the displacement mechanism Before determining the displacement of the virtual character in the virtual reality scene, the displacement mechanism is first triggered.
  • the trigger mechanism of the displacement is based on the physical handle, and the different platforms are slightly different. For example, on the Oculus platform, pushing the joystick in either direction triggers the selection of the displacement position; on the Vive platform, pressing the touchpad at any position is the trigger displacement selection step.
  • the physical buttons of the Oculus platform and the Vive platform are not the same, the design concept is the same, that is, because the orientation of the displacement needs to be selected, a physical button that can precisely control the direction is required, and the joystick and the touch platform are the best. s Choice.
  • the displacement trigger mechanism is very convenient and flexible, that is, there is the possibility of displacement false triggering. Therefore, a displacement cancellation mechanism is needed, which can provide the game player with the buffer time for determining the displacement, and determine the virtual character in the virtual reality scene.
  • the special effect of the virtual reality scene is to slowly emit a ray from the position of the player's hand to the front of the handle, for example, the curve with an arrow shown in FIG. 8 is a schematic diagram of a displacement special effect display according to an embodiment of the present application.
  • the curve with an arrow may have a color, for example, the curve is blue, and the curved end extends to the ground at about the end of the curve.
  • the extension of this curve is to give the game player regret time, that is, to provide the game player with the buffer time determined by the displacement.
  • the game player releases the joystick.
  • the game player lifts the large disc trackpad, thereby canceling the displacement selection mechanism and not triggering the subsequent displacement mechanism. If there is a display effect in which the curve extends to the end position, as shown in the cylinder at the end of the curve in Fig. 8, it indicates that the displacement scene in the virtual reality scene is triggered.
  • the displacement selection confirmation mechanism that is, the displacement determination mechanism, is also very convenient to design.
  • the displacement determination mechanism is triggered.
  • the cylindrical effect at the end of the curve shown in Figure 8 is used to show the game player the position and orientation of the determined displacement in the virtual reality scene.
  • the target position of the displacement is determined according to the position where the parabola emitted by the player's hand directly at a certain speed intersects the ground, and therefore, the game player only needs to simply move the hand.
  • the position of the part, rotating the angle of the front of the hand, can adjust the target position of the displacement in a wide range; the determination of the angle of the displacement is determined by the direction selection mechanism of the handle, for example, on the Oculus platform, the rocker is
  • the 360 degree direction maps to the horizontal 360 degrees of the target position, so the game player only needs to conveniently rotate the direction of the joystick to determine the target rotation position.
  • a prominent arrow can be used to clearly show the target rotational position, so this embodiment uses the displacement effect to visualize the game player.
  • the displacement of the virtual character allows the game player to know where and where the virtual character is to be reached.
  • the reachable area of the virtual reality scene is calibrated. For the non-reachable area, if the parabola is emitted, there is an intersection with the target position in the area, but because the virtual character cannot reach the virtual area, Set the effect of the parabola and the target position display to a striking color, for example, it is displayed in red to remind the game player that the area cannot be reached.
  • the joystick or lifts the large disc trackpad Cancel the displacement that has been selected and will not trigger the displacement performance.
  • the game player's operation is: for the Oculus platform, the game player pushes the joystick to trigger the displacement selection mechanism, then adjusts the position and orientation of the hand in the space to determine the position of the target displacement, and then adjusts the joystick direction to determine The direction of the target displacement, and finally press the joystick to confirm the starting displacement; for the Vive platform, the game player presses the touchpad to trigger the displacement selection mechanism, and then adjusts the position and orientation of the hand in the space to determine the position of the target displacement, and then Adjust the position of the touch point of the hand on the touchpad to determine the direction of the target displacement. Finally, lift the hand away from the touchpad to confirm the start displacement.
  • the whole operation process can be one-handed operation, which is very convenient and fast. The player determines where the avatar wants to be displaced.
  • the virtual character corresponding to the game player is triggered to perform the displacement operation, and then the following steps are triggered:
  • the game player For the game player, keep the corresponding virtual character in the current position in the virtual reality scene, that is, keep the state of rest, then the camera slowly black screen, and then slowly restore the lens, the virtual character will be selected before The new position and orientation, during the slow recovery of the lens, the virtual character is still in the new position and orientation, that is, it remains stationary. In this process, the game player does not feel the acceleration and speed caused by the continuous movement of the virtual character, and does not bring a feeling of dizziness.
  • FIG. 9 is a schematic diagram showing another displacement special effect display according to an embodiment of the present application, in which the particle point of the original position is shifted to a new position in the disturbance, as if it were blown by the wind.
  • the effect of the point of the new position and the effect of the point of the original position are used to remind other game players that there is movement in this place, and the particle wind direction is used to prompt other game players, which is the change process of the displacement of the current virtual character. Where does the place arrive?
  • the operation of the game player is simple, convenient, and intuitive.
  • the selected displacement can be previewed, thereby directly telling the result of the operation of the game player, and the performance of the entire displacement is shifted.
  • the game character corresponding to the virtual character feels comfortable and does not cause dizziness; in addition, during the displacement performance process, other game players will also understand the displacement process of the virtual character, so the scheme is also applicable to large multiplayer scenes.
  • the special effects display effect of the above displacement can be adjusted, but the purpose and effect expressed by the display special effects are consistent with the objectives and effects in the above embodiments, that is, the displayed special effects provide displacement for the game player.
  • the determined buffer time is such that the game player knows the position and orientation of the virtual character to be reached, and the special effects of the new position and the original position are used to prompt other game players to have a movement, suggesting which change process of the virtual character's displacement is from. Where does the place arrive?
  • the embodiment only implements and is compatible with the two Oculus and Vive helmets and corresponding operating handles currently on the market. If there are new hardware devices, the operation is also applicable to the virtual characters of the embodiment of the present application. Displacement control method.
  • the technical solution of the embodiments of the present application may be embodied in the form of a software product in essence or in the form of a software product stored in a non-transitory computer readable storage medium. (such as ROM/RAM, disk, CD), including a plurality of machine readable instructions for causing a terminal device (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the methods described in various embodiments of the present application. .
  • a terminal device which may be a mobile phone, a computer, a server, or a network device, etc.
  • FIG. 10a is a schematic diagram of a displacement control device for a virtual character according to an embodiment of the present application.
  • the apparatus may include a receiving unit 10, a determining unit 20, and a control unit 30.
  • the receiving unit 10 is configured to receive, in the virtual reality scenario, a first instruction for instructing the virtual character to generate a target displacement from the first orientation.
  • the determining unit 20 is configured to determine a second orientation that is offset from the first orientation by the target after receiving the first instruction.
  • the control unit 30 is configured to control the virtual character to disappear in the first orientation, and control the virtual character to appear in the second orientation, wherein the virtual character does not appear between the first orientation and the second orientation.
  • the control unit 30 includes: a first control module and a second control module.
  • the first control module is configured to control the virtual character to stay in the first orientation after determining the second orientation, and control the preset lens black screen, wherein the preset lens is used to display the screen of the virtual reality scene; the second control module For controlling the preset lens to resume display after controlling the preset lens black screen, and controlling the virtual character to stay in the second orientation.
  • the first control module is configured to control the gradual black screen of the preset lens, wherein the virtual character stays in the first orientation during the gradual blackout of the preset lens; and the second control module is configured to control the preset The lens gradually resumes display, wherein the virtual character stays in the second orientation while the preset lens is gradually restored to display.
  • control unit 30 includes: a third control module and a fourth control module.
  • the third control module is configured to: after determining the second orientation, control the virtual character to disappear in the first orientation, and display the first indication information in the first orientation, where the first indication information is used to indicate that the first orientation is virtual
  • the second control module is configured to control the virtual character to appear in the second orientation, and display the second indication information in the second orientation, where the second indication information is used to indicate that the second orientation is in the virtual reality scene. There is movement.
  • the third control module includes: a first display sub-module, configured to display the moving first point particle in the first orientation, wherein the first indication information includes the moving first point particle;
  • the fourth control module includes: a second display sub-module, configured to display the moved second point particle in the second orientation, wherein the second indication information comprises the moving second point particle, the first point particle to the second point particle
  • the direction of motion is used to represent the change process of the virtual character from the first orientation to the second orientation.
  • the apparatus further includes: a display unit 40, configured to display third indication information for indicating a target displacement before determining a second orientation that is offset from the first orientation by the target displacement
  • the determining unit 20 includes: a first determining module, configured to determine the second orientation according to the third indication information.
  • the display unit 40 includes: a display module for displaying a curve for indicating a target displacement, wherein the third indication information includes a curve.
  • the first determining module includes: a first determining submodule and a second sub determining module.
  • the first determining sub-module is configured to determine a intersection position of the curve and the preset plane, wherein the preset plane is used to support the virtual character; and the second determining sub-module is configured to preset the distance within the distance range of the intersection position Determine the position indicated as the second orientation.
  • the second determining sub-module is further configured to detect whether the second orientation is legal in the virtual reality scene; when detecting that the second orientation is legal in the virtual reality scene, the distance intersecting the position is within a preset distance range The area is determined as the position indicated by the second orientation; and when the second orientation is detected to be invalid in the virtual reality scene, the preset identifier information is displayed, wherein the preset identifier information is used to indicate that the second orientation is not in the virtual reality scene legitimate.
  • the apparatus further includes: a first receiving unit 50 and a canceling unit 60.
  • the first receiving unit 50 is configured to: after displaying the third indication information for indicating the target displacement, the second instruction is used to indicate that the virtual character cancels the displacement from the first orientation, wherein the displacement The target displacement is included; the canceling unit 60 is configured to control the virtual character to cancel the target displacement from the first orientation after receiving the second instruction.
  • the determining unit 20 includes: a first receiving module, configured to receive a third instruction, where the third instruction is used to indicate a location of the second orientation indication; and the second determining module is configured to receive the third After the three instructions, the position of the second orientation indication is determined.
  • the first receiving module includes: a first acquiring sub-module, configured to acquire location information of the first operating object in a real scene, where the first operating object is used to adjust a position indicated by the second orientation, The location information corresponds to the location indicated by the second orientation; the second acquisition submodule is configured to acquire the third instruction according to the location information.
  • the determining unit 20 includes: a second receiving module, configured to receive a fourth instruction, where the fourth instruction is used to indicate an orientation of the second orientation indication; and the third determining module is configured to receive the fourth After the four instructions, the orientation of the second orientation indication is determined.
  • the second receiving module includes: a third acquiring sub-module, configured to acquire angle information of the second operating object in the real scene, where the second operating object is used to adjust the orientation of the second orientation indication, The angle information corresponds to the orientation of the second orientation indication; the fourth acquisition submodule is configured to acquire the fourth instruction according to the angle information.
  • the receiving unit 10 includes: a third receiving module, configured to receive the first instruction by the joystick; the determining unit 20 includes: a fourth determining module, configured to determine the second orientation by the joystick; or
  • the receiving unit 10 includes: a fourth receiving module, configured to receive the first instruction by using the touch panel; and the determining unit 20 includes: a fifth determining module, configured to determine the second orientation by using the touch panel.
  • the receiving unit 10 in this embodiment may be used to perform step S202 in the foregoing method embodiment of the present application.
  • the determining unit 20 in this embodiment may be used to perform step S204 in the foregoing method embodiment of the present application.
  • the control unit 30 in this embodiment may be used to perform step S206 in the foregoing method embodiment of the present application.
  • the receiving unit 10 receives, in the virtual reality scenario, a first instruction for instructing the virtual character to generate a target displacement from the first orientation, and the determining unit 20 determines the target from the first orientation after receiving the first instruction.
  • the second orientation of the displacement, the control unit 30 controls the virtual character to disappear in the first orientation, and controls the virtual character to appear in the second orientation, and the virtual character does not appear between the first orientation and the second orientation, ie, in the virtual reality
  • the control virtual character disappears in the first orientation, and the virtual character is controlled to appear in the second orientation, thereby achieving the purpose of avoiding continuous movement by the instantaneous movement of the virtual character, thereby avoiding the user being easily caused by the continuous movement of the virtual character.
  • the technical effect of vertigo further solves the technical problem that the related art is likely to cause dizziness of the user due to continuous movement of the virtual character.
  • the above-mentioned units and modules are the same as the examples and application scenarios implemented by the steps in the corresponding method embodiments, but are not limited to the contents disclosed in the foregoing method embodiments.
  • the foregoing module may be implemented in a hardware environment as shown in FIG. 1 as part of the device, and may be implemented by software or by hardware, where the hardware environment includes a network environment.
  • the embodiment of the present application further provides a computing device, such as a server or a terminal, for implementing the above-described displacement control method for a virtual character.
  • a computing device such as a server or a terminal, for implementing the above-described displacement control method for a virtual character.
  • FIG. 11 is a structural block diagram of a terminal according to an embodiment of the present application.
  • the terminal may include one or more (only one shown in the figure) processor 111, memory 113, and transmission device 115.
  • the terminal may further include an input/output device 117. .
  • the memory 113 can be used to store a software program and a machine readable instruction module, such as a virtual character displacement control method and a program instruction/module corresponding to the device in the embodiment of the present application, and the processor 111 runs the software stored in the memory 113.
  • the program and the module thereby performing various function applications and data processing, that is, implementing the above-described displacement control method of the virtual character.
  • Memory 113 may include high speed random access memory, and may also include non-volatile memory such as one or more magnetic storage devices, flash memory, or other non-volatile solid state memory.
  • memory 113 may further include memory remotely located relative to processor 111, which may be connected to the terminal over a network. Examples of such networks include, but are not limited to, the Internet, intranets, local area networks, mobile communication networks, and combinations thereof.
  • the transmission device 115 described above is used to receive or transmit data via a network, and can also be used for data transmission between the processor and the memory. Specific examples of the above network may include a wired network and a wireless network.
  • the transmission device 115 includes a Network Interface Controller (NIC) that can be connected to other network devices and routers via a network cable to communicate with the Internet or a local area network.
  • NIC Network Interface Controller
  • transmission device 115 is a Radio Frequency (RF) module for communicating wirelessly with the Internet.
  • RF Radio Frequency
  • the memory 113 is used to store an application, that is, machine readable instructions.
  • the processor 111 can call and execute the application stored in the memory 113 through the transmission device 115 to perform the following steps:
  • the control virtual character disappears in the first orientation, and controls the virtual character to appear in the second orientation, wherein the virtual character does not appear between the first orientation and the second orientation.
  • the processor 111 is further configured to: after determining the second orientation, control the virtual character to stay in the first orientation, and control the preset lens black screen, wherein the preset lens is used to display the screen of the virtual reality scene; After the preset lens is black, the preset lens is controlled to resume display, and the virtual character is controlled to stay in the second orientation.
  • the processor 111 is further configured to: control a gradual black screen of the preset lens, wherein the avatar stays in the first orientation during the gradual blackout of the preset lens; and the preset camera is gradually restored to display, wherein During the process of gradually resetting the preset lens, the virtual character stays in the second orientation.
  • the processor 111 is further configured to: after determining the second orientation, control the virtual character to disappear in the first orientation, and display the first indication information in the first orientation, wherein the first indication information is used to indicate the first orientation
  • the virtual reality scene has a motion; the control virtual character appears in the second orientation, and the second indication information is displayed in the second orientation, wherein the second indication information is used to indicate that the second orientation has motion in the virtual reality scene.
  • the processor 111 is further configured to: display the moving first point particle in the first orientation, wherein the first indication information comprises the moving first point particle; and the second orientation shows the moving second point a particle, wherein the second indication information includes the moving second point particle, and the moving direction of the first point particle to the second point particle is used to indicate a change process of the virtual character from the first orientation to the second orientation.
  • the processor 111 is further configured to: perform third indication information indicating a target displacement before determining a second orientation that is offset from the first orientation by the target orientation; and determine the second orientation according to the third indication information.
  • the processor 111 is further configured to: display a curve for indicating a target displacement, wherein the third indication information includes a curve; determining a intersection position of the curve with the preset plane, wherein the preset plane is used to support the virtual character; The area within the preset distance range from the intersection position is determined as the position indicated by the second orientation.
  • the processor 111 is further configured to: determine whether the second orientation is legal in the virtual reality scene; if it is detected that the second orientation is legal in the virtual reality scene, determine an area within a preset distance range from the intersecting position as the first The position of the two-direction indication; if it is detected that the second orientation is invalid in the virtual reality scene, the preset identification information is displayed, wherein the preset identification information is used to indicate that the second orientation is illegal in the virtual reality scene.
  • the processor 111 is further configured to: after receiving the third indication information for indicating the target displacement, receiving the second instruction, wherein the second instruction is used to indicate that the virtual character cancels the displacement from the first orientation, where The displacement includes a target displacement; after receiving the second instruction, the control virtual character cancels the target displacement from the first orientation.
  • the processor 111 is further configured to: receive a third instruction, wherein the third instruction is used to indicate a location of the second orientation indication; and after receiving the third instruction, determine a location of the second orientation indication.
  • the processor 111 is further configured to: obtain location information of the first operation object in the real scene, where the first operation object is used to adjust the position indicated by the second orientation, the location information and the location indicated by the second orientation Corresponding; obtaining a third instruction according to the location information.
  • the processor 111 is further configured to: receive a fourth instruction, wherein the fourth instruction is used to indicate the orientation of the second orientation indication; after receiving the fourth instruction, determine the orientation of the second orientation indication.
  • the processor 111 is further configured to: obtain angle information of the second operation object in the real scene, where the second operation object is used to adjust the orientation of the second orientation indication, the angle information and the orientation of the second orientation indication Corresponding; obtaining the fourth instruction according to the angle information.
  • the embodiment of the present application provides a scheme for displacement control of a virtual character.
  • the continuous movement is likely to cause technical problems for the user to stun.
  • FIG. 11 is only schematic, and the terminal can be a smart phone (such as an Android mobile phone, an iOS mobile phone, etc.), a tablet computer, a palm computer, and a mobile Internet device (MID). Terminal equipment such as PAD.
  • FIG. 11 does not limit the structure of the above electronic device.
  • the terminal may also include more or less components (such as a network interface, display device, etc.) than shown in FIG. 11, or have a different configuration than that shown in FIG.
  • a program to instruct terminal device related hardware may be completed by a program to instruct terminal device related hardware, and the program may be stored in a computer readable storage medium. It may include: a flash disk, a read-only memory (ROM), a random access memory (RAM), a magnetic disk or an optical disk, and the like.
  • Embodiments of the present application also provide a non-transitory computer readable storage medium.
  • the foregoing storage medium may be used to store program code of a displacement control method of a virtual character provided by the foregoing method embodiment.
  • the foregoing storage medium may be located on at least one of the plurality of network devices in the network shown in the foregoing embodiment.
  • the storage medium is arranged to store program code for performing the following steps:
  • the control virtual character disappears in the first orientation, and controls the virtual character to appear in the second orientation, wherein the virtual character does not appear between the first orientation and the second orientation.
  • the storage medium is further configured to store program code for performing the following steps: after determining the second orientation, controlling the virtual character to stay in the first orientation and controlling the preset shot black screen, wherein the preset The lens is used to display the picture of the virtual reality scene; after controlling the black screen of the preset lens, the preset lens is controlled to resume display, and the virtual character is controlled to stay in the second orientation.
  • the storage medium is further configured to store program code for performing a step of: controlling the gradual blackout of the preset lens, wherein the avatar stays in the first orientation during the gradual blackout of the preset lens Controlling the preset lens gradually resumes display, wherein the virtual character stays in the second orientation while the preset lens is gradually restored to display.
  • the storage medium is further configured to store program code for performing the following steps: after determining the second orientation, controlling the virtual character to disappear in the first orientation, and displaying the first indication information in the first orientation,
  • the first indication information is used to indicate that the first orientation has motion in the virtual reality scene; the control virtual character appears in the second orientation, and the second indication information is displayed in the second orientation, where the second indication information is used to indicate The two directions are moving in the virtual reality scene.
  • the storage medium is further configured to store program code for performing the step of displaying the first point particle moving in the first orientation, wherein the first indication information comprises the first point particle moving Displaying the moving second point particle in the second orientation, wherein the second indication information comprises the moving second point particle, and the moving direction of the first point particle to the second point particle is used to indicate the virtual character from the first orientation The process of change to the second orientation.
  • the storage medium is further configured to store program code for performing: displaying third indication information indicating a target displacement before determining a second orientation that is offset from the first orientation by the target; Determining the second orientation according to the third indication information.
  • the storage medium is further configured to store program code for performing the steps of: displaying a curve for indicating a target displacement, wherein the third indication information comprises a curve; determining a intersection of the curve with the preset plane Wherein, the preset plane is used to support the virtual character; and the area within the preset distance range from the intersection position is determined as the position indicated by the second orientation.
  • the storage medium is further configured to store program code for performing the steps of: detecting whether the second orientation is legal in the virtual reality scene; if detecting that the second orientation is legal in the virtual reality scene, the distance is The area within the preset distance range of the intersection position is determined as the position indicated by the second orientation; if it is detected that the second orientation is invalid in the virtual reality scene, the preset identification information is displayed, wherein the preset identification information is used to indicate the second orientation Not legal in virtual reality scenes.
  • the storage medium is further configured to store program code for performing the following steps: after displaying the third indication information for indicating the target displacement, receiving the second instruction, wherein the second instruction is for indicating The virtual character cancels the displacement from the first orientation, wherein the displacement includes the target displacement; after receiving the second instruction, the control virtual character cancels the target displacement from the first orientation.
  • the storage medium is further configured to store program code for: receiving a third instruction, wherein the third instruction is for indicating a location of the second orientation indication; after receiving the third instruction , determining the position of the second orientation indication.
  • the storage medium is further configured to store program code for performing the following steps: acquiring location information of the first operation object in a real scene, wherein the first operation object is for adjusting the second orientation indication a location, the location information corresponding to the location indicated by the second orientation; obtaining a third instruction based on the location information.
  • the storage medium is further configured to store program code for: receiving a fourth instruction, wherein the fourth instruction is for indicating an orientation of the second orientation indication; after receiving the fourth instruction , determining the orientation of the second orientation indication.
  • the storage medium is further configured to store program code for performing the following steps: acquiring angle information of the second operation object in the real scene, wherein the second operation object is for adjusting the second orientation indication In the orientation, the angle information corresponds to the orientation of the second orientation indication; the fourth instruction is obtained according to the angle information.
  • the foregoing storage medium may include, but not limited to, a U disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a mobile hard disk, a magnetic disk, or an optical disk.
  • ROM Read-Only Memory
  • RAM Random Access Memory
  • mobile hard disk a magnetic disk
  • magnetic disk a magnetic disk
  • optical disk a variety of media that can store program code.
  • the integrated unit in the above embodiment if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in the above-described computer readable storage medium.
  • the technical solution of the embodiments of the present application may be embodied in the form of a software product in the form of a software product in essence or in the form of a contribution to the prior art, and the computer software product is stored in the storage medium.
  • a plurality of machine readable instructions are included to cause one or more computer devices (which may be a personal computer, server or network device, etc.) to perform all or part of the steps of the methods described in various embodiments of the present application.
  • the disclosed client may be implemented in other manners.
  • the device embodiments described above are merely illustrative.
  • the division of the unit is only a logical function division.
  • multiple units or components may be combined or may be Integrate into another system, or some features can be ignored or not executed.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, unit or module, and may be electrical or otherwise.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the above integrated unit can be implemented in the form of hardware or in the form of a software functional unit.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Architecture (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

一种虚拟角色的位移控制方法、装置和存储介质。其中,该方法包括:在虚拟现实场景中,接收第一指令,其中,所述第一指令用于指示虚拟角色从第一方位产生目标位移(S202);在接收到第一指令之后,确定与第一方位相距目标位移的第二方位(S204);控制虚拟角色消失在第一方位,并控制虚拟角色出现在第二方位,该虚拟角色未出现过在第一方位和第二方位之间(S206)。

Description

虚拟角色的位移控制方法、装置和存储介质
本申请要求于2017年7月25日提交中国专利局、申请号为201710613071.2,申请名称为“虚拟角色的位移控制方法和装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及计算机领域,具体而言,涉及虚拟角色的位移控制方法、装置和存储介质。
背景技术
目前,如何控制虚拟角色在虚拟空间中自由移动,是三维(three-dimension,3D)游戏出现以来一直存在的一个重要问题。在设备平台早期,用户还没有形成用户操作习惯,对于控制虚拟角色移动的方法多种多样,之后出现了标准化的虚拟角色在虚拟空间自由移动的设计和实现方法,形成设计标准,进而变成用户的操作习惯。
发明内容
本申请实施例提供了一种虚拟角色的位移控制方法。该虚拟角色的位移控制方法应用于计算设备,包括:在虚拟现实场景中,接收第一指令,其中,所述第一指令用于指示虚拟角色从第一方位产生目标位移;在接收到第一指令之后,确定与第一方位相距目标位移的第二方位;控制虚拟角色消失在第一方位,并控制虚拟角色出现在第二方位,其中,虚拟角色未出现过在第一方位和第二方位之间。
本申请实施例还提供了一种虚拟角色的位移控制装置。该虚拟角色的位移控制装置包括:处理器以及与所述处理器相连接的存储器,所述存储器中存储有可由所述处理器执行的机器可读指令,所述处理器执行 所述机器可读指令完成以下操作:
在虚拟现实场景中,接收第一指令,其中,所述第一指令用于指示虚拟角色从第一方位产生目标位移;在接收到第一指令之后,确定与第一方位相距目标位移的第二方位;控制虚拟角色消失在第一方位,并控制虚拟角色出现在第二方位,其中,虚拟角色未出现过在第一方位和第二方位之间。
本申请实施例还提供了一种非易失性计算机可读存储介质,所述存储介质中存储有机器可读指令,所述机器可读指令可以由处理器执行以完成上述的虚拟角色的位移控制方法。
附图说明
此处所说明的附图用来提供对本申请实施例的进一步理解,构成本申请的一部分,本申请的示意性实施例及其说明用于解释本申请,并不构成对本申请的不当限定。在附图中:
图1是根据本申请实施例的一种虚拟角色的位移控制方法的硬件环境的示意图;
图2是根据本申请实施例的一种虚拟角色的位移控制方法的流程图;
图3是根据本申请实施例的一种控制虚拟角色消失在第一方位,并控制虚拟角色出现在第二方位的方法的流程图;
图4是根据本申请实施例的另一种控制虚拟角色消失在第一方位,并控制虚拟角色出现在第二方位的方法的流程图;
图5是根据本申请实施例的一种确定与第一方位相距目标位移的第二方位的方法的流程图;
图6是根据本申请实施例的一种将距离相交位置预设距离范围内的区域确定为第二方位指示的位置的方法的流程图;
图7是根据本申请实施例的另一种虚拟角色的位移控制方法的流程图;
图8是根据本申请实施例的一种位移特效展示的示意图;
图9是根据本申请实施例的另一种位移特效展示的示意图;
图10a是根据本申请实施例的一种虚拟角色的位移控制装置的示意图;
图10b是根据本申请实施例的一种虚拟角色的位移控制装置的示意图;
图10c是根据本申请实施例的一种虚拟角色的位移控制装置的示意图;以及
图11是根据本申请实施例的一种终端的结构框图。
具体实施方式
为了使本技术领域的人员更好地理解本申请方案,下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分的实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都应当属于本申请保护的范围。
需要说明的是,本申请的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便这里描述的本申请的实施例能够以除了在这里图示或描述的那些以外的顺序实施。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,例如,包含了一系列步骤或单元的过程、方法、***、产品或设备不必限于清楚地列出的那些步骤或单元,而是可包括没 有清楚地列出的或对于这些过程、方法、产品或设备固有的其它步骤或单元。
在实际应用中,控制虚拟角色在3D空间的移动的方法可以通过PC端、主机端、手机端三大平台实现。
PC端都是默认由键盘和鼠标作为输入操作,对虚拟角色的按键控制已经非常标准化,可以使用键盘上的“W”、“A”、“S”、“D”四个按键作为前进、后退、左平移、右平移进行虚拟角色的连续线性行为。虚拟角色的前、后、左、右是根据鼠标控制的摄像机的视角的相对位置进行确定的。其中,鼠标用于控制相机的朝向,朝向可以为任意角度。
每一个主机端都有其配套的专用手柄,也即,每一个主机的控制操作的都是预定义好的,一般通过手柄上的摇杆来控制虚拟角色在虚拟空间中的自由位移。主机端的操作和PC端没有什么大的区别,移动都是连续的,按钮也是模拟与鼠标一样的操作,只是硬件设备是定制的。
智能手机全面转向重度游戏,也即,将原来的3D游戏运用到手机上的越来越多。对于原来的实体按键操作,在手机上也演变为虚拟按键操作。对于3D空间内虚拟人物对象的移动控制则采用的是虚拟摇杆的方式。
虚拟现实(Virtual Reality,简称为VR)端感受到的3D是全方位的,也即,用户感受到的不再是将3D虚拟空间转化成2D画面显示到平面上,而是用户处于这个虚拟空间中。当前很多飞行游戏、射击游戏等普遍使用VR,但是当视觉感受和身体感受不一致的时候,用户就会非常的眩晕,因此VR端不适合第一人称匹配虚拟角色的时候的连续移动的模拟。
Vive平台下的HTC Vive是一款虚拟现实头戴式显示器,由宏达国际电子(HTC)和维尔福公司(Valve Corporation)共同开发,这款头戴式显示器的利用“房间规模”的技术,通过传感器将一个房间转换为三维空间,在虚拟世界中允许用户自然地导航,四处走动,并使用运动跟 踪的手持控制器来生动地操纵物体,实现精密的互动、交流以及沉浸式环境的体验。由于Vive设备可以追踪一定的现实空间,进而完全匹配到虚拟空间,因此很多的休闲游戏在设计的时候就把虚拟空间设计到和Vive的追踪空间差不多大小,这样用户就可以完全的在这个空间里面自由移动,但是模拟空间的大小受限制,不同设备能追踪的虚拟空间区域的大小也不一致,从而很难适配不同的设备。
有鉴于此,本申请实施例提供了一种虚拟角色的位移控制方法。
在本申请一些实施例中,上述虚拟角色的位移控制方法可以应用于如图1所示的由服务器102和终端104所构成的硬件环境中。图1是根据本申请实施例的一种虚拟角色的位移控制方法的硬件环境的示意图。如图1所示,服务器102通过网络与终端104进行连接,上述网络包括但不限于:广域网、城域网或局域网,终端104并不限定于PC、手机、平板电脑等。本申请实施例的虚拟角色的位移控制方法可以由计算设备执行,例如由服务器102来执行,也可以由终端104来执行,还可以是由服务器102和终端104共同执行。其中,终端104执行本申请实施例的虚拟角色的位移控制方法也可以是由安装在其上的客户端来执行。
图2是根据本申请实施例的一种虚拟角色的位移控制方法的流程图。该方法可应用于计算设备。如图2所示,该方法可以包括以下步骤:
步骤S202,在虚拟现实场景中,接收用于指示虚拟角色从第一方位产生目标位移的第一指令。
在本申请上述步骤S202提供的技术方案中,在虚拟现实场景中,接收用于指示虚拟角色从第一方位产生目标位移的第一指令。
虚拟现实,也即,虚拟实境,虚拟环境,采用虚拟技术,集成了计算机图形、计算机仿真、人工智能、感应、显示及网络并行处理等技术的最新发展成果,是一种由计算机技术辅助生成的高技术模拟***,利 用电脑模拟产生三维空间的虚拟世界,可以为用户提供关于视觉等感官的模拟环境,及时、无限制地观察三维空间内的事物,从而为用户带来身临其境的感受。当代表用户的虚拟角色在虚拟环境中移动时,电脑可以立即对虚拟角色的信息进行复杂运算,将精确的三维空间以视频形式进行反馈,进而使用户产生临场感。
该实施例的虚拟现实场景为采用上述虚拟现实技术对现实场景进行模拟,得到的适用于具体应用的场景,比如,虚拟现实场景为通过虚拟现实技术将现实空间完全匹配到虚拟空间中,得到的适用于游戏应用的场景,其中,游戏应用为VR游戏应用,在本申请一些实施例中,为控制虚拟角色的位移表现过程的应用。在该虚拟现实场景中,用户与该虚拟角色相匹配,在本申请一些实施例中,为用户的操作与虚拟角色的行为相匹配,比如,用户在任意一个方向推动VR设备的摇杆,则触发虚拟角色的位移选择机制,当用户放掉摇杆时,则取消虚拟角色的位移选择机制。另外,该虚拟现实场景可以向用户提供关于视觉等感官的模拟场景,从而可以使用户及时地、无限制地通过虚拟角色观察到该场景内的事物,使用户具有如同身处现实场景中一样的感受。
根据本申请一些实施例,在上述虚拟现实场景中,可以实现虚拟角色的位移变化过程;可以确定虚拟角色到达的区域,比如,确定允许虚拟角色活动的合法区域;可以确定虚拟角色执行的操作,比如,确定允许虚拟角色在进行战斗时所使用的技能类型;可以赋予虚拟角色具有的属性,比如,虚拟角色增加生命值和减少生命值的难易程度等,此处不做限制。
根据本申请一些实施例,该虚拟现实场景为大型多人在线场景,也即,该虚拟现实场景中包括多个在线的虚拟角色,每个虚拟角色对应的用户也可以了解其它用户对应的虚拟角色的行为变化过程,比如,了解其它用户对应的虚拟角色的位移变化过程。
需要说明的是,上述虚拟现实场景仅为本申请实施例中的优选实施 方式,并不代表本申请实施例的虚拟现实场景仅限上述方式,任何可以实现虚拟角色的位移控制方法,并且可以避免由于虚拟角色的连续移动容易导致用户眩晕的效果的虚拟现实场景都在本申请的保护范围之内,此处不再一一举例说明。
该实施例的虚拟角色活动在虚拟现实场景中,可以对虚拟角色的位移进行控制。首先触发虚拟角色的位移机制,也即,启动虚拟角色在虚拟现实场景中的位移选择机制,接收用于指示虚拟角色从第一方位产生目标位移的第一指令,也即,选择虚拟角色在虚拟现实场景中待移动的目标位移,该第一指令的触发方便、灵活。其中,第一方位为虚拟角色在虚拟现实场景中在进行位移之前所处的起始方位,包括在虚拟现实场景中的初始位置和初始朝向,其中,朝向也即虚拟角色位移的方向。
根据本申请一些实施例,上述第一指令基于物理手柄产生,可以向任意一个方向推动摇杆以触发目标位移的选择,还可以在任意位置按下触摸板以触发目标位移的选择,从而选择虚拟角色唯一的朝向。根据本申请一些实施例,采用可以精确地控制方向的物理按键,比如,采用摇杆和触摸板。
根据本申请一些实施例,在Oculus平台下,向任意一个方向推动摇杆以触发位移位置的选择;在Vive平台下,在任意位置按下触摸板以触发目标位移选择。其中,Oculus平台下的Oculus Rift是一款虚拟现实的头戴式显示器,采用的软件主要是电子游戏类,使用Rift定制的编程。
步骤S204,在接收到第一指令之后,确定与第一方位相距目标位移的第二方位。
在本申请上述步骤S204提供的技术方案中,在接收到第一指令之后,确定与第一方位相距目标位移的第二方位。
在接收到第一指令之后,确定与第一方位相距目标位移的第二方位,其中,第二方位即为虚拟角色在虚拟现实空间中待移动到的方位。当按 下摇杆或者松掉触发板的按压,则触发了位移确定机制,也即,触发了虚拟角色需要进行位移。具体如何确定与第一方位相距目标位移的第二方位将在下面的实施例中详细介绍。
根据本申请一些实施例,由于第一指令触发方便、灵活,位移触发机制存在误触发的可能性,因而该实施例还可以提供一种位移取消机制,可以放掉摇杆,或者抬起大圆盘触控板以取消确定的目标位移,比如,对于Oculus平台,用户可以放掉摇杆从而取消已经选择的目标位移,对于Vive平台,用户可以抬起触控板从而取消已经选择的目标位移,不会触发后续的位移选择机制。
根据本申请一些实施例,在确定目标位移之后,可以对目标位移进行预览,比如,采用曲线预览已经确定的目标位移,曲线的延伸过程时间也即为位移确定的缓冲时间,当误触发位移时,用户可以及时取消已经确定的位移。曲线在虚拟现实场景中的展示特效为从手的位置向手柄的正前方缓慢射出一条射线,比如,缓慢射出带箭头的曲线,大概需要一秒钟左右的时间弯曲延伸到地面,从而通过曲线头部表示第二方位的位置。
根据本申请一些实施例,对于Oculus平台,用户完整的操作流程是推动摇杆以触发位移选择机制,然后调整用户手部的空间位置方向以确定目标位移的位置,再调整摇杆的方向以确定目标位移的方向,最后按下摇杆确认目标位移的启动;对于Vive平台,用户完整的操作流程是按下触控板以触发位移选择机制,然后调整用户手部的空间位置方向以确定目标位移的位置,再调整手的触控点位置以确定目标位移的方向,最后抬起手离开触控板以确认目标位移的启动,使得确定目标位移的操作简单、方便、迅速。
步骤S206,控制虚拟角色消失在第一方位,并控制虚拟角色出现在第二方位,其中,虚拟角色未出现过在第一方位和第二方位之间。
在本申请上述步骤S206提供的技术方案中,控制虚拟角色消失在 第一方位,并控制虚拟角色出现在第二方位,其中,虚拟角色未出现过在第一方位和第二方位之间。
在确定与第一方位相距目标位移的第二方位之后,控制虚拟角色消失在第一方位,并控制虚拟角色出现在第二方位。对于用户自己而言,虚拟角色在第一方位保持不动,即保持静止状态,镜头慢慢黑屏,然后当镜头再恢复时,虚拟角色就到了第二方位指示的位置和朝向,在镜头慢慢恢复的过程中,虚拟角色在第二方位指示的位置和朝向是不动的,即虚拟角色同样保持静止状态。在虚拟角色从第一方位消失和从第二方位出现的过程中,虚拟角色在第一方位和第二方位之间没有出现过。从而实现虚拟角色在虚拟现实场景中的瞬间移动,用户不会感受到由于连续移动带来的加速度和速度,从而实现了通过瞬间移动规避连续移动,以避免产生晕眩的感觉,提升用户体验。
对于虚拟现实场景中的其他用户而言,在第一方位的突然消失,在第二方位的突然出现,第一方位和第二方位可以同时出现带有点粒子围绕旋转的特效,第一方位的粒子点在扰动中向第二方位偏移,像是被风吹过去一样,也即,产生粒子风,以用来向其他用户提示第二方位和第二方位有动静,粒子风向则是用来提示其他用户当前虚拟角色的位移变化过程从虚拟现实场景中的哪个地方到达哪个地方,进而使得虚拟现实场景中的其他用户看清楚虚拟角色的位移过程,操作方法适配多种硬件规格,操作简单方便易学,在大型多人实时在线的虚拟现实场景下具有良好的表现。
在上述步骤S202至步骤S206中,通过在虚拟现实场景中,接收用于指示虚拟角色从第一方位产生目标位移的第一指令;在接收到第一指令之后,确定与第一方位相距目标位移的第二方位;控制虚拟角色消失在第一方位,并控制虚拟角色出现在第二方位,其中,虚拟角色未出现过在第一方位和第二方位之间。即,通过在虚拟现实场景下,控制虚拟角色消失在第一方位,并控制虚拟角色出现在第二方位,从而达到了通 过虚拟角色的瞬间移动规避连续移动的目的,从而实现了避免由于虚拟角色的连续移动容易导致用户眩晕的技术效果,进而解决了相关技术由于虚拟角色的连续移动容易导致用户眩晕的技术问题。
在本申请另外一些实施例中,步骤S206,控制虚拟角色消失在第一方位,并控制虚拟角色出现在第二方位包括:在确定第二方位之后,控制虚拟角色停留在第一方位,并控制预设镜头黑屏;在控制预设镜头黑屏之后,控制预设镜头恢复显示,并控制虚拟角色停留在第二方位。
图3是根据本申请实施例的一种控制虚拟角色消失在第一方位,并控制虚拟角色出现在第二方位的方法的流程图。如图3所示,该方法包括以下步骤:
步骤S301,在确定第二方位之后,控制虚拟角色停留在第一方位,并控制预设镜头黑屏。
在本申请上述步骤S301提供的技术方案中,在确定第二方位之后,控制虚拟角色停留在第一方位,并控制预设镜头黑屏,其中,预设镜头用于显示虚拟现实场景的画面。
第一方位为虚拟角色在虚拟现实场景中进行位移之前的初始位置,在确定与第一方位相距目标位移的第二方位之后,控制虚拟角色停留在第一方位,也即,控制虚拟角色在第一方位不动,预设镜头黑屏,使得虚拟角色在第一方位消失。其中,预设镜头用于显示虚拟显示场景的画面,也即,该虚拟显示场景的画面为站在虚拟角色对应的用户的角度,所看到的虚拟现实场景中的画面。
该实施例通过控制虚拟角色停留在第一方位,并控制预设镜头黑屏实现虚拟角色在第一方位突然消失的显示效果。
步骤S302,在控制预设镜头黑屏之后,控制预设镜头恢复显示,并控制虚拟角色停留在第二方位。
在本申请上述步骤S302提供的技术方案中,在控制预设镜头黑屏 之后,控制预设镜头恢复显示,并控制虚拟角色停留在第二方位。
在控制虚拟角色停留在第一方位,并控制预设镜头黑屏之后,控制预设镜头恢复画面显示,也即,预设镜头不再为黑屏,此时,虚拟角色停留在虚拟现实场景中的第二方位,该第二方位为虚拟角色最新到达的方位,与第一方位相距目标位移,该目标位移由位移选择机制确定。虚拟角色在第一方位和第二方位之间未出现,从而规避在虚拟现实场景下,由于虚拟角色连续移动为虚拟角色对应的用户带来的晕眩感觉。
该实施例通过控制预设镜头恢复显示,并控制虚拟角色停留在第二方位,实现了虚拟角色在一个新的位置突然出现的现实效果。
该实施例通过在确定第二方位之后,控制虚拟角色停留在第一方位,并控制预设镜头黑屏,预设镜头用于显示虚拟现实场景的画面;在控制预设镜头黑屏之后,控制预设镜头恢复显示,并控制虚拟角色停留在第二方位,实现了控制虚拟角色消失在第一方位,并控制虚拟角色出现在第二方位的目的,达到避免由于虚拟角色的连续移动导致用户眩晕的技术效果。
在本申请另外一些实施例中,步骤S301,控制预设镜头黑屏包括:控制预设镜头逐渐黑屏,其中,在预设镜头逐渐黑屏的过程中,保持虚拟角色停留在第一方位;步骤S302,控制预设镜头恢复显示包括:控制预设镜头逐渐恢复显示,其中,在预设镜头逐渐恢复显示的过程中,保持虚拟角色停留在第二方位。
在控制预设镜头黑屏时,控制预设镜头逐渐黑屏,也即,在控制虚拟角色停留在第一方位时,预设镜头慢慢黑屏,使得虚拟角色在虚拟现实场景中的第一方位消失。在控制预设镜头恢复显示时,控制预设镜头逐渐恢复显示,也即,预设镜头慢慢恢复显示虚拟现实场景的画面,在预设镜头慢慢恢复的过程中,虚拟角色在第二方位的位置和朝向不变,即虚拟角色保持静止状态,这一预设镜头逐渐黑屏过程,以及预设镜头逐渐恢复显示的过程,虚拟角色对应的用户不会感受到由于位移连续移 动带来的加速度和速度,也就不会为用户带来晕眩的感觉,进而提升了用户体验。
在本申请另外一些实施例中,步骤S206,控制虚拟角色消失在第一方位,并控制虚拟角色出现在第二方位包括:在确定第二方位之后,控制虚拟角色消失在第一方位,并在第一方位显示第一指示信息;控制虚拟角色出现在第二方位,并在第二方位显示第二指示信息。
图4是根据本申请实施例的另一种控制虚拟角色消失在第一方位,并控制虚拟角色出现在第二方位的方法的流程图。如图4所示,该方法包括以下步骤:
步骤S401,在确定第二方位之后,控制虚拟角色消失在第一方位,并在第一方位显示第一指示信息。
在本申请上述步骤S401提供的技术方案中,在确定第二方位之后,控制虚拟角色消失在第一方位,并在第一方位显示第一指示信息,其中,第一指示信息用于指示第一方位在虚拟现实场景中有动静。
在确定与第一方位相距目标位移的第二方位之后,在控制虚拟角色消失在第一方位时,在第一方位显示第一指示信息,该第一指示信息可以为点粒子,为虚拟角色在第一方位即将消失的消失点,根据本申请一些实施例,在第一方位的预设范围内展示点粒子围绕旋转的特效,用于指示虚拟现实场景中的第一方位有动静,从而达到提醒用户注意的目的。
步骤S402,控制虚拟角色出现在第二方位,并在第二方位显示第二指示信息。
在本申请上述步骤S402提供的技术方案中,控制虚拟角色出现在第二方位,并在第二方位显示第二指示信息,其中,第二指示信息用于指示第二方位在虚拟现实场景中有动静。
在控制虚拟角色消失在第一方位之后,控制虚拟角色出现在第二方位,同时在第二方位显示第二指示信息,该第二指示信息可以为点粒子, 为虚拟角色在第二方位即将现身的出身点,根据本申请一些实施例,在第二方位的预设范围内展示点粒子围绕旋转的特效,用于指示第二方位在虚拟现实场景中有动静。
该实施例通过在确定第二方位之后,控制虚拟角色消失在第一方位,并在第一方位显示第一指示信息,其中,指示信息用于指示第一方位在虚拟现实场景中有动静;控制虚拟角色出现在第二方位,并在第二方位显示第二指示信息,其中,第二指示信息用于指示第二方位在虚拟现实场景中有动静,实现了控制虚拟角色消失在第一方位,并控制虚拟角色出现在第二方位。
在本申请另外一些实施例中,步骤S401,在第一方位显示第一指示信息包括:在第一方位显示运动着的第一点粒子,其中,第一指示信息包括运动着的第一点粒子;步骤S402,在第二方位显示第二指示信息包括:在第二方位显示运动着的第二点粒子,其中,第二指示信息包括运动着的第二点粒子,第一点粒子到第二点粒子的运动方向用于表示虚拟角色从第一方位到第二方位的变化过程。
该实施例对于虚拟现实场景中的其他用户而言,在第一方位显示第一指示信息时,在第一方位显示运动着的第一点粒子,该第一点粒子可以包括多个小粒子,多个小粒子围绕旋转,以向其他用户提示虚拟现实场景中的第一方位有动静,也可以表示虚拟现实场景中的该方位为虚拟角色在位移之前所处的位置。在控制虚拟角色出现在第二方位,并在第二方位显示第二指示信息时,在第二方位显示运动着的第二点粒子,以向其他用户提示虚拟现实场景中的第二方位有动静,也可以表示虚拟现实场景中的该方位为虚拟角色在位移之后所处的位置。根据本申请一些实施例,该第二点粒子与第一点粒子不同,比如,第二点粒子包括的点粒子的个数与第一点粒子包括的点粒子的个数不同,和/或第二点粒子包括的点粒子的大小与第一点粒子包括的点粒子的大小不同。
第一点粒子到第二点粒子的运动方向用于表示第一方位到第二方 位的变化过程,可以为第一点粒子在扰动中向第二方位偏移,得到第二点粒子,第一点粒子到第二点粒子的变化过程所展示的特效为粒子风,像是风吹过一样的效果,该粒子风向用于提示虚拟现实场景中的其它用户虚拟角色的位移的变化过程是从第一方位到第二方位的变化过程,这种位移表现过程使得虚拟现实场景中的其他用户也会了解虚拟角色的位移的变化过程,适用于大型多人场景。
需要说明的是,上述第一指示信息和第二指示信息的展示特效仅为本申请实施例的优选实施方式,并不代表本申请实施例的展示特效仅为粒子展示特效,其它可以用于提示用户虚拟角色的位移的变化过程是从第一方位到第二方位的变化过程的方法均在本申请的保护范围之内,此处不再一一举例说明。
在本申请另外一些实施例中,在步骤S204,确定与第一方位相距目标位移的第二方位之前,显示用于指示目标位移的第三指示信息;步骤S204,确定与第一方位相距目标位移的第二方位包括:根据第三指示信息确定第二方位。
由于虚拟角色的位移触发机制方便、灵活,也即,在选择目标位移时存在误触发位移的可能性,因此需要向用户提供一个缓冲时间,也即,后悔时间,用户在缓冲时间内可以取消误触发位移。该实施例在确定与第一方位相距目标位移的第二方位之前,给予用户确定位移的缓冲时间。在触发位移选择机制后,显示用于指示目标位移的第三指示信息,该第三指示信息可以通过特效展示,进而预览目标位移的选择结果。根据第三指示信息确定第二方位,比如,根据第三指示信息确定第二方位在虚拟现实场景中的位置和朝向。
在本申请另外一些实施例中,显示用于指示目标位移的第三指示信息包括:显示用于指示目标位移的曲线,其中,第三指示信息包括曲线。
在显示用于指示目标位移的第三指示信息时,该第三指示信息的特效表现形式可以为从用户手部的位置向手柄的正前方缓慢射出一条射 线,比如,该射线为蓝色带箭头的曲线,经过一定时间,曲线弯曲延伸到地面,根据本申请一些实施例,大概一秒钟左右曲线弯曲延伸到地面,出现曲线头部指示的新位置的展示特效,如图8中曲线尽头所示的圆柱体。这一蓝色的曲线的延伸过程所耗费的时间为用户的缓冲时间。对于Oculus平台,用户放掉摇杆,对于Vive平台,用户抬起大圆盘触控板,从而取消了位移选择机制,不会触发后续的位移机制。
在本申请另外一些实施例中,根据第三指示信息确定第二方位包括:确定曲线与预设平面的相交位置;将距离相交位置预设距离范围内的区域确定为第二方位指示的位置。
图5是根据本申请实施例的一种确定与第一方位相距目标位移的第二方位的方法的流程图。如图5所示,该方法包括以下步骤:
步骤S501,确定曲线与预设平面的相交位置。
在本申请上述步骤S501提供的技术方案中,确定曲线与预设平面的相交位置,其中,预设平面用于支撑虚拟角色。
第二方位包括虚拟角色在虚拟现实场景中最终所处的位置和朝向。在确定与第一方位相距目标位移的第二方位时,对第二方位指示的位置和朝向进行确定。确定曲线与预设平面的相交位置,曲线尽头可以通过圆柱形进行表示,圆柱形的特效用于展示给用户位移后所会存在的位置和朝向。
根据本申请一些实施例,第二方位指示的位置为依据用户手部的正前方按照一定速度射出的抛物线与预设平面相交的位置,该预设平面可以为虚拟现实场景中的地面、高山等用于支撑虚拟角色的位置,此处不做限定。
步骤S502,将距离相交位置预设距离范围内的区域确定为第二方位指示的位置。
在本申请上述步骤S502提供的技术方案中,将距离相交位置预设 距离范围内的区域确定为第二方位指示的位置。
在确定曲线与预设平面的相交位置之后,将距离相交位置预设距离范围内的区域确定为第二方位指示的位置,从而实现对第二方位指示的位置进行确定。
该实施例通过确定曲线与预设平面的相交位置;将距离相交位置预设距离范围内的区域确定为第二方位指示的位置,实现了确定与第一方位相距目标位移的第二方位,进而控制虚拟角色消失在第一方位,并控制虚拟角色出现在第二方位,从而实现了避免由于虚拟角色的连续移动容易导致用户眩晕的技术效果。
在本申请另外一些实施例中,步骤S502,将距离相交位置预设距离范围内的区域确定为第二方位指示的位置包括:检测第二方位在虚拟现实场景中是否合法;如果检测到第二方位在虚拟现实场景中合法,将距离相交位置预设距离范围内的区域确定为第二方位指示的位置。
图6是根据本申请实施例的一种将距离相交位置预设距离范围内的区域确定为第二方位指示的位置的方法的流程图。如图6所示,该方法包括以下步骤:
步骤S601,检测第二方位在虚拟现实场景中是否合法。
在本申请上述步骤S601提供的技术方案中,检测第二方位在虚拟现实场景中是否合法。
该实施例引入非法传送区域的概念,在虚拟现实场景中的有些虚拟区域是虚拟角色不能到达的,但是用户手部射出的抛物线可以到达虚拟现实场景中的任何虚拟区域,因此需要对虚拟区域做出一定的限制。检测第二方位在虚拟现实场景中是否合法,也即,检测第二方位是否为虚拟角色可以到达的虚拟区域。如果检测到第二方位为虚拟角色可以到达的虚拟区域,则确定第二方位在虚拟现实场景中合法,如果检测到第二方位为虚拟角色的不可到达的虚拟区域,则确定第二方位在虚拟现实场 景中不合法。
步骤S602,如果检测到第二方位在虚拟现实场景中合法,将距离相交位置预设距离范围内的区域确定为第二方位指示的位置。
在本申请上述步骤S602提供的技术方案中,如果检测到第二方位在虚拟现实场景中合法,将距离相交位置预设距离范围内的区域确定为第二方位指示的位置。
在检测第二方位在虚拟现实场景中是否合法之后,如果检测到第二方位在虚拟现实场景中合法,将距离相交位置预设距离范围内的区域确定为第二方位指示的位置,从而实现对第二方位指示的位置进行确定,进而控制虚拟角色消失在第一方位,并控制虚拟角色出现在第二方位,从而实现了避免由于虚拟角色的连续移动容易导致用户眩晕的技术效果。
步骤S603,如果检测到第二方位在虚拟现实场景中不合法,显示预设标识信息。
在本申请上述步骤S603提供的技术方案中,如果检测到第二方位在虚拟现实场景中不合法,显示预设标识信息,其中,预设标识信息用于指示第二方位在虚拟现实场景中不合法。
该实施例对虚拟场景的可到达区域做出标定。如果检测到第二方位在虚拟现实场景中不合法,也即,第二方位在虚拟现实场景中为非可到达区域,抛物线射出后与其具有交点,但是由于虚拟角色不能到达,将曲线和第二方位指示的位置的展示特效都设置为醒目的颜色,比如,设置为红色,从而提醒用户虚拟角色不能传送至第二方位。此时,用户若按下摇杆,或者抬起触控板大圆盘,则取消位移选择,不会触发位移表现场景。
在本申请另外一些实施例中,在显示用于指示目标位移的第三指示信息之后,接收第二指令,其中,第二指令用于指示虚拟角色取消从第 一方位产生位移,其中,该位移包括目标位移;在接收第二指令之后,控制虚拟角色取消从第一方位产生所述目标位移。
由于位移触发机制是很方便灵活的,当用户放掉摇杆或者抬起触控板大圆盘时,产生第二指令,则触发了位移确定机制。对于Oculus平台。用户放掉摇杆,对于Vive平台,用户抬起大圆盘触控板,从而产生第二指令,根据第二指令取消位移选择机制,不会触发后续的位移机制。
在本申请另外一些实施例中,确定与第一方位相距目标位移的第二方位包括:接收第三指令,其中,第三指令用于指示第二方位指示的位置;在接收到第三指令之后,确定第二方位指示的位置。
在确定与第一方位相距目标位移的第二方位时,通过用户的操作产生第三指令,接收第三指令,以指示第二方位的位置,进而确定第二方位的位置。
在本申请另外一些实施例中,接收第三指令包括:获取第一操作对象在真实场景中的位置信息,其中,第一操作对象用于调整第二方位指示的位置,该位置信息与第二方位指示的位置相对应;根据位置信息获取第三指令。
第一操作对象可以为用户手部,用户可以简单地移动手部的位置,通过旋转手部的正前方的角度可以接收第三指令,根据第三指令实现大范围地调整目标位移位置。
在本申请另外一些实施例中,确定与第一方位相距目标位移的第二方位包括:接收第四指令,其中,第四指令用于指示第二方位指示的朝向;在接收到第四指令之后,确定第二方位指示的朝向。
在确定与第一方位相距目标位移的第二方位时,通过用户的操作产生第四指令,接收第四指令,以指示第二方位的朝向,进而确定第二方位的朝向。
在本申请另外一些实施例中,接收第四指令包括:获取第二操作对 象在真实场景中的角度信息,其中,第二操作对象用于调整第二方位指示的朝向,该角度信息与第二方位指示的朝向相对应;根据角度信息获取第四指令。
第二操作对象可以为手柄,可以由手柄的方向选择机制确定角度信息,对于Oculus平台,将摇杆的360度方向映射到第二方位指示的位置的水平360度,因此用户也只需要很方便的旋转摇杆的方向就可以确定第二方位指示的朝向。根据本申请一些实施例,通过曲线尽头带有的箭头来展示虚拟角色在虚拟现实场景中的第二方位,从而可视化地展示目标位移,确定虚拟角色在运动目标位移之后所处的位置和朝向。
在本申请另外一些实施例中,接收用于指示虚拟角色从第一方位产生目标位移的第一指令包括:通过摇杆接收第一指令;确定与第一方位相距目标位移的第二方位包括:通过摇杆确定第二方位;或者,接收用于指示虚拟角色从第一方位产生目标位移的第一指令包括:通过触控板接收第一指令;确定与第一方位相距目标位移的第二方位包括:通过触控板确定第二方位。
该实施例的实现兼容了目前市面上主流的Oculus和Vive两款头盔和对应的操作手柄,如果有新的硬件设备出来,该实施例的操作也同样适用。
该实施例提供了一种控制虚拟现实场景中的虚拟角色的位移方法,用户操作简单、方便、直观,在选择位移时的预览效果可以非常直观地告诉用户其操作的结果,通过实现虚拟角色的瞬间移动规避连续移动在虚拟现实场景下带来的眩晕,通过方向特效来使得场景中的其他用户看清虚拟角色的位移过程,操作方法适配多种硬件规格,操作简单方便易学,在大型多人实时在线的场景下有良好的表现。
下面结合实施例对本申请的技术方案进行说明,主要对虚拟角色的位移控制在程序层面的逻辑控制流程进行说明。
图7是根据本申请实施例的另一种虚拟角色的位移控制方法的流程图。如图7所示,圆角矩形用于表示用户输入,直角矩形用于表示逻辑单元,菱形用于表示控制流选择,虚线则是用于区分位移选择过程与位移表现过程的实现逻辑。该方法包括以下步骤:
步骤S701,用户推动摇杆或者按下触控板。
在进行虚拟角色的位移控制过程中,首先是触发位移机制,用户推动摇杆或者按下触控板。位移的触发基于物理手柄,不同的平台略有不同。在Oculus平台下,向任意一个方向推动摇杆以触发位移位置的选择;在Vive平台下,在任意位置按下触摸板以触发位移选择。虽然Oculus平台和Vive平台的物理按键不大相同,但是设计理念是一致的,也即,因为需要选择位移的朝向,因此需要可以精确控制位移方向的物理按键,而摇杆和触控板是最好的选择。
步骤S702,启动位移选择机制。
在用户推动摇杆或者按下触控板之后,启动位移选择机制。
步骤S703,调整用户手部在空间的位置和朝向。
在启动位移选择机制之后,调整用户手部在空间的位置和朝向,用户可以简单地移动手部在空间的位置,旋转手部的正前方的角度以调整手部在空间的朝向。
步骤S704,调整目标位移的位置。
通过调整用户手部在空间的位置和朝向,实现了大范围地调整目标位移的位置。
步骤S705,调整摇杆方向或者调整触控点位置。
在启动位移选择机制之后,由手柄的方向选择机制确定角度。例如,在Oculus平台上面,将摇杆的360度方向映射到目标位置的水平360度方向,因此,用户只需要旋转摇杆的方向就可以很方便地确定目标旋转 位置,进而确定目标位移的方向。或者调整触控点在触控板上的位置确定目标位移的方向。
步骤S706,调整目标位移的方向。
在调整摇杆方向或者调整触控点位置之后,调整目标位移方向。
步骤S707,等待缓冲时间。
该实施例的位移触发机制是很方便灵活的。然而,该实施例存在误触发的可能性,因此需要一种对目标位移进行取消的取消机制。在调整目标位移的位置和目标位移的方向之后,等待缓冲时间,以给予用户在选择位移时的后悔时间。根据本申请一些实施例,从用户手部的位置向手柄的正前方缓慢射出一条射线,比如,缓慢射出蓝色带箭头的曲线,也可称为抛物线,大概需要一秒钟左右的时间,曲线弯曲延伸到地面,出现曲线头部指示的新的位置的展示特效(如图8中所示的曲线尽头的圆柱体)。上述曲线的延伸过程就是给予用户的后悔时间,即缓冲时间。
例如,在Oculus平台上,用户放掉摇杆,在Vive平台上,用户抬起大圆盘触控板,从而取消了位移选择机制,不会触发后续的位移机制。如果曲线延伸到出现了曲线尽头的位置展示特效,则表示触发了位移展示特效。
如果缓冲时间到达,即曲线弯曲延伸到地面并出现曲线头部指示的新位置的展示特效,则执行步骤S708;如果缓冲时间未到达,即曲线并未弯曲延伸到地面并且曲线头部指示新位置的展示特效没有出现,例如Oculus的用户放掉摇杆,或Vive的用户抬起大圆盘触控板,则执行步骤S716。
步骤S708,执行目标位置的合法性检查。
在缓冲时间到达之后,对目标位移的目标位置的合法性进行检查。在虚拟现实场景中,有些虚拟区域是设定虚拟角色不能到达的,但是用户手部射出的抛物线可以到达任何虚拟区域,因此需要对虚拟区域做出 一定的限制。
根据本申请一些实施例,对虚拟现实场景中虚拟角色可到达的位置做出标定,对于非可到达区域,抛物线射出后虽与其具有交点,但是由于虚拟角色不能到达,可以将抛物线和目标位置的特效都设置为红色,提醒用户虚拟角色的位移不能传送过去,这时候用户如果按下摇杆或者抬起触控板的大圆盘,则是取消位移选择,不会触发位移表现。如果检查出目标位置合法,执行步骤S709。如果检查出目标位置不合法,执行步骤S716。
步骤S709,对目标位移进行确认。
在执行目标位置的合法性检查之后,如果检查出目标位置合法,对目标位移进行确认。如果对目标位移进行了确认,执行步骤S711;如果未对目标位移进行确认,则执行步骤S716。
步骤S710,按下摇杆或者松开触控板。
位移选择确认机制同样设计的非常方便,当用户按下摇杆或者松开触控板时,则触发了位移确定机制,执行步骤S709。
步骤S711,启动位移机制。
在对目标位移进行确认之后,启动位移机制,也即,触发了虚拟角色进行位移的操作。
步骤S712,控制镜头黑屏。
在启动位移机制之后,保持虚拟角色在当前的位置不动,即保持静止状态,控制镜头慢慢黑屏,该镜头用于显示虚拟现实场景的画面。
步骤S713,根据目标位移确定新方位的位置和方向。
根据目标位移确定虚拟角色要到达的新方位的位置和方向。
步骤S714,播放特效。
在控制镜头黑屏之后,使得虚拟角色在当前的位置突然消失,可以 带有点粒子围绕旋转的播放特效。
步骤S715,控制镜头黑屏恢复。
控制镜头黑屏恢复,虚拟角色到达虚拟现实场景中的新方位指示的位置和朝向。在镜头黑屏慢慢恢复的过程中,虚拟角色在新方位指示的位置和朝向不动,即保持静止状态。在这一过程中,用户不会感受到由于虚拟角色的连续移动所带来的加速度和速度,也就不会带来晕眩的感觉。
当虚拟角色在一个新方位突然出现时,在新方位和原方位同时出现带有点粒子围绕旋转的特效,然后原方位的粒子点在扰动中向新方位偏移,像是被风吹过去一样。新方位和原方位的特效是用来向其他用户提示这个地方有动静,粒子风向则是用来提示其他用户,当前这个用户的位移的变化过程是从哪个地方到达哪个地方。
步骤S716,取消位移选择机制。
在缓冲时间未到达时,如Oculus的用户放掉摇杆,Vive的用户抬起大圆盘触控板之后,或者在检查出目标位置不合法之后,或者未对目标位移进行确认,则取消位移选择机制。
在该实施例的虚拟角色位移选择过程中,一旦开启其位移的位置、角度的变化过程,则一直有效,只有在位置、角度的变化过程结束的一瞬间,判断变化后的位移的位置和角度是否符合要求以执行下一步操作,比如,执行目标位置的合法性检查,在检查出目标位置合法时,对目标位移进行确认,在检查出目标位置不合法时,取消位移选择机制,实现了通过虚拟角色的瞬间移动来规避连续移动,通过方向特效来使得场景中的其他玩家看清虚拟人物位移的过程,操作方法适配多种硬件规格,操作简单方便易学,在大型多人实时在线的场景下有良好的表现。
本申请实施例的应用环境可以但不限于参照上述实施例中的应用环境,本实施例中对此不再赘述。本申请实施例提供了用于实施虚拟角 色的位移控制方法的一种可选的具体应用,具体以虚拟现实场景为游戏应用进行举例说明。
该实施例的设计形式主要分为两部分,一部分是在确定虚拟角色在虚拟现实场景的位移之前,一部分是在确定虚拟角色在虚拟现实场景的位移之后,下面对其分别进行介绍。
在确定虚拟角色在虚拟现实场景中的位移之前,首先触发位移机制,位移的触发机制是基于物理手柄的,不同的平台略有不同。例如,在Oculus平台上,向任意一个方向推动摇杆即触发位移位置的选择这一步;在Vive平台上,在任意位置按下触摸板即为触发位移选择这一步。虽然Oculus平台和Vive平台的物理按键不大相同,但是设计理念是一致的,也即,因为需要选择位移的朝向,因此,需要一个可以精确控制方向的物理按键,摇杆和触摸平台是最好的选择。
其次,位移触发机制是很方便灵活的,也即,存在位移误触发的可能性,因而需要一种位移取消机制,可以为游戏玩家提供位移确定的缓冲时间,在确定虚拟角色在虚拟现实场景中的位移时,给予游戏玩家后悔时间。在触发位移选择机制之后,虚拟现实场景的特效表现是从游戏玩家手部的位置向手柄的正前方缓慢射出一条射线,比如,图8所示的带箭头的曲线。其中,图8是根据本申请实施例的一种位移特效展示的示意图,带箭头的曲线可以带有颜色,比如,曲线为蓝色,大概一秒钟左右曲线尽头弯曲延伸到地面,出现曲线头部的新的位置的展示特效,如图8中曲线尽头所示的圆柱体。这一曲线的延伸过程为给予游戏玩家的后悔时间,也即,为游戏玩家提供位移确定的缓冲时间。
根据本申请一些实施例,在Oculus平台上,游戏玩家放掉摇杆,在Vive平台上,游戏玩家抬起大圆盘触控板,从而取消位移选择机制,不会触发后续的位移机制。如果出现了曲线延伸到尽头位置的展示特效,如图8中曲线尽头所示的圆柱体,则表示触发了位移在虚拟现实场景中的表现场景。
位移选择确认机制,也即位移确定机制,同样设计的非常方便,当游戏玩家按下摇杆或者松掉大圆盘时,则触发了位移确定机制。
再次,对位移的位置和方向的确定,图8所示的曲线尽头的圆柱形特效就是用于向游戏玩家展示确定好的位移在虚拟现实场景中存在的位置和朝向。根据本申请一些实施例,对于位移的位置的确定,根据游戏玩家手部的正前方按照一定的速度射出的抛物线与地面相交的位置确定位移的目标位置,因此,游戏玩家只需要简单地移动手部的位置,旋转手部的正前方的角度,就可以大范围地调整位移的目标位置;位移的角度的确定则是由手柄的方向选择机制进行确定,例如在Oculus平台上面,将摇杆的360度方向映射到目标位置的水平360度,因此游戏玩家也只需要很方便地旋转摇杆的方向就可以确定目标旋转位置。例如,图8所示的曲线尽头的展示特效(图8中曲线尽头的圆柱体)中可采用一个突出的箭头清楚地展示目标旋转位置,因此该实施例采用位移特效非常可视化的向游戏玩家展示虚拟角色的位移,使游戏玩家了解虚拟角色待到达的位置和朝向。
另外,在游戏虚拟现实场景中,有些虚拟区域是设定为游戏玩家不能到达的,但是游戏玩家的手部射出的抛物线可以到达虚拟现实场景中的任何虚拟区域,因此需要对游戏玩家所能到达的区域做出一定的限制。根据本申请一些实施例,对虚拟现实场景的可到达区域做出标定,对于非可到达区域,若抛物线射出后与该区域中的目标位置有交点,但是因为虚拟角色不能到达该虚拟区域,因而将抛物线和目标位置展示的特效都设置为醒目的颜色,比如,显示为红色,以提醒游戏玩家不能到达该区域,这时候游戏玩家如果按下摇杆或者抬起大圆盘触控板,则取消已经选择的位移,不会触发位移表现。
在上述过程中,游戏玩家的操作为:对于Oculus平台,游戏玩家推动摇杆以触发位移选择机制,然后调整手部在空间的位置和方向以确定目标位移的位置,再调整摇杆方向以确定目标位移的方向,最后按下摇 杆以确认启动位移;对于Vive平台,游戏玩家按下触控板以触发位移选择机制,然后调整手部在空间的位置和方向以确定目标位移的位置,再调整手部在触控板的触控点位置以确定目标位移的方向,最后抬起手部离开触控板,以确认启动位移,整个操作过程可以为单手操作,非常方便、快捷,让游戏玩家确定虚拟角色想要位移去哪里。
在确认位移的方向和位置之后,触发游戏玩家对应的虚拟角色进行位移的操作,然后就会触发以下步骤:
首先,对于游戏玩家而言,保持其对应的虚拟角色在虚拟现实场景中的当前位置不动,即保持静止状态,然后镜头慢慢黑屏,然后再使镜头慢慢恢复,虚拟角色就到了之前选择的新的位置和朝向,在镜头慢慢恢复的过程中,虚拟角色在新的位置和朝向也是不动的,即保持静止状态。在该过程中,游戏玩家不会感受到由于虚拟角色的连续移动而带来的加速度和速度,也就不会带来晕眩的感觉。
其次,对于虚拟现实场景中的其他游戏玩家而言,虚拟角色在虚拟现实场景的起点位置的突然消失,在另外一个新位置的突然出现,两个地方会同时出现带有点粒子围绕旋转的特效,如图9所示。其中,图9是根据本申请实施例的另一种位移特效展示的示意图,原位置的粒子点在扰动中向新位置偏移,像是被风吹过去一样。新位置的点的特效的和原位置的点的特效是用来给其他游戏玩家提示这个地方有动静,粒子风向则是用来提示其他游戏玩家,当前这个虚拟角色的位移的变化过程是从哪个地方到达哪个地方。
在该实施例中,游戏玩家的操作简单、方便、直观,在位移选择的时候,可对选择的位移进行预览,从而非常直观地告诉该游戏玩家操作的结果,整个位移的表现过程使进行位移的虚拟角色对应的游戏玩家感觉舒适,不会产生晕眩感觉;另外,在位移表现过程中,其他游戏玩家也会了解虚拟角色的位移变化过程,因此该方案也适用于大型多人场景。
需要说明的是,上述位移的特效展示效果是可以进行调整的,但是 展示特效所表示的目的和效果与上述实施例中的目的和效果是一致的,也即,展示的特效为游戏玩家提供位移确定的缓冲时间,使游戏玩家了解虚拟角色待到达的位置和朝向,新位置的特效和原位置的特效用于向其他游戏玩家提示这个地方有动静,提示虚拟角色的位移的变化过程是从哪个地方到达哪个地方。
需要说明的是,该实施例只实现和兼容了目前市面上主流的Oculus和Vive两款头盔和对应的操作手柄,如果有新的硬件设备,其操作也同样适用本申请实施例的虚拟角色的位移控制方法。
需要说明的是,对于前述的各方法实施例,为了简单描述,故将其都表述为一系列的动作组合,但是本领域技术人员应该知悉,本申请实施例并不受所描述的动作顺序的限制,因为依据本申请实施例,某些步骤可以采用其他顺序或者同时进行。其次,本领域技术人员也应该知悉,说明书中所描述的实施例均属于示例性的实施例,所涉及的动作和模块并不一定是本申请所必须的。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到根据上述实施例的方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个非易失性计算机可读存储介质(如ROM/RAM、磁碟、光盘)中,包括若干机器可读指令用以使得一台终端设备(可以是手机,计算机,服务器,或者网络设备等)执行本申请各个实施例所述的方法。
本申请实施例还提供了一种用于实施上述虚拟角色的位移控制方法的虚拟角色的位移控制装置。图10a是根据本申请实施例的一种虚拟角色的位移控制装置的示意图。如图10a所示,该装置可以包括:接收单元10、确定单元20和控制单元30。
接收单元10,用于在虚拟现实场景中,接收用于指示虚拟角色从第 一方位产生目标位移的第一指令。
确定单元20,用于在接收到第一指令之后,确定与第一方位相距目标位移的第二方位。
控制单元30,用于控制虚拟角色消失在第一方位,并控制虚拟角色出现在第二方位,其中,虚拟角色未出现过在第一方位和第二方位之间。
根据本申请一些实施例,控制单元30包括:第一控制模块和第二控制模块。其中,第一控制模块,用于在确定第二方位之后,控制虚拟角色停留在第一方位,并控制预设镜头黑屏,其中,预设镜头用于显示虚拟现实场景的画面;第二控制模块,用于在控制预设镜头黑屏之后,控制预设镜头恢复显示,并控制虚拟角色停留在第二方位。
根据本申请一些实施例,第一控制模块用于控制预设镜头逐渐黑屏,其中,在预设镜头逐渐黑屏的过程中,保持虚拟角色停留在第一方位;第二控制模块用于控制预设镜头逐渐恢复显示,其中,在预设镜头逐渐恢复显示的过程中,保持虚拟角色停留在第二方位。
根据本申请一些实施例,控制单元30包括:第三控制模块和第四控制模块。其中,第三控制模块,用于在确定第二方位之后,控制虚拟角色消失在第一方位,并在第一方位显示第一指示信息,其中,第一指示信息用于指示第一方位在虚拟现实场景中有动静;第四控制模块,用于控制虚拟角色出现在第二方位,并在第二方位显示第二指示信息,其中,第二指示信息用于指示第二方位在虚拟现实场景中有动静。
根据本申请一些实施例,第三控制模块包括:第一显示子模块,用于在第一方位显示运动着的第一点粒子,其中,第一指示信息包括运动着的第一点粒子;第四控制模块包括:第二显示子模块,用于在第二方位显示运动着的第二点粒子,其中,第二指示信息包括运动着的第二点粒子,第一点粒子到第二点粒子的运动方向用于表示虚拟角色从第一方位到第二方位的变化过程。
根据本申请一些实施例,如图10b所示,该装置还包括:显示单元40,用于在确定与第一方位相距目标位移的第二方位之前,显示用于指示目标位移的第三指示信息;确定单元20包括:第一确定模块,用于根据第三指示信息确定第二方位。
根据本申请一些实施例,显示单元40包括:显示模块,用于显示用于指示目标位移的曲线,其中,第三指示信息包括曲线。
第一确定模块包括:第一确定子模块和第二子确定模块。其中,第一确定子模块,用于确定曲线与预设平面的相交位置,其中,预设平面用于支撑虚拟角色;第二确定子模块,用于将距离相交位置预设距离范围内的区域确定为第二方位指示的位置。
根据本申请一些实施例,第二确定子模块还用于检测第二方位在虚拟现实场景中是否合法;在检测到第二方位在虚拟现实场景中合法时,将距离相交位置预设距离范围内的区域确定为第二方位指示的位置;在检测到第二方位在虚拟现实场景中不合法时,显示预设标识信息,其中,预设标识信息用于指示第二方位在虚拟现实场景中不合法。
根据本申请一些实施例,如图10c所示,该装置还包括:第一接收单元50和取消单元60。其中,第一接收单元50,用于在显示用于指示目标位移的第三指示信息之后,接收第二指令,其中,第二指令用于指示虚拟角色取消从第一方位产生位移,其中,位移包括目标位移;取消单元60,用于在接收第二指令之后,控制虚拟角色取消从第一方位产生目标位移。
根据本申请一些实施例,确定单元20包括:第一接收模块,用于接收第三指令,其中,第三指令用于指示第二方位指示的位置;第二确定模块,用于在接收到第三指令之后,确定第二方位指示的位置。
根据本申请一些实施例,第一接收模块包括:第一获取子模块,用于获取第一操作对象在真实场景中的位置信息,其中,第一操作对象用 于调整第二方位指示的位置,该位置信息与第二方位指示的位置相对应;第二获取子模块,用于根据位置信息获取第三指令。
根据本申请一些实施例,确定单元20包括:第二接收模块,用于接收第四指令,其中,第四指令用于指示第二方位指示的朝向;第三确定模块,用于在接收到第四指令之后,确定第二方位指示的朝向。
根据本申请一些实施例,第二接收模块包括:第三获取子模块,用于获取第二操作对象在真实场景中的角度信息,其中,第二操作对象用于调整第二方位指示的朝向,该角度信息与第二方位指示的朝向相对应;第四获取子模块,用于根据该角度信息获取第四指令。
根据本申请一些实施例,接收单元10包括:第三接收模块,用于通过摇杆接收第一指令;确定单元20包括:第四确定模块,用于通过摇杆确定第二方位;或者
接收单元10包括:第四接收模块,用于通过触控板接收第一指令;确定单元20包括:第五确定模块,用于通过触控板确定第二方位。
需要说明的是,该实施例中的接收单元10可以用于执行本申请前述方法实施例中的步骤S202,该实施例中的确定单元20可以用于执行本申请前述方法实施例中的步骤S204,该实施例中的控制单元30可以用于执行本申请前述方法实施例中的步骤S206。
该实施例通过接收单元10在虚拟现实场景中,接收用于指示虚拟角色从第一方位产生目标位移的第一指令,通过确定单元20在接收到第一指令之后,确定与第一方位相距目标位移的第二方位,通过控制单元30控制虚拟角色消失在第一方位,并控制虚拟角色出现在第二方位,虚拟角色未出现过在第一方位和第二方位之间,即通过在虚拟现实场景下,控制虚拟角色消失在第一方位,并控制虚拟角色出现在第二方位,从而达到了通过虚拟角色的瞬间移动规避连续移动的目的,从而实现了避免由于虚拟角色的连续移动容易导致用户眩晕的技术效果,进而解决 了相关技术由于虚拟角色的连续移动容易导致用户眩晕的技术问题。
此处需要说明的是,上述单元和模块与对应的方法实施例中的步骤所实现的示例和应用场景相同,但不限于上述方法实施例所公开的内容。需要说明的是,上述模块作为装置的一部分可以运行在如图1所示的硬件环境中,可以通过软件实现,也可以通过硬件实现,其中,硬件环境包括网络环境。
本申请实施例还提供了一种用于实施上述虚拟角色的位移控制方法的计算设备,如服务器或终端。
图11是根据本申请实施例的一种终端的结构框图。如图11所示,该终端可以包括:一个或多个(图中仅示出一个)处理器111、存储器113、以及传输装置115,如图11所示,该终端还可以包括输入输出设备117。
其中,存储器113可用于存储软件程序以及机器可读指令模块,如本申请实施例中的虚拟角色的位移控制方法和装置对应的程序指令/模块,处理器111通过运行存储在存储器113内的软件程序以及模块,从而执行各种功能应用以及数据处理,即实现上述的虚拟角色的位移控制方法。存储器113可包括高速随机存储器,还可以包括非易失性存储器,如一个或者多个磁性存储装置、闪存、或者其他非易失性固态存储器。在一些实例中,存储器113可进一步包括相对于处理器111远程设置的存储器,这些远程存储器可以通过网络连接至终端。上述网络的实例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。
上述的传输装置115用于经由一个网络接收或者发送数据,还可以用于处理器与存储器之间的数据传输。上述的网络具体实例可包括有线网络及无线网络。在一个实例中,传输装置115包括一个网络适配器(Network Interface Controller,NIC),其可通过网线与其他网络设备与路由器相连从而可与互联网或局域网进行通讯。在一个实例中,传输装置115为射频(Radio Frequency,RF)模块,其用于通过无线方式与互 联网进行通讯。
其中,具体地,存储器113用于存储应用程序,即机器可读指令。
处理器111可以通过传输装置115调用并执行存储器113存储的应用程序,以执行下述步骤:
在虚拟现实场景中,接收用于指示虚拟角色从第一方位产生目标位移的第一指令;
在接收到第一指令之后,确定与第一方位相距目标位移的第二方位;
控制虚拟角色消失在第一方位,并控制虚拟角色出现在第二方位,其中,虚拟角色未出现过在第一方位和第二方位之间。
处理器111还用于执行下述步骤:在确定第二方位之后,控制虚拟角色停留在第一方位,并控制预设镜头黑屏,其中,预设镜头用于显示虚拟现实场景的画面;在控制预设镜头黑屏之后,控制预设镜头恢复显示,并控制虚拟角色停留在第二方位。
处理器111还用于执行下述步骤:控制预设镜头逐渐黑屏,其中,在预设镜头逐渐黑屏的过程中,保持虚拟角色停留在第一方位;控制预设镜头逐渐恢复显示,其中,在预设镜头逐渐恢复显示的过程中,保持虚拟角色停留在第二方位。
处理器111还用于执行下述步骤:在确定第二方位之后,控制虚拟角色消失在第一方位,并在第一方位显示第一指示信息,其中,第一指示信息用于指示第一方位在虚拟现实场景中有动静;控制虚拟角色出现在第二方位,并在第二方位显示第二指示信息,其中,第二指示信息用于指示第二方位在虚拟现实场景中有动静。
处理器111还用于执行下述步骤:在第一方位显示运动着的第一点粒子,其中,第一指示信息包括运动着的第一点粒子;在第二方位显示运动着的第二点粒子,其中,第二指示信息包括运动着的第二点粒子,第一点粒子到第二点粒子的运动方向用于表示虚拟角色从第一方位到 第二方位的变化过程。
处理器111还用于执行下述步骤:在确定与第一方位相距目标位移的第二方位之前,显示用于指示目标位移的第三指示信息;根据第三指示信息确定第二方位。
处理器111还用于执行下述步骤:显示用于指示目标位移的曲线,其中,第三指示信息包括曲线;确定曲线与预设平面的相交位置,其中,预设平面用于支撑虚拟角色;将距离相交位置预设距离范围内的区域确定为第二方位指示的位置。
处理器111还用于执行下述步骤:检测第二方位在虚拟现实场景中是否合法;如果检测到第二方位在虚拟现实场景中合法,将距离相交位置预设距离范围内的区域确定为第二方位指示的位置;如果检测到第二方位在虚拟现实场景中不合法,显示预设标识信息,其中,预设标识信息用于指示第二方位在虚拟现实场景中不合法。
处理器111还用于执行下述步骤:在显示用于指示目标位移的第三指示信息之后,接收第二指令,其中,第二指令用于指示虚拟角色取消从第一方位产生位移,其中,位移包括目标位移;在接收第二指令之后,控制虚拟角色取消从第一方位产生目标位移。
处理器111还用于执行下述步骤:接收第三指令,其中,第三指令用于指示第二方位指示的位置;在接收到第三指令之后,确定第二方位指示的位置。
处理器111还用于执行下述步骤:获取第一操作对象在真实场景中的位置信息,其中,第一操作对象用于调整第二方位指示的位置,该位置信息与第二方位指示的位置相对应;根据该位置信息获取第三指令。
处理器111还用于执行下述步骤:接收第四指令,其中,第四指令用于指示第二方位指示的朝向;在接收到第四指令之后,确定第二方位指示的朝向。
处理器111还用于执行下述步骤:获取第二操作对象在真实场景中的角度信息,其中,第二操作对象用于调整第二方位指示的朝向,该角度信息与第二方位指示的朝向相对应;根据该角度信息获取第四指令。
本申请实施例提供了一种虚拟角色的位移控制的方案。通过在虚拟现实场景中,接收用于指示虚拟角色从第一方位产生目标位移的第一指令;在接收到第一指令之后,确定与第一方位相距目标位移的第二方位;控制虚拟角色消失在第一方位,并控制虚拟角色出现在第二方位,该虚拟角色未出现过在第一方位和第二方位之间,即通过在虚拟现实场景下,控制虚拟角色消失在第一方位,并控制虚拟角色出现在第二方位,从而达到了通过虚拟角色的瞬间移动规避连续移动的目的,从而实现了避免由于虚拟角色的连续移动容易导致用户眩晕的技术效果,进而解决了相关技术由于虚拟角色的连续移动容易导致用户眩晕的技术问题。
根据本申请一些实施例,本实施例中的具体示例可以参考上述实施例中所描述的示例,本实施例在此不再赘述。
本领域普通技术人员可以理解,图11所示的结构仅为示意,终端可以是智能手机(如Android手机、iOS手机等)、平板电脑、掌上电脑以及移动互联网设备(Mobile Internet Devices,MID)、PAD等终端设备。图11其并不对上述电子装置的结构造成限定。例如,终端还可包括比图11中所示更多或者更少的组件(如网络接口、显示装置等),或者具有与图11所示不同的配置。
本领域普通技术人员可以理解上述实施例的各种方法中的全部或部分步骤是可以通过程序来指令终端设备相关的硬件来完成,该程序可以存储于一计算机可读存储介质中,该存储介质可以包括:闪存盘、只读存储器(Read-Only Memory,ROM)、随机存取器(Random Access Memory,RAM)、磁盘或光盘等。
本申请的实施例还提供了一种非易失性计算机可读存储介质。在本申请实施例中,上述存储介质可以用于存储前述方法实施例提供的虚拟 角色的位移控制方法的程序代码。
在本申请实施例中,上述存储介质可以位于上述实施例所示的网络中的多个网络设备中的至少一个网络设备上。
在本申请实施例中,存储介质被设置为存储用于执行以下步骤的程序代码:
在虚拟现实场景中,接收用于指示虚拟角色从第一方位产生目标位移的第一指令;
在接收到第一指令之后,确定与第一方位相距目标位移的第二方位;
控制虚拟角色消失在第一方位,并控制虚拟角色出现在第二方位,其中,虚拟角色未出现过在第一方位和第二方位之间。
根据本申请一些实施例,存储介质还被设置为存储用于执行以下步骤的程序代码:在确定第二方位之后,控制虚拟角色停留在第一方位,并控制预设镜头黑屏,其中,预设镜头用于显示虚拟现实场景的画面;在控制预设镜头黑屏之后,控制预设镜头恢复显示,并控制虚拟角色停留在第二方位。
根据本申请一些实施例,存储介质还被设置为存储用于执行以下步骤的程序代码:控制预设镜头逐渐黑屏,其中,在预设镜头逐渐黑屏的过程中,保持虚拟角色停留在第一方位;控制预设镜头逐渐恢复显示,其中,在预设镜头逐渐恢复显示的过程中,保持虚拟角色停留在第二方位。
根据本申请一些实施例,存储介质还被设置为存储用于执行以下步骤的程序代码:在确定第二方位之后,控制虚拟角色消失在第一方位,并在第一方位显示第一指示信息,其中,第一指示信息用于指示第一方位在虚拟现实场景中有动静;控制虚拟角色出现在第二方位,并在第二方位显示第二指示信息,其中,第二指示信息用于指示第二方位在虚拟现实场景中有动静。
根据本申请一些实施例,存储介质还被设置为存储用于执行以下步骤的程序代码:在第一方位显示运动着的第一点粒子,其中,第一指示信息包括运动着的第一点粒子;在第二方位显示运动着的第二点粒子,其中,第二指示信息包括运动着的第二点粒子,第一点粒子到第二点粒子的运动方向用于表示虚拟角色从第一方位到第二方位的变化过程。
根据本申请一些实施例,存储介质还被设置为存储用于执行以下步骤的程序代码:在确定与第一方位相距目标位移的第二方位之前,显示用于指示目标位移的第三指示信息;根据第三指示信息确定第二方位。
根据本申请一些实施例,存储介质还被设置为存储用于执行以下步骤的程序代码:显示用于指示目标位移的曲线,其中,第三指示信息包括曲线;确定曲线与预设平面的相交位置,其中,预设平面用于支撑虚拟角色;将距离相交位置预设距离范围内的区域确定为第二方位指示的位置。
根据本申请一些实施例,存储介质还被设置为存储用于执行以下步骤的程序代码:检测第二方位在虚拟现实场景中是否合法;如果检测到第二方位在虚拟现实场景中合法,将距离相交位置预设距离范围内的区域确定为第二方位指示的位置;如果检测到第二方位在虚拟现实场景中不合法,显示预设标识信息,其中,预设标识信息用于指示第二方位在虚拟现实场景中不合法。
根据本申请一些实施例,存储介质还被设置为存储用于执行以下步骤的程序代码:在显示用于指示目标位移的第三指示信息之后,接收第二指令,其中,第二指令用于指示虚拟角色取消从第一方位产生位移,其中,位移包括目标位移;在接收第二指令之后,控制虚拟角色取消从第一方位产生目标位移。
根据本申请一些实施例,存储介质还被设置为存储用于执行以下步骤的程序代码:接收第三指令,其中,第三指令用于指示第二方位指示的位置;在接收到第三指令之后,确定第二方位指示的位置。
根据本申请一些实施例,存储介质还被设置为存储用于执行以下步骤的程序代码:获取第一操作对象在真实场景中的位置信息,其中,第一操作对象用于调整第二方位指示的位置,该位置信息与第二方位指示的位置相对应;根据该位置信息获取第三指令。
根据本申请一些实施例,存储介质还被设置为存储用于执行以下步骤的程序代码:接收第四指令,其中,第四指令用于指示第二方位指示的朝向;在接收到第四指令之后,确定第二方位指示的朝向。
根据本申请一些实施例,存储介质还被设置为存储用于执行以下步骤的程序代码:获取第二操作对象在真实场景中的角度信息,其中,第二操作对象用于调整第二方位指示的朝向,该角度信息与第二方位指示的朝向相对应;根据该角度信息获取第四指令。
本实施例中的具体示例可以参考前述实施例中所描述的示例,本实施例在此不再赘述。
在本实施例中,上述存储介质可以包括但不限于:U盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、移动硬盘、磁碟或者光盘等各种可以存储程序代码的介质。
上述实施例中的集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在上述计算机可读取的存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在存储介质中,包括若干机器可读指令用以使得一台或多台计算机设备(可为个人计算机、服务器或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。
在本申请的上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述的部分,可以参见其他实施例的相关描述。
在本申请所提供的几个实施例中,应该理解到,所揭露的客户端, 可通过其它的方式实现。其中,以上所描述的装置实施例仅仅是示意性的,例如所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个***,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,单元或模块的间接耦合或通信连接,可以是电性或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
以上所述仅是本申请的示例性实施例,应当指出,对于本技术领域的普通技术人员来说,在不脱离本申请原理的前提下,还可以做出若干改进和润饰,这些改进和润饰也应视为本申请的保护范围。

Claims (29)

  1. 一种虚拟角色的位移控制方法,应用于计算设备,包括:
    在虚拟现实场景中,接收第一指令,其中,所述第一指令用于指示虚拟角色从第一方位产生目标位移;
    在接收到所述第一指令之后,确定与所述第一方位相距所述目标位移的第二方位;
    控制所述虚拟角色消失在所述第一方位,并控制所述虚拟角色出现在所述第二方位,其中,所述虚拟角色未出现过在所述第一方位和所述第二方位之间。
  2. 根据权利要求1所述的方法,所述控制所述虚拟角色消失在所述第一方位,并控制所述虚拟角色出现在所述第二方位包括:
    在确定所述第二方位之后,控制所述虚拟角色停留在所述第一方位,并控制预设镜头黑屏,其中,所述预设镜头用于显示所述虚拟现实场景的画面;
    在控制所述预设镜头黑屏之后,控制所述预设镜头恢复显示,并控制所述虚拟角色停留在所述第二方位。
  3. 根据权利要求2所述的方法,所述控制所述预设镜头黑屏包括:控制所述预设镜头逐渐黑屏,其中,在所述预设镜头逐渐黑屏的过程中,保持所述虚拟角色停留在所述第一方位;
    所述控制所述预设镜头恢复显示包括:控制所述预设镜头逐渐恢复显示,其中,在所述预设镜头逐渐恢复显示的过程中,保持所述虚拟角色停留在所述第二方位。
  4. 根据权利要求1所述的方法,所述控制所述虚拟角色消失在所述第一方位,并控制所述虚拟角色出现在所述第二方位包括:
    在确定所述第二方位之后,控制所述虚拟角色消失在所述第一方位,并在所述第一方位显示第一指示信息,其中,所述第一指示 信息用于指示所述第一方位在所述虚拟现实场景中有动静;
    控制所述虚拟角色出现在所述第二方位,并在所述第二方位显示第二指示信息,其中,所述第二指示信息用于指示所述第二方位在所述虚拟现实场景中有动静。
  5. 根据权利要求4所述的方法,所述在所述第一方位显示所述第一指示信息包括:在所述第一方位显示运动着的第一点粒子,其中,所述第一指示信息包括运动着的所述第一点粒子;
    所述在所述第二方位显示所述第二指示信息包括:在所述第二方位显示运动着的第二点粒子,其中,所述第二指示信息包括运动着的所述第二点粒子,所述第一点粒子到所述第二点粒子的运动方向用于表示所述虚拟角色从所述第一方位到所述第二方位的变化过程。
  6. 根据权利要求1所述的方法,在确定与所述第一方位相距所述目标位移的所述第二方位之前,所述方法还包括:显示用于指示所述目标位移的第三指示信息;
    所述确定与所述第一方位相距所述目标位移的所述第二方位包括:根据所述第三指示信息确定所述第二方位。
  7. 根据权利要求6所述的方法,所述显示用于指示所述目标位移的第三指示信息包括:显示用于指示所述目标位移的曲线,其中,所述第三指示信息包括所述曲线;
    所述根据所述第三指示信息确定所述第二方位包括:确定所述曲线与预设平面的相交位置,其中,所述预设平面用于支撑所述虚拟角色;将距离所述相交位置预设距离范围内的区域确定为所述第二方位指示的位置。
  8. 根据权利要求7所述的方法,所述将距离所述相交位置预设距离范围内的区域确定为所述第二方位指示的位置包括:
    检测所述第二方位在所述虚拟现实场景中是否合法;
    如果检测到所述第二方位在所述虚拟现实场景中合法,将距离所述相交位置预设距离范围内的区域确定为所述第二方位指示的位置;
    如果检测到所述第二方位在所述虚拟现实场景中不合法,显示预设标识信息,其中,所述预设标识信息用于指示所述第二方位在所述虚拟现实场景中不合法。
  9. 根据权利要求6所述的方法,在所述显示用于指示所述目标位移的第三指示信息之后,所述方法还包括:
    接收第二指令,其中,所述第二指令用于指示所述虚拟角色取消从所述第一方位产生位移,所述位移包括所述目标位移;
    在接收所述第二指令之后,控制所述虚拟角色取消从所述第一方位产生所述目标位移。
  10. 根据权利要求1所述的方法,所述确定与所述第一方位相距所述目标位移的所述第二方位包括:
    接收第三指令,其中,所述第三指令用于指示所述第二方位指示的位置;
    在接收到所述第三指令之后,确定所述第二方位指示的位置。
  11. 根据权利要求10所述的方法,所述接收所述第三指令包括:
    获取第一操作对象在真实场景中的位置信息,其中,所述第一操作对象用于调整所述第二方位指示的位置,所述位置信息与所述第二方位指示的位置相对应;
    根据所述位置信息获取所述第三指令。
  12. 根据权利要求1所述的方法,所述确定与所述第一方位相距所述目标位移的所述第二方位包括:
    接收第四指令,其中,所述第四指令用于指示所述第二方位指 示的朝向;
    在接收到所述第四指令之后,确定所述第二方位指示的朝向。
  13. 根据权利要求12所述的方法,所述接收所述第四指令包括:
    获取第二操作对象在真实场景中的角度信息,其中,所述第二操作对象用于调整所述第二方位指示的朝向,所述角度信息与所述第二方位指示的朝向相对应;
    根据所述角度信息获取所述第四指令。
  14. 根据权利要求1至13中任意一项所述的方法,
    所述接收所述第一指令包括:通过摇杆接收所述第一指令;所述确定与所述第一方位相距所述目标位移的第二方位包括:通过所述摇杆确定所述第二方位;或者
    所述接收所述第一指令包括:通过触控板接收所述第一指令;所述确定与所述第一方位相距所述目标位移的第二方位包括:通过所述触控板确定所述第二方位。
  15. 一种虚拟角色的位移控制装置,包括:
    处理器以及与所述处理器相连接的存储器,所述存储器中存储有可由所述处理器执行的机器可读指令,所述处理器执行所述机器可读指令完成以下操作:
    在虚拟现实场景中,接收第一指令,其中,所述第一指令用于指示虚拟角色从第一方位产生目标位移;
    在接收到所述第一指令之后,确定与所述第一方位相距所述目标位移的第二方位;
    控制所述虚拟角色消失在所述第一方位,并控制所述虚拟角色出现在所述第二方位,其中,所述虚拟角色未出现过在所述第一方位和所述第二方位之间。
  16. 根据权利要求15所述的装置,所述处理器执行所述机器可读指令完成以下操作:
    在确定所述第二方位之后,控制所述虚拟角色停留在所述第一方位,并控制预设镜头黑屏,其中,所述预设镜头用于显示所述虚拟现实场景的画面;
    在控制所述预设镜头黑屏之后,控制所述预设镜头恢复显示,并控制所述虚拟角色停留在所述第二方位。
  17. 根据权利要求16所述的装置,所述处理器执行所述机器可读指令完成以下操作:
    控制所述预设镜头逐渐黑屏,其中,在所述预设镜头逐渐黑屏的过程中,保持所述虚拟角色停留在所述第一方位;
    控制所述预设镜头逐渐恢复显示,其中,在所述预设镜头逐渐恢复显示的过程中,保持所述虚拟角色停留在所述第二方位。
  18. 根据权利要求15所述的装置,所述处理器执行所述机器可读指令完成以下操作:
    在确定所述第二方位之后,控制所述虚拟角色消失在所述第一方位,并在所述第一方位显示第一指示信息,其中,所述第一指示信息用于指示所述第一方位在所述虚拟现实场景中有动静;
    控制所述虚拟角色出现在所述第二方位,并在所述第二方位显示第二指示信息,其中,所述第二指示信息用于指示所述第二方位在所述虚拟现实场景中有动静。
  19. 根据权利要求18所述的装置,所述处理器执行所述机器可读指令完成以下操作:
    在所述第一方位显示运动着的第一点粒子,其中,所述第一指示信息包括运动着的所述第一点粒子;
    在所述第二方位显示运动着的第二点粒子,其中,所述第二指示信息包括运动着的所述第二点粒子,所述第一点粒子到所述第二点粒子的运动方向用于表示所述虚拟角色从所述第一方位到所述第二方位的变化过程。
  20. 根据权利要求15所述的装置,在确定与所述第一方位相距所述目标位移的所述第二方位之前,所述处理器执行所述机器可读指令完成以下操作:
    显示用于指示所述目标位移的第三指示信息;
    根据所述第三指示信息确定所述第二方位。
  21. 根据权利要求20所述的装置,所述处理器执行所述机器可读指令完成以下操作:
    显示用于指示所述目标位移的曲线,其中,所述第三指示信息包括所述曲线;
    确定所述曲线与预设平面的相交位置,其中,所述预设平面用于支撑所述虚拟角色;
    将距离所述相交位置预设距离范围内的区域确定为所述第二方位指示的位置。
  22. 根据权利要求21所述的装置,所述处理器执行所述机器可读指令完成以下操作:
    检测所述第二方位在所述虚拟现实场景中是否合法;
    如果检测到所述第二方位在所述虚拟现实场景中合法,将距离所述相交位置预设距离范围内的区域确定为所述第二方位指示的位置;
    如果检测到所述第二方位在所述虚拟现实场景中不合法,显示预设标识信息,其中,所述预设标识信息用于指示所述第二方位在所述虚拟现实场景中不合法。
  23. 根据权利要求20所述的装置,在所述显示用于指示所述目标位移的第三指示信息之后,所述处理器执行所述机器可读指令完成以下操作:
    接收第二指令,其中,所述第二指令用于指示所述虚拟角色取消从所述第一方位产生位移,所述位移包括所述目标位移;
    在接收所述第二指令之后,控制所述虚拟角色取消从所述第一方位产生所述目标位移。
  24. 根据权利要求15所述的装置,所述处理器执行所述机器可读指令完成以下操作:
    接收第三指令,其中,所述第三指令用于指示所述第二方位指示的位置;
    在接收到所述第三指令之后,确定所述第二方位指示的位置。
  25. 根据权利要求24所述的装置,所述处理器执行所述机器可读指令完成以下操作:
    获取第一操作对象在真实场景中的位置信息,其中,所述第一操作对象用于调整所述第二方位指示的位置,所述位置信息与所述第二方位指示的位置相对应;
    根据所述位置信息获取所述第三指令。
  26. 根据权利要求15所述的装置,所述处理器执行所述机器可读指令完成以下操作:
    接收第四指令,其中,所述第四指令用于指示所述第二方位指示的朝向;
    在接收到所述第四指令之后,确定所述第二方位指示的朝向。
  27. 根据权利要求26所述的装置,所述处理器执行所述机器可读指令完成以下操作:
    获取第二操作对象在真实场景中的角度信息,其中,所述第二操作对象用于调整所述第二方位指示的朝向,所述角度信息与所述第二方位指示的朝向相对应;
    根据所述角度信息获取所述第四指令。
  28. 根据权利要求15至27中任意一项所述的装置,所述处理器执行所述机器可读指令完成以下操作:
    通过摇杆接收所述第一指令;
    通过所述摇杆确定所述第二方位;
    或者,
    通过触控板接收所述第一指令;
    通过所述触控板确定所述第二方位。
  29. 一种非易失性计算机可读存储介质,所述存储介质中存储有机器可读指令,所述机器可读指令可以由处理器执行以完成如权利要求1至14任一项所述的方法。
PCT/CN2018/096646 2017-07-25 2018-07-23 虚拟角色的位移控制方法、装置和存储介质 WO2019019968A1 (zh)

Priority Applications (6)

Application Number Priority Date Filing Date Title
EP18837684.2A EP3575928A4 (en) 2017-07-25 2018-07-23 METHOD AND DEVICE FOR SHIFT CONTROL FOR VIRTUAL CHARACTERS AND STORAGE MEDIUM
KR1020197029586A KR102574170B1 (ko) 2017-07-25 2018-07-23 가상 캐릭터의 배치 제어 방법 및 디바이스와 저장 매체
JP2019572111A JP7023991B2 (ja) 2017-07-25 2018-07-23 仮想キャラクターの変位制御方法、装置、および記憶媒体
US16/538,147 US11049329B2 (en) 2017-07-25 2019-08-12 Method and apparatus for controlling placement of virtual character and storage medium
US17/325,187 US11527052B2 (en) 2017-07-25 2021-05-19 Method and apparatus for controlling placement of virtual character and storage medium
US17/985,450 US12026847B2 (en) 2017-07-25 2022-11-11 Method and apparatus for controlling placement of virtual character and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710613071.2A CN107450747B (zh) 2017-07-25 2017-07-25 虚拟角色的位移控制方法和装置
CN201710613071.2 2017-07-25

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/538,147 Continuation US11049329B2 (en) 2017-07-25 2019-08-12 Method and apparatus for controlling placement of virtual character and storage medium

Publications (1)

Publication Number Publication Date
WO2019019968A1 true WO2019019968A1 (zh) 2019-01-31

Family

ID=60487554

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/096646 WO2019019968A1 (zh) 2017-07-25 2018-07-23 虚拟角色的位移控制方法、装置和存储介质

Country Status (6)

Country Link
US (3) US11049329B2 (zh)
EP (1) EP3575928A4 (zh)
JP (1) JP7023991B2 (zh)
KR (1) KR102574170B1 (zh)
CN (1) CN107450747B (zh)
WO (1) WO2019019968A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111078031A (zh) * 2019-12-23 2020-04-28 上海米哈游网络科技股份有限公司 一种虚拟人物的位置确定方法、装置、设备及存储介质

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107450747B (zh) 2017-07-25 2018-09-18 腾讯科技(深圳)有限公司 虚拟角色的位移控制方法和装置
KR102085440B1 (ko) 2017-12-26 2020-03-05 (주)스코넥엔터테인먼트 가상 현실 제어 시스템
CN108211342A (zh) * 2018-01-19 2018-06-29 腾讯科技(深圳)有限公司 视角调整方法和装置、存储介质及电子装置
CN108379780B (zh) * 2018-03-13 2020-06-02 北京小米移动软件有限公司 虚拟跑步场景控制方法和装置、跑步机
CN108434741A (zh) * 2018-03-15 2018-08-24 网易(杭州)网络有限公司 虚拟现实中的移动控制方法及装置
CN108427501B (zh) * 2018-03-19 2022-03-22 网易(杭州)网络有限公司 虚拟现实中移动控制方法和装置
CN109316741A (zh) * 2018-07-17 2019-02-12 派视觉虚拟现实(深圳)软件技术有限公司 一种vr场景中控制角色移动的方法、装置及设备
EP3640767A1 (de) * 2018-10-17 2020-04-22 Siemens Schweiz AG Verfahren zum bestimmen mindestens eines bereichs in mindestens einem eingabemodell für mindestens ein zu platzierendes element
CN109814713A (zh) * 2019-01-10 2019-05-28 重庆爱奇艺智能科技有限公司 一种用于vr用户视角切换的方法与设备
CN111054076B (zh) * 2019-11-21 2021-05-04 珠海剑心互动娱乐有限公司 一种游戏角色行走动画处理的方法、终端及存储介质
CN111757081B (zh) * 2020-05-27 2022-07-08 海南车智易通信息技术有限公司 用于虚拟场景的移动限制方法、客户端、服务器及计算设备
CN111729306B (zh) * 2020-06-24 2024-06-04 网易(杭州)网络有限公司 游戏角色的传送方法、装置、电子设备及存储介质
CN111803947A (zh) * 2020-07-15 2020-10-23 广州玖的数码科技有限公司 虚拟空间中的游戏角色移动控制方法、存储介质和服务器
CN111803948A (zh) * 2020-07-15 2020-10-23 广州玖的数码科技有限公司 联机游戏角色移动处理方法、存储介质和电子设备
US20230093585A1 (en) * 2021-09-21 2023-03-23 Facebook Technologies, Llc Audio system for spatializing virtual sound sources
US11847750B2 (en) * 2022-05-18 2023-12-19 Niantic, Inc. Smooth object correction for augmented reality devices
CN115098005B (zh) * 2022-06-24 2023-01-24 北京华建云鼎科技股份公司 一种控制目标对象移动的数据处理***

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105094920A (zh) * 2015-08-14 2015-11-25 网易(杭州)网络有限公司 一种游戏渲染方法和装置
JP2016214807A (ja) * 2015-05-19 2016-12-22 フミソー株式会社 パズルゲーム
CN106527722A (zh) * 2016-11-08 2017-03-22 网易(杭州)网络有限公司 虚拟现实中的交互方法、***及终端设备
CN106598465A (zh) * 2016-12-20 2017-04-26 上海逗屋网络科技有限公司 基于虚拟摇杆的控制方法、装置和设备
CN107450747A (zh) * 2017-07-25 2017-12-08 腾讯科技(深圳)有限公司 虚拟角色的位移控制方法和装置

Family Cites Families (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07236769A (ja) * 1994-02-28 1995-09-12 Sega Enterp Ltd 画像表示制御方法及びこれを用いた電子遊戯装置
JP2995703B1 (ja) * 1998-10-08 1999-12-27 コナミ株式会社 画像作成装置、画像作成装置における表示場面切替方法、画像作成装置における表示場面切替プログラムが記録された可読記録媒体及びビデオゲーム装置
ATE349247T1 (de) * 2000-07-17 2007-01-15 Sony Computer Entertainment Inc Programmausführungssystem, programmausführungsvorrichtung, aufzeichungsmedium und entsprechendes computerausführbares programm
CN100428218C (zh) * 2002-11-13 2008-10-22 北京航空航天大学 一种实现通用虚拟环境漫游引擎的方法
JP3924579B2 (ja) * 2005-03-30 2007-06-06 株式会社コナミデジタルエンタテインメント ゲームプログラム、ゲーム装置及びゲーム制御方法
JP4971846B2 (ja) * 2007-03-16 2012-07-11 株式会社コナミデジタルエンタテインメント ゲーム装置、ゲーム装置の制御方法及びプログラム
US8214750B2 (en) * 2007-10-31 2012-07-03 International Business Machines Corporation Collapsing areas of a region in a virtual universe to conserve computing resources
JP2009112631A (ja) * 2007-11-08 2009-05-28 Koto:Kk ゲームキャラクタ表示制御システム
US8259100B2 (en) * 2008-04-24 2012-09-04 International Business Machines Corporation Fixed path transitions
US20130271457A1 (en) * 2012-04-11 2013-10-17 Myriata, Inc. System and method for displaying an object within a virtual environment
US10510189B2 (en) 2014-04-16 2019-12-17 Sony Interactive Entertainment Inc. Information processing apparatus, information processing system, and information processing method
JP5781213B1 (ja) 2014-12-26 2015-09-16 株式会社Cygames ゲーム制御プログラム、ゲーム制御方法及びゲーム制御装置
CN104539929B (zh) * 2015-01-20 2016-12-07 深圳威阿科技有限公司 带有运动预测的立体图像编码方法和编码装置
US10373392B2 (en) * 2015-08-26 2019-08-06 Microsoft Technology Licensing, Llc Transitioning views of a virtual model
CN105183296B (zh) * 2015-09-23 2018-05-04 腾讯科技(深圳)有限公司 交互界面显示方法及装置
CN105913497B (zh) * 2016-05-27 2018-09-07 杭州映墨科技有限公司 用于虚拟看房的虚拟现实空间移动定位***及方法
JP6126273B1 (ja) * 2016-06-10 2017-05-10 株式会社コロプラ 仮想空間を提供するための方法、当該方法をコンピュータに実現させるためのプログラム、および仮想空間を提供するためのシステム
CN106094639A (zh) * 2016-07-12 2016-11-09 大连普菲克科技有限公司 行走模拟控制装置
CN106502395A (zh) * 2016-10-18 2017-03-15 深圳市火花幻境虚拟现实技术有限公司 一种在虚拟现实应用中避免用户眩晕的方法及装置
CN106569609B (zh) * 2016-11-11 2019-05-07 上海远鉴信息科技有限公司 改进型虚拟现实中用户传送方法及***
CN106484123A (zh) * 2016-11-11 2017-03-08 上海远鉴信息科技有限公司 虚拟现实中用户传送方法及***
CN106774872A (zh) * 2016-12-09 2017-05-31 网易(杭州)网络有限公司 虚拟现实***、虚拟现实交互方法及装置
CN106621324A (zh) * 2016-12-30 2017-05-10 当家移动绿色互联网技术集团有限公司 Vr游戏的交互操作方法
CN106924970B (zh) * 2017-03-08 2020-07-07 网易(杭州)网络有限公司 虚拟现实***、基于虚拟现实的信息显示方法及装置
JP6257827B1 (ja) * 2017-06-01 2018-01-10 株式会社コロプラ 仮想空間を提供するためにコンピュータで実行される方法、プログラム、および、情報処理装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016214807A (ja) * 2015-05-19 2016-12-22 フミソー株式会社 パズルゲーム
CN105094920A (zh) * 2015-08-14 2015-11-25 网易(杭州)网络有限公司 一种游戏渲染方法和装置
CN106527722A (zh) * 2016-11-08 2017-03-22 网易(杭州)网络有限公司 虚拟现实中的交互方法、***及终端设备
CN106598465A (zh) * 2016-12-20 2017-04-26 上海逗屋网络科技有限公司 基于虚拟摇杆的控制方法、装置和设备
CN107450747A (zh) * 2017-07-25 2017-12-08 腾讯科技(深圳)有限公司 虚拟角色的位移控制方法和装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111078031A (zh) * 2019-12-23 2020-04-28 上海米哈游网络科技股份有限公司 一种虚拟人物的位置确定方法、装置、设备及存储介质
CN111078031B (zh) * 2019-12-23 2023-11-14 上海米哈游网络科技股份有限公司 一种虚拟人物的位置确定方法、装置、设备及存储介质

Also Published As

Publication number Publication date
US11049329B2 (en) 2021-06-29
KR20190126377A (ko) 2019-11-11
US20230074857A1 (en) 2023-03-09
US20190362564A1 (en) 2019-11-28
JP7023991B2 (ja) 2022-02-22
US12026847B2 (en) 2024-07-02
US11527052B2 (en) 2022-12-13
CN107450747B (zh) 2018-09-18
EP3575928A4 (en) 2020-02-26
JP2020527262A (ja) 2020-09-03
EP3575928A1 (en) 2019-12-04
US20210272380A1 (en) 2021-09-02
CN107450747A (zh) 2017-12-08
KR102574170B1 (ko) 2023-09-05

Similar Documents

Publication Publication Date Title
WO2019019968A1 (zh) 虚拟角色的位移控制方法、装置和存储介质
JP7256284B2 (ja) ゲームキャラクター制御方法、装置、機器および記憶媒体
JP6722252B2 (ja) 情報処理方法及び装置、記憶媒体、電子機器
EP3469466B1 (en) Directional interface object
US11226722B2 (en) Information interaction method and apparatus, storage medium, and electronic apparatus
TW202004421A (zh) 用於在hmd環境中利用傳至gpu之預測及後期更新的眼睛追蹤進行快速注視點渲染
WO2018064601A1 (en) Using a portable device and a head-mounted display to view a shared virtual reality space
US11194400B2 (en) Gesture display method and apparatus for virtual reality scene
US20190377473A1 (en) VR Comfort Zones Used to Inform an In-VR GUI Editor
CN111417989B (zh) 程序、信息处理方法、信息处理***、头戴式显示装置和信息处理装置
US20200241733A1 (en) Extended on-screen gameplay via augmented reality
CN107930114A (zh) 信息处理方法及装置、存储介质、电子设备
JP7249975B2 (ja) 位置に基づくゲームプレイコンパニオンアプリケーションへユーザの注目を向ける方法及びシステム
KR20220018562A (ko) 인공 현실 시스템을 위한 모서리-식별 제스처-구동 사용자 인터페이스 요소 게이팅
WO2018116544A1 (ja) 情報処理装置、情報処理方法、およびプログラム
US20230214005A1 (en) Information processing apparatus, method, program, and information processing system
WO2019166005A1 (zh) 智能终端及其感控方法、具有存储功能的装置
JP2019516180A (ja) 仮想化環境内にイメージを提示するための方法及び装置
US20180059788A1 (en) Method for providing virtual reality, program for executing the method on computer, and information processing apparatus
US20230310989A1 (en) Object control method and apparatus in virtual scene, terminal device, computer-readable storage medium, and computer program product
US20240160273A1 (en) Inferring vr body movements including vr torso translational movements from foot sensors on a person whose feet can move but whose torso is stationary
CN117742479A (zh) 人机交互方法、装置、设备和介质
CN117742478A (zh) 信息显示方法、装置、设备和介质
WO2018234318A1 (en) REDUCING VIRTUAL DISEASE IN VIRTUAL REALITY APPLICATIONS

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18837684

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2018837684

Country of ref document: EP

Effective date: 20190830

ENP Entry into the national phase

Ref document number: 20197029586

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2019572111

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE