CN108310768A - The display methods and device of virtual scene, storage medium, electronic device - Google Patents

The display methods and device of virtual scene, storage medium, electronic device Download PDF

Info

Publication number
CN108310768A
CN108310768A CN201810040140.XA CN201810040140A CN108310768A CN 108310768 A CN108310768 A CN 108310768A CN 201810040140 A CN201810040140 A CN 201810040140A CN 108310768 A CN108310768 A CN 108310768A
Authority
CN
China
Prior art keywords
region
area
event
virtual scene
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810040140.XA
Other languages
Chinese (zh)
Other versions
CN108310768B (en
Inventor
童颜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201810040140.XA priority Critical patent/CN108310768B/en
Publication of CN108310768A publication Critical patent/CN108310768A/en
Application granted granted Critical
Publication of CN108310768B publication Critical patent/CN108310768B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5255Changing parameters of virtual cameras according to dedicated instructions from a player, e.g. using a secondary joystick to rotate the camera around a player's character
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a kind of display methods of virtual scene and device, storage medium, electronic device.Wherein, this method includes:Detect that first event, first event are the event triggered on the client, client is used to control the target object in the virtual scene shown in client;In response to first event, determine that the first area in virtual scene, first area are different from second area, second area is the region where target object in virtual scene;Display is located at the scene in first area and second area on the client.The present invention solves the smaller technical problem of scene areas that terminal in the related technology is shown.

Description

The display methods and device of virtual scene, storage medium, electronic device
Technical field
The present invention relates to internet arenas, and the display methods and device, storage in particular to a kind of virtual scene are situated between Matter, electronic device.
Background technology
With the development of multimedia technology and popularizing for wireless network, the recreation of people becomes more and more abundant, such as Single machine or internet game are played by hand held media device networked game play, by computer, type of play is varied, such as bullet Curtain shooting game, venture game, simulation, role playing game, the online tactics sports MOBA game of more people, leisure chess and card trip Play and other game etc..
In the game of most of type, player can select to play with other players, for example, fashionable market at present MOBA game, multiple players can simultaneously online participate in game.During game, player can check player role institute Scene in region, is limited to that mobile terminal screen is smaller, and the scene areas shown on mobile terminals is also smaller, due to display Scene areas it is smaller, can have problems, player cannot have found the enemy close to oneself in time, to reduce player's Game experiencing.
The smaller technical problem of the scene areas that shows for terminal in the related technology, not yet proposes effective solution at present Scheme.
Invention content
An embodiment of the present invention provides a kind of display methods of virtual scene and device, storage medium, electronic device, so that The smaller technical problem of terminal is shown in the related technology scene areas is solved less.
One side according to the ... of the embodiment of the present invention provides a kind of display methods of virtual scene, including:Detect One event, first event are the event triggered on the client, and client is used to control in the virtual scene shown in client Target object;In response to first event, determine that the first area in virtual scene, first area are different from second area, the Two regions are the region where target object in virtual scene;Display is located in first area and second area on the client Scene.
Another aspect according to the ... of the embodiment of the present invention additionally provides a kind of display device of virtual scene, including:Detection is single Member, for detecting that first event, first event are the event triggered on the client, client is shown for controlling in client Target object in the virtual scene shown;Determination unit, in response to first event, determining the firstth area in virtual scene Domain, first area are different from second area, and second area is the region where target object in virtual scene;Display unit is used It is located at the scene in first area and second area in display on the client.
Another aspect according to the ... of the embodiment of the present invention additionally provides a kind of storage medium, which includes storage Program, program execute above-mentioned method when running.
Another aspect according to the ... of the embodiment of the present invention, additionally provides a kind of electronic device, including memory, processor and deposits The computer program that can be run on a memory and on a processor is stored up, processor executes above-mentioned side by computer program Method.
In embodiments of the present invention, when detecting first event, the first area in virtual scene, first area are determined Different from second area, second area is the region where target object in virtual scene;Display is located at first on the client Scene in region and second area, since the displayable scene of display interface in client can be extended by first event The range in region can solve the smaller technical problem of terminal is shown in the related technology scene areas, and then reach extension The technique effect of the displayable scene areas range of terminal.
Description of the drawings
Attached drawing described herein is used to provide further understanding of the present invention, and is constituted part of this application, this hair Bright illustrative embodiments and their description are not constituted improper limitations of the present invention for explaining the present invention.In the accompanying drawings:
Fig. 1 is the schematic diagram of the hardware environment of the display methods of virtual scene according to the ... of the embodiment of the present invention;
Fig. 2 is a kind of flow chart of the display methods of optional virtual scene according to the ... of the embodiment of the present invention;
Fig. 3 is a kind of schematic diagram of optional map according to the ... of the embodiment of the present invention;
Fig. 4 is a kind of schematic diagram of optional map according to the ... of the embodiment of the present invention;
Fig. 5 is a kind of schematic diagram of optional map according to the ... of the embodiment of the present invention;
Fig. 6 is a kind of schematic diagram of optional map according to the ... of the embodiment of the present invention;
Fig. 7 is a kind of schematic diagram of optional interface according to the ... of the embodiment of the present invention;
Fig. 8 is a kind of schematic diagram of optional interface according to the ... of the embodiment of the present invention;
Fig. 9 is a kind of schematic diagram of optional interface according to the ... of the embodiment of the present invention;
Figure 10 is a kind of flow chart of the display methods of optional virtual scene according to the ... of the embodiment of the present invention;
Figure 11 is a kind of schematic diagram of the display device of optional virtual scene according to the ... of the embodiment of the present invention;And
Figure 12 is a kind of structure diagram of terminal according to the ... of the embodiment of the present invention.
Specific implementation mode
In order to enable those skilled in the art to better understand the solution of the present invention, below in conjunction in the embodiment of the present invention Attached drawing, technical scheme in the embodiment of the invention is clearly and completely described, it is clear that described embodiment is only The embodiment of a part of the invention, instead of all the embodiments.Based on the embodiments of the present invention, ordinary skill people The every other embodiment that member is obtained without making creative work should all belong to the model that the present invention protects It encloses.
It should be noted that term " first " in description and claims of this specification and above-mentioned attached drawing, " Two " etc. be for distinguishing similar object, without being used to describe specific sequence or precedence.It should be appreciated that using in this way Data can be interchanged in the appropriate case, so as to the embodiment of the present invention described herein can in addition to illustrating herein or Sequence other than those of description is implemented.In addition, term " comprising " and " having " and their any deformation, it is intended that cover It includes to be not necessarily limited to for example, containing the process of series of steps or unit, method, system, product or equipment to cover non-exclusive Those of clearly list step or unit, but may include not listing clearly or for these processes, method, product Or the other steps or unit that equipment is intrinsic.
First, the part noun or term occurred during the embodiment of the present invention is described is suitable for as follows It explains:
MOBA:MOBA is the abbreviation of Multiplayer Online Battle Arena Games, and Chinese is translated into more people and exists Line tactics athletic game.
UI layers:The abbreviation of UI, that is, User Interface (user interface), UI layers include the icon in interface.
Operation layer:Sports ground background layer residing for the roles such as the hero that the player in game uses.
One side according to the ... of the embodiment of the present invention provides a kind of embodiment of the method for the display methods of virtual scene.
Optionally, in the present embodiment, the display methods of above-mentioned virtual scene can be applied to as shown in Figure 1 by servicing In the hardware environment that device 101 and terminal 103 are constituted.As shown in Figure 1, server 101 is connected by network and terminal 103 It connects, can be used for providing service (such as game services, application service, WEB service) for the client installed in terminal or terminal, it can In service above or independently of server setting database 105, for providing data storage service, above-mentioned network for server 101 Including but not limited to:Wide area network, Metropolitan Area Network (MAN) or LAN, terminal 103 are not limited to PC, mobile phone, tablet computer etc..The present invention The display methods of the virtual scene of embodiment can be executed by server 101, can also be executed by terminal 103, can be with It is to be executed jointly by server 101 and terminal 103.Wherein, terminal 103 executes the display side of the virtual scene of the embodiment of the present invention Method can also be to be executed by client mounted thereto:
Step S102 detects the drag events (i.e. first event) of player's triggering on the client, and such as two refer to sliding, sliding Dynamic direction can be up and down in any one.
Step S104 (namely is currently displayed at first in client third region B according to the dragging distance of drag events Scene areas in sub-interface) it is moved, to obtain first area C, it is likely to occur pair in first area after movement Corner color D.
Step S106, display on the client is located at the scene in first area C and second area A, for second area A Display can be constant, by the scene switching in the third region of display be first area in scene.
In other words, through the above steps, the picture of client top branch scape can be kept motionless, while to another part Scene moved, convenient for the various change in player views scene, such as teammate, other side, non-player's control object NPC, trip The variation of play environment etc..The technical solution that Fig. 1 is only used for schematically to the application illustrates, and is described in detail with reference to Fig. 2 Embodiments herein.
Fig. 2 is a kind of flow chart of the display methods of optional virtual scene according to the ... of the embodiment of the present invention, such as Fig. 2 institutes Show, this method may comprise steps of:
Step S202 detects that first event, first event are the event triggered on the client, and client is for controlling Target object in the virtual scene shown in client.
Above-mentioned client can be the client of game application, social networking application, instant messaging application etc.;The client can pacify In the terminals such as mobile terminal, PC computers, notebook, smart television, above-mentioned first event can be triggered directly in terminal, Such as pass through the touch screen of terminal, pressure sensitivity screen, control handle, image collecting device;It also can be by connecting with the terminal called Another terminal triggering, such as the mobile phone by being connect with PC computers, smart television.
Above-mentioned virtual scene can be the scene of game in game application, the social scene etc. in social networking application;Target Object can be the player role controlled by player in scene of game, the user role being controlled by the user in social networking application.
Step S204 determines that the first area in virtual scene, first area are different from the secondth area in response to first event Domain, second area are the region where target object in virtual scene.
Optionally, above-mentioned first event can be a plurality of types of events (such as drag events, slip event, click event, Press event, gesture etc.), at the first area in determining virtual scene, the thing of first event can be obtained according to event type Part information, such as sliding distance, click volume, pressure value;The first area in virtual scene is determined according to event information.
Step S206, on the client display are located at the scene in first area and second area.
It should be noted that for client, the display interface of client can at least be divided into two parts, wherein one A part (the first sub-interface) is used for showing scene in first area, another part (the second sub-interface) can be used to show the Scene in two regions, target object can reside in the center of entire display interface (as across first area and second area Form presence), it can also exist only in second area.Since screen size is fixed, the range that can be shown is also solid Fixed, and position of the target object in screen is fixed (such as among screen), is equivalent to and defines second area or Two regions can only be the region in distance objective object fixed range, to cause the displayable scene areas of terminal compared with It is small.
In the case where target object is motionless, the picture of the second sub-interface will not change, and player can pass through at this time Above-mentioned various types of first events change to trigger the picture in the first sub-interface, appointing such as into screen four direction Meaning one is slided so that the displayable scene areas of terminal is extended, and also corresponds to extend the visual of user Range can effectively improve user experience.
S202 is to step S206 through the above steps, when detecting first event, determines the firstth area in virtual scene Domain, first area are different from second area, and second area is the region where target object in virtual scene;It shows on the client Show the scene in the first area and second area, it can due to can extend display interface in client by first event The range of the scene areas of display can solve the smaller technical problem of terminal is shown in the related technology scene areas, in turn The technique effect of the displayable scene areas range of extension terminal is reached.
In the technical solution that step S202 is provided, detect that first event, first event trigger on the client Event, client are used to control the target object in the virtual scene shown in client.
Optionally, the type of the first event detected includes but is not limited to:
1) drag events or sliding thing detected on touch screen, on pressure sensitivity screen, mouse, in the equipment such as image collecting device Part;
2) the click event detected on touch screen, on pressure sensitivity screen, mouse, in the equipment such as image collecting device;
3) press event detected in the equipment such as pressure sensitivity screen;
4) gesture operation (such as " the drawing circle ", " drawing triangle " gesture behaviour detected in the equipment such as image collecting device Make).
In the technical solution that step S204 is provided, at the first area in determining virtual scene, the first thing can be obtained The event information of part;The first area in virtual scene is determined according to event information.
Optionally, the event information of the first event of acquisition includes but is not limited to:
1) slip event
The slip event of the application can be singly to refer to sliding or refer to more to slide (such as two finger slidings, three refer to sliding).
In the case where first event is slip event, the event information of slip event on the client is obtained, wherein Event information includes the direction of the sliding of slip event, may also include third distance, third distance is for determining the first distance (i.e. The distance of scene areas movement is shown in plane).
Optionally, such as Fig. 3, when determining the first area in virtual scene according to event information, can believe according at least to event The represented direction of breath, moves positions of the third region B in virtual scene, first area is obtained, for example, often sliding Once, fixed range just is translated according to glide direction, the is obtained after being moved to position of the third region in virtual scene One region, third region are the region where the scene that the first sub-interface before movement is shown.
Optionally, according at least to the direction represented by event information, position of the third region in virtual scene is carried out It is mobile, it obtains to first area and may include:Obtain the first distance y indicated by event information, the first distance is that third region need to be The distance of first moment movement, y=k*x, x indicate that third distance, k indicate pre-set coefficient;At the first moment by third The first distance is moved in position of the region in virtual scene to the direction represented by event information, obtains first area, third area Domain is the first sub-interface in the region where the scene that the second moment show, the second moment for the first moment previous moment, such as It shown in Fig. 3, after third region is to glide direction displacement distance y, is equivalent to and region C is moved downwardly distance y, and occur Within the scope of the rendering of the first sub-interface, as shown in figure 4, B just removes the rendering range of the first sub-interface at this time, and second The scenic picture rendered in sub-interface does not change.
Optionally, as shown in figure 5, player can also slide left and right, when third region is to glide direction displacement distance y Afterwards, it is equivalent to and region E (namely first area above-mentioned) has been moved to the left distance y, and appeared in the first sub-interface It renders in range, as shown in fig. 6, B just removes the rendering range of the first sub-interface at this time, and the field rendered in the second sub-interface Scape picture does not change.
Optionally, above-mentioned third distance can be during slip event a fixed time period (such as 0.1 second, 0.2 Second, 0.5 second etc.) sliding distance, imply that every the fixed time period it is determined that a third distance, and then movement is corresponding First distance;The sliding distance of above-mentioned third distance or primary complete slip event.
It should be noted that also the maximum region for allowing to check can be pre-set according to the position of target object, in this way, According at least to the direction represented by event information, position of the third region in virtual scene is moved, the firstth area is obtained Domain may include:Obtain the first distance indicated by event information, wherein the first distance, which is third region, to be moved at the first moment Distance;In the case where the first distance is less than second distance, by position of the third region in virtual scene to event information The first distance is moved in represented direction, obtains first area, and second distance is side of the third region represented by event information Upwards the distance between with the boundary of the fourth region, the fourth region is to allow the region checked, and third region in virtual scene In the fourth region;In the case where the first distance is not less than second distance, by position of the third region in virtual scene Second distance is moved to the direction represented by event information, obtains first area.
2) in the case where first event is touch event, the event information of touch event on the client is obtained, In, event information includes the position of touch of touch event.
Above-mentioned touch event can be the click event in third sub-interface, and corresponding event information includes clicking Position.
Determine that the first area in virtual scene may include according to event information:Obtain first represented by event information It sets;Region where determining the first position in virtual scene is first area.
Optionally, obtaining the first position represented by event information includes:It obtains and the touch-control position represented by event information Corresponding first position is set, position of touch is trigger position of the event information in third sub-interface, and third sub-interface is for showing Show that the scene map of virtual scene, first position are that position of touch is mapped in the position in scene map.It determines in virtual scene First position where region include for first area:Region where determining first position in scene map is the 5th area Domain;Using the part represented by the 5th region in virtual scene as first area.The 5th region herein is big with third region It is small identical.
As shown in fig. 7, clicking the region for needing to be shown in the first sub-interface, such as current time in player's third sub-interface The content such as content for third region B in map shown by (corresponding to user's click location 701), as shown in figure 8, working as player After finger position becomes 703, the region shown in the first sub-interface is relocated, can be specifically with finger touch position Centered on a panel region (i.e. the 5th region).
Similarly, player can also click in the first sub-interface, and the position clicked will be used as in the first sub-interface The center in the region (first area) of middle display.
Above-mentioned touch event can be click event or press event in the first sub-interface, corresponding event information Including position of touch, number of clicks or pressing dynamics (pressure value).
Optionally, optionally, according at least to the direction represented by event information, to position of the third region in virtual scene It sets and is moved, obtain to first area and may include:The first distance y indicated by event information is obtained, the first distance is third area The distance that domain need to move at the first moment, y=k*n, n indicate that number of clicks or pressing dynamics, k indicate pre-set coefficient; The first distance is moved into the direction represented by event information in position of the third region in virtual scene at the first moment, is obtained First area, third region are the first sub-interface in the region where the scene that the second moment show, when the second moment was first The previous moment at quarter.
Above-mentioned direction can be determined according to position of touch, and the first sub-interface is such as divided into multiple regions, each region One moving direction of characterization (as up and down), then moving direction can be determined according to the region where position of touch.
As shown in figure 9, the first sub-interface is divided into 901,903,905,907 4 regions, respectively it is corresponding downwards, to the right, Upwards, four moving directions to the left, when player's click on area 905, the region D in scene will be moved up, and become the firstth area Domain.
Optionally, the direction represented by the division in region, region can be also determined otherwise, above-mentioned reality Example is applied to be only used for schematically illustrating.
Similar to the above embodiments, above-mentioned number of clicks or pressure value can be a fixation during touch event The number of clicks or pressure value of period (such as 0.1 second, 0.2 second, 0.5 second), imply that every the fixed time period it is determined that One click number or pressure value, and then mobile corresponding first distance;Above-mentioned number of clicks or pressure value is alternatively once The number of clicks or pressure value of complete touch event, as long as example, being less than 0.1 second i.e. per the time interval between clicking twice This can be clicked twice and be considered as belonging to same touch event.
Optionally, it is provided with configuration interface in client, can show the corresponding configuration interface of configuration interface on the client, Wherein, configuration interface is for configuring sensitivity parameter k, the first distance be third distance (or clicking here, pressure value) with it is sensitive Spend the product of parameter.
In the technical solution that step S206 is provided, display on the client is located at the field in first area and second area Scape.
Optionally, scene of the display in first area and second area includes on the client:In the aobvious of client Show that the first sub-interface in interface shows the scene in first area, and the second sub-interface in display interface shows position In the scene of second area, wherein before detecting first event, the scene in the region in virtual scene where target object It is shown in the first sub-interface and the second sub-interface.
As shown in figure 3, before primary detection to first event (i.e. the two fingers slide downward of player), terminal display interface The content of display is the scene content in region A and B (second area and third region), this part scene content can be in Fig. 3 Game role centered on (i.e. second area and third region are the region where target object in virtual scene), for the first time When detecting first event, the region that the first sub-interface is shown is switched to first area C by third region.And the second sub-interface is aobvious The content shown remains as the scene in second area, or includes in the second son after the scene in second area and third region is reduced In interface.
It should be noted that if not instead of primary detection is to first event, n-th (N is more than or equal to 2), then third Region B may not be the region where target object in virtual scene, but after the triggering of a preceding first event is mobile Region.It is still that the region that shows of the first sub-interface is switched to first area by third region when detecting first event C, and the content that the second sub-interface is shown remains as the scene in second area, or the scene in second area and third region is contracted It is shown in after small in second sub-interface.
As a kind of optional embodiment, it is so that the technical solution of the application is applied to the game on mobile terminal below Example illustrates.
The problem of mobile phone games (such as moba classes are played) quickly timely can not view battlefield conditions in fight, always Annoying the development of mobile phone games.So can be used in mobile phone games based on the above-mentioned technical proposal in mobile phone screen right two Refer to the displayable range that moving lens mode carrys out extended screen, player can move simultaneously, and the moment quickly checks periphery battlefield Situation, solve the problems, such as the insensitive of mobile small map camera lens and operate uncomfortable.
In any one region (such as lower right region) of screen, player using two fingers, slide here by region, And then the camera lens of a distance can be moved, be quickly switched into target area and check battlefield state, player can also by setting, The sensitivity of camera lens is configured, to meet the operating habit of different players.
Camera lens moving area can be on the right side of screen, and region is black surround range, and camera lens moving area can be in stealthy shape always State (does not show black surround);Camera lens move mode:Player is slided in camera lens moving area using two fingers, and camera lens will be with The direction movement of finger sliding, two fingers can trigger lens shift, and finger in the sliding of camera lens moving area After moving out red outer frame, that is, reach maximum magnitude.In sliding, other button interactions of moving area are unaffected.As long as playing Family does not loose one's grip, and camera lens is the region that can rest on camera lens movement always, and after loosing one's grip, camera lens automatically returns on heroic head;Camera lens is slided The distance relation that dynamic distance is moved with finger is:Camera lens displacement distance=finger sliding distance * camera lens sensitivity coefficient k is There are one range, players to be configured in setting by number k.
It is specific as shown in Figure 10:
Step S1002, judges whether player opens two in setting and refer to lens shift, thens follow the steps if not S1004 is to then follow the steps S1006.
Step S1004, when player does not trigger camera lens movement when camera lens moving area slides.
Step S1006, when camera lens sensitivity coefficient k is arranged in player, client records the camera lens sensitivity of player's setting Coefficient k.
The entire right side of step S1008, UI layer screen becomes the operating area of camera lens movement, when two fingers of player are in camera lens Moving area slide when, client can detect player camera lens moving area slide.
Step S1010 excludes remaining response in camera lens moving area in addition to carrying out camera lens movement.
Step S1012, client record two parameters, and record two refers to the distance n that midpoint is slided in camera lens moving area, and The direction A that two finger midpoint of record is slided in camera lens moving area.
Step S1014, cooperation coefficient k calculate the distance y that camera lens is moved in operation layer.
Two finger midpoint sliding distance x* camera lens sensitivity coefficient k of camera lens displacement distance y=, calculate camera lens and are operating The mobile distance y of layer (such as in operation layer, game role is walked about, and completes the scene layer of items UI layer operations).
Step S1016, control camera lens is in operation layer towards direction A displacement distances y.
In the technical solution of the application, the whole region on the right side of screen marks off a piece of hiding camera lens turnover zone Domain, the region and technical ability, which are excuted a law, waits regions to have coincidence, and camera lens can be only being responded when region is slided using two fingers Mobile operation, will not other operations in response region.By the technical solution, mobile phone terminal game is made to be provided with quick shifting Index glass head mode, the becoming larger of camera lens movement response, which allows, to be used more convenient, and player can pass through camera lens and move scheme, fast to check quickly Look around situation, and the current movement that need not stop, allow player that mobile phone terminal plays can to observe distant place situation in movement, Left hand is allowed to move, right hand moving camera lens becomes possible to, and observation situation can only be can not operate simultaneously by small map before solving The case where.
It should be noted that for each method embodiment above-mentioned, for simple description, therefore it is all expressed as a series of Combination of actions, but those skilled in the art should understand that, the present invention is not limited by the described action sequence because According to the present invention, certain steps can be performed in other orders or simultaneously.Secondly, those skilled in the art should also know It knows, embodiment described in this description belongs to preferred embodiment, and involved action and module are not necessarily of the invention It is necessary.
Through the above description of the embodiments, those skilled in the art can be understood that according to above-mentioned implementation The method of example can add the mode of required general hardware platform to realize by software, naturally it is also possible to by hardware, but it is very much In the case of the former be more preferably embodiment.Based on this understanding, technical scheme of the present invention is substantially in other words to existing The part that technology contributes can be expressed in the form of software products, which is stored in a storage In medium (such as ROM/RAM, magnetic disc, CD), including some instructions are used so that a station terminal equipment (can be mobile phone, calculate Machine, server or network equipment etc.) execute method described in each embodiment of the present invention.
Other side according to the ... of the embodiment of the present invention additionally provides a kind of display side for implementing above-mentioned virtual scene The display device of the virtual scene of method can apply user terminal 103.Figure 11 is a kind of optional void according to the ... of the embodiment of the present invention The schematic diagram of the display device of quasi- scene, as shown in figure 11, which may include:Detection unit 1101, determination unit 1103 And display unit 1105.
Detection unit 1101, for detecting that first event, first event are the event triggered on the client, client For controlling the target object in the virtual scene shown in client.
Above-mentioned client can be the client of game application, social networking application, instant messaging application etc.;The client can pacify In the terminals such as mobile terminal, PC computers, notebook, smart television, above-mentioned first event can be triggered directly in terminal, Such as pass through the touch screen of terminal, pressure sensitivity screen, control handle, image collecting device;It also can be by connecting with the terminal called Another terminal triggering, such as the mobile phone by being connect with PC computers, smart television.
Above-mentioned virtual scene can be the scene of game in game application, the social scene etc. in social networking application;Target Object can be the player role controlled by player in scene of game, the user role being controlled by the user in social networking application.
Determination unit 1103, in response to first event, determining that the first area in virtual scene, first area are different In second area, second area is the region where target object in virtual scene.
Optionally, above-mentioned first event can be a plurality of types of events (such as drag events, slip event, click event, Press event, gesture etc.), at the first area in determining virtual scene, the thing of first event can be obtained according to event type Part information, such as sliding distance, click volume, pressure value;The first area in virtual scene is determined according to event information.
Display unit 1105 is located at the scene in first area and second area for display on the client.
It should be noted that for client, the display interface of client can at least be divided into two parts, wherein one A part (the first sub-interface) is used for showing scene in first area, another part (the second sub-interface) can be used to show the Scene in two regions, target object can reside in the center of entire display interface (as across first area and second area Form presence), it can also exist only in second area.Since screen size is fixed, the range that can be shown is also solid Fixed, and position of the target object in screen is fixed (such as among screen), is equivalent to and defines that second area can only It is the region in distance objective object fixed range, to which the displayable scene areas for causing terminal is smaller.
In the case where target object is motionless, the picture of the second sub-interface will not change, and player can pass through at this time Above-mentioned various types of first events change to trigger the picture in the first sub-interface, appointing such as into screen four direction Meaning one is dragged so that the displayable scene areas of terminal is extended, and also corresponds to extend the visual of user Range can effectively improve user experience.
It should be noted that the detection unit 1101 in the embodiment can be used for executing the step in the embodiment of the present application S202, the determination unit 1103 in the embodiment can be used for executing the step S204 in the embodiment of the present application, in the embodiment Display unit 1105 can be used for execute the embodiment of the present application in step S206.
Herein it should be noted that above-mentioned module is identical as example and application scenarios that corresponding step is realized, but not It is limited to above-described embodiment disclosure of that.It should be noted that above-mentioned module as a part for device may operate in as In hardware environment shown in FIG. 1, it can also pass through hardware realization by software realization.
By above-mentioned module, when detecting first event, determine that the first area in virtual scene, first area are different In second area, second area is the region where target object in virtual scene;Display is located at first area on the client With the scene in second area, since the displayable scene areas of display interface in client can be extended by first event Range, the smaller technical problem of terminal is shown in the related technology scene areas can be solved, and then reached extension terminal Displayable scene areas range technique effect.
Optionally, above-mentioned display unit can be additionally used in:The first sub-interface in the display interface of client shows position The second sub-interface in the scene in first area, and in display interface shows the scene positioned at second area, wherein is examining Before measuring first event, the scene in the region in virtual scene where target object is shown in the first sub-interface and the second sub- boundary On face.
The determination unit of the application may include:Acquisition module, the event information for obtaining first event;Determining module, For determining the first area in virtual scene according to event information.
Above-mentioned determining module can be additionally used in:According at least to the direction represented by event information, to third region virtual Position in scene is moved, and first area is obtained, wherein first area is the position in virtual scene to third region Carry out it is mobile after obtain, third region be the region where the scene that the first sub-interface before moving is shown.
Optionally it is determined that module is being executed according at least to the direction represented by event information, to third region in virtual field Position in scape is moved, and when obtaining the step of first area, can obtain the first distance indicated by event information, wherein First distance is the distance that third region need to move at the first moment;In position of first moment by third region in virtual scene It sets to the direction represented by event information and moves the first distance, obtain first area, wherein third region is that the first sub-interface exists Region where the scene of second moment display, the second moment are the previous moment at the first moment.
Optionally it is determined that module is being executed according at least to the direction represented by event information, to third region in virtual field Position in scape is moved, and when obtaining the step of first area, can obtain the first distance indicated by event information, wherein First distance is the distance that third region need to move at the first moment;In the case where the first distance is less than second distance, by the The first distance is moved in position of three regions in virtual scene to the direction represented by event information, obtains first area, wherein Second distance be third region on the direction represented by event information between the boundary of the fourth region at a distance from, the fourth region To allow the region checked in virtual scene, and third region is located in the fourth region;It is not less than second distance in the first distance In the case of, second distance is moved into the direction represented by event information in position of the third region in virtual scene, is obtained First area.
Optionally it is determined that module is in the step for executing the first area in determining according to event information virtual scene, it can Obtain the first position represented by event information;Region where determining the first position in virtual scene is first area.
Above-mentioned determining module can also be used to obtain first position corresponding with the position of touch represented by event information, In, position of touch is trigger position of the event information in third sub-interface, and third sub-interface is used to show the field of virtual scene Scape map, first position are that position of touch is mapped in the position in scene map, and determines first position institute in scene map Region be the 5th region;Using the part represented by the 5th region in virtual scene as first area.
The acquisition module of the application can be additionally used in:In the case where first event is slip event, obtain on the client Slip event event information, wherein event information includes direction and the third distance of the sliding of slip event, third distance For determining the first distance;In the case where first event is touch event, the event of touch event on the client is obtained Information, wherein event information includes the position of touch of touch event.
In the technical solution of the application, the whole region on the right side of screen marks off a piece of hiding camera lens turnover zone Domain, the region and technical ability, which are excuted a law, waits regions to have coincidence, and camera lens can be only being responded when region is slided using two fingers Mobile operation, will not other operations in response region.By the technical solution, mobile phone terminal game is made to be provided with quick shifting Index glass head mode, the becoming larger of camera lens movement response, which allows, to be used more convenient, and player can pass through camera lens and move scheme, fast to check quickly Look around situation, and the current movement that need not stop, allow player that mobile phone terminal plays can to observe distant place situation in movement, Left hand is allowed to move, right hand moving camera lens becomes possible to, and observation situation can only be can not operate simultaneously by small map before solving The case where.
Herein it should be noted that above-mentioned module is identical as example and application scenarios that corresponding step is realized, but not It is limited to above-described embodiment disclosure of that.It should be noted that above-mentioned module as a part for device may operate in as In hardware environment shown in FIG. 1, it can also pass through hardware realization by software realization, wherein hardware environment includes network Environment.
Other side according to the ... of the embodiment of the present invention additionally provides a kind of display side for implementing above-mentioned virtual scene The server or terminal of method.
Figure 12 is a kind of structure diagram of terminal according to the ... of the embodiment of the present invention, and as shown in figure 12, which may include: One or more (one is only shown in Figure 12) processors 1201, memory 1203 and (such as above-mentioned implementation of transmitting device 1205 Sending device in example), as shown in figure 12, which can also include input-output equipment 1207.
Wherein, memory 1203 can be used for storing software program and module, such as the virtual scene in the embodiment of the present invention Display methods and the corresponding program instruction/module of device, processor 1201 by operation be stored in it is soft in memory 1203 Part program and module realize the display side of above-mentioned virtual scene to perform various functions application and data processing Method.Memory 1203 may include high speed random access memory, can also include nonvolatile memory, such as one or more magnetism Storage device, flash memory or other non-volatile solid state memories.In some instances, memory 1203 can further comprise The memory remotely located relative to processor 1201, these remote memories can pass through network connection to terminal.Above-mentioned net The example of network includes but not limited to internet, intranet, LAN, mobile radio communication and combinations thereof.
Above-mentioned transmitting device 1205 is used to receive via network or transmission data, can be also used for processor with Data transmission between memory.Above-mentioned network specific example may include cable network and wireless network.In an example, Transmitting device 1205 includes a network adapter (Network Interface Controller, NIC), can pass through cable It is connected with other network equipments with router so as to be communicated with internet or LAN.In an example, transmission dress It is radio frequency (Radio Frequency, RF) module to set 1205, is used to wirelessly be communicated with internet.
Wherein, specifically, memory 1203 is for storing application program.
Processor 1201 can call the application program that memory 1203 stores by transmitting device 1205, following to execute Step:
Detect that first event, first event are the event triggered on the client, client is for controlling client Target object in the virtual scene of display;
In response to first event, determine that the first area in virtual scene, first area are different from second area, the secondth area Domain is the region where target object in virtual scene;
Display is located at the scene in first area and second area on the client.
Processor 1201 is additionally operable to execute following step:
Obtain the first distance indicated by event information, wherein the first distance, which is third region, to be moved at the first moment Distance;
In the case where the first distance is less than second distance, by position of the third region in virtual scene to event information The first distance is moved in represented direction, obtains first area, wherein second distance is third region represented by event information Direction on the distance between with the boundary of the fourth region, the fourth region is to allow the region checked, and third in virtual scene Region is located in the fourth region;
In the case where the first distance is not less than second distance, position of the third region in virtual scene is believed to event Second distance is moved in the represented direction of breath, obtains first area.
Using the embodiment of the present invention, when detecting first event, the first area in virtual scene, first area are determined Different from second area, second area is the region where target object in virtual scene;Display is located at first on the client Scene in region and second area, since the displayable scene of display interface in client can be extended by first event The range in region can solve the smaller technical problem of terminal is shown in the related technology scene areas, and then reach extension The technique effect of the displayable scene areas range of terminal.
Optionally, the specific example in the present embodiment can refer to the example described in above-described embodiment, the present embodiment Details are not described herein.
It will appreciated by the skilled person that structure shown in Figure 12 is only to illustrate, terminal can be smart mobile phone (such as Android phone, iOS mobile phones), tablet computer, palm PC and mobile internet device (Mobile Internet Devices, MID), the terminal devices such as PAD.Figure 12 it does not cause to limit to the structure of above-mentioned electronic device.For example, terminal is also It may include more either less components (such as network interface, display device) than shown in Figure 12 or have and Figure 12 institutes Show different configurations.
One of ordinary skill in the art will appreciate that all or part of step in the various methods of above-described embodiment is can To be completed come command terminal device-dependent hardware by program, which can be stored in a computer readable storage medium In, storage medium may include:Flash disk, read-only memory (Read-Only Memory, ROM), random access device (Random Access Memory, RAM), disk or CD etc..
The embodiments of the present invention also provide a kind of storage mediums.Optionally, in the present embodiment, above-mentioned storage medium can For the program code of the display methods of execution virtual scene.
Optionally, in the present embodiment, above-mentioned storage medium can be located at multiple in network shown in above-described embodiment On at least one of network equipment network equipment.
Optionally, in the present embodiment, storage medium is arranged to store the program code for executing following steps:
S12 detects that first event, first event are the event triggered on the client, and client is for controlling client Target object in the virtual scene shown on end;
S14 determines that the first area in virtual scene, first area are different from second area in response to first event, the Two regions are the region where target object in virtual scene;
S16, on the client display are located at the scene in first area and second area.
Optionally, storage medium is also configured to store the program code for executing following steps:
S22 obtains the first distance indicated by event information, wherein the first distance is that third region need to be at the first moment Mobile distance;
S24, in the case where the first distance is less than second distance, by position of the third region in virtual scene to event The first distance is moved in direction represented by information, obtains first area, wherein second distance is third region in event information institute The distance between with the boundary of the fourth region on the direction of expression, the fourth region be allow the region checked in virtual scene, and Third region is located in the fourth region;
S26, in the case where the first distance is not less than second distance, by position of the third region in virtual scene to thing Second distance is moved in direction represented by part information, obtains first area.
Optionally, the specific example in the present embodiment can refer to the example described in above-described embodiment, the present embodiment Details are not described herein.
Optionally, in the present embodiment, above-mentioned storage medium can include but is not limited to:USB flash disk, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), mobile hard disk, magnetic disc or The various media that can store program code such as CD.
The embodiments of the present invention are for illustration only, can not represent the quality of embodiment.
If the integrated unit in above-described embodiment is realized in the form of SFU software functional unit and as independent product Sale in use, can be stored in the storage medium that above computer can be read.Based on this understanding, skill of the invention Substantially all or part of the part that contributes to existing technology or the technical solution can be with soft in other words for art scheme The form of part product embodies, which is stored in a storage medium, including some instructions are used so that one Platform or multiple stage computers equipment (can be personal computer, server or network equipment etc.) execute each embodiment institute of the present invention State all or part of step of method.
In the above embodiment of the present invention, all emphasizes particularly on different fields to the description of each embodiment, do not have in some embodiment The part of detailed description may refer to the associated description of other embodiment.
In several embodiments provided herein, it should be understood that disclosed client, it can be by others side Formula is realized.Wherein, the apparatus embodiments described above are merely exemplary, for example, the unit division, only one Kind of division of logic function, formula that in actual implementation, there may be another division manner, such as multiple units or component can combine or It is desirably integrated into another system, or some features can be ignored or not executed.Another point, it is shown or discussed it is mutual it Between coupling, direct-coupling or communication connection can be INDIRECT COUPLING or communication link by some interfaces, unit or module It connects, can be electrical or other forms.
The unit illustrated as separating component may or may not be physically separated, aobvious as unit The component shown may or may not be physical unit, you can be located at a place, or may be distributed over multiple In network element.Some or all of unit therein can be selected according to the actual needs to realize the mesh of this embodiment scheme 's.
In addition, each functional unit in each embodiment of the present invention can be integrated in a processing unit, it can also It is that each unit physically exists alone, it can also be during two or more units be integrated in one unit.Above-mentioned integrated list The form that hardware had both may be used in member is realized, can also be realized in the form of SFU software functional unit.
The above is only a preferred embodiment of the present invention, it is noted that for the ordinary skill people of the art For member, various improvements and modifications may be made without departing from the principle of the present invention, these improvements and modifications are also answered It is considered as protection scope of the present invention.

Claims (15)

1. a kind of display methods of virtual scene, which is characterized in that including:
Detect first event, wherein the first event is the event triggered on the client, and the client is for controlling Target object in the virtual scene shown in the client;
In response to the first event, the first area in the virtual scene is determined, wherein the first area is different from the Two regions, the second area are the region where target object described in the virtual scene;
Display is located at the scene in the first area and the second area in the client.
2. according to the method described in claim 1, it is characterized in that, in the client display be located at the first area and Scene in the second area includes:
The first sub-interface in the display interface of the client shows the scene in the first area, and described The second sub-interface in display interface shows the scene positioned at the second area.
3. according to the method described in claim 2, it is characterized in that, before detecting the first event, the method is also Including:
Region where showing target object described in the virtual scene in first sub-interface and second sub-interface Scene.
4. method as claimed in any of claims 1 to 3, which is characterized in that determine in the virtual scene One region includes:
Obtain the event information of the first event;
The first area in the virtual scene is determined according to the event information.
5. according to the method described in claim 4, it is characterized in that, being determined in the virtual scene according to the event information First area includes:
According at least to the direction represented by the event information, position of the third region in the virtual scene is moved It is dynamic, obtain the first area, wherein the first area be position to the third region in the virtual scene into It is obtained after row is mobile, the third region is the region where the scene that the first sub-interface before movement is shown.
6. right according to the method described in claim 5, it is characterized in that, according at least to the direction represented by the event information Position of the third region in the virtual scene is moved, and is obtained the first area and is included:
Obtain the first distance indicated by the event information, wherein first distance is that the third region need to be first The distance of moment movement;
At first moment by position of the third region in the virtual scene to represented by the event information First distance is moved in direction, obtains the first area, wherein the third region is first sub-interface second Region where the scene of moment display, second moment are the previous moment at first moment.
7. right according to the method described in claim 5, it is characterized in that, according at least to the direction represented by the event information Position of the third region in the virtual scene is moved, and is obtained the first area and is included:
Obtain the first distance indicated by the event information, wherein first distance is that the third region need to be first The distance of moment movement;
It is described first distance be less than second distance in the case of, by position of the third region in the virtual scene to First distance is moved in direction represented by the event information, obtains the first area, wherein the second distance is The third region on the direction represented by the event information between the boundary of the fourth region at a distance from, the 4th area Domain is to allow the region checked in the virtual scene, and the third region is located in the fourth region;
In the case where first distance is not less than the second distance, by the third region in the virtual scene The second distance is moved in position to the direction represented by the event information, obtains the first area.
8. according to the method described in claim 4, it is characterized in that, being determined in the virtual scene according to the event information First area includes:
Obtain the first position represented by the event information;
Region where determining the first position in the virtual scene is the first area.
9. according to the method described in claim 8, it is characterized in that,
Obtaining the first position represented by the event information includes:It obtains and the position of touch pair represented by the event information The first position answered, wherein the position of touch is trigger position of the event information in third sub-interface, described Third sub-interface is used to show that the scene map of the virtual scene, the first position to be described in the position of touch is mapped in Position in scene map,
Region where determining the first position in the virtual scene is that the first area includes:Determine described first Region of the position where in the scene map is the 5th region;Represented by the 5th region described in the virtual scene Part is used as the first area.
10. according to the method described in claim 4, it is characterized in that, the event information for obtaining the first event includes:
In the case where the first event is slip event, the thing of the slip event in the client is obtained Part information, wherein the event information includes direction and the third distance of the sliding of the slip event, and the third distance is used In determining the first distance;Or,
In the case where the first event is touch event, the thing of the touch event in the client is obtained Part information, wherein the event information includes the position of touch of the touch event.
11. according to the method described in claim 10, it is characterized in that, the method further includes:
Show configuration interface in the client, wherein the configuration interface for configuring sensitivity parameter, described first away from With a distance from for third with the product of the sensitivity parameter.
12. a kind of display device of virtual scene, which is characterized in that including:
Detection unit, for detecting first event, wherein the first event is the event triggered on the client, described Client is used to control the target object in the virtual scene shown in the client;
Determination unit determines the first area in the virtual scene in response to the first event, wherein described the One region is different from second area, and the second area is the region where target object described in the virtual scene;
Display unit, for scene of the display in the first area and the second area in the client.
13. device according to claim 12, which is characterized in that the determination unit is additionally operable to:
According at least to the direction represented by event information, position of the third region in the virtual scene is moved, is obtained To the first area, wherein the first area is moved to position of the third region in the virtual scene It is obtained after dynamic, the third region is the region where the scene that the first sub-interface before movement is shown.
14. a kind of storage medium, which is characterized in that the storage medium includes the program of storage, wherein when described program is run Execute the method described in 1 to 11 any one of the claims.
15. a kind of electronic device, including memory, processor and it is stored on the memory and can transports on the processor Capable computer program, which is characterized in that the processor executes the claims 1 to 11 by the computer program Method described in one.
CN201810040140.XA 2018-01-16 2018-01-16 Virtual scene display method and device, storage medium and electronic device Active CN108310768B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810040140.XA CN108310768B (en) 2018-01-16 2018-01-16 Virtual scene display method and device, storage medium and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810040140.XA CN108310768B (en) 2018-01-16 2018-01-16 Virtual scene display method and device, storage medium and electronic device

Publications (2)

Publication Number Publication Date
CN108310768A true CN108310768A (en) 2018-07-24
CN108310768B CN108310768B (en) 2020-04-07

Family

ID=62893519

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810040140.XA Active CN108310768B (en) 2018-01-16 2018-01-16 Virtual scene display method and device, storage medium and electronic device

Country Status (1)

Country Link
CN (1) CN108310768B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109432775A (en) * 2018-11-09 2019-03-08 网易(杭州)网络有限公司 A kind of multi-screen display method and device of map
CN110860082A (en) * 2019-11-20 2020-03-06 网易(杭州)网络有限公司 Identification method, identification device, electronic equipment and storage medium
WO2020143144A1 (en) * 2019-01-10 2020-07-16 网易(杭州)网络有限公司 Method and apparatus for controlling display during game, storage medium, processor, and terminal
CN111913674A (en) * 2019-05-07 2020-11-10 广东虚拟现实科技有限公司 Virtual content display method, device, system, terminal equipment and storage medium
CN115981518A (en) * 2023-03-22 2023-04-18 北京同创蓝天云科技有限公司 VR (virtual reality) display user operation method and related equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1110585A2 (en) * 1999-12-14 2001-06-27 KCEO Inc. A video game apparatus, game image display control method, and readable storage medium
US20030153374A1 (en) * 2002-02-12 2003-08-14 Anell Gilmore Interactive video racing game
CN104765905A (en) * 2015-02-13 2015-07-08 上海同筑信息科技有限公司 (Building Information Modeling) BIM based plan graph and first view-angle split-screen synchronous display method and system
CN105208368A (en) * 2015-09-23 2015-12-30 北京奇虎科技有限公司 Method and device for displaying panoramic data
CN105760076A (en) * 2016-02-03 2016-07-13 网易(杭州)网络有限公司 Game control method and device
CN105808071A (en) * 2016-03-31 2016-07-27 联想(北京)有限公司 Display control method and device and electronic equipment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1110585A2 (en) * 1999-12-14 2001-06-27 KCEO Inc. A video game apparatus, game image display control method, and readable storage medium
US20030153374A1 (en) * 2002-02-12 2003-08-14 Anell Gilmore Interactive video racing game
CN104765905A (en) * 2015-02-13 2015-07-08 上海同筑信息科技有限公司 (Building Information Modeling) BIM based plan graph and first view-angle split-screen synchronous display method and system
CN105208368A (en) * 2015-09-23 2015-12-30 北京奇虎科技有限公司 Method and device for displaying panoramic data
CN105760076A (en) * 2016-02-03 2016-07-13 网易(杭州)网络有限公司 Game control method and device
CN105808071A (en) * 2016-03-31 2016-07-27 联想(北京)有限公司 Display control method and device and electronic equipment

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109432775A (en) * 2018-11-09 2019-03-08 网易(杭州)网络有限公司 A kind of multi-screen display method and device of map
CN109432775B (en) * 2018-11-09 2022-05-17 网易(杭州)网络有限公司 Split screen display method and device of game map
WO2020143144A1 (en) * 2019-01-10 2020-07-16 网易(杭州)网络有限公司 Method and apparatus for controlling display during game, storage medium, processor, and terminal
US11383165B2 (en) 2019-01-10 2022-07-12 Netease (Hangzhou) Network Co., Ltd. In-game display control method and apparatus, storage medium, processor, and terminal
CN111913674A (en) * 2019-05-07 2020-11-10 广东虚拟现实科技有限公司 Virtual content display method, device, system, terminal equipment and storage medium
CN110860082A (en) * 2019-11-20 2020-03-06 网易(杭州)网络有限公司 Identification method, identification device, electronic equipment and storage medium
CN110860082B (en) * 2019-11-20 2023-04-07 网易(杭州)网络有限公司 Identification method, identification device, electronic equipment and storage medium
CN115981518A (en) * 2023-03-22 2023-04-18 北京同创蓝天云科技有限公司 VR (virtual reality) display user operation method and related equipment

Also Published As

Publication number Publication date
CN108310768B (en) 2020-04-07

Similar Documents

Publication Publication Date Title
US11806623B2 (en) Apparatus for adapting virtual gaming with real world information
CN108310768A (en) The display methods and device of virtual scene, storage medium, electronic device
CN109675307B (en) In-game display control method, device, storage medium, processor and terminal
CN107617213A (en) Information processing method and device, storage medium, electronic equipment
CN110075522A (en) The control method of virtual weapons, device and terminal in shooting game
CN107479699A (en) Virtual reality exchange method, apparatus and system
CN108421255B (en) Scene image display method and device, storage medium and electronic device
CN110812838A (en) Method and device for controlling virtual unit in game and electronic equipment
CN108379839A (en) Response method, device and the terminal of control
CN108579086A (en) Processing method, device, storage medium and the electronic device of object
CN108159696A (en) Information processing method, device, electronic equipment and storage medium
CN111330268B (en) Control method and device of virtual prop, storage medium and electronic device
CN105980023A (en) Gaming management device, gaming system, program, and recording medium
CN108310771A (en) The execution method and apparatus of task, storage medium, electronic device
JP2023552772A (en) Virtual item switching method, device, terminal and computer program
CN107913516A (en) Information processing method, device, electronic equipment and storage medium
CN108245892A (en) Information processing method, device, electronic equipment and storage medium
KR101404635B1 (en) Method for processing a drag input in online game
CN111318014A (en) Image display method and apparatus, storage medium, and electronic apparatus
CN110448903A (en) Determination method, apparatus, processor and the terminal of control strategy in game
CN108543308A (en) The selection method and device of virtual objects in virtual scene
CN108744499A (en) Processing method, device, storage medium and the electronic device of object
CN113680047B (en) Terminal operation method, device, electronic equipment and storage medium
CN116212386A (en) Method and device for picking up virtual article in game, electronic equipment and storage medium
CN115671724A (en) Processing method and processing device of virtual prop, computer equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant