WO2020255991A1 - ゲームプログラム、ゲーム方法、および情報端末装置 - Google Patents
ゲームプログラム、ゲーム方法、および情報端末装置 Download PDFInfo
- Publication number
- WO2020255991A1 WO2020255991A1 PCT/JP2020/023691 JP2020023691W WO2020255991A1 WO 2020255991 A1 WO2020255991 A1 WO 2020255991A1 JP 2020023691 W JP2020023691 W JP 2020023691W WO 2020255991 A1 WO2020255991 A1 WO 2020255991A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- game
- user
- image
- range
- user terminal
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
- A63F13/426—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/214—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
- A63F13/2145—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/25—Output arrangements for video game devices
- A63F13/26—Output arrangements for video game devices having at least one additional display device, e.g. on the game controller or outside a game booth
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/30—Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
- A63F13/35—Details of game servers
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
- A63F13/525—Changing parameters of virtual cameras
- A63F13/5255—Changing parameters of virtual cameras according to dedicated instructions from a player, e.g. using a secondary joystick to rotate the camera around a player's character
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
- A63F13/533—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game for prompting the player, e.g. by displaying a game menu
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
- A63F13/537—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/54—Controlling the output signals based on the game progress involving acoustic signals, e.g. for simulating revolutions per minute [RPM] dependent engine sounds in a driving game or reverberation against a virtual wall
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/60—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
- A63F13/69—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by enabling or updating specific game elements, e.g. unlocking hidden features, items, levels or versions
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/70—Game security or game management aspects
- A63F13/79—Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
- A63F13/792—Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories for payment purposes, e.g. monthly subscriptions
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/70—Game security or game management aspects
- A63F13/79—Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
- A63F13/798—Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories for assessing skills or for ranking players, e.g. for generating a hall of fame
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/837—Shooting of targets
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/847—Cooperative playing, e.g. requiring coordinated actions from several players to achieve a common goal
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/85—Providing additional services to players
- A63F13/87—Communicating with other players during game play, e.g. by e-mail or chat
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/90—Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
- A63F13/92—Video game devices specially adapted to be hand-held while playing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1694—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1068—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
- A63F2300/1075—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen
Definitions
- the present invention relates to game programs, game methods, and information terminal devices.
- Non-Patent Document 1 discloses a game in which a virtual pad is fixedly displayed on a screen and the posture and movement direction of a fighter object flying in a virtual space are controlled by operating the virtual pad.
- the present invention has been conceived in view of such circumstances, and an object of the present invention is to provide a game program, a game method, and an information terminal device capable of improving operability.
- a game program running on a computer with a processor, memory, and a touch screen has a step of displaying an operation target image at a predetermined first position of the touch screen and a step of displaying a range image that makes it possible to specify a predetermined first range including the first position, and an operation target.
- the processor has a step of displaying an operation target image at a predetermined first position of the touch screen and a step of displaying a range image that makes it possible to specify a predetermined first range including the first position, and an operation target.
- FIG. 1 It is a figure which shows a specific example of moving image play. It is a figure which shows the other specific example of moving image play.
- (A) is a diagram showing an example of a game image displayed on the user terminal
- (B) is a diagram showing another example of the game image displayed on the user terminal
- (C) is a diagram showing another example of the game image displayed on the user terminal. It is a figure which shows the other example of the game image to be performed
- (D) is the figure which shows further another example of the game image displayed on the user terminal
- (E) is the figure which shows the game image displayed on the user terminal. It is a figure which shows another example.
- (A) is a diagram showing an example of a game image displayed on the user terminal
- (B) is a diagram showing another example of the game image displayed on the user terminal
- (C) is a diagram showing another example of the game image displayed on the user terminal. It is a figure which shows the other example of the game image to be performed
- (D) is the figure which shows further another example of the game image displayed on the user terminal
- (E) is the figure which shows the game image displayed on the user terminal. It is a figure which shows another example.
- It is a flowchart which shows an example of the flow of processing executed in a user terminal.
- the system according to the present disclosure is a system for providing a game to a plurality of users.
- the system will be described with reference to the drawings. It should be noted that the present invention is not limited to these examples, and is indicated by the scope of claims, and it is intended that all modifications within the meaning and scope equivalent to the scope of claims are included in the present invention. To. In the following description, the same elements are designated by the same reference numerals in the description of the drawings, and duplicate description will not be repeated.
- FIG. 1 is a diagram showing an outline of the system 1 according to the present embodiment.
- the system 1 includes a plurality of user terminals 100 (computers), a server 200, a game play terminal 300 (external device, second external device), and a distribution terminal 400 (external, first external device).
- user terminals 100A to 100C in other words, three user terminals 100 are shown as an example of a plurality of user terminals 100, but the number of user terminals 100 is not limited to the illustrated example. .. Further, in the present embodiment, when it is not necessary to distinguish the user terminals 100A to C, it is described as "user terminal 100".
- the user terminal 100, the game play terminal 300, and the distribution terminal 400 are connected to the server 200 via the network 2.
- the network 2 is composed of various mobile communication systems constructed by the Internet and a wireless base station (not shown). Examples of this mobile communication system include so-called 3G and 4G mobile communication systems, LTE (Long Term Evolution), and wireless networks (for example, Wi-Fi (registered trademark)) that can be connected to the Internet by a predetermined access point. Be done.
- the game provided by the system 1 may be a game in which a plurality of users participate, and is not limited to this example.
- Gameplay terminal 300 The game play terminal 300 advances the game in response to an input operation by the player. In addition, the game play terminal 300 sequentially delivers information generated by the player's game play (hereinafter, game progress information) to the server 200 in real time.
- game progress information information generated by the player's game play
- the server 200 transmits the game progress information (second data) received in real time from the game play terminal 300 to the user terminal 100.
- the server 200 mediates the transmission and reception of various information between the user terminal 100, the game play terminal 300, and the distribution terminal 400.
- the distribution terminal 400 generates operation instruction data (first data) in response to an input operation by the user of the distribution terminal 400, and distributes the operation instruction data to the user terminal 100 via the server 200.
- the operation instruction data is data for playing back a moving image on the user terminal 100, and specifically, is data for operating a character appearing in the moving image.
- the user of the distribution terminal 400 is a player of this game.
- the moving image played on the user terminal 100 based on the operation instruction data is a moving image in which the character operated by the player in the game operates. "Movement" is to move at least a part of the character's body, including speech. Therefore, the motion instruction data according to the present embodiment includes, for example, voice data for causing the character to speak and motion data for moving the character's body.
- the operation instruction data is transmitted to the user terminal 100 after the end of this game.
- the details of the operation instruction data and the moving image played based on the operation instruction data will be described later.
- the user terminal 100 receives the game progress information in real time, and generates and displays the game screen using the information. In other words, the user terminal 100 reproduces the game screen of the game being played by the player by real-time rendering. As a result, the user of the user terminal 100 can visually recognize the same game screen as the game screen that the player is viewing while playing the game at substantially the same timing as the player.
- the user terminal 100 generates information for supporting the progress of the game by the player in response to the input operation by the user, and transmits the information to the game play terminal 300 via the server 200. Details of the information will be described later.
- the user terminal 100 receives the operation instruction data from the distribution terminal 400, and generates and reproduces a moving image (video) using the operation instruction data. In other words, the user terminal 100 renders and reproduces the operation instruction data.
- FIG. 2 is a diagram showing a hardware configuration of the user terminal 100.
- FIG. 3 is a diagram showing a hardware configuration of the server 200.
- FIG. 4 is a diagram showing a hardware configuration of the game play terminal 300.
- FIG. 5 is a diagram showing a hardware configuration of the distribution terminal 400.
- the user terminal 100 is not limited to the smartphone.
- the user terminal 100 may be realized as a feature phone, a tablet computer, a laptop computer (so-called laptop computer), a desktop computer, or the like.
- the user terminal 100 may be a game device suitable for game play.
- the user terminal 100 measures the processor 10, the memory 11, the storage 12, the communication interface (IF) 13, the input / output IF 14, the touch screen 15 (display unit), the camera 17, and the like. It includes a distance sensor 18. These configurations included in the user terminal 100 are electrically connected to each other by a communication bus.
- the user terminal 100 may be provided with an input / output IF 14 to which a display (display unit) configured separately from the user terminal 100 main body can be connected instead of or in addition to the touch screen 15.
- the user terminal 100 may be configured to be communicable with one or more controllers 1020.
- the controller 1020 establishes communication with the user terminal 100 according to a communication standard such as Bluetooth (registered trademark).
- the controller 1020 may have one or more buttons or the like, and transmits an output value based on a user's input operation to the buttons or the like to the user terminal 100.
- the controller 1020 may have various sensors such as an acceleration sensor and an angular velocity sensor, and transmits the output values of the various sensors to the user terminal 100.
- the controller 1020 may have the camera 17 and the distance measuring sensor 18.
- the user terminal 100 for example, at the start of a game, have a user who uses the controller 1020 input user identification information such as the user's name or login ID via the controller 1020.
- the user terminal 100 can associate the controller 1020 with the user, and identifies which user the output value belongs to based on the source of the received output value (controller 1020). be able to.
- each user terminal 100 communicates with a plurality of controllers 1020, each user holds each controller 1020 so that the one user terminal 100 does not communicate with other devices such as the server 200 via the network 2.
- Multiplayer can be realized with.
- each user terminal 100 communicates with each other according to a wireless standard such as a wireless LAN (Local Area Network) standard (communication connection is made without going through a server 200), thereby realizing local multiplayer with a plurality of user terminals 100. You can also do it.
- a wireless standard such as a wireless LAN (Local Area Network) standard (communication connection is made without going through a server 200), thereby realizing local multiplayer with a plurality of user terminals 100. You can also do it.
- the user terminal 100 may further include at least a part of various functions described later described in the server 200.
- the plurality of user terminals 100 may be provided with various functions described later described in the server 200 in a distributed manner.
- the user terminal 100 may communicate with the server 200.
- information indicating a play result such as a result or victory or defeat in a certain game may be associated with user identification information and transmitted to the server 200.
- the controller 1020 may be configured to be detachable from the user terminal 100.
- a coupling portion with the controller 1020 may be provided on at least one surface of the housing of the user terminal 100.
- the user terminal 100 may accept the attachment of a storage medium 1030 such as an external memory card via the input / output IF14. As a result, the user terminal 100 can read the program and data recorded on the storage medium 1030.
- the program recorded on the storage medium 1030 is, for example, a game program.
- the user terminal 100 may store the game program acquired by communicating with an external device such as the server 200 in the memory 11 of the user terminal 100, or may store the game program acquired by reading from the storage medium 1030 in the memory 11. You may memorize it in.
- the user terminal 100 includes a communication IF 13, an input / output IF 14, a touch screen 15, a camera 17, and a distance measuring sensor 18 as an example of a mechanism for inputting information to the user terminal 100.
- a communication IF 13 an input / output IF 14
- a touch screen 15 a camera 17, and a distance measuring sensor 18
- an input mechanism can be regarded as an operation part configured to accept a user's input operation.
- the operation unit when the operation unit is configured by at least one of the camera 17 and the distance measuring sensor 18, the operation unit detects an object 1010 in the vicinity of the user terminal 100 and performs an input operation from the detection result of the object. Identify.
- a user's hand as an object 1010, a marker having a predetermined shape, or the like is detected, and an input operation is specified based on the color, shape, movement, or type of the object 1010 obtained as a detection result.
- the user terminal 100 inputs a gesture (a series of movements of the user's hand) detected based on the captured image. Identify and accept as.
- the captured image may be a still image or a moving image.
- the user terminal 100 identifies and accepts the user's operation performed on the input unit 151 of the touch screen 15 as the user's input operation.
- the operation unit is composed of the communication IF 13
- the user terminal 100 identifies and accepts a signal (for example, an output value) transmitted from the controller 1020 as an input operation of the user.
- a signal output from an input device (not shown) different from the controller 1020 connected to the input / output IF14 is specified and accepted as a user input operation.
- the server 200 may be a general-purpose computer such as a workstation or a personal computer.
- the server 200 includes a processor 20, a memory 21, a storage 22, a communication IF 23, and an input / output IF 24. These configurations included in the server 200 are electrically connected to each other by a communication bus.
- Gameplay terminal 300 may be a general-purpose computer such as a personal computer.
- the game play terminal 300 includes a processor 30, a memory 31, a storage 32, a communication IF 33, and an input / output IF 34. These configurations included in the gameplay terminal 300 are electrically connected to each other by a communication bus.
- the game play terminal 300 is included in the HMD (Head Mounted Display) set 1000 as an example. That is, it can be expressed that the HMD set 1000 is included in the system 1, and the player can also express that he / she plays a game using the HMD set 1000.
- the device for the player to play the game is not limited to the HMD set 1000.
- the device may be any device that allows the player to experience the game virtually.
- the device may be realized as a smartphone, a feature phone, a tablet computer, a laptop computer (so-called laptop computer), a desktop computer, or the like. Further, the device may be a game device suitable for game play.
- the HMD set 1000 includes a game play terminal 300, an HMD 500, an HMD sensor 510, a motion sensor 520, a display 530, and a controller 540.
- the HMD 500 includes a monitor 51, a gaze sensor 52, a first camera 53, a second camera 54, a microphone 55, and a speaker 56.
- the controller 540 may include a motion sensor 520.
- the HMD 500 may be mounted on the player's head and provide the player with virtual space during operation. More specifically, the HMD 500 displays an image for the right eye and an image for the left eye on the monitor 51, respectively. When each eye of the player visually recognizes the respective image, the player can recognize the image as a three-dimensional image based on the parallax of both eyes.
- the HMD 500 may include either a so-called head-mounted display including a monitor and a head-mounted device capable of mounting a smartphone or other terminal having a monitor.
- the monitor 51 is realized as, for example, a non-transparent display device.
- the monitor 51 is arranged on the main body of the HMD 500 so as to be located in front of both eyes of the player. Therefore, the player can immerse himself in the virtual space when he / she visually recognizes the three-dimensional image displayed on the monitor 51.
- the virtual space includes, for example, a background, player-operable objects, and player-selectable menu images.
- the monitor 51 can be realized as a liquid crystal monitor or an organic EL (Electro Luminescence) monitor included in a so-called smartphone or other information display terminal.
- the monitor 51 can be realized as a transmissive display device.
- the HMD 500 may be an open type such as a glasses type, not a closed type that covers the player's eyes as shown in FIG.
- the transmissive monitor 51 may be temporarily configured as a non-transparent display device by adjusting its transmittance.
- the monitor 51 may include a configuration that simultaneously displays a part of the image constituting the virtual space and the real space.
- the monitor 51 may display an image of the real space taken by the camera mounted on the HMD 500, or may make the real space visible by setting a part of the transmittance to be high.
- the monitor 51 may include a sub-monitor for displaying an image for the right eye and a sub-monitor for displaying an image for the left eye.
- the monitor 51 may be configured to display the image for the right eye and the image for the left eye as a unit.
- the monitor 51 includes a high speed shutter. The high-speed shutter operates so that the image for the right eye and the image for the left eye can be alternately displayed so that the image is recognized by only one of the eyes.
- the HMD 500 includes a plurality of light sources (not shown). Each light source is realized by, for example, an LED (Light Emitting Diode) that emits infrared rays.
- the HMD sensor 510 has a position tracking function for detecting the movement of the HMD 500. More specifically, the HMD sensor 510 reads a plurality of infrared rays emitted by the HMD 500 and detects the position and inclination of the HMD 500 in the real space.
- the HMD sensor 510 may be implemented by a camera.
- the HMD sensor 510 can detect the position and tilt of the HMD 500 by executing the image analysis process using the image information of the HMD 500 output from the camera.
- the HMD 500 may include a sensor (not shown) as a position detector in place of the HMD sensor 510 or in addition to the HMD sensor 510.
- the HMD500 can use the sensor to detect the position and tilt of the HMD500 itself.
- the sensor is an angular velocity sensor, a geomagnetic sensor, or an accelerometer
- the HMD 500 may use any of these sensors instead of the HMD sensor 510 to detect its position and tilt.
- the angular velocity sensor detects the angular velocity around the three axes of the HMD 500 in real space over time.
- the HMD500 calculates the temporal change of the angle around the three axes of the HMD500 based on each angular velocity, and further calculates the inclination of the HMD500 based on the temporal change of the angle.
- the gaze sensor 52 detects the direction in which the line of sight of the player's right eye and left eye is directed. That is, the gaze sensor 52 detects the line of sight of the player.
- the detection of the direction of the line of sight is realized by, for example, a known eye tracking function.
- the gaze sensor 52 is realized by a sensor having the eye tracking function.
- the gaze sensor 52 preferably includes a sensor for the right eye and a sensor for the left eye.
- the gaze sensor 52 may be, for example, a sensor that irradiates infrared light to the right eye and left eye of the player and detects the angle of rotation of each eyeball by receiving the reflected light from the cornea and the iris with respect to the irradiation light.
- the gaze sensor 52 can detect the line of sight of the player based on each of the detected rotation angles.
- the first camera 53 photographs the lower part of the player's face. More specifically, the first camera 53 captures the nose and mouth of the player.
- the second camera 54 photographs the eyes and eyebrows of the player.
- the housing on the player side of the HMD500 is defined as the inside of the HMD500, and the housing on the side opposite to the player of the HMD500 is defined as the outside of the HMD500.
- the first camera 53 may be located outside the HMD500 and the second camera 54 may be located inside the HMD500.
- the images generated by the first camera 53 and the second camera 54 are input to the game play terminal 300.
- the first camera 53 and the second camera 54 may be realized as one camera, and the player's face may be photographed by this one camera.
- the microphone 55 converts the player's utterance into a voice signal (electric signal) and outputs it to the game play terminal 300.
- the speaker 56 converts the voice signal into voice and outputs it to the player.
- the HMD 500 may include earphones instead of the speaker 56.
- the controller 540 is connected to the game play terminal 300 by wire or wirelessly.
- the controller 540 receives an input of a command from the player to the game play terminal 300.
- the controller 540 is configured to be grippable by the player.
- the controller 540 is configured to be wearable on a player's body or part of clothing.
- the controller 540 may be configured to output at least one of vibration, sound, and light based on a signal transmitted from the gameplay terminal 300.
- the controller 540 receives from the player an operation for controlling the position and movement of an object arranged in the virtual space.
- the controller 540 includes a plurality of light sources. Each light source is realized by, for example, an LED that emits infrared rays.
- the HMD sensor 510 has a position tracking function. In this case, the HMD sensor 510 reads a plurality of infrared rays emitted by the controller 540 and detects the position and tilt of the controller 540 in the real space.
- the HMD sensor 510 may be implemented by a camera. In this case, the HMD sensor 510 can detect the position and tilt of the controller 540 by executing the image analysis process using the image information of the controller 540 output from the camera.
- the motion sensor 520 is attached to the player's hand in a certain aspect to detect the movement of the player's hand. For example, the motion sensor 520 detects the rotation speed, the number of rotations, and the like of the hand. The detected signal is sent to the game play terminal 300.
- the motion sensor 520 is provided in the controller 540, for example.
- the motion sensor 520 is provided in, for example, a controller 540 configured to be grippable by the player.
- the controller 540 is attached to something that does not easily fly by being attached to the player's hand, such as a glove type.
- a sensor not attached to the player may detect the movement of the player's hand.
- the signal of the camera that shoots the player may be input to the game play terminal 300 as a signal indicating the operation of the player.
- the motion sensor 520 and the game play terminal 300 are wirelessly connected to each other.
- the communication mode is not particularly limited, and for example, Bluetooth or other known communication method is used.
- the display 530 displays an image similar to the image displayed on the monitor 51. As a result, users other than the player wearing the HMD 500 can view the same image as the player.
- the image displayed on the display 530 does not have to be a three-dimensional image, and may be an image for the right eye or an image for the left eye. Examples of the display 530 include a liquid crystal display and an organic EL monitor.
- the game play terminal 300 operates a character to be operated by the player based on various information acquired from each part of the HMD 500, the controller 540, and the motion sensor 520, and advances the game.
- the "movements" here include moving parts of the body, changing postures, changing facial expressions, moving, uttering, touching and moving objects placed in virtual space, and characters. This includes the use of weapons and tools to grasp. That is, in this game, when the player moves each part of the body, the character also moves each part of the body in the same manner as the player. Further, in this game, the character utters the content uttered by the player. In other words, in this game, the character is an avatar object that behaves as a player's alter ego. As an example, at least part of the character's actions may be performed by input to the controller 540 by the player.
- the motion sensor 520 is attached to, for example, both hands of the player, both feet of the player, the waist of the player, and the head of the player.
- the motion sensors 520 attached to both hands of the player may be provided in the controller 540 as described above.
- the motion sensor 520 attached to the player's head may be provided in the HMD 500.
- the motion sensor 520 may also be attached to the user's elbows and knees. By increasing the number of motion sensors 520 attached to the player, the movement of the player can be more accurately reflected in the character.
- the player may wear a suit to which one or more motion sensors 520 are attached. That is, the method of motion capture is not limited to the example using the motion sensor 520.
- the distribution terminal 400 may be a mobile terminal such as a smartphone, a PDA (Personal Digital Assistant), or a tablet computer. Further, the distribution terminal 400 may be a so-called stationary terminal such as a desktop personal computer.
- the distribution terminal 400 includes a processor 40, a memory 41, a storage 42, a communication IF 43, an input / output IF 44, and a touch screen 45.
- the distribution terminal 400 may be provided with an input / output IF 44 to which a display (display unit) configured separately from the distribution terminal 400 main body can be connected in place of or in addition to the touch screen 45.
- Controller 1021 may have one or more physical input mechanisms such as buttons, levers, sticks, wheels and the like.
- the controller 1021 transmits an output value based on an input operation input to the input mechanism by the operator of the distribution terminal 400 (player in the present embodiment) to the distribution terminal 400.
- the controller 1021 may have various sensors such as an acceleration sensor and an angular velocity sensor, and may transmit the output values of the various sensors to the distribution terminal 400.
- the above-mentioned output value is accepted by the distribution terminal 400 via the communication IF43.
- the distribution terminal 400 may include a camera and a distance measuring sensor (both not shown).
- the controller 1021 may have a camera and a distance measuring sensor.
- the distribution terminal 400 includes a communication IF43, an input / output IF44, and a touch screen 45 as an example of a mechanism for inputting information to the distribution terminal 400.
- a communication IF43 an input / output IF44
- a touch screen 45 an example of a mechanism for inputting information to the distribution terminal 400.
- Each of the above-mentioned parts as an input mechanism can be regarded as an operation part configured to accept a user's input operation.
- the distribution terminal 400 identifies and accepts the user's operation performed on the input unit 451 of the touch screen 45 as the user's input operation.
- the distribution terminal 400 identifies and accepts a signal (for example, an output value) transmitted from the controller 1021 as an input operation of the user.
- the distribution terminal 400 identifies and accepts a signal output from an input device (not shown) connected to the input / output IF44 as a user input operation.
- the processors 10, 20, 30, and 40 control the overall operation of the user terminal 100, the server 200, the game play terminal 300, and the distribution terminal 400, respectively.
- Processors 10, 20, 30, and 40 include a CPU (Central Processing Unit), an MPU (Micro Processing Unit), and a GPU (Graphics Processing Unit).
- Processors 10, 20, 30, and 40 read programs from storages 12, 22, 32, and 42, which will be described later, respectively. Then, the processors 10, 20, 30, and 40 expand the read programs into the memories 11, 21, 31, and 41, which will be described later, respectively.
- Processors 10, 20, and 30 execute the expanded program.
- the memories 11, 21, 31, and 41 are main storage devices.
- the memories 11, 21, 31, and 41 are composed of storage devices such as a ROM (Read Only Memory) and a RAM (Random Access Memory).
- the memory 11 provides a work area to the processor 10 by temporarily storing a program and various data read from the storage 12 described later by the processor 10.
- the memory 11 also temporarily stores various data generated while the processor 10 is operating according to the program.
- the memory 21 provides a work area to the processor 20 by temporarily storing various programs and data read from the storage 22 described later by the processor 20.
- the memory 21 also temporarily stores various data generated while the processor 20 is operating according to the program.
- the memory 31 provides a work area to the processor 30 by temporarily storing various programs and data read from the storage 32 described later by the processor 30.
- the memory 31 also temporarily stores various data generated while the processor 30 is operating according to the program.
- the memory 41 provides a work area to the processor 40 by temporarily storing the program and various data read from the storage 42 described later by the processor 40.
- the memory 41 also temporarily stores various data generated while the processor 40 is operating according to the program.
- the programs executed by the processors 10 and 30 may be the game programs of the present game.
- the program executed by the processor 40 may be a distribution program for realizing distribution of operation instruction data.
- the processor 10 may further execute a viewing program for realizing the reproduction of the moving image.
- the program executed by the processor 20 may be at least one of the above-mentioned game program, distribution program, and viewing program.
- the processor 20 executes at least one of a game program, a distribution program, and a viewing program in response to a request from at least one of the user terminal 100, the game play terminal 300, and the distribution terminal 400.
- the distribution program and the viewing program may be executed in parallel.
- the game program may be a program that realizes the game in cooperation with the user terminal 100, the server 200, and the game play terminal 300.
- the distribution program may be a program that realizes distribution of operation instruction data in collaboration with the server 200 and the distribution terminal 400.
- the viewing program may be a program that realizes the reproduction of a moving image in collaboration with the user terminal 100 and the server 200.
- the storages 12, 22, 32, and 42 are auxiliary storage devices.
- the storages 12, 22, 32, and 42 are composed of a storage device such as a flash memory or an HDD (Hard Disk Drive).
- various data related to the game are stored in the storages 12 and 32.
- Various data related to the distribution of operation instruction data are stored in the storage 42.
- various data related to the reproduction of the moving image are stored in the storage 12.
- the storage 22 may store at least a part of various data related to each of the game, the distribution of the operation instruction data, and the reproduction of the moving image.
- the communication IFs 13, 23, 33, and 43 control the transmission and reception of various data in the user terminal 100, the server 200, the game play terminal 300, and the distribution terminal 400, respectively.
- the communication IFs 13, 23, 33, and 43 include, for example, communication via a wireless LAN (Local Area Network), Internet communication via a wired LAN, a wireless LAN, or a mobile phone network, and communication using a short-range wireless communication or the like. Control.
- the input / output IFs 14, 24, 34, and 44 are interfaces for the user terminal 100, the server 200, the game play terminal 300, and the distribution terminal 400 to receive data input and to output data, respectively.
- the input / output IFs 14, 24, 34, and 44 may input / output data via USB (Universal Serial Bus) or the like.
- Input / output IFs 14, 24, 34, 44 may include physical buttons, cameras, microphones, speakers, mice, keyboards, displays, sticks, levers and the like. Further, the input / output IFs 14, 24, 34, and 44 may include a connection portion for transmitting and receiving data to and from a peripheral device.
- the touch screen 15 is an electronic component that combines an input unit 151 and a display unit 152 (display).
- the touch screen 45 is an electronic component that combines an input unit 451 and a display unit 452.
- the input units 151 and 451 are, for example, touch-sensitive devices, and are configured by, for example, a touch pad.
- the display units 152 and 452 are composed of, for example, a liquid crystal display, an organic EL (Electro-Luminescence) display, or the like.
- the input units 151 and 451 detect the position where the user's operation (mainly a physical contact operation such as a touch operation, a slide operation, a swipe operation, and a tap operation) is input to the input surface, and information indicating the position. Has a function of transmitting as an input signal.
- the input units 151 and 451 may include touch sensing units (not shown).
- the touch sensing unit may adopt any method such as a capacitance method or a resistance film method.
- the user terminal 100 and the distribution terminal 400 may each include one or more sensors for specifying the holding posture of the user terminal 100 and the distribution terminal 400, respectively.
- This sensor may be, for example, an acceleration sensor, an angular velocity sensor, or the like.
- the processors 10 and 40 each specify the holding postures of the user terminal 100 and the distribution terminal 400 from the output of the sensor, and perform processing according to the holding posture. It will also be possible.
- the processors 10 and 40 may be a vertical screen display in which vertically long images are displayed on the display units 152 and 452 when the user terminal 100 and the distribution terminal 400 are held vertically, respectively.
- a horizontally long image may be displayed on the display unit as a horizontal screen display. In this way, the processors 10 and 40 may be able to switch between the vertical screen display and the horizontal screen display according to the holding postures of the user terminal 100 and the distribution terminal 400, respectively.
- FIG. 6 is a block diagram showing a functional configuration of a user terminal 100, a server 200, and an HMD set 1000 included in the system 1.
- FIG. 7 is a block diagram showing a functional configuration of the distribution terminal 400 shown in FIG.
- the user terminal 100 has a function as an input device that accepts a user's input operation and a function as an output device that outputs a game image or sound.
- the user terminal 100 functions as a control unit 110 and a storage unit 120 in cooperation with a processor 10, a memory 11, a storage 12, a communication IF 13, an input / output IF 14, a touch screen 15, and the like.
- the server 200 has a function of mediating the transmission / reception of various information between the user terminal 100, the HMD set 1000, and the distribution terminal 400.
- the server 200 functions as a control unit 210 and a storage unit 220 in cooperation with the processor 20, the memory 21, the storage 22, the communication IF23, the input / output IF24, and the like.
- the HMD set 1000 (game play terminal 300) has a function as an input device for receiving an input operation of a player, a function as an output device for outputting a game image and sound, and a user of game progress information via a server 200. It has a function of transmitting to the terminal 100 in real time.
- the HMD set 1000 is a control unit 310 in cooperation with the processor 30, the memory 31, the storage 32, the communication IF33, the input / output IF34, and the HMD500, the HMD sensor 510, the motion sensor 520, the controller 540, and the like of the game play terminal 300. And functions as a storage unit 320.
- the distribution terminal 400 has a function of generating operation instruction data and transmitting the operation instruction data to the user terminal 100 via the server 200.
- the distribution terminal 400 functions as a control unit 410 and a storage unit 420 in cooperation with the processor 40, the memory 41, the storage 42, the communication IF43, the input / output IF44, the touch screen 45, and the like.
- the storage unit 120 stores the game program 131 (program), the game information 132, and the user information 133.
- the storage unit 220 stores the game program 231 and the game information 232, the user information 233, and the user list 234.
- the storage unit 320 stores the game program 331, the game information 332, and the user information 333.
- the storage unit 420 stores the user list 421, the motion list 422, and the distribution program 423 (program, second program).
- the game programs 131, 231 and 331 are game programs executed by the user terminal 100, the server 200, and the HMD set 1000, respectively. This game is realized by the cooperative operation of each device based on the game programs 131, 231 and 331.
- the game programs 131 and 331 may be stored in the storage unit 220 and downloaded to the user terminal 100 and the HMD set 1000, respectively.
- the user terminal 100 renders the data received from the distribution terminal 400 based on the game program 131, and reproduces the moving image.
- the game program 131 is also a program for playing back a moving image using the moving image instruction data distributed from the distribution terminal 400.
- the program for playing the moving image may be different from the game program 131.
- the storage unit 120 stores a program for playing the moving image separately from the game program 131.
- the game information 132, 232, and 332 are data that the user terminal 100, the server 200, and the HMD set 1000 refer to when executing the game program, respectively.
- the user information 133, 233, and 333 are data related to the user account of the user terminal 100.
- the game information 232 is the game information 132 of each user terminal 100 and the game information 332 of the HMD set 1000.
- the user information 233 is the user information of the player included in the user information 133 of each user terminal 100 and the user information 333.
- the user information 333 is the user information 133 of each user terminal 100 and the user information of the player.
- the user list 234 and the user list 421 are a list of users who have participated in the game.
- the user list 234 and the user list 421 may include a list of users who participated in the most recent gameplay by the player, as well as a list of users who participated in each gameplay before the gameplay.
- the motion list 422 is a list of a plurality of motion data created in advance.
- the motion list 422 is, for example, a list in which motion data is associated with each of the information (for example, a motion name) that identifies each motion.
- the distribution program 423 is a program for realizing distribution of operation instruction data for playing a moving image on the user terminal 100 to the user terminal 100.
- the control unit 210 comprehensively controls the server 200 by executing the game program 231 stored in the storage unit 220.
- the control unit 210 mediates the transmission and reception of various information between the user terminal 100, the HMD set 1000, and the distribution terminal 400.
- the control unit 210 functions as a communication mediation unit 211, a log generation unit 212, and a list generation unit 213 according to the description of the game program 231.
- the control unit 210 can also function as another functional block (not shown) for mediating the transmission and reception of various information related to game play and distribution of operation instruction data, and for supporting the progress of the game.
- the communication mediation unit 211 mediates the transmission and reception of various information between the user terminal 100, the HMD set 1000, and the distribution terminal 400. For example, the communication mediation unit 211 transmits the game progress information received from the HMD set 1000 to the user terminal 100.
- the game progress information includes data indicating the movement of the character operated by the player, the parameters of the character, information on items and weapons possessed by the character, enemy characters, and the like.
- the server 200 transmits the game progress information to the user terminals 100 of all the users participating in the game. In other words, the server 200 transmits common game progress information to the user terminals 100 of all users participating in the game. As a result, the game progresses in the same manner as the HMD set 1000 on each of the user terminals 100 of all the users participating in the game.
- the communication mediation unit 211 transmits the information received from any one of the user terminals 100 to support the progress of the game by the player to the other user terminals 100 and the HMD set 1000.
- the information may be an item for the player to advance the game advantageously, and may be item information indicating an item provided to the player (character).
- the item information includes information (user name, user ID, etc.) indicating the user who provided the item.
- the communication mediation unit 211 may mediate the distribution of the operation instruction data from the distribution terminal 400 to the user terminal 100.
- the log generation unit 212 generates a game progress log based on the game progress information received from the HMD set 1000.
- the list generation unit 213 generates the user list 234 after the end of game play. Although the details will be described later, each user in the user list 234 is associated with a tag indicating the content of the support provided to the player by the user.
- the list generation unit 213 generates a tag based on the game progress log generated by the log generation unit 212, and associates it with the corresponding user.
- the list generation unit 213 may associate the content of the support provided by each user to the player, which is input by the game operator or the like using a terminal device such as a personal computer, with the corresponding user as a tag. ..
- the user terminal 100 transmits information indicating the user to the server 200 based on the user's operation. For example, the user terminal 100 transmits the user ID input by the user to the server 200. That is, the server 200 holds information indicating each user for all the users participating in the game.
- the list generation unit 213 may generate the user list 234 using the information.
- the control unit 310 comprehensively controls the HMD set 1000 by executing the game program 331 stored in the storage unit 320. For example, the control unit 310 advances the game according to the game program 331 and the operation of the player. In addition, the control unit 310 communicates with the server 200 to send and receive information as needed while the game is in progress. The control unit 310 may send and receive information directly to and from the user terminal 100 without going through the server 200.
- the control unit 310 has an operation reception unit 311, a display control unit 312, a UI control unit 313, an animation generation unit 314, a game progress unit 315, a virtual space control unit 316, and a reaction processing unit 317 according to the description of the game program 331. Functions as.
- the control unit 310 can also function as other functional blocks (not shown) for controlling characters appearing in the game, depending on the nature of the game to be executed.
- the operation reception unit 311 detects and accepts the input operation of the player.
- the operation reception unit 311 receives signals input from the HMD 500, the motion sensor 520, the controller 540, etc., determines what kind of input operation has been performed, and outputs the result to each element of the control unit 310.
- the UI control unit 313 controls a user interface (hereinafter, UI) image to be displayed on the monitor 51, the display 530, and the like.
- UI image is a tool for the player to make an input necessary for the progress of the game to the HMD set 1000, or a tool for obtaining information output during the progress of the game from the HMD set 1000.
- UI images are, but are not limited to, icons, buttons, lists, menu screens, and the like.
- the animation generation unit 314 generates animations showing the motions of various objects based on the control modes of the various objects. For example, the animation generation unit 314 may generate an animation or the like that expresses how an object (for example, a player's avatar object) moves as if it were there, moves its mouth, or changes its facial expression. ..
- the game progress unit 315 advances the game based on the game program 331, the input operation by the player, the operation of the avatar object in response to the input operation, and the like. For example, the game progress unit 315 performs a predetermined game process when the avatar object performs a predetermined operation. Further, for example, the game progress unit 315 may receive information representing a user's operation on the user terminal 100 and perform game processing based on the user's operation. In addition, the game progress unit 315 generates game progress information according to the progress of the game and transmits it to the server 200. The game progress information is transmitted to the user terminal 100 via the server 200. As a result, the progress of the game in the HMD set 1000 is shared in the user terminal 100. In other words, the progress of the game in the HMD set 1000 and the progress of the game in the user terminal 100 are synchronized.
- the virtual space control unit 316 performs various controls related to the virtual space provided to the player according to the progress of the game. As an example, the virtual space control unit 316 creates various objects and arranges them in the virtual space. Further, the virtual space control unit 316 arranges the virtual camera in the virtual space. In addition, the virtual space control unit 316 operates various objects arranged in the virtual space according to the progress of the game. Further, the virtual space control unit 316 controls the position and inclination of the virtual camera arranged in the virtual space according to the progress of the game.
- the display control unit 312 outputs a game screen on which the processing results executed by each of the above elements are reflected to the monitor 51 and the display 530.
- the display control unit 312 may display an image based on the field of view from the virtual camera arranged in the virtual space on the monitor 51 and the display 530 as a game screen. Further, the display control unit 312 may include the animation generated by the animation generation unit 314 in the game screen. Further, the display control unit 312 may superimpose and draw the above-mentioned UI image controlled by the UI control unit 313 on the game screen.
- the reaction processing unit 317 receives feedback on the reaction of the user of the user terminal 100 to the game play of the player, and outputs this to the player.
- the user terminal 100 can create a comment (message) addressed to the avatar object based on the input operation of the user.
- the reaction processing unit 317 receives the comment data of the comment and outputs it.
- the reaction processing unit 317 may display the text data corresponding to the user's comment on the monitor 51 and the display 530, or may output the voice data corresponding to the user's comment from a speaker (not shown). In the former case, the reaction processing unit 317 may superimpose and draw an image corresponding to the text data (that is, an image including the content of the comment) on the game screen.
- the control unit 110 comprehensively controls the user terminal 100 by executing the game program 131 stored in the storage unit 120. For example, the control unit 110 advances the game according to the game program 131 and the user's operation. In addition, the control unit 110 communicates with the server 200 to send and receive information as needed while the game is in progress. The control unit 110 may send and receive information directly to and from the HMD set 1000 without going through the server 200.
- the control unit 110 has an operation reception unit 111, a display control unit 112, a UI control unit 113, an animation generation unit 114, a game progress unit 115, a virtual space control unit 116, and a video playback unit 117 according to the description of the game program 131. Functions as.
- the control unit 110 can also function as other functional blocks (not shown) for the progress of the game, depending on the nature of the game being executed.
- the operation reception unit 111 detects and accepts a user's input operation with respect to the input unit 151.
- the operation reception unit 111 determines what kind of input operation has been performed from the action exerted by the user on the console via the touch screen 15 and other input / output IF14s, and outputs the result to each element of the control unit 110. To do.
- the operation receiving unit 111 receives an input operation for the input unit 151, detects the coordinates of the input position of the input operation, and specifies the type of the input operation.
- the operation reception unit 111 specifies, for example, a touch operation, a slide operation, a swipe operation, a tap operation, and the like as types of input operations. Further, the operation reception unit 111 detects that the contact input is released from the touch screen 15 when the continuously detected input is interrupted.
- the UI control unit 113 controls a UI image to be displayed on the display unit 152 in order to construct a UI according to at least one of a user's input operation and received game progress information.
- the UI image is a tool for the user to input necessary input for the progress of the game to the user terminal 100, or a tool for obtaining information output from the user terminal 100 during the progress of the game.
- UI images are, but are not limited to, icons, buttons, lists, menu screens, and the like.
- the animation generation unit 114 generates animations showing the motions of various objects based on the control modes of the various objects.
- the game progress unit 115 advances the game based on the game program 131, the received game progress information, the input operation by the user, and the like.
- the game progress unit 115 transmits information about the game process to the HMD set 1000 via the server 200.
- the predetermined game processing is shared in the HMD set 1000.
- the predetermined game process is, for example, a process of providing an item to the avatar object, and in this example, the information related to the game process is the item information described above.
- the virtual space control unit 116 performs various controls related to the virtual space provided to the user according to the progress of the game.
- the virtual space control unit 116 creates various objects and arranges them in the virtual space.
- the virtual space control unit 116 arranges the virtual camera in the virtual space.
- the virtual space control unit 116 operates various objects arranged in the virtual space according to the progress of the game, specifically, the received game progress information.
- the virtual space control unit 316 controls the position and inclination of the virtual camera arranged in the virtual space according to the progress of the game, specifically, the received game progress information.
- the display control unit 112 outputs to the display unit 152 a game screen in which the processing results executed by each of the above elements are reflected.
- the display control unit 112 may display an image based on the field of view from the virtual camera arranged in the virtual space provided to the user on the display unit 152 as a game screen. Further, the display control unit 112 may include the animation generated by the animation generation unit 114 in the game screen. Further, the display control unit 112 may superimpose and draw the above-mentioned UI image controlled by the UI control unit 113 on the game screen.
- the game screen displayed on the display unit 152 is the same game screen as the game screen displayed on the other user terminal 100 and the HMD set 1000.
- the moving image reproduction unit 117 analyzes (renders) the operation instruction data received from the distribution terminal 400, and reproduces the moving image.
- the control unit 410 comprehensively controls the distribution terminal 400 by executing a program (not shown) stored in the storage unit 420. For example, the control unit 410 generates operation instruction data and distributes it to the user terminal 100 according to the operation of the program and the user (player in this embodiment) of the distribution terminal 400. In addition, the control unit 410 communicates with the server 200 to send and receive information as needed. The control unit 410 may send and receive information directly to and from the user terminal 100 without going through the server 200.
- the control unit 410 functions as a communication control unit 411, a display control unit 412, an operation reception unit 413, a voice reception unit 414, a motion identification unit 415, and an operation instruction data generation unit 416 according to the description of the program.
- the control unit 410 can also function as other functional blocks (not shown) for the generation and distribution of operation instruction data.
- the communication control unit 411 controls transmission / reception of information to / from the server 200 or the user terminal 100 via the server 200.
- the communication control unit 411 receives the user list 421 from the server 200 as an example. Further, the communication control unit 411 transmits the operation instruction data to the user terminal 100 as an example.
- the display control unit 412 outputs various screens reflecting the processing results executed by each element to the display unit 452. As an example, the display control unit 412 displays a screen including the received user list 234. Further, as an example, the display control unit 412 displays a screen including a motion list 422 for causing the player to select motion data for operating the avatar object included in the motion instruction data to be distributed.
- the operation reception unit 413 detects and accepts a player's input operation with respect to the input unit 151.
- the operation reception unit 111 determines what kind of input operation has been performed from the action exerted by the player on the console via the touch screen 45 and other input / output IF44s, and outputs the result to each element of the control unit 410. To do.
- the operation reception unit 413 receives an input operation for the input unit 451, detects the coordinates of the input position of the input operation, and specifies the type of the input operation.
- the operation reception unit 413 specifies, for example, a touch operation, a slide operation, a swipe operation, a tap operation, and the like as the types of input operations. Further, the operation reception unit 413 detects that the contact input is released from the touch screen 45 when the continuously detected input is interrupted.
- the voice reception unit 414 receives the voice generated around the distribution terminal 400 and generates voice data of the voice.
- the voice receiving unit 414 receives the voice spoken by the player and generates voice data of the voice.
- the motion specifying unit 415 specifies the motion data selected by the player from the motion list 422 according to the input operation of the player.
- the operation instruction data generation unit 416 generates operation instruction data.
- the operation instruction data generation unit 416 generates operation instruction data including the generated voice data and the specified motion data.
- the functions of the HMD set 1000, the server 200, and the user terminal 100 shown in FIG. 6 and the functions of the distribution terminal 400 shown in FIG. 7 are merely examples.
- Each device of the HMD set 1000, the server 200, the user terminal 100, and the distribution terminal 400 may include at least a part of the functions provided by the other devices.
- another device other than the HMD set 1000, the server 200, the user terminal 100, and the distribution terminal 400 may be a component of the system 1, and the other device may be made to execute a part of the processing in the system 1.
- the computer that executes the game program in the present embodiment may be any of the HMD set 1000, the server 200, the user terminal 100, the distribution terminal 400, and other devices, or a plurality of these. It may be realized by the combination of the devices of.
- FIG. 8 is a flowchart showing an example of a flow of control processing of the virtual space provided to the player and the virtual space provided to the user of the user terminal 100.
- FIG. 9 is a diagram showing a virtual space 600A provided to the player and a field of view image visually recognized by the player according to an embodiment.
- FIG. 10 is a diagram showing a virtual space 600B provided to the user of the user terminal 100 and a field of view image visually recognized by the user according to a certain embodiment.
- virtual space 600 when it is not necessary to distinguish between the virtual spaces 600A and 600B, it is described as "virtual space 600".
- step S1 the processor 30 defines the virtual space 600A shown in FIG. 9 as the virtual space control unit 316.
- the processor 30 defines a virtual space 600A by using virtual space data (not shown).
- the virtual space data may be stored in the game play terminal 300, may be generated by the processor 30 based on the game program 331, or may be acquired by the processor 30 from an external device such as the server 200. May be good.
- the virtual space 600 has an all-sky spherical structure that covers the entire 360-degree direction of a point defined as a center.
- the celestial sphere in the upper half of the virtual space 600 is illustrated so as not to complicate the explanation.
- step S2 the processor 30 arranges the avatar object 610 (character) in the virtual space 600A as the virtual space control unit 316.
- the avatar object 610 is an avatar object associated with the player, and operates according to the input operation of the player.
- step S3 the processor 30 arranges other objects in the virtual space 600A as the virtual space control unit 316.
- the processor 30 arranges objects 631 to 634.
- Other objects imitate, for example, character objects (so-called non-player characters, NPCs) that operate according to the game program 331, operation objects such as virtual hands, animals, plants, man-made objects, and natural objects that are arranged as the game progresses.
- Can include objects and the like.
- step S4 the processor 30 arranges the virtual camera 620A in the virtual space 600A as the virtual space control unit 316. As an example, the processor 30 arranges the virtual camera 620A at the position of the head of the avatar object 610.
- step S5 the processor 30 displays the field of view image 650 on the monitor 51 and the display 530.
- the processor 30 defines a field of view 640A, which is the field of view from the virtual camera 620A in the virtual space 600A, according to the initial position and tilt of the virtual camera 620A. Then, the processor 30 defines the field of view image 650 corresponding to the field of view area 640A.
- the processor 30 outputs the field of view image 650 to the monitor 51 and the display 530 to display the field of view image 650 on the HMD 500 and the display 530.
- a part of the object 634 is included in the field of view area 640A, so that the field of view image 650 is one of the objects 634 as shown in FIG. 9B. Including part.
- step S6 the processor 30 transmits the initial arrangement information to the user terminal 100 via the server 200.
- the initial placement information is information indicating the initial placement positions of various objects in the virtual space 600A.
- the initial placement information includes the avatar object 610 and the information of the initial placement positions of the objects 631 to 634.
- the initial placement information can also be expressed as one of the game progress information.
- step S7 the processor 30 controls the virtual camera 620A according to the movement of the HMD 500 as the virtual space control unit 316. Specifically, the processor 30 controls the orientation and tilt of the virtual camera 620A according to the movement of the HM D500, that is, the posture of the player's head. As will be described later, when the player moves the head (changes the posture of the head), the processor 30 moves the head of the avatar object 610 in accordance with this movement. The processor 30 controls the orientation and tilt of the virtual camera 620A so that, for example, the direction of the line of sight of the avatar object 610 coincides with the direction of the line of sight of the virtual camera 620A. In step S8, the processor 30 updates the field of view image 650 in response to changes in the orientation and tilt of the virtual camera 620A.
- step S9 the processor 30 moves the avatar object 610 as the virtual space control unit 316 according to the movement of the player.
- the processor 30 moves the avatar object 610 in the virtual space 600A in response to the player moving in the real space.
- the processor 30 moves the head of the avatar object 610 in the virtual space 600A in response to the player moving the head in the real space.
- step S10 the processor 30 moves the virtual camera 620A as the virtual space control unit 316 so as to follow the avatar object 610. That is, the virtual camera 620A is always at the position of the head of the avatar object 610 even if the avatar object 610 moves.
- the processor 30 updates the field of view image 650 in response to the movement of the virtual camera 620A. That is, the processor 30 updates the field of view area 640A according to the posture of the player's head and the position of the virtual camera 620A in the virtual space 600A. As a result, the field of view image 650 is updated.
- step S11 the processor 30 transmits the operation instruction data of the avatar object 610 to the user terminal 100 via the server 200.
- the operation instruction data here is motion data that captures the player's motion, voice data of the voice spoken by the player, and operation data indicating the content of the input operation to the controller 540 during the virtual experience (for example, during game play). Includes at least one.
- the operation instruction data is transmitted to the user terminal 100 as, for example, game progress information.
- steps S7 to S11 are continuously and repeatedly executed while the player is playing the game.
- step S21 the processor 10 of the user terminal 100 of the user 3 defines the virtual space 600B shown in FIG. 10 as the virtual space control unit 116.
- the processor 10 defines the virtual space 600B by using the virtual space data (not shown).
- the virtual space data may be stored in the user terminal 100, may be generated by the processor 10 based on the game program 131, or may be acquired by the processor 10 from an external device such as the server 200. Good.
- step S22 the processor 10 receives the initial placement information.
- step S23 the processor 10 arranges various objects in the virtual space 600B as the virtual space control unit 116 according to the initial arrangement information.
- the various objects are the avatar object 610 and the objects 631 to 634.
- step S24 the processor 10 arranges the virtual camera 620B in the virtual space 600B as the virtual space control unit 116. As an example, the processor 10 arranges the virtual camera 620B at the position shown in FIG. 10 (A).
- step S25 the processor 10 displays the field of view image 660 on the display unit 152.
- the processor 10 defines a field of view 640B, which is a field of view from the virtual camera 620B in the virtual space 600B, according to the initial position and tilt of the virtual camera 620B. Then, the processor 10 defines the field of view image 660 corresponding to the field of view area 640B.
- the processor 10 outputs the field of view image 660 to the display unit 152 to display the field of view image 660 on the display unit 152.
- the avatar object 610 and the object 631 are included in the field of view area 640B, so that the field of view image 660 is the avatar object 610 as shown in FIG. 10B. And the object 631.
- step S26 the processor 10 receives the operation instruction data.
- step S27 the processor 10 moves the avatar object 610 in the virtual space 600B as the virtual space control unit 116 according to the operation instruction data. In other words, the processor 10 reproduces the video in which the avatar object 610 is operating by real-time rendering.
- step S28 the processor 10 controls the virtual camera 620B as the virtual space control unit 116 according to the operation of the user received as the operation reception unit 111.
- step S29 the processor 10 updates the field of view image 660 in response to changes in the position of the virtual camera 620B in the virtual space 600B, the orientation and tilt of the virtual camera 620B.
- the processor 10 may automatically control the virtual camera 620B according to the movement of the avatar object 610, for example, the movement of the avatar object 610 or the change of the orientation.
- the processor 10 may automatically move the virtual camera 620B or change its orientation and tilt so that the avatar object 610 is always photographed from the front.
- the processor 10 may automatically move the virtual camera 620B or change the orientation and tilt so as to always shoot the avatar object 610 from the rear according to the movement of the avatar object 610. Good.
- the avatar object 610 operates according to the movement of the player.
- the operation instruction data indicating this operation is transmitted to the user terminal 100.
- the avatar object 610 operates according to the received operation instruction data.
- the avatar object 610 performs the same operation in the virtual space 600A and the virtual space 600B.
- the user 3 can visually recognize the movement of the avatar object 610 according to the movement of the player by using the user terminal 100.
- FIG. 11 is a diagram showing another example of the field of view image displayed on the user terminal 100. Specifically, it is a figure which shows an example of the game screen of the game (this game) executed by the system 1 played by a player.
- an avatar object 610 that operates weapons such as guns and knives and a plurality of enemy objects 671 that are NPCs appear in the virtual space 600, and the avatar object 610 fights against the enemy object 671. It is a game to let you.
- Various game parameters such as the physical strength of the avatar object 610, the number of magazines that can be used, the number of remaining bullets of the gun, and the remaining number of the enemy object 671 are updated as the game progresses.
- a plurality of stages are prepared in this game, and the player can clear the stage by satisfying a predetermined achievement condition associated with each stage.
- Predetermined achievement conditions are established by, for example, defeating all appearing enemy objects 671, defeating boss objects among the appearing enemy objects 671, acquiring predetermined items, reaching a predetermined position, and the like. It may include the condition to be used.
- the achievement conditions are defined in the game program 131.
- the player clears the stage when the achievement condition is satisfied according to the content of the game, in other words, the avatar object 610 wins the enemy object 671 (avatar object 610 and enemy object 671). The victory or defeat between) is decided.
- the game executed by the system 1 is a racing game or the like, the ranking of the avatar object 610 is determined when the condition of reaching the goal is satisfied.
- the game progress information is live-distributed to the plurality of user terminals 100 at predetermined time intervals.
- the touch screen 15 of the user terminal 100 during viewing the game displays a field of view image in the field of view defined by the virtual camera 620B corresponding to the user terminal 100.
- parameter images showing the physical strength of the avatar object 610, the number of magazines that can be used, the number of remaining bullets of the gun, the remaining number of the enemy object 671 and the like are displayed superimposed. ..
- This field of view image can also be expressed as a game screen.
- the game progress information includes motion data that captures the player's motion, voice data of the voice spoken by the player, and operation data indicating the content of the input operation to the controller 540.
- These data are, that is, information for specifying the position, posture, orientation, etc. of the avatar object 610, information for specifying the position, posture, orientation, etc. of the enemy object 671, and other objects (for example, obstacle objects 672 and 673). It is information that identifies the position of.
- the processor 10 identifies the position, posture, orientation, and the like of each object by analyzing (rendering) the game progress information.
- the game information 132 includes data of various objects such as an avatar object 610, an enemy object 671, and obstacle objects 672 and 673.
- the processor 10 updates the position, posture, orientation, and the like of each object by using the data and the analysis result of the game progress information. As a result, the game progresses, and each object in the virtual space 600B moves in the same manner as each object in the virtual space 600A. Specifically, in the virtual space 600B, each object including the avatar object 610 operates based on the game progress information regardless of whether or not the user operates the user terminal 100.
- UI images 701 and 702 are displayed superimposed on the field of view image.
- the UI image 701 is a UI image that accepts an operation for displaying the UI image 711 for displaying the item input operation for supporting the avatar object 610 from the user 3 on the touch screen 15.
- the UI image 702 accepts an operation for displaying a UI image (described later) on the touch screen 15 for receiving an operation for inputting and transmitting a comment for the avatar object 610 (in other words, the player 4) from the user 3. It is a UI image.
- the operation accepted by the UI images 701 and 702 may be, for example, an operation of tapping the UI images 701 and 702.
- the UI image 711 is displayed superimposed on the field of view image.
- the UI image 711 is, for example, a UI image 711A on which a magazine icon is drawn, a UI image 711B on which an first aid box icon is drawn, a UI image 711C on which a triangular cone icon is drawn, and a UI on which a barricade icon is drawn.
- the item insertion operation corresponds to, for example, an operation of tapping any UI image.
- Obstacle objects 672 and 673 may be one that more obstructs the movement of enemy object 671 than the other.
- the processor 10 transmits the item input information indicating that the item input operation has been performed to the server 200.
- the item input information includes at least information for specifying the type of the item specified by the item input operation.
- the item input information may include other information about the item, such as information indicating where the item is placed.
- the item input information is transmitted to the other user terminal 100 and the HMD set 1000 via the server 200.
- FIG. 12 is a diagram showing another example of the field of view image displayed on the user terminal 100. Specifically, it is a figure which shows an example of the game screen of this game, and is the figure for demonstrating the communication between a player and a user terminal 100 during game play.
- the user terminal 100 causes the avatar object 610 to execute the utterance 691.
- the user terminal 100 causes the avatar object 610 to execute the utterance 691 according to the voice data included in the game progress information.
- the content of the utterance 691 is "There is no bullet! Spoken by the player 4. That is, the content of the utterance 691 is to inform each user that the magazine is 0 and the bullet loaded in the gun is 1, so that the means for attacking the enemy object 671 is likely to be lost.
- a balloon is used to visually indicate the utterance of the avatar object 610, but in reality, the voice is output from the speaker of the user terminal 100.
- the balloon shown in FIG. 12A (that is, the balloon including the text of the audio content) may be displayed in the visual field image. This also applies to the utterance 692 described later.
- the user terminal 100 Upon receiving the tap operation on the UI image 702, the user terminal 100 superimposes and displays the UI images 705 and 706 (message UI) on the field of view image as shown in FIG. 12 (B).
- the UI image 705 is a UI image that displays a comment on the avatar object 610 (in other words, the player).
- the UI image 706 is a UI image that accepts a comment transmission operation from the user 3 in order to transmit the input comment.
- the user terminal 100 when the user terminal 100 receives a tap operation on the UI image 705, the user terminal 100 displays a UI image (not shown, hereinafter simply referred to as "keyboard") imitating a keyboard on the touch screen 15.
- the user terminal 100 causes the UI image 705 to display text corresponding to the user's input operation on the keyboard.
- the text "Send magazine” is displayed on the UI image 705.
- the user terminal 100 When the user terminal 100 receives a tap operation on the UI image 706 as an example after inputting the text, the user terminal 100 transmits the comment information including the input content (text content) and the information indicating the user to the server 200. To do.
- the comment information is transmitted to the other user terminal 100 and the HMD set 1000 via the server 200.
- the UI image 703A is a UI image showing the user name of the user who sent the comment
- the UI image 704A is a UI image showing the content of the comment sent by the user.
- the user whose user name is "BBBBBB” uses his / her own user terminal 100 to transmit comment information having the content "dangerous!, That is, the UI image 703A and the UI image 704A.
- the UI image 703A and the UI image 704A are displayed on the touch screen 15 of all the user terminals 100 participating in this game and the monitor 51 of the HMD 500.
- the UI images 703A and 704A may be one UI image. That is, one UI image may include the user name and the content of the comment.
- the UI image 703B includes the user name "AAAAAA”, and the UI image 704B contains the comment "Magazine send! Entered in the example of FIG. 12B.
- the user "AAAAA” further inputs a tap operation to the UI image 701, displays the UI image 711 on the touch screen 15, and inputs a tap operation to the UI image 711A. It is a later view image 611. That is, as a result of the item input information indicating the magazine being transmitted from the user terminal 100 of the user "AAAAA" to the other user terminal 100 and the HMD set 1000, the user terminal 100 and the HMD set 1000 have the effect object 674 (described later). Is arranged in the virtual space 600. As an example, the user terminal 100 and the HMD set 1000 execute the effect related to the effect object 674 after the elapsed time indicated in the item input information has elapsed, and execute the process of invoking the effect of the item object.
- the number of magazines is increased from 0 to 1 by executing the process of invoking the effect of the item object.
- the player utters "Thank you!” To the user "AAAAA”, and the voice data of the utterance is transmitted to each user terminal 100.
- each user terminal 100 outputs the voice "Thank you!” As the utterance 692 of the avatar object 610.
- communication between the user and the avatar object 610 is realized by outputting the utterance voice of the avatar object 610 based on the utterance of the player and inputting a comment by each user.
- FIG. 13 is a flowchart showing an example of the flow of the game progress process executed by the game play terminal 300.
- step S31 the processor 30 advances the game as the game progress unit 315 based on the game program 331 and the movement of the player.
- step S32 the processor 30 generates game progress information and distributes it to the user terminal 100. Specifically, the processor 30 transmits the generated game progress information to each user terminal 100 via the server 200.
- the processor 30 When the processor 30 receives the item input information in step S33 (YES in S33), the processor 30 arranges the item object in the virtual space 600A based on the item input information in step S34. As an example, the processor 30 arranges the effect object 674 in the virtual space 600A before arranging the item object (see FIG. 11C).
- the effect object 674 may be, for example, an object imitating a present box.
- the processor 30 may execute the effect related to the effect object 674 after the elapsed time indicated in the item input information has elapsed.
- the effect may be, for example, an animation in which the lid of the present box is opened. After executing the animation, the processor 30 executes a process of invoking the effect of the item object. For example, in the example of FIG. 11D, the obstacle object 673 is arranged.
- the processor 30 may arrange the item object corresponding to the tapped UI image in the virtual space 600A. For example, when a tap operation is performed on the UI image 711A, the processor 30 arranges a magazine object indicating a magazine in the virtual space 600A after executing the animation. Further, when the tap operation is performed on the UI image 711B, the processor 30 arranges the first aid kit object indicating the first aid kit in the virtual space 600A after executing the animation.
- the processor 30 may execute a process of invoking the effect of the magazine object or the first aid kit object when the avatar object 610 moves to the position of the magazine object or the first aid kit object, for example.
- the processor 30 continues and repeats the processes of steps S31 to S34 until the game is finished.
- the game ends for example, when the player inputs a predetermined input operation for ending the game (YES in step S35), the process shown in FIG. 13 ends.
- FIG. 14 is a flowchart showing an example of the flow of the game progress process executed by the user terminal 100.
- step S41 the processor 10 receives the game progress information.
- step S42 the processor 10 advances the game as the game progress unit 115 based on the game progress information.
- step S43 when the processor 10 accepts the item input operation by the user 3 (YES in step S43), in step S44, the processor 10 consumes the virtual currency and arranges the effect object 674 in the virtual space 600B.
- the virtual currency is purchased (charged for the game) by the user 3 performing a predetermined operation on the processor 10 before or during the participation in the game. It may be present, or it may be given to the user 3 when a predetermined condition is satisfied.
- the predetermined conditions may be those that require participation in this game, such as clearing a quest in this game, or those that do not require participation in this game, such as answering a questionnaire.
- the amount of virtual currency (owned amount of virtual currency) is stored in the user terminal 100 as game information 132 as an example.
- step S45 the processor 10 transmits the item input information to the server 200.
- the item input information is transmitted to the game play terminal 300 via the server 200.
- the processor 10 arranges the item object in the virtual space 600A when a predetermined time elapses after the arrangement of the effect object 674.
- the obstacle object 673 is arranged. That is, when the user 3 inputs a tap operation to the UI image 711C, a predetermined amount of virtual currency is consumed and the obstacle object 673 is arranged.
- the processor 10 continues and repeats the processes of steps S41 to S45 until the game is finished.
- the game is finished, for example, when the player performs a predetermined input operation for ending the game, or when the user 3 performs a predetermined input operation for leaving the game halfway (YES in step S46). ), The process shown in FIG. 14 is completed.
- FIG. 15 is a flowchart showing an example of the flow of game progress processing executed on the server 200.
- step S51 the processor 20 receives the game progress information from the game play terminal 300.
- step S52 the processor 20 updates the game progress log (hereinafter, play log) as the log generation unit 212.
- the play log is generated by the processor 20 when the initial arrangement information is received from the game play terminal 300.
- step S53 the processor 20 transmits the received game progress information to each user terminal 100.
- step S54 When the item input information is received from any user terminal 100 in step S54 (YES in step S54), the processor 20 updates the play log as the log generation unit 212 in step S55. In step S56, the processor 20 transmits the received item input information to the game play terminal 300.
- the processor 20 continues and repeats the processes of steps S51 to S56 until the game is finished.
- the processor 20 acts as a list generator 213 from the play log. Generate a list of users who participated in the game (user list 234).
- the processor 20 stores the generated user list 234 in the server 200.
- FIG. 16 is a diagram showing a specific example of the user list 234.
- information for example, a user name
- Information (tags) generated based on the support provided by each user to the player is stored in the "tag” column.
- tags generated based on the support provided by each user to the player is stored in the "tag" column.
- those without the key brackets are the information automatically generated by the processor 20, and those with the key brackets are those stored by the game operator. This is manually entered information.
- the user "AAAAA” is associated with information such as a magazine, 10F, a boss, and "winning the boss by presenting the magazine”. This indicates that, for example, in the boss battle on the stage of 10F, the user "AAAAA” inserts a magazine, and the avatar object 610 wins the boss with the bullet of the inserted magazine.
- the user "BBBBBB” is associated with information such as a first aid kit, 3rd floor, Zako, and "recovery on the verge of game over”. For example, in a battle with a Zako enemy on the 3rd floor, the user “BBBB” “BBBBBB” throws in the first aid kit, and as a result, it shows that the physical strength of the avatar object 610 has recovered just before it becomes 0 (the game is over).
- the user "CCCCC” is associated with information such as barricade, 5th floor, Zako, and "stopping two zombies at the barricade”. This means that, for example, in a battle with a Zako enemy on the 5th floor, the user "CCCCC” throws in a barricade (obstacle object 672 in FIG. 11), and as a result, succeeds in stopping the two Zako enemies. Shown.
- one support provided is associated with the user name of each user 3, but the user name of the user 3 who has performed the support a plurality of times has a tag for each of the multiple support times. Can be associated.
- each tag is distinguished. As a result, after the game is completed, the player who refers to the user list 421 using the distribution terminal 400 can accurately grasp the content of each support.
- FIG. 17 is a flowchart showing an example of the flow of the distribution process executed by the distribution terminal 400.
- FIG. 18 is a diagram showing a specific example of a screen displayed on the distribution terminal 400.
- FIG. 19 is a diagram showing another specific example of the screen displayed on the distribution terminal.
- step S61 the processor 40 receives the first operation for displaying the list of users who participated in the game (user list 234) as the operation reception unit 413.
- the download screen 721 shown in FIG. 18A is a screen for downloading the user list 234 from the server 200 and displaying it on the display unit 452.
- the download screen 721 is a screen displayed immediately after inputting the start operation of the application for executing the distribution process shown in FIG. 17 into the distribution terminal 400.
- the download screen 721 includes UI images 722 and 723 as an example.
- the UI image 722 accepts an operation for downloading the user list 234, that is, the first operation.
- the first operation may be, for example, an operation of tapping the UI image 722.
- the UI image 723 accepts an operation for terminating the application.
- the operation may be, for example, an operation of tapping the UI image 723.
- the processor 40 Upon receiving the tap operation on the UI image 722, in step S62, the processor 40 acquires (receives) the user list 234 from the server 200 as the communication control unit 411. In step S63, the processor 40 causes the display unit 452 to display the user list 234 as the display control unit 412. Specifically, the processor 40 causes the display unit 452 to display the user list screen generated based on the user list 234.
- the user list screen may be the user list screen 731 shown in FIG. 18B.
- the user list screen 731 is composed of a record image corresponding to each record in the user list 234.
- record images 732A to 732C are shown as record images, but the number of record images is not limited to three. In the example of FIG.
- the player may perform, for example, scroll the screen (eg,).
- Scroll the screen eg, By inputting a drag operation or a flick operation) to the touch screen 45, another record image can be displayed on the display unit 452.
- the record images 732A to 732C include user names 733A to 733C, tag information 734A to 734C, and icons 735A to 735C, respectively.
- the user names 733A to 733C, the tag information 734A to 734C, and the icons 735A to 735C "record image 732", "user name 733", and "user name 733”, respectively. It is described as “tag information 734" and "icon 735".
- the user name 733 is information indicating each user who has participated in the game, which is stored in the "user” column in the user list 234.
- the tag information 734 is the user list 23. In No. 4, it is information indicating a tag associated with each of the information indicating each user who participated in the game.
- the record image 732A includes "AAAAAA” as the username 733A. Therefore, the record image 732A includes "magazine, 10F, boss,” win the boss by presenting the magazine “” as tag information 734A, which is associated with "AAAAAA” in the user list 234.
- the icon 735 is, for example, an image preset by the user.
- the processor 40 may store the received user list in the distribution terminal 400 (user list 421 in FIG. 7).
- the download screen 721 may include a UI image (not shown) for displaying the user list 421 on the display unit 452.
- the processor 40 does not download the user list 234, reads the user list 421, generates a user list screen from the user list 421, and displays the user list screen on the display unit 452. ..
- step S64 the processor 40 receives a second operation for selecting any of the users included in the user list screen 731 as the operation reception unit 413.
- the second operation may be an operation of tapping any of the record images 732 on the user list screen 731.
- the player inputs a tap operation to the record image 732A. That is, the player has selected the user "AAAAAA" as the user who distributes the operation instruction data.
- the processor 40 Upon receiving the tap operation for the record image 732, in step S65, the processor 40 causes the display unit 452 to display the motion list 422 as the display control unit 412. Specifically, the processor 40 causes the display unit 452 to display the motion list screen generated based on the motion list 422.
- the motion list screen may be the motion list screen 741 shown in FIG.
- the motion list screen 741 is composed of a record image corresponding to each record in the motion list 422.
- record images 742A to 742C are described as record images, but the number of record images is not limited to three. In the example of FIG.
- the player inputs another record, for example, an operation of scrolling the screen (for example, a drag operation or a flick operation) to the touch screen 45.
- the image can be displayed on the display unit 452.
- the record images 742A to 742C include motion names 743A to 743C, motion images 744A to 744C, and UI images 745A to 745C, respectively.
- the motion names 743A to 743C, the motion images 744A to 744C, and the UI images 745A to 745C "record image 7432" and "motion name 743", respectively. It is described as “motion image 744" and "UI image 745".
- the motion name 743 is information for identifying the motion stored in the motion list 422.
- the motion image 744 is an image generated from the motion data associated with each motion name in the motion list 422.
- the processor 40 includes an image of the avatar object 610 that takes the first posture in each motion data in the record image 742 as the motion image 744.
- the motion image 744 may be a UI image that accepts a predetermined operation (for example, a tap operation on the motion image 744) by the player. When the processor 40 accepts the predetermined operation, the processor 40 may reproduce the motion moving image in which the avatar object 610 operates based on the motion data.
- the processor 40 may automatically redisplay the motion list screen 741 when the motion moving image is finished.
- the record image 742 may include, for example, a UI image including the text "motion reproduction" instead of the motion image 744.
- step S66 the processor 40 receives the third operation of selecting the motion as the operation receiving unit 413.
- the third operation may be a tap operation on the UI image 745. That is, the UI image 745 accepts an operation of selecting motion data corresponding to each record image 742.
- the processor 40 specifies the motion data selected by the player as the motion specifying unit 415.
- step S67 the processor 40 receives the voice input of the player as the display control unit 412 and the voice reception unit 414 while the avatar object 610 reproduces the motion moving image that operates based on the selected motion data.
- FIG. 20 is a diagram showing a specific example of voice input by the player 4.
- the player 4 is inputting the utterance voice 820A while playing the motion moving image 810A.
- the utterance voice 820A is a utterance voice addressed to the user 3 (hereinafter, user 3A) whose user name is "AAAAAA”. That is, in the example of FIG. 20, the player 4 selects the user 3A (first user) in step S64 and creates the operation instruction data addressed to the user 3A. It is assumed that the user terminal 100 used by the user 3A is the user terminal 100A.
- the utterance voice 820A is a utterance voice addressed to the user 3A
- the utterance voice is based on the content of the support provided by the user 3A to the avatar object 610 (in other words, the player 4).
- the user 3A inserts a magazine in the boss battle on the stage of 10F, and the avatar object 610 wins the boss with the bullet of the inserted magazine.
- the utterance voice 820A says, "Thank you for giving me the magazine in the boss battle! The timing was perfect! Thanks to AAAAA-san, I was able to clear it!
- the uttered voice includes the content of the support provided by the user 3 in the game and the gratitude to the user 3.
- the player 4 creates the utterance content addressed to the user 3 before starting the voice input, that is, before inputting the third operation to the distribution terminal 400.
- the utterance content addressed to the user 3 may be automatically generated by the processor 40. Further, the processor 40 may superimpose and display the tag associated with the user 3 selected by the second operation on the motion moving image 810A.
- the processor 40 converts the received voice into voice data.
- the processor 40 generates the operation instruction data including the voice data and the motion data of the selected motion as the operation instruction data generation unit 416.
- step S69 the processor 40, as the communication control unit 411, distributes the generated operation instruction data to the user terminal 100 (first computer) of the selected user 3 (user 3A in the example of FIG. 20).
- FIG. 21 is a diagram showing still another specific example of the screen displayed on the distribution terminal 400.
- the processor 40 causes the display unit 452 to display the distribution screen as the display control unit 412.
- the distribution screen may be the distribution screen 751 shown in FIG. 21 (A).
- the distribution screen 751 includes a UI image 752 and a motion image 753A. Further, as shown in FIG. 21A, the distribution screen 751 may include information indicating a user to whom the operation instruction data is distributed.
- the UI image 752 accepts an operation for delivering the operation instruction data to the selected user 3.
- the operation may be, for example, a tap operation on the UI image 752.
- the motion image 753A is a UI image that accepts an operation for playing a moving image based on the generated motion instruction data, that is, a moving image based on the motion instruction data generated for the user 3A.
- the operation may be, for example, a tap operation on the motion image 753A.
- the UI image that accepts the operation for playing the generated moving image is not limited to the motion image 753A. For example, it may be a UI image including the text "video playback".
- the processor 40 may automatically redisplay the distribution screen 751 when the moving image is finished.
- the distribution screen 751 preferably further includes a UI image that accepts an operation for returning to the acceptance of voice input.
- the operation may be, for example, a tap operation on the UI image.
- the player 4 can perform voice input again when the voice input fails, for example, when the content to be spoken is mistaken.
- the UI image may be a UI image that accepts an operation for returning to the selection of motion data.
- the processor 40 Upon receiving the tap operation for the UI image 752, the processor 40 transmits the operation instruction data to the server 200 together with the information indicating the user 3A.
- the server 200 identifies the user terminal 100 to which the operation instruction data is transmitted based on the information indicating the user 3A, and transmits the operation instruction data to the specified user terminal 100 (that is, the user terminal 100A).
- the processor 40 may display the distribution completion screen 761 shown in FIG. 21B on the display unit 452 as an example.
- the delivery completion screen 761 includes UI images 762 and 763 as an example. Further, the distribution completion screen 761 may include a text indicating that the transmission of the operation instruction data has been completed, as shown in FIG. 21 (B).
- the UI image 762 accepts an operation for starting the creation of operation instruction data addressed to another user 3.
- the operation may be, for example, an operation of tapping the UI image 762.
- the processor 40 receives the tap operation, the processor 40 causes the display unit 452 to display the user list screen again. That is, when the tap operation is accepted, the distribution process returns to step S63. At this time, the processor 40 may generate a user list screen based on the user list 421 stored in the distribution terminal 400 and display it on the display unit 452.
- the UI image 763 accepts an operation for terminating the application.
- the operation may be, for example, an operation of tapping the UI image 763.
- the distribution process ends.
- the distribution terminal 400 transmits the operation instruction data of the moving image addressed to the user 3A (the user 3 whose user name is "AAAAAA"). , Sent only to the user terminal 100A.
- FIG. 22 is a diagram showing another specific example of voice input by the player 4.
- the player 4 is inputting the utterance voice 820B while playing the motion moving image 810B.
- the utterance voice 820B is a utterance voice addressed to the user 3 (hereinafter, user 3B) whose user name is "BBBBBB". That is, in the example of FIG. 22, in step S64, the player 4 inputs a tap operation to the record image 732B corresponding to the user 3B, and creates the operation instruction data addressed to the user 3B. It is assumed that the user terminal 100 used by the user 3B is the user terminal 100B.
- the utterance voice 820B is a utterance voice addressed to the user 3B, the utterance voice is based on the content of the support provided by the user 3B to the avatar object 610 (in other words, the player 4). Specifically, in the battle with the Zako enemy on the 3rd floor, the user "BBBBBB” throws in the first aid kit, and as a result, the physical strength of the avatar object 610 becomes 0 (the game is over). I am recovering my physical strength. For this reason, the utterance voice 820B has the content "Thanks to the first aid kit given by Mr. BBBBB, the game did not end on the 3rd floor. Thank you very much!
- FIG. 23 is a diagram showing still another specific example of the screen displayed on the distribution terminal 400.
- the distribution screen 751 shown in FIG. 23A includes a UI image 752 and a motion image 753B.
- the motion image 753B accepts the tap operation, the motion image 753B reproduces a moving image based on the operation instruction data generated for the user 3B.
- the processor 40 Upon receiving the tap operation on the UI image 752, the processor 40 transmits the operation instruction data to the server 200 together with the information indicating the user 3B.
- the server 200 identifies the user terminal 100 to which the operation instruction data is transmitted based on the information indicating the user 3B, and transmits the operation instruction data to the specified user terminal 100 (that is, the user terminal 100B).
- the distribution terminal 400 transmits the operation instruction data of the moving image addressed to the user 3B (the user 3 whose user name is "BBBBBB"). , Transmit only to the user terminal 100B.
- the content of the voice based on the voice data included in the operation instruction data is based on the content of the support provided to the player 4 by the user 3 in participating in the latest game. Since the content of the support is different for each user 3, the content of the voice is different for each user 3. That is, after the end of the game, operation instruction data including voices having different contents is transmitted to at least a part of the user terminals 100 of the user 3 who participated in the game.
- the motion of the avatar object 610 in the example of FIG. 22 is different from the motion of the example of FIG. That is, the player 4 selects motion data different from that at the time of generating the operation instruction data addressed to the user 3A in the operation instruction data generation addressed to the user 3B. Specifically, in step S66, the player 4 inputs a tap operation to the UI image 745B that selects the motion data corresponding to the record image 742B. In this way, the player 4 can make the motion data included in the operation instruction data different for each user 3.
- unique operation instruction data for each user terminal 100 is transmitted to each of the user terminals 100 of the selected user 3.
- FIG. 24 is a diagram showing an outline of transmission of game progress information from the game play terminal 300 to the user terminal 100. While the operation instruction data for video reproduction on the user terminal 100 is unique for each user terminal 100, as shown in FIG. 24, the user terminals 100 of all users 3 participating in the game during game execution.
- the game progress information transmitted to is common among the user terminals 100. That is, the operation instruction data included in the game progress information is also common among the user terminals 100. As described above, it can be said that the operation instruction data for playing the moving image and the operation instruction data for advancing the game are different data from the viewpoints of the same difference between the user terminals 100 and the destination.
- FIG. 25 is a flowchart showing an example of the flow of the moving image reproduction process executed by the user terminal 100.
- step S71 the processor 10 receives the operation instruction data as the moving image reproduction unit 117.
- step S72 the processor 10 notifies the user 3 of the reception of the operation instruction data as the moving image reproduction unit 117.
- the processor 10 displays a notification image on the display unit 152, reproduces a notification voice from a speaker (not shown), lights a lighting unit (not shown) composed of an LED (light-emitting diode), or the like.
- the user 3 is notified of the reception of the operation instruction data by at least one of the blinking.
- step S73 the processor 10 receives the first reproduction operation for reproducing the moving image as the operation reception unit 111.
- the first reproduction operation may be an operation of tapping the notification image.
- step S74 the processor 10 renders the operation instruction data as the moving image reproduction unit 117 and reproduces the moving image.
- the processor 10 may start an application for playing this game and play a video, or may start an application for playing a video different from the application and play a video. Good.
- the video will be referred to as a "thank you video".
- FIG. 26 is a diagram showing a specific example of playing a thank-you video. Specifically, it is a figure which shows an example of the reproduction of the thank-you moving image in the user terminal 100 of the user 3A.
- the avatar object 610 is uttering the voice 920A while executing a certain motion.
- the processor 10 outputs the audio 920A from the speaker (not shown) while playing the thank-you video 910A including the avatar object 610 that executes a certain motion.
- the motion in the thank-you video 910A is based on the motion data selected by the player 4 in the generation of the motion instruction data addressed to the user 3A, and the voice 920A is the utterance voice input by the player 4 in the generation of the motion instruction data. It is based on the voice data generated from the 820A. That is, the voice 920A is a voice including the content of the support provided by the user 3A in the game and the gratitude for the support. In this way, the user 3A can watch the thank-you video in which the avatar object 610 utters the content of the support provided in the game and the gratitude for the support by inputting the first playback operation.
- the user terminal 100 may display at least one UI image on the touch screen 15 after the reproduction of the thank-you moving image 910A is completed.
- the UI image may be, for example, a UI image that accepts an operation for playing the thank-you video 910A again, a UI image that accepts an operation for transitioning to another screen, or an application. It may be a UI image that accepts an operation for terminating.
- the user terminal 100 may display at least one UI image on the touch screen 15 during playback of the thank-you moving image 910A.
- the UI image may be, for example, a plurality of UI images that accept operations such as temporarily stopping the thank-you video 910A being played, ending it, or changing the scene to be played.
- these UI images displayed during the playback of the thank-you video 910A and after the playback of the thank-you video 910A is hunted do not include a UI image for making a response to the avatar object 610. That is, the thank-you video 910A according to the present embodiment does not have a means for responding to the avatar object 610.
- FIG. 27 is a diagram showing another specific example of playing the thank-you video. Specifically, it is a figure which shows the example of the reproduction of the thank-you moving image in the user terminal 100 of the user 3B.
- the avatar object 610 is uttering the voice 920B while executing a certain motion.
- the processor 10 outputs the audio 920B from the speaker (not shown) while playing the thank-you video 910B including the avatar object 610 that executes a certain motion.
- the motion in the thank-you video 910B is based on the motion data selected by the player 4 in the generation of the motion instruction data addressed to the user 3B, and the voice 920B is the utterance voice input by the player 4 in the generation of the motion instruction data. It is based on the voice data generated from the 820B. Therefore, in the example of FIG. 27, the motion performed by the avatar object 610 is different from the motion of the example of FIG. 26. Further, the voice 920B is a voice including the content of the support provided by the user 3B in the game and the gratitude for the support. Therefore, in the example of FIG. 27, the content of the voice 920B is different from the content of the voice 920A in the example of FIG. 26.
- the thank-you video received by at least a part of the user terminals 100 of the users 3 who participated in the game after the end of the game is a video in which the utterance content of the avatar object 610 is different for each user 3.
- the processor 10 may superimpose and display the UI image 930 including the content prompting participation in the next game on the moving image 910.
- the UI image 930 may be distributed together with the operation instruction data, or may be stored in the user terminal 100 as the game information 132.
- the game that can be provided by the system 1 according to the present embodiment includes a participatory live battle game.
- a participatory live battle game In the following, an example of a participatory live battle game will be described with reference to FIGS. 28 (A) to 28 (D) and 29 (A) to 29 (D).
- the live battle game is composed of a plurality of stages, and in each stage, a soldier object 720 operated by each of the plurality of users and an enemy object 730 operating as an NPC by the game program of each of the plurality of user terminals 100.
- Building objects 740 such as high-rise buildings appear in the virtual space 600B defined by each of the plurality of user terminals 100.
- the soldier object 720 is, for example, an object that imitates a soldier carrying a karoka gun
- the enemy object 730 is, for example, an object that imitates a giant spider.
- Soldier object 720 attacks enemy object 730 by firing bullets from a apelooka cannon.
- the enemy object 730 attacks the soldier object 720 by releasing a thread from its mouth.
- the user terminal 100 of the own user arranges the virtual camera 620B of the own user in the virtual space 600B defined by the user terminal 100.
- the virtual camera 620B is arranged so as to capture the virtual space 600B from behind the soldier object 720 of the own user.
- the user terminal 100 of the own user displays the field of view image 660 representing the field of view area 640B of the virtual camera 620B on the touch screen 15 as a game image, and displays the virtual pads VP1 and VP2 superimposed on the game image. The details of the virtual pads VP1 and VP2 will be described later.
- the user terminal 100 of the own user When the virtual pad VP1 or VP2 of the own user is operated, the user terminal 100 of the own user causes the soldier object 720 of the own user to perform an operation corresponding to the operation in the virtual space 600B defined by the user terminal 100. Further, the user terminal 100 of the own user transmits the operation information for making the operation identifiable to the user terminal 100 of another user via the server 200.
- the user terminal 100 that has received the operation information is a soldier object 720 that is operated by the user of the user terminal 100 that has transmitted the operation information among the soldier objects 720 that exist in the virtual space 600B defined in the user terminal 100. , Operate according to the operation information.
- the soldier object 720 moves in the virtual space 600B defined by each of the plurality of user terminals 100.
- the soldier object 720 jumps in the virtual space 600B defined by each of the plurality of user terminals 100.
- the operation is an operation for changing the direction of the apelooka gun, that is, the direction of the soldier object 720
- the direction of the apelooka gun that is, the soldier object is in the virtual space 600B defined by each of the plurality of user terminals 100.
- the orientation of the 720 is changed.
- the operation is an operation for firing a bullet from the apelooka cannon
- the bullet is fired from the apelooka cannon in the virtual space 600B defined by each of the plurality of user terminals 100.
- the user terminal 100 of the own user When a bullet fired from the apel object 720 of the own user hits the enemy object 730, the user terminal 100 of the own user reduces the HP of the enemy object 730 and reduces the HP of the reduced HP via the server 200. It is transmitted to the user terminal 100 of another user.
- the user terminal 100 that has received the reduced HP updates the HP of the enemy object 730 existing in the virtual space 600B defined in the user terminal 100 with the reduced HP.
- each of the plurality of user terminals 100 eliminates the enemy object 730 from the virtual space 600B defined by the user terminal 100.
- the building object 740 When a bullet fired from a apela cannon hits a building object 740, the building object 740 is destroyed.
- the user terminal 100 when a thread is released from the enemy object 730 in the virtual space 600B defined by the user terminal 100 of the own user and the thread hits the soldier object 720 of the own user, the user terminal 100 is the soldier object 720 of the own user. Reduces HP. When the HP of the soldier object 720 of the own user decreases to 0, the user terminal 100 of the own user ends the battle of the soldier object 720 and notifies the user of another user via the server 200 that the battle has ended. Send to terminal 100. Upon receiving the notification, the user terminal 100 extinguishes the soldier object 720 whose battle has ended from the virtual space 600B defined by the user terminal 100.
- the virtual pad VP1 is displayed at the lower left position of the touch screen 15.
- the virtual pad VP2 has an outer diameter smaller than the outer diameter of the virtual pad VP1 and is displayed at a position slightly above the lower right of the touch screen 15. That is, the virtual pads VP1 and VP2 are arranged at positions where their respective center positions (reference positions) are deviated in both the horizontal direction and the vertical direction on the touch screen 15.
- the arrangement of the virtual pads VP1 and VP2 is fixed regardless of the user's touch operation.
- the user terminal 100 sets the virtual pads VP1 and VP2 in a predetermined mode (for example, in the vertical and horizontal directions).
- the operation is displayed in a mode that gives the user the impression of vibration by moving and displaying by a predetermined amount).
- the user terminal 100 registers information indicating the touch position with respect to the touch screen 15 in the history information table (not shown). Specifically, when the user terminal 100 detects the touch from the state in which the touch to the touch screen 15 is not detected, the user terminal 100 determines that the touch-on state has been reached, and sets the history information indicating the touch position to the "touch now state". It is registered in the history information table as the history information of. Further, when the touch to the touch screen 15 is no longer detected, it is determined that the "touch-off state" has been reached, and predetermined history information is registered in the history information table. The user terminal 100 specifies a mode of touch operation (tap operation, drag operation, etc.) on the touch screen 15 based on the history information.
- a mode of touch operation tap operation, drag operation, etc.
- the virtual pad VP1 is composed of an operation body (operation target image) ST1 and a circular outer frame (range image) FR1 larger than the operation body ST1.
- the operating body ST1 is displayed at the center position (reference position) of the range RG1 surrounded by the outer frame FR1.
- the user terminal 100 moves the operation body ST1 in the direction specified by the drag operation. More specifically, the user terminal 100 does not move the operation body ST1 by the drag operation started by the touch operation to the position other than the position of the operation body ST1, but is started by the touch operation to the position of the operation body ST1.
- the operation body ST1 is moved according to the drag operation.
- the operating body ST1 is a circular object having a predetermined radius from the center position of the operating body ST1 and having the center position movable within the outer frame FR1. As a result, when the center position is moved to the vicinity of the outer edge of the outer frame FR1 by the drag operation, the operating body ST1 is displayed slightly protruding outside the outer frame FR1.
- the user terminal 100 When the drag operation started by the touch operation on the operation body ST1 is performed, the user terminal 100 specifies a vector whose start point is the center position of the range RG1 and whose end point is the current touch position. If the current touch position is within the range RG1, the user terminal 100 aligns the center position of the operating body ST1 with the current touch position, and if the current touch position is outside the range RG1, the user terminal 100 has the vector. Align the center position of the operating body ST1 with the intersection of the outer frame FR1 and the operating body ST1. When the touch position for the drag operation is moved outside the outer frame FR1 in the circumferential direction of the outer frame FR1, the intersection of the touch position and the center position of the range RG1 and thus the operating body ST1 also moves in the circumferential direction.
- the user terminal 100 moves the soldier object 720 in a direction corresponding to the center position of the range RG1 and the current position of the operating body ST1.
- the direction of the soldier object 720 is not changed by the drag operation of the operation body ST1. Therefore, when the operation body ST1 is moved by the drag operation while the soldier object 720 is displayed as shown in FIG. 28 (C), the soldier object 720 operates as follows. At this time, the virtual camera 620B moves following the soldier object 720.
- the soldier object 720 moves forward while facing forward (see FIG. 28 (A)).
- the soldier object 720 moves backward while facing forward (see FIG. 28E).
- the soldier object 720 moves to the right while facing forward (see FIG. 28 (B)).
- the soldier object 720 moves to the left while facing forward (see FIG. 28 (D)).
- the soldier object 720 moves diagonally forward to the right while facing forward. Further, when the operating body ST1 is moved diagonally downward to the left, the soldier object 720 moves diagonally backward to the left while facing forward.
- the user terminal 100 transmits to the server 200 information that can identify that the operation target and the operation mode are the virtual pad VP1 and the drag operation, respectively, and the operation information including the above vector and the user ID.
- the processor 10 of the user terminal 100 performs a display process of returning the operation body ST1 to the center position of the range RG1.
- the processor 10 of the user terminal 100 may perform a display process for returning the operation body ST1 to the center position at once, and a display process for returning the operation body ST1 to the center position at a predetermined moving speed. You may go.
- the user terminal 100 When a tap operation is performed by the user for any position in the range RG1, the user terminal 100 causes the soldier object 720 to jump on the spot regardless of whether or not the tap operation is an operation for the operation body ST1. .. However, if the tap operation is performed within a predetermined time (for example, 0.1 seconds) after the drag operation is released, the soldier object is moved in the moving direction of the soldier object 720 by the released drag operation. You may make the 720 jump.
- the user terminal 100 transmits to the server 200 operation information including information that can identify that the operation target and the operation mode are the virtual pad VP1 and the tap operation, respectively, and the user ID.
- the virtual pad VP2 is composed of an operation body (operation target image) ST2 and a circular outer frame (range image) FR2 larger than the operation body ST2.
- the operation body ST2 is displayed at the center position (reference position) of the range RG2 surrounded by the outer frame FR2, and when a drag operation is performed on the operation body ST2, the operation body ST2 moves in the direction specified by the drag operation.
- the user terminal 100 does not move the operation body ST2 by the drag operation started by the touch operation to the position other than the position of the operation body ST2, and the drag operation is started by the touch operation to the position of the operation body ST2.
- the operating body ST2 is moved accordingly.
- the operating body ST2 is a circular object whose center position of the operating body ST2 can be moved within the outer frame FR2. Therefore, when the center position of the operating body ST2 moves to the vicinity of the outer edge of the outer frame FR2 by the drag operation, the operating body ST2 is displayed slightly protruding outside the outer frame FR2.
- the user terminal 100 When the drag operation started by the touch operation on the operation body ST2 is performed, the user terminal 100 specifies a vector whose start point is the center position of the range RG2 and whose end point is the current touch position. If the current touch position is within the range RG2, the user terminal 100 aligns the center position of the operating body ST2 with the current touch position, and if the current touch position is outside the range RG2, the user terminal 100 is the vector. Align the center position of the operating body ST2 with the intersection of the outer frame FR2 and the operating body ST2. When the touch position for the drag operation is moved outside the outer frame FR2 in the circumferential direction of the outer frame FR2, the intersection of the touch position and the center position of the range RG2, and thus the operating body ST2 also moves in the circumferential direction.
- the user terminal 100 changes the firing direction of the apelooka cannon and the arrangement of the virtual camera 620B according to the center position of the range RG2 and the current position of the operating body ST2.
- the user terminal 100 views the firing direction of the apelooka cannon, that is, the direction of the soldier object 720 from above the soldier object 720, with respect to the body axis of the soldier object 720.
- the position and orientation of the virtual camera 620B are changed so as to catch the soldier object 720 from behind while changing the direction clockwise or counterclockwise.
- the user terminal 100 changes the firing direction of the apelooka cannon in the vertical direction around the straight line connecting both shoulders of the soldier object 720 and captures the firing direction. Change the position and orientation of the virtual camera 620B.
- the firing direction of the Bazooka cannon that is, the direction of the soldier object 720 is changed to the counterclockwise direction around the body axis of the soldier object 720, and the virtual camera 620B is the soldier object. It follows the movement of the soldier object 720 so as to capture the 720 from behind (see FIG. 29 (B)).
- the firing direction of the Bazooka cannon that is, the direction of the soldier object 720 is changed clockwise around the body axis of the soldier object 720
- the virtual camera 620B is changed to the soldier object 720.
- the firing direction of the Bazooka cannon that is, the direction of the soldier object 720 is changed clockwise around the body axis of the soldier object 720
- the virtual camera 620B is changed to the soldier object 720.
- the firing direction of the apelooka cannon that is, the soldier object 720 faces diagonally upward to the right.
- the firing direction of the apelooka cannon that is, the soldier object 720 faces diagonally downward to the left.
- the user terminal 100 transmits to the server 200 information that can identify that the operation target and the operation mode are the virtual pad VP2 and the drag operation, respectively, and the operation information including the vector and the user ID.
- the user terminal 100 returns the operating body ST2 to the center position of the range RG2.
- the user terminal 100 When a tap operation is performed by the user for any position in the range RG2, the user terminal 100 fires a bullet from the apelooka cannon regardless of whether the tap operation is an operation for the operation body ST2. Further, the user terminal 100 transmits the operation information including the user ID and the information that can identify that the operation target and the operation mode are the virtual pad VP2 and the tap operation, respectively, to the server 200.
- the virtual pads VP1 and VP2 are arranged in a layer above the game image (a layer having a high priority), and are displayed superimposed on the game image. Further, regarding the virtual pads VP1 and VP2, one virtual pad is arranged in a layer (higher priority layer) higher than the other virtual pad. As described above, the image displayed on the touch screen 15 is an image in which a plurality of layers having different priorities are superimposed, and the virtual pads VP1 and VP2 are displayed on different layers, respectively.
- the virtual pads VP1 and VP2 When playing the live battle game of the present embodiment using a small terminal having a display area smaller than the touch screen 15 of the user terminal 100, the virtual pads VP1 and VP2 partially overlap in the display area. It is expected that it will end up. However, even in such a case, the small terminal accepts the operation within the overlapping range as the operation for the virtual pad having a high priority. For example, when the virtual pad VP2 is set in the hierarchy above the virtual pad VP1, when a tap operation is performed on the overlapping range, the small terminal considers it as a tap operation on the virtual pad VP2 and starts from the apelooka cannon. Fire a bullet.
- the touch screen 15 displays icons IC1 and IC2 that can be tapped while the game is in progress.
- the icon IC1 is an icon for exerting an action on the virtual space 600B, and by tapping the icon IC1, a virtual object such as an obstacle is thrown into the virtual space 600B.
- the icon IC2 is an icon for inputting a comment of the own user, and by tapping the icon IC2, the comment of the own user is displayed on each touch screen 15 of the plurality of user terminals 100. Both the icon IC1 and IC2 are arranged in a layer above the virtual pads VP1 and VP2.
- the small terminal accepts the tap operation as a tap operation on the icon IC1.
- the operating bodies ST1 and ST2 are returned to the center positions of the ranges RG1 and RG2, respectively, in response to the release of the drag operation on the operating bodies ST1 and ST2, respectively. Therefore, if the tap operation for jumping or firing a bullet is accepted only for the tap operation on the operation body ST1 or ST2, even if the tap operation is performed at the release position of the drag operation, the tap operation is performed at the release position. Since the operating bodies ST1 and ST2 do not exist, they cannot be effectively accepted. As a result, the tap operation immediately after the drag operation is released becomes difficult.
- a game control such as jumping the soldier object 720 regardless of whether the tap operation is an operation on the operation body ST1 or not. Is executed. Further, the operation body ST2 is returned to the center position of the range RG2 in response to the release of the drag operation on the operation body ST2, and when the tap operation on the range RG2 is performed, the tap operation is an operation on the operation body ST2. Regardless of whether or not, game control such as firing a karoka bullet is executed.
- the tap operation immediately after the drag operation is released can be facilitated.
- the movement of the soldier object 720 according to the drag operation with respect to the operation body ST1 and the jump of the soldier object 720 according to the tap operation within the range RG1 can be performed. It is possible to improve the operability when continuously performing, and when continuously changing the direction of the apelooka gun according to the drag operation on the operating body ST2 and attacking according to the tap operation within the range RG2. Operability can be improved.
- the virtual pad that accepts an operation for changing the firing direction of the apelooka cannon or the direction of the soldier object 720 or performing an attack with the apelooka cannon is different from the virtual pad that accepts the operation for moving / jumping the soldier object 720. Therefore, it is possible to change the direction and attack while moving / jumping, and the operability can be improved.
- the operation of moving the soldier object 720 and the operation of jumping the soldier object 720 are accepted by the virtual pad VP1, and the operation of changing the firing direction of the apelooka gun or the direction of the soldier object 720 and the attack by the apelooka gun are performed. Since the virtual pad VP2 accepts the operation to be made, it is possible to change the direction and attack while moving / jumping. As a result, operability can be improved.
- step S81 it is determined whether or not the tap operation is performed in the range RG1 based on the input operation on the touch screen 15.
- the process proceeds to step S82, and the soldier object 720 is jumped.
- step S83 it is determined based on the history information table whether or not the drag operation started by the touch operation on the operation body ST1 is in progress. If it is not determined that the drag operation is in progress, the operation body ST1 is arranged at the center position (reference position) of the range RG1 in step S84, and then returns. On the other hand, when it is determined that the drag operation is in progress, the process proceeds to step S85.
- step S85 a vector is created with the center position of the range RG1 as the start point and the current touch position as the end point.
- step S86 the operating body ST1 is moved based on the vector. That is, if the current touch position is within the range RG1, the operating body ST1 is moved to the current touch position, and if the current touch position is outside the range RG1, the operating body is at the intersection of the vector and the outer frame FR1. Move ST1.
- step S87 the soldier object 720 is moved in a direction corresponding to the center position of the range RG1 and the current position of the operating body ST1.
- step S88 operation information for making the user's operation identifiable is transmitted to the server 200. That is, when the process proceeds from step S82 to step S88, the server 200 is provided with information that can identify that the operation target and the operation mode are the virtual pad VP1 and the tap operation, respectively, and the operation information including the user ID. Send. On the other hand, when the process proceeds from step S87 to step S88, the information that can identify that the operation target and the operation mode are the virtual pad VP1 and the drag operation, respectively, the vector created in step S85, and the user ID of the own user. The operation information including and is transmitted to the server 200. When the process of step S89 is completed, the process returns.
- step S91 it is determined whether or not the tap operation is performed in the range RG2 based on the input operation on the touch screen 15.
- the process proceeds to step S92, and a bullet is fired from the apelooka cannon.
- step S93 it is determined based on the history information table whether or not the drag operation started by the touch operation on the operation body ST2 is in progress. If it is not determined that the drag operation is in progress, the operating body ST2 is arranged at the center position (reference position) of the range RG2 in step S94, and then returns. On the other hand, when it is determined that the drag operation is in progress, the process proceeds to step S95.
- step S95 a vector is created with the center position of the range RG1 as the start point and the current touch position as the end point.
- step S96 the operating body ST2 is moved based on the vector. That is, if the current touch position is within the range RG2, the operating body ST1 is moved to the current touch position, and if the current touch position is outside the range RG2, the operating body is at the intersection of the vector and the outer frame FR2. Move ST2.
- step S97 the firing direction of the apelooka cannon and the arrangement of the virtual camera 620B are changed according to the center position of the range RG1 and the current position of the operating body ST2.
- the firing direction of the Bazooka cannon that is, the direction of the soldier object 720 is viewed clockwise or counterclockwise when viewed from above the body axis of the soldier object 720.
- the position and orientation of the virtual camera 620B are changed so that the soldier object 720 is captured from behind.
- the firing direction of the Bazooka cannon is changed in the vertical direction around the straight line connecting both shoulders of the soldier object 720, and a virtual camera is used to capture the firing direction. Change the position and orientation of 620B.
- step S98 operation information for making the user's operation identifiable is transmitted to the server 200. That is, when the process proceeds from step S92 to step S98, the server 200 is provided with information that can identify that the operation target and the operation mode are the virtual pad VP2 and the tap operation, respectively, and the operation information including the user ID. Send. On the other hand, when the process proceeds from step S97 to step S98, the information that can identify that the operation target and the operation mode are the virtual pad VP2 and the drag operation, respectively, the vector created in step S95, and the user ID of the own user. Operation information including and is transmitted to the server 200. When the process of step S98 is completed, the process returns.
- the operating body ST1 is displayed at a predetermined reference position on the touch screen 15, and the outer frame FR1 in which the predetermined range RG1 including the reference position can be specified is displayed.
- the operation body ST1 moves from the reference position to a position within the range RG1 corresponding to the current position of the drag operation.
- the game control for moving the soldier object 720 is executed according to the reference position and the display position of the operation body ST1. Further, when the tap operation on the range RG1 is performed, the game control for jumping the soldier object 720 is executed regardless of whether or not the tap operation is the operation on the operation body ST1.
- the operating body ST2 is displayed at a predetermined reference position on the touch screen 15, and the outer frame FR2 in which the predetermined range RG2 including the reference position can be specified is displayed.
- the operation body ST2 moves from the reference position to a position within the range RG2 corresponding to the current position of the drag operation.
- the game control for changing the firing direction of the apelooka cannon is executed according to the reference position and the display position of the operating body ST2.
- a game control for firing a bullet from the apelooka cannon is executed regardless of whether or not the tap operation is an operation on the operating body ST2.
- the operating body ST1 when the drag operation on the operating body ST1 is released, the operating body ST1 is returned to the reference position. As a result, the operation position when starting the drag operation can be made uniform, so that the operability of the drag operation can be improved.
- the operating body ST1 does not move by the drag operation started by the touch operation on the position other than the display position of the operating body ST1, but by the touch operation on the display position of the operating body ST1. Move according to the started drag operation.
- the game control for changing the direction of the apelooka cannon or firing a bullet from the apelooka cannon is executed.
- the arrangement of the virtual pad VP2 is determined so as to be a position deviated from the virtual pad VP1 in both the horizontal direction (horizontal direction) and the vertical direction (vertical direction) on the touch screen 15.
- the virtual pads VP1 and VP2 are prevented from overlapping as much as possible. Can be prevented. That is, for example, even when the smartphone is held vertically and the game is played, it is possible to prevent the virtual pads VP1 and VP2 from overlapping as much as possible. Further, when the virtual pad VP1 is operated with the fingers of the left hand while the game image is displayed vertically on the smartphone and the virtual pad VP2 is operated with the fingers of the right hand, it is possible to prevent the left and right fingers from colliding with each other.
- the virtual pads VP1 and VP2 are associated with different priorities, respectively.
- the ranges RG1 and RG2 overlap, if a user operation is performed within the overlapping range, the user operation is accepted as an operation in the range R and RG2 having a higher priority.
- the ranges RG1 and RG2 overlap, they overlap. Since the operation for the range to be performed is accepted as the operation for one of the ranges, the user can recognize in advance the game control performed by the operation for the overlapping range, so that it is possible to avoid confusing the user. Further, in the present embodiment, since the virtual pad VP2 is prioritized over the virtual pad VP1, when the tap operation is performed on the overlapping range, the attack can be prioritized over the jump.
- the virtual pads VP1 and VP2 vibrate up and down. This makes it possible to enhance the taste of the game.
- the soldier object 720 moves in the direction specified by the drag operation.
- the soldier object 720 performs an operation called jump as an operation regardless of the position of the tap operation.
- the action of jumping needs to be performed at an accurate timing, but the jump is performed in response to a tap operation on a relatively wide range such as range RG1.
- the user can set the timing of the tap operation, that is, the timing of the jump while watching the game image instead of the virtual pad VP1. This improves operability.
- the firing direction of the apelooka cannon moves in the direction specified by the drag operation.
- a tap operation is performed on the range RG2
- an operation of firing a bullet is performed as an operation regardless of the position of the tap operation.
- the user can set the timing of the tap operation, that is, the timing of firing the bullet, while watching the game image instead of the virtual pad VP2. This improves operability.
- the virtual pads VP1 and VP2 are displayed on the touch screen 15.
- the number of virtual pads to be displayed varies depending on the stage, such as displaying one of the virtual pads VP1 and VP2 on one stage and displaying both the virtual pads VP1 and VP2 on another stage. You may do so.
- the game image is displayed vertically.
- the display mode of the game image may be switched according to the stage, such as displaying the game image vertically on one stage and displaying it horizontally on another stage.
- the display positions of the virtual pads VP1 and VP2 are also switched according to the stage.
- the virtual pad VP1 is displayed on the left side of the touch screen 15, and the virtual pad VP2 is displayed on the right side of the touch screen 15.
- the arrangement of the virtual pads VP1 and VP2 may be switched by the user setting.
- the soldier object 720 plays against the enemy object 730.
- the battle may be made between a plurality of soldier objects 720 operated by a plurality of users.
- the soldier object 720 operated by another user becomes an enemy object and the battle is performed in a battle royale format, and the user who operates the last surviving soldier object 720 becomes the winner.
- the enemy object 730 is an NPC operated by a game program.
- the enemy object 730 may be operated by a player (performer).
- both the virtual pads VP1 and VP2 are displayed at predetermined positions.
- either one of the virtual pads VP1 and VP2 may be displayed at a position corresponding to the touch position of the user.
- the operating body ST2 may be displayed at the touch position of the user, and the outer frame FR2 may be displayed so that the position of the operating body ST2 is the center position.
- the display positions of the virtual pads VP and VP2 may be set in advance by user settings.
- the user can operate only one soldier object 720.
- the soldier object 720 operated by the own user and another soldier object 720 that operates as an NPC according to the game program as an ally of the own user are displayed, and a predetermined achievement condition (for example, the number of defeated enemy objects 730) is displayed.
- the soldier object 720 to be operated may be switched to the other soldier object 720 by satisfying the condition that is satisfied when the number reaches a predetermined number or the condition that is satisfied when the player reaches a predetermined place. ..
- the soldier object 720 before switching may be operated as an NPC or may be extinguished.
- both the virtual pads VP1 and VP2 are displayed in a predetermined mode in a specific situation.
- only one of the virtual pads VP1 and VP2 may be displayed in a predetermined mode, or the virtual pads VP1 and VP2 may be displayed in a predetermined mode different from each other.
- the outer frames FR1 and FR2 have a perfect circular shape, but the shape may be an elliptical shape or a cross shape, and the shape of the outer frame FR2 may be changed.
- the shape of the outer frame FR1 may be different from that of the outer frame FR1.
- the respective shapes of the outer frames FR1 and FR2, that is, the respective widths of the ranges RG1 and RG2 are fixed regardless of the drag operation with respect to each of the operating bodies ST1 and ST2.
- the shape of the outer frame FR1, that is, the width of the range RG1 may be changed according to the drag operation on the operating body ST1
- the shape of the outer frame FR2, that is, the width of the range RG2 may be changed according to the drag operation on the operating body ST2. It may be changed.
- a game program executed on a computer (user terminal 100) including a processor, a memory, and a touch screen, wherein the processor predetermines the touch screen.
- a range image (outer frame FR1) that displays an operation target image (operation body ST1) at the first position (reference position) and makes it possible to specify a predetermined first range (range RG1) including the first position. ), And a step of moving the display position of the operation target image to a position within the first range according to the current position of the drag operation by accepting a drag operation on the operation target image from the user.
- the step (S87) of executing the first game control By accepting (S86, S84), the step (S87) of executing the first game control according to the first position and the display position of the operation target image, and the tap operation for the first range from the user.
- the step (S82) for executing the second game control is executed regardless of whether or not the tap operation is an operation for the operation target image.
- (Appendix 4) In any of (Appendix 1) to (Appendix 3), the step of displaying a range image (outer frame FR2) that enables the processor to specify a predetermined second range (range RG2) of the touch screen. In response to an operation from the user in the second range, the steps (S92, S97) for executing the third game control are executed, and the first range and the second range have their respective central positions. The position of the touch screen is set to be offset in both the horizontal direction and the vertical direction.
- (Appendix 6) In any of (Appendix 1) to (Appendix 5), a step of causing the processor to display the operation of the operation target image and the range image in a predetermined mode when a predetermined condition is satisfied according to the progress of the game. Let it run.
- (Appendix 7) In (Appendix 6), the game based on the game program is a battle game in which the first object operated by the user is played against the second object, and the predetermined condition is that the first object is from the second object. Includes conditions that are met by being attacked.
- the game based on the game program is a game in which a first object operated by a user appears in a virtual space, and the first game control is the first game control.
- the second game control is a control that exerts a first action on the first object according to the one position and the display position of the operation target image, and the second game control has a second action that is not related to the tap operation position. This is a process applied to the first object.
- a gaming method performed by a computer (user terminal 100) comprising a processor, a memory, and a touch screen, wherein the computer is a predetermined first position of the touch screen.
- the operation target image (operation body ST1) is displayed in (reference position), and the range image (outer frame FR1) that enables the identification of the predetermined first range (range RG1) including the first position is displayed.
- the display position of the operation target image is moved to a position within the first range corresponding to the current position of the drag operation (S86, S84).
- the step (S82) for executing the second game control is provided regardless of whether or not the operation is for the operation target image.
- the information terminal device (user terminal 100) is a storage unit (storage unit 120) for storing a game program, and the information terminal device is operated by executing the game program.
- a control unit (control unit 110) for controlling the operation is provided, and the control unit displays an operation target image (operation body ST1) at a predetermined first position (reference position) of the touch screen, and the first position.
- a step of displaying a range image (outer frame FR1) that enables the identification of a predetermined first range (range RG1) including one position and a drag operation on the operation target image, the operation target The step (S86, S84) of moving the display position of the image to a position within the first range according to the current position of the drag operation, and the first position according to the first position and the display position of the operation target image.
- the step (S87) for executing the game control of 1 and the tap operation for the first range from the user the second game regardless of whether or not the tap operation is an operation for the operation target image.
- the step (S82) for executing the control is executed.
- control blocks (particularly the control units 110, 210, 310, 410) of the user terminal 100, the server 200, the game play terminal 300 (HMD set 1000), and the distribution terminal 400 are logics formed in an integrated circuit (IC chip) or the like. It may be realized by a circuit (hardware) or by software.
- the user terminal 100, the server 200, the game play terminal 300 (HMD set 1000), and the distribution terminal 400 include a computer that executes instructions of a program that is software that realizes each function.
- the computer includes, for example, one or more processors and a computer-readable recording medium that stores the program. Then, in the computer, the processor reads the program from the recording medium and executes it, thereby achieving the object of the present invention.
- the processor for example, a CPU (Central Processing Unit) can be used.
- the recording medium in addition to a “non-temporary tangible medium” such as a ROM (Read Only Memory), a tape, a disk, a card, a semiconductor memory, a programmable logic circuit, or the like can be used.
- a RAM RandomAccessMemory
- the program may be supplied to the computer via an arbitrary transmission medium (communication network, broadcast wave, etc.) capable of transmitting the program.
- a transmission medium communication network, broadcast wave, etc.
- one aspect of the present invention can also be realized in the form of a data signal embedded in a carrier wave, in which the above program is embodied by electronic transmission.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Computer Hardware Design (AREA)
- Business, Economics & Management (AREA)
- Computer Security & Cryptography (AREA)
- General Business, Economics & Management (AREA)
- Acoustics & Sound (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
図1は、本実施形態に係るシステム1の概要を示す図である。システム1は、複数のユーザ端末100(コンピュータ)と、サーバ200と、ゲームプレイ端末300(外部装置、第2外部装置)と、配信端末400(外部、第1外部装置)とを含む。なお、図1では、複数のユーザ端末100の一例として、ユーザ端末100A~100C、換言すれば、3台のユーザ端末100を記載しているが、ユーザ端末100の台数は図示の例に限定されない。また、本実施形態では、ユーザ端末100A~Cを区別する必要が無い場合、「ユーザ端末100」と記載する。ユーザ端末100、ゲームプレイ端末300、および配信端末400は、サーバ200とネットワーク2を介して接続する。ネットワーク2は、インターネットおよび図示しない無線基地局によって構築される各種移動通信システム等で構成される。この移動通信システムとしては、例えば、所謂3G、4G移動通信システム、LTE(Long Term Evolution)、および所定のアクセスポイントによってインターネットに接続可能な無線ネットワーク(例えばWi-Fi(登録商標))等が挙げられる。
本実施形態では、システム1によって提供されるゲーム(以下、本ゲーム)の一例として、ゲームプレイ端末300のユーザが主としてプレイするゲームを説明する。以下、ゲームプレイ端末300のユーザを、「プレイヤ」と称する。プレイヤ(演者)は、一例として、本ゲームに登場するキャラクタを操作することにより、ゲームを進行させる。また、本ゲームにおいて、ユーザ端末100のユーザは、プレイヤによるゲームの進行を支援する役割を担う。本ゲームの詳細については後述する。なお、システム1によって提供されるゲームは、複数のユーザが参加するゲームであればよく、この例に限定されない。
ゲームプレイ端末300は、プレイヤによる入力操作に応じてゲームを進行させる。また、ゲームプレイ端末300は、プレイヤのゲームプレイにより生成された情報(以下、ゲーム進行情報)を、順次、サーバ200にリアルタイムで配信する。
サーバ200は、ゲームプレイ端末300からリアルタイムに受信したゲーム進行情報(第2データ)を、ユーザ端末100に送信する。また、サーバ200は、ユーザ端末100、ゲームプレイ端末300、および配信端末400の間の各種情報の送受信を仲介する。
配信端末400は、配信端末400のユーザによる入力操作に応じて、動作指図データ(第1データ)を生成し、サーバ200を介してユーザ端末100へ動作指図データを配信する。動作指図データとは、ユーザ端末100において動画を再生するためのデータであり、具体的には、動画に登場するキャラクタを動作させるためのデータである。
ユーザ端末100は、ゲーム進行情報をリアルタイムに受信し、該情報を用いてゲーム画面を生成して表示する。換言すれば、ユーザ端末100は、リアルタイムレンダリングにより、プレイヤがプレイしているゲームのゲーム画面を再生する。これにより、ユーザ端末100のユーザは、プレイヤがゲームをプレイしながら視認しているゲーム画面と同一のゲーム画面を、プレイヤとほぼ同じタイミングで視認することができる。
図2は、ユーザ端末100のハードウェア構成を示す図である。図3は、サーバ200のハードウェア構成を示す図である。図4は、ゲームプレイ端末300のハードウェア構成を示す図である。図5は、配信端末400のハードウェア構成を示す図である。
本実施形態では、一例として、ユーザ端末100がスマートフォンとして実現される例を説明するが、ユーザ端末100はスマートフォンに限定されない。例えば、ユーザ端末100はフィーチャーフォン、タブレット型コンピュータ、ラップトップ型コンピュータ(いわゆる、ノートパソコン)、または、デスクトップ型コンピュータなどとして実現されてもよい。また、ユーザ端末100は、ゲームプレイに適したゲーム装置であってもよい。
サーバ200は、一例として、ワークステーションまたはパーソナルコンピュータなどの汎用コンピュータであってよい。サーバ200は、プロセッサ20と、メモリ21と、ストレージ22と、通信IF23と、入出力IF24とを備える。サーバ200が備えるこれらの構成は、通信バスによって互いに電気的に接続される。
ゲームプレイ端末300は、一例として、パーソナルコンピュータなどの汎用コンピュータであってよい。ゲームプレイ端末300は、プロセッサ30と、メモリ31と、ストレージ32と、通信IF33と、入出力IF34とを備える。ゲームプレイ端末300が備えるこれらの構成は、通信バスによって互いに電気的に接続される。
配信端末400は、スマートフォン、PDA(Personal Digital Assistant)、またはタブレット型コンピュータ等の携帯端末であってよい。また、配信端末400は、デスクトップパソコン等の、いわゆる据え置き型の端末であってもよい。
プロセッサ10、20、30、40はそれぞれ、ユーザ端末100、サーバ200、ゲームプレイ端末300、配信端末400の全体の動作を制御する。プロセッサ10、20、30、40は、CPU(Central Processing Unit)、MPU(Micro Processing Unit)、およびGPU(Graphics Processing Unit)を含む。プロセッサ10、20、30、40は、それぞれ、後述するストレージ12、22、32、42からプログラムを読み出す。そして、プロセッサ10、20、30、40は、それぞれ、読み出したプログラムを、後述するメモリ11、21、31、41に展開する。プロセッサ10、20、30は、展開したプログラムを実行する。
図6は、システム1に含まれるユーザ端末100、サーバ200、およびHMDセット1000の機能的構成を示すブロック図である。図7は、図6に示す配信端末400の機能的構成を示すブロック図である。
記憶部120は、ゲームプログラム131(プログラム)、ゲーム情報132、および、ユーザ情報133を格納する。記憶部220は、ゲームプログラム231、ゲーム情報232、ユーザ情報233、および、ユーザリスト234を格納する。記憶部320は、ゲームプログラム331、ゲーム情報332、および、ユーザ情報333を格納する。記憶部420は、ユーザリスト421、モーションリスト422、配信プログラム423(プログラム、第2プログラム)を格納する。
制御部210は、記憶部220に格納されたゲームプログラム231を実行することにより、サーバ200を統括的に制御する。例えば制御部210は、ユーザ端末100、HMDセット1000、および配信端末400の間の各種情報の送受信を仲介する。
制御部310は、記憶部320に格納されたゲームプログラム331を実行することにより、HMDセット1000を統括的に制御する。例えば、制御部310は、ゲームプログラム331、および、プレイヤの操作に従って、ゲームを進行させる。また、制御部310は、ゲームを進行させている間、必要に応じて、サーバ200と通信して、情報の送受信を行う。制御部310は、情報の送受信を、サーバ200を介さずにユーザ端末100と直接行ってもよい。
制御部110は、記憶部120に格納されたゲームプログラム131を実行することにより、ユーザ端末100を統括的に制御する。例えば、制御部110は、ゲームプログラム131、および、ユーザの操作に従って、ゲームを進行させる。また、制御部110は、ゲームを進行させている間、必要に応じて、サーバ200と通信して、情報の送受信を行う。制御部110は、情報の送受信を、サーバ200を介さずにHMDセット1000と直接行ってもよい。
制御部410は、記憶部420に格納されたプログラム(不図示)を実行することにより、配信端末400を統括的に制御する。例えば、制御部410は、該プログラム、および、配信端末400のユーザ(本実施形態ではプレイヤ)の操作に従って、動作指図データを生成し、ユーザ端末100に配信する。また、制御部410は、必要に応じて、サーバ200と通信して、情報の送受信を行う。制御部410は、情報の送受信を、サーバ200を介さずにユーザ端末100と直接行ってもよい。
図8は、プレイヤに提供される仮想空間、および、ユーザ端末100のユーザに提供される仮想空間の制御処理の流れの一例を示すフローチャートである。図9は、ある実施の形態に従う、プレイヤに提供される仮想空間600A、および、プレイヤが視認する視界画像を示す図である。図10は、ある実施の形態に従う、ユーザ端末100のユーザに提供される仮想空間600B、および、ユーザが視認する視界画像を示す図である。なお以降、仮想空間600Aおよび600Bを区別する必要が無い場合、「仮想空間600」と記載する。
図11は、ユーザ端末100において表示される視界画像の他の例を示す図である。具体的には、プレイヤがプレイしている、システム1が実行するゲーム(本ゲーム)のゲーム画面の一例を示す図である。
図13は、ゲームプレイ端末300で実行されるゲーム進行処理の流れの一例を示すフローチャートである。
図14は、ユーザ端末100で実行されるゲーム進行処理の流れの一例を示すフローチャートである。
図15は、サーバ200で実行されるゲーム進行処理の流れの一例を示すフローチャートである。
(配信端末400における配信処理)
図17は、配信端末400で実行される配信処理の流れの一例を示すフローチャートである。図18は、配信端末400に表示される画面の一具体例を示す図である。図19は、配信端末に表示される画面の他の具体例を示す図である。
4において、ゲームに参加した各ユーザを示す情報のそれぞれに対応付けられているタグを示す情報である。例えば、レコード画像732Aは、ユーザ名733Aとして、「AAAAA」を含む。このため、レコード画像732Aは、タグ情報734Aとして、ユーザリスト234において「AAAAA」に対応付けられている、『マガジン、10F、ボス、「マガジンのプレゼントによりボスに勝利」』を含む。アイコン735は、例えば、ユーザが事前に設定した画像である。
図25は、ユーザ端末100で実行される動画再生処理の流れの一例を示すフローチャートである。
本実施形態に係るシステム1により提供可能なゲームは、参加型のライブ対戦ゲームを含む。以下では、図28(A)~図28(D)と図29(A)~図29(D)とを参照して、参加型のライブ対戦ゲームの一例を説明する。当該ライブ対戦ゲームは、複数のステージにより構成されており、各ステージでは、複数のユーザ各々により操作される兵士オブジェクト720と、複数のユーザ端末100各々のゲームプログラムによりNPCとして動作する敵オブジェクト730と、高層ビルなどの建物オブジェクト740とが、複数のユーザ端末100各々により規定された仮想空間600Bに登場する。
バーチャルパッドVP1は、タッチスクリーン15の左下段の位置に表示される。一方、バーチャルパッドVP2は、バーチャルパッドVP1の外径よりも小さい外径を有して、タッチスクリーン15の右下段のやや上側の位置に表示される。即ち、バーチャルパッドVP1およびVP2は、各々の中心位置(基準位置)がタッチスクリーン15における水平方向および垂直方向のいずれにおいてもずれた位置に配置される。
バーチャルパッドVP1は、操作体(操作対象画像)ST1と、当該操作体ST1よりも大きい円形の外枠(範囲画像)FR1とにより構成される。操作体ST1がタッチされていないとき、当該操作体ST1は、外枠FR1により囲まれた範囲RG1の中心位置(基準位置)に表示される。操作体ST1に対するドラッグ操作がユーザによって行われると、ユーザ端末100は、当該ドラッグ操作により指定された方向に当該操作体ST1を移動させる。より具体的には、ユーザ端末100は、操作体ST1の位置以外の位置に対するタッチ操作により開始されたドラッグ操作によっては操作体ST1を移動させず、操作体ST1の位置に対するタッチ操作により開始されたドラッグ操作に応じて操作体ST1を移動させる。
バーチャルパッドVP2は、操作体(操作対象画像)ST2と、当該操作体ST2よりも大きい円形の外枠(範囲画像)FR2とにより構成される。操作体ST2は、外枠FR2により囲まれる範囲RG2の中心位置(基準位置)に表示され、当該操作体ST2に対するドラッグ操作が行われたときに、当該ドラッグ操作により指定された方向に移動する。このとき、ユーザ端末100は、操作体ST2の位置以外の位置に対するタッチ操作により開始されたドラッグ操作によっては操作体ST2を移動させず、操作体ST2の位置に対するタッチ操作により開始されたドラッグ操作に応じて操作体ST2を移動させる。操作体ST2は、当該操作体ST2の中心位置が外枠FR2内で移動可能な円形状のオブジェクトである。このため、ドラッグ操作により操作体ST2の中心位置が外枠FR2の外縁付近まで移動すると、当該操作体ST2は、外枠FR2よりも若干外側にはみ出して表示される。
ユーザ端末100がゲームプログラムに基づいて実行する処理のうち、バーチャルパッドVP1の操作に応じた処理の流れを図30の左側に示すフローチャートを用いて説明し、バーチャルパッドVP2の操作に応じた処理の流れを図30の右側に示すフローチャートを用いて説明する。なお、当該処理の一部はサーバ200において実行し、処理結果をユーザ端末100に送信するようにしてもよい。
本実施形態によれば、タッチスクリーン15の予め定められた基準位置に操作体ST1が表示されるとともに、当該基準位置を含む予め定められた範囲RG1が特定可能な外枠FR1が表示される。操作体ST1に対するドラッグ操作が行われると、操作体ST1は、当該基準位置から当該ドラッグ操作の現在位置に応じた範囲RG1内の位置に移動する。兵士オブジェクト720を移動させるゲーム制御は、当該基準位置と操作体ST1の表示位置とに応じて実行される。また、範囲RG1に対するタップ操作が行われると、当該タップ操作が操作体ST1に対する操作であるか否かにかかわらず、兵士オブジェクト720をジャンプさせるゲーム制御が実行される。
以上説明した実施形態の変形例などを以下に列挙する。
以上の各実施形態で説明した事項を、以下に付記する。
本開示に示す一実施形態のある局面によれば、プロセッサ、メモリ、およびタッチスクリーンを備えるコンピュータ(ユーザ端末100)において実行されるゲームプログラムであって、前記プロセッサに、前記タッチスクリーンの予め定められた第1位置(基準位置)に操作対象画像(操作体ST1)を表示するとともに、前記第1位置を含む予め定められた第1範囲(範囲RG1)を特定可能とする範囲画像(外枠FR1)を表示するステップと、前記操作対象画像に対するドラッグ操作をユーザから受け付けることにより、当該操作対象画像の表示位置を、当該ドラッグ操作の現在位置に応じた前記第1範囲内の位置に移動させるステップ(S86、S84)と、前記第1位置と前記操作対象画像の表示位置とに応じて第1のゲーム制御を実行するステップ(S87)と、前記第1範囲に対するタップ操作をユーザから受け付けることにより、当該タップ操作が前記操作対象画像に対する操作であるか否かにかかわらず、第2のゲーム制御を実行するステップ(S82)とを実行させる。
(付記1)において、前記移動させるステップは、前記ドラッグ操作が解除されたときに前記操作対象画像の表示位置を前記第1位置に戻す(S84)。
(付記1)または(付記2)において、前記移動させるステップは、前記操作対象画像の表示位置以外の位置に対する操作により開始されたドラッグ操作によっては当該操作対象画像の表示位置を移動させず、前記操作対象画像の表示位置に対する操作により開始されたドラッグ操作に応じて当該操作対象画像の表示位置を移動させる(S86)。
(付記1)から(付記3)のいずれかにおいて、前記プロセッサに、前記タッチスクリーンの予め定められた第2範囲(範囲RG2)を特定可能とする範囲画像(外枠FR2)を表示するステップと、前記第2範囲におけるユーザからの操作に応じて、第3のゲーム制御を実行するステップ(S92、S97)とを実行させ、前記第1範囲と前記第2範囲とは、各々の中央位置が前記タッチスクリーンにおける水平方向および垂直方向のいずれにおいてもずれた位置となるように定められている。
(付記4)において、前記第1範囲を特定可能とする範囲画像および前記第2範囲を特定可能とする範囲画像には、各々、異なる優先度が関連付けられており、前記第1範囲を特定可能とする範囲画像と、前記第2範囲を特定可能とする範囲画像とが重複している状況において当該重複する範囲内における操作が行われた場合には、前記第1範囲および前記第2範囲のうち優先度が高い範囲画像に対応する範囲における操作として受け付けるステップを実行させる。
(付記1)から(付記5)のいずれかにおいて、前記プロセッサに、ゲームの進行に応じて所定条件が成立したときに、前記操作対象画像および前記範囲画像各々を所定態様で動作表示させるステップを実行させる。
(付記6)において、前記ゲームプログラムに基づくゲームは、ユーザにより操作される第1オブジェクトを第2オブジェクトと対戦させる対戦ゲームであり、前記所定条件は、前記第1オブジェクトが前記第2オブジェクトからの攻撃を受けることにより成立する条件を含む。
(付記1)から(付記7)のいずれかにおいて、前記ゲームプログラムに基づくゲームは、ユーザにより操作される第1オブジェクトを仮想空間に登場させるゲームであり、前記第1のゲーム制御は、前記第1位置と前記操作対象画像の表示位置とに応じた第1の作用を前記第1オブジェクトに及ぼす制御であり、前記第2のゲーム制御は、前記タップ操作の位置にかかわらない第2の作用を前記第1オブジェクトに及ぼす処理である。
一実施形態のある局面によれば、プロセッサ、メモリ、およびタッチスクリーンを備えるコンピュータ(ユーザ端末100)により実行されるゲーム方法であって、前記コンピュータが、前記タッチスクリーンの予め定められた第1位置(基準位置)に操作対象画像(操作体ST1)を表示するとともに、前記第1位置を含む予め定められた第1範囲(範囲RG1)を特定可能とする範囲画像(外枠FR1)を表示するステップと、前記操作対象画像に対するドラッグ操作をユーザから受け付けることにより、当該操作対象画像の表示位置を、当該ドラッグ操作の現在位置に応じた前記第1範囲内の位置に移動させるステップ(S86、S84)と、前記第1位置と前記操作対象画像の表示位置とに応じて第1のゲーム制御を実行するステップ(S87)と、前記第1範囲に対するタップ操作をユーザから受け付けることにより、当該タップ操作が前記操作対象画像に対する操作であるか否かにかかわらず、第2のゲーム制御を実行するステップ(S82)とを備える。
一実施形態のある局面によれば、情報端末装置(ユーザ端末100)であって、ゲームプログラムを記憶する記憶部(記憶部120)と、前記ゲームプログラムを実行することにより、前記情報端末装置の動作を制御する制御部(制御部110)とを備え、前記制御部は、タッチスクリーンの予め定められた第1位置(基準位置)に操作対象画像(操作体ST1)を表示するとともに、前記第1位置を含む予め定められた第1範囲(範囲RG1)を特定可能とする範囲画像(外枠FR1)を表示するステップと、前記操作対象画像に対するドラッグ操作をユーザから受け付けることにより、当該操作対象画像の表示位置を、当該ドラッグ操作の現在位置に応じた前記第1範囲内の位置に移動させるステップ(S86、S84)と、前記第1位置と前記操作対象画像の表示位置とに応じて第1のゲーム制御を実行するステップ(S87)と、前記第1範囲に対するタップ操作をユーザから受け付けることにより、当該タップ操作が前記操作対象画像に対する操作であるか否かにかかわらず、第2のゲーム制御を実行するステップ(S82)とを実行する。
ユーザ端末100、サーバ200、ゲームプレイ端末300(HMDセット1000)、および配信端末400の制御ブロック(特に制御部110、210、310、410)は、集積回路(ICチップ)等に形成された論理回路(ハードウェア)によって実現してもよいし、ソフトウェアによって実現してもよい。
Claims (10)
- プロセッサ、メモリ、およびタッチスクリーンを備えるコンピュータにおいて実行されるゲームプログラムであって、
前記プロセッサに、
前記タッチスクリーンの予め定められた第1位置に操作対象画像を表示するとともに、前記第1位置を含む予め定められた第1範囲を特定可能とする範囲画像を表示するステップと、
前記操作対象画像に対するドラッグ操作をユーザから受け付けることにより、当該操作対象画像の表示位置を、当該ドラッグ操作の現在位置に応じた前記第1範囲内の位置に移動させるステップと、
前記第1位置と前記操作対象画像の表示位置とに応じて第1のゲーム制御を実行するステップと、
前記第1範囲に対するタップ操作をユーザから受け付けることにより、当該タップ操作が前記操作対象画像に対する操作であるか否かにかかわらず、第2のゲーム制御を実行するステップとを実行させる、ゲームプログラム。 - 前記移動させるステップは、前記ドラッグ操作が解除されたときに前記操作対象画像の表示位置を前記第1位置に戻す、請求項1に記載のゲームプログラム。
- 前記移動させるステップは、前記操作対象画像の表示位置以外の位置に対する操作により開始されたドラッグ操作によっては当該操作対象画像の表示位置を移動させず、前記操作対象画像の表示位置に対する操作により開始されたドラッグ操作に応じて当該操作対象画像の表示位置を移動させる、請求項1または請求項2に記載のゲームプログラム。
- 前記プロセッサに、
前記タッチスクリーンの予め定められた第2範囲を特定可能とする範囲画像を表示するステップと、
前記第2範囲におけるユーザからの操作に応じて、第3のゲーム制御を実行するステップとを実行させ、
前記第1範囲と前記第2範囲とは、各々の中央位置が前記タッチスクリーンにおける水平方向および垂直方向のいずれにおいてもずれた位置となるように定められている、請求項1~請求項3のいずれかに記載のゲームプログラム。 - 前記第1範囲を特定可能とする範囲画像および前記第2範囲を特定可能とする範囲画像には、各々、異なる優先度が関連付けられており、
前記第1範囲を特定可能とする範囲画像と、前記第2範囲を特定可能とする範囲画像とが重複している状況において当該重複する範囲内における操作が行われた場合には、前記第1範囲および前記第2範囲のうち優先度が高い範囲画像に対応する範囲における操作として受け付けるステップを実行させる、請求項4に記載のゲームプログラム。 - 前記プロセッサに、
ゲームの進行に応じて所定条件が成立したときに、前記操作対象画像および前記範囲画像各々を所定態様で動作表示させるステップを実行させる、請求項1~請求項5のいずれかに記載のゲームプログラム。 - 前記ゲームプログラムに基づくゲームは、ユーザにより操作される第1オブジェクトを第2オブジェクトと対戦させる対戦ゲームであり、
前記所定条件は、前記第1オブジェクトが前記第2オブジェクトからの攻撃を受けることにより成立する条件を含む、請求項6に記載のゲームプログラム。 - 前記ゲームプログラムに基づくゲームは、ユーザにより操作される第1オブジェクトを仮想空間に登場させるゲームであり、
前記第1のゲーム制御は、前記第1位置と前記操作対象画像の表示位置とに応じた第1の作用を前記第1オブジェクトに及ぼす制御であり、
前記第2のゲーム制御は、前記タップ操作の位置にかかわらない第2の作用を前記第1オブジェクトに及ぼす処理である、請求項1~請求項7のいずれかに記載のゲームプログラム。 - プロセッサ、メモリ、およびタッチスクリーンを備えるコンピュータにより実行されるゲーム方法であって、
前記コンピュータが、
前記タッチスクリーンの予め定められた第1位置に操作対象画像を表示するとともに、前記第1位置を含む予め定められた第1範囲を特定可能とする範囲画像を表示するステップと、
前記操作対象画像に対するドラッグ操作をユーザから受け付けることにより、当該操作対象画像の表示位置を、当該ドラッグ操作の現在位置に応じた前記第1範囲内の位置に移動させるステップと、
前記第1位置と前記操作対象画像の表示位置とに応じて第1のゲーム制御を実行するステップと、
前記第1範囲に対するタップ操作をユーザから受け付けることにより、当該タップ操作が前記操作対象画像に対する操作であるか否かにかかわらず、第2のゲーム制御を実行するステップとを備える、ゲーム方法。 - 情報端末装置であって、
ゲームプログラムを記憶する記憶部と、
前記ゲームプログラムを実行することにより、前記情報端末装置の動作を制御する制御部とを備え、
前記制御部は、
タッチスクリーンの予め定められた第1位置に操作対象画像を表示するとともに、前記第1位置を含む予め定められた第1範囲を特定可能とする範囲画像を表示するステップと、
前記操作対象画像に対するドラッグ操作をユーザから受け付けることにより、当該操作対象画像の表示位置を、当該ドラッグ操作の現在位置に応じた前記第1範囲内の位置に移動させるステップと、
前記第1位置と前記操作対象画像の表示位置とに応じて第1のゲーム制御を実行するステップと、
前記第1範囲に対するタップ操作をユーザから受け付けることにより、当該タップ操作が前記操作対象画像に対する操作であるか否かにかかわらず、第2のゲーム制御を実行するステップとを実行する、情報端末装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/621,098 US20220347559A1 (en) | 2019-06-20 | 2020-06-17 | Game program, game method, and information terminal device |
CN202080044722.0A CN114007707A (zh) | 2019-06-20 | 2020-06-17 | 游戏程序、游戏方法以及信息终端装置 |
KR1020227001482A KR20220024602A (ko) | 2019-06-20 | 2020-06-17 | 게임 프로그램, 게임 방법, 및 정보 단말 장치 |
EP20825685.9A EP3988186A4 (en) | 2019-06-20 | 2020-06-17 | GAME PROGRAM, GAME METHOD AND INFORMATION TERMINAL DEVICE |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019-114368 | 2019-06-20 | ||
JP2019114368A JP6818091B2 (ja) | 2019-06-20 | 2019-06-20 | ゲームプログラム、ゲーム方法、および情報端末装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020255991A1 true WO2020255991A1 (ja) | 2020-12-24 |
Family
ID=73993624
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/023691 WO2020255991A1 (ja) | 2019-06-20 | 2020-06-17 | ゲームプログラム、ゲーム方法、および情報端末装置 |
Country Status (6)
Country | Link |
---|---|
US (1) | US20220347559A1 (ja) |
EP (1) | EP3988186A4 (ja) |
JP (1) | JP6818091B2 (ja) |
KR (1) | KR20220024602A (ja) |
CN (1) | CN114007707A (ja) |
WO (1) | WO2020255991A1 (ja) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7333564B2 (ja) * | 2021-10-29 | 2023-08-25 | グリー株式会社 | 情報処理システム、情報処理方法およびコンピュータプログラム |
US11989811B2 (en) | 2021-10-29 | 2024-05-21 | Gree, Inc. | Information processing system, information processing method, and computer program |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006340744A (ja) * | 2005-06-07 | 2006-12-21 | Nintendo Co Ltd | ゲームプログラムおよびゲーム装置 |
JP2014045965A (ja) * | 2012-08-31 | 2014-03-17 | Square Enix Co Ltd | ビデオゲーム処理装置、およびビデオゲーム処理プログラム |
US20180373406A1 (en) * | 2017-06-21 | 2018-12-27 | Netease (Hangzhou) Network Co.,Ltd. | Information Processing Method, Apparatus, Electronic Device and Storage Medium |
JP2019076721A (ja) * | 2017-10-23 | 2019-05-23 | ネットイース(ハンジョウ)ネットワーク カンパニー,リミテッド | 情報処理方法及び装置、記憶媒体、電子機器 |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5379250B2 (ja) * | 2011-02-10 | 2013-12-25 | 株式会社ソニー・コンピュータエンタテインメント | 入力装置、情報処理装置および入力値取得方法 |
TW201334843A (zh) * | 2012-02-20 | 2013-09-01 | Fu Li Ye Internat Corp | 具有觸控面板媒體的遊戲控制方法及該遊戲媒體 |
JP6072338B1 (ja) * | 2016-07-29 | 2017-02-01 | 株式会社 ディー・エヌ・エー | ゲームを提供するためのプログラム、システム、及び方法 |
JP6966836B2 (ja) * | 2016-11-21 | 2021-11-17 | 株式会社コーエーテクモゲームス | ゲームプログラム、記録媒体、ゲーム処理方法 |
-
2019
- 2019-06-20 JP JP2019114368A patent/JP6818091B2/ja active Active
-
2020
- 2020-06-17 US US17/621,098 patent/US20220347559A1/en active Pending
- 2020-06-17 CN CN202080044722.0A patent/CN114007707A/zh active Pending
- 2020-06-17 KR KR1020227001482A patent/KR20220024602A/ko unknown
- 2020-06-17 WO PCT/JP2020/023691 patent/WO2020255991A1/ja active Application Filing
- 2020-06-17 EP EP20825685.9A patent/EP3988186A4/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006340744A (ja) * | 2005-06-07 | 2006-12-21 | Nintendo Co Ltd | ゲームプログラムおよびゲーム装置 |
JP2014045965A (ja) * | 2012-08-31 | 2014-03-17 | Square Enix Co Ltd | ビデオゲーム処理装置、およびビデオゲーム処理プログラム |
US20180373406A1 (en) * | 2017-06-21 | 2018-12-27 | Netease (Hangzhou) Network Co.,Ltd. | Information Processing Method, Apparatus, Electronic Device and Storage Medium |
JP2019076721A (ja) * | 2017-10-23 | 2019-05-23 | ネットイース(ハンジョウ)ネットワーク カンパニー,リミテッド | 情報処理方法及び装置、記憶媒体、電子機器 |
Non-Patent Citations (2)
Title |
---|
See also references of EP3988186A4 |
STAR BATTALION, 30 May 2019 (2019-05-30), Retrieved from the Internet <URL:https://www.youtube.com/watch?v=PhGAIr6KN5o> |
Also Published As
Publication number | Publication date |
---|---|
EP3988186A4 (en) | 2023-07-12 |
US20220347559A1 (en) | 2022-11-03 |
KR20220024602A (ko) | 2022-03-03 |
JP2021000192A (ja) | 2021-01-07 |
CN114007707A (zh) | 2022-02-01 |
EP3988186A1 (en) | 2022-04-27 |
JP6818091B2 (ja) | 2021-01-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6776400B1 (ja) | プログラム、方法、および情報端末装置 | |
JP6776393B2 (ja) | 視聴プログラム、視聴方法、および情報端末装置 | |
JP7344189B2 (ja) | 視聴プログラム、視聴方法、および情報端末装置 | |
JP2021053181A (ja) | プログラム、方法、および視聴端末 | |
WO2022202126A1 (ja) | プログラム、方法、および情報処理装置 | |
WO2020255991A1 (ja) | ゲームプログラム、ゲーム方法、および情報端末装置 | |
WO2020262331A1 (ja) | ゲームプログラム、ゲーム方法、および端末装置 | |
JP6813618B2 (ja) | 視聴プログラム、視聴方法、視聴端末、配信プログラム、配信方法、および情報端末装置 | |
JP7361736B2 (ja) | ゲームプログラム、ゲーム方法、および端末装置 | |
WO2020262332A1 (ja) | ゲームプログラム、ゲーム方法、および情報端末装置 | |
JP7332562B2 (ja) | プログラム、方法、および情報端末装置 | |
JP6776425B1 (ja) | プログラム、方法、および配信端末 | |
WO2021039346A1 (ja) | プログラム、方法、および情報処理端末 | |
JP6813617B2 (ja) | ゲームプログラム、ゲーム方法、および情報端末装置 | |
JP6952730B2 (ja) | プログラム、方法、情報処理装置、およびシステム | |
JP7440401B2 (ja) | ゲームプログラム、ゲーム方法、および情報端末装置 | |
JP7377790B2 (ja) | ゲームプログラム、ゲーム方法、および情報端末装置 | |
JP6903701B2 (ja) | ゲームプログラム、ゲーム方法、および情報端末装置 | |
JP2021053401A (ja) | プログラム、方法、および配信端末 | |
JP2021053358A (ja) | プログラム、方法、および視聴端末 | |
JP2022028694A (ja) | プログラム、方法、および情報端末装置 | |
JP2021049391A (ja) | 視聴プログラム、視聴方法、視聴端末、配信プログラム、配信方法、および情報端末装置 | |
JP2021058625A (ja) | ゲームプログラム、ゲーム方法、および情報端末装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20825685 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 20227001482 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2020825685 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2020825685 Country of ref document: EP Effective date: 20220120 |