WO2014123129A1 - Information processing apparatus, control method, program, storage medium, and rendering system - Google Patents

Information processing apparatus, control method, program, storage medium, and rendering system Download PDF

Info

Publication number
WO2014123129A1
WO2014123129A1 PCT/JP2014/052594 JP2014052594W WO2014123129A1 WO 2014123129 A1 WO2014123129 A1 WO 2014123129A1 JP 2014052594 W JP2014052594 W JP 2014052594W WO 2014123129 A1 WO2014123129 A1 WO 2014123129A1
Authority
WO
WIPO (PCT)
Prior art keywords
rendering
screen
information processing
avatar
processing apparatus
Prior art date
Application number
PCT/JP2014/052594
Other languages
French (fr)
Inventor
Alex Tait
Original Assignee
Square Enix Holdings Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Square Enix Holdings Co., Ltd. filed Critical Square Enix Holdings Co., Ltd.
Priority to EP14749202.9A priority Critical patent/EP2954496A4/en
Priority to US14/382,409 priority patent/US9636581B2/en
Priority to JP2014530452A priority patent/JP5776954B2/en
Publication of WO2014123129A1 publication Critical patent/WO2014123129A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/35Details of game servers
    • A63F13/355Performing operations on behalf of clients with restricted processing capabilities, e.g. servers transform changing game scene into an encoded video stream for transmitting to a mobile phone or a thin client
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/40Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of platform network
    • A63F2300/406Transmission via wireless network, e.g. pager or GSM

Definitions

  • the present invention relates to an
  • Video games have become a common source of entertainment for virtually every segment of the population.
  • the video game industry has seen
  • content such as video games, in which provided screens are rendered by an apparatus, is not limited to specialized consoles and can be provided for general information processing apparatuses such as PCs and smartphones as well.
  • many common information processing apparatus including game consoles have a connectivity functions for connecting to a network such as the Internet, and the aforementioned contents have the ability to provide services and entertainment using network communication.
  • a massively multiplayer type network game such as an MMORPG (Massively Multiplayer Online Role-Playing Game), for example.
  • provision system in which rather than performing execution of screen rendering on a user apparatus, screens are rendered on a connected external apparatus such as a server.
  • a similar provision experience to a case where the user apparatus renders the screens can be realized by the user apparatus receiving and displaying data of screens rendered by the external apparatus.
  • So-called cloud-based rendering systems in which screen provision by such kinds of external apparatuses is performed via a network such as the Internet to which many users can simultaneously connect, having been getting attention in recent years.
  • cloud-based gaming systems that use such systems in the video game industry as well.
  • a player can utilize an ordinary Internet-enabled device such as a smartphone or tablet to connect to a video game server on the Internet.
  • the video game server starts a session for the player and renders images for the player based on selections (e.g., moves, actions) made by the player and other attributes of the game.
  • the images are delivered to the player's device over the Internet, and are reproduced on the display.
  • players from anywhere in the world can play a video game without the use of specialized video game consoles, software or graphics processing hardware.
  • the present invention provides an information processing apparatus, a control method, a program, a storage medium and a rendering system for improving user experiences in a virtual space in which a
  • the present invention in its first aspect provides an information processing apparatus for rendering screens provided to a plurality of apparatuses,
  • reception means for receiving operation input corresponding to the plurality of apparatuses; first rendering means for rendering a first screen for a particular viewpoint in a virtual space in which it is possible to move an avatar based on the operation input, wherein avatars corresponding to each of the plurality of apparatuses are simultaneously positioned in the virtual space; second rendering means for performing processing on particular content that is not an avatar based on the operation input and rendering a second screen for the content; and management means for managing management information indicating which of the first rendering means and the second rendering means to cause to render a screen to be provided to an apparatus in the plurality of apparatuses, wherein the first rendering means, in a case where an avatar
  • the present invention in its second aspect provides a method of controlling an information
  • processing apparatus for rendering screens provided to a plurality of apparatuses, comprising: a receiving step of reception means of the information processing apparatus receiving operation input corresponding to the plurality of apparatuses; a first rendering step of first rendering means of the information processing apparatus rendering a first screen for a particular viewpoint in a virtual space in which it is possible to move an avatar based on the operation input, wherein avatars corresponding to each of the plurality of apparatuses are simultaneously positioned in the virtual space; a second rendering step of second rendering means of the information processing apparatus performing processing on particular content that is not an avatar based on the operation input and rendering a second screen for the content; and a managing step of management means of the information processing
  • rendering step in a case where an avatar corresponding to an apparatus to which the second screen is provided is included in a rendering scope for the particular viewpoint, renders the first screen, applying the second screen provided to the apparatus corresponding to the avatar to an object related to the avatar in the virtual space.
  • the present invention in its third aspect provides a rendering system comprising an information processing apparatus and a rendering apparatus for rendering screens provided to a plurality of
  • the information processing apparatus comprises: reception means for receiving operation input corresponding to the plurality of apparatuses; and instruction means for making a rendering instruction to the rendering apparatus based on the operation input received by the reception means, and wherein the rendering apparatus comprises: first
  • rendering means for rendering a first screen for a particular viewpoint in a virtual space in which it is possible to move an avatar based on the operation input, wherein avatars corresponding to each of the plurality of apparatuses are simultaneously positioned in the virtual space; and second rendering means for
  • the information processing apparatus or the rendering apparatus further comprises management means for
  • Fig. 1 is a block diagram of a video game system architecture, according to a non-limiting embodiment of the present invention.
  • Fig. 2A is a block diagram showing various functional modules of a server system used in the video game system architecture of Fig. 1, according to one non-limiting embodiment of the present invention.
  • Fig. 2B is a block diagram showing various functional modules of a server system used in the video game system architecture of Fig. 1, according to another non-limiting embodiment of the present
  • FIG. 3 is a block diagram showing various functional modules of a client device used in the video game system architecture of Fig. 1, according to one non-limiting embodiment of the present invention.
  • FIG. 4 is a flowchart showing steps in a main processing loop executed by the server system, in accordance with a non-limiting embodiment of the present invention.
  • FIG. 5 is a flowchart showing steps taken by a client device to decode, combine and display received images, in accordance with a non-limiting embodiment of the present invention.
  • FIG. 6 is a flowchart showing steps in a rendering processing executed by the server system, in accordance with a non-limiting embodiment of the present invention.
  • Figs. 7A to 7D show example screens to be provided to client devices.
  • Fig. 1 schematically shows a video game system architecture according to a non-limiting embodiment of the present invention, including a server system (or "server arrangement") 100 for delivering video game services over the Internet 130.
  • server system or "server arrangement"
  • Individual client devices 120A, 120B, 120C, 120D, 120E can connect to the server system 100 by communication pathways established over a respective local access network 140A, 140B, 140C, 140D, 140E and the Internet 130.
  • the server system 100 may itself be connected to the Internet 130 by an access network, although to minimize latency, the server system 100 may connect directly to the Internet 130 without the intermediary of an access network.
  • server system 100 and the client devices 120A, 120B, 120C, 120D, 120E may be configured to be connected wirelessly or with wires without going through the Internet 130 or the local access networks 140A, 140B, 140C, 140D, or 140E.
  • the server system 100 may be configured so as to enable users of the client devices 120A, 120B, 120C, 120D, 120E to play a video game, either
  • Non-limiting examples of video games may include games that are played for leisure, education and/or sport.
  • a video game may but need not offer participants the
  • the server system 100 may comprise a single server or a cluster of servers connected through, for example, a virtual private network (VPN) and/or a data center. Individual servers within the cluster may be configured to carry out specialized functions. For example, one or more servers in the server system 100 may be primarily responsible for graphics rendering.
  • VPN virtual private network
  • Figs. 2A and 2B illustrate two different non- limiting embodiments of the server system 100.
  • the server system 100 includes at least one server 200, 250 with a central processing unit (CPU) 202.
  • the CPU 202 may load service providing programs into a temporary storage area 206 from a persistent memory medium 204 or from a mass storage device 208.
  • the temporary storage area 206 may be a volatile random-access memory (RAM) such as SRAM or DRAM.
  • the persistent memory medium 204 may be a programmable nonvolatile memory such as EEPROM.
  • the persistent memory medium 204 may store other sets of program (such as operating system) and/or data required for the operation of various modules of the server 200, 250.
  • the mass storage device 208 may be an internal hard disk drive or a Flash memory detachable from the server 200, 250.
  • the mass storage device 208 may be
  • the mass storage device 208 may also serve as a
  • database for storing information about participants involved in the video game, as well as other kinds of information that may be required to generate output for the various participants in the video game.
  • the service providing programs may include instructions for generating screens for various viewpoints in a virtual space in which a
  • the service providing programs may include instructions for generating game screens for a case where a user simply executes a single-player game or a head-to-head game. Data necessary for generation of the screens may be stored in, for example, the mass storage device 208.
  • the rendering of game screens may be
  • GPUs graphics processing units
  • the GPUs may be co-located with the CPU 202 within the server 200 or they may be located on a separate server connected to the server 200 over the Internet 130.
  • Fig. 2A pertains to the case where graphics rendering capabilities are located within the same server 200 as the CPU 202 that executes the service providing program.
  • Each video memory (e.g., VRAM) 212A, 212B, 212C may be connected to a respective GPUs 210A, 210B, 210C, and may provide temporary storage of picture element (pixel) values for display of a game screen.
  • data for one or more objects to be located in three-dimensional space may be loaded into a cache memory (not shown) of the GPU 210A, 210B, 210C.
  • This data may be rendered by the GPU 210A, 210B, 210C as data in two-dimensional space (e.g., an image) , which may be stored in the video memory 212A, 212B, 212C.
  • the image is then output towards a
  • the server 200 also provides a communication unit 220, which may exchange data with the client devices 120A, 120B, 120C, 120D, 120E over the Internet 130.
  • the communication unit 220 may receive user inputs from the client devices 120A, 120B, 120C, 120D, 120E over the Internet 130 and may transmit data (e.g. screen data) to the client devices 120A, 120B, 120C, 120D, 120E over the Internet 130.
  • the data transmitted to the client devices 120A, 120B, 120C, 120D, 120E may include encoded images of screens or portions thereof.
  • the communication unit 220 may convert data into a format compliant with a suitable communication protocol.
  • Fig. 2B pertains to the case where graphics rendering capabilities are distributed among one or more rendering server (s) 260A, 260B, 260C that are separate from a control server 250.
  • the control server 250 and the rendering servers 260A, 260B, 260C can be connected over a network such as the Internet 130.
  • the communication unit 222 may transmit control
  • the rendering server 260A comprises a GPU 210D, a video memory (e.g., VRAM) 212D, a CPU 213D (not shown) and a communication unit 224.
  • the communication unit 224 receives the control instructions from the control server 250 over the Internet 130. Based on these control instructions, the CPU 213D performs necessary processing and causes the GPU 210D to execute screen rendering processing.
  • the rendered screen data are stored in a video memory 212D and then encoded and transmit towards a recipient client device over the Internet 130 via the communication unit 224.
  • the rendered screen data is the
  • the communication unit 224 may convert data into a format compliant with a suitable communication protocol.
  • client devices 120A, 120B, 120C, 120D, 120E their configuration is not
  • one or more of the client devices 120A, 120B, 120C, 120D, 120E may be, for example, a PC, a home-use game machine (console such as XBOXTM, PS3TM, WiiTM, etc.), a portable game device, a smart television, a set-top box (STB) , etc.
  • one or more of the client devices 120A, 120B, 120C, 120D, 120E may be a communication or computing device such as a mobile phone, a PDA, or a tablet .
  • Fig. 3 shows a general client device
  • a client CPU 301 controls operation of blocks comprised in the client device.
  • the client CPU 301 controls operation of the blocks by reading out operation programs for the blocks stored in a client storage medium 302, loading them into a client RAM 303 and executing them.
  • the client storage medium 302 may be an HDD, a non-volatile ROM, or the like. Also, operation programs may be dedicated applications, browsing applications or the like.
  • the client RAM 303 may not just be a program loading area, and may also be used as a storage area for temporarily storing such things as intermediate data output in the operation of any of the blocks.
  • a client communication unit 304 is a
  • the client communication unit 304 receives encoded screen data of the provided service from the server system 100 via the Internet 130. Also, the client communication unit 304 transmits information of operation input by the user on a client device via the Internet 130 to the server system 100.
  • the client decoder 305 decodes encoded screen data received by the client communication unit 304 and generates screen data. The generated screen data is presented to the user of the client device by being output to a client display 306 and displayed. Note, it is not necessary that the client device has the client display 306, and the client display 306 may be an external display apparatus connected to the client device .
  • the client input unit 307 is a user interface comprised in the client device.
  • the client input unit 307 may include input devices (such as a touch screen, a keyboard, a game controller, a joystick, etc.), and detect operation input by the user.
  • integrated data may be transmitted via the client communication unit 304 to the server system 100, and may be transmitted as information indicating that a particular operation input was performed after analyzing the operation content.
  • the client input unit 307 may include other sensors (e.g.,
  • KinectTM that includes a camera or the like, that detect as operation input a motion of a particular object, or a body motion made by the user.
  • each of the client devices may include a loudspeaker for outputting audio.
  • 2 types of screens i) screens for a virtual space, and ii) game screens for when the user performs game play are provided to the client device .
  • the server system provides, as a user
  • a virtual space that a plurality of client device users can simultaneously participate in (Fig.7A).
  • This virtual space need not be provided as a game, and may be taken something having visual effect that is used as a tool for simply supporting
  • Each user can operate and move within the space a corresponding avatar which is positioned in the virtual space.
  • a screen for a viewpoint set in the space is provided to the client device of the user.
  • viewpoint may be selected from among preset fixed viewpoints, or may be selectively changeable by the user, or be something that is changed in accordance with movement (rotation) operation on the avatar by the user .
  • the virtual space is assumed to be a place such as an amusement arcade.
  • a user is not only able to cause an avatar to walk around in the space, but is also able, by causing the avatar to move to a position corresponding to an arcade video game machine object arranged in the space, to play a game assigned to that machine (Fig. 7B) .
  • game screens for the game being played are provided to the client device of the user.
  • the game screens may be screens provided for a full screen of the client device, for example. While the user is playing a game of the arcade video game machine, operation performed by the user on the client device is recognized as operation for the game and not for movement of the avatar in the virtual space. When the server system receives this operation, it executes processing necessary for the game, updates parameters within the game, and renders a game screen for a next frame.
  • the arcade video game machine object has a part corresponding to a display.
  • a game screen being provided to the user may be applied as a texture to the display unit of the arcade video game machine arranged in front of the avatar, for example.
  • a user being provided screens of the virtual space can see the game play of other users in the space, just like in a real amusement arcade (Fig. 7D) .
  • the hands of the avatar of the user may also be moved in accordance with the operation input. With this, a user being provided screens of the virtual space can see not only the screens of the game play of another user but also the operation for the game play.
  • configuration is made such that a user playing a game of an arcade video game machine is capable of stopping or pausing the game from a menu within the game for example. By stopping or pausing, the user once again becomes able to operate the avatar in the virtual space.
  • configuration may be made such that in a case where the game is paused, the user that was playing the game, or a user being provided screens of the virtual space, is able to confirm the paused game screen in the display unit of the arcade video game machine corresponding to the screen in the virtual space.
  • the server system retains the rendered game screen when the pause was performed, for example, as a texture, and subsequently uses it in rendering of the virtual space.
  • a paused game can be resumed when the avatar of the user that paused the game once again is moved to a position corresponding to the arcade video game machine object. Also, configuration may be made so that, for example, resuming of the game continuing from the paused state is possible when an avatar of another user is moved to the corresponding position, for example .
  • communication mode the mode in which screens of the virtual space are provided
  • game mode the mode in which a game is playable and game screens are provided
  • communication mode the mode in which a game is playable and game screens are provided
  • game mode the mode in which a game is playable and game screens are provided
  • communication mode the mode in which a game is playable and game screens are provided
  • game mode the mode in which a game is playable and game screens are provided
  • communication mode the mode in which a game is playable and game screens are provided
  • game mode the mode in which a game is playable and game screens are provided
  • the service providing program executed by the CPU 202 renders screen
  • FIG. 4 conceptually illustrates the steps in a main processing loop (or main game loop) of the service providing program executed by the CPU 202 in the server system 100.
  • the main game loop may be executed for each client device receiving service provision.
  • the main game loop may include steps 410 to 460, which are described below in further detail, in accordance with a non-limiting embodiment of the present invention.
  • the main game loop executes continually at a certain frame rate.
  • the CPU 202 receives information of operation input performed on a client device via the communication unit 220.
  • the operation input may be information of a difference with respect to an input state of a previous frame, for example, and it may be information, detected on the client device, indicating that a particular operation was performed.
  • the inputs may include the set of inputs received from all participants in the game, not just the
  • step 310 may be omitted.
  • the CPU 202 updates various parameters for screen configuration based on received information of operation input. Also, the CPU 202 updates parameters not based on operation input but rater based on information that changes along with the passing of time.
  • the various parameters updated in this step may be, in a case where the current mode is communication mode, information indicating position or stance information of an avatar positioned in the virtual space, for example, or whether or not a game play state has been entered. Also, in a case where the current mode is game mode, for example, the parameters may be for positions of operation characters and NPCs (Non-Player Characters) in the game, or stance
  • the CPU 202 determines rendering contents of a screen to be provided to the client device. Specifically, the CPU 202, when the current mode is the communication mode, based on the viewpoint and direction for which the rendering is performed, specifies an object included in the portion of the virtual space that would be "visible" from the perspective originated from the viewpoint, and
  • the CPU 202 determines rendering content.
  • the CPU 202 when the current mode is game mode, determines a rendering target and rendering content in accordance with screen configuration defined in the game.
  • step 440 the CPU 202, based on rendering content determined in step 430, renders a screen
  • the communication unit 220 encodes screen data rendered by a GPU and stored in a VRAM.
  • the encoding is not limited to encoding based on a static image encoding standard, and may be based on a moving image encoding standard.
  • the encoding process used to encode a particular image may or may not apply cryptographic encryption .
  • the communication unit 220 transmits the encoded image via the Internet 130 to the client device. In this way, screen creation processing for 1 frame completes.
  • Fig. 5 shows operation of the client device associated with a given user, which may be any of client devices 120A, 120B, 120C, 120D, 120E.
  • the client CPU 301 receives an encoded image (encoded screen data) from the server system 100 via the client communication unit 304. Any associated audio may also be received at this time.
  • the client decoder 305 decodes the received encoded image.
  • the identity or version of the compression algorithm used to encode the image may be specified in the content of one or more packets that convey the image.
  • the client CPU 301 processes the (decoded) image. This can include placing the decoded image in a buffer (ex. the client RAM 303) , performing error correction, combining multiple successive images, performing alpha blending, interpolating portions of missing image, and so on. The result can be a final image to be presented to the user one per frame.
  • the client CPU 301 outputs the final image to the client display 306 and displays. For example, in the case of video, a composite video frame is displayed on the client display 306. In addition, associated audio may also be played and any other output synchronized with the video frame may similarly be triggered. [0057]V. Specific Description of Non-Limiting
  • Fig. 6 is a flowchart for exemplifying details of rendering processing for a screen that the server system 100 provides to a client device. This processing is performed for generation of one frame in the screen provision, and is executed simultaneously for each of the client devices connected to the server system 100.
  • step S601 the CPU 202 determines whether the mode which is set to the client device, to which the screen is provided, is the communication mode or the game mode.
  • the CPU 202 moves the processing to step S605 in a case where it determines that the client device mode is the communication mode, and moves the processing to step S602 in a case where it determines that it is the game mode.
  • step S602 the CPU 202, after having executed a game program for a game being played on the client device. and performing various processing, causes one of GPU 210A, 210B, 210C, or 210D to render a game screen .
  • step S603 the CPU 202 reads outs a game screen stored in a VRAM in the rendering, transmits it to the communication unit 220 and causes encoding processing to be executed. Also, the CPU 202
  • step S604 the communication unit 220 transmits the encoded game screen to the client device, and the processing completes.
  • step S601 the CPU 202, in step S605, identifies rendering target objects from out of the objects arranged in the virtual space.
  • step S606 the CPU 202 determines whether or not an avatar of another user performing game play is included in the rendering target objects.
  • the CPU 202 moves the processing on to step S607 in a case where it determines that an avatar of another user performing game play is included, and moves the
  • step S608 processing to step S608 in a case where it is
  • step S607 the CPU 202 performs screen rendering, applying a corresponding game screen as a texture to the display unit of the arcade video game machine object on which the game play is performed in the rendering of the screen of the virtual space caused to be performed by one of GPU 210A, 210B, 210C, or 210D.
  • the GPU performing the rendering under the control of the CPU 202, reads data of the
  • corresponding game screen from the Temporary Storage Area 206, loads this into a cache memory of the GPU, and uses it in the rendering. Also, in a case where an arcade video game machine object in a state in which game play is paused is included in the rendering target, rendering is performed applying a corresponding game screen as a texture to the display unit of the object.
  • step S608 the CPU 202 reads out a screen of the virtual space stored in a VRAM in the rendering, transmits it to the communication unit 220, causes execution of encoding processing.
  • step S609 the communication unit 220 transmits the encoded screen to the client device, and the processing completes.
  • game screens specific to game play can be provided to a user of a client device in game mode and screens by which it is possible to spectate the game play of another user in the virtual space can be provided to users of client device in the
  • users of client devices in the communication mode can observe that another user in the virtual space is performing game play, and also can observe that game play.
  • performing game play are not limited to forms in which they are provided by simply being applied as textures to objects in the virtual space. Configuration may be made so that a user can receive provision of more detailed display of the game screens by performing a selection operation on an avatar in the game play or an arcade video game machine object, for example.
  • the game screen and a screen showing the state of the hands of the avatar changing, based on information of the operation input from the client device for the game may be presented to the user arranged within a single new screen.
  • the user being provided the game screen and the user being provided the new screen may be able to interact with each other using at least one of audio input and character input. That is, the audio input or the character input is shared with the users, and then the users can be in communication with each other while viewing a same game screen.
  • a texture of a game screen applied to a display unit of an arcade video game machine object it does not necessarily have to be used in the rendering of a virtual space screen of the same frame as the frame in which the game screen is
  • the game screens used in the rendering of virtual space screens may be game screens transmitted to the client device in a
  • a plurality of game screens may be accumulated in the Temporary Storage Area 206, used in order in the virtual space screen rendering.
  • the information processing apparatus and the method of controlling the information processing apparatus according to the present invention are realizable by a program executing the methods on a computer.
  • the program is providable/distributable by being stored on a computer-readable storage medium or through an electronic communication line.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Information Transfer Between Computers (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

An information processing apparatus receives operation input corresponding to a plurality of devices and renders a first screen for a particular viewpoint in a virtual space in which it is possible to move an avatar based on the operation input, wherein avatars corresponding to each of the plurality of devices are simultaneously positioned in the virtual space. Meanwhile, the apparatus also performs processing on particular content that is not an avatar based on the operation input and renders a second screen for the content. The apparatus, in a case where an avatar corresponding to an apparatus to which the second screen is provided is included in a rendering scope for the particular viewpoint, renders the first screen, applying the second screen provided to the apparatus corresponding to the avatar to an object related to the avatar in the virtual space.

Description

DESCRIPTION
TITLE OF INVENTION INFORMATION PROCESSING APPARATUS, CONTROL METHOD, PROGRAM, STORAGE MEDIUM, AND RENDERING SYSTEM
TECHNICAL FIELD
[0001] The present invention relates to an
information processing apparatus, a control method, a program, a storage medium, and a rendering system, and in particular to a screen composing technique for screens of a virtual space in which a plurality of people can simultaneously participate.
BACKGROUND ART
[0002] Video games have become a common source of entertainment for virtually every segment of the population. The video game industry has seen
considerable evolution, from the introduction of standalone arcade games, to home-based computer games and the emergence of games made for specialized consoles.
[0003] Meanwhile content, such as video games, in which provided screens are rendered by an apparatus, is not limited to specialized consoles and can be provided for general information processing apparatuses such as PCs and smartphones as well. ■ In recent years, many common information processing apparatus including game consoles have a connectivity functions for connecting to a network such as the Internet, and the aforementioned contents have the ability to provide services and entertainment using network communication. For example, one kind of service that uses network communication is a massively multiplayer type network game such as an MMORPG (Massively Multiplayer Online Role-Playing Game), for example.
[0004] Also, in recent years, provision system exist in which rather than performing execution of screen rendering on a user apparatus, screens are rendered on a connected external apparatus such as a server. In such as case, a similar provision experience to a case where the user apparatus renders the screens can be realized by the user apparatus receiving and displaying data of screens rendered by the external apparatus.
So-called cloud-based rendering systems, in which screen provision by such kinds of external apparatuses is performed via a network such as the Internet to which many users can simultaneously connect, having been getting attention in recent years. There exist cloud-based gaming systems that use such systems in the video game industry as well.
[0005] In a cloud-based gaming system, a player can utilize an ordinary Internet-enabled device such as a smartphone or tablet to connect to a video game server on the Internet. The video game server starts a session for the player and renders images for the player based on selections (e.g., moves, actions) made by the player and other attributes of the game. The images are delivered to the player's device over the Internet, and are reproduced on the display. In this way, players from anywhere in the world can play a video game without the use of specialized video game consoles, software or graphics processing hardware.
SUMMARY OF INVENTION
[0006] The present invention provides an information processing apparatus, a control method, a program, a storage medium and a rendering system for improving user experiences in a virtual space in which a
plurality of users can simultaneously participate.
[0007] Various non-limiting aspects of the invention are set out in the following clauses:
The present invention in its first aspect provides an information processing apparatus for rendering screens provided to a plurality of apparatuses,
comprising: reception means for receiving operation input corresponding to the plurality of apparatuses; first rendering means for rendering a first screen for a particular viewpoint in a virtual space in which it is possible to move an avatar based on the operation input, wherein avatars corresponding to each of the plurality of apparatuses are simultaneously positioned in the virtual space; second rendering means for performing processing on particular content that is not an avatar based on the operation input and rendering a second screen for the content; and management means for managing management information indicating which of the first rendering means and the second rendering means to cause to render a screen to be provided to an apparatus in the plurality of apparatuses, wherein the first rendering means, in a case where an avatar
corresponding to an apparatus to which the second screen is provided is included in a rendering scope for the particular viewpoint, renders the first screen, applying the second screen provided to the apparatus corresponding to the avatar to an object related to the avatar in the virtual space.
[0008] The present invention in its second aspect provides a method of controlling an information
processing apparatus for rendering screens provided to a plurality of apparatuses, comprising: a receiving step of reception means of the information processing apparatus receiving operation input corresponding to the plurality of apparatuses; a first rendering step of first rendering means of the information processing apparatus rendering a first screen for a particular viewpoint in a virtual space in which it is possible to move an avatar based on the operation input, wherein avatars corresponding to each of the plurality of apparatuses are simultaneously positioned in the virtual space; a second rendering step of second rendering means of the information processing apparatus performing processing on particular content that is not an avatar based on the operation input and rendering a second screen for the content; and a managing step of management means of the information processing
apparatus managing management information indicating with which of the first rendering step and the second rendering step to use to render a screen to be provided to an apparatus in the plurality of apparatuses, wherein the first rendering means in the first
rendering step, in a case where an avatar corresponding to an apparatus to which the second screen is provided is included in a rendering scope for the particular viewpoint, renders the first screen, applying the second screen provided to the apparatus corresponding to the avatar to an object related to the avatar in the virtual space.
[0009] The present invention in its third aspect provides a rendering system comprising an information processing apparatus and a rendering apparatus for rendering screens provided to a plurality of
apparatuses connected to the information processing apparatus, wherein the information processing apparatus comprises: reception means for receiving operation input corresponding to the plurality of apparatuses; and instruction means for making a rendering instruction to the rendering apparatus based on the operation input received by the reception means, and wherein the rendering apparatus comprises: first
rendering means for rendering a first screen for a particular viewpoint in a virtual space in which it is possible to move an avatar based on the operation input, wherein avatars corresponding to each of the plurality of apparatuses are simultaneously positioned in the virtual space; and second rendering means for
performing processing on particular content that is not an avatar based on the operation input and rendering a second screen for the content, and wherein either the information processing apparatus or the rendering apparatus further comprises management means for
managing management information indicating which of the first rendering means and the second rendering means to cause to render a screen to be provided to an apparatus in the plurality of apparatuses, and wherein the first rendering means, in a case where an avatar
corresponding to an apparatus to which the second screen is provided is included in a rendering scope for the particular viewpoint, renders the first screen, applying the second screen, provided to the apparatus corresponding to the avatar, to an object related to the avatar in the virtual space.
[0010] Further features of the present invention will become apparent from the following description of exemplary embodiments (with reference to the attached drawings) .
BRIEF DESCRIPTION OF DRAWINGS
[0011] In the accompanying drawings:
Fig. 1 is a block diagram of a video game system architecture, according to a non-limiting embodiment of the present invention.
[0012] Fig. 2A is a block diagram showing various functional modules of a server system used in the video game system architecture of Fig. 1, according to one non-limiting embodiment of the present invention.
[0013] Fig. 2B is a block diagram showing various functional modules of a server system used in the video game system architecture of Fig. 1, according to another non-limiting embodiment of the present
invention .
[0014] Fig. 3 is a block diagram showing various functional modules of a client device used in the video game system architecture of Fig. 1, according to one non-limiting embodiment of the present invention.
[0015] Fig. 4 is a flowchart showing steps in a main processing loop executed by the server system, in accordance with a non-limiting embodiment of the present invention.
[0016] Fig. 5 is a flowchart showing steps taken by a client device to decode, combine and display received images, in accordance with a non-limiting embodiment of the present invention.
[0017] Fig. 6 is a flowchart showing steps in a rendering processing executed by the server system, in accordance with a non-limiting embodiment of the present invention.
[0018] Figs. 7A to 7D show example screens to be provided to client devices.
[0019] It is to be expressly understood that the description and drawings are only for the purpose of illustration of certain embodiments of the invention and are an aid for understanding. They are not
intended to be a definition of the limits of the invention .
DESCRIPTION OF EMBODIMENTS
[0020] Detailed explanation for explanatory
embodiments of the present invention will be give with reference to the drawings. Note, in the embodiments explained below, examples of an information processing apparatus and a rendering system wherein the present invention is adopted to a server system capable of rendering game screens or screens for a virtual space provided to a plurality of client devices. However, the present invention can also be adopted to various devices that can render screens provided to a plurality of apparatuses. In other words, the present invention is no limited to an apparatus or system that provides rendered game screens.
[0021] I. Cloud Gaming Architecture
Fig. 1 schematically shows a video game system architecture according to a non-limiting embodiment of the present invention, including a server system (or "server arrangement") 100 for delivering video game services over the Internet 130. Individual client devices 120A, 120B, 120C, 120D, 120E can connect to the server system 100 by communication pathways established over a respective local access network 140A, 140B, 140C, 140D, 140E and the Internet 130. The server system 100 may itself be connected to the Internet 130 by an access network, although to minimize latency, the server system 100 may connect directly to the Internet 130 without the intermediary of an access network.
Alternatively, the server system 100 and the client devices 120A, 120B, 120C, 120D, 120E may be configured to be connected wirelessly or with wires without going through the Internet 130 or the local access networks 140A, 140B, 140C, 140D, or 140E.
[0022] As will become apparent from the description herein below, the server system 100 may be configured so as to enable users of the client devices 120A, 120B, 120C, 120D, 120E to play a video game, either
individually (i.e., a single-player video game) or in groups (i.e., a multiplayer video game). Non-limiting examples of video games may include games that are played for leisure, education and/or sport. A video game may but need not offer participants the
possibility of monetary gain.
[0023] The server system 100 may comprise a single server or a cluster of servers connected through, for example, a virtual private network (VPN) and/or a data center. Individual servers within the cluster may be configured to carry out specialized functions. For example, one or more servers in the server system 100 may be primarily responsible for graphics rendering.
[0024] Figs. 2A and 2B illustrate two different non- limiting embodiments of the server system 100. In both cases, the server system 100 includes at least one server 200, 250 with a central processing unit (CPU) 202. The CPU 202 may load service providing programs into a temporary storage area 206 from a persistent memory medium 204 or from a mass storage device 208. In a non-limiting embodiment, the temporary storage area 206 may be a volatile random-access memory (RAM) such as SRAM or DRAM. In a non-limiting embodiment, the persistent memory medium 204 may be a programmable nonvolatile memory such as EEPROM. Also, in addition to storing the service providing program, the persistent memory medium 204 may store other sets of program (such as operating system) and/or data required for the operation of various modules of the server 200, 250. In a non-limiting embodiment, the mass storage device 208 may be an internal hard disk drive or a Flash memory detachable from the server 200, 250. In another
embodiment, the mass storage device 208 may be
accessible to the server 200, 250 over the Internet 130. The mass storage device 208 may also serve as a
database for storing information about participants involved in the video game, as well as other kinds of information that may be required to generate output for the various participants in the video game.
[0025] Once the service providing programs are loaded into the temporary storage area 206, they may be executed by the CPU 202. The service providing programs may include instructions for generating screens for various viewpoints in a virtual space in which a
plurality of user participate, and in which avatars corresponding to the users can be moved by making operations. The service providing programs may include instructions for generating game screens for a case where a user simply executes a single-player game or a head-to-head game. Data necessary for generation of the screens may be stored in, for example, the mass storage device 208. The rendering of game screens may be
executed by invoking one or more specialized processors referred to as graphics processing units (GPUs) .
Depending on the embodiment, the GPUs may be co-located with the CPU 202 within the server 200 or they may be located on a separate server connected to the server 200 over the Internet 130.
[0026] Fig. 2A pertains to the case where graphics rendering capabilities are located within the same server 200 as the CPU 202 that executes the service providing program. Each video memory (e.g., VRAM) 212A, 212B, 212C may be connected to a respective GPUs 210A, 210B, 210C, and may provide temporary storage of picture element (pixel) values for display of a game screen. When performing rendering, data for one or more objects to be located in three-dimensional space may be loaded into a cache memory (not shown) of the GPU 210A, 210B, 210C. This data may be rendered by the GPU 210A, 210B, 210C as data in two-dimensional space (e.g., an image) , which may be stored in the video memory 212A, 212B, 212C. The image is then output towards a
recipient client device over the Internet 130.
[0027] The server 200 also provides a communication unit 220, which may exchange data with the client devices 120A, 120B, 120C, 120D, 120E over the Internet 130. Specifically, the communication unit 220 may receive user inputs from the client devices 120A, 120B, 120C, 120D, 120E over the Internet 130 and may transmit data (e.g. screen data) to the client devices 120A, 120B, 120C, 120D, 120E over the Internet 130. The data transmitted to the client devices 120A, 120B, 120C, 120D, 120E may include encoded images of screens or portions thereof. Where necessary or appropriate, the communication unit 220 may convert data into a format compliant with a suitable communication protocol.
[0028] Fig. 2B pertains to the case where graphics rendering capabilities are distributed among one or more rendering server (s) 260A, 260B, 260C that are separate from a control server 250. The control server 250 and the rendering servers 260A, 260B, 260C can be connected over a network such as the Internet 130. In an embodiment, there may be one or several rendering servers 260A, 260B, 260C, and in the case where there is more than one rendering server, the rendering servers 260A, 260B, 260C may be distributed over any suitable geographic area. In the case of Fig. 2B, the communication unit 222 may transmit control
instructions to the rendering servers 260A, 260B, 260C over the Internet 130.
[0029] The rendering server 260A comprises a GPU 210D, a video memory (e.g., VRAM) 212D, a CPU 213D (not shown) and a communication unit 224. The communication unit 224 receives the control instructions from the control server 250 over the Internet 130. Based on these control instructions, the CPU 213D performs necessary processing and causes the GPU 210D to execute screen rendering processing. The rendered screen data are stored in a video memory 212D and then encoded and transmit towards a recipient client device over the Internet 130 via the communication unit 224.
Alternatively, the rendered screen data is the
transmitted via the control server 250 to a recipient client device. Similarly to the case of Fig. 2A, the communication unit 224 may convert data into a format compliant with a suitable communication protocol.
[0030] Turning now to the client devices 120A, 120B, 120C, 120D, 120E, their configuration is not
particularly limited. In some embodiments, one or more of the client devices 120A, 120B, 120C, 120D, 120E may be, for example, a PC, a home-use game machine (console such as XBOX™, PS3™, Wii™, etc.), a portable game device, a smart television, a set-top box (STB) , etc. In other embodiments, one or more of the client devices 120A, 120B, 120C, 120D, 120E may be a communication or computing device such as a mobile phone, a PDA, or a tablet .
[0031] Fig. 3 shows a general client device
configuration for explaining explanatory embodiments of the present invention. A client CPU 301 controls operation of blocks comprised in the client device.
The client CPU 301 controls operation of the blocks by reading out operation programs for the blocks stored in a client storage medium 302, loading them into a client RAM 303 and executing them. The client storage medium 302 may be an HDD, a non-volatile ROM, or the like. Also, operation programs may be dedicated applications, browsing applications or the like. The client RAM 303 may not just be a program loading area, and may also be used as a storage area for temporarily storing such things as intermediate data output in the operation of any of the blocks.
[0032] A client communication unit 304 is a
communication interface comprised in the client device. In the present embodiment, the client communication unit 304 receives encoded screen data of the provided service from the server system 100 via the Internet 130. Also, the client communication unit 304 transmits information of operation input by the user on a client device via the Internet 130 to the server system 100. The client decoder 305 decodes encoded screen data received by the client communication unit 304 and generates screen data. The generated screen data is presented to the user of the client device by being output to a client display 306 and displayed. Note, it is not necessary that the client device has the client display 306, and the client display 306 may be an external display apparatus connected to the client device .
[0033] The client input unit 307 is a user interface comprised in the client device. The client input unit 307 may include input devices (such as a touch screen, a keyboard, a game controller, a joystick, etc.), and detect operation input by the user. For the detected operation input, integrated data may be transmitted via the client communication unit 304 to the server system 100, and may be transmitted as information indicating that a particular operation input was performed after analyzing the operation content. Also, the client input unit 307 may include other sensors (e.g.,
Kinect™) that includes a camera or the like, that detect as operation input a motion of a particular object, or a body motion made by the user. In addition, each of the client devices may include a loudspeaker for outputting audio.
[0034] II. Types of Screens Provided by the Server
System
In the server system of the present embodiment, as explained previously, 2 types of screens: i) screens for a virtual space, and ii) game screens for when the user performs game play are provided to the client device .
[0035] Here, explanation of an outline of services provided by the server system of the present embodiment to the client device user will be given.
[0036] The server system provides, as a user
communication area, a virtual space that a plurality of client device users can simultaneously participate in (Fig.7A). This virtual space need not be provided as a game, and may be taken something having visual effect that is used as a tool for simply supporting
communication or improving user experiences for
communication. Each user can operate and move within the space a corresponding avatar which is positioned in the virtual space.
[0037] When a user operates an avatar in the virtual space, a screen for a viewpoint set in the space is provided to the client device of the user. The
viewpoint may be selected from among preset fixed viewpoints, or may be selectively changeable by the user, or be something that is changed in accordance with movement (rotation) operation on the avatar by the user .
[0038] In the present embodiment, the virtual space is assumed to be a place such as an amusement arcade. A user is not only able to cause an avatar to walk around in the space, but is also able, by causing the avatar to move to a position corresponding to an arcade video game machine object arranged in the space, to play a game assigned to that machine (Fig. 7B) . In this case, in place of screens of the virtual space, game screens for the game being played are provided to the client device of the user.
[0039] As shown in Fig. 7C, the game screens may be screens provided for a full screen of the client device, for example. While the user is playing a game of the arcade video game machine, operation performed by the user on the client device is recognized as operation for the game and not for movement of the avatar in the virtual space. When the server system receives this operation, it executes processing necessary for the game, updates parameters within the game, and renders a game screen for a next frame.
[0040] Note, the arcade video game machine object has a part corresponding to a display. In a case where in a screen of the virtual space an avatar of a user is playing a game, a game screen being provided to the user may be applied as a texture to the display unit of the arcade video game machine arranged in front of the avatar, for example. In other words, a user being provided screens of the virtual space can see the game play of other users in the space, just like in a real amusement arcade (Fig. 7D) . Because information of operation input from the client device of a user performing game play is being received, the hands of the avatar of the user may also be moved in accordance with the operation input. With this, a user being provided screens of the virtual space can see not only the screens of the game play of another user but also the operation for the game play.
[0041] Also, configuration is made such that a user playing a game of an arcade video game machine is capable of stopping or pausing the game from a menu within the game for example. By stopping or pausing, the user once again becomes able to operate the avatar in the virtual space. Note, configuration may be made such that in a case where the game is paused, the user that was playing the game, or a user being provided screens of the virtual space, is able to confirm the paused game screen in the display unit of the arcade video game machine corresponding to the screen in the virtual space. In other words, the server system retains the rendered game screen when the pause was performed, for example, as a texture, and subsequently uses it in rendering of the virtual space.
[0042] A paused game can be resumed when the avatar of the user that paused the game once again is moved to a position corresponding to the arcade video game machine object. Also, configuration may be made so that, for example, resuming of the game continuing from the paused state is possible when an avatar of another user is moved to the corresponding position, for example .
[0043] In the following explanation, the previously described mode in which screens of the virtual space are provided is referred to as communication mode, and the mode in which a game is playable and game screens are provided is referred to as game mode. Also, explanation will be given referring to users and avatars in the communication mode as communication user and communication avatar respectively, and referring to users and avatars in the game mode as playing users and playing avatars respectively.
[0044] III. Screen Creation by Server System
As mentioned earlier, the service providing program executed by the CPU 202 renders screen
according to the current mode and transmits them to the client devices 120A, 120B, 120C, 120D, 120E.
[0045] Reference is now made to Fig. 4, which conceptually illustrates the steps in a main processing loop (or main game loop) of the service providing program executed by the CPU 202 in the server system 100. The main game loop may be executed for each client device receiving service provision. The main game loop may include steps 410 to 460, which are described below in further detail, in accordance with a non-limiting embodiment of the present invention. The main game loop executes continually at a certain frame rate.
[0046] At step 410, the CPU 202 receives information of operation input performed on a client device via the communication unit 220. The operation input may be information of a difference with respect to an input state of a previous frame, for example, and it may be information, detected on the client device, indicating that a particular operation was performed. In some embodiments, and particularly in a multi-player video game, the inputs may include the set of inputs received from all participants in the game, not just the
participant for whom the main game loop is being executed. In the case where no inputs have been
provided, step 310 may be omitted.
[0047] At step 420, the CPU 202 updates various parameters for screen configuration based on received information of operation input. Also, the CPU 202 updates parameters not based on operation input but rater based on information that changes along with the passing of time. The various parameters updated in this step may be, in a case where the current mode is communication mode, information indicating position or stance information of an avatar positioned in the virtual space, for example, or whether or not a game play state has been entered. Also, in a case where the current mode is game mode, for example, the parameters may be for positions of operation characters and NPCs (Non-Player Characters) in the game, or stance
information.
[0048] At step 430, the CPU 202 determines rendering contents of a screen to be provided to the client device. Specifically, the CPU 202, when the current mode is the communication mode, based on the viewpoint and direction for which the rendering is performed, specifies an object included in the portion of the virtual space that would be "visible" from the perspective originated from the viewpoint, and
determines rendering content. Alternatively, the CPU 202, when the current mode is game mode, determines a rendering target and rendering content in accordance with screen configuration defined in the game.
[0049] At step 440, the CPU 202, based on rendering content determined in step 430, renders a screen
(image) to be provided to a client device with GPU 210A, 210B, 210C, or 210D.
[0050] At step 450, the communication unit 220 encodes screen data rendered by a GPU and stored in a VRAM. The encoding is not limited to encoding based on a static image encoding standard, and may be based on a moving image encoding standard. In addition to data compression, the encoding process used to encode a particular image may or may not apply cryptographic encryption .
[0051] At step 460, the communication unit 220 transmits the encoded image via the Internet 130 to the client device. In this way, screen creation processing for 1 frame completes.
[0052] IV. Screen Reproduction at Client Device
Reference is now made to Fig. 5, which shows operation of the client device associated with a given user, which may be any of client devices 120A, 120B, 120C, 120D, 120E. [0053] At step 510, the client CPU 301 receives an encoded image (encoded screen data) from the server system 100 via the client communication unit 304. Any associated audio may also be received at this time.
[0054] At step 520, the client decoder 305 decodes the received encoded image. The decompression
algorithm is complementary to the compression algorithm used in the encoding process (see, e.g., step 450 above) . In a non-limiting embodiment, the identity or version of the compression algorithm used to encode the image may be specified in the content of one or more packets that convey the image.
[0055] At step 530, the client CPU 301 processes the (decoded) image. This can include placing the decoded image in a buffer (ex. the client RAM 303) , performing error correction, combining multiple successive images, performing alpha blending, interpolating portions of missing image, and so on. The result can be a final image to be presented to the user one per frame.
[0056] At step 540, the client CPU 301 outputs the final image to the client display 306 and displays. For example, in the case of video, a composite video frame is displayed on the client display 306. In addition, associated audio may also be played and any other output synchronized with the video frame may similarly be triggered. [0057]V. Specific Description of Non-Limiting
Embodiments
A more detailed description of certain non- limiting embodiments of the present invention is now provided.
[0058] Fig. 6 is a flowchart for exemplifying details of rendering processing for a screen that the server system 100 provides to a client device. This processing is performed for generation of one frame in the screen provision, and is executed simultaneously for each of the client devices connected to the server system 100.
[0059] In step S601 the CPU 202 determines whether the mode which is set to the client device, to which the screen is provided, is the communication mode or the game mode. The CPU 202 moves the processing to step S605 in a case where it determines that the client device mode is the communication mode, and moves the processing to step S602 in a case where it determines that it is the game mode.
[0060] In step S602, the CPU 202, after having executed a game program for a game being played on the client device. and performing various processing, causes one of GPU 210A, 210B, 210C, or 210D to render a game screen .
[0061] In step S603, the CPU 202 reads outs a game screen stored in a VRAM in the rendering, transmits it to the communication unit 220 and causes encoding processing to be executed. Also, the CPU 202
reproduces the game screen read out from the VRAM and stores the reproduction into a Temporary Storage Area 206.
[0062 ] In step S604, the communication unit 220 transmits the encoded game screen to the client device, and the processing completes.
[0063] Meanwhile, in a case where the mode which is set to the client device is determined to be the
communication mode in step S601, the CPU 202, in step S605, identifies rendering target objects from out of the objects arranged in the virtual space.
[0064] In step S606, the CPU 202 determines whether or not an avatar of another user performing game play is included in the rendering target objects. The CPU 202 moves the processing on to step S607 in a case where it determines that an avatar of another user performing game play is included, and moves the
processing to step S608 in a case where it is
determined that no such avatar is included.
[0065 ] In step S607, the CPU 202 performs screen rendering, applying a corresponding game screen as a texture to the display unit of the arcade video game machine object on which the game play is performed in the rendering of the screen of the virtual space caused to be performed by one of GPU 210A, 210B, 210C, or 210D. Specifically, the GPU performing the rendering, under the control of the CPU 202, reads data of the
corresponding game screen from the Temporary Storage Area 206, loads this into a cache memory of the GPU, and uses it in the rendering. Also, in a case where an arcade video game machine object in a state in which game play is paused is included in the rendering target, rendering is performed applying a corresponding game screen as a texture to the display unit of the object.
[0066] In step S608, the CPU 202 reads out a screen of the virtual space stored in a VRAM in the rendering, transmits it to the communication unit 220, causes execution of encoding processing.
[0067] In step S609, the communication unit 220 transmits the encoded screen to the client device, and the processing completes.
[0068] With this, game screens specific to game play can be provided to a user of a client device in game mode and screens by which it is possible to spectate the game play of another user in the virtual space can be provided to users of client device in the
communication mode. In other words, users of client devices in the communication mode can observe that another user in the virtual space is performing game play, and also can observe that game play.
[0069] Note, game screens provided to users
performing game play are not limited to forms in which they are provided by simply being applied as textures to objects in the virtual space. Configuration may be made so that a user can receive provision of more detailed display of the game screens by performing a selection operation on an avatar in the game play or an arcade video game machine object, for example. In such cases, for example, the game screen and a screen showing the state of the hands of the avatar changing, based on information of the operation input from the client device for the game, may be presented to the user arranged within a single new screen. In this case, the user being provided the game screen and the user being provided the new screen may be able to interact with each other using at least one of audio input and character input. That is, the audio input or the character input is shared with the users, and then the users can be in communication with each other while viewing a same game screen.
[0070] Also, as for a texture of a game screen applied to a display unit of an arcade video game machine object, it does not necessarily have to be used in the rendering of a virtual space screen of the same frame as the frame in which the game screen is
transmitted. In other words, the game screens used in the rendering of virtual space screens may be game screens transmitted to the client device in a
previously transmitted past frame. Accordingly, a plurality of game screens may be accumulated in the Temporary Storage Area 206, used in order in the virtual space screen rendering. By doing this, it is possible to apply interpolation of operation input performed in previous and subsequent frames in a case where a motion is applied in accordance with operation input on an avatar of a user performing game play, for example. In other words, virtual space screens and textures of game screens included in the screens are generated at different timings, and so, by
interpolation of discrete data, adjustment can be made to make smooth motion due to operations.
[0071] It will be appreciated that by allowing users to participate simultaneously in multiple nested games, certain embodiments of the invention may enable a richer gaming experience than has been previously available .
[0072] While the present invention has been
described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such
modifications and equivalent structures and functions. Also, the information processing apparatus and the method of controlling the information processing apparatus according to the present invention are realizable by a program executing the methods on a computer. The program is providable/distributable by being stored on a computer-readable storage medium or through an electronic communication line.
[0073] This application claims the benefit of U.S. Provisional Patent Application No. 61/761,415 filed February 6, 2013, which is hereby incorporated by reference herein in its entirety.

Claims

1. An information processing apparatus for rendering screens provided to a plurality of apparatuses, comprising :
reception means for receiving operation input corresponding to the plurality of apparatuses;
first rendering means for rendering a first screen for a particular viewpoint in a virtual space in which it is possible to move an avatar based on the operation input, wherein avatars corresponding to each of the plurality of apparatuses are simultaneously positioned in the virtual space;
second rendering means for performing processing on particular content that is not an avatar based on the operation input and rendering a second screen for the content; and
management means for managing management
information indicating which of said first rendering means and said second rendering means to cause to render a screen to be provided to an apparatus in the plurality of apparatuses,
wherein said first rendering means, in a case where an avatar corresponding to an apparatus to which the second screen is provided is included in a
rendering scope for the particular viewpoint, renders the first screen, applying the second screen provided to the apparatus corresponding to the avatar to an object related to the avatar in the virtual space.
2. The information processing apparatus according to claim 1 wherein said first rendering means, in the case where the avatar corresponding to the apparatus to which the second screen is provided is included in the rendering scope for the particular viewpoint, causes a state of the avatar to change in accordance with corresponding operation input and renders the first screen .
3. The information processing apparatus according to claim 1 or 2 wherein, said management means,
in a case where the first screen is being provided, updates, in accordance with a first operation input being performed, the management information to change rendering of screens to be provided to a corresponding apparatus to said second rendering means,
and in a case where the second screen is being provided, updates, in accordance with a second
operation input being performed, the management
information to change rendering of screens to be provided to a corresponding apparatus to said first rendering means.
4. The information processing apparatus according to claim 3 wherein said first rendering means, in a case where an avatar corresponding to an apparatus for which rendering of screens to be provided was changed to said first rendering means is included in a rendering scope, renders the first screen applying the second screen, which was provided to the apparatus for which rendering was changed, to an object related to the avatar in the virtual space.
5. The information processing apparatus according to claim 3 or 4 wherein the first operation input is input of an initiation instruction for provision for an object, which is positioned in the virtual space, related to provision of the particular content.
6. The information processing apparatus according t any one of claims 3-5 wherein the second operation input is an instruction to stop the provision of the particular content or is an instruction to pause the provision of the particular content.
7. The information processing apparatus according to any one of claims 1-6 wherein the second screen that said first rendering means applies in the case where the avatar corresponding to the apparatus to which the second screen is provided is included in the rendering scope for the particular viewpoint corresponds to a time that is earlier than a time associated with the first screen that is rendered, and is a screen that was previously provided to the apparatus.
8. The information processing apparatus according to any one of claims 1-7 further comprising third
rendering means for rendering a third screen, that is different from the first screen, comprising the second screen, which is provided to an apparatus corresponding to an avatar included in a rendering scope for the particular viewpoint and a screen indicating an
operation of the avatar in accordance with operation input corresponding to the apparatus, in accordance with the third operation input being performed in a case where first screen is being provided,
wherein said management means manages management information indicating which of said first rendering means, said second rendering means and said third rendering means to cause to render a screen to be provided to an apparatus in the plurality of
apparatuses .
9. The information processing apparatus according to claim 8 further comprising: acquisition means for acquiring at least one of audio input and character input from an apparatus to which the third screen is being provided and an apparatus to which the second screen, which is included in the third screen, is being provided, and
sharing means for sharing input acquired by said acquisition means with the apparatus to which the third screen is being provided and the apparatus to which the second screen, which is included in the third screen, is being provided.
10. The information processing apparatus according to claim 8 or 9 wherein the third operation input is input of an initiation instruction for sharing of the second screen to an avatar, is positioned in the virtual space, corresponding to an apparatus to which the second screen is being provided.
11. The information processing apparatus according to any one of claims 1-10 further comprising transmission means for transmitting a screen rendered by any of said rendering means to each of the plurality of apparatuses based on the management information.
12. A method of controlling an information processing apparatus for rendering screens provided to a plurality of apparatuses, comprising:
a receiving step of reception means of the
information processing apparatus receiving operation input corresponding to the plurality of apparatuses;
a first rendering step of first rendering means of the information processing apparatus rendering a first screen for a particular viewpoint in a virtual space in which it is possible to move an avatar based on the operation input, wherein avatars corresponding to each of the plurality of apparatuses are simultaneously positioned in the virtual space;
a second rendering step of second rendering means of the information processing apparatus performing processing on particular content that is not an avatar based on the operation input and rendering a second screen for the content; and
a managing step of management means of the
information processing apparatus managing management information indicating with which of the first
rendering step and the second rendering step to use to render a screen to be provided to an apparatus in the plurality of apparatuses,
wherein said first rendering means in the first rendering step, in a case where an avatar corresponding to an apparatus to which the second screen is provided is included in a rendering scope for the particular viewpoint, renders the first screen, applying the second screen provided to the apparatus corresponding to the avatar to an object related to the avatar in the virtual space.
13. The method of controlling the information processing apparatus according to claim 12 wherein said first rendering means in the first rendering step, in the case where the avatar corresponding to the
apparatus to which the second screen is provided is included in the rendering scope for the particular viewpoint, causes a state of the avatar to change in accordance with corresponding operation input and renders the first screen.
14. The method of controlling the information
processing apparatus according to claim 12 or 13 wherein, said management means in the managing step, in a case where the first screen is being provided, updates, in accordance with a first operation input being performed, the management information to change rendering of screens to be provided to a corresponding apparatus to use the second rendering step,
in a case where the second screen is being
provided, updates, in accordance with a second
operation input being performed, the management
information to change rendering of screens to be provided to a corresponding apparatus to use the first rendering step.
15. The method of controlling the information
processing apparatus according to claim 14 wherein said first rendering means in the first rendering step, in a case where an avatar corresponding to an apparatus for which rendering of screens to be provided was changed to use the first rendering step is included in a rendering scope, renders the first screen applying the second screen, which was provided to the apparatus for which rendering was changed, to an object related to the avatar in the virtual space.
16. The method of controlling the information
processing apparatus according to claim 14 or 15 wherein the first operation input is input of an initiation instruction for provision for an object, which is positioned in the virtual space, related to provision of the particular content.
17. The method of controlling the information
processing apparatus according to any one of claims 14-
16 wherein the second operation input is an instruction to stop the provision of the particular content or is an instruction to pause the provision of the particular content .
18. The method of controlling the information
processing apparatus according to any one of claims 12-
17 wherein the second screen, that said first rendering means in the first rendering step applies in the case where the avatar corresponding to the apparatus to which the second screen is provided is included in the rendering scope for the particular viewpoint,
corresponds to a time that is earlier than a time associated with the first screen that is rendered, and is a screen that was previously provided to the
apparatus .
19. The method of controlling the information
processing apparatus according to any one of claims 12- 18 further comprising a third rendering step of third rendering means of the information processing apparatus rendering a third screen, that is different from the first screen, comprising the second screen, which is provided to an apparatus corresponding to an avatar included in a rendering scope for the particular viewpoint and a screen indicating an operation of the avatar in accordance with operation input corresponding to the apparatus, in accordance with the third
operation input being performed in a case where first screen is being provided,
wherein said management means, in the managing step, manages management information indicating which of the first rendering step, the second rendering step and the third rendering step to cause to render a screen to be provided to an apparatus in the plurality of apparatuses.
20. The method of controlling the information
processing apparatus according to claim 19 further comprising:
acquisition means of the information processing apparatus for acquiring at least one of audio input and character input from an apparatus to which the third screen is being provided and an apparatus to which the second screen, which is included in the third screen, is being provided, and
sharing means of the information processing apparatus for sharing input acquired by said
acquisition means with the apparatus to which the third screen is being provided and the apparatus to which the second screen, which is included in the third screen, is being provided.
21. The method of controlling the information
processing apparatus according to claim 19 or 20 wherein the third operation input is input of an initiation instruction for sharing of the second screen to an avatar, positioned in the virtual space,
corresponding to an apparatus to which the second screen is being provided.
22. The method of controlling the information
processing apparatus according to any one of claims 12- 21 further comprising transmission means of the information processing apparatus transmitting a screen rendered by any of said rendering means to each of the plurality of apparatuses based on the management information .
23. A program for causing a computer to execute each step of the method of controlling the information processing apparatus of any one of claims 12 to 22.
24. A storage medium for storing a program for causing a computer to execute each step of the method of controlling the information processing apparatus of any one of claims 12 to 22.
25. A rendering system comprising an information processing apparatus and a rendering apparatus for rendering screens provided to a plurality of
apparatuses connected to the information processing apparatus,
wherein the information processing apparatus comprises :
reception means for receiving operation input corresponding to the plurality of apparatuses; and
instruction means for making a rendering instruction to the rendering apparatus based on the operation input received by the reception means,
and wherein the rendering apparatus comprises: first rendering means for rendering a first screen for a particular viewpoint in a virtual space in which it is possible to move an avatar based on the operation input, wherein avatars corresponding to each of the plurality of apparatuses are simultaneously positioned in the virtual space; and
second rendering means for performing processing on particular content that is not an avatar based on the operation input and rendering a second screen for the content,
and wherein
either the information processing apparatus or the rendering apparatus further comprises management means for managing management information indicating which of said first rendering means and said second rendering means to cause to render a screen to be provided to an apparatus in the plurality of apparatuses,
and wherein
said first rendering means, in a case where an avatar corresponding to an apparatus to which the second screen is provided is included in a rendering scope for the particular viewpoint, renders the first screen, applying the second screen, provided to the apparatus corresponding to the avatar, to an object related to the avatar in the virtual space.
PCT/JP2014/052594 2013-02-06 2014-01-29 Information processing apparatus, control method, program, storage medium, and rendering system WO2014123129A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP14749202.9A EP2954496A4 (en) 2013-02-06 2014-01-29 Information processing apparatus, control method, program, storage medium, and rendering system
US14/382,409 US9636581B2 (en) 2013-02-06 2014-01-29 Information processing apparatus, control method, program, storage medium, and rendering system
JP2014530452A JP5776954B2 (en) 2013-02-06 2014-01-29 Information processing apparatus, control method, program, recording medium, and drawing system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361761415P 2013-02-06 2013-02-06
US61/761,415 2013-02-06

Publications (1)

Publication Number Publication Date
WO2014123129A1 true WO2014123129A1 (en) 2014-08-14

Family

ID=51293330

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/052594 WO2014123129A1 (en) 2013-02-06 2014-01-29 Information processing apparatus, control method, program, storage medium, and rendering system

Country Status (5)

Country Link
US (1) US9636581B2 (en)
EP (1) EP2954496A4 (en)
JP (1) JP5776954B2 (en)
CA (1) CA2831587C (en)
WO (1) WO2014123129A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016144820A1 (en) 2015-03-06 2016-09-15 Sony Computer Entertainment America Llc Dynamic adjustment of cloud game data streams to output device and network quality

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5545687B1 (en) * 2013-11-07 2014-07-09 株式会社 ディー・エヌ・エー Server and method for providing game
US10099134B1 (en) * 2014-12-16 2018-10-16 Kabam, Inc. System and method to better engage passive users of a virtual space by providing panoramic point of views in real time
JP6518166B2 (en) * 2015-08-07 2019-05-22 ダイコク電機株式会社 Game information display system
CN109671140B (en) * 2018-12-26 2024-02-02 上海赞奇文化科技有限公司 Cloud rendering service processing method adopting micro-service
JP7043558B1 (en) * 2020-09-23 2022-03-29 グリー株式会社 Computer programs, methods, and server equipment

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002248273A (en) * 2001-02-26 2002-09-03 Square Co Ltd Video game device and its control method, program of video game and computer readable storage medium for recording this program
JP2009022365A (en) * 2007-07-17 2009-02-05 Sony Computer Entertainment Inc Game guidance system

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6672961B1 (en) 2000-03-16 2004-01-06 Sony Computer Entertainment America Inc. Computer system and method of displaying images
KR100678994B1 (en) 2004-02-16 2007-02-05 주식회사 와이앤케이 코리아 Game-in-Game Type On-line Game Service System
US20070167239A1 (en) 2006-01-19 2007-07-19 O'rourke Jason Arcade Casino Game
US7841946B2 (en) 2006-06-29 2010-11-30 Spawn Labs, Inc. System for remote game access
US20080307473A1 (en) 2007-06-06 2008-12-11 Virtual Worlds Ppv, Llc Virtual worlds pay-per-view
US8151199B2 (en) * 2009-02-09 2012-04-03 AltEgo, LLC Computational delivery system for avatar and background game content
CA2692064A1 (en) 2010-02-05 2011-08-05 Robert Bruce Method and system for implementing a virtual game
US8944911B2 (en) * 2010-07-27 2015-02-03 Disney Enterprises, Inc. Online parallel play
US20120190453A1 (en) * 2011-01-25 2012-07-26 Bossa Nova Robotics Ip, Inc. System and method for online-offline interactive experience
KR20120119504A (en) * 2011-04-21 2012-10-31 한국전자통신연구원 System for servicing game streaming according to game client device and method
US20130116044A1 (en) * 2011-11-03 2013-05-09 Lawrence Schwartz Network multi-player trivia-based game and contest

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002248273A (en) * 2001-02-26 2002-09-03 Square Co Ltd Video game device and its control method, program of video game and computer readable storage medium for recording this program
JP2009022365A (en) * 2007-07-17 2009-02-05 Sony Computer Entertainment Inc Game guidance system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2954496A4 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016144820A1 (en) 2015-03-06 2016-09-15 Sony Computer Entertainment America Llc Dynamic adjustment of cloud game data streams to output device and network quality
EP3266198A4 (en) * 2015-03-06 2019-01-16 Sony Interactive Entertainment America LLC Dynamic adjustment of cloud game data streams to output device and network quality
US11648474B2 (en) 2015-03-06 2023-05-16 Sony Interactive Entertainment LLC Dynamic adjustment of cloud game data streams to output device and network quality

Also Published As

Publication number Publication date
US20150038224A1 (en) 2015-02-05
EP2954496A4 (en) 2016-12-21
JP5776954B2 (en) 2015-09-09
JP2015515031A (en) 2015-05-21
CA2831587A1 (en) 2014-08-06
US9636581B2 (en) 2017-05-02
CA2831587C (en) 2021-07-27
EP2954496A1 (en) 2015-12-16

Similar Documents

Publication Publication Date Title
US9858210B2 (en) Information processing apparatus, rendering apparatus, method and program
JP5952406B2 (en) Video game device having remote drawing capability
US9636581B2 (en) Information processing apparatus, control method, program, storage medium, and rendering system
US20160293134A1 (en) Rendering system, control method and storage medium
JP5987060B2 (en) GAME SYSTEM, GAME DEVICE, CONTROL METHOD, PROGRAM, AND RECORDING MEDIUM
US9215276B2 (en) Apparatus and method of data transfer
US11497990B2 (en) Crowd sourced cloud gaming using peer-to-peer streaming
US20160127508A1 (en) Image processing apparatus, image processing system, image processing method and storage medium
US20140195912A1 (en) Method and system for simultaneous display of video content
JP2016202686A (en) System, method, and program for sharing game experience, and recording medium
US20160271495A1 (en) Method and system of creating and encoding video game screen images for transmission over a network
US20230381672A1 (en) Multiplayer video game systems and methods
CA2798066A1 (en) Method and system of creating and encoding video game screen images for transmission over a network
CA2798934A1 (en) Video gaming device with remote rendering capability

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2014530452

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 14382409

Country of ref document: US

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14749202

Country of ref document: EP

Kind code of ref document: A1

REEP Request for entry into the european phase

Ref document number: 2014749202

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2014749202

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE