WO2008035027A1 - Video game - Google Patents

Video game Download PDF

Info

Publication number
WO2008035027A1
WO2008035027A1 PCT/GB2007/002744 GB2007002744W WO2008035027A1 WO 2008035027 A1 WO2008035027 A1 WO 2008035027A1 GB 2007002744 W GB2007002744 W GB 2007002744W WO 2008035027 A1 WO2008035027 A1 WO 2008035027A1
Authority
WO
WIPO (PCT)
Prior art keywords
game
primary game
user
sequence
character
Prior art date
Application number
PCT/GB2007/002744
Other languages
French (fr)
Inventor
Matthew Christian Townsend Hart
Tameem Nadi Antoniades
Original Assignee
Sony Computer Entertainment Europe Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Computer Entertainment Europe Limited filed Critical Sony Computer Entertainment Europe Limited
Publication of WO2008035027A1 publication Critical patent/WO2008035027A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/10
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device

Definitions

  • This invention relates to video games.
  • a user control such as a so-called "joypad”, which is a handheld unit providing several user-operable switches, buttons, joysticks and the like.
  • the joypad normally used with the Sony® PlayStation 2TM video game machine provides two joysticks, four directional buttons (up, down, left and right), four multi-function buttons ( ⁇ , ⁇ , o and x) and four shoulder buttons (L1, L2, R1, R2).
  • the available user controls are mapped to required game functions.
  • buttons While some conventions have developed within the game industry, the allocation of functions to controls is still arbitrary and must be learned by the player of the game. For example, a function such as punching or kicking an opponent has no intrinsically obvious control button to carry out this function.
  • An example of this is to display, at the time that a user needs to take a particular action, a visual indication of which button needs to be pressed. For example, in the game
  • Tomb RaiderTM at the time that a user needs to press (say) the X button, a large X is displayed on the video game's display screen.
  • One drawback is that because an icon representing a particular button is alien to the rest of the game environment, it appears to be in the foreground and can be distracting from the action going on behind it. This is especially a problem in the case of a carefully prepared interactive cut scene using particularly exciting video scenes - it is not desirable for the user's attention to be taken away from the underlying action.
  • a second disadvantage is that the process of displaying a button identification, and the user pressing that button, could tend to disconnect (in the user's mind) the hard-learned association between that button and a particular game action.
  • This invention provides video game apparatus in which a primary game object is displayed within a game environment of a video game, the apparatus having: a user controller; and a game program defining a sequence of one or more target tasks required to be performed with respect to the primary game object by operating the user controller; in which the apparatus displays a further representation of the primary game object displaced, in the game environment, with respect to- the displayed primary game character, a display property of the further representation being dependent upon a next target task in the sequence.
  • the invention provides an elegantly simple yet innovative approach to providing user instruction or guidance as to the next required game task by a game object (e.g. a game character, vehicle or the like).
  • a further representation of the game object is displayed, which is displaced from the primary game object (i.e. the one which the user is controlling).
  • a display property such as the position, colour or both of the further representation is used to indicate what the user should do next.
  • the further representation might be displayed ahead of the primary game object (in game time) so that the further representation indicates a movement path that the primary game object must follow - for example to scale a cliff face or to cross a derelict bridge.
  • Figure 1 schematically illustrates the overall system architecture of the PlayStation2
  • FIG. 1 schematically illustrates the architecture of an Emotion Engine
  • Figure 3 schematically illustrates the configuration of a Graphics Synthesiser
  • Figure 4 schematically illustrates a game involving successive scenes
  • Figure 5 schematically illustrates an interactive cut scene
  • Figure 6 schematically illustrates a branch scene
  • Figure 7 schematically illustrates a scene definition table
  • Figures 8a to 8c schematically illustrate a ghost game character
  • Figures 9a to 9d schematically illustrate a ghost game object
  • Figure 10 schematically illustrates another possible scene definition table.
  • Figure 1 schematically illustrates the overall system architecture of the PlayStation2.
  • a system unit 10 is provided, with various peripheral devices connectable to the .system unit.
  • the system unit 10 comprises: an Emotion Engine 100; a Graphics Synthesiser 200; a sound processor unit 300 having dynamic random access memory (DRAM); a read only memory (ROM) 400; a compact disc (CD) and digital versatile disc (DVD) reader 450; a Rambus Dynamic Random Access Memory (RDRAM) unit 500; an input/output processor (IOP) 700 with dedicated RAM 750.
  • An (optional) external hard disk drive (HDD) 390 may be connected.
  • the input/output processor 700 has two Universal Serial Bus (USB) ports 715 and an iLink or IEEE 1394 port (iLink is the Sony Corporation implementation of the IEEE 1394 standard).
  • the IOP 700 handles all USB, iLink and game controller data traffic.
  • the IOP 700 receives data from the game controller and directs it to the Emotion Engine 100 which updates the current state of the game accordingly.
  • the IOP 700 has a Direct Memory Access (DMA) architecture to facilitate rapid data transfer rates. DMA involves transfer of data from main memory to a device without passing it through the CPU.
  • the USB interface is compatible with Open Host Controller Interface (OHCI) and can handle data transfer rates of between 1.5 Mbps and 12 Mbps. Provision of these interfaces means that the PlayStation2 is potentially compatible with peripheral devices such as video cassette recorders (VCRs), digital cameras, microphones, set-top boxes, printers, keyboard, mouse and joystick.
  • a USB microphone 730 is connected to the USB port. It will be appreciated that the USB microphone 730 may be a hand-held microphone or may form part of a head-set that is worn by the human operator. The advantage of wearing a head-set is that the human operator's hand are free to perform other actions.
  • the microphone includes an analogue-to-digital converter (ADC) and a basic hardware-based real-time data compression and encoding arrangement, so that audio data are transmitted by the microphone 730 to the USB port 715 in an appropriate format, such as 16-bit mono PCM (an uncompressed format) for decoding at the PlayStation 2 system unit 10.
  • ADC analogue-to-digital converter
  • PCM an uncompressed format
  • two other ports 705, 710 are proprietary sockets allowing the connection of a proprietary non- volatile RAM memory card 720 for storing game-related information, a hand-held game controller 725 or a device (not shown) mimicking a .handheld controller, such as a dance mat.
  • the system unit 10 may be connected to a network adapter 805 that provides an interface (such as an Ethernet interface) to a network.
  • This network may be, for example, a LAN, a WAN or the Internet.
  • the network may be a general network or one that is dedicated to game related communication.
  • the network adapter 805 allows data to be transmitted to and received from other system units 10 that are connected to the same network, (the other system units 10 also having corresponding network adapters 805).
  • the Emotion Engine 100 is a 128-bit Central Processing Unit (CPU) that has been specifically designed for efficient simulation of 3 dimensional (3D) graphics for games applications.
  • the Emotion Engine components include a data bus, cache memory and registers, all of which are 128-bit. This facilitates fast processing of large volumes of multimedia data. Conventional PCs, by way of comparison, have a basic 64-bit data structure.
  • the floating point calculation performance of the PlayStation2 is 6.2 GFLOPs.
  • the Emotion Engine also comprises MPEG2 decoder circuitry which allows for simultaneous processing of 3D graphics data and DVD data.
  • the Emotion Engine performs geometrical calculations including mathematical transforms and translations and also performs calculations associated with the physics of simulation objects, for example, calculation of friction between two objects.
  • the image rendering commands are output in the form of display lists.
  • a display list is a sequence of drawing commands that specifies to the Graphics Synthesiser which primitive graphic objects (e.g. points, lines, triangles, sprites) to draw on the screen and at which co-ordinates.
  • primitive graphic objects e.g. points, lines, triangles, sprites
  • a typical display list will comprise commands to draw vertices, commands to shade the faces of polygons, render bitmaps and so on.
  • the Emotion Engine 100 can asynchronously generate multiple display lists.
  • the Graphics Synthesiser 200 is a video accelerator that performs rendering of the display lists produced by the Emotion Engine 100.
  • the Graphics Synthesiser 200 includes a graphics interface unit (GIF) which handles, tracks and manages the multiple display lists.
  • the rendering function of the Graphics Synthesiser 200 can generate image data that supports several alternative standard output image formats, i.e., NTSC/PAL, High Definition Digital TV and VESA.
  • NTSC/PAL High Definition Digital TV
  • VESA High Definition Digital TV
  • the rendering capability of graphics systems is defined by the memory bandwidth between a pixel engine and a video memory, each of which is located within the graphics processor.
  • Conventional graphics systems use external Video Random Access Memory (VRAM) connected to- the pixel logic via an off-chip bus which tends to restrict available bandwidth.
  • VRAM Video Random Access Memory
  • the Graphics Synthesiser 200 of the PlayStation2 provides the pixel logic and the video memory on a single high-performance chip which allows for a comparatively large 38.4 Gigabyte per second memory access bandwidth.
  • the Graphics Synthesiser is theoretically capable of achieving a peak drawing capacity of 75 million polygons per second. Even with a full range of effects such as textures, lighting and transparency, a sustained rate of 20 million polygons per second can be drawn continuously. Accordingly, the Graphics Synthesiser 200 is capable of rendering a film-quality image.
  • the Sound Processor Unit (SPU) 300 is effectively the soundcard of the system which is capable of recognising 3D digital sound such as Digital Theater Surround (DTS®) sound and AC-3 (also known as Dolby Digital) which is the sound format used for DVDs.
  • a display and sound output device 305 such as a video monitor or television set with an associated loudspeaker arrangement 310, is connected to receive video and audio signals from the graphics synthesiser 200 and the sound processing unit 300.
  • the main memory supporting the Emotion Engine 100 is the RDRAM (Rambus Dynamic Random Access Memory) module 500 produced by Rambus Incorporated.
  • This RDRAM memory subsystem comprises RAM, a RAM controller and a bus connecting the RAM to the Emotion Engine 100.
  • FIG. 2 schematically illustrates the architecture of the Emotion Engine 100 of Figure 1.
  • the Emotion Engine 100 comprises: a floating point unit (FPU) 104; a central processing unit (CPU) core 102; vector unit zero (VUO) 106; vector unit one (VUl) 108; a graphics interface unit (GIF) 110; an interrupt controller (INTC) 112; a timer unit 1 14; a direct memory access controller 116; an image data processor unit (IPU) 118; a dynamic random access memory controller (DRAMC) 120; a sub-bus interface (SIF) 122; and all of these components are connected via a 128-bit main bus 124.
  • the CPU core 102 is a 128-bit processor clocked at 300 MHz.
  • the CPU core has access to 32 MB of main memory via the DRAMC 120.
  • the CPU core 102 instruction set is based on MIPS III RISC with some MIPS IV RISC instructions together with additional multimedia instructions.
  • MIPS III and IV are Reduced Instruction Set Computer (RISC) instruction set architectures proprietary to MIPS Technologies, Inc. Standard instructions are 64-bit, two-way superscalar, which means that two instructions can be executed simultaneously. Multimedia instructions, on the other hand, use 128-bit instructions via two pipelines.
  • the CPU core 102 comprises a 1"6KB instruction cache, an 8KB data cache and a 16KB scratchpad RAM which is a portion of cache reserved for direct private usage by the CPU.
  • the FPU 104 serves as a first co-processor for the CPU core 102.
  • the vector unit is based on MIPS III RISC with some MIPS IV RISC instructions together with additional multimedia instructions.
  • MIPS III and IV are Reduced Instruction Set Computer (RISC) instruction set architectures proprietary to MIPS Technologies, Inc
  • the FPU 104 comprises a floating point product sum arithmetic logic unit (FMAC) and a floating point division calculator (FDIV). Both the FMAC and FDIV operate on 32-bit values so when an operation is carried out on a 128-bit value ( composed of four 32-bit values) an operation can be carried out on all four parts concurrently. For example adding 2 vectors together can be done at the same time.
  • FMAC floating point product sum arithmetic logic unit
  • FDIV floating point division calculator
  • the vector units 106 and 108 perform mathematical operations and are essentially specialised FPUs that are extremely fast at evaluating the multiplication and addition of vector equations. They use Floating-Point Multiply-Adder Calculators (FMACs) for addition and multiplication operations and Floating-Point Dividers (FDIVs) for division and square root operations. They have built-in memory for storing micro-programs and interface with the rest of the system via Vector Interface Units (VIFs). Vector unit zero 106 can work as a coprocessor to the CPU core 102 via a dedicated 128-bit bus so it is essentially a second specialised FPU.
  • FMACs Floating-Point Multiply-Adder Calculators
  • FDIVs Floating-Point Dividers
  • VIPs Vector Interface Units
  • Vector unit one 108 has a dedicated bus to the Graphics synthesiser 200 and thus can be considered as a completely separate processor.
  • the inclusion of two vector units allows the software developer to split up the work between different parts of the CPU and the vector units can be used in either serial or parallel connection.
  • Vector unit zero 106 comprises 4 FMACS and 1 FDIV. It is connected to the CPU core 102 via a coprocessor connection. It has 4 Kb of vector unit memory for data and 4 Kb of micro-memory for instructions. Vector unit zero 106 is useful for performing physics calculations associated with the images for display. It primarily executes non-patterned geometric processing together with the CPU core 102.
  • Vector unit one 108 comprises 5 FMACS and 2 FDIVs. It has no direct path to the CPU core 102, although it does have a direct path to the GIF unit 110. It has 16 Kb of vector unit memory for data and 16 Kb of micro-memory for instructions. Vector unit one 108 is useful for performing transformations. It primarily executes patterned geometric processing and directly outputs a generated display list to the GIF 110.
  • the GIF 110 is an interface unit to the Graphics Synthesiser 200. It converts data according to a tag specification at the beginning of a display list packet and transfers drawing commands to the Graphics Synthesiser 200 whilst mutually arbitrating multiple transfer.
  • the interrupt controller (INTC) 112 serves to arbitrate interrupts from peripheral devices, except the DMAC 1 16.
  • the timer unit 114 comprises four independent timers with 16-bit counters. The timers are driven either by the bus clock (at 1/16 or 1/256 intervals) or via an external clock.
  • the DMAC 116 handles data transfers between main memory and peripheral processors or main memory and the scratch pad memory. It arbitrates the main bus 124 at the same time. Performance optimisation of the DMAC 116 is a key way by which to improve Emotion Engine performance.
  • the image processing unit (IPU) 118 is an image data processor that is used to expand compressed animations and texture images. It performs I-PICTURE Macro- Block decoding, colour space conversion and vector quantisation.
  • the sub-bus interface (SIF) 122 is an interface unit to the IOP 700. It has its own memory and bus to control I/O devices such as sound chips and storage devices.
  • FIG. 3 schematically illustrates the configuration of the Graphic Synthesiser 200.
  • the Graphics Synthesiser comprises: a host interface 202; a set-up / rasterizing unit; a pixel pipeline 206; a memory interface 208; a local memory 212 including a frame page buffer 214 and a texture page buffer 216; and a video converter 210.
  • the host interface 202 transfers data with the host (in this case the CPU core 102 of the Emotion Engine 100). Both drawing data and buffer data from the host pass through this interface.
  • the output from the host interface 202 is supplied to the graphics synthesiser 200 which develops the graphics to draw pixels based on vertex information received from the Emotion Engine 100, and calculates information such as RGBA value, depth value (i.e. Z- value), texture value and fog value for each pixel.
  • the RGBA value specifies the red, green, blue (RGB) colour components and the A (Alpha) component represents opacity of an'image object.
  • the Alpha value can range from completely transparent to totally opaque.
  • the pixel data is supplied to the pixel pipeline 206 which performs processes such as texture mapping, fogging and Alpha-blending and determines the final drawing colour based on the calculated pixel information.
  • the pixel pipeline 206 comprises 16- pixel engines PEl, PE2, Vietnamese , PE16 so that it can process a maximum of 16 pixels concurrently.
  • the pixel pipeline 206 runs at 150MHz with 32-bit colour and a 32-bit Z-buffer.
  • the memory interface 208 reads data from and writes data to the local Graphics Synthesiser memory 212. It writes the drawing pixel values (RGBA and Z) to memory at the end of a pixel operation and reads the pixel values of the frame buffer 214 from memory. These pixel values read from the frame buffer 214 are used for pixel test or Alpha-blending.
  • the memory interface 208 also reads from local memory 212 the RGBA -values for the current contents, of the frame buffer.
  • the local memory 212 is a 32 Mbit (4MB) memory that is built-in to the Graphics Synthesiser 200. It can be organised as a frame buffer 214, texture buffer 216 and a 32-bit Z-buffer 215.
  • the frame buffer 214 is the portion of video memory where pixel data such as colour information is stored.
  • the Graphics Synthesiser uses a 2D to 3D texture mapping process to add visual detail to 3D geometry. Each texture may be wrapped around a 3D image object and is stretched and skewed to give a 3D graphical effect.
  • the texture buffer is used to store the texture information for image objects.
  • the Z-buffer 215 also known as depth buffer
  • Images are constructed from basic building blocks known as graphics primitives or polygons. When a polygon is rendered with Z-buffering, the depth value of each of its pixels is compared with the corresponding value stored in the Z-buffer.
  • the value stored in the Z-buffer is greater than or equal to the depth of the new pixel value then this pixel is determined visible so that it should be rendered and the Z-buffer will be updated with the new pixel depth. If however the Z-buffer depth value is less than the new pixel depth value the new pixel value is behind what has already been drawn and will not be rendered.
  • the local memory 212 has a 1024-bit read port and a 1024-bit write- port for accessing the frame buffer and Z-buffer and a 512-bit port for texture reading.
  • the video converter 210 is operable to display the contents of the frame memory in a specified output format.
  • Figure 4 schematically illustrates the operation of a game involving successive scenes.
  • the functionality of the PlayStation 2 described above has been simplified to a single unit, the "game engine”, which receives user commands from the handheld controller 725 and generates video signals for display and sound signals for audio output.
  • the game engine responds to game information stored, on discs and read by the disc reader 450.
  • Game design is a well established art and, although complicated and lengthy, the type of information to be stored as part of a game program and its supporting data. Only the differences relevant to the present invention will be described here.
  • a. set of game scenes is defined: scene 1, scene 2 etc.
  • the general intention is that the game object (character, vehicle etc) moves from scene to scene in a generally predetermined order, although branching points may be provided to allow different routes through the game.
  • a game character may typically move through the scenes: killing or removing opponents, collecting treasure or point-scoring objects, acquiring clues and the like.
  • the scenes may be defined by a bounded game environment within which the character may roam. Within the scene there may be enemies wandering around waiting for a fight and clues or treasure dispersed around the environment for the character to find.
  • a scene may be entirely choreographed in advance. There may still be enemies to fight and items to find, but these are laid out in a predetermined order.
  • a sequence of tasks for the character to complete will have been pre-defined. Successful completion of each task allows the user to attempt the next one; failure at a task generally means that the user has failed and either the game terminates or the user is demoted (e.g. the user has to return to an earlier stage to try again).
  • This type of scene is sometimes referred to as an interactive cut scene.
  • a technique of using a further representation of a game object to be called a ghost character
  • Such a technique is particularly relevant to the type of pre- choreographed scenes described above, but could also be used in other types of scene.
  • Figure 5 schematically illustrates an interactive cut scene in which a sequence of required user actions (actions 1.1 to 1.5), each with an associated background video clip, enemies, treasure and the like, is arranged in a linear order.
  • actions 1.1 to 1.5 each with an associated background video clip, enemies, treasure and the like.
  • the user is allowed to move on to the next task, through the whole scene. If any task is not successfully completed, the user moves to a failure clip.
  • An example of a failure clip in the case of an action requiring the user's character to climb a cliff face, is a video clip showing the character falling off the cliff face.
  • the failure clips might be shared between actions and can be arranged in a sequence to add more dramatic effect, if required. After the failure clip the user's character can be removed from the game or can be transferred to a previous point in the game to try again.
  • Figure 6 schematically illustrates a branching scene. Failure cips are not shown but may apply to any of the actions shown in Figure 6. If the user completes action 2.1, the user then attempts action 2.2, The action 2.2 has two successful outcomes: depending on the outcome, the user's character passes either to a sequence of actions 2.3, 2.4 and 2.5 or to a sequence of actions 2.6, 2.7 and 2.8.
  • Figure 7 schematically illustrates a scene definition table. This represents a part of the data associated with a scene.
  • Each action is listed, along with a starting position of the game object (e.g. the character) relevant to that action.
  • the position that the game object will have reached at the end of a successful completion of that action is also stored.
  • a ghost position is stored. This could represent either a static position of a ghost image, to act as a target for movement of the game character, or an offset with respect to the game character, or a more complicated movement specification.
  • the user's character reaches the ending position for an action, the ghost ceases to be displayed.
  • the skilled person will appreciate that much more information than this is required to define an action or a scene. However, the rest of the information is routine to one skilled in the art.
  • Figures 8a to 8c schematically show a ghost game character in use.
  • a user's character 840 has to leap across three boulders 810, 820, 830.
  • the character 840 starts on the boulder 810.
  • a ghost representation 850 of the character is shown on the boulder 820 indicating that the user's character has to jump forwards and upwards onto the boulder 820. Assuming that the user does this successfully, the ghost representation 850 disappears (or moves - see below) either when the character 840 arrives on the second boulder 820 or just before then, as defined by the data in the scene definition table of Figure 7.
  • a similar process applies for the user's character 840 to leap from the boulder 820 to the third boulder 830. After that successful leap, the ghost character is no longer displayed.
  • the scene can be arranged so that if the user operates a "jump forward and upward" control at the appropriate time, the character 840 will move from one boulder to the next, following the ghost character. In this way, the ghost character can encourage the user to operate a certain control even where the user may not believe that the character 840 can fulfil that action.
  • the ghost character is a representation of the user's character, but this does not necessarily mean that the ghost character is in the same orientation as the user's character at any time.
  • Figures 9a to 9 ⁇ are another example schematically illustrating a ghost game object: in this case a ghost automobile 860 which indicates to a user's automobile 870 which is the correct path through a (highly simplified) maze.
  • Figure 10 schematically illustrates another possible scene definition table, in this case applicable to a situation where the ghost character, as well as being displaced within the game environment from the user's character, is displayed with different display properties such as a different colour, texture, transparency or a combination of these, to indicate a required user action.
  • an image "effect" is defined for the ghost character, over a particular time range within the scene.
  • a required user action (such as pressing a certain button) is also defined. Successful completion of the task requires the user to press that button within the defined time range.
  • the display appearance variation can be used in addition to the technique described earlier where the ghost character' s position indicates a required action.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)

Abstract

Video game apparatus, in which a primary game object is displayed within a game environment of a video game, comprises a user controller (725); and a game program defining a sequence of one or more target tasks required to be performed with respect to the primary game object by operating the user controller (725); in which the apparatus displays a further representation of the primary game object displaced, in the game environment, from the displayed primary game character, a display property of the further representation being dependent upon a next target task in the sequence.

Description

VIDEO GAME
This invention relates to video games.
Most current video games are played by means of a user operating a user control such as a so-called "joypad", which is a handheld unit providing several user-operable switches, buttons, joysticks and the like. For example, the joypad normally used with the Sony® PlayStation 2™ video game machine provides two joysticks, four directional buttons (up, down, left and right), four multi-function buttons (α, Δ, o and x) and four shoulder buttons (L1, L2, R1, R2). As .part of the game design process, the available user controls are mapped to required game functions. This might be relatively straightforward in the case of defining, motion of a game object (such as a game character, a vehicle or the like), where an obvious mapping is that a joystick and/or the directional control buttons are used to control required movements. As regards the other buttons, while some conventions have developed within the game industry, the allocation of functions to controls is still arbitrary and must be learned by the player of the game. For example, a function such as punching or kicking an opponent has no intrinsically obvious control button to carry out this function.
It is known, especially in role-playing, action adventure video games, for a game object to pass through various game scenes, such that a sequence of game tasks must be completed in a current scene for the game object either to enter the next scene at all,, er to have a chance of completing the next .scene successfully. In many cases these scenes have no predetermined outcome, and the user can cause the game object to move extensively -through the game environment and carry out tasks in a flexible order. This requires that the game environment be generated inτeal time. However, in other instances, for example in so- called "interactive cut scenes", the game designer may wish to provide a more detailed or complicated video sequence through which the game object must pass. In such cases, rather than derive the game environment in real time, it can be set up in advance, almost as a video clip. The game object must perform various tasks to pass through such an interactive cut scene; if a task is not carried out successfully and at the required time within the scene, the game object passes to a "failure" scene. Such scenes are often fast-moving and require a complicated sequence of actions by the user in order to pass successfully through the scene. It is always important to establish a balance between a game being challenging and yet being achievable (eventually) by the user. For this reason, it has been proposed that the user is given some assistance during a fast-moving interactive cut scene.
An example of this is to display, at the time that a user needs to take a particular action, a visual indication of which button needs to be pressed. For example, in the game
Tomb Raider™, at the time that a user needs to press (say) the X button, a large X is displayed on the video game's display screen. However, there are at least two disadvantages with such an arrangement. One drawback is that because an icon representing a particular button is alien to the rest of the game environment, it appears to be in the foreground and can be distracting from the action going on behind it. This is especially a problem in the case of a carefully prepared interactive cut scene using particularly exciting video scenes - it is not desirable for the user's attention to be taken away from the underlying action. A second disadvantage is that the process of displaying a button identification, and the user pressing that button, could tend to disconnect (in the user's mind) the hard-learned association between that button and a particular game action.
This invention provides video game apparatus in which a primary game object is displayed within a game environment of a video game, the apparatus having: a user controller; and a game program defining a sequence of one or more target tasks required to be performed with respect to the primary game object by operating the user controller; in which the apparatus displays a further representation of the primary game object displaced, in the game environment, with respect to- the displayed primary game character, a display property of the further representation being dependent upon a next target task in the sequence. The invention provides an elegantly simple yet innovative approach to providing user instruction or guidance as to the next required game task by a game object (e.g. a game character, vehicle or the like). A further representation of the game object is displayed, which is displaced from the primary game object (i.e. the one which the user is controlling). A display property such as the position, colour or both of the further representation is used to indicate what the user should do next.
In a specific example, the further representation might be displayed ahead of the primary game object (in game time) so that the further representation indicates a movement path that the primary game object must follow - for example to scale a cliff face or to cross a derelict bridge.
Further respective aspects and features of the invention are defined in the appended claims. Embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings in which:
Figure 1 schematically illustrates the overall system architecture of the PlayStation2;
Figure 2 schematically illustrates the architecture of an Emotion Engine;
Figure 3 schematically illustrates the configuration of a Graphics Synthesiser; Figure 4 schematically illustrates a game involving successive scenes;
Figure 5 schematically illustrates an interactive cut scene;
Figure 6 schematically illustrates a branch scene;
Figure 7 schematically illustrates a scene definition table;
Figures 8a to 8c schematically illustrate a ghost game character; Figures 9a to 9d schematically illustrate a ghost game object; and
Figure 10 schematically illustrates another possible scene definition table.
Figure 1 schematically illustrates the overall system architecture of the PlayStation2. A system unit 10 is provided, with various peripheral devices connectable to the .system unit.
The system unit 10 comprises: an Emotion Engine 100; a Graphics Synthesiser 200; a sound processor unit 300 having dynamic random access memory (DRAM); a read only memory (ROM) 400; a compact disc (CD) and digital versatile disc (DVD) reader 450; a Rambus Dynamic Random Access Memory (RDRAM) unit 500; an input/output processor (IOP) 700 with dedicated RAM 750. An (optional) external hard disk drive (HDD) 390 may be connected. The input/output processor 700 has two Universal Serial Bus (USB) ports 715 and an iLink or IEEE 1394 port (iLink is the Sony Corporation implementation of the IEEE 1394 standard). The IOP 700 handles all USB, iLink and game controller data traffic. For example when a user is playing a game, the IOP 700 receives data from the game controller and directs it to the Emotion Engine 100 which updates the current state of the game accordingly. The IOP 700 has a Direct Memory Access (DMA) architecture to facilitate rapid data transfer rates. DMA involves transfer of data from main memory to a device without passing it through the CPU. The USB interface is compatible with Open Host Controller Interface (OHCI) and can handle data transfer rates of between 1.5 Mbps and 12 Mbps. Provision of these interfaces means that the PlayStation2 is potentially compatible with peripheral devices such as video cassette recorders (VCRs), digital cameras, microphones, set-top boxes, printers, keyboard, mouse and joystick. Generally, In order for successful data communication to occur with a peripheral device connected to a USB port 715, an appropriate piece of software such as a device driver should be provided. Device driver technology is very well known and will not be described in detail here, except to say that the skilled man will be aware that a device driver or similar software interface may be required in the embodiment described here. In the present embodiment, a USB microphone 730 is connected to the USB port. It will be appreciated that the USB microphone 730 may be a hand-held microphone or may form part of a head-set that is worn by the human operator. The advantage of wearing a head-set is that the human operator's hand are free to perform other actions. The microphone includes an analogue-to-digital converter (ADC) and a basic hardware-based real-time data compression and encoding arrangement, so that audio data are transmitted by the microphone 730 to the USB port 715 in an appropriate format, such as 16-bit mono PCM (an uncompressed format) for decoding at the PlayStation 2 system unit 10.
Apart from the USB ports, two other ports 705, 710 are proprietary sockets allowing the connection of a proprietary non- volatile RAM memory card 720 for storing game-related information, a hand-held game controller 725 or a device (not shown) mimicking a .handheld controller, such as a dance mat.
The system unit 10 may be connected to a network adapter 805 that provides an interface (such as an Ethernet interface) to a network. This network may be, for example, a LAN, a WAN or the Internet. The network may be a general network or one that is dedicated to game related communication. The network adapter 805 allows data to be transmitted to and received from other system units 10 that are connected to the same network, (the other system units 10 also having corresponding network adapters 805).
The Emotion Engine 100 is a 128-bit Central Processing Unit (CPU) that has been specifically designed for efficient simulation of 3 dimensional (3D) graphics for games applications. The Emotion Engine components include a data bus, cache memory and registers, all of which are 128-bit. This facilitates fast processing of large volumes of multimedia data. Conventional PCs, by way of comparison, have a basic 64-bit data structure. The floating point calculation performance of the PlayStation2 is 6.2 GFLOPs. The Emotion Engine also comprises MPEG2 decoder circuitry which allows for simultaneous processing of 3D graphics data and DVD data. The Emotion Engine performs geometrical calculations including mathematical transforms and translations and also performs calculations associated with the physics of simulation objects, for example, calculation of friction between two objects. It produces sequences of image rendering commands which are subsequently utilised by the Graphics Synthesiser 200. The image rendering commands are output in the form of display lists. A display list is a sequence of drawing commands that specifies to the Graphics Synthesiser which primitive graphic objects (e.g. points, lines, triangles, sprites) to draw on the screen and at which co-ordinates. Thus a typical display list will comprise commands to draw vertices, commands to shade the faces of polygons, render bitmaps and so on. The Emotion Engine 100 can asynchronously generate multiple display lists.
The Graphics Synthesiser 200 is a video accelerator that performs rendering of the display lists produced by the Emotion Engine 100. The Graphics Synthesiser 200 includes a graphics interface unit (GIF) which handles, tracks and manages the multiple display lists. The rendering function of the Graphics Synthesiser 200 can generate image data that supports several alternative standard output image formats, i.e., NTSC/PAL, High Definition Digital TV and VESA. In general, the rendering capability of graphics systems is defined by the memory bandwidth between a pixel engine and a video memory, each of which is located within the graphics processor. Conventional graphics systems use external Video Random Access Memory (VRAM) connected to- the pixel logic via an off-chip bus which tends to restrict available bandwidth. However, the Graphics Synthesiser 200 of the PlayStation2 provides the pixel logic and the video memory on a single high-performance chip which allows for a comparatively large 38.4 Gigabyte per second memory access bandwidth. The Graphics Synthesiser is theoretically capable of achieving a peak drawing capacity of 75 million polygons per second. Even with a full range of effects such as textures, lighting and transparency, a sustained rate of 20 million polygons per second can be drawn continuously. Accordingly, the Graphics Synthesiser 200 is capable of rendering a film-quality image.
The Sound Processor Unit (SPU) 300 is effectively the soundcard of the system which is capable of recognising 3D digital sound such as Digital Theater Surround (DTS®) sound and AC-3 (also known as Dolby Digital) which is the sound format used for DVDs. A display and sound output device 305, such as a video monitor or television set with an associated loudspeaker arrangement 310, is connected to receive video and audio signals from the graphics synthesiser 200 and the sound processing unit 300.
The main memory supporting the Emotion Engine 100 is the RDRAM (Rambus Dynamic Random Access Memory) module 500 produced by Rambus Incorporated. This RDRAM memory subsystem comprises RAM, a RAM controller and a bus connecting the RAM to the Emotion Engine 100.
Figure 2 schematically illustrates the architecture of the Emotion Engine 100 of Figure 1. The Emotion Engine 100 comprises: a floating point unit (FPU) 104; a central processing unit (CPU) core 102; vector unit zero (VUO) 106; vector unit one (VUl) 108; a graphics interface unit (GIF) 110; an interrupt controller (INTC) 112; a timer unit 1 14; a direct memory access controller 116; an image data processor unit (IPU) 118; a dynamic random access memory controller (DRAMC) 120; a sub-bus interface (SIF) 122; and all of these components are connected via a 128-bit main bus 124. The CPU core 102 is a 128-bit processor clocked at 300 MHz. The CPU core has access to 32 MB of main memory via the DRAMC 120. The CPU core 102 instruction set is based on MIPS III RISC with some MIPS IV RISC instructions together with additional multimedia instructions. MIPS III and IV are Reduced Instruction Set Computer (RISC) instruction set architectures proprietary to MIPS Technologies, Inc. Standard instructions are 64-bit, two-way superscalar, which means that two instructions can be executed simultaneously. Multimedia instructions, on the other hand, use 128-bit instructions via two pipelines. The CPU core 102 comprises a 1"6KB instruction cache, an 8KB data cache and a 16KB scratchpad RAM which is a portion of cache reserved for direct private usage by the CPU. The FPU 104 serves as a first co-processor for the CPU core 102. The vector unit
106 acts as a second co-processor. The FPU 104 comprises a floating point product sum arithmetic logic unit (FMAC) and a floating point division calculator (FDIV). Both the FMAC and FDIV operate on 32-bit values so when an operation is carried out on a 128-bit value ( composed of four 32-bit values) an operation can be carried out on all four parts concurrently. For example adding 2 vectors together can be done at the same time.
The vector units 106 and 108 perform mathematical operations and are essentially specialised FPUs that are extremely fast at evaluating the multiplication and addition of vector equations. They use Floating-Point Multiply-Adder Calculators (FMACs) for addition and multiplication operations and Floating-Point Dividers (FDIVs) for division and square root operations. They have built-in memory for storing micro-programs and interface with the rest of the system via Vector Interface Units (VIFs). Vector unit zero 106 can work as a coprocessor to the CPU core 102 via a dedicated 128-bit bus so it is essentially a second specialised FPU. Vector unit one 108, on the other hand, has a dedicated bus to the Graphics synthesiser 200 and thus can be considered as a completely separate processor. The inclusion of two vector units allows the software developer to split up the work between different parts of the CPU and the vector units can be used in either serial or parallel connection.
Vector unit zero 106 comprises 4 FMACS and 1 FDIV. It is connected to the CPU core 102 via a coprocessor connection. It has 4 Kb of vector unit memory for data and 4 Kb of micro-memory for instructions. Vector unit zero 106 is useful for performing physics calculations associated with the images for display. It primarily executes non-patterned geometric processing together with the CPU core 102.
Vector unit one 108 comprises 5 FMACS and 2 FDIVs. It has no direct path to the CPU core 102, although it does have a direct path to the GIF unit 110. It has 16 Kb of vector unit memory for data and 16 Kb of micro-memory for instructions. Vector unit one 108 is useful for performing transformations. It primarily executes patterned geometric processing and directly outputs a generated display list to the GIF 110.
The GIF 110 is an interface unit to the Graphics Synthesiser 200. It converts data according to a tag specification at the beginning of a display list packet and transfers drawing commands to the Graphics Synthesiser 200 whilst mutually arbitrating multiple transfer. The interrupt controller (INTC) 112 serves to arbitrate interrupts from peripheral devices, except the DMAC 1 16.
The timer unit 114 comprises four independent timers with 16-bit counters. The timers are driven either by the bus clock (at 1/16 or 1/256 intervals) or via an external clock. The DMAC 116 handles data transfers between main memory and peripheral processors or main memory and the scratch pad memory. It arbitrates the main bus 124 at the same time. Performance optimisation of the DMAC 116 is a key way by which to improve Emotion Engine performance. The image processing unit (IPU) 118 is an image data processor that is used to expand compressed animations and texture images. It performs I-PICTURE Macro- Block decoding, colour space conversion and vector quantisation. Finally, the sub-bus interface (SIF) 122 is an interface unit to the IOP 700. It has its own memory and bus to control I/O devices such as sound chips and storage devices.
Figure 3 schematically illustrates the configuration of the Graphic Synthesiser 200. The Graphics Synthesiser comprises: a host interface 202; a set-up / rasterizing unit; a pixel pipeline 206; a memory interface 208; a local memory 212 including a frame page buffer 214 and a texture page buffer 216; and a video converter 210.
The host interface 202 transfers data with the host (in this case the CPU core 102 of the Emotion Engine 100). Both drawing data and buffer data from the host pass through this interface. The output from the host interface 202 is supplied to the graphics synthesiser 200 which develops the graphics to draw pixels based on vertex information received from the Emotion Engine 100, and calculates information such as RGBA value, depth value (i.e. Z- value), texture value and fog value for each pixel. The RGBA value specifies the red, green, blue (RGB) colour components and the A (Alpha) component represents opacity of an'image object. The Alpha value can range from completely transparent to totally opaque. The pixel data is supplied to the pixel pipeline 206 which performs processes such as texture mapping, fogging and Alpha-blending and determines the final drawing colour based on the calculated pixel information.
The pixel pipeline 206 comprises 16- pixel engines PEl, PE2, ..... , PE16 so that it can process a maximum of 16 pixels concurrently. The pixel pipeline 206 runs at 150MHz with 32-bit colour and a 32-bit Z-buffer. The memory interface 208 reads data from and writes data to the local Graphics Synthesiser memory 212. It writes the drawing pixel values (RGBA and Z) to memory at the end of a pixel operation and reads the pixel values of the frame buffer 214 from memory. These pixel values read from the frame buffer 214 are used for pixel test or Alpha-blending. The memory interface 208 also reads from local memory 212 the RGBA -values for the current contents, of the frame buffer. The local memory 212 is a 32 Mbit (4MB) memory that is built-in to the Graphics Synthesiser 200. It can be organised as a frame buffer 214, texture buffer 216 and a 32-bit Z-buffer 215. The frame buffer 214 is the portion of video memory where pixel data such as colour information is stored.
The Graphics Synthesiser uses a 2D to 3D texture mapping process to add visual detail to 3D geometry. Each texture may be wrapped around a 3D image object and is stretched and skewed to give a 3D graphical effect. The texture buffer is used to store the texture information for image objects. The Z-buffer 215 (also known as depth buffer) is the memory available to store the depth information for a pixel. Images are constructed from basic building blocks known as graphics primitives or polygons. When a polygon is rendered with Z-buffering, the depth value of each of its pixels is compared with the corresponding value stored in the Z-buffer. If the value stored in the Z-buffer is greater than or equal to the depth of the new pixel value then this pixel is determined visible so that it should be rendered and the Z-buffer will be updated with the new pixel depth. If however the Z-buffer depth value is less than the new pixel depth value the new pixel value is behind what has already been drawn and will not be rendered.
The local memory 212 has a 1024-bit read port and a 1024-bit write- port for accessing the frame buffer and Z-buffer and a 512-bit port for texture reading. The video converter 210 is operable to display the contents of the frame memory in a specified output format. Figure 4 schematically illustrates the operation of a game involving successive scenes. In particular, the functionality of the PlayStation 2 described above has been simplified to a single unit, the "game engine", which receives user commands from the handheld controller 725 and generates video signals for display and sound signals for audio output. The game engine responds to game information stored, on discs and read by the disc reader 450. Game design is a well established art and, although complicated and lengthy, the type of information to be stored as part of a game program and its supporting data. Only the differences relevant to the present invention will be described here.
In an example game, a. set of game scenes is defined: scene 1, scene 2 etc. The general intention is that the game object (character, vehicle etc) moves from scene to scene in a generally predetermined order, although branching points may be provided to allow different routes through the game. In the case of a role playing, action adventure type game, a game character may typically move through the scenes: killing or removing opponents, collecting treasure or point-scoring objects, acquiring clues and the like. In some cases the scenes may be defined by a bounded game environment within which the character may roam. Within the scene there may be enemies wandering around waiting for a fight and clues or treasure dispersed around the environment for the character to find.
In other cases a scene may be entirely choreographed in advance. There may still be enemies to fight and items to find, but these are laid out in a predetermined order. A sequence of tasks for the character to complete will have been pre-defined. Successful completion of each task allows the user to attempt the next one; failure at a task generally means that the user has failed and either the game terminates or the user is demoted (e.g. the user has to return to an earlier stage to try again). This type of scene is sometimes referred to as an interactive cut scene. A technique of using a further representation of a game object (to be called a ghost character) will now be described. Such a technique is particularly relevant to the type of pre- choreographed scenes described above, but could also be used in other types of scene.
Figure 5 schematically illustrates an interactive cut scene in which a sequence of required user actions (actions 1.1 to 1.5), each with an associated background video clip, enemies, treasure and the like, is arranged in a linear order. As a user successfully completes each task, the user is allowed to move on to the next task, through the whole scene. If any task is not successfully completed, the user moves to a failure clip. An example of a failure clip, in the case of an action requiring the user's character to climb a cliff face, is a video clip showing the character falling off the cliff face. The failure clips might be shared between actions and can be arranged in a sequence to add more dramatic effect, if required. After the failure clip the user's character can be removed from the game or can be transferred to a previous point in the game to try again.
Figure 6 schematically illustrates a branching scene. Failure cips are not shown but may apply to any of the actions shown in Figure 6. If the user completes action 2.1, the user then attempts action 2.2, The action 2.2 has two successful outcomes: depending on the outcome, the user's character passes either to a sequence of actions 2.3, 2.4 and 2.5 or to a sequence of actions 2.6, 2.7 and 2.8.
The use of a ghost character will now be described.
Figure 7 schematically illustrates a scene definition table. This represents a part of the data associated with a scene. Each action is listed, along with a starting position of the game object (e.g. the character) relevant to that action. The position that the game object will have reached at the end of a successful completion of that action is also stored. Finally, a ghost position is stored. This could represent either a static position of a ghost image, to act as a target for movement of the game character, or an offset with respect to the game character, or a more complicated movement specification. When the user's character reaches the ending position for an action, the ghost ceases to be displayed. Of course, the skilled person will appreciate that much more information than this is required to define an action or a scene. However, the rest of the information is routine to one skilled in the art.
To illustrate this technique, Figures 8a to 8c schematically show a ghost game character in use. Here, in a short scene a user's character 840 has to leap across three boulders 810, 820, 830.
The character 840 starts on the boulder 810. A ghost representation 850 of the character is shown on the boulder 820 indicating that the user's character has to jump forwards and upwards onto the boulder 820. Assuming that the user does this successfully, the ghost representation 850 disappears (or moves - see below) either when the character 840 arrives on the second boulder 820 or just before then, as defined by the data in the scene definition table of Figure 7.
A similar process applies for the user's character 840 to leap from the boulder 820 to the third boulder 830. After that successful leap, the ghost character is no longer displayed.
It may be that a leap form one of these boulders to the next is beyond the normal "jumping" capabilities of the game character. However, in a pre-choreographed scene such as this, such- a limitation does not matter. The scene can be arranged so that if the user operates a "jump forward and upward" control at the appropriate time, the character 840 will move from one boulder to the next, following the ghost character. In this way, the ghost character can encourage the user to operate a certain control even where the user may not believe that the character 840 can fulfil that action.
Note that the ghost character is a representation of the user's character, but this does not necessarily mean that the ghost character is in the same orientation as the user's character at any time.
Figures 9a to 9ά are another example schematically illustrating a ghost game object: in this case a ghost automobile 860 which indicates to a user's automobile 870 which is the correct path through a (highly simplified) maze. Figure 10 schematically illustrates another possible scene definition table, in this case applicable to a situation where the ghost character, as well as being displaced within the game environment from the user's character, is displayed with different display properties such as a different colour, texture, transparency or a combination of these, to indicate a required user action. At each action within a scene, an image "effect" is defined for the ghost character, over a particular time range within the scene. A required user action (such as pressing a certain button) is also defined. Successful completion of the task requires the user to press that button within the defined time range. Of course, the display appearance variation can be used in addition to the technique described earlier where the ghost character' s position indicates a required action.

Claims

1. Video game apparatus in which a primary game object is displayed within a game environment of a video game, the apparatus having: a user controller; and a game program defining a sequence of one or more target tasks required to be performed with respect to the primary game object by operating the user controller; in which the apparatus displays a further representation of the primary game object displaced, in the game environment, with respect to the displayed primary game character, a display property of the further representation being dependent upon a next target task in the sequence.
2. Apparatus according to claim 1. in which the display property is at least a display position of the further representation with respect to the display position of the primary game object
3. Apparatus according to claim 2, in which the direction within the game environment from the displayed primary game character to ther further representation is indicative of a required direction of movement by the primary game object in order to fulfil the next target task in the sequence.
4. Apparatus according to any one of claims 1 to 3; in which the display property is at least a display colour of the further representation, the display colour being selected from a group of at least two display colours indicating different respective required user actions.
5. Apparatus according, to any one of the preceding claims, in which: the video game has a series of scenes; and the further representation is provided in respect of at least a subset of the scenes; in which: each scene in the subset has an associated sequence of target tasks; and the game program is arranged to select an immediately following scene in dependence upon a degree to which the target tasks in the associated sequence are successfully completed.
6. Apparatus according to any one of the preceding claims, in which the primary game object is a game character.
7. A method of operation of a video game in which a primary game object is displayed within a game environment, the video game defining a sequence of one or more target tasks required to be performed with respect to the primary game object by a user operating a user controller, the method comprising the step of: displaying a further representation of the primary game object displaced, in the game environment, with respect to the displayed primary game character, a display property of the further representation being dependent upon a next target task in the sequence.
8. Computer software having program code which, when run on a computer, causes the computer to carry out a method according to claim 7.
9. A providing medium by which computer software according to claim 8 is provided.
10. A medium according to claim 9, the medium being a storage medium.
11. A medium according to claim 9, the medium being a transmission medium.
PCT/GB2007/002744 2006-09-19 2007-07-19 Video game WO2008035027A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB0618406A GB2441975A (en) 2006-09-19 2006-09-19 Video game
GB0618406.3 2006-09-19

Publications (1)

Publication Number Publication Date
WO2008035027A1 true WO2008035027A1 (en) 2008-03-27

Family

ID=37421221

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2007/002744 WO2008035027A1 (en) 2006-09-19 2007-07-19 Video game

Country Status (2)

Country Link
GB (1) GB2441975A (en)
WO (1) WO2008035027A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050171690A1 (en) * 2004-01-30 2005-08-04 Microsoft Corporation Ghost following
WO2005107905A1 (en) * 2004-05-11 2005-11-17 Konami Digital Entertainment Co., Ltd. Game device, game control method, information recording medium, and program
US20060094501A1 (en) * 2004-05-10 2006-05-04 Nintendo Co., Ltd. Video game including time dilation effect and a storage medium storing software for the video game

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU5935200A (en) * 1999-07-15 2001-02-05 Midway Games West Inc. System and method of vehicle competition with enhanced ghosting features
JP2006149577A (en) * 2004-11-26 2006-06-15 Sega Corp Image processing device, image processing method, program for executing image processing process and storage medium storing program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050171690A1 (en) * 2004-01-30 2005-08-04 Microsoft Corporation Ghost following
US20060094501A1 (en) * 2004-05-10 2006-05-04 Nintendo Co., Ltd. Video game including time dilation effect and a storage medium storing software for the video game
WO2005107905A1 (en) * 2004-05-11 2005-11-17 Konami Digital Entertainment Co., Ltd. Game device, game control method, information recording medium, and program
EP1757341A1 (en) * 2004-05-11 2007-02-28 Konami Digital Entertainment Co., Ltd. Game device, game control method, information recording medium, and program

Also Published As

Publication number Publication date
GB0618406D0 (en) 2006-11-01
GB2441975A (en) 2008-03-26

Similar Documents

Publication Publication Date Title
US7586502B2 (en) Control of data processing
US8035613B2 (en) Control of data processing
EP1880576B1 (en) Audio processing
EP1768759B1 (en) Control of data processing
US20090247249A1 (en) Data processing
WO2006000786A1 (en) Real-time voice-chat system for an networked multiplayer game
WO2006024873A2 (en) Image rendering
US20100035678A1 (en) Video game
US8587589B2 (en) Image rendering
WO2006027596A1 (en) Data processing
WO2008035027A1 (en) Video game
EP1889645B1 (en) Data processing
JP2872667B1 (en) Video game apparatus, game screen output method, and readable recording medium on which game screen output program is recorded
JP2005275798A (en) Program, information storage medium, and image generation system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07766309

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07766309

Country of ref document: EP

Kind code of ref document: A1