US20040063501A1 - Game device, image processing device and image processing method - Google Patents

Game device, image processing device and image processing method Download PDF

Info

Publication number
US20040063501A1
US20040063501A1 US10/441,031 US44103103A US2004063501A1 US 20040063501 A1 US20040063501 A1 US 20040063501A1 US 44103103 A US44103103 A US 44103103A US 2004063501 A1 US2004063501 A1 US 2004063501A1
Authority
US
United States
Prior art keywords
character
virtual camera
distance
damage
shooting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/441,031
Other languages
English (en)
Inventor
Hitoshi Shimokawa
Yukio Tsuji
Kazutomo Sanbongi
Junji Shibasaki
Takenao Sata
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sega Corp
Original Assignee
Sega Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sega Corp filed Critical Sega Corp
Assigned to KABUSHIKI KAISHA SEGA reassignment KABUSHIKI KAISHA SEGA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SANBONGI, KAZUTOMO, SATA, TAKENAO, SHIBASAKI, JUNJI, SHIMOKAWA, HITOSHI, TSUJI, YUKIO
Publication of US20040063501A1 publication Critical patent/US20040063501A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5258Changing parameters of virtual cameras by dynamically adapting the position of the virtual camera to keep a game object or game character in its viewing frustum, e.g. for tracking a character or a ball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • A63F13/577Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using determination of contact between game characters or objects, e.g. to avoid collision between virtual racing cars
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/64Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/64Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car
    • A63F2300/643Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car by determining the impact between objects, e.g. collision detection
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6661Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6661Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera
    • A63F2300/6684Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera by dynamically adapting its position to keep a game object in its viewing frustrum, e.g. for tracking a character or a ball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Definitions

  • the present invention relates to an image processing device, particularly to a game device.
  • An image processing device defines various kinds of characters in a virtual space formed by the computer and loads the manipulation information of players into the computer through peripheral equipment such as joysticks, thereby realizing image processing for moving characters, etc.
  • images which are viewed from a three-dimensional virtual space viewpoint, called a virtual camera are displayed on a TV monitor for the players to see.
  • One example of the image processing device is a game device in which players compete against each other over shooting characters displayed on the screen (for example, “House of the Dead” available from Sega Enterprises, Ltd.).
  • This game device is composed such that the virtual camera moves along a predetermined course in the three-dimensional space, and a player moves while shooting enemies (zombies).
  • Each enemy has some weak points and when the player accurately hits a weak point, the enemy incurs a damage point, and when the total damage point exceed a predetermined value, the enemy is defeated.
  • a time limit is predetermined for the player to defeat the enemies. Therefore, if the player fails to defeat the enemies within the predetermined tim limit, the player is attack d by the en mies and the player incurs damage points.
  • This game device is also composed such that, when the player switches on a trigger of a gun pointing at the screen, the time elapsed for the gun to detect a scanning line on the screen is computed, and the coordinates of the location pointed at by the gun is further computed, and thereby a decision is made as to whether or not a bullet will hit the enemy character.
  • the conventional game device has configurations unfavorable to the player, for example, a time to fight with the enemies is predetermined and if the player can not defeat the enemies within the predetermined time limit, the player is attacked by the enemies and his/her damage points increase.
  • a time to fight with the enemies is predetermined and if the player can not defeat the enemies within the predetermined time limit, the player is attacked by the enemies and his/her damage points increase.
  • the player does defeat the enemies within the time limit
  • there is no arrangement favorable to the player For instance, when the player defeats the enemies just in time, or even when the player defeats them with time to spare, there are no positive effects on the future development of the game or the player's game score. Accordingly, there is a problem that the players, who are skilled in the shooting technique and can defeat the enemies within the time limit, lose their fighting spirits.
  • the conventional game device is hardly intended to be realistic based upon deciding whether a bullet hits an enemy. Therefore, the player cannot develop a strategy by familiarizing himself/herself with the types and characteristics of weapons. Accordingly, the conventional gam d vice does not pres nt nough entertaining characteristics for a gun shooting game.
  • the present invention provides an image processing method for moving a virtual camera located in a three-dimensional virtual space at a predetermined speed and changing the distance between the virtual camera and a character defined in the three-dimensional virtual space, wherein the moving speed of the virtual camera changes based on the distance between the virtual camera and the character.
  • a first character defined in the three-dimensional virtual space and a second character manipulated by the player are displayed, and the moving speed of the virtual camera changes based on the distance between the first and second characters.
  • the present invention also provides an image processing method for directing a virtual camera to a character located in a three-dimensional virtual space, wherein a fixation point of the virtual camera is set on the character in a manner so that the speed of directing the virtual camera to the character changes based on the distance between the virtual camera and the character.
  • the present invention further provides a game d vice that is composed such that a virtual camera located in a three-dimensional virtual space moves at a predetermined speed and the distance between the virtual camera and a character defined in the virtual space changes, comprising: virtual camera controlling means for changing a moving speed of the virtual camera based on the distance b twe n the virtual camera and the character.
  • the game device display a first character defined in the three-dimensional virtual space and a second character manipulated by the player, and the virtual camera controlling means change the moving speed of the virtual camera based on the distance between the first and second characters.
  • the virtual camera controlling means control the virtual camera moving speed in a manner that the shorter the distance between the virtual camera and the character becomes, the more the virtual camera moving speed decreases.
  • the game device it is desirable that a plurality of areas be provided in the three-dimensional virtual space with the virtual camera at the center, and the virtual camera controlling means determine in which area a character closest to the virtual camera exists, and control the virtual camera moving speed in accordance with the determined area.
  • the present invention provides a game device for directing a virtual camera to a character located in a three-dimensional virtual space comprising: fixation point setting means for setting a fixation point of the virtual camera on the character such that the speed of directing the virtual camera to the character chang s in accordance with the distance between the virtual camera and the character.
  • the present invention also provides a game device for simulating a player's shooting of a character defined in a three-dimensional virtual space while moving a virtual camera located in the three-dimensional virtual space, comprising: computing means for computing a damage point of the character caused by the player's shooting of it, based on a distance between the virtual camera and the character and the distance between the character and the center of an effective shooting radius that changes in accordance with the distance between the virtual camera and the character.
  • the damage point computing means compute a damage point of the character caused by the player's shooting of it, by multiplying a damage value, which is determined based on the distance between the virtual camera and the character, by a damage rate that is determined based on the distance between the character and the center of the effective shooting radius that changes in accordance with the distance between the virtual camera and the character.
  • the damage value be determined such that the further the distance between the virtual camera and the character is, the smaller the damage value is
  • the effective shooting radius be determined such that the further the distance between the virtual camera and the character is, the larger the effective shooting radius is
  • the damage rate be determined such that the further the distance between the character and the center of the effective shooting radius is, the smaller the proportion is.
  • the present invention provides a game device for simulating a player's shooting of a charact r defined in a three-dimensional virtual space while moving a virtual camera located in the three-dimensional virtual space, comprising: computing means for computing a damage point of the character caused by the shooting in accordance with a proportion of an overlapping area to the shooting radius, and in the overlapping area an effective shooting radius and a collision sphere of the enemy overlap.
  • the present invention provides a game device for simulating a player's shooting of a character defined in a three-dimensional virtual space while moving a virtual camera located in the three-dimensional virtual space, comprising: computing means for computing a damage point of the character caused by the shooting in accordance with a proportion of an overlapping area to the collision sphere of the enemy, and in the overlapping area an effective shooting radius and a collision sphere of the enemy overlap.
  • the present invention provides an image processing device that is composed to change a distance between a character and a virtual camera both located in a three-dimensional virtual space, comprising: means for computing the distance between the virtual camera and the character; and virtual camera controlling means for changing a moving speed of the virtual camera in accordance with the computed distance.
  • the present invention provides a controlling method of a game device for simulating a player's shooting of a character defined in a three-dimensional virtual space while moving a virtual camera located in the three-dimensional virtual space, wherein a damage point of the character caused by the player's shooting of it is computed on the basis of a distance between the virtual camera and the character and the distance between the character and the center of an effective shooting radius that changes in accordance with the distance between the virtual camera and the character.
  • the present invention provides a controlling method of a game device for simulating a player's shooting of a character defined in a three-dimensional virtual space while moving a virtual camera located in the three-dimensional virtual space, wherein a damage point of the character caused by the shooting is computed in accordance with a proportion of an overlapping area to the shooting radius, and in the overlapping area an effective shooting radius and a collision sphere of the enemy overlap.
  • the present invention provides a controlling method of a game device for simulating a player's shooting of a character defined in a three-dimensional virtual space while moving a virtual camera located in the three-dimensional virtual space, wherein a damage point of the character caused by the shooting is computed in accordance with a proportion of an overlapping area to the collision sphere, and in the overlapping area an effective shooting radius and a collision sphere of the enemy overlap.
  • the present invention provides a game controlling method whereby a game device is controlled such that it determines whether a hit decision area generated in a virtual space in accordance with a player's manipulation overlaps with an object located in the virtual space, and the character is damaged when it is determined that the hit decision area overlaps with the object, the method comprising: a step of obtaining first positional information indicating a position of a character being manipulated by the player in the virtual space; a step of obtaining second positional information indicating a position of the object in the virtual space; a step of computing a distance between the character and the object based on the obtained first and second positional information; a step of changing the size of the hit decision area based on the obtained distance; and a step of computing, when it is determined that the hit decision area and the object overlap with each other, a damage amount for the object based on the obtained distance, and realizing damage to the object based on the obtained damage amount.
  • the hit decision area be set to small and the damage amount be set to large when the obtained distance is shorter than a predetermined distance.
  • the hit decision area be set to large and the damage amount be set to small when the computed distance is further than a predetermined distance.
  • the present invention provides a game controlling method whereby a game device is controlled such that it determines whether an effective a area, generated in a virtual space with a predetermined target point at the center in accordance with a player's manipulation, overlaps with an object located in the virtual space and the object is damaged when it is determined that the hit decision area overlaps with the object, the method comprising: a step of obtaining positional information indicating a position of the object in the virtual space; a step of computing an area wher in the hit decision area and the object overlap with each other, based on the hit decision area and the obtained positional information; and a step of generating data of a damage amount to be attributed to the object, based on the obtained area, and providing damage to the object based on the generated damage amount data.
  • the present invention provides a shooting game controlling method, wherein the shooting game is controlled such that it simulates a player's shooting of a character defined in a three-dimensional virtual space, while a virtual camera located in the three-dimensional virtual space is moving at a predetermined speed, comprising: a step of changing the distance between the character and the virtual camera; a step of changing the moving speed of the virtual camera or the speed of directing the virtual camera to the character, based on the distance between the character and the virtual camera; a step of changing the player's effective shooting radius based on the distance between the virtual camera and the character; a step of determining whether a bullet has hit the character, based on the character's position and the location of the effective shooting radius; and a step of computing, when it is determined that the bullet has hit the character in the determination step, a damage amount caused to the character by the shooting, based on both the distance between the virtual camera and the character as well as the distance between the character and the center of the effective shooting radius.
  • FIG. 1 is a block diagram indicating the general structure of a game device according to one embodiment of the present invention.
  • FIG. 2 is a flow chart of th ntire general process performed by the CPU according to th embodiment.
  • FIG. 3 is a flow chart of the process executed in the game mode.
  • FIG. 4 illustrates the relationship between virtual camera movements and the enemies.
  • FIG. 5 is a flow chart of one example for the controlling process of the virtual camera.
  • FIG. 6(A) illustrates the relationship between the enemy sensing distance and the position d of the enemy closest to the virtual camera.
  • FIG. 6(B) shows examples of formulas for computing acceleration.
  • FIG. 7 is a diagram explaining a fixation point for the virtual camera.
  • FIG. 8 is a diagram showing one example of the relationship between the distance to the enemy and the bullet strength.
  • FIG. 9 is a diagram showing one example of the relationship between an effective shooting scope and the bullet strength.
  • FIG. 10 is a diagram showing one example of an effective shotgun radius.
  • FIG. 11 is a flow chart explaining the entire hit decision process.
  • FIG. 12(A) is a flow chart of the hit decision process.
  • FIG. 12(B) is an example in which a collision cone of the shotgun's bullet is divided into 16 sections.
  • FIG. 13(A) is a flow chart explaining the damage process.
  • FIG. 13(B) is an example of a computation for damage rate.
  • FIG. 14 shows an example of the configuration of the damage chart.
  • FIG. 15 shows image examples of objects (enemies) being shot.
  • FIG. 16 is a flow chart of the injury process.
  • FIG. 17 shows an example of the configuration for an injury progression value (damage progression value) chart.
  • FIG. 18 is a diagram explaining a second damage process.
  • FIG. 19 is a diagram explaining a second damage process.
  • FIG. 1 is a block diagram indicating one example of the game device of an arcade type game for playing a gun shooting game, according to the present invention.
  • the basic components of this game device include a game device main body 10 , an input device 11 , a TV monitor 13 , and a speaker 14 .
  • the input device 11 is a weapon such as a gun, shotgun, or a machine gun, for shooting enemies in the game.
  • the weapon is a shotgun used by the game player.
  • a shotgun includes a photoreceptor for reading a scanning spot (a light spot of an electron beam) for an impact point on the TV monitor, and a trigger switch that is equivalent to the trigger of a regular shotgun which is pulled. Scanning spot detection signals and trigger signals are transmitted to the interface 106 , which will be described hereinafter, via a connecting cord.
  • the TV monitor 13 displays images showing the status of the game development.
  • the TV monitor can be replaced by a projector.
  • the game device main body 10 comprises a central processing unit (CPU) 101 , a ROM 102 , a RAM 103 , a sound device 104 , an input/output interface 106 , a scroll data processor 107 , a coprocessor (auxiliary processor) 108 , a landform contour data ROM 109 , a geometrizer 110 , a form data ROM 111 , a drawing device 112 , a texture data ROM 113 , a texture map RAM 114 , a frame buffer 115 , an image composition device 116 , and a D/A converter 117 .
  • Examples of a storage medium used in this invention as the ROM 102 may include a hard disc, a cartridge-type ROM, a CD-ROM, other well-known media, and communication media (the intern t and other personal computer communication networks).
  • the CPU 101 is connected through a bus-line to: the ROM 102 having a predetermined program stored therein; RAM 103 for storing data; sound device 104 ; input/out interface 106 ; scroll data processor 107 ; coprocessor 108 ; and geometrizer 110 .
  • the RAM 103 is operated as a RAM buffer.
  • Various commands (to display objects, etc) to the geometrizer 110 and matrices obtained by computing the transformation matrix are written on the RAM 103 .
  • the input device 11 (shotgun) is connected to the input/output interface 106 .
  • CPU 101 checks whether the shotgun was fired based on a scanning spot detection signal sent from the shotgun 11 and a trigger signal indicating that the shotgun switch was pulled, and identifies an impact point and the number of shots fired in accordance with the current coordinates (X, Y) of the location of the scanning electron beam on the TV monitor and a location of a target. Then, CPU 101 sets various corresponding flags at predetermined positions in the RAM 103 .
  • the sound device 104 is connected to the speaker 14 through a power amplifier 105 , and audio signals generated by the sound device 104 are amplified in electric power, and then transmitted to the speaker 14 .
  • the CPU 101 reads, on the basis of a program stored in the ROM 102 , the game story development, the landform data in ROM 109 or the form data (three-dimensional data of “objects such as enemy characters” and “the game scenery including landscape, buildings, interiors, and underground passages) in the ROM 111 , then the CPU 101 determines the situation in the three-dimensional virtual space and executes the shooting process in correspondence with the trigger signals s nt from the input device 11 .
  • the various objects in the virtual game space their coordinate values in the three-dimensional space are determined, then the transformation matrix for transforming the coordinate values to a viewpoint coordinate system, and the form data (buildings, landforms, interiors, laboratories, furniture, etc.) are specified to the geometrizer 110 .
  • the coprocessor 108 is connected to the landform data ROM 109 , and accordingly, the landform data for the predetermined movement course of the camera is delivered to the coprocessor 108 (and the CPU 101 ).
  • the coprocessor 108 decides whether a bullet hits a target, computes deviations of objects from the camera's line of sight, executes the process for the movement of the line of sight, and takes on computing floating-points upon such decision and computation. Consequently, the results of the coprocessor's decision as to whether a bullet hit an object and the process for the movement of the line of sight which is movement relative to the position of the objects, are transmitted to the CPU 101 .
  • the geometrizer 110 is connected to the form data ROM 111 and the drawing device 112 .
  • Prestored on the form data ROM 111 is the polygon form data (three-dimensional data of buildings, walls, corridors, interiors, landforms, scenery, the main character, characters on the main character's side, various kinds of enemies (zombies), etc., all being composed of respective vertices).
  • This form data is delivered to the geometrizer 110 .
  • the geometrizer 110 executes the perspective transformation of the form data specified by the transformation matrix sent from the CPU 101 , and obtains the form data in which the coordinate system of the three-dimensional virtual space has been transformed to the coordinate system of a visual field.
  • the drawing device 112 pastes together the transformed form data of the visual field coordinate system and the textures and then outputs it to the frame buffer 115 .
  • the drawing device 112 is connected to the texture data ROM 113 , the texture map RAM 114 , and the frame buffer 115 .
  • Polygon data refers to a data group of relative or absolute coordinates of respective vertices each composing a polygon (mainly a trigon or tetragon), which consists of a set of plural vertices.
  • the polygon data stored in the landform data ROM 109 is set relatively roughly but enough to move the camera in the virtual space along the game storyline.
  • the polygon data stored in the form data ROM 111 is set in more detail concerning the forms, such as enemies and backgrounds, that compose the screen.
  • Scroll data processor 107 processes data such as letters on a scroll screen.
  • the scroll data processor 107 and the frame buffer 115 are connected to the TV monitor 13 via the image composition device 116 and the D/A converter 117 .
  • the polygon screen (a simulation result) for objects (enemies) and landforms (backgrounds) stored temporarily in the frame buffer 115 and the scroll screen for text information (for example, the player's LifeCount value, damage points, etc.) are composed to generate final frame-image data.
  • This frame-image data is converted into analog signals by the D/A converter 117 and transmitted to the TV monitor 13 , thereby, real-time images of the shooting game are displayed.
  • FIG. 2 is a flow chart explaining the outline of the game, and the process flow is broadly classified into a movement mode and a game mode.
  • the movement mode S 10
  • the virtual camera moves in the virtual game space created in the computer system in accordance with the pre-programmed game story, and also projects various game status updates onto the screen.
  • the game is determined to be over.
  • the player can go back to the game mode of a different status or the exact game mode in which the player lost, if, for example, the main character's damage is not heavy (S 50 ; NO).
  • the game is over (S 50 ; YES) when the set time of each section of the game runs out or when game parameters, such as damage point, satisfy the game termination requirements.
  • FIG. 3 is a flow chart explaining the process in the game mode (S 30 ).
  • the virtual camera moves in the virtual space and when an enemy appears, an enemy appearance means execut s the process to make enemies appear of the type and number preprogrammed for the scene (S 302 ).
  • an enemy appearance means execut s the process to make enemies appear of the type and number preprogrammed for the scene (S 302 ).
  • well-known techniques such as the Japanese Patent Laid-Open Publication No. Hei 10-185547 may be used.
  • a virtual camera controlling means changes the moving speed of the virtual camera in accordance with the distance between the virtual camera (viewpoint) and the enemy (S 304 ).
  • the process to control the moving speed of the virtual camera will be explained hereinafter with reference to FIGS. 4 to 6 .
  • the virtual camera controlling means also changes a fixation point of the virtual camera in accordance with the distance between the virtual camera (viewpoint) and the enemy (S 304 ). This process will be explained hereinafter with reference to FIG. 4.
  • a shooting result determination means determines a shooting result (S 306 ). At first, the shooting result determination means determines whether the bullet has hit the enemy (hit decision), and if the bullet has hit the enemy, a hit flag is set for the shot enemy, and a damage point and an injury progression value incurred by the shot are computed. The hit decision and the computation of a damage point are executed in accordance with the shotgun properties, and the details of this process will be hereinafter explained with reference to FIGS. 11 to 13 .
  • an enemy moving means executes the enemy moving process (S 308 ) for moving another enemy towards a clear space or the place where the previous enemy was shot.
  • the moving process of the enemies well-known techniques such as the above-cited Japanese Patent Publication Hei 10-165547 may be used.
  • Step 310 a determination is made as to whether the fighting will continue. If the fighting is not yet finished (S 310 ; YES), such as in the case that some enemies still remain on the screen, it is determined whether to make further enemies appear based on the program of the game story (S 312 ). If another enemy should appear (S 312 ; YES), the enemy is made to appear (S 302 ). If enemies should no longer appear (S 312 ; NO), the determination of the shooting results for the remaining enemies (S 306 ) is moved on to, and Steps 308 to 310 are repeated. When the fighting is determined to be over (S 310 ; NO), the process returns to the above-mentioned Step 40 . Then, it is determined whether to return to the movement mode that leads to another fight scene (S 40 ), or to finish the game (S 50 ).
  • the virtual camera is a viewpoint located in the three-dimensional virtual space, and images seen from this viewpoint are presented to the player through the monitor.
  • the virtual camera stops its movement. Accordingly, when an enemy appears (i.e. when the virtual camera arrives at a predetermined position), the player stops to shoot the enemy. Further, the moving speed for the virtual camera of a conventional game is constant.
  • the player's speed of defeating the enemies affects the progression of th game.
  • the moving speed of the camera is changed such that the shorter the distance becomes, the slower the moving speed of the camera becomes. Accordingly, the faster the player defeats the approaching enemies (the more the player defeats the enemies far off in the distance), the faster the game progresses, and consequently, the player can obtain a higher score.
  • the more slowly the player defeats the enemies the more slowly the game progresses, therefore, the player cannot obtain a high score.
  • the speed of defeating the enemies influences the game development and the game results. Further, since the player finds and shoots the enemies while moving toward a destination, the player feels a sense of urgency and thus, an increased enjoyment of the game.
  • the virtual camera moves along a predetermined track with a predetermined speed and angle.
  • There is a certain distance (enemy sensing distance) from the virtual camera wherein the player can sense an enemy and the virtual camera moving speed changes depending on the distance between the virtual camera and the enemy.
  • the virtual camera moving speed decelerates. The closer the enemy approaches the virtual camera, the more the virtual camera moving speed decelerates, and when the enemy reaches a certain proximity, the virtual camera stops its movement.
  • the enemy sensing distance is divided into three areas having th virtual camera at the center: a normal moving speed area in which the virtual camera mov s at a normal speed; a low moving speed area in which the virtual camera moves at a low speed; and a non-moving area in which the virtual camera stops its movement. These areas are defined by the enemy's distance from the virtual camera.
  • FIG. 5 is a flow chart explaining the process for controlling the virtual camera (S 304 in FIG. 3).
  • FIG. 6(A) shows the areas of the different moving speeds of camera.
  • FIG. 6(B) explains formulas for computing an acceleration for the camera.
  • a position d of an enemy that is closest to the virtual camera is determined (S 304 a ).
  • the non-moving area of the camera is within the 2.5-meter distance from the virtual camera; the low moving speed area of th camera is within the 10-meter distance from the virtual camera (excluding the non-moving area of the camera); and the normal moving speed area of the camera is an area further than the 10-meter distance from the virtual camera, respectively.
  • the enemy's position d is determined whether it is within the non-moving area of the camera (S 304 b ). If it is determined that it is within the non-moving area (S 304 b ; YES), an acceleration in the non-moving area is computed (S 304 c ). The acceleration is obtained by the formula shown in FIG. 6(B).
  • the virtual camera moving speed s is computed based on the above-obtained acceleration (S 304 g ), and it is further determined whether the obtained moving speed s is less than zero (S 304 h ).) If the obtained moving speed s is less than zero (S 304 h ; YES), the moving speed s is set to zero (S 304 i ). Conversely, when the obtained moving speed s is not less than zero (S 304 k ; NO), it is determined whether it is more than one, and if it is more than one (S 304 k ; YES), the moving speed s is set as one.
  • the position d of the enemy (the distance between the virtual camera and the enemy character) is computed. Then, an area that includes the enemy's position d is determined, the acceleration of the virtual camera is computed based on the specified area, and the virtual camera moving speed is computed on the basis of the obtained acceleration.
  • the virtual camera moving speed may be set back to the normal speed. Specifically, it is determined whether the enemy characters in the virtual space have all been defeated. If so determined, the virtual camera moving speed is set back to the normal speed.
  • the virtual camera moving direction and the game story development may be changed (or may follow other preprogrammed branches) in accordance with the time elapsed for enemy characters to appear in the virtual space and to be annihilated. Specifically, the time elapsed is measured, then a virtual camera moving direction is selected and the game story is specified for how it will develop in accordance with the time.
  • the virtual camera moving speed changes in accordance with the distance between the virtual camera and the enemies, and the shorter the distance becomes, the slower the virtual camera moving speed becomes. Accordingly, if the player shoots the enemies from far away, the virtual camera moving speed does not decrease. In other words, the faster the play r defeats the enemies, the fast r the game advances, thereby the player obtains a higher score.
  • the game device can provide a tense mood. For example, while a target enemy approaches from the back of the screen, the player (the virtual camera) moves toward a certain destination. The moving speed of the player does not change as long as the enemy is far away, therefore, the player feels as if he/she is advancing towards the destination on his/her own. However, as time passes and the enemy approaches the player, the player's moving speed decelerates and the player recognizes that he/she will have a battle with the enemies and so he/she becomes nervous.
  • the player's character stops its movement and remains in that position until the battle with the enemy is over.
  • the player fights with the enemies with a sense of urgency, fearing that he/she might be defeated. Accordingly, the combination of the player's movements and the enemies'movements can provide a real-life tense atmosphere.
  • the virtual camera moves in the three-dimensional virtual space according to the program.
  • the line of sight of the camera is directed to a certain point (a fixation point) in the virtual space and images are generated with the fixation point at the center of the display screen.
  • the fixation point is controlled according to the enemy's situation which is located in the direction of the virtual camera's line of sight. Control of the fixation point is executed such that the speed with which the fixation point follows the enemy changes based on the distance between the virtual camera and the enemy. More specifically, as the enemy reaches within the enemy sensing distance, the fixation point starts to follow the enemy, and the shorter the distance becomes, the speed of the fixation point for following th enemy increases.
  • the fixation point of the virtual camera is predetermined according to the program.
  • an enemy reaches within the enemy sensing distance (enemy 1 in FIG. 4)
  • the fixation point of the virtual camera starts to follow the enemy, but since the enemy is still far away from the virtual camera, the speed of the fixation point for following the enemy is set to slow.
  • the enemy approaches the virtual camera (enemy 2 in FIG. 4)
  • the enemy following speed of the fixation point is set to fast.
  • the speed of the fixation point for following the enemy will be at the maximum setting.
  • a fixation point setting means sets a fixation point of the virtual camera. Next, it selects an enemy on which the fixation point should be fixed. This enemy is the one within the enemy sensing distance and closest to the virtual camera. Then, the fixation point setting means determines the speed to move the fixation point in accordance with the enemy's position and moves the fixation point at the determined speed.
  • the player's weapon is a shotgun. Accordingly, it is desirable that the shooting results be determined in a manner that effectively demonstrates the shotgun property in which “bullets scatter in a wide radius”. It is understood that this characteristic of the shotgun means that, if fired at an object closeby, bullets impact a small area with high density, thereby demonstrating their greatest strength. On the other hand, if fired at a distant object, bullets scatter and impact a wide area with low density, thereby a deadly force cannot be fully realized.
  • damage to be suffered by an enemy is determined based on the following points: the amount of damage changes according to the distance to the enemy; an effective shooting scope (bullet strength) will change in accordance with the above distance: and the damage also changes in accordance with the bullet's impact point within the effective radius.
  • FIG. 8 is a diagram showing one example of the relationship between the distance to the enemy and the bullet strength.
  • the bullet strength and the effective shooting scope are determined to change according to the distance to the enemy. For example, if the distance is 3 meters, the bullet strength is 100 points, but the bullet strength decreases to 60 points when the distance is 5 meters, and to 30 points when the distance is 7 meters. Whereas, if the distance to the enemy is 3 meters, the effective shooting scope is 20 centimeters, and it expands to 60 centimeters when the distance is 5 meters, and to 70 centimeters when the distance is 7 meters.
  • FIG. 9 shows an example of the relationship between the effective shooting scope and the bullet strength.
  • the bullet strength is set in a manner so that the further the impact point is located from the target point of the player (the center of the concentric circle), the more the bullet strength and the damage suffered by the enemy decrease.
  • FIG. 10 shows on example of the r lationship between the distance to the enemy and the effective shooting scope.
  • FIG. 11 is a flow chart explaining the process of the shooting results determination process (S 306 in FIG. 3).
  • the hit decision means executes the hit decision process for determining whether the bullet hits the enemy (S 306 a ). This hit decision process will be described hereinafter with reference to FIGS. 12 (A) and 12 (B).
  • a hit flag is set.
  • a damage computing means executes the damage process for computing a damage point caused by the shooting (S 306 b ).
  • the damage process will be described hereinafter with reference to FIGS. 13 (A) and 13 (B).
  • injury severity computing means executes the injury process for determining the injury severity of the enemies in accordance with the shooting results and expressing the injury visually (S 306 c ). The injury process will be hereinafter described with reference to FIG. 16.
  • FIG. 12(A) is a flow chart explaining the hit determination process flow.
  • the player shoots an enemy S 306 a 1 ; YES
  • the enemy's coordinates are converted on the coordinate system in which the players position is an original point and the vector of the shooting direction is a Z-axis (S 306 a 2 ).
  • a radius DR i.e., an effective shooting scope (extent of the scatter shot) at the Z position of the enemy is computed (S 306 a 3 ) and a distance L between the enemy and the Z-axis is computed (S 306 a 4 ).
  • a radius R of the enemy's collision sphere is computed (S 306 a 6 ) and it is determined whether the bullet has hit the enemy base d on the radius DR, the distance L, and the radius R (S 306 a 6 ).
  • the sum of the radius R and the radius DR is greater than or equal to the distance L, it is considered that the bullet has hit the enemy (S 306 a 6 ; YES).
  • the sum is less than the distance L (S 306 a 6 ; NO)
  • the cross section of the collision cone of the shotgun pellets at the enemy's Z position is divided into sections of a predetermined number (for example, 16 sections), and it is determined which sections cover the enemy (S 306 a 8 ).
  • FIG. 12(B) shows the conical cone which is divided into sections 1 to 16 .
  • FIG. 13 is a flow chart explaining the damage process.
  • the body of an enemy is composed of predetermined body sections (for example, head, arms, legs, chest, etc.), and each body section is composed of predetermined body parts (for example, an “arm” has a “shoulder,” “upper arm,” “lower arm,” and “hand”).
  • the presence or absence of a hit flag which is set in the hit decision process explained by FIG. 12(A), tells whether the bullet has hit any body section or body sections of the enemy's body as well as which body section or body sections were hit.
  • the damage rate can be obtained by the formula shown in FIG. 13(B).
  • the minimum damage rate (MIN_DAMAGE_RATE) is a bullet strength percentage at the furthest position from the impact point within the shotgun radius, for example the minimum damage rate is set as 0.1, for example.
  • the maximum damage rate (MAX_DAMAGE_RADIUS_RATE) in the maximum impact radius is a bullet strength percentage, which determines a radius around the impact point for which the same strength should be applied. In short, the bullet strength at the impact point is maintained within a certain radius from the impact point.
  • the damage radius is a radius wherein the bullet strength at the impact point is effective, and it also represents a range in which bullets scatter (i.e., hit decision area).
  • a distance from the center of the trajectory is a distance between the center of the trajectory and the enemy, and is obtained by subtracting the radius of the enemy's collision sphere from the distance between the center of the trajectory and the enemy.
  • FIG. 14 is one example of the configuration of the damage chart.
  • the damage chart stores the damage values that determine a damage point of the enemies that have been shot.
  • the average physical power value of the enemies is set at 200 points.
  • the damage values are set according to the distance between the player and an enemy and a body section that has been shot.
  • the physical power value of an enemy which has been hit is calculated by: at first, obtaining the damage point of the enemy by multiplying the damage value by the damage rate based on the distance to the impact point (center of the trajectory); and then subtracting the computed damage point from the physical power value which the nemy owned before it was shot.
  • the damage value suffered by the enemy is “ 30 ” points.
  • a damage point of the body part is computed by multiplying the damage rate by the damage value (S 306 b 7 ).
  • damage points of all the body sections are not computed, damage points of the rest of the body sections are computed (S 306 b 8 ; NO).
  • the damage points are summed up for the respective body sections and the total damage point to the enemy is obtained (S 306 b 9 ).
  • the sum of the damage points of the respective body sections is the total damage point suffered by the enemy, and this total damage point is subtracted from the physical power value of the enemy. If after the subtraction, the physical power value is less than a predetermined value, the enemy vanishes from the screen.
  • FIG. 15 shows image examples of objects (enemies) being shot.
  • FIG. 15 shows two examples wherein the objects were shot at the same impact point but from different distances, the effective shooting scopes being shown with circles of dashed lines, and damage being shown with ⁇ figures. If the enemy is shot at short range as shown in FIG. 15(A), the bullets scatter around the abdomen, and each body part will be heavily damaged even though there are only a few points of damage. Whereas, if the enemy is shot at long range as shown in FIG. 15(B), the bullets scatter in a wide range throughout the whole body, but the damage to each body part is small.
  • the distance betwe n the virtual camera and an enemy character affects not only the virtual camera's moving speed, but the amount of damag suffered by the enemy character that was shot. Due to this fact, the player will be conflicted since on the one hand, shooting at short range demonstrates great bullet strength and enables the player to defeat ‘one enemy’ in a short time, but the player's moving speed will become slow. On the other hand, if there are many enemies, it may be better to shoot them at long range even with small bullet strengths because the player can damage an enemy in a wide range, thereby defeating the enemies more quickly and the player can move forward in the game. Thus, the entertaining characteristics of the game are enhanced.
  • Each body part on the enemy is provided with damaged body parts that express damage (injury status) corresponding to the predetermined levels.
  • the body part, the “chest,” of an enemy A is provided with damaged body parts that correspond to five levels ( 0 ⁇ 1 ⁇ 2 ⁇ 3 ⁇ 4 ) of damage.
  • the damaged body parts are composed such that the severity increases at each level.
  • level 0 shows an image of the chest with no damage; at level 1 a part of the chest is bleeding; at level 2 , a part of the chest is damaged; at level 3 , the entire ch st is damaged; and at level 4 , the chest is shattered. Damage levels and their modes of expression may be set differently depending on the enemy types.
  • FIG. 16 is a flow chart explaining the injury process.
  • the presence or absence of a hit flag which is set in the hit decision process explained with reference to FIG. 12, indicates whether a bullet hit a body section of the enemy's body.
  • a hit flag is set for a predetermined body section (S 306 c 1 ), and if the hit flag is set for the body section (S 306 c 1 ; YES), a body part closest to the impact point within the present body section is selected (S 306 c 2 ). Then, the injury progression value chart (damage progression value chart) in FIG. 17 is referred to, and an injury progression value is specified based on the player's distance to the enemy (S 306 c 3 ).
  • FIG. 17 shows one example of the configuration of the injury progression value chart, wherein the injury progression values are set in accordance with the distance to the enemy.
  • the injury progression values of a body part A of a certain body section are set such that the shorter the distance to the enemy the player is, the more the value increases; and the further the distance is, the more the value decreases.
  • the injury progression values only of the body part A are indicated.
  • Other body parts (for example, lower arms) and other body sections (for example, the head) are omitted in FIG. 17, but the injury progression values of those parts are similarly set.
  • the injury progression value that is already stored in a predetermined storage area is added (S 306 c 4 ).
  • the injury progression value of the present impact is added to the injury progression value of the previous impact, thereby increasing the total injury progression value.
  • Parameters of the damaged body parts are referred to based on the accumulated injury progression value, and a damaged body part which will be displayed is specified as it shows the shooting result (S 306 c 5 ).
  • the injury progression value of the body part is “7”
  • the damaged body part of level “0” is displayed.
  • the severity of the enemy's injury increases even if impacted only a few times.
  • a proportion (“first overlapping proportion”) of the overlapping area (hit determined portion) to the entire effective shooting radius (damage radius) is computed, and in the overlapping area, the damage radius and the enemy's collision sphere overlap.
  • a damage point is then computed based on the obtained proportion. Specifically, the damage point is computed by multiplying a damage value of the damage radius by the first overlapping proportion. Details will be explained with reference to FIG. 18.
  • FIG. 18 explains the second damage process.
  • the proportion of the hit determined portion to the damage radius is 100%.
  • second overlapping proportion of the overlapping area (hit determined portion) to the entire collision sphere of the enemy is computed, and in the overlapping area, the damage radius and the enemy's collision sphere overlap.
  • a damage point is then computed based on the obtain d second overlapping proportion. Specifically, the damage point is obtained by multiplying the damage value of the damage radius by the second overlapping proportion. Details will be explained with reference to FIGS. 18 (C) and 18 (D).
  • the proportion of the hit determined portion to the enemy's collision sphere is 100%.
  • the proportion of the hit determined portion to the enemy's collision sphere is 50%. Accordingly, when the damage value provided by the entire damage radius is set to 100, the enemy's damage point is 50 by calculating “damage value (100) ⁇ second overlapping proportion (50%)”. The fact that the proportion of the hit determined portion to the enemy's collision sphere is 50%, means that 50% of the enemy's collision sphere overlaps with the damage radius.
  • the damage radius is divided into lattices of a predetermined size. Then, it counts the number of lattices overlapping the collision sphere. Finally, in the case of the first overlapping proportion, the proportion of the overlapped lattices to all of the damage radius lattices is computed. Whereas, in the cas of the second overlapping proportion, the proportion of the overlapped lattices to all of the collision sphere lattices is computed.
  • the damage radius and the collision sphere are projected onto a virtual image (not displayed) for the hit decision, and the number of pixels in the overlapped portion of the virtual image is counted.
  • the overlapped area and the value for multiplying the damage do not necessarily correspond exactly.
  • the value may be separated into levels, for example, when “the overlapped portion is 1% or more, but less than 10%”, “the value is 10%”, and when “the overlapped portion is 10% or more, but less than 30%”, “the value is 30%”.
  • the damage radius is not limited to being circular, but may be oval or polygonal as appropriate. Further, the combination of the damage process of the present invention with other damage processes makes it possible to execute a more fractionalized damage process.
  • FIG. 19 explains the second damage process when an enemy is human-shaped.
  • a proportion of an area for each body part to the damage radius is computed and a damage point is also computed based on the total obtained proportion. For example, when the damage radius is taken up by 10% for each of the arms, 10% by the head, 20% by the waist, and 30% by the chest, the summation of 20 for the arms, 10 for the head, 20 for the waist, and 30 for the chest, i.e. “20+10+20+30”, will equal the total damage points of the enemy, i.e., 80.
  • damage can be determined more realistically and fairly. Further, the player can develop his/her skills for the shooting game, for example, “aiming at a target in a manner so that more enemies are included in the effective shooting scope” and “shooting stronger enemies in a manner so that bullets scatter in a wide range”.
  • th present invention is applied to a gun shooting game, however, the present invention is not limited to this application, but can be applied to other types of games.
  • explanations will be given to a game wherein multiple characters are defined in a three-dimensional virtual space, a first character (for example, an enemy character) being manipulated under a predetermined program while a second character (for example, a player's character) being manipulated in accordance with the manipulation information from the player.
  • a position of the player's character is employed instead of the position of the virtual camera and the moving speed of the virtual camera is controlled by the distance between the enemy character and the player's character.
  • the speed of the fixation point to follow enemy characters may be controlled on the basis of the distance between the player's character and the enemy character.
  • an effective attack range of the player is employed instead of the effective shooting radius.
  • damage points of the enemy characters may be computed on the basis of: the distance between the player's character and the enemy character; the effective attack range that changes in accordance with the above distance; and the distance between the enemy character and the center of the effective attack range. The same computing manner is used when the player's character (a character on the player's side) is being attacked.
  • a damage point may be computed based on the proportion of the overlapping area to the entire effective attack range, and in the overlapping area, the effective attack range and the enemy's collision spheres overlap.
  • This damage point computing process may be applied to games in which a player damages objects located in a virtual space. Specifically, the positional information of a character manipulated by the player and the positional information of objects are obtained. Then the distance between the player's character and an object is computed, and the size of a hit decision is determined based on the distance. When it is determined that the hit decision area overlaps with an object, the damage amount of the object is computed, and then the damage to the object based on the damage amount is caused.
  • the shooting results are determined in correspondence with the characteristics of the gun, the player can enjoy the game by developing fight strategies using his/her knowledge of the gun properties.
  • the shooting results and the enemies' damage point correspond to each other precisely, and damage can be determined more realistically and fairly.
  • a product invention can be interpreted as a method invention and vice versa.
  • This invention can also be implemented as a program or a recording medium that has a program stored therein for making a computer implement predetermined functions.
  • the recording medium include, for example, a hard disk (HD), a DVD-RAM, a floppy disk (FD), a CD-ROM, and types of memory such as a RAM and a ROM.
  • Examples of the computer included a so-called microcomputer wherein a central processing unit such as a CPU or an MPU interprets programs to execute predetermined processes.
  • a means does not simply imply a physical means, but it can also imply a function of the means implemented by a software or hardware circuit.
  • a function of one means may be realized by two or more physical means and functions of two or more means may be realized by one physical means.
  • means in this specification can be implemented by hardware or software, or the combination of both.
  • Implementation by the combination of the hardware and the software is, for example, the implementation by a computer system having a predetermined program therein.
  • a function of one means may be realized by two or more types of hardware or software, or by the combination of both, while two or more functions of one means may also be realized by one type of hardware or software, or by the combination of both.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)
US10/441,031 2002-05-21 2003-05-20 Game device, image processing device and image processing method Abandoned US20040063501A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2002146900A JP2003334382A (ja) 2002-05-21 2002-05-21 ゲーム装置、画像処理装置及び画像処理方法
JP2002-146900 2002-05-21

Publications (1)

Publication Number Publication Date
US20040063501A1 true US20040063501A1 (en) 2004-04-01

Family

ID=29705737

Family Applications (1)

Application Number Title Priority Date Filing Date
US10/441,031 Abandoned US20040063501A1 (en) 2002-05-21 2003-05-20 Game device, image processing device and image processing method

Country Status (2)

Country Link
US (1) US20040063501A1 (ja)
JP (1) JP2003334382A (ja)

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050164794A1 (en) * 2004-01-28 2005-07-28 Nintendo Co.,, Ltd. Game system using touch panel input
US20050187023A1 (en) * 2004-02-23 2005-08-25 Nintendo Co., Ltd. Game program and game machine
US20070171221A1 (en) * 2006-01-26 2007-07-26 Nintendo Co., Ltd. Image processing program and image processing device
US20070270215A1 (en) * 2006-05-08 2007-11-22 Shigeru Miyamoto Method and apparatus for enhanced virtual camera control within 3d video games or other computer graphics presentations providing intelligent automatic 3d-assist for third person viewpoints
US20080207324A1 (en) * 2007-02-28 2008-08-28 Kabushiki Kaisha Square Enix (Also Trading As Square Enix Co., Ltd.) Game apparatus, virtual camera control method, program and recording medium
EP1968350A1 (en) * 2005-12-28 2008-09-10 Konami Digital Entertainment Co., Ltd. Voice processor, voice processing method, program, and information recording medium
US20100151943A1 (en) * 2006-11-09 2010-06-17 Kevin Johnson Wagering game with 3d gaming environment using dynamic camera
US20100160042A1 (en) * 2007-09-27 2010-06-24 Konami Digital Entertainment Co., Ltd. Game program, game apparatus and game control method
US20100267451A1 (en) * 2009-04-20 2010-10-21 Capcom Co., Ltd. Game machine, program for realizing game machine, and method of displaying objects in game
US20100315415A1 (en) * 2007-11-01 2010-12-16 Konami Digital Entertainment Co., Ltd. Image Processing Device, Method for Processing Image, Information Recording Medium, and Program
US20110304620A1 (en) * 2010-06-09 2011-12-15 Nintendo Co., Ltd. Storage medium having stored therein image processing program, image processing apparatus, image processing system, and image processing method
US20130085410A1 (en) * 2011-09-30 2013-04-04 Motorola Mobility, Inc. Method and system for identifying location of a touched body part
US20130137066A1 (en) * 2011-11-29 2013-05-30 L-3 Communications Corporation Physics-based simulation of warhead and directed energy weapons
US20140274239A1 (en) * 2013-03-12 2014-09-18 Fourthirtythree Inc. Computer readable medium recording shooting game
EP2087928A3 (en) * 2007-12-21 2015-02-25 Nintendo Co., Ltd. Computer-readable storage medium having game program stored therein and game apparatus
US20150314194A1 (en) * 2014-05-01 2015-11-05 Activision Publishing, Inc. Reactive emitters for video games
US10406437B1 (en) * 2015-09-30 2019-09-10 Electronic Arts Inc. Route navigation system within a game application environment
US10589180B2 (en) * 2013-04-05 2020-03-17 Gree, Inc. Method and apparatus for providing online shooting game
CN113069770A (zh) * 2021-03-29 2021-07-06 广州三七互娱科技有限公司 游戏角色显示方法、装置及电子设备
CN114225419A (zh) * 2020-08-27 2022-03-25 腾讯科技(深圳)有限公司 虚拟道具的控制方法、装置、设备、存储介质及程序产品
US20220254094A1 (en) * 2021-02-09 2022-08-11 Canon Medical Systems Corporation Image rendering apparatus and method
US11498004B2 (en) * 2020-06-23 2022-11-15 Nintendo Co., Ltd. Computer-readable non-transitory storage medium having instructions stored therein, game apparatus, game system, and game processing method
US11508116B2 (en) * 2017-03-17 2022-11-22 Unity IPR ApS Method and system for automated camera collision and composition preservation
CN117395510A (zh) * 2023-12-12 2024-01-12 湖南快乐阳光互动娱乐传媒有限公司 虚拟机位的控制方法和装置
US12005356B2 (en) * 2019-10-31 2024-06-11 Tencent Technology (Shenzhen) Company Limited Virtual prop control method and apparatus, computer-readable storage medium, and electronic device

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006006635A (ja) * 2004-06-25 2006-01-12 Aruze Corp 遊技機
JP2006006634A (ja) * 2004-06-25 2006-01-12 Aruze Corp 遊技機
JP5030132B2 (ja) * 2006-01-17 2012-09-19 任天堂株式会社 ゲームプログラムおよびゲーム装置
JP4094647B2 (ja) 2006-09-13 2008-06-04 株式会社コナミデジタルエンタテインメント ゲーム装置、ゲーム処理方法、ならびに、プログラム
JP2008119224A (ja) * 2006-11-10 2008-05-29 Namco Bandai Games Inc プログラム、情報記憶媒体及びゲーム装置
JP2009000286A (ja) * 2007-06-21 2009-01-08 Taito Corp ゲームシステム及び遠隔操作可能なゲームロボット
JP5296338B2 (ja) * 2007-07-09 2013-09-25 任天堂株式会社 ゲームプログラムおよびゲーム装置
JP5269392B2 (ja) * 2007-11-08 2013-08-21 株式会社カプコン プログラムおよびゲームシステム
JP4392446B2 (ja) * 2007-12-21 2010-01-06 株式会社コナミデジタルエンタテインメント ゲーム装置、ゲーム処理方法、ならびに、プログラム
JP4384697B2 (ja) * 2008-03-26 2009-12-16 株式会社コナミデジタルエンタテインメント ゲーム装置、ゲーム処理方法、ならびに、プログラム
JP4218977B2 (ja) * 2008-05-20 2009-02-04 任天堂株式会社 ゲームプログラムおよびゲーム装置
JP5498803B2 (ja) * 2010-01-13 2014-05-21 株式会社コナミデジタルエンタテインメント ゲーム装置、ゲーム制御方法、ならびに、プログラム
JP5989708B2 (ja) * 2014-05-22 2016-09-07 株式会社コロプラ ゲーム・プログラム
JP6608171B2 (ja) * 2015-05-22 2019-11-20 株式会社コロプラ ゲーム・プログラム
CN111265864B (zh) * 2020-01-19 2022-07-01 腾讯科技(深圳)有限公司 信息显示方法、装置、存储介质以及电子装置
CN112169330B (zh) * 2020-09-25 2021-12-31 腾讯科技(深圳)有限公司 虚拟环境的画面显示方法、装置、设备及介质

Citations (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3960380A (en) * 1974-09-16 1976-06-01 Nintendo Co., Ltd. Light ray gun and target changing projectors
US4317650A (en) * 1978-09-13 1982-03-02 The Solartron Electronic Group Limited Weapon training systems
US5382026A (en) * 1991-09-23 1995-01-17 Hughes Aircraft Company Multiple participant moving vehicle shooting gallery
USRE35314E (en) * 1986-05-20 1996-08-20 Atari Games Corporation Multi-player, multi-character cooperative play video game with independent player entry and departure
US5662523A (en) * 1994-07-08 1997-09-02 Sega Enterprises, Ltd. Game apparatus using a video display device
US5734807A (en) * 1994-07-21 1998-03-31 Kabushiki Kaisha Sega Enterprises Image processing devices and methods
US5800265A (en) * 1995-02-24 1998-09-01 Semiconductor Energy Laboratory Co., Ltd. Game machine
US5880709A (en) * 1994-08-30 1999-03-09 Kabushiki Kaisha Sega Enterprises Image processing devices and methods
US5988645A (en) * 1994-04-08 1999-11-23 Downing; Dennis L. Moving object monitoring system
US6146278A (en) * 1997-01-10 2000-11-14 Konami Co., Ltd. Shooting video game machine
US20010029203A1 (en) * 2000-04-10 2001-10-11 Konami Corporation Game system and computer readable storage medium
US6304267B1 (en) * 1997-06-13 2001-10-16 Namco Ltd. Image generating system and information storage medium capable of changing angle of view of virtual camera based on object positional information
US6306033B1 (en) * 1999-03-23 2001-10-23 Square Co., Ltd. Video game item's value being adjusted by using another item's value
US6323895B1 (en) * 1997-06-13 2001-11-27 Namco Ltd. Image generating system and information storage medium capable of changing viewpoint or line-of sight direction of virtual camera for enabling player to see two objects without interposition
US6458034B1 (en) * 1999-08-27 2002-10-01 Namco Ltd. Game system and computer-usable information
US20020190981A1 (en) * 1997-12-12 2002-12-19 Namco Ltd. Image generation device and information storage medium
US6504539B1 (en) * 1999-09-16 2003-01-07 Sony Computer Entertainment Inc. Method for displaying an object in three-dimensional game
US6532015B1 (en) * 1999-08-25 2003-03-11 Namco Ltd. Image generation system and program
US20030064764A1 (en) * 2001-10-02 2003-04-03 Konami Corporation Game device, game control method and program
US6582299B1 (en) * 1998-12-17 2003-06-24 Konami Corporation Target shooting video game device, and method of displaying result of target shooting video game
US6632137B1 (en) * 1999-06-11 2003-10-14 Konami Co., Ltd. Target-game execution method, game machine, and recording medium
US6763325B1 (en) * 1998-06-19 2004-07-13 Microsoft Corporation Heightened realism for computer-controlled units in real-time activity simulation
US6821206B1 (en) * 1999-11-25 2004-11-23 Namco Ltd. Game machine, game route selection method, and information storage medium
US6852032B2 (en) * 2000-12-06 2005-02-08 Nikon Corporation Game machine, method of performing game and computer-readable medium
US6972734B1 (en) * 1999-06-11 2005-12-06 Canon Kabushiki Kaisha Mixed reality apparatus and mixed reality presentation method
US6972756B1 (en) * 1997-11-25 2005-12-06 Kabushiki Kaisha Sega Enterprises Image generating device
US6992666B2 (en) * 2000-05-15 2006-01-31 Sony Corporation 3-dimensional-model-processing apparatus, 3-dimensional-model processing method and program-providing medium
US7048632B2 (en) * 1998-03-19 2006-05-23 Konami Co., Ltd. Image processing method, video game apparatus and storage medium
US20070202946A1 (en) * 2004-03-12 2007-08-30 Konami Digital Entertainment Co., Ltd. Shooting Game Device
US20080100531A1 (en) * 2005-03-31 2008-05-01 Sega Corporation Display control program executed in game machine

Patent Citations (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3960380A (en) * 1974-09-16 1976-06-01 Nintendo Co., Ltd. Light ray gun and target changing projectors
US4317650A (en) * 1978-09-13 1982-03-02 The Solartron Electronic Group Limited Weapon training systems
USRE35314E (en) * 1986-05-20 1996-08-20 Atari Games Corporation Multi-player, multi-character cooperative play video game with independent player entry and departure
US5382026A (en) * 1991-09-23 1995-01-17 Hughes Aircraft Company Multiple participant moving vehicle shooting gallery
US5988645A (en) * 1994-04-08 1999-11-23 Downing; Dennis L. Moving object monitoring system
US5662523A (en) * 1994-07-08 1997-09-02 Sega Enterprises, Ltd. Game apparatus using a video display device
US5734807A (en) * 1994-07-21 1998-03-31 Kabushiki Kaisha Sega Enterprises Image processing devices and methods
US5880709A (en) * 1994-08-30 1999-03-09 Kabushiki Kaisha Sega Enterprises Image processing devices and methods
US5800265A (en) * 1995-02-24 1998-09-01 Semiconductor Energy Laboratory Co., Ltd. Game machine
US6146278A (en) * 1997-01-10 2000-11-14 Konami Co., Ltd. Shooting video game machine
US6304267B1 (en) * 1997-06-13 2001-10-16 Namco Ltd. Image generating system and information storage medium capable of changing angle of view of virtual camera based on object positional information
US6323895B1 (en) * 1997-06-13 2001-11-27 Namco Ltd. Image generating system and information storage medium capable of changing viewpoint or line-of sight direction of virtual camera for enabling player to see two objects without interposition
US6972756B1 (en) * 1997-11-25 2005-12-06 Kabushiki Kaisha Sega Enterprises Image generating device
US20020190981A1 (en) * 1997-12-12 2002-12-19 Namco Ltd. Image generation device and information storage medium
US6614436B2 (en) * 1997-12-12 2003-09-02 Namco Ltd Image generation device and information storage medium
US7048632B2 (en) * 1998-03-19 2006-05-23 Konami Co., Ltd. Image processing method, video game apparatus and storage medium
US6763325B1 (en) * 1998-06-19 2004-07-13 Microsoft Corporation Heightened realism for computer-controlled units in real-time activity simulation
US6582299B1 (en) * 1998-12-17 2003-06-24 Konami Corporation Target shooting video game device, and method of displaying result of target shooting video game
US6306033B1 (en) * 1999-03-23 2001-10-23 Square Co., Ltd. Video game item's value being adjusted by using another item's value
US6972734B1 (en) * 1999-06-11 2005-12-06 Canon Kabushiki Kaisha Mixed reality apparatus and mixed reality presentation method
US6632137B1 (en) * 1999-06-11 2003-10-14 Konami Co., Ltd. Target-game execution method, game machine, and recording medium
US6532015B1 (en) * 1999-08-25 2003-03-11 Namco Ltd. Image generation system and program
US6458034B1 (en) * 1999-08-27 2002-10-01 Namco Ltd. Game system and computer-usable information
US6504539B1 (en) * 1999-09-16 2003-01-07 Sony Computer Entertainment Inc. Method for displaying an object in three-dimensional game
US6821206B1 (en) * 1999-11-25 2004-11-23 Namco Ltd. Game machine, game route selection method, and information storage medium
US6572476B2 (en) * 2000-04-10 2003-06-03 Konami Corporation Game system and computer readable storage medium
US20010029203A1 (en) * 2000-04-10 2001-10-11 Konami Corporation Game system and computer readable storage medium
US6992666B2 (en) * 2000-05-15 2006-01-31 Sony Corporation 3-dimensional-model-processing apparatus, 3-dimensional-model processing method and program-providing medium
US6852032B2 (en) * 2000-12-06 2005-02-08 Nikon Corporation Game machine, method of performing game and computer-readable medium
US20030064764A1 (en) * 2001-10-02 2003-04-03 Konami Corporation Game device, game control method and program
US20070202946A1 (en) * 2004-03-12 2007-08-30 Konami Digital Entertainment Co., Ltd. Shooting Game Device
US20080100531A1 (en) * 2005-03-31 2008-05-01 Sega Corporation Display control program executed in game machine
US7948449B2 (en) * 2005-03-31 2011-05-24 Sega Corporation Display control program executed in game machine

Cited By (52)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050164794A1 (en) * 2004-01-28 2005-07-28 Nintendo Co.,, Ltd. Game system using touch panel input
US7771279B2 (en) 2004-02-23 2010-08-10 Nintendo Co. Ltd. Game program and game machine for game character and target image processing
US20050187023A1 (en) * 2004-02-23 2005-08-25 Nintendo Co., Ltd. Game program and game machine
EP1968350A1 (en) * 2005-12-28 2008-09-10 Konami Digital Entertainment Co., Ltd. Voice processor, voice processing method, program, and information recording medium
CN101347043A (zh) * 2005-12-28 2009-01-14 科乐美数码娱乐株式会社 声音处理装置、声音处理方法、程序及信息记录媒体
US20090180624A1 (en) * 2005-12-28 2009-07-16 Konami Digital Entertainment Co., Ltd. Voice Processor, Voice Processing Method, Program, and Information Recording Medium
EP1968350A4 (en) * 2005-12-28 2009-11-18 Konami Digital Entertainment LANGUAGE PROCESSOR, LANGUAGE PROCESSING, PROGRAM AND INFORMATION RECORDING MEDIUM
US8155324B2 (en) 2005-12-28 2012-04-10 Konami Digital Entertainment Co. Ltd. Voice processor, voice processing method, program, and information recording medium
US20070171221A1 (en) * 2006-01-26 2007-07-26 Nintendo Co., Ltd. Image processing program and image processing device
US7679623B2 (en) * 2006-01-26 2010-03-16 Nintendo Co., Ltd. Image processing program and image processing device
US20070270215A1 (en) * 2006-05-08 2007-11-22 Shigeru Miyamoto Method and apparatus for enhanced virtual camera control within 3d video games or other computer graphics presentations providing intelligent automatic 3d-assist for third person viewpoints
US9327191B2 (en) * 2006-05-08 2016-05-03 Nintendo Co., Ltd. Method and apparatus for enhanced virtual camera control within 3D video games or other computer graphics presentations providing intelligent automatic 3D-assist for third person viewpoints
US20100151943A1 (en) * 2006-11-09 2010-06-17 Kevin Johnson Wagering game with 3d gaming environment using dynamic camera
US8628415B2 (en) * 2006-11-09 2014-01-14 Wms Gaming Inc. Wagering game with 3D gaming environment using dynamic camera
US20080207324A1 (en) * 2007-02-28 2008-08-28 Kabushiki Kaisha Square Enix (Also Trading As Square Enix Co., Ltd.) Game apparatus, virtual camera control method, program and recording medium
US8641523B2 (en) * 2007-02-28 2014-02-04 Kabushiki Kaisha Square Enix Game apparatus, virtual camera control method, program and recording medium
US20100160042A1 (en) * 2007-09-27 2010-06-24 Konami Digital Entertainment Co., Ltd. Game program, game apparatus and game control method
US8241120B2 (en) 2007-09-27 2012-08-14 Konami Digital Entertainment Co., Ltd. Game program, game apparatus and game control method
US20100315415A1 (en) * 2007-11-01 2010-12-16 Konami Digital Entertainment Co., Ltd. Image Processing Device, Method for Processing Image, Information Recording Medium, and Program
EP2087928A3 (en) * 2007-12-21 2015-02-25 Nintendo Co., Ltd. Computer-readable storage medium having game program stored therein and game apparatus
US8740681B2 (en) * 2009-04-20 2014-06-03 Capcom Co., Ltd. Game machine, program for realizing game machine, and method of displaying objects in game
US20100267451A1 (en) * 2009-04-20 2010-10-21 Capcom Co., Ltd. Game machine, program for realizing game machine, and method of displaying objects in game
EP2243527A3 (en) * 2009-04-20 2013-12-11 Capcom Co., Ltd. Game machine, program for realizing game machine, and method of displaying objects in game
US20130122977A1 (en) * 2009-04-20 2013-05-16 Capcom Co., Ltd. Game machine, program for realizing game machine, and method of displaying objects in game
US20130116019A1 (en) * 2009-04-20 2013-05-09 Capcom Co., Ltd. Game machine, program for realizing game machine, and method of displaying objects in game
US8740682B2 (en) * 2009-04-20 2014-06-03 Capcom Co., Ltd. Game machine, program for realizing game machine, and method of displaying objects in game
US8665285B2 (en) * 2010-06-09 2014-03-04 Nintendo Co., Ltd. Storage medium having stored therein image processing program, image processing apparatus, image processing system, and image processing method
US20110304620A1 (en) * 2010-06-09 2011-12-15 Nintendo Co., Ltd. Storage medium having stored therein image processing program, image processing apparatus, image processing system, and image processing method
US20130085410A1 (en) * 2011-09-30 2013-04-04 Motorola Mobility, Inc. Method and system for identifying location of a touched body part
US10932728B2 (en) 2011-09-30 2021-03-02 Google Technology Holdings LLC Method and system for identifying location of a touched body part
US9924907B2 (en) * 2011-09-30 2018-03-27 Google Technology Holdings LLC Method and system for identifying location of a touched body part
US8834163B2 (en) * 2011-11-29 2014-09-16 L-3 Communications Corporation Physics-based simulation of warhead and directed energy weapons
US20150243182A1 (en) * 2011-11-29 2015-08-27 L-3 Communications Corporation Physics-based simulation of warhead and directed energy weapons
US20130137066A1 (en) * 2011-11-29 2013-05-30 L-3 Communications Corporation Physics-based simulation of warhead and directed energy weapons
US9333420B2 (en) * 2013-03-12 2016-05-10 Fourthirtythree Inc. Computer readable medium recording shooting game
US20140274239A1 (en) * 2013-03-12 2014-09-18 Fourthirtythree Inc. Computer readable medium recording shooting game
US20230347254A1 (en) * 2013-04-05 2023-11-02 Gree, Inc. Method and apparatus for providing online shooting game
US10589180B2 (en) * 2013-04-05 2020-03-17 Gree, Inc. Method and apparatus for providing online shooting game
US11712634B2 (en) 2013-04-05 2023-08-01 Gree, Inc. Method and apparatus for providing online shooting game
US11192035B2 (en) * 2013-04-05 2021-12-07 Gree, Inc. Method and apparatus for providing online shooting game
US20150314194A1 (en) * 2014-05-01 2015-11-05 Activision Publishing, Inc. Reactive emitters for video games
US10532286B2 (en) * 2014-05-01 2020-01-14 Activision Publishing, Inc. Reactive emitters of a video game effect based on intersection of coverage and detection zones
US11235241B2 (en) 2015-09-30 2022-02-01 Electronic Arts Inc. Route navigation system within a game application environment
US10406437B1 (en) * 2015-09-30 2019-09-10 Electronic Arts Inc. Route navigation system within a game application environment
US11508116B2 (en) * 2017-03-17 2022-11-22 Unity IPR ApS Method and system for automated camera collision and composition preservation
US12005356B2 (en) * 2019-10-31 2024-06-11 Tencent Technology (Shenzhen) Company Limited Virtual prop control method and apparatus, computer-readable storage medium, and electronic device
US11498004B2 (en) * 2020-06-23 2022-11-15 Nintendo Co., Ltd. Computer-readable non-transitory storage medium having instructions stored therein, game apparatus, game system, and game processing method
CN114225419A (zh) * 2020-08-27 2022-03-25 腾讯科技(深圳)有限公司 虚拟道具的控制方法、装置、设备、存储介质及程序产品
US20220254094A1 (en) * 2021-02-09 2022-08-11 Canon Medical Systems Corporation Image rendering apparatus and method
US11688126B2 (en) * 2021-02-09 2023-06-27 Canon Medical Systems Corporation Image rendering apparatus and method
CN113069770A (zh) * 2021-03-29 2021-07-06 广州三七互娱科技有限公司 游戏角色显示方法、装置及电子设备
CN117395510A (zh) * 2023-12-12 2024-01-12 湖南快乐阳光互动娱乐传媒有限公司 虚拟机位的控制方法和装置

Also Published As

Publication number Publication date
JP2003334382A (ja) 2003-11-25

Similar Documents

Publication Publication Date Title
US20040063501A1 (en) Game device, image processing device and image processing method
JP3745475B2 (ja) ゲーム装置及び画像処理装置
US8740681B2 (en) Game machine, program for realizing game machine, and method of displaying objects in game
US6980207B2 (en) Image processing device and information recording medium
US7390254B2 (en) Soccer game method for use in game apparatus, involves recognizing areas pertaining to power of character group, based on calculated arrival times of characters up to sample points
US8556695B2 (en) Information storage medium, image generation device, and image generation method
KR100276549B1 (ko) 화상생성장치,화상생성방법및이것을이용한게임기
JP5234716B2 (ja) プログラム、情報記憶媒体及びゲーム装置
US6972756B1 (en) Image generating device
JP5136742B2 (ja) 電子遊戯装置、電子遊戯用制御方法およびゲームプログラム
US20100069152A1 (en) Method of generating image using virtual camera, storage medium, and computer device
US20100009734A1 (en) Electronic play device, control method for electronic play device and game program
US20030032484A1 (en) Game apparatus for mixed reality space, image processing method thereof, and program storage medium
EP2394716A2 (en) Image generation system, program product, and image generation method for video games
JP2010068872A (ja) プログラム、情報記憶媒体及びゲーム装置
JP3835005B2 (ja) ゲーム装置及びゲーム制御方法及び記憶媒体
JP4363595B2 (ja) 画像生成装置及び情報記憶媒体
JP4292483B2 (ja) コンピュータプログラム
JP4117687B2 (ja) 画像処理装置
JP4114825B2 (ja) 画像生成装置及び情報記憶媒体
JP3736767B2 (ja) 画像処理方法
JP5161384B2 (ja) ゲームシステム、ゲーム制御方法、プログラム、および、このプログラムを記録したコンピュータ読み取り可能な記録媒体
JP2011255114A (ja) プログラム、情報記憶媒体及び画像生成システム
JP2005334128A (ja) プログラム、情報記憶媒体及びゲーム装置
JP2012166068A (ja) ゲームシステム、ゲーム制御方法、プログラム、および、このプログラムを記録したコンピュータ読み取り可能な記録媒体

Legal Events

Date Code Title Description
AS Assignment

Owner name: KABUSHIKI KAISHA SEGA, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SHIMOKAWA, HITOSHI;TSUJI, YUKIO;SANBONGI, KAZUTOMO;AND OTHERS;REEL/FRAME:014508/0622

Effective date: 20030916

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION