US20070265074A1 - Game program and game apparatus - Google Patents

Game program and game apparatus Download PDF

Info

Publication number
US20070265074A1
US20070265074A1 US11/797,558 US79755807A US2007265074A1 US 20070265074 A1 US20070265074 A1 US 20070265074A1 US 79755807 A US79755807 A US 79755807A US 2007265074 A1 US2007265074 A1 US 2007265074A1
Authority
US
United States
Prior art keywords
sound
player
player object
action
inputted
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/797,558
Inventor
Eiji Akahori
Shingo Miyata
Toshiharu Izuno
Takuji Hotta
Kentaro Nishimura
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nintendo Co Ltd
Original Assignee
Nintendo Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nintendo Co Ltd filed Critical Nintendo Co Ltd
Assigned to NINTENDO CO., LTD. reassignment NINTENDO CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: AKAHORI, EIJI, HOTTA, TAKUJI, IZUNO, TOSHIHARU, MIYATA, SHINGO, NISHIMURA, KENTARO
Publication of US20070265074A1 publication Critical patent/US20070265074A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • A63F13/10
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/424Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving acoustic input signals, e.g. by using the results of pitch or rhythm extraction or voice recognition
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/24Constructional details thereof, e.g. game controllers with detachable joystick handles
    • A63F13/245Constructional details thereof, e.g. game controllers with detachable joystick handles specially adapted to a particular type of game, e.g. steering wheels
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1062Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to a type of game, e.g. steering wheel
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/303Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/303Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display
    • A63F2300/305Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display for providing a graphical or textual hint to the player
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6063Methods for processing data by generating or executing the game program for sound processing
    • A63F2300/6072Methods for processing data by generating or executing the game program for sound processing of an input signal, e.g. pitch and rhythm extraction, voice recognition
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8047Music games

Definitions

  • the present invention relates to a game program and game apparatus capable of controlling a character displayed on a screen in accordance with a sound input.
  • Patent document 1 discloses a game which controls a character by operating a percussion controller comprised of two congas disposed side-by-side. More specifically, when a player hits a right conga, the character moves to the right. When the player hits a left conga, the character moves to the left. Finally, when the player simultaneously hits both the right and left congas, the character jumps.
  • the percussion controller includes a sound detecting device, and when the sound detecting device detects sound generated by handclaps of the player, for example, the character performs an action so as to toss an item.
  • an object of the present invention is to prevent, in an action game which causes a player character to perform a predetermined action in accordance with a sound input, the player character from performing an action which is not intended by a player, even if the sound input is mistakenly detected.
  • the present invention has the following features to attain the object mentioned above. Note that reference numerals and figure numbers are shown in parentheses below for assisting a reader in finding corresponding components in the figures to facilitate the understanding of the present invention, but they are in no way intended to restrict the scope of the invention.
  • a computer-readable storage medium is a computer-readable storage medium storing a game program instructing a computer ( 31 ) of a game apparatus ( 3 ), which is connected to sound inputting means ( 6 M) and a display apparatus ( 2 ), to function as: display controlling means (S 32 ) for generating a game image including a player object, of a virtual game world, which is operated by a player, and causing the display apparatus to display the game image; movement controlling means (S 30 ) for causing the player object to move in the virtual game world; sound detecting means (S 42 ) for determining whether a sound is inputted through the sound inputting means; object position determining means (S 54 ) for determining whether the player object is positioned in a specific area of the virtual game world; and action controlling means (S 56 ) for causing the player object to perform a specific action, when the sound is inputted through the sound inputting means and the player object is positioned in the specific area of the virtual game world.
  • display controlling means for generating a game image including
  • the movement controlling means may cause the player object to move in accordance with an instruction of the player, or may allow the computer to automatically move the player object.
  • the movement controlling means may cause the player object not to perform any action in response to the inputted sound, or may cause the player object to perform another action different from the specific action.
  • the action controlling means determines whether or not the player object is positioned in the specific area based on a positional relationship between a current position of the player object and the specific area.
  • the action controlling means may determine whether or not the player object is to be positioned inside the specific area in a foreseeable future (e.g., in a frame immediately following a current frame), taking into consideration the current position of the player object which is already in the specific area, or a current moving direction with respect to the current position of the player object.
  • the specific action may be an action related to the specific area.
  • the specific action may be different from one specific area to another.
  • the object position determining means may determine whether the player object is positioned in the specific area, if the sound detecting means determines that the sound is inputted through the sound inputting means ( FIG. 8 ).
  • the action controlling means may cause the player object to perform an action different from the specific action, when the sound is inputted through the sound inputting means and the player object is positioned outside the specific area (S 58 ).
  • the action controlling means may cause the player object to perform: (a) an action which exerts an influence on a movement of the player object, when the sound is inputted through the sound inputting means and the player object is positioned in the specific area of the virtual game world; and (b) an action which does not exert any influence on the movement of the player object, when the sound is inputted through the sound inputting means and the player object is positioned outside the specific area of the virtual game world.
  • the action which exerts an influence on a movement of the player object indicates an action in which at least one of a position, moving direction and moving speed of the player object is changed accordingly when the action is performed. For example, when the action which exerts an influence on a movement of the player object is performed, movement parameters (e.g., a speed parameter, acceleration parameter, orientation parameter, etc.) of the player object are changed in accordance with the inputted sound. On the other hand, when the action which does not exert any influence on the movement of the player object is performed, the player object is caused to perform a predetermined action on the spot (while continuing to move if the player object is caused to move due to other factors).
  • movement parameters e.g., a speed parameter, acceleration parameter, orientation parameter, etc.
  • the game program may realize a game providing the player with a specific challenge
  • the action controlling means may cause the player object to perform: (a) an action which exerts an influence on a success or failure of the specific challenge, when the sound is inputted through the sound inputting means and the player object is positioned in the specific area of the virtual game world; and (b) an action which does not exert any influence on the success or failure of the specific challenge, when the sound is inputted through the sound inputting means and the player object is positioned outside the specific area of the virtual game world.
  • the specific challenge includes “to reach a goal as fast as possible”, “to reach a goal faster than a rival character”, “to acquire as high score as possible”, and “to defeat an enemy character”, for example, and is different depending on a genre or type of the game.
  • the sound detecting means may determine whether the sound is inputted through the sound inputting means, if the object position determining means determines that the player object is positioned in the specific area ( FIG. 9 ).
  • the specific action may be a jump action.
  • the specific area may indicate a jump ramp disposed in the virtual game world.
  • the sound detecting means may determine that the sound is inputted through the sound inputting means, when a sound having a predetermined volume level or higher is inputted through the sound inputting means (S 42 ).
  • the game apparatus may be connected to operation means ( 6 R, 6 L), and the movement controlling means may cause the player object to move based on a signal outputted from the operation means.
  • the operation means and the sound inputting means may have a common housing, or may have separate housings.
  • the operation means may be a percussion controller ( 6 ).
  • the game program may instruct the computer to further function as input operation detecting means (S 4 6 ) for determining whether the player operates the operation means based on the signal outputted from the operation means, and the action controlling means may cause the player object not to perform the specific action, at least when the player operates the operation means.
  • the action controlling means may cause the player object not to perform the specific action, at least when the player operates the operation means.
  • the action controlling means may cause the player object not to perform the specific action, when the player operates the operation means and while a predetermined time period has not yet passed after the player finishes operating the operation means (S 48 ). Thus, it becomes possible to ignore the noise, other than the sound of the handclaps, which is inputted through the sound inputting means.
  • the display controlling means may display an operation guiding image for prompting the player to input the sound in a vicinity of the specific area ( FIG. 5 ).
  • an operation guiding image for prompting the player to input the sound in a vicinity of the specific area ( FIG. 5 ).
  • the game program may be able to realize a game in which at least a first player and a second player simultaneously play against each other, a first player object which can be operated by the first player and a second player object which can be operated by the second player may exist in the virtual game space, the movement controlling means may cause the first player object and the second player object to individually move in the virtual game world, and the action controlling means may include: first player object action controlling means for causing the first player object to perform the specific action, when the sound is inputted through the sound inputting means and the first player object is positioned in the specific area of the virtual game world; and second player object action controlling means for causing the second player object to perform the specific action, when the sound is inputted through the sound inputting means and the second player object is positioned in the specific area of the virtual game world.
  • the game apparatus is further connected to first operation means ( 6 ) operated by the first player and second operation means ( 7 ) operated by the second player, each of the first operation means and the second operation means includes the sound inputting means ( 6 M, 7 M), the first player object action controlling means may cause the first player object to perform the specific action, when the sound is inputted through the sound inputting means included in the first operation means and the first player object is positioned in the specific area of the virtual game world, and the second player object action controlling means may cause the second player object to perform the specific action, when the sound is inputted through the sound inputting means included in the second operation means and the second player object is positioned in the specific area of the virtual game world.
  • the game program may instruct the computer to further function as: first virtual camera setting means for setting a parameter (e.g., a view point, fixation point, camera orientation, etc.) of a first virtual camera which images the first player object based on a current position of the first player object; second virtual camera setting means for setting a parameter of a second virtual camera which images the second player object based on a current position of the second player object; and display controlling means for causing the display apparatus to simultaneously display, on the display apparatus, a first game image generated by imaging the virtual game space by means of the first virtual camera and a second game image generated by imaging the virtual game space by means of the second virtual camera.
  • a parameter e.g., a view point, fixation point, camera orientation, etc.
  • a game apparatus comprises: sound inputting means ( 6 M); a display apparatus ( 2 ); display controlling means ( 31 , S 32 ) for generating a game image including a player object, of a virtual game world, which is operated by a player, and causing a display apparatus to display the game image; movement controlling means ( 31 , S 30 ) for causing the player object to move in the virtual game world; sound detecting means ( 31 , S 42 ) for determining whether a sound is inputted through the sound inputting means; object position determining means ( 31 , S 54 ) for determining whether the player object is positioned in a specific area of the virtual game world; and action controlling means ( 31 , S 56 ) for causing the player object to perform a specific action, when the sound in inputted through the sound inputting means and the player object is positioned in the specific area of the virtual game world.
  • an action game for causing a player character to perform a predetermined action in accordance with a sound input, it becomes possible to prevent the character from performing an action which is not intended by the player, even if the sound input is mistakenly detected.
  • FIG. 1 is an external view illustrating a configuration of a game system according to an embodiment of the present invention
  • FIG. 2 is a block diagram illustrating an internal configuration of a game apparatus body
  • FIG. 3 shows an exemplary game image displayed on a screen of a television
  • FIG. 4 shows another exemplary game image displayed on the screen of the television
  • FIG. 5 shows still another exemplary game image displayed on the screen of the television
  • FIG. 6 shows a memory map of a work memory
  • FIG. 7 is a flowchart illustrating a flow of a process executed by a CPU
  • FIG. 8 is a flowchart illustrating a flow of a handclap process
  • FIG. 9 is a flowchart illustrating a flow of the handclap process according to a variant
  • FIG. 10 is an external view illustrating the game system obtained when two players simultaneously play a game against each other.
  • FIG. 11 is an exemplary game image obtained when the two players simultaneously play the game against each other.
  • FIG. 1 is an external view illustrating a configuration of the game system according to the embodiment of the present invention.
  • a game system 1 comprises a television 2 , a game apparatus body 3 , and a conga controller 6 , and has mounted there on a DVD-ROM 4 and a memory card 5 .
  • the DVD-ROM 4 and the memory card 5 are mounted on the game apparatus body 3 in a removable manner.
  • the conga controller 6 is connected, by a communication cable, to any of four controller port connectors provided on the game apparatus body 3 .
  • the television 2 is connected to the game apparatus body 3 by an AV cable or the like. Note that the game apparatus body 3 and the controller 6 may communicate with each other by radio communication.
  • the conga controller 6 is provided with a microphone 6 M and three switches: a start button 6 S, a right strike surface 6 R, and a left strike surface 6 L. As described herein below, a player can control a motion of a character in a virtual game world by hitting the right strike surface 6 R or left strike surface 6 L. Instead of the conga controller 6 , any controller including a microphone may be used.
  • the DVD-ROM 4 fixedly stores a game program, game data and the like.
  • the DVD-ROM 4 is mounted on the game apparatus body 3 when the player plays a game.
  • an external storage medium such as a CD-ROM, an MO, a memory card, a ROM cartridge or the like may be used as means for storing the game program and the like.
  • the game apparatus body 3 reads the game program stored in the DVD-ROM 4 , and then performs a process in accordance with the read game program.
  • the television 2 displays, on a screen, image data outputted from the game apparatus body 3 .
  • the memory card 5 has a rewritable storage medium, e.g., a flash memory, as a backup memory for storing data such as saved data of the game.
  • a rewritable storage medium e.g., a flash memory
  • FIG. 2 is a block diagram illustrating an internal configuration of the game apparatus body 3 .
  • each component of the game system 1 will be described in more detail with reference to FIG. 2 .
  • the game apparatus body 3 comprises a CPU 31 , a work memory 32 , an external memory interface (I/F) 33 , a controller interface (I/F) 34 , a video RAM (VRAM) 35 , a graphics processing unit (GRU) 36 , and an optical disc drive 37 .
  • the optical disc drive 37 drives the DVD-ROM 4 mounted on the game apparatus body 3 , and then the game program stored in the DVD-ROM 4 is loaded into the work memory 32 .
  • the game starts when the CPU 31 executes the program stored in the work memory 32 .
  • the player plays the game by using the conga controller 6 .
  • the conga controller 6 outputs operation data to the game apparatus body 3 .
  • the operation data outputted from the conga controller 6 is supplied to the CPU 31 via the controller I/F 34 .
  • the CPU 31 performs a game process based on inputted operation data.
  • the GPU 36 is used for image data generation and the like performed in the game process.
  • the GPU 36 performs, for coordinates of a solid model of an object of figure (e.g., an object comprised of polygons) placed in a three-dimensional virtual game world, arithmetic processing (e.g., rotation, scaling and deformation of a three-dimensional model, and coordinate transformation from a world coordinate system to a camera coordinate system or screen coordinate system). Further, the GPU 36 generates a game image by writing, based on a predetermined texture, color data (RGB data) of each pixel of a solid model projected on the screen coordinate system into the VRAM 35 . The GPU 36 thus generates the game image to be displayed on the television 2 , and outputs the game image to the television 2 as necessary.
  • arithmetic processing e.g., rotation, scaling and deformation of a three-dimensional model, and coordinate transformation from a world coordinate system to a camera coordinate system or screen coordinate system.
  • the GPU 36 generates a game image by writing, based on a predetermined texture, color data (RGB data) of each
  • the present embodiment shows a hardware configuration in which a memory dedicated for image processing (VRAM 35 ) is separately provided, the present invention is not limited thereto.
  • VRAM 35 a memory dedicated for image processing
  • UMA Unified Memory Architecture
  • the work memory 32 stores various program and pieces of data loaded from the DVD-ROM 4 .
  • These pieces of data include, for example, data, which is related to polygons comprising the three-dimensional model placed in the virtual game world, and a texture used for coloring the polygons.
  • FIG. 3 shows an exemplary game image displayed on a screen of the television 2 .
  • the present embodiment illustrates an example where the present invention is applied to a racing game, the present invention is not limited thereto. The present invention is applicable to an arbitrary game.
  • a racecourse set in a virtual game world On the screen of the television 2 , a racecourse set in a virtual game world, a player character operated by the player, and an obstacle and coins disposed on the racecourse are displayed.
  • the player operates the player character by using the conga controller 6 : so as to collide with as few obstacles as possible; so as to acquire as many coins as possible; and so as to reach a goal as fast as possible.
  • the player can input instructions such as an acceleration instruction, a rightward movement instruction, a leftward movement instruction, a deceleration instruction, and a jump instruction.
  • the player can input the acceleration instruction by alternately and continuously hitting the right strike surface 6 R and left strike surface 6 L of the conga controller 6 .
  • the acceleration instruction When the acceleration instruction is inputted, the character accelerates forward (i.e., in a direction in which the character faces or in a moving direction of the character).
  • the player can input the rightward movement instruction by continuously hitting the right strike surface 6 R of the conga controller 6 .
  • the character moves to the right (i.e., in a rightward direction with respect to the direction in which the character faces or in a rightward direction with respect to the moving direction of the character).
  • the more rapidly the player continuously hits the right strike surface 6 R the more quickly the character moves to the right.
  • a current direction in which the character faces and a current moving direction of the character may be changed to the right.
  • the player can input the leftward movement instruction by continuously hitting the left strike surface 6 L of the conga controller 6 .
  • the character moves to the left (i.e., in a leftward direction with respect to the direction in which the character faces or in a leftward direction with respect to the moving direction of the character).
  • the more rapidly the player continuously hits the left strike surface 6 L the more quickly the character moves to the left.
  • the current direction in which the character faces and the current direction of the character may be changed to the left.
  • the player can input the deceleration instruction by pressing both the right strike surface 6 R and left strike surface 6 L of the conga controller 6 for a predetermined time period or longer.
  • the deceleration instruction is inputted, the character decelerates.
  • the player can input the jump instruction by clapping his or her hands in the vicinity of the conga controller 6 . Specifically, sound generated by handclaps of the player is converted into an electric signal by the microphone 6 M, so as to be inputted to the game apparatus body 3 .
  • the jump instruction is inputted, the character jumps. Note that the character jumps in accordance with the handclaps of the player only when the character is positioned on a jump ramp (see FIG. 4 ) which is set in the virtual game world.
  • the character In a case where ambient noise (including voice or handclaps of any person other than the player) is inputted through the microphone 6 M when the character is not positioned on the jump ramp, the character does not jump, but performs a provocative action, as shown in FIG. 5 .
  • the provocative action does not exert any influence on the movement of the character. Therefore, even if the character performs the provocative action, a movement speed or movement direction of the character never changes according to the provocative action.
  • the provocative action is immediately released, thereby allowing the character to perform an action (i.e., an acceleration, rightward movement, leftward movement or deceleration) in accordance with the inputted instruction.
  • the character jumps in accordance with the handclaps of the player only when the character is positioned on the jump ramp.
  • a game result e.g., a player ranking for the race, goal time, score, etc.
  • the present embodiment illustrates an example where the character jumps in accordance with the handclaps of the player
  • the present invention is not limited thereto.
  • the character when the character is positioned in an acceleration lane as shown in FIG. 11 , the character may accelerate in accordance with the handclaps of the player (more rapidly than when the acceleration instruction is inputted). Further, both the jump ramp and the acceleration lane may exist on the racecourse.
  • an area e.g., the jump ramp or acceleration lane, in which the character performs a special movement action such as a jump or acceleration in accordance with the handclaps of the player
  • a hand clap area an area, e.g., the jump ramp or acceleration lane, in which the character performs a special movement action such as a jump or acceleration in accordance with the handclaps of the player
  • an action e.g., the provocative action, performed by the character in accordance with the handclaps of the player when the character is positioned outside the handclap area
  • performance action e.g., the provocative action, performed by the character in accordance with the handclaps of the player when the character is positioned outside the handclap area
  • a handclap image for prompting the player to clap his or her hands is displayed in the vicinity of the jump ramp (handclap area).
  • FIG. 6 shows a memory map of the work memory 32 .
  • the work memory 32 stores a game program 40 , game image data 41 , racecourse data 42 , character controlling data 43 , a sound input flag 44 , and a sound input timer 45 .
  • the game image data 41 is data for generating a game image displayed on the screen of the television 2 , and includes a character image, a background image, and the handclap image.
  • the race course data 42 is data for showing a shape of the racecourse set in the virtual game world, and includes handclap area information indicating a position of the handclap area.
  • the character controlling data 43 is data for controlling the movement of the character in the virtual game world, and includes current position information and speed information.
  • the current position information is information (coordinate data) indicating a current position of the character
  • the speed information is information (vector data) indicating a movement speed of the character.
  • the sound input flag 44 and the sound input timer 45 are a flag or timer, respectively, used in a handclap process to be described later.
  • step S 10 when the game program 40 starts to be executed, the CPU 31 firstly displays, in step S 10 , an initial game image. At this time, initial values of the current position and movement speed of the character are set.
  • step S 12 it is determined whether the right strike surface 6 R has been continuously hit. For example, in a case where, within a predetermined time period after the right strike surface 6 R is hit, the strike surface 6 R is hit again, it is determined that the right strike surface 6 R has been continuously hit. When it is determined that the right strike surface 6 R has been continuously hit, the process proceeds to step S 14 . On the other hand, it is determined that the right strike surface 6 R has not been continuously hit, the process proceeds to step S 16 .
  • step S 14 the speed information is updated (specifically, a direction of a speed vector is changed) such that the character is to turn clockwise (i.e., the character is to move or accelerate to the right). Thereafter, the process proceeds to step S 28 .
  • step S 16 it is determined whether the left strike surface 6 L has been continuously hit. For example, in a case where, within a predetermined time period after the left strike surface 6 L is hit, the left strike surface 6 L is hit again, it is determined that the left strike surface 6 L has been continuously hit. When it is determined that the left strike surface 6 L has been continuously hit, the process proceeds to step S 18 . On the other hand, it is determined that the left strike surface 6 L has not been continuously hit, the process proceeds to step S 20 .
  • step S 18 the speed information is updated (specifically, the direction of the speed vector is changed) such that the character is to turn counterclockwise (i.e., the character is to move or accelerate to the left). Thereafter, the process proceeds to step S 28 .
  • step S 20 it is determined whether the right strike surface 6 R and the left strike surface 6 L have been alternately and continuously hit. For example, in a case where, within a predetermined time period after the left strike surface 6 L is hit, the right strike surface 6 R is hit, or in a case where, within a predetermined time period after the right strike surface 6 R is hit, the left strike surface 6 L is hit, it is determined that the right strike surface 6 R and the left strike surface 6 L have been alternately and continuously hit. When it is determined that the right strike surface 6 R and the left strike surface 6 L have been alternately and continuously hit, the process proceeds to step S 22 . On the other hand, the right strike surface 6 R and the left strike surface 6 L have not been alternately or continuously hit, the process proceeds to step S 24 .
  • step S 22 the speed information is updated (specifically, a magnitude of the speed vector is changed) such that the character is to accelerate forward. Thereafter, the process proceeds to step S 28 .
  • step S 24 it is determined whether the right strike surface 6 R and the left strike surface 6 L have been pressed for the predetermined time period or longer. For example, when both the right strike surface 6 R and the left strike surface 6 L have been pressed for one second or longer, it is determined that both the right strike surface 6 R and the left strike surface 6 L have been pressed for the predetermined time period or longer. When it is determined that both the right strike surface 6 R and the left strike surface 6 L have been pressed for the predetermined time period or longer, the process proceeds to step S 26 . On the other hand, when both the right strike surface 6 R and the left strike surface 6 L have not been pressed for the predetermined time period or longer, the process proceeds to step S 28 .
  • step S 26 the speed information is updated (specifically, the magnitude of the speed vector is changed) such that the character is to decelerate. Thereafter, the process proceeds to step S 28 .
  • step S 28 the handclap process is executed.
  • the character is controlled in accordance with the handclaps of the player.
  • the handclap process will be described in detail with reference to FIG. 8 .
  • the CPU 31 firstly determines, in step S 40 , whether the sound input flag 44 is turned on. Note that the sound input flag 44 is turned off in an initial state. When it is determined that the sound input flag 44 is turned on, the process proceeds to step S 46 . On the other hand, when it is determined that the sound input flag 44 is turned off, the process proceeds to step S 42 .
  • step S 42 it is determined whether sound having a predetermined volume level or higher has been detected by the microphone 6 M. When it is determined that the sound having the predetermined volume level or higher has been detected, the process proceeds to step S 44 . On the other hand, when it is determined that the sound having the predetermined volume level or higher has not been detected, the handclap process is to be finished.
  • step S 44 the sound input flag 44 is turned on, thereby causing the sound input timer 45 to be started. Thereafter, the process proceeds to step S 46 .
  • step S 46 it is determined whether either the right strike surface 6 R or the left strike surface 6 L is being pressed. When it is determined that either the right strike surface 6 R or the left strike surface 6 L is being pressed, the process proceeds to step S 50 . On the other hand, when it is determined that neither the right strike surface 6 R nor the left strike surface 6 L is being pressed, the process proceeds to step S 48 . That is, in step S 46 , when the player is pressing either of the two strike surfaces (i.e., in this state, it is determined sound inputted through the microphone 6 M is not the hand claps of the player), the sound inputted through the microphone 6 M is to be ignored.
  • step S 48 it is determined whether neither the right strike surface 6 R nor the left strike surface 6 L is being pressed for a predetermined time period (e.g., 10 frame period) or longer.
  • a predetermined time period e.g. 10 frame period
  • the process proceeds to step S 52 .
  • the predetermined time period has not yet passed after either the right strike surface 6 R or the left strike surface 6 L is pressed
  • the process proceeds to step S 50 .
  • step S 48 when sound is inputted through the microphone 6 M until the predetermined time period has passed after either of the two strike surfaces is pressed (i.e., in this state, it is determined that the sound inputted through the microphone 6 M is not the handclaps of the player, because a certain time period is required from when the player removes the pressed strike surface to when the player starts to clap his or her hands), the sound inputted through the microphone 6 M is to be ignored.
  • step S 50 the sound input flag 44 is turned off, thereby causing the sound input timer 45 to be reset. Thereafter, the handclap process is to be finished.
  • step S 52 it is determined whether a count value of the sound input timer 45 is a predetermined value (e.g., 10 frame period) or longer. When it is determined that the count value of the sound input timer 45 is the predetermined value or greater, the process proceeds to step S 54 . On the other hand, when it is determined that the count value of the sound input timer 45 is less than the predetermined value, the handclap process is to be finished.
  • a predetermined value e.g. 10 frame period
  • step S 52 when either of the two strike surfaces is pressed until a predetermined time period (e.g., 10 frame period) has passed after the sound is inputted through the microphone 6 M (i.e., in this state, it is determined that the sound inputted through the microphone 6 M is not the handclaps of the player, because a certain time period is required from when the player finishes clapping his or her hands to when the player presses the strike surface), the sound inputted through the microphone 6 M is to be ignored.
  • a predetermined time period e.g. 10 frame period
  • step S 54 by reading the handclap area information of the racecourse data 42 and the current position information of the character controlling data 43 , it is determined whether the character is positioned in the handclap area. When it is determined that the character is positioned in the handclap area, the process proceeds to step S 56 . On the other hand, when it is determined that the character is positioned outside the handclap area, the process proceeds to step S 58 .
  • step S 56 the speed information is updated such that the character is to perform a special movement action (e.g., jump).
  • a special movement action e.g., jump
  • the speed information is changed such that the special movement action, corresponding to a type of the handclap area in which the character is positioned, is to be performed.
  • the character maybe caused to perform the special movement action.
  • the character may be caused to instantaneously move from one place to another on the race course.
  • step S 58 an image of the character is changed such that the character is to perform the performance action (e.g., provocative action).
  • the performance action e.g., provocative action
  • step S 60 the sound input flag 44 is turned off, thereby causing the sound input timer 45 to be reset. Thereafter, the handclap process is to be finished.
  • the CPU 31 updates, in step S 30 shown in FIG. 7 , the current position information based on the speed information.
  • step S 32 the game image is updated based on the current position information which has been updated in step S 30 . Thereafter, the process returns to step S 12 .
  • the game image is sequentially updated such that the character is to move in accordance with an instruction inputted by the player.
  • the character jumps in accordance with the handclaps of the player only when the character is positioned on the jump ramp.
  • a game result e.g., a player ranking for the race, goal time, score, etc.
  • FIG. 9 shows a detailed variant of the handclap process.
  • the same steps as those shown in FIG. 8 are denoted by the same reference numerals.
  • FIG. 9 is only different from FIG. 8 in that in FIG. 9 , a process for determining whether the character is positioned in the handclap area (step S 54 ) is initially executed, and a process for causing the character to perform the performance action (step S 58 shown in FIG. 8 ) is eliminated.
  • the present invention is particularly effective when a plurality of players simultaneously play the game against each other.
  • a case where a first player and a second player simultaneously play the game against each other will be described with reference to FIGS. 10 and 11 .
  • two conga controllers 6 and 7 used by the first player and the second player, respectively, are connected to the game apparatus body 3 .
  • the conga controller 7 used by the second player is also provided with a microphone 7 M and three switches: a start button 7 S, a right strike surface 7 R, and a left strike surface 7 L.
  • FIG. 11 shows an exemplary game image obtained when the first player and the second player simultaneously play the game against each other.
  • the game image is comprised of two image areas: an upper image area and a lower image area.
  • An environment of the virtual game world viewed from a virtual camera imaging a character A is displayed in the upper image area, and another environment of the virtual game world viewed from a virtual camera imaging a character B is displayed in the lower image area.
  • the virtual cameras are mounted in a virtual game space by setting various parameters (e.g., a view point, fixation point, camera orientation, etc. ).
  • the parameters of the virtual camera imaging the character A are updated in accordance with a current position of the character A, and the parameters of the virtual camera imaging the character B are updated in accordance with a current position of the character B.
  • the character A is operated by the first player, and the character B is operated by the second player.
  • the character A and the character B travel on the same racecourse, and the acceleration lane is disposed on the racecourse.
  • the first player can cause the character A to accelerate, move to the right, move to the left and decelerate, by using the right strike surface 6 R and left strike surface 6 L of the conga type controller 6 . Furthermore, when the first player claps his or her hands, sound generated by handclaps of the first player is inputted to the conga controller 6 via the microphone 6 M, thereby making it possible to cause the character A to substantially accelerate.
  • the second player can cause the character B to accelerate, move to the right, move to the left and decelerate, by using the right strike surface 7 R and left strike surface 7 L of the conga controller 7 . Furthermore, when the second player claps his or her hands, sound generated by handclaps of the second player is inputted to the conga controller 7 via the microphone 7 M, thereby making it possible to cause the character B to substantially accelerate.
  • the sound generated by the handclaps of the second player is inputted not only to the microphone 7 M of the conga controller 7 used by the second player but also to the microphone 6 M of the conga controller 6 used by the first player.
  • the character A even when the sound generated by the handclaps of the second player is inputted to the microphone 6 M of the conga controller 6 used by the first player, the character A only performs the provocative action unless the character A is positioned in the acceleration lane.

Abstract

Sound generated by handclaps of a player is converted into an electric signal by a microphone, and the electric signal is inputted to a game apparatus. When a jump instruction is inputted, a player character jumps. However, the character jumps in accordance with the handclaps of the player only when the character is positioned on a jump ramp set in a virtual game world. In a case where ambient noise is inputted through the microphone when the character is not positioned on the jump ramp, the player character does not jump, but performs a provocative action. The provocative action does not exert any influence on a movement of the player character. Thus, in an action game for causing the player character to perform a predetermined action in accordance with a sound input, it becomes possible to prevent the player character from performing an action which is not intended by the player, even if the sound input is mistakenly detected.

Description

    CROSS REFERENCE TO RELATED APPLICATION
  • The disclosure of Japanese Patent Application No. 2006-130776, filed May 9, 2006, is incorporated herein by reference.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present invention relates to a game program and game apparatus capable of controlling a character displayed on a screen in accordance with a sound input.
  • 2. Description of the Background Art
  • As a conventional game which causes a character displayed on a screen to move by a sound input, there is a game disclosed in Japanese Laid-Open Patent Publication No. 2005-319041 (hereinafter, referred to as patent document 1). Patent document 1 discloses a game which controls a character by operating a percussion controller comprised of two congas disposed side-by-side. More specifically, when a player hits a right conga, the character moves to the right. When the player hits a left conga, the character moves to the left. Finally, when the player simultaneously hits both the right and left congas, the character jumps. The percussion controller includes a sound detecting device, and when the sound detecting device detects sound generated by handclaps of the player, for example, the character performs an action so as to toss an item.
  • However, in the game disclosed in patent document 1, even when the sound detecting device detects ambient noise around the player, the character performs an arbitrary action despite an intention of the player. In patent document 1, for example, when the player claps his or her hands in the vicinity of the percussion controller, the sound of the handclaps is detected as a sound input. However, the percussion controller can be operated by being hit by the player. Therefore, sound generated by hitting the percussion controller is mistakenly detected as the sound of the handclaps. As a result, there may be a case where the character frequently performs the action so as to toss the item, even when the player attempts to move the character.
  • SUMMARY OF THE INVENTION
  • Therefore, an object of the present invention is to prevent, in an action game which causes a player character to perform a predetermined action in accordance with a sound input, the player character from performing an action which is not intended by a player, even if the sound input is mistakenly detected.
  • The present invention has the following features to attain the object mentioned above. Note that reference numerals and figure numbers are shown in parentheses below for assisting a reader in finding corresponding components in the figures to facilitate the understanding of the present invention, but they are in no way intended to restrict the scope of the invention.
  • A computer-readable storage medium according to the present invention is a computer-readable storage medium storing a game program instructing a computer (31) of a game apparatus (3), which is connected to sound inputting means (6M) and a display apparatus (2), to function as: display controlling means (S32) for generating a game image including a player object, of a virtual game world, which is operated by a player, and causing the display apparatus to display the game image; movement controlling means (S30) for causing the player object to move in the virtual game world; sound detecting means (S42) for determining whether a sound is inputted through the sound inputting means; object position determining means (S54) for determining whether the player object is positioned in a specific area of the virtual game world; and action controlling means (S56) for causing the player object to perform a specific action, when the sound is inputted through the sound inputting means and the player object is positioned in the specific area of the virtual game world.
  • Note that “the movement controlling means” may cause the player object to move in accordance with an instruction of the player, or may allow the computer to automatically move the player object. In a case where the sound is inputted through the sound inputting means when the player object is positioned outside the specific area, the movement controlling means may cause the player object not to perform any action in response to the inputted sound, or may cause the player object to perform another action different from the specific action. The action controlling means determines whether or not the player object is positioned in the specific area based on a positional relationship between a current position of the player object and the specific area. For example, the action controlling means may determine whether or not the player object is to be positioned inside the specific area in a foreseeable future (e.g., in a frame immediately following a current frame), taking into consideration the current position of the player object which is already in the specific area, or a current moving direction with respect to the current position of the player object. Also, the specific action may be an action related to the specific area. Furthermore, the specific action may be different from one specific area to another.
  • The object position determining means may determine whether the player object is positioned in the specific area, if the sound detecting means determines that the sound is inputted through the sound inputting means (FIG. 8).
  • The action controlling means may cause the player object to perform an action different from the specific action, when the sound is inputted through the sound inputting means and the player object is positioned outside the specific area (S58).
  • The action controlling means may cause the player object to perform: (a) an action which exerts an influence on a movement of the player object, when the sound is inputted through the sound inputting means and the player object is positioned in the specific area of the virtual game world; and (b) an action which does not exert any influence on the movement of the player object, when the sound is inputted through the sound inputting means and the player object is positioned outside the specific area of the virtual game world.
  • Note that “the action which exerts an influence on a movement of the player object” indicates an action in which at least one of a position, moving direction and moving speed of the player object is changed accordingly when the action is performed. For example, when the action which exerts an influence on a movement of the player object is performed, movement parameters (e.g., a speed parameter, acceleration parameter, orientation parameter, etc.) of the player object are changed in accordance with the inputted sound. On the other hand, when the action which does not exert any influence on the movement of the player object is performed, the player object is caused to perform a predetermined action on the spot (while continuing to move if the player object is caused to move due to other factors).
  • The game program may realize a game providing the player with a specific challenge, and the action controlling means may cause the player object to perform: (a) an action which exerts an influence on a success or failure of the specific challenge, when the sound is inputted through the sound inputting means and the player object is positioned in the specific area of the virtual game world; and (b) an action which does not exert any influence on the success or failure of the specific challenge, when the sound is inputted through the sound inputting means and the player object is positioned outside the specific area of the virtual game world.
  • Note that “the specific challenge” includes “to reach a goal as fast as possible”, “to reach a goal faster than a rival character”, “to acquire as high score as possible”, and “to defeat an enemy character”, for example, and is different depending on a genre or type of the game.
  • The sound detecting means may determine whether the sound is inputted through the sound inputting means, if the object position determining means determines that the player object is positioned in the specific area (FIG. 9).
  • The specific action may be a jump action.
  • The specific area may indicate a jump ramp disposed in the virtual game world.
  • The sound detecting means may determine that the sound is inputted through the sound inputting means, when a sound having a predetermined volume level or higher is inputted through the sound inputting means (S42).
  • The game apparatus may be connected to operation means (6R, 6L), and the movement controlling means may cause the player object to move based on a signal outputted from the operation means.
  • Note that the operation means and the sound inputting means may have a common housing, or may have separate housings.
  • The operation means may be a percussion controller (6).
  • The game program may instruct the computer to further function as input operation detecting means (S4 6) for determining whether the player operates the operation means based on the signal outputted from the operation means, and the action controlling means may cause the player object not to perform the specific action, at least when the player operates the operation means. Thus, it becomes possible to ignore noise, other than sound of handclaps, which is inputted through the sound inputting means.
  • The action controlling means may cause the player object not to perform the specific action, when the player operates the operation means and while a predetermined time period has not yet passed after the player finishes operating the operation means (S48). Thus, it becomes possible to ignore the noise, other than the sound of the handclaps, which is inputted through the sound inputting means.
  • The display controlling means may display an operation guiding image for prompting the player to input the sound in a vicinity of the specific area (FIG. 5). Thus, it becomes possible to inform the player of a timing of inputting the sound in an easily understood manner, thereby allowing the player to appropriately perform an operation.
  • The game program may be able to realize a game in which at least a first player and a second player simultaneously play against each other, a first player object which can be operated by the first player and a second player object which can be operated by the second player may exist in the virtual game space, the movement controlling means may cause the first player object and the second player object to individually move in the virtual game world, and the action controlling means may include: first player object action controlling means for causing the first player object to perform the specific action, when the sound is inputted through the sound inputting means and the first player object is positioned in the specific area of the virtual game world; and second player object action controlling means for causing the second player object to perform the specific action, when the sound is inputted through the sound inputting means and the second player object is positioned in the specific area of the virtual game world.
  • The game apparatus is further connected to first operation means (6) operated by the first player and second operation means (7) operated by the second player, each of the first operation means and the second operation means includes the sound inputting means (6M, 7M), the first player object action controlling means may cause the first player object to perform the specific action, when the sound is inputted through the sound inputting means included in the first operation means and the first player object is positioned in the specific area of the virtual game world, and the second player object action controlling means may cause the second player object to perform the specific action, when the sound is inputted through the sound inputting means included in the second operation means and the second player object is positioned in the specific area of the virtual game world.
  • The game program may instruct the computer to further function as: first virtual camera setting means for setting a parameter (e.g., a view point, fixation point, camera orientation, etc.) of a first virtual camera which images the first player object based on a current position of the first player object; second virtual camera setting means for setting a parameter of a second virtual camera which images the second player object based on a current position of the second player object; and display controlling means for causing the display apparatus to simultaneously display, on the display apparatus, a first game image generated by imaging the virtual game space by means of the first virtual camera and a second game image generated by imaging the virtual game space by means of the second virtual camera.
  • A game apparatus according to the present invention comprises: sound inputting means (6M); a display apparatus (2); display controlling means (31, S32) for generating a game image including a player object, of a virtual game world, which is operated by a player, and causing a display apparatus to display the game image; movement controlling means (31, S30) for causing the player object to move in the virtual game world; sound detecting means (31, S42) for determining whether a sound is inputted through the sound inputting means; object position determining means (31, S54) for determining whether the player object is positioned in a specific area of the virtual game world; and action controlling means (31, S56) for causing the player object to perform a specific action, when the sound in inputted through the sound inputting means and the player object is positioned in the specific area of the virtual game world.
  • According to the present invention, in an action game for causing a player character to perform a predetermined action in accordance with a sound input, it becomes possible to prevent the character from performing an action which is not intended by the player, even if the sound input is mistakenly detected.
  • These and other objects, features, aspects and advantages of the present invention will become more apparent from the following detailed description of the present invention when taken in conjunction with the accompanying drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is an external view illustrating a configuration of a game system according to an embodiment of the present invention;
  • FIG. 2 is a block diagram illustrating an internal configuration of a game apparatus body;
  • FIG. 3 shows an exemplary game image displayed on a screen of a television;
  • FIG. 4 shows another exemplary game image displayed on the screen of the television;
  • FIG. 5 shows still another exemplary game image displayed on the screen of the television;
  • FIG. 6 shows a memory map of a work memory;
  • FIG. 7 is a flowchart illustrating a flow of a process executed by a CPU;
  • FIG. 8 is a flowchart illustrating a flow of a handclap process;
  • FIG. 9 is a flowchart illustrating a flow of the handclap process according to a variant;
  • FIG. 10 is an external view illustrating the game system obtained when two players simultaneously play a game against each other; and
  • FIG. 11 is an exemplary game image obtained when the two players simultaneously play the game against each other.
  • DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • Hereinafter, a game system according to an embodiment of the present invention will be described with reference to the drawings.
  • FIG. 1 is an external view illustrating a configuration of the game system according to the embodiment of the present invention. As shown in FIG. 1, a game system 1 comprises a television 2, a game apparatus body 3, and a conga controller 6, and has mounted there on a DVD-ROM 4 and a memory card 5. The DVD-ROM 4 and the memory card 5 are mounted on the game apparatus body 3 in a removable manner. The conga controller 6 is connected, by a communication cable, to any of four controller port connectors provided on the game apparatus body 3. The television 2 is connected to the game apparatus body 3 by an AV cable or the like. Note that the game apparatus body 3 and the controller 6 may communicate with each other by radio communication.
  • The conga controller 6 is provided with a microphone 6M and three switches: a start button 6S, a right strike surface 6R, and a left strike surface 6L. As described herein below, a player can control a motion of a character in a virtual game world by hitting the right strike surface 6R or left strike surface 6L. Instead of the conga controller 6, any controller including a microphone may be used.
  • The DVD-ROM 4 fixedly stores a game program, game data and the like. The DVD-ROM 4 is mounted on the game apparatus body 3 when the player plays a game. Here, instead of the DVD-ROM 4, an external storage medium such as a CD-ROM, an MO, a memory card, a ROM cartridge or the like may be used as means for storing the game program and the like.
  • The game apparatus body 3 reads the game program stored in the DVD-ROM 4, and then performs a process in accordance with the read game program.
  • The television 2 displays, on a screen, image data outputted from the game apparatus body 3.
  • The memory card 5 has a rewritable storage medium, e.g., a flash memory, as a backup memory for storing data such as saved data of the game.
  • FIG. 2 is a block diagram illustrating an internal configuration of the game apparatus body 3. Hereinafter, each component of the game system 1 will be described in more detail with reference to FIG. 2.
  • As shown in FIG. 2, the game apparatus body 3 comprises a CPU 31, a work memory 32, an external memory interface (I/F) 33, a controller interface (I/F) 34, a video RAM (VRAM) 35, a graphics processing unit (GRU) 36, and an optical disc drive 37.
  • In order for the game to start, the optical disc drive 37 drives the DVD-ROM 4 mounted on the game apparatus body 3, and then the game program stored in the DVD-ROM 4 is loaded into the work memory 32. The game starts when the CPU 31 executes the program stored in the work memory 32. After the game starts, the player plays the game by using the conga controller 6. In accordance with an operation performed by the player, the conga controller 6 outputs operation data to the game apparatus body 3. The operation data outputted from the conga controller 6 is supplied to the CPU 31 via the controller I/F 34. The CPU 31 performs a game process based on inputted operation data. The GPU 36 is used for image data generation and the like performed in the game process.
  • The GPU 36 performs, for coordinates of a solid model of an object of figure (e.g., an object comprised of polygons) placed in a three-dimensional virtual game world, arithmetic processing (e.g., rotation, scaling and deformation of a three-dimensional model, and coordinate transformation from a world coordinate system to a camera coordinate system or screen coordinate system). Further, the GPU 36 generates a game image by writing, based on a predetermined texture, color data (RGB data) of each pixel of a solid model projected on the screen coordinate system into the VRAM 35. The GPU 36 thus generates the game image to be displayed on the television 2, and outputs the game image to the television 2 as necessary. Although the present embodiment shows a hardware configuration in which a memory dedicated for image processing (VRAM 35) is separately provided, the present invention is not limited thereto. For example, a UMA (Unified Memory Architecture) system, in which a part of the work memory 32 is used as a memory for image processing, may be used.
  • The work memory 32 stores various program and pieces of data loaded from the DVD-ROM 4. These pieces of data include, for example, data, which is related to polygons comprising the three-dimensional model placed in the virtual game world, and a texture used for coloring the polygons.
  • FIG. 3 shows an exemplary game image displayed on a screen of the television 2. Although the present embodiment illustrates an example where the present invention is applied to a racing game, the present invention is not limited thereto. The present invention is applicable to an arbitrary game.
  • On the screen of the television 2, a racecourse set in a virtual game world, a player character operated by the player, and an obstacle and coins disposed on the racecourse are displayed. The player operates the player character by using the conga controller 6: so as to collide with as few obstacles as possible; so as to acquire as many coins as possible; and so as to reach a goal as fast as possible.
  • By using the conga controller 6, the player can input instructions such as an acceleration instruction, a rightward movement instruction, a leftward movement instruction, a deceleration instruction, and a jump instruction.
  • The player can input the acceleration instruction by alternately and continuously hitting the right strike surface 6R and left strike surface 6L of the conga controller 6. When the acceleration instruction is inputted, the character accelerates forward (i.e., in a direction in which the character faces or in a moving direction of the character).
  • The player can input the rightward movement instruction by continuously hitting the right strike surface 6R of the conga controller 6. When the rightward movement instruction is inputted, the character moves to the right (i.e., in a rightward direction with respect to the direction in which the character faces or in a rightward direction with respect to the moving direction of the character). The more rapidly the player continuously hits the right strike surface 6R, the more quickly the character moves to the right. Instead of moving the character to the right, a current direction in which the character faces and a current moving direction of the character may be changed to the right.
  • The player can input the leftward movement instruction by continuously hitting the left strike surface 6L of the conga controller 6. When the leftward movement instruction is inputted, the character moves to the left (i.e., in a leftward direction with respect to the direction in which the character faces or in a leftward direction with respect to the moving direction of the character). The more rapidly the player continuously hits the left strike surface 6L, the more quickly the character moves to the left. Instead of moving the character to the left, the current direction in which the character faces and the current direction of the character may be changed to the left.
  • The player can input the deceleration instruction by pressing both the right strike surface 6R and left strike surface 6L of the conga controller 6 for a predetermined time period or longer. When the deceleration instruction is inputted, the character decelerates.
  • The player can input the jump instruction by clapping his or her hands in the vicinity of the conga controller 6. Specifically, sound generated by handclaps of the player is converted into an electric signal by the microphone 6M, so as to be inputted to the game apparatus body 3. When the jump instruction is inputted, the character jumps. Note that the character jumps in accordance with the handclaps of the player only when the character is positioned on a jump ramp (see FIG. 4) which is set in the virtual game world.
  • In a case where ambient noise (including voice or handclaps of any person other than the player) is inputted through the microphone 6M when the character is not positioned on the jump ramp, the character does not jump, but performs a provocative action, as shown in FIG. 5. The provocative action does not exert any influence on the movement of the character. Therefore, even if the character performs the provocative action, a movement speed or movement direction of the character never changes according to the provocative action. Furthermore, in a case where any of the acceleration instruction, the rightward movement instruction, the leftward movement instruction and the deceleration instruction is inputted while the character is performing the provocative action, the provocative action is immediately released, thereby allowing the character to perform an action (i.e., an acceleration, rightward movement, leftward movement or deceleration) in accordance with the inputted instruction.
  • As described above, the character jumps in accordance with the handclaps of the player only when the character is positioned on the jump ramp. Thus, it becomes possible to avoid a case where the character unexpectedly jumps in response to the ambient noise, thereby exerting an adverse effect on a game result (e.g., a player ranking for the race, goal time, score, etc.). Although the present embodiment illustrates an example where the character jumps in accordance with the handclaps of the player, the present invention is not limited thereto. For example, when the character is positioned in an acceleration lane as shown in FIG. 11, the character may accelerate in accordance with the handclaps of the player (more rapidly than when the acceleration instruction is inputted). Further, both the jump ramp and the acceleration lane may exist on the racecourse.
  • Hereinafter, an area, e.g., the jump ramp or acceleration lane, in which the character performs a special movement action such as a jump or acceleration in accordance with the handclaps of the player, is referred to as a “hand clap area”. Also, an action, e.g., the provocative action, performed by the character in accordance with the handclaps of the player when the character is positioned outside the handclap area, is referred to as a “performance action”.
  • As shown in FIG. 4, a handclap image for prompting the player to clap his or her hands is displayed in the vicinity of the jump ramp (handclap area).
  • Note that when the character passes through the jump ramp, the player may not input the jump instruction (sound of the handclaps). In this case, the character passes through the jump ramp without being jumped. The same is also true of the acceleration lane.
  • Hereinafter, an operation of the game apparatus body 3 according to the present embodiment will be described in detail.
  • FIG. 6 shows a memory map of the work memory 32. The work memory 32 stores a game program 40, game image data 41, racecourse data 42, character controlling data 43, a sound input flag 44, and a sound input timer 45.
  • The game image data 41 is data for generating a game image displayed on the screen of the television 2, and includes a character image, a background image, and the handclap image.
  • The race course data 42 is data for showing a shape of the racecourse set in the virtual game world, and includes handclap area information indicating a position of the handclap area.
  • The character controlling data 43 is data for controlling the movement of the character in the virtual game world, and includes current position information and speed information. The current position information is information (coordinate data) indicating a current position of the character, and the speed information is information (vector data) indicating a movement speed of the character.
  • The sound input flag 44 and the sound input timer 45 are a flag or timer, respectively, used in a handclap process to be described later.
  • Hereinafter, a flow of a process executed by the CPU 31 based on the game program 40 will be described with reference to FIGS. 7 and 8.
  • In FIG. 7, when the game program 40 starts to be executed, the CPU 31 firstly displays, in step S10, an initial game image. At this time, initial values of the current position and movement speed of the character are set.
  • In step S12, it is determined whether the right strike surface 6R has been continuously hit. For example, in a case where, within a predetermined time period after the right strike surface 6R is hit, the strike surface 6R is hit again, it is determined that the right strike surface 6R has been continuously hit. When it is determined that the right strike surface 6R has been continuously hit, the process proceeds to step S14. On the other hand, it is determined that the right strike surface 6R has not been continuously hit, the process proceeds to step S16.
  • In step S14, the speed information is updated (specifically, a direction of a speed vector is changed) such that the character is to turn clockwise (i.e., the character is to move or accelerate to the right). Thereafter, the process proceeds to step S28.
  • In step S16, it is determined whether the left strike surface 6L has been continuously hit. For example, in a case where, within a predetermined time period after the left strike surface 6L is hit, the left strike surface 6L is hit again, it is determined that the left strike surface 6L has been continuously hit. When it is determined that the left strike surface 6L has been continuously hit, the process proceeds to step S18. On the other hand, it is determined that the left strike surface 6L has not been continuously hit, the process proceeds to step S20.
  • In step S18, the speed information is updated (specifically, the direction of the speed vector is changed) such that the character is to turn counterclockwise (i.e., the character is to move or accelerate to the left). Thereafter, the process proceeds to step S28.
  • In step S20, it is determined whether the right strike surface 6R and the left strike surface 6L have been alternately and continuously hit. For example, in a case where, within a predetermined time period after the left strike surface 6L is hit, the right strike surface 6R is hit, or in a case where, within a predetermined time period after the right strike surface 6R is hit, the left strike surface 6L is hit, it is determined that the right strike surface 6R and the left strike surface 6L have been alternately and continuously hit. When it is determined that the right strike surface 6R and the left strike surface 6L have been alternately and continuously hit, the process proceeds to step S22. On the other hand, the right strike surface 6R and the left strike surface 6L have not been alternately or continuously hit, the process proceeds to step S24.
  • In step S22, the speed information is updated (specifically, a magnitude of the speed vector is changed) such that the character is to accelerate forward. Thereafter, the process proceeds to step S28.
  • In step S24, it is determined whether the right strike surface 6R and the left strike surface 6L have been pressed for the predetermined time period or longer. For example, when both the right strike surface 6R and the left strike surface 6L have been pressed for one second or longer, it is determined that both the right strike surface 6R and the left strike surface 6L have been pressed for the predetermined time period or longer. When it is determined that both the right strike surface 6R and the left strike surface 6L have been pressed for the predetermined time period or longer, the process proceeds to step S26. On the other hand, when both the right strike surface 6R and the left strike surface 6L have not been pressed for the predetermined time period or longer, the process proceeds to step S28.
  • In step S26, the speed information is updated (specifically, the magnitude of the speed vector is changed) such that the character is to decelerate. Thereafter, the process proceeds to step S28.
  • In step S28, the handclap process is executed. In the handclap process, the character is controlled in accordance with the handclaps of the player. Hereinafter, the handclap process will be described in detail with reference to FIG. 8.
  • In the handclap process, the CPU 31 firstly determines, in step S40, whether the sound input flag 44 is turned on. Note that the sound input flag 44 is turned off in an initial state. When it is determined that the sound input flag 44 is turned on, the process proceeds to step S46. On the other hand, when it is determined that the sound input flag 44 is turned off, the process proceeds to step S42.
  • In step S42, it is determined whether sound having a predetermined volume level or higher has been detected by the microphone 6M. When it is determined that the sound having the predetermined volume level or higher has been detected, the process proceeds to step S44. On the other hand, when it is determined that the sound having the predetermined volume level or higher has not been detected, the handclap process is to be finished.
  • In step S44, the sound input flag 44 is turned on, thereby causing the sound input timer 45 to be started. Thereafter, the process proceeds to step S46.
  • In step S46, it is determined whether either the right strike surface 6R or the left strike surface 6L is being pressed. When it is determined that either the right strike surface 6R or the left strike surface 6L is being pressed, the process proceeds to step S50. On the other hand, when it is determined that neither the right strike surface 6R nor the left strike surface 6L is being pressed, the process proceeds to step S48. That is, in step S46, when the player is pressing either of the two strike surfaces (i.e., in this state, it is determined sound inputted through the microphone 6M is not the hand claps of the player), the sound inputted through the microphone 6M is to be ignored.
  • In step S48, it is determined whether neither the right strike surface 6R nor the left strike surface 6L is being pressed for a predetermined time period (e.g., 10 frame period) or longer. When it is determined that neither the right strike surface 6R or the left strike surface 6L is being pressed for the predetermined time period or longer, the process proceeds to step S52. On the other hand, when it is determined that the predetermined time period has not yet passed after either the right strike surface 6R or the left strike surface 6L is pressed, the process proceeds to step S50. That is, in step S48, when sound is inputted through the microphone 6M until the predetermined time period has passed after either of the two strike surfaces is pressed (i.e., in this state, it is determined that the sound inputted through the microphone 6M is not the handclaps of the player, because a certain time period is required from when the player removes the pressed strike surface to when the player starts to clap his or her hands), the sound inputted through the microphone 6M is to be ignored.
  • In step S50, the sound input flag 44 is turned off, thereby causing the sound input timer 45 to be reset. Thereafter, the handclap process is to be finished.
  • In step S52, it is determined whether a count value of the sound input timer 45 is a predetermined value (e.g., 10 frame period) or longer. When it is determined that the count value of the sound input timer 45 is the predetermined value or greater, the process proceeds to step S54. On the other hand, when it is determined that the count value of the sound input timer 45 is less than the predetermined value, the handclap process is to be finished. That is, in step S52, when either of the two strike surfaces is pressed until a predetermined time period (e.g., 10 frame period) has passed after the sound is inputted through the microphone 6M (i.e., in this state, it is determined that the sound inputted through the microphone 6M is not the handclaps of the player, because a certain time period is required from when the player finishes clapping his or her hands to when the player presses the strike surface), the sound inputted through the microphone 6M is to be ignored.
  • In step S54, by reading the handclap area information of the racecourse data 42 and the current position information of the character controlling data 43, it is determined whether the character is positioned in the handclap area. When it is determined that the character is positioned in the handclap area, the process proceeds to step S56. On the other hand, when it is determined that the character is positioned outside the handclap area, the process proceeds to step S58.
  • In step S56, the speed information is updated such that the character is to perform a special movement action (e.g., jump). Note that when a plurality of types of handclap areas such as the jump ramp or acceleration lane exist on the racecourse, the speed information is changed such that the special movement action, corresponding to a type of the handclap area in which the character is positioned, is to be performed. By using a method other than that of changing the speed information, the character maybe caused to perform the special movement action. By changing the current position information, for example, the character may be caused to instantaneously move from one place to another on the race course.
  • In step S58, an image of the character is changed such that the character is to perform the performance action (e.g., provocative action).
  • In step S60, the sound input flag 44 is turned off, thereby causing the sound input timer 45 to be reset. Thereafter, the handclap process is to be finished.
  • When the handclap process is finished, the CPU 31 updates, in step S30 shown in FIG. 7, the current position information based on the speed information.
  • In step S32, the game image is updated based on the current position information which has been updated in step S30. Thereafter, the process returns to step S12.
  • By repeating steps S12 to S32 mentioned above, the game image is sequentially updated such that the character is to move in accordance with an instruction inputted by the player.
  • As described above, according to the present embodiment, the character jumps in accordance with the handclaps of the player only when the character is positioned on the jump ramp. Thus, it becomes possible to avoid a case where the character unexpectedly jumps in response to the ambient noise, thereby exerting an adverse effect on a game result (e.g., a player ranking for the race, goal time, score, etc.).
  • Note that in the handclap process shown in FIG. 8, it is determined whether the character is positioned in the handclap area after the sound having the predetermined volume level or higher is detected. Instead of this, however, the sound may be detected by the microphone 6M only when the character is positioned in the handclap area. FIG. 9 shows a detailed variant of the handclap process. In FIG. 9, the same steps as those shown in FIG. 8 are denoted by the same reference numerals. FIG. 9 is only different from FIG. 8 in that in FIG. 9, a process for determining whether the character is positioned in the handclap area (step S54) is initially executed, and a process for causing the character to perform the performance action (step S58 shown in FIG. 8) is eliminated.
  • The present invention is particularly effective when a plurality of players simultaneously play the game against each other. Hereinafter, a case where a first player and a second player simultaneously play the game against each other will be described with reference to FIGS. 10 and 11.
  • In a case where the first player and the second player simultaneously play the game against each other, two conga controllers 6 and 7 used by the first player and the second player, respectively, are connected to the game apparatus body 3. The conga controller 7 used by the second player is also provided with a microphone 7M and three switches: a start button 7S, a right strike surface 7R, and a left strike surface 7L.
  • FIG. 11 shows an exemplary game image obtained when the first player and the second player simultaneously play the game against each other. The game image is comprised of two image areas: an upper image area and a lower image area. An environment of the virtual game world viewed from a virtual camera imaging a character A is displayed in the upper image area, and another environment of the virtual game world viewed from a virtual camera imaging a character B is displayed in the lower image area. The virtual cameras are mounted in a virtual game space by setting various parameters (e.g., a view point, fixation point, camera orientation, etc. ). The parameters of the virtual camera imaging the character A are updated in accordance with a current position of the character A, and the parameters of the virtual camera imaging the character B are updated in accordance with a current position of the character B. The character A is operated by the first player, and the character B is operated by the second player. The character A and the character B travel on the same racecourse, and the acceleration lane is disposed on the racecourse.
  • The first player can cause the character A to accelerate, move to the right, move to the left and decelerate, by using the right strike surface 6R and left strike surface 6L of the conga type controller 6. Furthermore, when the first player claps his or her hands, sound generated by handclaps of the first player is inputted to the conga controller 6 via the microphone 6M, thereby making it possible to cause the character A to substantially accelerate.
  • Similarly, the second player can cause the character B to accelerate, move to the right, move to the left and decelerate, by using the right strike surface 7R and left strike surface 7L of the conga controller 7. Furthermore, when the second player claps his or her hands, sound generated by handclaps of the second player is inputted to the conga controller 7 via the microphone 7M, thereby making it possible to cause the character B to substantially accelerate.
  • In such a case where the first player and the second player simultaneously play the game against each other, when the second player claps his or her hands so as to cause the character B to substantially accelerate in a state where the character B is positioned in the acceleration lane, the sound generated by the handclaps of the second player is inputted not only to the microphone 7M of the conga controller 7 used by the second player but also to the microphone 6M of the conga controller 6 used by the first player. However, according to the present invention, even when the sound generated by the handclaps of the second player is inputted to the microphone 6M of the conga controller 6 used by the first player, the character A only performs the provocative action unless the character A is positioned in the acceleration lane. Therefore, no adverse effect is exerted on a game result of the first player. The same is also true of a case where the first player claps his or her hands so as to cause the character A to substantially accelerate in a state where the character A is positioned in the acceleration lane.
  • While the invention has been described in detail, the foregoing description is in all aspects illustrative and not restrictive. It is understood that numerous other modifications and variations can be devised without departing from the scope of the invention.

Claims (18)

1. A computer-readable storage medium storing a game program instructing a computer of a game apparatus, which is connected to sound inputting means and a display apparatus, to function as:
display controlling means for generating a game image including a player object, of a virtual game world, which is operated by a player, and causing the display apparatus to display the game image;
movement controlling means for causing the player object to move in the virtual game world;
sound detecting means for determining whether a sound is inputted through the sound inputting means;
object position determining means for determining whether the player object is positioned in a specific area of the virtual game world; and
action controlling means for causing the player object to perform a specific action, when the sound is inputted through the sound inputting means and the player object is positioned in the specific area of the virtual game world.
2. The computer-readable storage medium according to claim 1, wherein
the object position determining means determines whether the player object is positioned in the specific area, if the sound detecting means determines that the sound is inputted through the sound inputting means.
3. The computer-readable storage medium according to claim 2, wherein
the action controlling means causes the player object to perform an action different from the specific action, when the sound is inputted through the sound inputting means and the player object is positioned outside the specific area.
4. The computer-readable storage medium according to claim 3, wherein
the action controlling means causes the player object to perform:
(a) an action which exerts an influence on a movement of the player object, when the sound is inputted through the sound inputting means and the player object is positioned in the specific area of the virtual game world; and
(b) an action which does not exert any influence on the movement of the player object, when the sound is inputted through the sound inputting means and the player object is positioned outside the specific area of the virtual game world.
5. The computer-readable storage medium according to claim 3, wherein
the game program realizes a game providing the player with a specific challenge, and
the action controlling means causes the player object to perform:
(a) an action which exerts an influence on a success or failure of the specific challenge, when the sound is inputted through the sound inputting means and the player object is positioned in the specific area of the virtual game world; and
(b) an action which does not exert any influence on the success or failure of the specific challenge, when the sound is inputted through the sound inputting means and the player object is positioned outside the specific area of the virtual game world.
6. The computer-readable storage medium according to claim 1, wherein
the sound detecting means determines whether the sound is inputted through the sound inputting means, if the object position determining means determines that the player object is positioned in the specific area.
7. The computer-readable storage medium according-to claim 1, wherein
the specific action is a jump action.
8. The computer-readable storage medium according to claim 1, wherein
the specific area indicates a jump ramp disposed in the virtual game world.
9. The computer-readable storage medium according to claim 1, wherein
the sound detecting means determines that the sound is inputted through the sound inputting means, when a sound having a predetermined volume level or higher is inputted through the sound inputting means.
10. The computer-readable storage medium according to claim 1, wherein
the game apparatus is connected to operation means, and
the movement controlling means causes the player object to move based on a signal outputted from the operation means.
11. The computer-readable storage medium according to claim 10, wherein
the operation means is a percussion controller.
12. The computer-readable storage medium according to claim 10, wherein
the game program instructs the computer to further function as input operation detecting means for determining whether the player operates the operation means based on the signal outputted from the operation means, and
the action controlling means causes the player object not to perform the specificaction, at least when the player operates the operation means.
13. The computer-readable storage medium according to claim 12, wherein
the action controlling means causes the player object not to perform the specific action, when the player operates the operation means and while a predetermined time period has not yet passed after the player finishes operating the operation means.
14. The computer-readable storage medium according to claim 1, wherein
the display controlling means displays an operation guiding image for prompting the player to input the sound in a vicinity of the specific area.
15. The computer-readable storage medium according to claim 1, wherein
the game program is able to realize a game in which at least a first player and a second player simultaneously play against each other,
a first player object which can be operated by the first player and a second player object which can be operated by the second player exist in the virtual game space,
the movement controlling means causes the first player object and the second player object to individually move in the virtual game world, and
the action controlling means includes:
first player object action controlling means for causing the first player object to perform the specific action, when the sound is inputted through the sound inputting means and the first player object is positioned in the specific area of the virtual game world; and
second player object action controlling means for causing the second player object to perform the specific action, when the sound is inputted through the sound inputting means and the second player object is positioned in the specific area of the virtual game world.
16. The computer-readable storage medium according to claim 15, wherein
the game apparatus is further connected to first operation means operated by the first player and second operation means operated by the second player,
each of the first operation means and the second operation means includes the sound inputting means,
the first player object action controlling means causes the first player object to perform the specific action, when the sound is inputted through the sound inputting means included in the first operation means and the first player object is positioned in the specific area of the virtual game world, and
the second player object action controlling means causes the second player object to perform the specific action, when the sound is inputted through the sound inputting means included in the second operation means and the second player object is positioned in the specific area of the virtual game world.
17. The computer-readable storage medium according to claim 15, wherein
the game program instructs the computer to further function as:
first virtual camera setting means for setting a parameter of a first virtual camera which images the first player object based on a current position of the first player object;
second virtual camera setting means for setting a parameter of a second virtual camera which images the second player object based on a current position of the second player object; and
display controlling means for causing the display apparatus to simultaneously display, on the display apparatus, a first game image generated by imaging the virtual game space by means of the first virtual camera and a second game image generated by imaging the virtual game space by means of the second virtual camera.
18. A game apparatus comprising:
sound inputting means;
a display apparatus;
display controlling means for generating a game image including a player object, of a virtual game world, which is operated by a player, and causing a display apparatus to display the game image;
movement controlling means for causing the player object to move in the virtual game world;
sound detecting means for determining whether a sound is inputted through the sound inputting means;
object position determining means for determining whether the player object is positioned in a specific area of the virtual game world; and
action controlling means for causing the player object to perform a specific action, when the sound in inputted through the sound inputting means and the player object is positioned in the specific area of the virtual game world.
US11/797,558 2006-05-09 2007-05-04 Game program and game apparatus Abandoned US20070265074A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2006-0130776 2006-05-09
JP2006130776A JP2007301039A (en) 2006-05-09 2006-05-09 Game program and game apparatus

Publications (1)

Publication Number Publication Date
US20070265074A1 true US20070265074A1 (en) 2007-11-15

Family

ID=38685802

Family Applications (1)

Application Number Title Priority Date Filing Date
US11/797,558 Abandoned US20070265074A1 (en) 2006-05-09 2007-05-04 Game program and game apparatus

Country Status (2)

Country Link
US (1) US20070265074A1 (en)
JP (1) JP2007301039A (en)

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050096132A1 (en) * 2003-09-22 2005-05-05 Hiromu Ueshima Music game with strike sounds changing in quality in the progress of music and entertainment music system
US20070265087A1 (en) * 2006-05-09 2007-11-15 Nintendo Co., Ltd. Game program and game apparatus
US20070265042A1 (en) * 2006-05-09 2007-11-15 Nintendo Co., Ltd. Game program and game apparatus
US20120231886A1 (en) * 2009-11-20 2012-09-13 Wms Gaming Inc. Integrating wagering games and environmental conditions
US8390670B1 (en) 2008-11-24 2013-03-05 Shindig, Inc. Multiparty communications systems and methods that optimize communications based on mode and available bandwidth
US8647206B1 (en) 2009-01-15 2014-02-11 Shindig, Inc. Systems and methods for interfacing video games and user communications
US9401937B1 (en) 2008-11-24 2016-07-26 Shindig, Inc. Systems and methods for facilitating communications amongst multiple users
US9711181B2 (en) 2014-07-25 2017-07-18 Shindig. Inc. Systems and methods for creating, editing and publishing recorded videos
US9712579B2 (en) 2009-04-01 2017-07-18 Shindig. Inc. Systems and methods for creating and publishing customizable images from within online events
US9734410B2 (en) 2015-01-23 2017-08-15 Shindig, Inc. Systems and methods for analyzing facial expressions within an online classroom to gauge participant attentiveness
US9733333B2 (en) 2014-05-08 2017-08-15 Shindig, Inc. Systems and methods for monitoring participant attentiveness within events and group assortments
US9779708B2 (en) 2009-04-24 2017-10-03 Shinding, Inc. Networks of portable electronic devices that collectively generate sound
US9947366B2 (en) 2009-04-01 2018-04-17 Shindig, Inc. Group portraits composed using video chat systems
US9952751B2 (en) 2014-04-17 2018-04-24 Shindig, Inc. Systems and methods for forming group communications within an online event
US10133916B2 (en) 2016-09-07 2018-11-20 Steven M. Gottlieb Image and identity validation in video chat events
US10271010B2 (en) 2013-10-31 2019-04-23 Shindig, Inc. Systems and methods for controlling the display of content
US11439903B2 (en) * 2018-12-26 2022-09-13 Nintendo Co., Ltd. Information processing system, storage medium storing information processing program, information processing apparatus, and information processing method

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108465238B (en) * 2018-02-12 2021-11-12 网易(杭州)网络有限公司 Information processing method in game, electronic device and storage medium

Citations (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5993318A (en) * 1996-11-07 1999-11-30 Kabushiki Kaisha Sega Enterprises Game device, image sound processing device and recording medium
US6149523A (en) * 1996-03-06 2000-11-21 Namco Ltd. Image synthesis method, games machine and information storage medium with sequence checking
US20010046896A1 (en) * 1995-11-22 2001-11-29 Nintendo Co., Ltd. Video game system and method with enhanced three-dimensional character and background control due to environmental conditions
US6347998B1 (en) * 1999-06-30 2002-02-19 Konami Co., Ltd. Game system and computer-readable recording medium
US20030069052A1 (en) * 2001-10-10 2003-04-10 Konami Corporation Recording medium storing game screen display program, game screen display program, game screen display method, and video game device
US20030216177A1 (en) * 2002-05-17 2003-11-20 Eiji Aonuma Game system and game program
US20040029640A1 (en) * 1999-10-04 2004-02-12 Nintendo Co., Ltd. Game system and game information storage medium used for same
US20050056997A1 (en) * 2003-09-12 2005-03-17 Nintendo Co., Ltd. Operating apparatus for game machine
US20050085297A1 (en) * 2003-09-12 2005-04-21 Namco Ltd. Program, information storage medium and game system
US20050130740A1 (en) * 2003-09-12 2005-06-16 Namco Ltd. Input device, input determination method, game system, game system control method, program, and information storage medium
US20050245315A1 (en) * 2004-04-30 2005-11-03 Nintendo Co., Ltd. Game system and game program medium
US20050272504A1 (en) * 2001-08-21 2005-12-08 Nintendo Co., Ltd. Method and apparatus for multi-user communications using discrete video game platforms
US20050288099A1 (en) * 2004-05-07 2005-12-29 Takao Shimizu Game system, storage medium storing game program, and game controlling method
US20060025218A1 (en) * 2004-07-29 2006-02-02 Nintendo Co., Ltd. Game apparatus utilizing touch panel and storage medium storing game program
US7073129B1 (en) * 1998-12-18 2006-07-04 Tangis Corporation Automated selection of appropriate information based on a computer user's context
US7071914B1 (en) * 2000-09-01 2006-07-04 Sony Computer Entertainment Inc. User input device and method for interaction with graphic images
US7083519B2 (en) * 2001-10-26 2006-08-01 Konami Corporation Game system and related game machine, control method and program, operable with different interchangeable controllers
US7096079B2 (en) * 1999-10-14 2006-08-22 Sony Computer Entertainment Inc. Audio processing and image generating apparatus, audio processing and image generating method, recording medium and program
US7134960B1 (en) * 2000-08-23 2006-11-14 Nintendo Co., Ltd. External interfaces for a 3D graphics system
US20070265087A1 (en) * 2006-05-09 2007-11-15 Nintendo Co., Ltd. Game program and game apparatus
US20070265042A1 (en) * 2006-05-09 2007-11-15 Nintendo Co., Ltd. Game program and game apparatus
US7320643B1 (en) * 2006-12-04 2008-01-22 Harmonix Music Systems, Inc. Game controller simulating a musical instrument
US7335105B2 (en) * 2001-08-20 2008-02-26 Ssd Company Limited Soccer game apparatus
US7352358B2 (en) * 2002-07-27 2008-04-01 Sony Computer Entertainment America Inc. Method and system for applying gearing effects to acoustical tracking
US7367887B2 (en) * 2000-02-18 2008-05-06 Namco Bandai Games Inc. Game apparatus, storage medium, and computer program that adjust level of game difficulty
US7435169B2 (en) * 2006-03-10 2008-10-14 Nintendo Co., Ltd. Music playing apparatus, storage medium storing a music playing control program and music playing control method
US20080311969A1 (en) * 2007-06-14 2008-12-18 Robert Kay Systems and methods for indicating input actions in a rhythm-action game
US20090082078A1 (en) * 2006-03-29 2009-03-26 Harmonix Music Systems, Inc. Game controller simulating a musical instrument
US7601056B2 (en) * 2003-07-30 2009-10-13 Konami Corporation Music game software and music game machine
US20090258686A1 (en) * 2008-04-15 2009-10-15 Mccauley Jack J System and method for playing a music video game with a drum system game controller
US7682237B2 (en) * 2003-09-22 2010-03-23 Ssd Company Limited Music game with strike sounds changing in quality in the progress of music and entertainment music system
US7782297B2 (en) * 2002-07-27 2010-08-24 Sony Computer Entertainment America Inc. Method and apparatus for use in determining an activity level of a user in relation to a system

Patent Citations (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010046896A1 (en) * 1995-11-22 2001-11-29 Nintendo Co., Ltd. Video game system and method with enhanced three-dimensional character and background control due to environmental conditions
US6149523A (en) * 1996-03-06 2000-11-21 Namco Ltd. Image synthesis method, games machine and information storage medium with sequence checking
US5993318A (en) * 1996-11-07 1999-11-30 Kabushiki Kaisha Sega Enterprises Game device, image sound processing device and recording medium
US7073129B1 (en) * 1998-12-18 2006-07-04 Tangis Corporation Automated selection of appropriate information based on a computer user's context
US6347998B1 (en) * 1999-06-30 2002-02-19 Konami Co., Ltd. Game system and computer-readable recording medium
US20040029640A1 (en) * 1999-10-04 2004-02-12 Nintendo Co., Ltd. Game system and game information storage medium used for same
US7096079B2 (en) * 1999-10-14 2006-08-22 Sony Computer Entertainment Inc. Audio processing and image generating apparatus, audio processing and image generating method, recording medium and program
US7367887B2 (en) * 2000-02-18 2008-05-06 Namco Bandai Games Inc. Game apparatus, storage medium, and computer program that adjust level of game difficulty
US7134960B1 (en) * 2000-08-23 2006-11-14 Nintendo Co., Ltd. External interfaces for a 3D graphics system
US7071914B1 (en) * 2000-09-01 2006-07-04 Sony Computer Entertainment Inc. User input device and method for interaction with graphic images
US7335105B2 (en) * 2001-08-20 2008-02-26 Ssd Company Limited Soccer game apparatus
US20050272504A1 (en) * 2001-08-21 2005-12-08 Nintendo Co., Ltd. Method and apparatus for multi-user communications using discrete video game platforms
US20030069052A1 (en) * 2001-10-10 2003-04-10 Konami Corporation Recording medium storing game screen display program, game screen display program, game screen display method, and video game device
US7083519B2 (en) * 2001-10-26 2006-08-01 Konami Corporation Game system and related game machine, control method and program, operable with different interchangeable controllers
US20030216177A1 (en) * 2002-05-17 2003-11-20 Eiji Aonuma Game system and game program
US7782297B2 (en) * 2002-07-27 2010-08-24 Sony Computer Entertainment America Inc. Method and apparatus for use in determining an activity level of a user in relation to a system
US7352358B2 (en) * 2002-07-27 2008-04-01 Sony Computer Entertainment America Inc. Method and system for applying gearing effects to acoustical tracking
US7601056B2 (en) * 2003-07-30 2009-10-13 Konami Corporation Music game software and music game machine
US20050056997A1 (en) * 2003-09-12 2005-03-17 Nintendo Co., Ltd. Operating apparatus for game machine
US20050130740A1 (en) * 2003-09-12 2005-06-16 Namco Ltd. Input device, input determination method, game system, game system control method, program, and information storage medium
US20080242412A1 (en) * 2003-09-12 2008-10-02 Nintendo Co., Ltd. Operating apparatus for game machine
US7479064B2 (en) * 2003-09-12 2009-01-20 Nintendo Co., Ltd. Operating apparatus for game machine
US20050085297A1 (en) * 2003-09-12 2005-04-21 Namco Ltd. Program, information storage medium and game system
US7582015B2 (en) * 2003-09-12 2009-09-01 Namco Bandai Games Inc. Program, information storage medium and game system
US7682237B2 (en) * 2003-09-22 2010-03-23 Ssd Company Limited Music game with strike sounds changing in quality in the progress of music and entertainment music system
US20120196681A1 (en) * 2004-04-30 2012-08-02 Nintendo Co., Ltd., Game system and game program medium
US20050245315A1 (en) * 2004-04-30 2005-11-03 Nintendo Co., Ltd. Game system and game program medium
US20050288099A1 (en) * 2004-05-07 2005-12-29 Takao Shimizu Game system, storage medium storing game program, and game controlling method
US7618322B2 (en) * 2004-05-07 2009-11-17 Nintendo Co., Ltd. Game system, storage medium storing game program, and game controlling method
US20060025218A1 (en) * 2004-07-29 2006-02-02 Nintendo Co., Ltd. Game apparatus utilizing touch panel and storage medium storing game program
US7435169B2 (en) * 2006-03-10 2008-10-14 Nintendo Co., Ltd. Music playing apparatus, storage medium storing a music playing control program and music playing control method
US20090082078A1 (en) * 2006-03-29 2009-03-26 Harmonix Music Systems, Inc. Game controller simulating a musical instrument
US20070265042A1 (en) * 2006-05-09 2007-11-15 Nintendo Co., Ltd. Game program and game apparatus
US20070265087A1 (en) * 2006-05-09 2007-11-15 Nintendo Co., Ltd. Game program and game apparatus
US7320643B1 (en) * 2006-12-04 2008-01-22 Harmonix Music Systems, Inc. Game controller simulating a musical instrument
US20090104956A1 (en) * 2007-06-14 2009-04-23 Robert Kay Systems and methods for simulating a rock band experience
US20090075711A1 (en) * 2007-06-14 2009-03-19 Eric Brosius Systems and methods for providing a vocal experience for a player of a rhythm action game
US20080311969A1 (en) * 2007-06-14 2008-12-18 Robert Kay Systems and methods for indicating input actions in a rhythm-action game
US7625284B2 (en) * 2007-06-14 2009-12-01 Harmonix Music Systems, Inc. Systems and methods for indicating input actions in a rhythm-action game
US20100041477A1 (en) * 2007-06-14 2010-02-18 Harmonix Music Systems, Inc. Systems and Methods for Indicating Input Actions in a Rhythm-Action Game
US20090258686A1 (en) * 2008-04-15 2009-10-15 Mccauley Jack J System and method for playing a music video game with a drum system game controller

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"SOCOM: U.S. Navy SEALs Releases," Giant Bomb, 2 pages (printed), accessed 13 Nov. 2012, *
"SOCOM: U.S. Navy SEALs Review," Gamer's Temple, second page, 2 pages (printed), accessed 13 Nov. 2012, *

Cited By (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050096132A1 (en) * 2003-09-22 2005-05-05 Hiromu Ueshima Music game with strike sounds changing in quality in the progress of music and entertainment music system
US7682237B2 (en) * 2003-09-22 2010-03-23 Ssd Company Limited Music game with strike sounds changing in quality in the progress of music and entertainment music system
US20070265087A1 (en) * 2006-05-09 2007-11-15 Nintendo Co., Ltd. Game program and game apparatus
US20070265042A1 (en) * 2006-05-09 2007-11-15 Nintendo Co., Ltd. Game program and game apparatus
US10525345B2 (en) 2006-05-09 2020-01-07 Nintendo Co., Ltd. Game program and game apparatus
US10092837B2 (en) 2006-05-09 2018-10-09 Nintendo Co., Ltd. Game program and game apparatus
US9550123B2 (en) * 2006-05-09 2017-01-24 Nintendo Co., Ltd. Game program and game apparatus
US8902272B1 (en) 2008-11-24 2014-12-02 Shindig, Inc. Multiparty communications systems and methods that employ composite communications
US10542237B2 (en) 2008-11-24 2020-01-21 Shindig, Inc. Systems and methods for facilitating communications amongst multiple users
US8917310B2 (en) 2008-11-24 2014-12-23 Shindig, Inc. Multiparty communications systems and methods that optimize communications based on mode and available bandwidth
US9782675B2 (en) 2008-11-24 2017-10-10 Shindig, Inc. Systems and methods for interfacing video games and user communications
US9041768B1 (en) 2008-11-24 2015-05-26 Shindig, Inc. Multiparty communications systems and methods that utilize multiple modes of communication
US8390670B1 (en) 2008-11-24 2013-03-05 Shindig, Inc. Multiparty communications systems and methods that optimize communications based on mode and available bandwidth
US9215412B2 (en) 2008-11-24 2015-12-15 Shindig, Inc. Multiparty communications systems and methods that optimize communications based on mode and available bandwidth
US9357169B2 (en) 2008-11-24 2016-05-31 Shindig, Inc. Multiparty communications and methods that utilize multiple modes of communication
US9401937B1 (en) 2008-11-24 2016-07-26 Shindig, Inc. Systems and methods for facilitating communications amongst multiple users
US8405702B1 (en) 2008-11-24 2013-03-26 Shindig, Inc. Multiparty communications systems and methods that utilize multiple modes of communication
US9661270B2 (en) 2008-11-24 2017-05-23 Shindig, Inc. Multiparty communications systems and methods that optimize communications based on mode and available bandwidth
US9124760B2 (en) 2009-01-15 2015-09-01 Shindig, Inc. Systems and methods for interfacing video games and user communications
US9737804B2 (en) 2009-01-15 2017-08-22 Shindig, Inc. Systems and methods for interfacing video games and user communications
US8647206B1 (en) 2009-01-15 2014-02-11 Shindig, Inc. Systems and methods for interfacing video games and user communications
US9712579B2 (en) 2009-04-01 2017-07-18 Shindig. Inc. Systems and methods for creating and publishing customizable images from within online events
US9947366B2 (en) 2009-04-01 2018-04-17 Shindig, Inc. Group portraits composed using video chat systems
US9779708B2 (en) 2009-04-24 2017-10-03 Shinding, Inc. Networks of portable electronic devices that collectively generate sound
US20120231886A1 (en) * 2009-11-20 2012-09-13 Wms Gaming Inc. Integrating wagering games and environmental conditions
US8968092B2 (en) * 2009-11-20 2015-03-03 Wms Gaming, Inc. Integrating wagering games and environmental conditions
US10271010B2 (en) 2013-10-31 2019-04-23 Shindig, Inc. Systems and methods for controlling the display of content
US9952751B2 (en) 2014-04-17 2018-04-24 Shindig, Inc. Systems and methods for forming group communications within an online event
US9733333B2 (en) 2014-05-08 2017-08-15 Shindig, Inc. Systems and methods for monitoring participant attentiveness within events and group assortments
US9711181B2 (en) 2014-07-25 2017-07-18 Shindig. Inc. Systems and methods for creating, editing and publishing recorded videos
US9734410B2 (en) 2015-01-23 2017-08-15 Shindig, Inc. Systems and methods for analyzing facial expressions within an online classroom to gauge participant attentiveness
US10133916B2 (en) 2016-09-07 2018-11-20 Steven M. Gottlieb Image and identity validation in video chat events
US11439903B2 (en) * 2018-12-26 2022-09-13 Nintendo Co., Ltd. Information processing system, storage medium storing information processing program, information processing apparatus, and information processing method

Also Published As

Publication number Publication date
JP2007301039A (en) 2007-11-22

Similar Documents

Publication Publication Date Title
US20070265074A1 (en) Game program and game apparatus
US10525345B2 (en) Game program and game apparatus
US6296570B1 (en) Video game system and video game memory medium
US6612930B2 (en) Video game apparatus and method with enhanced virtual camera control
US7666079B2 (en) Video game processing apparatus, a method and a computer readable medium that stores a program for processing a video game
JP4740644B2 (en) Image processing program and image processing apparatus
US8556718B2 (en) Game machine and movement control method of character
KR100648539B1 (en) Game machine
US6923722B2 (en) Game system and game program for providing multi-player gameplay on individual displays and a common display
JP2010273987A (en) Program, information storage medium and image forming system
US6325717B1 (en) Video game apparatus and method with enhanced virtual camera control
JP4189315B2 (en) GAME DEVICE AND GAME PROGRAM
US8241120B2 (en) Game program, game apparatus and game control method
US6793576B2 (en) Methods and apparatus for causing a character object to overcome an obstacle object
JP2005103154A (en) Game apparatus and game program
US20070265042A1 (en) Game program and game apparatus
JP3981392B2 (en) Video game program, video game apparatus, and video game control method
JP4148868B2 (en) GAME PROGRAM AND GAME DEVICE
JP2005342120A (en) Game device, game controlling method and program
EP1002560A2 (en) Video game apparatus and information storage medium for video game
JP5216927B2 (en) GAME PROGRAM, GAME DEVICE, GAME SYSTEM, AND GAME PROCESSING METHOD
JP7429663B2 (en) Information processing program, information processing device, information processing system, and information processing method
JPH08832A (en) Game device and slip stream generating method therefor
JP3910995B2 (en) Information storage medium and image generation apparatus
JP4353285B2 (en) Video game apparatus, program control method, and storage medium

Legal Events

Date Code Title Description
AS Assignment

Owner name: NINTENDO CO., LTD., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:AKAHORI, EIJI;MIYATA, SHINGO;IZUNO, TOSHIHARU;AND OTHERS;REEL/FRAME:019368/0827

Effective date: 20070425

STCB Information on status: application discontinuation

Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION