EP2118840A1 - Interactive user controlled avatar animations - Google Patents

Interactive user controlled avatar animations

Info

Publication number
EP2118840A1
EP2118840A1 EP08726220A EP08726220A EP2118840A1 EP 2118840 A1 EP2118840 A1 EP 2118840A1 EP 08726220 A EP08726220 A EP 08726220A EP 08726220 A EP08726220 A EP 08726220A EP 2118840 A1 EP2118840 A1 EP 2118840A1
Authority
EP
European Patent Office
Prior art keywords
controller
avatar
console
user
input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP08726220A
Other languages
German (de)
French (fr)
Other versions
EP2118840A4 (en
Inventor
Phil Harrison
Scott Waugaman
Gray M. Zalewski
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment Europe Ltd
Sony Interactive Entertainment America LLC
Original Assignee
Sony Computer Entertainment Europe Ltd
Sony Computer Entertainment America LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from GBGB0703974.6A external-priority patent/GB0703974D0/en
Priority claimed from GB0704225A external-priority patent/GB2447094B/en
Priority claimed from GB0704246A external-priority patent/GB2447096B/en
Priority claimed from GB0704235A external-priority patent/GB2447095B/en
Priority claimed from GB0704227A external-priority patent/GB2447020A/en
Priority claimed from US11/789,202 external-priority patent/US20080215974A1/en
Application filed by Sony Computer Entertainment Europe Ltd, Sony Computer Entertainment America LLC filed Critical Sony Computer Entertainment Europe Ltd
Priority claimed from PCT/US2008/002644 external-priority patent/WO2008106197A1/en
Publication of EP2118840A1 publication Critical patent/EP2118840A1/en
Publication of EP2118840A4 publication Critical patent/EP2118840A4/en
Withdrawn legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/12
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/131Protocols for games, networked simulations or virtual reality
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/23Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/61Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor using advertising information
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/70Game security or game management aspects
    • A63F13/79Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
    • A63F13/795Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories for finding other players; for building a team; for providing a buddy list
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/105Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals using inertial sensors, e.g. accelerometers, gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1081Input via voice recognition
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1087Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/40Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of platform network
    • A63F2300/406Transmission via wireless network, e.g. pager or GSM
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/40Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of platform network
    • A63F2300/407Data transfer via internet
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/55Details of game data or player data management
    • A63F2300/5506Details of game data or player data management using advertisements
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6063Methods for processing data by generating or executing the game program for sound processing
    • A63F2300/6072Methods for processing data by generating or executing the game program for sound processing of an input signal, e.g. pitch and rhythm extraction, voice recognition
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/807Role playing or strategy games

Definitions

  • the present invention relates generally to interactive multimedia entertainment and more particularly, interactive user control and manipulation representations of users in a virtual space.
  • Example gaming platforms may be the Sony Playstation or Sony Playstation2 (PS2), each of which is sold in the form of a game console.
  • the game console is designed to connect to a monitor (usually a television) and enable user interaction through handheld controllers.
  • the game console is designed with specialized processing hardware, including a CPU, a graphics synthesizer for processing intensive graphics operations, a vector unit for performing geometry transformations, and other glue hardware, firmware, and software.
  • the game console is further designed with an optical disc tray for receiving game compact discs for local play through the game console. Online gaming is also possible, where a user can interactively play against or with other users over the Internet.
  • a virtual world is a simulated environment in which users may interact with each other via one or more computer processors. Users may appear on a video screen in the form of representations referred to as avatars.
  • the degree of interaction between the avatars and the simulated environment is implemented by one or more computer applications that govern such interactions as simulated physics, exchange of information between users, and the like.
  • the nature of interactions among users of the virtual world is often limited by the constraints of the system implementing the virtual world.
  • Embodiments defined herein enable computer controlled systems and programs to map interface input to particular aspects of a virtual world animated character, as represented on a screen.
  • specific buttons of a controller e.g., game controller
  • buttons are mapped to specific body parts of an avatar, that defines the virtual world animated character.
  • buttons map to avatar features, but also positioning, movement, triggers, placement and combinations thereof, so that a real-world user can accurately control the avatar, that is represented on the screen.
  • the real-world user is able to control the avatar throughout a virtual world of places and spaces, and cause the interaction with other avatars (that may be controlled by other real-world users or computer controlled bots), or interface with things, objects, environments, and cause communication actions.
  • the communication actions can be controlled by the controller, by way of the translation mapping that is transferred to the avatar in the form of visual, audio, or combinations thereof.
  • mapping specific controller e.g., game controller, or general computer controlling peripherals
  • buttons movements (and combinations) to specific or selected body parts of an avatar, entire body movements of an avatar, body reactions of an avatar, facial reactions of avatar, emotions of an avatar, and the like.
  • a method for controlling an avatar in a virtual space the virtual space accessed through a computer network using a console.
  • the method begins by capturing activity of a console controller and processing the captured activity of the console controller to identify input parameters.
  • the next operation of the method is to map selected ones of the input parameters to portions of the avatar, the avatar being a virtual space representation of a user.
  • the capturing, processing and mapping are continuously performed to define a correlation between activity of the console controller and the avatar that is the virtual space representation of the user.
  • a method for interactively controlling an avatar through a computer network using a console begins by providing a console controller and determining a first position of the console controller. The method continues by capturing input to the console controller, the input including detecting movement of the console controller to a second position. Another step is processing input to the console controller and relative motion of the console controller between the first position and the second position. The next step of the method is mapping the relative motion between the first position and the second position of the console controller to animated body portions of the avatar. Wherein the capturing, processing and mapping are continuously performed to define a correlation between relative motion of the console controller and the avatar. [0010] In yet another embodiment, a computer implemented method for interactively controlling an avatar within a virtual environment is disclosed.
  • a computer program that is executed on at least one computer in a computer network generates the avatar and virtual environment.
  • the method begins by providing a controller interfaced with the computer program and mapping controller input to allow a user to control a select portion of the avatar.
  • the method continues by capturing controller input and controller movement between a first position and a second position.
  • the next step of the method is processing the captured controller input and controller movement and applying the captured movement to interactively animate the select portion of the avatar within the virtual environment. Wherein the capturing and processing of controller input and controller movement is continuously performed to define a correlation between controller movement and avatar animation.
  • Figure IA is a schematic illustrating an avatar control system 100 in accordance with one embodiment of the present invention.
  • Figure IB illustrates various methods of transmitting motion and position detection between the controller 108 and console 110, in accordance with one embodiment of the present invention.
  • Figure 2 shows an illustration of a user 102a interacting with an avatar controlling system in accordance with one embodiment of the present invention.
  • Figure 3 shows an illustration of a user 102a interacting with an avatar controlling system in accordance with one embodiment of the present invention.
  • Figure 5 is [0017]
  • Figures 4A is an exemplary illustration of motion capture of relative controller movements effectuating changes in an avatar, in accordance with one embodiment of the present invention.
  • Figure 4B is a flow chart illustrating how button selection on the controller can be used to define and supplement an avatar controlling system in accordance with one embodiment of the present invention.
  • Figure 5 A is an exemplary illustration of multiple motion captures of relative controller movements effectuating changes in an avatar, in accordance with one embodiment of the present invention.
  • Figure 5B is another exemplary illustration of multiple motion captures of relative controller movements effectuating changes in an avatar, in accordance with one embodiment of the present invention.
  • Figure 6 illustrates mapping controller buttons to control particular body parts of an avatar, in accordance with one embodiment of the present invention.
  • Figure 7 illustrates controlling various aspects of an avatar during the display of an avatar animation, in accordance with one embodiment of the present invention.
  • Figure 8 illustrates controlling various motions of an avatar's head in accordance with one embodiment of the present invention.
  • Figure 9 schematically illustrates the overall system architecture of the Sony® Playstation 3® entertainment device, a console having controllers for implementing an avatar control system in accordance with one embodiment of the present invention.
  • Figure 10 is a schematic of the Cell processor 928 in accordance with one embodiment of the present invention.
  • An invention for allowing real world users to control motions and actions of avatars within a virtual world.
  • users may interact with a virtual world.
  • virtual world means a representation of a real or fictitious environment having rules of interaction simulated by means of one or more processors that a real user may perceive via one or more display devices and/or may interact with via one or more user interfaces.
  • user interface refers to a real device by which a user may send inputs to or receive outputs from the virtual world.
  • the virtual world may be simulated by one or more processor modules. Multiple processor modules may be linked together via a network.
  • the user may interact with the virtual world via a user interface device that can communicate with the processor modules and other user interface devices via a network.
  • Certain aspects of the virtual world may be presented to the user in graphical form on a graphical display such as a computer monitor, television monitor or similar display.
  • Certain other aspects of the virtual world may be presented to the user in audible form on a speaker, which may be associated with the graphical display.
  • users may be represented by avatars. Each avatar within the virtual world may be uniquely associated with a different user. The name or pseudonym of a user may be displayed next to the avatar so that users may readily identify each other.
  • a particular user's interactions with the virtual world may be represented by one or more corresponding actions of the avatar.
  • An avatar representing a user could have an appearance similar to that of a person, an animal or an object.
  • An avatar in the form of a person may have the same gender as the user or a different gender.
  • the avatar may be shown on the display so that the user can see the avatar along with other objects in the virtual world.
  • the display may show the world from the point of view of the avatar without showing itself.
  • the user's (or avatar's) perspective on the virtual world may be thought of as being the view of a virtual camera.
  • a virtual camera refers to a point of view within the virtual world that may be used for rendering two-dimensional images of a 3D scene within the virtual world.
  • Users may interact with each other through their avatars by means of the chat channels associated with each lobby. Users may enter text for chat with other users via their user interface. The text may then appear over or next to the user's avatar, e.g., in the form of comic-book style dialogue bubbles, sometimes referred to as chat bubbles. Such chat may be facilitated by the use of a canned phrase chat system sometimes referred to as quick chat. With quick chat, a user may select one or more chat phrases from a menu. For further examples, reference may also be made to: (1) United Kingdom patent application no. 0703974.6 entitled “ENTERTAINMENT DEVICE", filed on March 1, 2007; (2) United Kingdom patent application no.
  • Figure IA is a schematic illustrating an avatar control system 100 in accordance with one embodiment of the present invention.
  • a user 102a manipulates a controller 108 that can communicate with a console 110.
  • the console 110 can include a storage medium capable of saving and retrieving data.
  • Exemplary types of storage mediums include, but are not limited to magnetic storage, optical storage, flash memory, random access memory and read only memory.
  • the console 110 can also include a network interface such as Ethernet ports and wireless network capabilities including the multitude of wireless networking standards found under IEEE 802.11.
  • the network interface of the console 110 can enable the user 102a to connect to remote servers capable of providing real-time interactive game play with other console 110 users, software updates, media services, and access to social networking services.
  • the console 1 10 can also include a central processing unit and graphics processing units. The central processing unit can be used to process instructions retrieved from the storage medium while the graphics processing unit can process and render graphics to be displayed on a screen 106.
  • the console 106 can display a virtual space 104 that includes an avatar 102b.
  • the virtual space 104 can be maintained on remote servers accessed using the network interface of the console 110.
  • portions of the virtual space 104 can be stored on the console 110 while other portions are stored on remote servers.
  • the virtual space 104 is a virtual three-dimensional world displayed on the screen 106.
  • the virtual space 104 can be a virtual representation of the real world that includes geography, weather, flora, fauna, currency, and politics. Similar to the real world, the virtual space 104 can include urban, suburban, and rural areas. However, unlike the real world, the virtual space 104 can have variable laws of physics.
  • the previously discussed aspects of the virtual space 104 are intended to be exemplary and not intended to be limiting or restrictive. As a virtual space, the scope of what can be simulated and modeled can encompass anything within the real world and is only limited by the scope of human imagination.
  • a user can interact with the virtual space 114 using their avatar 102b.
  • the avatar 102b can be rendered three-dimensionally and configured by the user 102a to be a realistic or fanciful representation of the user 102a in the virtual space 104.
  • the user 102a can have complete control over multiple aspects of the avatar including, but not limited to, hair style, head shape, eye shape, eye color, nose shape, nose size, ear size, Hp shape, lip color, clothing, footwear and accessories.
  • the user 102a can input a photograph of their actual face that is can be mapped onto a three-dimensional wire-frame head and body.
  • Figure IB illustrates various methods of transmitting motion and position detection between the controller 108 and console 1 10, in accordance with one embodiment of the present invention.
  • the user 102a can control the avatar 102b within the virtual space 104 using the controller 108.
  • the controller 108 can transmit signals wirelessly to the console 110.
  • interference between individual controllers can be accomplished by transmitting wireless signals at particular frequencies or use radio and communications protocols such as Bluetooth.
  • the controller 108 can include multiple buttons and joysticks that can be manipulated by the user to achieve a variety of effects such as navigating and selecting items from an on screen menu. Similarly, the buttons and joysticks of the controller 108 can be mapped to control aspects of computer programs executed by the console 110.
  • the controller 108 can also include motion sensors capable of detecting translation and rotation in the x-axis, y-axis, and z-axis.
  • the motion sensors are inertial sensors capable of detecting motion, acceleration and deceleration of the controller 108.
  • the motion of the controller 108 can be detected in all axes using gyroscopes.
  • the controller 108 can wirelessly transmit data from the motions sensors to the console 110 for processing resulting in actions displayed on a screen.
  • a camera 112 can also be connected to the console 110 to assist in providing visual detection of the controller 108.
  • the camera 112 and LEDs positioned on the controller 108 provide visual detection of the controller 108 to the console 110.
  • the LEDs capable of emitting light within the visible spectrum or outside the visible spectrum, can be integrated into the controller in an array that assists in determining if the controller is off axis to the camera 112.
  • the LEDs can be modularly attached to the controller 108.
  • the camera 112 can be configured to receive the light emitted from the LEDs while the console 110 can calculate movement of the controller 108 based on changes of the LEDs relative to the camera 112.
  • the LEDs of different controllers can be differentiated from each other using individual blink patterns or frequencies of light.
  • the camera 112 is a depth camera that can help determine a distance between the camera 112 and the controller 108.
  • the depth camera can have a maximum scan depth. In this situation, depth values are only calculated for objects within the maximum scan depth.
  • the camera 112 has a maximum scan depth of Z. As the controller 108 is within the maximum scan depth, the distance between the camera 112 and the controller 108 can be calculated.
  • combinations of inertial sensors, LED detection and depth cameras can be used to refine motion and position detection.
  • the camera 112 can be integrated into the console 110. In other embodiments, the camera 112 can be positioned independent of the console 110.
  • Figure 2 shows an illustration of a user 102a interacting with an avatar controlling system in accordance with one embodiment of the present invention.
  • User 102a holding a controller 108, is shown bending over at the waist. As the user 102a bends at the waist, the controller 108 is pitched down from an initial substantially horizontal position to the position illustrated in Figure 2. The pitching down of the controller 108 can be captured by the motion capture system in operation 120.
  • computer analysis can be performed by the console to map the motion capture of the controller 108 to a particular body part of the avatar 102b.
  • Operation 124 renders an animation of the avatar 102b that can be output from the console to the screen 106.
  • the motion capture of the controller 108 can be mapped to the waist of the avatar 102b.
  • motion capture of controller 108 movements can be mapped to different body parts of the avatar 102b such as legs, arms, hands, and head.
  • motion capture from the controller 108 can be combined with other forms of user input to effectuate changes in the avatar 102b.
  • a microphone and camera system can be used to monitor when the user 102a speaks resulting in animation of the mouth of the avatar 102b.
  • the user 102a can also use buttons on the controller 108 to change and customize reactions and movements of the avatar 102b.
  • Figure 3 shows an illustration of a user 102a interacting with an avatar controlling system in accordance with one embodiment of the present invention.
  • the user 102 is shown pitching the controller 108 down from an initial substantially horizontal position to the position seen in Figure 3. Similar to Figure 2, the downward pitch of the controller can be detected by the motion capture system in operation 120. Operation 122 can perform computer analysis of the motion capture and operation 124 can render the motion capture to the avatar 102b. [0043] Comparing Figure 2 and Figure 3 illustrates that motion capture of relative controller movement can effectuate change in the avatar 102b. In Figure 2, as the user 102 bends at the waist and the motion capture system can detect changes in the controller 108 position. In Figure 3, a wrist movement from the user 102a can pitch the controller 108 down.
  • FIG. 4A is an exemplary illustration of motion capture of relative controller movements effectuating changes in an avatar, in accordance with one embodiment of the present invention.
  • avatar 102b represents a before motion capture view
  • avatar 102b' illustrates an after motion capture view. Initially, the user 102a, holding the controller 108, is represented by the avatar 102b.
  • motion capture of the relative motion of the controller 108 is analyzed and applied to the avatar 102b.
  • motion of the controller 108 is mapped to the entire body of the avatar 102b so the ninety degree yaw of the controller 108 results in avatar 102b'.
  • FIG 4B is a flow chart illustrating how button selection on the controller can be used to define and supplement an avatar controlling system in accordance with one embodiment of the present invention.
  • different aspects of an avatar can be controlled by various relative motion of a controller.
  • To enrich the interactivity and realism of an avatar it can be beneficial to allow users to control facial expressions, hand gestures and other traits, expressions and emotions of their avatar.
  • To accomplish this level of avatar control supplemental input other than motion capture of relative motion of the controller may be used.
  • Button on the controller can be mapped to select, control, and manipulate the possibly endless variations of avatar expressions and emotions.
  • the flow chart in Figure 5 illustrates how button selection on the controller can be used to define and supplement an avatar control system.
  • a motion detection system detects a first controller position. This is followed by operation 502 that detects movements of the controller relative to the first controller position. In operation 504, it is determined if any buttons on the controller have been selected. Computer analysis of the controller button selections and of the relative movements of the controller is completed in operation 506. This is followed by operation 508 where the controller movements and button selections are mapped to the avatar.
  • Figure 5 A is an exemplary illustration of multiple motion captures of relative controller movements effectuating changes in an avatar, in accordance with one embodiment of the present invention.
  • the user 102a performs motion A by imparting a downward pitch to the controller 108.
  • the motion capture of motion A is mapped to the waist of the avatar 102b and results in avatar 102b bending over at the waist.
  • Performing motion B the user 102a yaws the controller to the user's right while pitching the controller up to a substantially horizontal position and rolling the controller to the user's right.
  • yawing the controller is mapped to the direction the avatar faces.
  • the yaw to the user's right results in the avatar 102b rotating into the forward facing position seen in avatar 102b'.
  • pitching the controller 108 is mapped to movement of the waist of the avatar.
  • pitching up of the controller 108 to a substantially horizontal position brings the avatar from the bent over position of avatar 102b to the straightened position of avatar 102b'.
  • rolling the controller 108 is mapped to leaning the avatar at the waist so that the roll of the controller to the user's right results in the avatar 102b' leaning to the avatar's right.
  • FIG. 5B is another exemplary illustration of multiple motion captures of relative controller movements effectuating changes in an avatar, in accordance with one embodiment of the present invention.
  • the user performs motion A by imparting a downward pitch to the controller 108.
  • the motion capture of motion A is mapped to the waist of the avatar 102b and results in avatar 102b bending over at the waist.
  • the user 102a' performs motion B.
  • motion B With motion B, the user 102a' rolls the controller to the user's left.
  • rolling the controller leans the avatar 102b to the avatar's left.
  • the combined Motion A and Motion B results in the avatar 102b being bent forward at the waist and leaning to the avatar's left.
  • FIG. 6 illustrates mapping controller buttons to control particular body parts of an avatar, in accordance with one embodiment of the present invention.
  • the controller 108 can have a variety of buttons including a digital control pad represented by DU, DR, DD and DL.
  • the controller can also have left shoulder buttons 108a that include LSI and LS2.
  • right shoulder buttons 108b include RSl and RS2.
  • Analog sticks AL and AR can be included on the controller 108 where the analog sticks are also capable of acting as buttons when depressed.
  • the controller can also have selection buttons illustrated in Figure 6 as a square, triangle, circle and "X". While particular names and symbols have been used to describe the controller 108, the names are exemplary and not intended to be limiting. [0049] In one embodiment, the various buttons of the controller 108 can be mapped to activate control of particular body parts of an avatar. As shown in avatar mapping 600, depressing AR can place a user in control of the avatar's head. Depressing RSl or RS2 can allows a user to respectively control the right arm or right leg of the avatar. Similarly, LSI and LS2 are respectively mapped to control the avatar's left arm and left leg. In addition to being able to control various parts of the avatar, a user can initiate and modify pre-rendered avatar animations. The user can initiate an avatar animation with a single or multiple button presses, single or multiple controller movements, or sequences of button presses in conjunction with controller movements.
  • an avatar animation can be considered a sequence of various states.
  • state 1 602 has the user's avatar is in a rest position or position prior to the initiation of the dance animation.
  • State 2 604 can be considered the state of the avatar just after initiation of the dance animation. In this embodiment, the avatar has leaned to its left.
  • state 3 606 the final state of the dance animation 601, the user's avatar has leaned to it's right and raised it's right arm.
  • transition frames between the various states are not shown. It should be apparent to one skilled in the art that additional frames may be required to smoothly animate the avatar between the various states.
  • avatar animations can contain fewer or additional states, as the dance animation 601 is exemplary and not intended to be limiting.
  • the controller 108 can detect acceleration and deceleration of translational and rotational motion in a three axes. This allows a user to interactively control directional movement of the animation and the rate animation of the avatar based on user input such as actual acceleration and deceleration of translational and rotational movement of the controller.
  • the mapping of controller buttons to activate control of particular body parts of an avatar allows a user to decide which body part, or body parts, of the avatar to interactively animate. This can result in unique avatar animations because the user directly controls the animation of particular body parts of the avatar.
  • Avatar animations that are responsive to direct control from the user are different from pre-mapped, pre-defined and pre- rendered avatar animations found in other forms of avatar animation.
  • the disclosed mapping of controller movement, controller input, and controller positioning to particular parts of an avatar enable specific identification of avatar aspects to control, a degree of control and the resulting application of such control to the animated avatar.
  • the avatar character is not tied to a particular pre-defined game, game scenes, or environments or game levels experiences. For instance, an avatar, as controlled by a real-world user, is able to define locations to visit, things to interact with, things to see, and experiences to enjoy.
  • FIG. 7 illustrates controlling various aspects of an avatar during the display of an avatar animation, in accordance with one embodiment of the present invention.
  • a button press combination using the controller 108 can be used to initiate state 1 602 of an avatar dance animation on the screen 106.
  • the controller 108 is in a position that is substantially horizontal.
  • the user depresses and holds LSI to control the left arm of the avatar.
  • the user pitches the controller 108 up, the avatar's left arm is raised into the position seen in state 604a on the screen 106.
  • state 704 the user continues to hold LSI while pitching the controller 108 down to a substantially horizontal position.
  • the controller is pitched down, on the screen 106, the avatar's left arm is lowered into the position seen in state 606a.
  • a user can depress and release a button corresponding to a selected portion of an avatar and continue to control that portion of an avatar until the button is pressed a second time.
  • a button corresponding to a selected portion of an avatar and continue to control that portion of an avatar until the button is pressed a second time.
  • Figure 8 illustrates controlling various motions of an avatar's head in accordance with one embodiment of the present invention.
  • State 800 illustrates how depressing and holding the right analog stick button, AR, while yawing the controller 108, can turn an avatar's head.
  • a user implementing the avatar control in state 800 would be able to turn their avatar's head in a side-to-side motion to non- verbally convey "no".
  • state 802 if a user pitches the controller 108 up and down while depressing and holding AR, the user can nod their avatar's head up and down to non-verbally convey "yes".
  • a system unit 900 is provided, with various peripheral devices connectable to the system unit 9OO.
  • the system unit 900 comprises: a Cell processor 928; a Rambus® dynamic random access memory (XDRAM) unit 926; a Reality Synthesizer graphics unit 930 with a dedicated video random access memory (VRAM) unit 932; and an I/O bridge 934.
  • the system unit 900 also comprises a BIu Ray® Disk BD-ROM® optical disk reader 940 for reading from a disk 940a and a removable slot-in hard disk drive (HDD) 936, accessible through the I/O bridge 934.
  • the system unit 900 also comprises a memory card reader 938 for reading compact flash memory cards, Memory Stick® memory cards and the like, which is similarly accessible through the I/O bridge 934.
  • the I/O bridge 934 also connects to six Universal Serial Bus (USB) 2.0 ports 924; a gigabit Ethernet port 922; an IEEE 802.1 lb/g wireless network (Wi-Fi) port 920; and a Bluetooth® wireless link port 918 capable of supporting of up to seven Bluetooth connections.
  • USB Universal Serial Bus
  • Wi-Fi IEEE 802.1 lb/g wireless network
  • the I/O bridge 934 handles all wireless, USB and Ethernet data, including data from one or more game controllers 902. For example when a user is playing a game, the I/O bridge 934 receives data from the game controller 902 via a Bluetooth link and directs it to the Cell processor 928, which updates the current state of the game accordingly.
  • the wireless, USB and Ethernet ports also provide connectivity for other peripheral devices in addition to game controllers 902, such as: a remote control 904; a keyboard 906; a mouse 908; a portable entertainment device 910 such as a Sony Playstation Portable® entertainment device; a video camera such as an EyeToy® video camera 912; and a microphone headset 914.
  • Such peripheral devices may therefore in principle be connected to the system unit 900 wirelessly; for example the portable entertainment device 910 may communicate via a Wi-Fi ad-hoc connection, whilst the microphone headset 914 may communicate via a Bluetooth link.
  • the Playstation 3 device is also potentially compatible with other peripheral devices such as digital video recorders (DVRs), set-top boxes, digital cameras, portable media players, Voice over IP telephones, mobile telephones, printers and scanners.
  • DVRs digital video recorders
  • set-top boxes digital cameras
  • portable media players Voice over IP telephones
  • mobile telephones printers and scanners.
  • a legacy memory card reader 916 may be connected to the system unit via a USB port 924, enabling the reading of memory cards 948 of the kind used by the Playstation® or Playstation 2® devices.
  • the game controller 902 is operable to communicate wirelessly with the system unit 900 via the Bluetooth link.
  • the game controller 902 can instead be connected to a USB port, thereby also providing power by which to charge the battery of the game controller 902.
  • the game controller is sensitive to motion in six degrees of freedom, corresponding to translation and rotation in each axis. Consequently gestures and movements by the user of the game controller may be translated as inputs to a game in addition to or instead of conventional button or joystick commands.
  • other wirelessly enabled peripheral devices such as the Playstation Portable device may be used as a controller.
  • the remote control 904 is also operable to communicate wirelessly with the system unit 900 via a Bluetooth link.
  • the remote control 904 comprises controls suitable for the operation of the Blu-Ray Disk BD-ROM reader 940 and for the navigation of disk content.
  • the BIu Ray Disk BD-ROM reader 940 is operable to read CD-ROMs compatible with the Playstation and PlayStation 2 devices, in addition to conventional pre-recorded and recordable CDs, and so-called Super Audio CDs.
  • the reader 940 is also operable to read DVD-ROMs compatible with the Playstation 2 and PlayStation 3 devices, in addition to conventional pre-recorded and recordable DVDs.
  • the reader 940 is further operable to read BD-ROMs compatible with the Playstation 3 device, as well as conventional pre-recorded and recordable Blu-Ray Disks.
  • the system unit 900 is operable to supply audio and video, either generated or decoded by the Playstation 3 device via the Reality Synthesizer graphics unit 930, through audio and video connectors to a display and sound output device 942 such as a monitor or television set having a display 944 and one or more loudspeakers 946.
  • the audio connectors 950 may include conventional analogue and digital outputs whilst the video connectors 952 may variously include component video, S-video, composite video and one or more High Definition Multimedia Interface (HDMI) outputs. Consequently, video output may be in formats such as PAL or NTSC, or in 72Op, 1080i or 1080p high definition.
  • Audio processing generation, decoding and so on is performed by the Cell processor 928.
  • the Playstation 3 device's operating system supports Dolby® 5.1 surround sound, Dolby® Theatre Surround (DTS), and the decoding of 7.1 surround sound from Blu-Ray® disks.
  • DTS Dolby® Theatre Surround
  • the video camera 912 comprises a single charge coupled device (CCD), an LED indicator, and hardware-based real-time data compression and encoding apparatus so that compressed video data may be transmitted in an appropriate format such as an intra-image based MPEG (motion picture expert group) standard for decoding by the system unit 900.
  • the camera LED indicator is arranged to illuminate in response to appropriate control data from the system unit 900, for example to signify adverse lighting conditions.
  • Embodiments of the video camera 912 may variously connect to the system unit 900 via a USB, Bluetooth or Wi-Fi communication port.
  • Embodiments of the video camera may include one or more associated microphones and also be capable of transmitting audio data.
  • the CCD may have a resolution suitable for high-definition video capture.
  • images captured by the video camera may for example be incorporated within a game or interpreted as game control inputs.
  • a peripheral device such as a video camera or remote control via one of the communication ports of the system unit 900
  • an appropriate piece of software such as a device driver should be provided.
  • Device driver technology is well-known and will not be described in detail here, except to say that the skilled man will be aware that a device driver or similar software interface may be required in the present embodiment described.
  • FIG. 10 is a schematic of the Cell processor 928 in accordance with one embodiment of the present invention.
  • the Cell processors 928 has an architecture comprising four basic components: external input and output structures comprising a memory controller 1060 and a dual bus interface controller 1070A,B; a main processor referred to as the Power Processing Element 1050; eight co-processors referred to as Synergistic Processing Elements (SPEs) 1010A-H; and a circular data bus connecting the above components referred to as the Element Interconnect Bus 1080.
  • the total floating point performance of the Cell processor is 218 GFLOPS, compared with the 6.2 GFLOPs of the Playstation 2 device's Emotion Engine.
  • the Power Processing Element (PPE) 1050 is based upon a two-way simultaneous multithreading Power 970 compliant PowerPC core (PPU) 1055 running with an internal clock of 3.2 GHz. It comprises a 512 kB level 2 (L2) cache and a 32 kB level 1 (Ll) cache.
  • the PPE 1050 is capable of eight single position operations per clock cycle, translating to 25.6 GFLOPs at 3.2 GHz.
  • the primary role of the PPE 1050 is to act as a controller for the Synergistic Processing Elements 1010A-H, which handle most of the computational workload. In operation the PPE 1050 maintains a job queue, scheduling jobs for the
  • Each Synergistic Processing Element (SPE) 101 OA-H comprises a respective Synergistic Processing Unit (SPU) 1020A-H, and a respective Memory Flow Controller (MFC) 1040 A-H comprising in turn a respective Dynamic Memory Access Controller (DMAC) 1042 A-H, a respective Memory Management Unit (MMU) 1044 A-H and a bus interface (not shown).
  • SPE Synergistic Processing Element
  • SPE Synergistic Processing Element
  • SPU Synergistic Processing Unit
  • MFC Memory Flow Controller
  • DMAC Dynamic Memory Access Controller
  • MMU Memory Management Unit
  • Each SPU 1020A-H is a RISC processor clocked at 3.2 GHz and comprising 256 kB local RAM 1030A-H, expandable in principle to 4 GB.
  • Each SPE gives a theoretical 25.6 GFLOPS of single precision performance.
  • An SPU can operate on 4 single precision floating point members, 4 32-bit numbers, 8 16-bit integers, or 16 8-bit integers in a single clock cycle. In the same clock cycle it can also perform a memory operation.
  • the SPU 1020 A-H does not directly access the system memory XDRAM 926; the 64-bit addresses formed by the SPU 1020A-H are passed to the MFC 1040 A-H which instructs its DMA controller 1042 A-H to access memory via the Element Interconnect Bus 1080 and the memory controller 1060.
  • the Element Interconnect Bus (EIB) 1080 is a logically circular communication bus internal to the Cell processor 928 which connects the above processor elements, namely the PPE 1050, the memory controller 1060, the dual bus interface 1070A,B and the 8 SPEs 101 OA-H, totaling 12 participants. Participants can simultaneously read and write to the bus at a rate of 8 bytes per clock cycle.
  • each SPE 1010A-H comprises a DMAC 1042 A-H for scheduling longer read or write sequences.
  • the EIB comprises four channels, two each in clockwise and anti-clockwise directions. Consequently for twelve participants, the longest step- wise data-flow between any two participants is six steps in the appropriate direction.
  • the theoretical peak instantaneous EIB bandwidth for 12 slots is therefore 96B per clock, in the event of full utilization through arbitration between participants. This equates to a theoretical peak bandwidth of 307.2 GB/s (gigabytes per second) at a clock rate of 3.2GHz.
  • the memory controller 1060 comprises an XDRAM interface 1062, developed by Rambus Incorporated.
  • the memory controller interfaces with the Rambus XDRAM 926 with a theoretical peak bandwidth of 25.6 GB/s.
  • the dual bus interface 1070 A 5 B comprises a Rambus FlexIO® system interface 1072A,B.
  • the interface is organized into 12 channels each being 8 bits wide, with five paths being inbound and seven outbound. This provides a theoretical peak bandwidth of 62.4 GB/s (36.4 GB/s outbound, 26 GB/s inbound) between the Cell processor and the I/O Bridge 700 via controller 170A and the Reality Simulator graphics unit 200 via controller 170B.
  • Data sent by the Cell processor 928 to the Reality Simulator graphics unit 930 will typically comprise display lists, being a sequence of commands to draw vertices, apply textures to polygons, specify lighting conditions, and so on.
  • Embodiments may include capturing depth data to better identify the real world user and to direct activity of an avatar or scene.
  • the object can be something the person is holding or can also be the person's hand.
  • the terms "depth camera” and "three- dimensional camera” refer to any camera that is capable of obtaining distance or depth information as well as two-dimensional pixel information.
  • a depth camera can utilize controlled infrared lighting to obtain distance information.
  • Another exemplary depth camera can be a stereo camera pair, which triangulates distance information using two standard cameras.
  • the term “depth sensing device” refers to any type of device that is capable of obtaining distance information as well as two-dimensional pixel information.
  • embodiments of the present invention provide real-time interactive gaming experiences for users. For example, users can interact with various computer-generated objects in real-time.
  • video scenes can be altered in real-time to enhance the user's game experience. For example, computer generated costumes can be inserted over the user's clothing, and computer generated light sources can be utilized to project virtual shadows within a video scene.
  • a depth camera captures two-dimensional data for a plurality of pixels that comprise the video image. These values are color values for the pixels, generally red, green, and blue (RGB) values for each pixel. In this manner, objects captured by the camera appear as two-dimension objects on a monitor.
  • RGB red, green, and blue
  • Embodiments of the present invention also contemplate distributed image processing configurations. For example, the invention is not limited to the captured image and display image processing taking place in one or even two locations, such as in the CPU or in the CPU and one other element.
  • the input image processing can just as readily take place in an associated CPU, processor or device that can perform processing; essentially all of image processing can be distributed throughout the interconnected system.
  • the present invention is not limited to any specific image processing hardware circuitry and/or software.
  • the embodiments described herein are also not limited to any specific combination of general hardware circuitry and/or software, nor to any particular source for the instructions executed by processing components.
  • the invention may employ various computer-implemented operations involving data stored in computer systems. These operations include operations requiring physical manipulation of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. Further, the manipulations performed are often referred to in terms, such as producing, identifying, determining, or comparing. [0081]
  • the above described invention may be practiced with other computer system configurations including hand-held devices, microprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers and the like. The invention may also be practiced in distributing computing environments where tasks are performed by remote processing devices that are linked through a communications network.
  • the invention can also be embodied as computer readable code on a computer readable medium.
  • the computer readable medium is any data storage device that can store data which can be thereafter read by a computer system, including an electromagnetic wave carrier. Examples of the computer readable medium include hard drives, network attached storage (NAS), read-only memory, random-access memory, CD-ROMs, CD-Rs, CD-RWs, magnetic tapes, and other optical and non-optical data storage devices.
  • the computer readable medium can also be distributed over a network coupled computer system so that the computer readable code is stored and executed in a distributed fashion.

Abstract

A method for controlling an avatar in a virtual space, the virtual space accessed through a computer network using a console, is disclosed. The method begins by capturing activity of a console controller and processing the captured activity of the console controller to identify input parameters. Another operation of the method is to map selected ones of the input parameters to portions of the avatar, the avatar being a virtual space representation of a user. Wherein the capturing, processing and mapping are continuously performed to define a correlation between activity of the console controller and the avatar that is the virtual space representation of the user.

Description

INTERACTIVE USER CONTROLLED AVATAR ANIMATIONS
by Inventors
Phil Harrison, Scott Waugaman and Gary Zalewski
BACKGROUND
1. Field of the Invention
[0001] The present invention relates generally to interactive multimedia entertainment and more particularly, interactive user control and manipulation representations of users in a virtual space.
2. Description of the Related Art
[0002] The video game industry has seen many changes over the years. As computing power has expanded, developers of video games have likewise created game software that takes advantage of these increases in computing power. To this end, video game developers have been coding games that incorporate sophisticated operations and mathematics to produce a very realistic game experience.
[0003] Example gaming platforms, may be the Sony Playstation or Sony Playstation2 (PS2), each of which is sold in the form of a game console. As is well known, the game console is designed to connect to a monitor (usually a television) and enable user interaction through handheld controllers. The game console is designed with specialized processing hardware, including a CPU, a graphics synthesizer for processing intensive graphics operations, a vector unit for performing geometry transformations, and other glue hardware, firmware, and software. The game console is further designed with an optical disc tray for receiving game compact discs for local play through the game console. Online gaming is also possible, where a user can interactively play against or with other users over the Internet.
[0004] As game complexity continues to intrigue players, game and hardware manufacturers have continued to innovate to enable additional interactivity and computer programs. Some computer programs define virtual worlds. A virtual world is a simulated environment in which users may interact with each other via one or more computer processors. Users may appear on a video screen in the form of representations referred to as avatars. The degree of interaction between the avatars and the simulated environment is implemented by one or more computer applications that govern such interactions as simulated physics, exchange of information between users, and the like. The nature of interactions among users of the virtual world is often limited by the constraints of the system implementing the virtual world. [0005] It is within this context that embodiments of the invention arise.
SUMMARY
[0006] Embodiments defined herein enable computer controlled systems and programs to map interface input to particular aspects of a virtual world animated character, as represented on a screen. In one embodiment, specific buttons of a controller (e.g., game controller) are mapped to specific body parts of an avatar, that defines the virtual world animated character. In some embodiments, not only buttons map to avatar features, but also positioning, movement, triggers, placement and combinations thereof, so that a real-world user can accurately control the avatar, that is represented on the screen.
[0007] As will be noted below in more detail, the real-world user is able to control the avatar throughout a virtual world of places and spaces, and cause the interaction with other avatars (that may be controlled by other real-world users or computer controlled bots), or interface with things, objects, environments, and cause communication actions. The communication actions can be controlled by the controller, by way of the translation mapping that is transferred to the avatar in the form of visual, audio, or combinations thereof. Accordingly, the following embodiments shall be viewed broadly as examples of controls that are possible by mapping specific controller (e.g., game controller, or general computer controlling peripherals) buttons, movements (and combinations) to specific or selected body parts of an avatar, entire body movements of an avatar, body reactions of an avatar, facial reactions of avatar, emotions of an avatar, and the like. [0008] In one embodiment, a method for controlling an avatar in a virtual space, the virtual space accessed through a computer network using a console, is disclosed. The method begins by capturing activity of a console controller and processing the captured activity of the console controller to identify input parameters. The next operation of the method is to map selected ones of the input parameters to portions of the avatar, the avatar being a virtual space representation of a user. Wherein the capturing, processing and mapping are continuously performed to define a correlation between activity of the console controller and the avatar that is the virtual space representation of the user.
[0009] In another embodiment, a method for interactively controlling an avatar through a computer network using a console is disclosed. The method begins by providing a console controller and determining a first position of the console controller. The method continues by capturing input to the console controller, the input including detecting movement of the console controller to a second position. Another step is processing input to the console controller and relative motion of the console controller between the first position and the second position. The next step of the method is mapping the relative motion between the first position and the second position of the console controller to animated body portions of the avatar. Wherein the capturing, processing and mapping are continuously performed to define a correlation between relative motion of the console controller and the avatar. [0010] In yet another embodiment, a computer implemented method for interactively controlling an avatar within a virtual environment is disclosed. In this embodiment, a computer program that is executed on at least one computer in a computer network generates the avatar and virtual environment. The method begins by providing a controller interfaced with the computer program and mapping controller input to allow a user to control a select portion of the avatar. The method continues by capturing controller input and controller movement between a first position and a second position. The next step of the method is processing the captured controller input and controller movement and applying the captured movement to interactively animate the select portion of the avatar within the virtual environment. Wherein the capturing and processing of controller input and controller movement is continuously performed to define a correlation between controller movement and avatar animation.
[0011] Other aspects and advantages of the invention will become apparent from the following detailed description, taken in conjunction with the accompanying drawings, illustrating by way of example the principles of the invention. BRIEF DESCRIPTION OF THE DRAWINGS
[0012] The invention, together with further advantages thereof, may best be understood by reference to the following description taken in conjunction with the accompanying drawings. [0013] Figure IA is a schematic illustrating an avatar control system 100 in accordance with one embodiment of the present invention.
[0014] Figure IB illustrates various methods of transmitting motion and position detection between the controller 108 and console 110, in accordance with one embodiment of the present invention. [0015] Figure 2 shows an illustration of a user 102a interacting with an avatar controlling system in accordance with one embodiment of the present invention.
[0016] Figure 3 shows an illustration of a user 102a interacting with an avatar controlling system in accordance with one embodiment of the present invention. Figure 5 is [0017] Figures 4A is an exemplary illustration of motion capture of relative controller movements effectuating changes in an avatar, in accordance with one embodiment of the present invention.
[0018] Figure 4B is a flow chart illustrating how button selection on the controller can be used to define and supplement an avatar controlling system in accordance with one embodiment of the present invention. [0019] Figure 5 A is an exemplary illustration of multiple motion captures of relative controller movements effectuating changes in an avatar, in accordance with one embodiment of the present invention.
[0020] Figure 5B is another exemplary illustration of multiple motion captures of relative controller movements effectuating changes in an avatar, in accordance with one embodiment of the present invention. [0021] Figure 6 illustrates mapping controller buttons to control particular body parts of an avatar, in accordance with one embodiment of the present invention. [0022] Figure 7 illustrates controlling various aspects of an avatar during the display of an avatar animation, in accordance with one embodiment of the present invention. [0023] Figure 8 illustrates controlling various motions of an avatar's head in accordance with one embodiment of the present invention.
[0024] Figure 9 schematically illustrates the overall system architecture of the Sony® Playstation 3® entertainment device, a console having controllers for implementing an avatar control system in accordance with one embodiment of the present invention. [0025] Figure 10 is a schematic of the Cell processor 928 in accordance with one embodiment of the present invention.
DETAILED DESCRIPTION [0026] An invention is disclosed for allowing real world users to control motions and actions of avatars within a virtual world. According to an embodiment of the present invention users may interact with a virtual world. As used herein the term virtual world means a representation of a real or fictitious environment having rules of interaction simulated by means of one or more processors that a real user may perceive via one or more display devices and/or may interact with via one or more user interfaces. As used herein, the term user interface refers to a real device by which a user may send inputs to or receive outputs from the virtual world. The virtual world may be simulated by one or more processor modules. Multiple processor modules may be linked together via a network. The user may interact with the virtual world via a user interface device that can communicate with the processor modules and other user interface devices via a network. Certain aspects of the virtual world may be presented to the user in graphical form on a graphical display such as a computer monitor, television monitor or similar display. Certain other aspects of the virtual world may be presented to the user in audible form on a speaker, which may be associated with the graphical display. [0027] Within the virtual world, users may be represented by avatars. Each avatar within the virtual world may be uniquely associated with a different user. The name or pseudonym of a user may be displayed next to the avatar so that users may readily identify each other. A particular user's interactions with the virtual world may be represented by one or more corresponding actions of the avatar. Different users may interact with each other via their avatars. An avatar representing a user could have an appearance similar to that of a person, an animal or an object. An avatar in the form of a person may have the same gender as the user or a different gender. The avatar may be shown on the display so that the user can see the avatar along with other objects in the virtual world. [0028] Alternatively, the display may show the world from the point of view of the avatar without showing itself. The user's (or avatar's) perspective on the virtual world may be thought of as being the view of a virtual camera. As used herein, a virtual camera refers to a point of view within the virtual world that may be used for rendering two-dimensional images of a 3D scene within the virtual world. Users may interact with each other through their avatars by means of the chat channels associated with each lobby. Users may enter text for chat with other users via their user interface. The text may then appear over or next to the user's avatar, e.g., in the form of comic-book style dialogue bubbles, sometimes referred to as chat bubbles. Such chat may be facilitated by the use of a canned phrase chat system sometimes referred to as quick chat. With quick chat, a user may select one or more chat phrases from a menu. For further examples, reference may also be made to: (1) United Kingdom patent application no. 0703974.6 entitled "ENTERTAINMENT DEVICE", filed on March 1, 2007; (2) United Kingdom patent application no. 0704225.2 entitled "ENTERTAINMENT DEVICE AND METHOD", filed on March 5, 2007; (3) United Kingdom patent application no. 0704235.1 entitled "ENTERTAINMENT DEVICE AND METHOD", filed on March 5, 2007; (4) United Kingdom patent application no. 0704227.8 entitled "ENTERTAINMENT DEVICE AND METHOD", filed on March 5, 2007; and (5) United Kingdom patent application no. 0704246.8 entitled "ENTERTAINMENT DEVICE AND METHOD", filed on March 5, 2007, each of which is herein incorporated by reference. [0029] In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced without some or all of these specific details. In other instances, well known process steps have not been described in detail in order not to unnecessarily obscure the present invention. [0030] Figure IA is a schematic illustrating an avatar control system 100 in accordance with one embodiment of the present invention. A user 102a manipulates a controller 108 that can communicate with a console 110. In some embodiments, the console 110 can include a storage medium capable of saving and retrieving data. Exemplary types of storage mediums include, but are not limited to magnetic storage, optical storage, flash memory, random access memory and read only memory.
[0031] The console 110 can also include a network interface such as Ethernet ports and wireless network capabilities including the multitude of wireless networking standards found under IEEE 802.11. The network interface of the console 110 can enable the user 102a to connect to remote servers capable of providing real-time interactive game play with other console 110 users, software updates, media services, and access to social networking services. [0032] The console 1 10 can also include a central processing unit and graphics processing units. The central processing unit can be used to process instructions retrieved from the storage medium while the graphics processing unit can process and render graphics to be displayed on a screen 106.
[0033] With the console 110 connected to the screen 106, the console 106 can display a virtual space 104 that includes an avatar 102b. In one embodiment, the virtual space 104 can be maintained on remote servers accessed using the network interface of the console 110. In other embodiments, portions of the virtual space 104 can be stored on the console 110 while other portions are stored on remote servers. In some embodiments, the virtual space 104 is a virtual three-dimensional world displayed on the screen 106. The virtual space 104 can be a virtual representation of the real world that includes geography, weather, flora, fauna, currency, and politics. Similar to the real world, the virtual space 104 can include urban, suburban, and rural areas. However, unlike the real world, the virtual space 104 can have variable laws of physics. The previously discussed aspects of the virtual space 104 are intended to be exemplary and not intended to be limiting or restrictive. As a virtual space, the scope of what can be simulated and modeled can encompass anything within the real world and is only limited by the scope of human imagination.
[0034] A user can interact with the virtual space 114 using their avatar 102b. The avatar 102b can be rendered three-dimensionally and configured by the user 102a to be a realistic or fanciful representation of the user 102a in the virtual space 104. The user 102a can have complete control over multiple aspects of the avatar including, but not limited to, hair style, head shape, eye shape, eye color, nose shape, nose size, ear size, Hp shape, lip color, clothing, footwear and accessories. In other embodiments, the user 102a can input a photograph of their actual face that is can be mapped onto a three-dimensional wire-frame head and body. [0035] Figure IB illustrates various methods of transmitting motion and position detection between the controller 108 and console 1 10, in accordance with one embodiment of the present invention. The user 102a can control the avatar 102b within the virtual space 104 using the controller 108. In one embodiment, the controller 108 can transmit signals wirelessly to the console 110. As multiple controllers can be in use with a single console 110, interference between individual controllers can be accomplished by transmitting wireless signals at particular frequencies or use radio and communications protocols such as Bluetooth. The controller 108 can include multiple buttons and joysticks that can be manipulated by the user to achieve a variety of effects such as navigating and selecting items from an on screen menu. Similarly, the buttons and joysticks of the controller 108 can be mapped to control aspects of computer programs executed by the console 110. [0036] The controller 108 can also include motion sensors capable of detecting translation and rotation in the x-axis, y-axis, and z-axis. In one embodiment, the motion sensors are inertial sensors capable of detecting motion, acceleration and deceleration of the controller 108. In other embodiments, the motion of the controller 108 can be detected in all axes using gyroscopes. The controller 108 can wirelessly transmit data from the motions sensors to the console 110 for processing resulting in actions displayed on a screen. [0037] A camera 112 can also be connected to the console 110 to assist in providing visual detection of the controller 108. In one embodiment, the camera 112 and LEDs positioned on the controller 108 provide visual detection of the controller 108 to the console 110. The LEDs, capable of emitting light within the visible spectrum or outside the visible spectrum, can be integrated into the controller in an array that assists in determining if the controller is off axis to the camera 112. In other embodiments, the LEDs can be modularly attached to the controller 108. [0038] The camera 112 can be configured to receive the light emitted from the LEDs while the console 110 can calculate movement of the controller 108 based on changes of the LEDs relative to the camera 112. Furthermore, in embodiments where multiple controllers are associated with a single console 110, the LEDs of different controllers can be differentiated from each other using individual blink patterns or frequencies of light. [0039] In other embodiments, the camera 112 is a depth camera that can help determine a distance between the camera 112 and the controller 108. In some embodiments, the depth camera can have a maximum scan depth. In this situation, depth values are only calculated for objects within the maximum scan depth. As shown in Figure IB, the camera 112 has a maximum scan depth of Z. As the controller 108 is within the maximum scan depth, the distance between the camera 112 and the controller 108 can be calculated. In still other embodiments, combinations of inertial sensors, LED detection and depth cameras can be used to refine motion and position detection. In one embodiment, the camera 112 can be integrated into the console 110. In other embodiments, the camera 112 can be positioned independent of the console 110. [0040] Figure 2 shows an illustration of a user 102a interacting with an avatar controlling system in accordance with one embodiment of the present invention. User 102a, holding a controller 108, is shown bending over at the waist. As the user 102a bends at the waist, the controller 108 is pitched down from an initial substantially horizontal position to the position illustrated in Figure 2. The pitching down of the controller 108 can be captured by the motion capture system in operation 120. In operation 122, computer analysis can be performed by the console to map the motion capture of the controller 108 to a particular body part of the avatar 102b. Operation 124 renders an animation of the avatar 102b that can be output from the console to the screen 106. [0041] As shown in Figure 2, the motion capture of the controller 108 can be mapped to the waist of the avatar 102b. In other embodiments, motion capture of controller 108 movements can be mapped to different body parts of the avatar 102b such as legs, arms, hands, and head. In yet other embodiments, motion capture from the controller 108 can be combined with other forms of user input to effectuate changes in the avatar 102b. For example, a microphone and camera system can be used to monitor when the user 102a speaks resulting in animation of the mouth of the avatar 102b. The user 102a can also use buttons on the controller 108 to change and customize reactions and movements of the avatar 102b. [0042] Figure 3 shows an illustration of a user 102a interacting with an avatar controlling system in accordance with one embodiment of the present invention. In this embodiment, the user 102 is shown pitching the controller 108 down from an initial substantially horizontal position to the position seen in Figure 3. Similar to Figure 2, the downward pitch of the controller can be detected by the motion capture system in operation 120. Operation 122 can perform computer analysis of the motion capture and operation 124 can render the motion capture to the avatar 102b. [0043] Comparing Figure 2 and Figure 3 illustrates that motion capture of relative controller movement can effectuate change in the avatar 102b. In Figure 2, as the user 102 bends at the waist and the motion capture system can detect changes in the controller 108 position. In Figure 3, a wrist movement from the user 102a can pitch the controller 108 down. While the user 102a performs different physical motions, the pitching down of the controller 108 is the relative motion captured and analyzed by the controller. Thus, when mapped to the same avatar body parts, different physical motions of the user 102a that result in similar relative motions of the controller 108, can result in similar animation for the avatar 102b. [0044] Figures 4A is an exemplary illustration of motion capture of relative controller movements effectuating changes in an avatar, in accordance with one embodiment of the present invention. In this embodiment, avatar 102b represents a before motion capture view and avatar 102b' illustrates an after motion capture view. Initially, the user 102a, holding the controller 108, is represented by the avatar 102b. As the user 102a yaws the controller 108 ninety degrees to user's 102a right, motion capture of the relative motion of the controller 108 is analyzed and applied to the avatar 102b. In this embodiment, motion of the controller 108 is mapped to the entire body of the avatar 102b so the ninety degree yaw of the controller 108 results in avatar 102b'.
[0045] Figure 4B is a flow chart illustrating how button selection on the controller can be used to define and supplement an avatar controlling system in accordance with one embodiment of the present invention. As previously discussed, different aspects of an avatar can be controlled by various relative motion of a controller. To enrich the interactivity and realism of an avatar, it can be beneficial to allow users to control facial expressions, hand gestures and other traits, expressions and emotions of their avatar. To accomplish this level of avatar control, supplemental input other than motion capture of relative motion of the controller may be used. Button on the controller can be mapped to select, control, and manipulate the possibly endless variations of avatar expressions and emotions. The flow chart in Figure 5 illustrates how button selection on the controller can be used to define and supplement an avatar control system. In operation 500 a motion detection system detects a first controller position. This is followed by operation 502 that detects movements of the controller relative to the first controller position. In operation 504, it is determined if any buttons on the controller have been selected. Computer analysis of the controller button selections and of the relative movements of the controller is completed in operation 506. This is followed by operation 508 where the controller movements and button selections are mapped to the avatar.
[0046] Figure 5 A is an exemplary illustration of multiple motion captures of relative controller movements effectuating changes in an avatar, in accordance with one embodiment of the present invention. The user 102a performs motion A by imparting a downward pitch to the controller 108. The motion capture of motion A is mapped to the waist of the avatar 102b and results in avatar 102b bending over at the waist. Performing motion B, the user 102a yaws the controller to the user's right while pitching the controller up to a substantially horizontal position and rolling the controller to the user's right. In this embodiment, yawing the controller is mapped to the direction the avatar faces. Thus, the yaw to the user's right results in the avatar 102b rotating into the forward facing position seen in avatar 102b'. As previously discussed, in this embodiment pitching the controller 108 is mapped to movement of the waist of the avatar. Thus, when motion B is performed, pitching up of the controller 108 to a substantially horizontal position brings the avatar from the bent over position of avatar 102b to the straightened position of avatar 102b'. In this embodiment, rolling the controller 108 is mapped to leaning the avatar at the waist so that the roll of the controller to the user's right results in the avatar 102b' leaning to the avatar's right. [0047] Figure 5B is another exemplary illustration of multiple motion captures of relative controller movements effectuating changes in an avatar, in accordance with one embodiment of the present invention. The user performs motion A by imparting a downward pitch to the controller 108. The motion capture of motion A is mapped to the waist of the avatar 102b and results in avatar 102b bending over at the waist. Without bringing the controller 108 back to a substantially horizontal position, the user 102a' performs motion B. With motion B, the user 102a' rolls the controller to the user's left. In this embodiment, rolling the controller leans the avatar 102b to the avatar's left. Thus, the combined Motion A and Motion B results in the avatar 102b being bent forward at the waist and leaning to the avatar's left. As the controller 108 includes sensors capable of measuring acceleration and deceleration, the animation of the avatars 102b can correlate to actual movement of the controller 108, by the user 102a/a'. [0048] Figure 6 illustrates mapping controller buttons to control particular body parts of an avatar, in accordance with one embodiment of the present invention. The controller 108 can have a variety of buttons including a digital control pad represented by DU, DR, DD and DL. The controller can also have left shoulder buttons 108a that include LSI and LS2. Similarly, right shoulder buttons 108b include RSl and RS2. Analog sticks AL and AR can be included on the controller 108 where the analog sticks are also capable of acting as buttons when depressed. The controller can also have selection buttons illustrated in Figure 6 as a square, triangle, circle and "X". While particular names and symbols have been used to describe the controller 108, the names are exemplary and not intended to be limiting. [0049] In one embodiment, the various buttons of the controller 108 can be mapped to activate control of particular body parts of an avatar. As shown in avatar mapping 600, depressing AR can place a user in control of the avatar's head. Depressing RSl or RS2 can allows a user to respectively control the right arm or right leg of the avatar. Similarly, LSI and LS2 are respectively mapped to control the avatar's left arm and left leg. In addition to being able to control various parts of the avatar, a user can initiate and modify pre-rendered avatar animations. The user can initiate an avatar animation with a single or multiple button presses, single or multiple controller movements, or sequences of button presses in conjunction with controller movements.
[0050] As shown in dance animation 601, an avatar animation can be considered a sequence of various states. In one embodiments, state 1 602, has the user's avatar is in a rest position or position prior to the initiation of the dance animation. State 2 604, can be considered the state of the avatar just after initiation of the dance animation. In this embodiment, the avatar has leaned to its left. In state 3 606, the final state of the dance animation 601, the user's avatar has leaned to it's right and raised it's right arm. As the dance animation 601 is intended to convey various states of an avatar, transition frames between the various states are not shown. It should be apparent to one skilled in the art that additional frames may be required to smoothly animate the avatar between the various states. Other embodiments of avatar animations can contain fewer or additional states, as the dance animation 601 is exemplary and not intended to be limiting. [0051] As previously discussed, the controller 108 can detect acceleration and deceleration of translational and rotational motion in a three axes. This allows a user to interactively control directional movement of the animation and the rate animation of the avatar based on user input such as actual acceleration and deceleration of translational and rotational movement of the controller. Furthermore, the mapping of controller buttons to activate control of particular body parts of an avatar allows a user to decide which body part, or body parts, of the avatar to interactively animate. This can result in unique avatar animations because the user directly controls the animation of particular body parts of the avatar. Avatar animations that are responsive to direct control from the user are different from pre-mapped, pre-defined and pre- rendered avatar animations found in other forms of avatar animation. [0052] For instance, although some system may allow control of an animated character in a game, in one embodiment, the disclosed mapping of controller movement, controller input, and controller positioning to particular parts of an avatar enable specific identification of avatar aspects to control, a degree of control and the resulting application of such control to the animated avatar. Still further, the avatar character is not tied to a particular pre-defined game, game scenes, or environments or game levels experiences. For instance, an avatar, as controlled by a real-world user, is able to define locations to visit, things to interact with, things to see, and experiences to enjoy. The experiences of the avatar in the virtual environment and the motions, reactions, and body movements are created on demand of the input defined by the real-world user, as dictated by controller activity. [0053] Figure 7 illustrates controlling various aspects of an avatar during the display of an avatar animation, in accordance with one embodiment of the present invention. In state 700, a button press combination using the controller 108 can be used to initiate state 1 602 of an avatar dance animation on the screen 106. As shown in state 700, the controller 108 is in a position that is substantially horizontal. In state 702, the user depresses and holds LSI to control the left arm of the avatar. Thus, when the user pitches the controller 108 up, the avatar's left arm is raised into the position seen in state 604a on the screen 106. Moving to state 704, the user continues to hold LSI while pitching the controller 108 down to a substantially horizontal position. As the controller is pitched down, on the screen 106, the avatar's left arm is lowered into the position seen in state 606a.
[0054] In other embodiments, a user can depress and release a button corresponding to a selected portion of an avatar and continue to control that portion of an avatar until the button is pressed a second time. To assist a user in determining which portion of their avatar they are controlling, it is possible to highlight the controlled portion of the avatar on the screen 106. This highlighting can be displayed only to the user controlling the avatar and may not be visible to other users in the virtual space.
[0055] Figure 8 illustrates controlling various motions of an avatar's head in accordance with one embodiment of the present invention. State 800 illustrates how depressing and holding the right analog stick button, AR, while yawing the controller 108, can turn an avatar's head. Thus, a user implementing the avatar control in state 800 would be able to turn their avatar's head in a side-to-side motion to non- verbally convey "no". Conversely, in state 802, if a user pitches the controller 108 up and down while depressing and holding AR, the user can nod their avatar's head up and down to non-verbally convey "yes". In state 804, rolling the controller 108 left and right while pressing and holding AR, can result in the user's avatar's head tilting to the left and right. It should be apparent to one skilled in the art an avatar's head could make compound motions based on a combination of user input controller selected from yaw, pitch and roll. Similarly, the compound motions based on yaw, pitch and roll can be mapped to other aspects of avatar animation. [0056] Figure 9 schematically illustrates the overall system architecture of the Sony®
Playstation 3® entertainment device, a console having controllers for implementing an avatar control system in accordance with one embodiment of the present invention. A system unit 900 is provided, with various peripheral devices connectable to the system unit 9OO.The system unit 900 comprises: a Cell processor 928; a Rambus® dynamic random access memory (XDRAM) unit 926; a Reality Synthesizer graphics unit 930 with a dedicated video random access memory (VRAM) unit 932; and an I/O bridge 934. The system unit 900 also comprises a BIu Ray® Disk BD-ROM® optical disk reader 940 for reading from a disk 940a and a removable slot-in hard disk drive (HDD) 936, accessible through the I/O bridge 934. Optionally the system unit 900 also comprises a memory card reader 938 for reading compact flash memory cards, Memory Stick® memory cards and the like, which is similarly accessible through the I/O bridge 934.
[0057] The I/O bridge 934 also connects to six Universal Serial Bus (USB) 2.0 ports 924; a gigabit Ethernet port 922; an IEEE 802.1 lb/g wireless network (Wi-Fi) port 920; and a Bluetooth® wireless link port 918 capable of supporting of up to seven Bluetooth connections.
[0058] In operation the I/O bridge 934 handles all wireless, USB and Ethernet data, including data from one or more game controllers 902. For example when a user is playing a game, the I/O bridge 934 receives data from the game controller 902 via a Bluetooth link and directs it to the Cell processor 928, which updates the current state of the game accordingly. [0059] The wireless, USB and Ethernet ports also provide connectivity for other peripheral devices in addition to game controllers 902, such as: a remote control 904; a keyboard 906; a mouse 908; a portable entertainment device 910 such as a Sony Playstation Portable® entertainment device; a video camera such as an EyeToy® video camera 912; and a microphone headset 914. Such peripheral devices may therefore in principle be connected to the system unit 900 wirelessly; for example the portable entertainment device 910 may communicate via a Wi-Fi ad-hoc connection, whilst the microphone headset 914 may communicate via a Bluetooth link. [0060] The provision of these interfaces means that the Playstation 3 device is also potentially compatible with other peripheral devices such as digital video recorders (DVRs), set-top boxes, digital cameras, portable media players, Voice over IP telephones, mobile telephones, printers and scanners. [0061] In addition, a legacy memory card reader 916 may be connected to the system unit via a USB port 924, enabling the reading of memory cards 948 of the kind used by the Playstation® or Playstation 2® devices.
[0062] In the present embodiment, the game controller 902 is operable to communicate wirelessly with the system unit 900 via the Bluetooth link. However, the game controller 902 can instead be connected to a USB port, thereby also providing power by which to charge the battery of the game controller 902. In addition to one or more analog joysticks and conventional control buttons, the game controller is sensitive to motion in six degrees of freedom, corresponding to translation and rotation in each axis. Consequently gestures and movements by the user of the game controller may be translated as inputs to a game in addition to or instead of conventional button or joystick commands. Optionally, other wirelessly enabled peripheral devices such as the Playstation Portable device may be used as a controller. In the case of the Playstation Portable device, additional game or control information (for example, control instructions or number of lives) may be provided on the screen of the device. Other alternative or supplementary control devices may also be used, such as a dance mat (not shown), a light gun (not shown), a steering wheel and pedals (not shown) or bespoke controllers, such as a single or several large buttons for a rapid-response quiz game (also not shown). [0063] The remote control 904 is also operable to communicate wirelessly with the system unit 900 via a Bluetooth link. The remote control 904 comprises controls suitable for the operation of the Blu-Ray Disk BD-ROM reader 940 and for the navigation of disk content. [0064] The BIu Ray Disk BD-ROM reader 940 is operable to read CD-ROMs compatible with the Playstation and PlayStation 2 devices, in addition to conventional pre-recorded and recordable CDs, and so-called Super Audio CDs. The reader 940 is also operable to read DVD-ROMs compatible with the Playstation 2 and PlayStation 3 devices, in addition to conventional pre-recorded and recordable DVDs. The reader 940 is further operable to read BD-ROMs compatible with the Playstation 3 device, as well as conventional pre-recorded and recordable Blu-Ray Disks. [0065] The system unit 900 is operable to supply audio and video, either generated or decoded by the Playstation 3 device via the Reality Synthesizer graphics unit 930, through audio and video connectors to a display and sound output device 942 such as a monitor or television set having a display 944 and one or more loudspeakers 946. The audio connectors 950 may include conventional analogue and digital outputs whilst the video connectors 952 may variously include component video, S-video, composite video and one or more High Definition Multimedia Interface (HDMI) outputs. Consequently, video output may be in formats such as PAL or NTSC, or in 72Op, 1080i or 1080p high definition. [0066] Audio processing (generation, decoding and so on) is performed by the Cell processor 928. The Playstation 3 device's operating system supports Dolby® 5.1 surround sound, Dolby® Theatre Surround (DTS), and the decoding of 7.1 surround sound from Blu-Ray® disks.
[0067] In the present embodiment, the video camera 912 comprises a single charge coupled device (CCD), an LED indicator, and hardware-based real-time data compression and encoding apparatus so that compressed video data may be transmitted in an appropriate format such as an intra-image based MPEG (motion picture expert group) standard for decoding by the system unit 900. The camera LED indicator is arranged to illuminate in response to appropriate control data from the system unit 900, for example to signify adverse lighting conditions. Embodiments of the video camera 912 may variously connect to the system unit 900 via a USB, Bluetooth or Wi-Fi communication port. Embodiments of the video camera may include one or more associated microphones and also be capable of transmitting audio data. In embodiments of the video camera, the CCD may have a resolution suitable for high-definition video capture. In use, images captured by the video camera may for example be incorporated within a game or interpreted as game control inputs. [0068] In general, in order for successful data communication to occur with a peripheral device such as a video camera or remote control via one of the communication ports of the system unit 900, an appropriate piece of software such as a device driver should be provided. Device driver technology is well-known and will not be described in detail here, except to say that the skilled man will be aware that a device driver or similar software interface may be required in the present embodiment described.
[0069] Figure 10 is a schematic of the Cell processor 928 in accordance with one embodiment of the present invention. The Cell processors 928 has an architecture comprising four basic components: external input and output structures comprising a memory controller 1060 and a dual bus interface controller 1070A,B; a main processor referred to as the Power Processing Element 1050; eight co-processors referred to as Synergistic Processing Elements (SPEs) 1010A-H; and a circular data bus connecting the above components referred to as the Element Interconnect Bus 1080. The total floating point performance of the Cell processor is 218 GFLOPS, compared with the 6.2 GFLOPs of the Playstation 2 device's Emotion Engine. [0070] The Power Processing Element (PPE) 1050 is based upon a two-way simultaneous multithreading Power 970 compliant PowerPC core (PPU) 1055 running with an internal clock of 3.2 GHz. It comprises a 512 kB level 2 (L2) cache and a 32 kB level 1 (Ll) cache. The PPE 1050 is capable of eight single position operations per clock cycle, translating to 25.6 GFLOPs at 3.2 GHz. The primary role of the PPE 1050 is to act as a controller for the Synergistic Processing Elements 1010A-H, which handle most of the computational workload. In operation the PPE 1050 maintains a job queue, scheduling jobs for the
Synergistic Processing Elements 101 OA-H and monitoring their progress. Consequently each Synergistic Processing Element 1010A-H runs a kernel whose role is to fetch a job, execute it and synchronizes with the PPE 1050. [0071] Each Synergistic Processing Element (SPE) 101 OA-H comprises a respective Synergistic Processing Unit (SPU) 1020A-H, and a respective Memory Flow Controller (MFC) 1040 A-H comprising in turn a respective Dynamic Memory Access Controller (DMAC) 1042 A-H, a respective Memory Management Unit (MMU) 1044 A-H and a bus interface (not shown). Each SPU 1020A-H is a RISC processor clocked at 3.2 GHz and comprising 256 kB local RAM 1030A-H, expandable in principle to 4 GB. Each SPE gives a theoretical 25.6 GFLOPS of single precision performance. An SPU can operate on 4 single precision floating point members, 4 32-bit numbers, 8 16-bit integers, or 16 8-bit integers in a single clock cycle. In the same clock cycle it can also perform a memory operation. The SPU 1020 A-H does not directly access the system memory XDRAM 926; the 64-bit addresses formed by the SPU 1020A-H are passed to the MFC 1040 A-H which instructs its DMA controller 1042 A-H to access memory via the Element Interconnect Bus 1080 and the memory controller 1060. [0072] The Element Interconnect Bus (EIB) 1080 is a logically circular communication bus internal to the Cell processor 928 which connects the above processor elements, namely the PPE 1050, the memory controller 1060, the dual bus interface 1070A,B and the 8 SPEs 101 OA-H, totaling 12 participants. Participants can simultaneously read and write to the bus at a rate of 8 bytes per clock cycle. As noted previously, each SPE 1010A-H comprises a DMAC 1042 A-H for scheduling longer read or write sequences. The EIB comprises four channels, two each in clockwise and anti-clockwise directions. Consequently for twelve participants, the longest step- wise data-flow between any two participants is six steps in the appropriate direction. The theoretical peak instantaneous EIB bandwidth for 12 slots is therefore 96B per clock, in the event of full utilization through arbitration between participants. This equates to a theoretical peak bandwidth of 307.2 GB/s (gigabytes per second) at a clock rate of 3.2GHz.
[0073] The memory controller 1060 comprises an XDRAM interface 1062, developed by Rambus Incorporated. The memory controller interfaces with the Rambus XDRAM 926 with a theoretical peak bandwidth of 25.6 GB/s. [0074] The dual bus interface 1070 A5B comprises a Rambus FlexIO® system interface 1072A,B. The interface is organized into 12 channels each being 8 bits wide, with five paths being inbound and seven outbound. This provides a theoretical peak bandwidth of 62.4 GB/s (36.4 GB/s outbound, 26 GB/s inbound) between the Cell processor and the I/O Bridge 700 via controller 170A and the Reality Simulator graphics unit 200 via controller 170B. [0075] Data sent by the Cell processor 928 to the Reality Simulator graphics unit 930 will typically comprise display lists, being a sequence of commands to draw vertices, apply textures to polygons, specify lighting conditions, and so on.
[0076] Embodiments may include capturing depth data to better identify the real world user and to direct activity of an avatar or scene. The object can be something the person is holding or can also be the person's hand. In the this description, the terms "depth camera" and "three- dimensional camera" refer to any camera that is capable of obtaining distance or depth information as well as two-dimensional pixel information. For example, a depth camera can utilize controlled infrared lighting to obtain distance information. Another exemplary depth camera can be a stereo camera pair, which triangulates distance information using two standard cameras. Similarly, the term "depth sensing device" refers to any type of device that is capable of obtaining distance information as well as two-dimensional pixel information. [0077] Recent advances in three-dimensional imagery have opened the door for increased possibilities in real-time interactive computer animation. In particular, new "depth cameras" provide the ability to capture and map the third-dimension in addition to normal two- dimensional video imagery. With the new depth data, embodiments of the present invention allow the placement of computer-generated objects in various positions within a video scene in real-time, including behind other objects. [0078] Moreover, embodiments of the present invention provide real-time interactive gaming experiences for users. For example, users can interact with various computer-generated objects in real-time. Furthermore, video scenes can be altered in real-time to enhance the user's game experience. For example, computer generated costumes can be inserted over the user's clothing, and computer generated light sources can be utilized to project virtual shadows within a video scene. Hence, using the embodiments of the present invention and a depth camera, users can experience an interactive game environment within their own living room. Similar to normal cameras, a depth camera captures two-dimensional data for a plurality of pixels that comprise the video image. These values are color values for the pixels, generally red, green, and blue (RGB) values for each pixel. In this manner, objects captured by the camera appear as two-dimension objects on a monitor. [0079] Embodiments of the present invention also contemplate distributed image processing configurations. For example, the invention is not limited to the captured image and display image processing taking place in one or even two locations, such as in the CPU or in the CPU and one other element. For example, the input image processing can just as readily take place in an associated CPU, processor or device that can perform processing; essentially all of image processing can be distributed throughout the interconnected system. Thus, the present invention is not limited to any specific image processing hardware circuitry and/or software. The embodiments described herein are also not limited to any specific combination of general hardware circuitry and/or software, nor to any particular source for the instructions executed by processing components.
[0080] With the above embodiments in mind, it should be understood that the invention may employ various computer-implemented operations involving data stored in computer systems. These operations include operations requiring physical manipulation of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. Further, the manipulations performed are often referred to in terms, such as producing, identifying, determining, or comparing. [0081] The above described invention may be practiced with other computer system configurations including hand-held devices, microprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers and the like. The invention may also be practiced in distributing computing environments where tasks are performed by remote processing devices that are linked through a communications network. [0082] The invention can also be embodied as computer readable code on a computer readable medium. The computer readable medium is any data storage device that can store data which can be thereafter read by a computer system, including an electromagnetic wave carrier. Examples of the computer readable medium include hard drives, network attached storage (NAS), read-only memory, random-access memory, CD-ROMs, CD-Rs, CD-RWs, magnetic tapes, and other optical and non-optical data storage devices. The computer readable medium can also be distributed over a network coupled computer system so that the computer readable code is stored and executed in a distributed fashion. [0083] Although the foregoing invention has been described in some detail for purposes of clarity of understanding, it will be apparent that certain changes and modifications may be practiced within the scope of the appended claims. Accordingly, the present embodiments are to be considered as illustrative and not restrictive, and the invention is not to be limited to the details given herein, but may be modified within the scope and equivalents of the appended claims. What is claimed is:

Claims

1. A method for controlling an avatar in a virtual space, the virtual space accessed through a computer network using a console executing a computer program, comprising: capturing activity of a console controller; processing the captured activity of the console controller to identify input parameters; and mapping selected ones of the input parameters to portions of the avatar, the avatar being a virtual space representation of a user; wherein the capturing, processing and mapping are continuously performed to define a correlation between activity of the console controller and the avatar that is the virtual space representation of the user.
2. The method as described in claim 1, wherein activity of the console controller includes selecting of controller buttons.
3. The method as described in claim 1, wherein activity of the console controller includes sensing rotation and translation of the console controller in three-dimensional space.
4. The method as described in claim 3, wherein input parameters includes a rate of change in the rotation and translation velocity of the console controller.
5. The method as described in claim 1, wherein portions of the avatar include specific animated body parts.
6. The method as described in claim 4, wherein mapping selected ones of the input parameters to a portion of the avatar is done in proportion to acceleration and deceleration in the rotation and translation velocity of the console controller.
7. The method as described in claim 1, wherein the input parameters include selecting controller buttons, the controller buttons being mapped to initiate user control of different portions of the avatar animation when selected.
8. The method as described in claim 7, wherein the avatar animation is responsive and proportional to user controlled acceleration and deceleration of translational and rotational motion of the controller in three-axes.
9. A method for interactively controlling an avatar through a computer network using a console, comprising: providing a console controller; determining a first position of the console controller; capturing input to the console controller, the input including detecting movement of the console controller to a second position; processing input to the console controller and relative motion of the console controller between the first position and the second position; and mapping the relative motion between the first position and the second position of the console controller to animated body portions of the avatar, wherein the capturing, processing and mapping are continuously performed to define a correlation between relative motion of the console controller and the avatar.
10. The method as described in claim 9, wherein input to the console includes selecting console buttons.
11. The method as described in claim 10, wherein input to the console changes the mapping of relative motion of the console controller to different portions of the avatar.
12. The method as described in claim 9, wherein movement of the console controller can be detected in three-axes including translational and rotational movement in the three-axes.
13. The method as described in claim 9, wherein acceleration of the console controller is included as part of capturing input and detecting movement of the console controller.
14. The method as described in claim 9, wherein portions of the avatar include specific animated body parts.
15. The method as described in claim 9, wherein the avatar is a virtual representation of a user in a virtual environment, the virtual environment for the avatar accessed through the computer network and rendered by the console.
16. A computer implemented method for interactively controlling an avatar within a virtual environment, the avatar and virtual environment generated by a computer program that is executed on at least one computer in a computer network, comprising: providing a controller interfaced with the computer program; mapping controller input to allow a user to control a selected portion of the avatar; capturing controller input and controller movement between a first position and a second position; and processing the captured controller input and controller movement and applying the captured movement to interactively animate the selected portion of the avatar within the virtual environment, wherein the capturing and processing of controller input and controller movement is continuously performed to define a correlation between controller movement and avatar animation.
17. The method as described in claim 16, wherein controller movement can be detected in three-axes including translational and rotational movement in all three-axes.
18. The method as described in claim 16, wherein capturing controller movement includes capturing acceleration and deceleration of the controller in rotational and translational movements of the controller in six-axes.
19. The method as described in claim 16, wherein the controller input includes selecting controller buttons, the controller buttons being mapped to initiate user control of different portions of the avatar animation when selected.
20. The method as described in claim 19, wherein animation of the selected portions of the avatar is responsive and proportional to user controlled acceleration and deceleration of translational and rotational motion of the controller in three-axes.
EP08726220A 2007-03-01 2008-02-27 Interactive user controlled avatar animations Withdrawn EP2118840A4 (en)

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
US89239707P 2007-03-01 2007-03-01
GBGB0703974.6A GB0703974D0 (en) 2007-03-01 2007-03-01 Entertainment device
GB0704225A GB2447094B (en) 2007-03-01 2007-03-05 Entertainment device and method
GB0704246A GB2447096B (en) 2007-03-01 2007-03-05 Entertainment device and method
GB0704235A GB2447095B (en) 2007-03-01 2007-03-05 Entertainment device and method
GB0704227A GB2447020A (en) 2007-03-01 2007-03-05 Transmitting game data from an entertainment device and rendering that data in a virtual environment of a second entertainment device
US11/789,202 US20080215974A1 (en) 2007-03-01 2007-04-23 Interactive user controlled avatar animations
PCT/US2008/002644 WO2008106197A1 (en) 2007-03-01 2008-02-27 Interactive user controlled avatar animations

Publications (2)

Publication Number Publication Date
EP2118840A1 true EP2118840A1 (en) 2009-11-18
EP2118840A4 EP2118840A4 (en) 2010-11-10

Family

ID=39738577

Family Applications (4)

Application Number Title Priority Date Filing Date
EP08730776A Ceased EP2132650A4 (en) 2007-03-01 2008-02-26 System and method for communicating with a virtual world
EP08726219A Withdrawn EP2118757A4 (en) 2007-03-01 2008-02-27 Virtual world avatar control, interactivity and communication interactive messaging
EP08726207A Ceased EP2126708A4 (en) 2007-03-01 2008-02-27 Virtual world user opinion&response monitoring
EP08726220A Withdrawn EP2118840A4 (en) 2007-03-01 2008-02-27 Interactive user controlled avatar animations

Family Applications Before (3)

Application Number Title Priority Date Filing Date
EP08730776A Ceased EP2132650A4 (en) 2007-03-01 2008-02-26 System and method for communicating with a virtual world
EP08726219A Withdrawn EP2118757A4 (en) 2007-03-01 2008-02-27 Virtual world avatar control, interactivity and communication interactive messaging
EP08726207A Ceased EP2126708A4 (en) 2007-03-01 2008-02-27 Virtual world user opinion&response monitoring

Country Status (3)

Country Link
EP (4) EP2132650A4 (en)
JP (5) JP2010533006A (en)
WO (1) WO2008108965A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10489795B2 (en) 2007-04-23 2019-11-26 The Nielsen Company (Us), Llc Determining relative effectiveness of media content items

Families Citing this family (52)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8397168B2 (en) 2008-04-05 2013-03-12 Social Communications Company Interfacing with a spatial virtual communication environment
US7769806B2 (en) 2007-10-24 2010-08-03 Social Communications Company Automated real-time data stream switching in a shared virtual area communication environment
US8407605B2 (en) 2009-04-03 2013-03-26 Social Communications Company Application sharing
CN102084354A (en) 2008-04-05 2011-06-01 社会传播公司 Shared virtual area communication environment based apparatus and methods
US20100093439A1 (en) * 2008-10-15 2010-04-15 Nc Interactive, Inc. Interactive network game and methods thereof
US20100099495A1 (en) * 2008-10-16 2010-04-22 Nc Interactive, Inc. Interactive network game and methods thereof
US9853922B2 (en) 2012-02-24 2017-12-26 Sococo, Inc. Virtual area communications
JP5527721B2 (en) 2009-01-28 2014-06-25 任天堂株式会社 Program and information processing apparatus
JP5813912B2 (en) * 2009-01-28 2015-11-17 任天堂株式会社 Program, information processing apparatus, and information processing system
JP5690473B2 (en) 2009-01-28 2015-03-25 任天堂株式会社 Program and information processing apparatus
JP5229484B2 (en) 2009-01-28 2013-07-03 任天堂株式会社 Information processing system, program, and information processing apparatus
US9542010B2 (en) * 2009-09-15 2017-01-10 Palo Alto Research Center Incorporated System for interacting with objects in a virtual environment
WO2011040371A1 (en) * 2009-09-30 2011-04-07 楽天株式会社 Object displacement method for a web page
US20120192088A1 (en) * 2011-01-20 2012-07-26 Avaya Inc. Method and system for physical mapping in a virtual world
CN106943742A (en) * 2011-02-11 2017-07-14 漳州市爵晟电子科技有限公司 One kind action amplification system
US20120277001A1 (en) * 2011-04-28 2012-11-01 Microsoft Corporation Manual and Camera-based Game Control
JP2013003778A (en) * 2011-06-15 2013-01-07 Forum8 Co Ltd Three-dimensional space information processing system, three-dimensional space information processing terminal, three-dimensional space information processing server, three-dimensional space information processing terminal program, three-dimensional space information processing server program, and three-dimensional space information processing method
WO2013181026A1 (en) 2012-06-02 2013-12-05 Social Communications Company Interfacing with a spatial virtual communications environment
CN104516618B (en) * 2013-09-27 2020-01-14 中兴通讯股份有限公司 Interface function analysis display method and device
JP6091407B2 (en) * 2013-12-18 2017-03-08 三菱電機株式会社 Gesture registration device
CN106464707A (en) * 2014-04-25 2017-02-22 诺基亚技术有限公司 Interaction between virtual reality entities and real entities
EP2996017B1 (en) 2014-09-11 2022-05-11 Nokia Technologies Oy Method, apparatus and computer program for displaying an image of a physical keyboard on a head mountable display
WO2016068581A1 (en) 2014-10-31 2016-05-06 Samsung Electronics Co., Ltd. Device and method of managing user information based on image
US10861056B2 (en) 2015-06-17 2020-12-08 Facebook, Inc. Placing locations in a virtual world
US9786125B2 (en) * 2015-06-17 2017-10-10 Facebook, Inc. Determining appearances of objects in a virtual world based on sponsorship of object appearances
US10339592B2 (en) 2015-06-17 2019-07-02 Facebook, Inc. Configuring a virtual store based on information associated with a user by an online system
CN109074397B (en) 2016-05-06 2022-04-15 索尼公司 Information processing system and information processing method
JP6263252B1 (en) * 2016-12-06 2018-01-17 株式会社コロプラ Information processing method, apparatus, and program for causing computer to execute information processing method
US10515474B2 (en) 2017-01-19 2019-12-24 Mindmaze Holding Sa System, method and apparatus for detecting facial expression in a virtual reality system
US10943100B2 (en) 2017-01-19 2021-03-09 Mindmaze Holding Sa Systems, methods, devices and apparatuses for detecting facial expression
EP3571627A2 (en) 2017-01-19 2019-11-27 Mindmaze Holding S.A. Systems, methods, apparatuses and devices for detecting facial expression and for tracking movement and location including for at least one of a virtual and augmented reality system
US10943383B2 (en) 2017-01-26 2021-03-09 Sony Corporation Information processing apparatus and information processing method
JP6821461B2 (en) * 2017-02-08 2021-01-27 株式会社コロプラ A method executed by a computer to communicate via virtual space, a program that causes the computer to execute the method, and an information control device.
US11231591B2 (en) 2017-02-24 2022-01-25 Sony Corporation Information processing apparatus, information processing method, and program
JP6651479B2 (en) * 2017-03-16 2020-02-19 株式会社コロプラ Information processing method and apparatus, and program for causing computer to execute the information processing method
US11328533B1 (en) 2018-01-09 2022-05-10 Mindmaze Holding Sa System, method and apparatus for detecting facial expression for motion capture
JP7308573B2 (en) * 2018-05-24 2023-07-14 株式会社ユピテル System and program etc.
JP7302956B2 (en) * 2018-09-19 2023-07-04 株式会社バンダイナムコエンターテインメント computer system, game system and program
JP2019130295A (en) * 2018-12-28 2019-08-08 ノキア テクノロジーズ オサケユイチア Interaction between virtual reality entities and real entities
JP7323315B2 (en) 2019-03-27 2023-08-08 株式会社コーエーテクモゲームス Information processing device, information processing method and program
EP3968165A4 (en) * 2019-08-20 2022-12-21 Japan Tobacco Inc. Communication assistance method, program, and communication server
WO2021033254A1 (en) * 2019-08-20 2021-02-25 日本たばこ産業株式会社 Communication assistance method, program, and communication server
KR102212511B1 (en) * 2019-10-23 2021-02-04 (주)스코넥엔터테인먼트 Virtual Reality Control System
EP3846008A1 (en) 2019-12-30 2021-07-07 TMRW Foundation IP SARL Method and system for enabling enhanced user-to-user communication in digital realities
JP2020146469A (en) * 2020-04-20 2020-09-17 株式会社トプコン Ophthalmologic examination system and ophthalmologic examination device
JP6932224B1 (en) * 2020-06-01 2021-09-08 株式会社電通 Advertising display system
JP7254112B2 (en) * 2021-03-19 2023-04-07 本田技研工業株式会社 Virtual experience providing device, virtual experience providing method, and program
WO2023281755A1 (en) * 2021-07-09 2023-01-12 シャープNecディスプレイソリューションズ株式会社 Display control device, display control method, and program
WO2023068067A1 (en) * 2021-10-18 2023-04-27 ソニーグループ株式会社 Information processing device, information processing method, and program
WO2023149255A1 (en) * 2022-02-02 2023-08-10 株式会社Nttドコモ Display control device
KR20230173481A (en) * 2022-06-17 2023-12-27 주식회사 메타캠프 Apparatus for Metaverse Service by Using Multi-Channel Structure and Channel Syncronizaton and Driving Method Thereof
WO2024004609A1 (en) * 2022-06-28 2024-01-04 ソニーグループ株式会社 Information processing device, information processing method, and recording medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0744199A2 (en) * 1995-05-26 1996-11-27 Kabushiki Kaisha Bandai Game apparatus
US5803810A (en) * 1995-03-23 1998-09-08 Perception Systems, Inc. Velocity-based command recognition technology
WO1999034276A2 (en) * 1997-12-23 1999-07-08 Koninklijke Philips Electronics N.V. System and method for constructing three-dimensional images using camera-based gesture inputs
WO2000063874A1 (en) * 1999-04-20 2000-10-26 John Warren Stringer Human gestural input device with motion and pressure
US6219033B1 (en) * 1993-07-16 2001-04-17 Immersion Corporation Method and apparatus for controlling force feedback interface systems utilizing a host computer
WO2002069609A2 (en) * 2001-02-27 2002-09-06 Anthrotronix, Inc. Robotic apparatus and wireless communication system
WO2004042545A1 (en) * 2002-11-07 2004-05-21 Personics A/S Adaptive motion detection interface and motion detector
WO2006121896A2 (en) * 2005-05-05 2006-11-16 Sony Computer Entertainment Inc. Microphone array based selective sound source listening and video game control

Family Cites Families (58)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2141144A1 (en) * 1994-03-31 1995-10-01 Joseph Desimone Electronic game utilizing bio-signals
US5823879A (en) * 1996-01-19 1998-10-20 Sheldon F. Goldberg Network gaming system
JP3274603B2 (en) * 1996-04-18 2002-04-15 エヌイーシーソフト株式会社 Voice aggregation system and voice aggregation method
JP3975511B2 (en) * 1997-07-25 2007-09-12 富士通株式会社 Personal communication distributed control system
JP3757584B2 (en) * 1997-11-20 2006-03-22 株式会社富士通ゼネラル Advertising effect confirmation system
JP3276068B2 (en) * 1997-11-28 2002-04-22 インターナショナル・ビジネス・マシーンズ・コーポレーション Object selection method and system
JP2000187435A (en) * 1998-12-24 2000-07-04 Sony Corp Information processing device, portable apparatus, electronic pet device, recording medium with information processing procedure recorded thereon, and information processing method
JP2000311251A (en) * 1999-02-26 2000-11-07 Toshiba Corp Device and method for generating animation and storage medium
JP4034002B2 (en) * 1999-04-22 2008-01-16 三菱電機株式会社 Distributed virtual space information management transmission method
AU5012600A (en) * 1999-05-14 2000-12-05 Graphic Gems Method and apparatus for a multi-owner, three-dimensional virtual world
WO2000070557A2 (en) * 1999-05-14 2000-11-23 Graphic Gems Method and apparatus for registering lots in a shared virtual world
JP2000325653A (en) * 1999-05-19 2000-11-28 Enix Corp Portable videogame device and storage medium with program stored therein
JP2001154966A (en) * 1999-11-29 2001-06-08 Sony Corp System and method for supporting virtual conversation being participation possible by users in shared virtual space constructed and provided on computer network and medium storing program
JP2001153663A (en) * 1999-11-29 2001-06-08 Canon Inc Discrimination device for moving direction of object, and photographic device, navigation system, suspension system, game system and remote controller system provide with the device
JP3623415B2 (en) * 1999-12-02 2005-02-23 日本電信電話株式会社 Avatar display device, avatar display method and storage medium in virtual space communication system
JP2001236290A (en) * 2000-02-22 2001-08-31 Toshinao Komuro Communication system using avatar
KR100366384B1 (en) * 2000-02-26 2002-12-31 (주) 고미드 Information search system based on communication of users
JP2001325501A (en) * 2000-03-10 2001-11-22 Heart Gift:Kk On-line gift method
JP3458090B2 (en) * 2000-03-15 2003-10-20 コナミ株式会社 GAME SYSTEM HAVING MESSAGE EXCHANGE FUNCTION, GAME DEVICE USED FOR THE GAME SYSTEM, MESSAGE EXCHANGE SYSTEM, AND COMPUTER-READABLE STORAGE MEDIUM
JP2001321568A (en) * 2000-05-18 2001-11-20 Casio Comput Co Ltd Device and method of game and recording medium
TWI221574B (en) * 2000-09-13 2004-10-01 Agi Inc Sentiment sensing method, perception generation method and device thereof and software
JP2002136762A (en) * 2000-11-02 2002-05-14 Taito Corp Adventure game using latent video
JP3641423B2 (en) * 2000-11-17 2005-04-20 Necインフロンティア株式会社 Advertisement information system
AU2002219857A1 (en) * 2000-11-27 2002-06-03 Butterfly.Net, Inc. System and method for synthesizing environments to facilitate distributed, context-sensitive, multi-user interactive applications
EP1216733A3 (en) * 2000-12-20 2004-09-08 Aruze Co., Ltd. Server providing competitive game service, program storage medium for use in the server, and method of providing competitive game service using the server
JP2002197376A (en) * 2000-12-27 2002-07-12 Fujitsu Ltd Method and device for providing virtual world customerized according to user
JP4613295B2 (en) * 2001-02-16 2011-01-12 株式会社アートディンク Virtual reality playback device
US7667705B2 (en) * 2001-05-15 2010-02-23 Nintendo Of America Inc. System and method for controlling animation by tagging objects within a game environment
JP4068542B2 (en) * 2001-05-18 2008-03-26 株式会社ソニー・コンピュータエンタテインメント Entertainment system, communication program, computer-readable recording medium storing communication program, and communication method
JP3425562B2 (en) * 2001-07-12 2003-07-14 コナミ株式会社 Character operation program, character operation method, and video game apparatus
JP3732168B2 (en) * 2001-12-18 2006-01-05 株式会社ソニー・コンピュータエンタテインメント Display device, display system and display method for objects in virtual world, and method for setting land price and advertising fee in virtual world where they can be used
JP2003210834A (en) * 2002-01-17 2003-07-29 Namco Ltd Control information, information storing medium, and game device
JP2003259331A (en) * 2002-03-06 2003-09-12 Nippon Telegraph & Telephone West Corp Three-dimensional contents distribution apparatus, three-dimensional contents distribution program, program recording medium, and three-dimensional contents distribution method
JP2003324522A (en) * 2002-05-02 2003-11-14 Nippon Telegr & Teleph Corp <Ntt> Ip/pstn integrated control apparatus, communication method, program, and recording medium
JP2004021606A (en) * 2002-06-17 2004-01-22 Nec Corp Internet service providing system using virtual space providing server
JP2004046311A (en) * 2002-07-09 2004-02-12 Nippon Telegr & Teleph Corp <Ntt> Method and system for gesture input in three-dimensional virtual space
US20040029625A1 (en) * 2002-08-07 2004-02-12 Ed Annunziata Group behavioral modification using external stimuli
JP3952396B2 (en) * 2002-11-20 2007-08-01 任天堂株式会社 GAME DEVICE AND INFORMATION PROCESSING DEVICE
JP2004237022A (en) * 2002-12-11 2004-08-26 Sony Corp Information processing device and method, program and recording medium
JP3961419B2 (en) * 2002-12-27 2007-08-22 株式会社バンダイナムコゲームス GAME DEVICE, GAME CONTROL PROGRAM, AND RECORDING MEDIUM CONTAINING THE PROGRAM
GB0306875D0 (en) * 2003-03-25 2003-04-30 British Telecomm Apparatus and method for generating behavior in an object
JP4442117B2 (en) * 2003-05-27 2010-03-31 ソニー株式会社 Information registration method, information registration apparatus, and information registration program
US7725419B2 (en) * 2003-09-05 2010-05-25 Samsung Electronics Co., Ltd Proactive user interface including emotional agent
JP2005100053A (en) * 2003-09-24 2005-04-14 Nomura Research Institute Ltd Method, program and device for sending and receiving avatar information
JP2005216004A (en) * 2004-01-29 2005-08-11 Tama Tlo Kk Program and communication method
JP4559092B2 (en) * 2004-01-30 2010-10-06 株式会社エヌ・ティ・ティ・ドコモ Mobile communication terminal and program
US20060013254A1 (en) * 2004-06-07 2006-01-19 Oded Shmueli System and method for routing communication through various communication channel types
JP2006034436A (en) * 2004-07-23 2006-02-09 Smk Corp Virtual game system using exercise apparatus
EP1814292A1 (en) * 2004-10-08 2007-08-01 Sonus Networks, Inc. Call handoff between subscriber's multiple devices associated with multiple networks
US20090005167A1 (en) * 2004-11-29 2009-01-01 Juha Arrasvuori Mobile Gaming with External Devices in Single and Multiplayer Games
JP2006185252A (en) * 2004-12-28 2006-07-13 Univ Of Electro-Communications Interface device
JP2006186893A (en) * 2004-12-28 2006-07-13 Matsushita Electric Ind Co Ltd Voice conversation control apparatus
JP2006211005A (en) * 2005-01-25 2006-08-10 Takashi Uchiyama Television telephone advertising system
JPWO2006080080A1 (en) * 2005-01-28 2008-06-19 富士通株式会社 Telephone management system, telephone management method, and telephone management program
JP4322833B2 (en) * 2005-03-16 2009-09-02 株式会社東芝 Wireless communication system
US20060252538A1 (en) * 2005-05-05 2006-11-09 Electronic Arts Inc. Analog stick input replacement for lengthy button push sequences and intuitive input for effecting character actions
JP2006004421A (en) * 2005-06-03 2006-01-05 Sony Corp Data processor
US20070002835A1 (en) * 2005-07-01 2007-01-04 Microsoft Corporation Edge-based communication

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6219033B1 (en) * 1993-07-16 2001-04-17 Immersion Corporation Method and apparatus for controlling force feedback interface systems utilizing a host computer
US5803810A (en) * 1995-03-23 1998-09-08 Perception Systems, Inc. Velocity-based command recognition technology
EP0744199A2 (en) * 1995-05-26 1996-11-27 Kabushiki Kaisha Bandai Game apparatus
WO1999034276A2 (en) * 1997-12-23 1999-07-08 Koninklijke Philips Electronics N.V. System and method for constructing three-dimensional images using camera-based gesture inputs
WO2000063874A1 (en) * 1999-04-20 2000-10-26 John Warren Stringer Human gestural input device with motion and pressure
WO2002069609A2 (en) * 2001-02-27 2002-09-06 Anthrotronix, Inc. Robotic apparatus and wireless communication system
WO2004042545A1 (en) * 2002-11-07 2004-05-21 Personics A/S Adaptive motion detection interface and motion detector
WO2006121896A2 (en) * 2005-05-05 2006-11-16 Sony Computer Entertainment Inc. Microphone array based selective sound source listening and video game control

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of WO2008106197A1 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10489795B2 (en) 2007-04-23 2019-11-26 The Nielsen Company (Us), Llc Determining relative effectiveness of media content items
US11222344B2 (en) 2007-04-23 2022-01-11 The Nielsen Company (Us), Llc Determining relative effectiveness of media content items

Also Published As

Publication number Publication date
EP2118840A4 (en) 2010-11-10
JP2010535363A (en) 2010-11-18
EP2126708A1 (en) 2009-12-02
EP2118757A4 (en) 2010-11-03
EP2118757A1 (en) 2009-11-18
JP2014149836A (en) 2014-08-21
JP2010533006A (en) 2010-10-21
WO2008108965A1 (en) 2008-09-12
JP2010535364A (en) 2010-11-18
EP2132650A2 (en) 2009-12-16
JP5756198B2 (en) 2015-07-29
EP2126708A4 (en) 2010-11-17
JP2010535362A (en) 2010-11-18
EP2132650A4 (en) 2010-10-27

Similar Documents

Publication Publication Date Title
JP5756198B2 (en) Interactive user-controlled avatar animation
US20080215974A1 (en) Interactive user controlled avatar animations
US11317076B2 (en) Peripheral device having sensors for capturing changes in spatial position
US10195528B2 (en) Systems for using three-dimensional object as controller in an interactive game
US8601379B2 (en) Methods for interactive communications with real time effects and avatar environment interaction
WO2008106197A1 (en) Interactive user controlled avatar animations
EP2303422B1 (en) Determination of controller three-dimensional location using image analysis and ultrasonic communication
US8221229B2 (en) Spherical ended controller with configurable modes
US20100060662A1 (en) Visual identifiers for virtual world avatars
US8393964B2 (en) Base station for position location
EP2356545B1 (en) Spherical ended controller with configurable modes
WO2010020739A1 (en) Entertainment device and method of interaction
JP2012510856A (en) 3D control by multi-positional controller

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20090915

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MT NL NO PL PT RO SE SI SK TR

DAX Request for extension of the european patent (deleted)
RIN1 Information on inventor provided before grant (corrected)

Inventor name: WAUGAMAN, SCOTT

Inventor name: HARRISON, PHIL

Inventor name: ZALEWSKI, GRAY, M.

A4 Supplementary search report drawn up and despatched

Effective date: 20101008

RIC1 Information provided on ipc code assigned before grant

Ipc: A63F 13/00 20060101AFI20101004BHEP

Ipc: A63F 13/10 20060101ALI20101004BHEP

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: SONY COMPUTER ENTERTAINMENT AMERICA LLC

Owner name: SONY COMPUTER ENTERTAINMENT EUROPE LIMITED

17Q First examination report despatched

Effective date: 20140516

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: SONY COMPUTER ENTERTAINMENT AMERICA LLC

Owner name: SONY COMPUTER ENTERTAINMENT EUROPE LIMITED

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: SONY INTERACTIVE ENTERTAINMENT EUROPE LIMITED

Owner name: SONY COMPUTER ENTERTAINMENT AMERICA LLC

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN

18W Application withdrawn

Effective date: 20190403