US20100105479A1 - Determining orientation in an external reference frame - Google Patents

Determining orientation in an external reference frame Download PDF

Info

Publication number
US20100105479A1
US20100105479A1 US12/256,747 US25674708A US2010105479A1 US 20100105479 A1 US20100105479 A1 US 20100105479A1 US 25674708 A US25674708 A US 25674708A US 2010105479 A1 US2010105479 A1 US 2010105479A1
Authority
US
United States
Prior art keywords
controller
acceleration
orientation
frame
external
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/256,747
Inventor
Andrew Wilson
Steven Michael Beeman
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Corp filed Critical Microsoft Corp
Priority to US12/256,747 priority Critical patent/US20100105479A1/en
Assigned to MICROSOFT CORPORATION reassignment MICROSOFT CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: WILSON, ANDREW, BEEMAN, STEVEN MICHAEL
Priority to US12/490,331 priority patent/US8282487B2/en
Publication of US20100105479A1 publication Critical patent/US20100105479A1/en
Assigned to MICROSOFT TECHNOLOGY LICENSING, LLC reassignment MICROSOFT TECHNOLOGY LICENSING, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MICROSOFT CORPORATION
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • A63F13/10
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06F3/0325Detection arrangements using opto-electronic means using a plurality of light emitters or reflectors or a plurality of detectors forming a reference frame from which to derive the orientation of the object, e.g. by triangulation or on the basis of reference deformation in the picked up image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/105Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals using inertial sensors, e.g. accelerometers, gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1087Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands

Definitions

  • Controller 32 may report acceleration information and/or angular motion information to orientation inferring subsystem 40 by any suitable means.
  • controller 32 may report acceleration information and/or angular motion information by wirelessly transmitting such information to orientation inferring subsystem 40 , as schematically shown in FIG. 2 .
  • controller 32 may be physically connected to orientation inferring subsystem 40 .
  • FIG. 4 shows a process flow diagram of an example method 60 of tracking an orientation of a game controller.
  • Method 60 begins at 62 by inferring a coarse orientation of the game controller.
  • method 60 includes determining an external-frame acceleration for the game controller, the external-frame acceleration being in an external reference frame relative to the game controller.
  • method 60 includes determining an internal-frame acceleration for the game controller, the internal-frame acceleration being in an internal reference frame relative to the game controller.
  • method 60 includes determining an orientation of the game controller based on a comparison between a direction of the external-frame acceleration and a direction of the internal-frame acceleration, as explained above.
  • method 60 may optionally include, at 70 , updating the coarse orientation of the game controller based on angular motion information observed by the game controller.

Abstract

Orientation in an external reference is determined. An external-frame acceleration for a device is determined, the external-frame acceleration being in an external reference frame relative to the device. An internal-frame acceleration for the device is determined, the internal-frame acceleration being in an internal reference frame relative to the device. An orientation of the device is determined based on a comparison between a direction of the external-frame acceleration and a direction of the internal-frame acceleration.

Description

    BACKGROUND
  • A gyroscope can use angular momentum to assess a relative orientation of a device in a frame of reference that is internal to that device. However, even the most accurate gyroscopes available may accumulate small orientation errors over time.
  • SUMMARY
  • This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
  • Determining orientation in an external reference frame is disclosed herein. An external-frame acceleration for a device is determined, the external-frame acceleration being in an external reference frame relative to the device. An internal-frame acceleration for the device is also determined, the internal-frame acceleration being in an internal reference frame relative to the device. An orientation of the device is determined based on a comparison between a direction of the external-frame acceleration and a direction of the internal-frame acceleration.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1A schematically shows an orientation-determining computing system in accordance with an embodiment of the present disclosure.
  • FIG. 1B schematically shows a position-determining computing system in accordance with another embodiment of the present disclosure.
  • FIG. 2 shows an exemplary configuration of the orientation-determining computing system of FIG. 1.
  • FIG. 3 shows a comparison of an external-frame acceleration vector and an internal-frame acceleration vector corresponding to the controller orientation of FIG. 2.
  • FIG. 4 shows a process flow of an example method of tracking an orientation of a game controller.
  • DETAILED DESCRIPTION
  • FIG. 1 shows an orientation-determining computing system 10 including a wand 12, a wand monitor 14 and an orientation inferring subsystem 16. Orientation inferring subsystem 16 is configured to determine an orientation of wand 12 in a frame of reference that is external to the wand 12. In particular, the orientation inferring subsystem 16 may infer a coarse orientation of the wand 12 in the external reference frame by comparing acceleration information of the wand 12 in the external reference frame with acceleration information of the wand 12 in an internal reference frame.
  • The acceleration information in the external reference frame may be assessed by wand monitor 14. The wand monitor 14 may be configured to observe the wand 12 as the wand 12 moves relative to the wand monitor 14. Such observations may be translated into an external-frame acceleration for the wand. Any suitable technique may be used by the wand monitor 14 for observing the wand 12. As a nonlimiting example, the wand monitor 14 may be configured to visually observe the wand 12 with stereo cameras. In some embodiments, the wand 12 may include a target 18 that facilitates observation by the wand monitor 14.
  • The acceleration information in the internal reference frame may be assessed by the wand 12. The wand 12 may be configured to sense wand accelerations and report such sensed accelerations to orientation inferring subsystem 16. In some embodiments, the wand may include an acceleration-measuring subsystem 20 for measuring wand accelerations in a frame of reference that is internal to the wand 12.
  • In addition to determining a coarse orientation of the wand 12 by comparing wand accelerations in internal and external reference frames, the orientation inferring subsystem 16 may update the coarse orientation of the wand 12 based on angular motion information observed by the wand 12 itself. As such, the wand 12 may include an angular-motion measuring subsystem 22 for measuring angular motion of the wand 12 in a frame of reference that is internal to the wand. Even when such an angular-motion measuring subsystem 22 is included, the coarse orientation inferred using internal and external-frame accelerations may be used to limit errors that may accumulate if only the angular-motion measuring subsystem 22 is used.
  • The wand may be configured to serve a variety of different functions in different embodiments without departing from the scope of this disclosure. As a nonlimiting example, in some embodiments, computing system 10 may be a game system in which wand 12 is a game controller device for controlling various game functions. It is to be understood that the orientation inferring methods described herein may additionally and/or alternatively be applied to an orientation-determining computing system other than a game system, and the wand need not be a game controller in all embodiments.
  • Furthermore, it is to be understood that the arrangement shown in FIG. 1A is exemplary, and other arrangements are within the scope of this disclosure. As a nonlimiting example, FIG. 1B shows a position-determining computing system 10′ in accordance with another embodiment of the present disclosure. Position-determining computing system 10′ includes a wand 12′, a target monitor 14′ and a position inferring subsystem 16′. Position inferring subsystem 16′ is configured to determine a position of wand 12′ in a frame of reference that is external to the wand 12′. In particular, the position inferring subsystem 16′ may infer a coarse position of the wand 12′ in the external reference frame by comparing orientation information of the wand 12′ in the external reference frame with acceleration information of the wand 12′ in an internal reference frame.
  • In some embodiments, target 18′ may include one or more LEDs (e.g., infrared LEDs) positioned in a fixed location, such as near a television or any other suitable location. In such embodiments, the wand 12′ may include a target monitor 14′ configured to view the target 18′ and deduce an orientation of the wand based upon a relative position of the target 18′ within the target monitor's field of view. Such information may be used in cooperation with acceleration information measured by an acceleration-measuring subsystem 20′ and/or angular motion information measured by an angular-motion measuring subsystem 22′ to infer a coarse position of the wand as discussed below with reference to inferring coarse orientation.
  • In yet other embodiments, a wand may include both a target and a target monitor, and/or both a target and a target monitor may be positioned at one or more locations external to the wand. In other words, the arrangements shown in FIGS. 1A and 1B may be at least partially combined, thus enabling direct deduction of both wand position and wand orientation, which may optionally be confirmed/verified with inferred position and inferred orientation, as described herein. Further, it should be understood that the relative positioning of targets, target monitors, wand monitors, and other components described herein may be varied from the specific examples provided herein without departing from the scope of the present disclosure.
  • FIG. 2 shows an example game system 30 including a controller 32, a controller monitor 34 including stereo cameras 36, and a gaming console 38 including an orientation inferring subsystem 40.
  • In such a game system 30, orientation inferring subsystem 40 is configured to infer a coarse orientation of controller 32 in an external reference frame relative to controller 32. In particular, the coarse orientation of the controller 32 in a television's, or other display's, reference frame may be inferred. The orientation inferring subsystem 40 infers the coarse orientation of the controller 32 by comparing acceleration information from an external reference frame relative to the controller 32 with acceleration information from an internal reference frame relative to the controller 32.
  • In the illustrated embodiment, orientation inferring subsystem 40 is configured to determine an external-frame acceleration of controller 32 using time-elapsed position information received from stereo cameras 36. While shown placed near a television, it should be understood that stereo cameras 36, or another wand/target monitor, may be placed in numerous different positions without departing from the scope of this disclosure.
  • The stereo cameras may observe a target 41 in the form of an infrared light on controller 32. The individual position of the target 41 in each camera's field of view may be cooperatively used to determine a three-dimensional position of the target 41, and thus the controller 32, at various times. Visually-observed initial position information and subsequent position information may be used to calculate the external-frame acceleration of the controller 32 using any suitable technique.
  • The following technique is a nonlimiting example for using initial position information and subsequent position information to determine an external-frame acceleration of the controller. Taking X0 to be a current position of controller 32 as observed by controller monitor 34 at a time t0, and X−1 to be a previous position of controller 32 as observed by controller monitor 34 at a previous time t−1, an expected position X0 for controller 32 at a current time t0 can be calculated according to the following equation,

  • X 0= X −1 + V (t 0 −t −1).
  • Here, the velocity V is calculated from prior position information as follows,
  • V _ = ( X - 1 _ - X - 2 _ ) ( t - 1 - t - 2 ) ,
  • where X−2 is a more previous position of the controller as observed by the controller monitor at a more previous time t−2.
  • If it is determined that the expected position X0 is not equal to the current position X0 , then the difference may be a result of acceleration of controller 32. In such a case, the orientation inferring subsystem 40 determines an external-frame acceleration ā of controller 32 at a current time t0 to be given by the following,
  • a _ = 2 ( X 0 _ - X 0 _ ) ( t 0 - t - 1 ) 2 + g _ ,
  • where g is a gravitational acceleration.
  • Orientation inferring subsystem 40 is configured to determine an internal-frame acceleration of controller 32 from acceleration information received from controller 32. The controller 32 may obtain the internal-frame acceleration in any suitable manner. For example, the controller may include an acceleration-measuring subsystem configured to report acceleration information to the orientation inferring subsystem 40. In some embodiments, the acceleration-measuring subsystem may be a three-axis accelerometer 42 located proximate to the target 41, as schematically shown in FIG. 2.
  • The orientation inferring subsystem 40 can determine a coarse orientation of controller 32 based on a comparison between a direction of the external-frame acceleration and a direction of the internal-frame acceleration. FIG. 3 shows an example of such a comparison 50 corresponding to the controller movement shown in FIG. 2. Vector 52 represents the direction of the external-frame acceleration and vector 54 represents the direction of the internal-frame acceleration. The misalignment between the external-frame acceleration and the internal-frame acceleration can be resolved to find any difference between the external reference frame and the internal reference frame. Accordingly, an orientation of the controller 32 can be inferred in the external frame of reference.
  • As a nonlimiting example, if stereo cameras 36 observe controller 32 accelerating due east without changing elevation or moving north/south; and if acceleration-measuring subsystem 20 reports that controller 32 accelerates to the right, without moving up/down or front/back; then orientation inferring subsystem 40 can infer that controller 32 is pointing toward the north. The above is a simplified and somewhat exaggerated scenario. In many usage scenarios, controller 32 will be pointed substantially toward a television or other display, and any relative misalignments between internal and external reference frames will be less severe. Nonetheless, the orientation inferring methods described herein may be used to assess a coarse orientation.
  • The assessed external-frame acceleration of controller 32 may differ from the actual controller acceleration due to one or more of the following factors: noise and error in the data visually-observed by stereo cameras 36, noise and error in the accelerometer data, and/or misalignment between the internal reference frame and the external reference frame. However, an inferred coarse orientation of controller 32, which is found as described herein, is absolute, rather than relative, and therefore does not accumulate error over time.
  • In some embodiments, orientation inferring subsystem 40 may be further configured to update the coarse orientation of controller 32 based on angular motion information observed by controller 32. The controller 32 may obtain the angular motion information in any suitable manner. One such suitable manner includes obtaining the angular motion information by means of an angular-motion measuring subsystem 44 configured to report angular motion information to the orientation inferring subsystem 40. In some embodiments, the angular-motion measuring subsystem may include spaced-apart three-axis accelerometers configured to be used in combination to determine the angular motion of controller 32. As shown in FIG. 2, in such embodiments, one three-axis accelerometer 42 may be located at a head end of controller 32 and another three-axis accelerometer 46 may be located at a tail end of controller 32, such that subtracting a head acceleration direction obtained by the head accelerometer 42 from a tail acceleration direction obtained by the tail accelerometer 46 yields an orientation change of controller 32 in the internal reference frame relative to controller 32. In other embodiments, such an angular-motion measuring subsystem 44 may include a three-axis gyroscope 48 which calculates the angular velocity of controller 32, which can then be integrated over time to determine an angular position.
  • In between frames where a coarse orientation is available (e.g., if target 41 does not move sufficient distance for detection by stereo cameras 36), measurements from the angular-motion measuring subsystem 44 may accumulate error. A long period of very slow motion, as might well happen when drawing, is the worst-case scenario. However, such a situation is the best-case scenario for smoothing and filtering the accelerometer data, because it is expected that a user will attempt to draw smooth lines and curves.
  • Controller 32 may report acceleration information and/or angular motion information to orientation inferring subsystem 40 by any suitable means. In some embodiments, controller 32 may report acceleration information and/or angular motion information by wirelessly transmitting such information to orientation inferring subsystem 40, as schematically shown in FIG. 2. In other embodiments, controller 32 may be physically connected to orientation inferring subsystem 40.
  • FIG. 4 shows a process flow diagram of an example method 60 of tracking an orientation of a game controller. Method 60 begins at 62 by inferring a coarse orientation of the game controller. At 64, method 60 includes determining an external-frame acceleration for the game controller, the external-frame acceleration being in an external reference frame relative to the game controller. At 66, method 60 includes determining an internal-frame acceleration for the game controller, the internal-frame acceleration being in an internal reference frame relative to the game controller. At 68, method 60 includes determining an orientation of the game controller based on a comparison between a direction of the external-frame acceleration and a direction of the internal-frame acceleration, as explained above. Upon inferring a coarse orientation of the game controller, method 60 may optionally include, at 70, updating the coarse orientation of the game controller based on angular motion information observed by the game controller.
  • In some embodiments, an unscented Kalman filter may be used to combine three-dimensional position tracking from stereo cameras, angular velocity information from gyroscopes, and acceleration information from accelerometers into a unified estimate of position and absolute orientation of the device. An unscented Kalman filter may be appropriate because of nonlinearities that may be introduced in the observation part of the process model (i.e., using the orientation to correct accelerometers). An extended Kalman filter may alternatively be used.
  • The Kalman filter approach combines the information provided from all sensors and allows the introduction of (Gaussian) noise models for each of the sensors. For example, any noise associated with position estimates from the cameras can be incorporated directly into the model. Similarly, the noise of the gyroscopes and accelerometers may be represented by the model. By tuning each of these separately, the system may favor the more reliable sensors without neglecting less reliable sensors.
  • The Kalman state, state transition, and observation model are described as follows, and the standard Kalman filter equations are used thereafter. At each frame, the state is updated with the state transition model, and predicted sensor values are computed from state estimates given the observation model. After the filter is updated, an updated position and orientation information is “read” from the updated state vector.
  • The Kalman state {x, {dot over (x)}, {umlaut over (x)}, q, ω} includes information to be represented and carried from frame to frame, and is described as follows:
      • x is a 3D position of the device (3-vector);
      • {dot over (x)} is a velocity of the device (3-vector);
      • {umlaut over (x)} is an acceleration of the device (3-vector);
      • q is a device orientation (quaternion); and
      • ω is an angular velocity: change in yaw, pitch and roll in the device coordinate frame (3-vector).
  • Next, a state transition is used to advance the state to the next time step based on process dynamics (velocity, acceleration, etc.). The state transition is described mathematically as follows:

  • x′=x+{dot over (x)}

  • {dot over (x)}′={dot over (x)}+{umlaut over (x)}

  • {umlaut over (x)}′={umlaut over (x)}

  • q′=q·q(ω)
  • where:
      • q(ω) is a quaternion formed from a change in yaw, pitch, and roll.
  • Next the sensed values are “observed” from the state, as follows:
      • z is a 3D position from a stereo camera system (3-vector);
      • gyro are gyroscope values including change in yaw, pitch and roll (3-vector);
      • a is accelerometer values (3-vector);
      • g is a direction of gravity (3-vector);
  • where:
      • z=x;
      • gyro=ω;
      • a=({umlaut over (x)}−g)R(q)
  • where:
      • R(q) is a rotation matrix formed from the quaternion q.
  • The last equation is the focus, where the accelerometer values are predicted by combining the effects of acceleration due to motion of the device, the effect of gravity, and the absolute orientation of the device. Discrepancies in the predicted values are then propagated back to the state by way of the standard Kalman update equations.
  • It should be understood that the configurations and/or approaches described herein are exemplary in nature, and that these specific embodiments or examples are not to be considered in a limiting sense, because numerous variations are possible. The specific routines or methods described herein may represent one or more of any number of processing strategies. As such, various acts illustrated may be performed in the sequence illustrated, in other sequences, in parallel, or in some cases omitted. Likewise, the order of the above-described processes may be changed.
  • The subject matter of the present disclosure includes all novel and nonobvious combinations and subcombinations of the various processes, systems and configurations, and other features, functions, acts, and/or properties disclosed herein, as well as any and all equivalents thereof. Furthermore, U.S. Pat. No. 6,982,697 is hereby incorporated herein by reference for all purposes.

Claims (20)

1. A game system, comprising:
a controller;
a controller monitor; and
an orientation inferring subsystem configured to:
determine an external-frame acceleration of the controller from time-elapsed position information received from the controller monitor, the external-frame acceleration being in an external reference frame relative to the controller;
determine an internal-frame acceleration for the device from acceleration information received from the controller, the internal-frame acceleration being in an internal reference frame relative to the controller; and
determine a coarse orientation of the controller based on a comparison between a direction of the external-frame acceleration and a direction of the internal-frame acceleration.
2. The game system of claim 1, where the controller includes an acceleration-measuring subsystem configured to report acceleration information to the orientation inferring subsystem.
3. The game system of claim 1, where the controller includes an angular-motion measuring subsystem configured to report angular motion information to the orientation inferring subsystem.
4. The game system of claim 3, where the angular-motion measuring subsystem includes spaced-apart three-axis accelerometers.
5. The game system of claim 3, where the angular-motion measuring subsystem includes a three-axis gyroscope.
6. The game system of claim 3, where the orientation inferring subsystem is configured to update the coarse orientation based on the angular motion information.
7. The game system of claim 1, where the controller monitor includes stereo cameras.
8. The game system of claim 7, where the controller includes an infrared light and the stereo cameras are configured to view the infrared light.
9. The game system of claim 1, where the orientation inferring subsystem determines the external-frame acceleration as:
2 ( X 0 _ - X 0 _ ) ( t 0 - t - 1 ) 2 + g _
where:
X0 is a current position of the controller as observed by the controller monitor at a time t0;
g is a gravitational acceleration;
X0 is X−1 + V(t0−t−1)
where:
X−1 is a previous position of the controller as observed by the controller monitor at a previous time t−1;
V _ is ( X - 1 _ - X - 2 _ ) ( t - 1 - t - 2 )
where:
X−2 is a more previous position of the controller as observed by the controller monitor at a more previous time t−2.
10. The game system of claim 1, where the orientation inferring subsystem uses an unscented Kalman filter to determine a unified estimate of position and an absolute orientation of the controller.
11. A method of tracking an orientation of a game controller, the method comprising:
inferring a coarse orientation of the game controller by:
determining an external-frame acceleration for the game controller, the external-frame acceleration being in an external reference frame relative to the game controller;
determining an internal-frame acceleration for the game controller, the internal-frame acceleration being in an internal reference frame relative to the game controller; and
determining an orientation of the game controller based on a comparison between a direction of the external-frame acceleration and a direction of the internal-frame acceleration; and
updating the coarse orientation of the game controller based on angular motion information observed by the game controller.
12. The method of claim 11, where determining an external-frame acceleration for the game controller includes translating motion information for the game controller that is visually observed by a stereo camera.
13. A method of inferring device orientation in an external reference frame, the method comprising:
determining an external-frame acceleration for the device, the external-frame acceleration being in an external reference frame relative to the device;
determining an internal-frame acceleration for the device, the internal-frame acceleration being in an internal reference frame relative to the device;
determining an orientation of the device based on a comparison between a direction of the external-frame acceleration and a direction of the internal-frame acceleration.
14. The method of claim 13, where determining an external-frame acceleration for the device includes translating visually-observed motion of the device.
15. The method of claim 14, where a stereo camera is used to visually-observe motion of the device.
16. The method of claim 13, where determining the internal-frame acceleration for the device includes receiving internal-frame acceleration information observed by the device.
17. The method of claim 13, further comprising updating the orientation of the device based on angular motion information observed by the device.
18. The method of claim 13, where determining an external-frame acceleration for the device includes receiving initial position information for the device, the initial position information being in the external reference frame relative to the device; and receiving subsequent position information for the device, the subsequent position information being in the external reference frame relative to the device.
19. The method of claim 13, where determining the external-frame acceleration for the device includes calculating:
2 ( X 0 _ - X 0 _ ) ( t 0 - t - 1 ) 2 + g _
where:
X0 is a current position of the device in the external reference frame at a time t0;
g is a gravitational acceleration;
X0 is X−1 + V(t0−t−1)
where:
X−1 is a previous position of the device in the external reference frame at a previous time t−1;
V _ is ( X - 1 _ - X - 2 _ ) ( t - 1 - t - 2 )
where:
X 2 is a more previous position of the device in the external reference frame at a more previous time t−2.
20. The method of claim 13, further comprising using an unscented Kalman filter to determine a unified estimate of position and an absolute orientation of the device.
US12/256,747 2008-10-23 2008-10-23 Determining orientation in an external reference frame Abandoned US20100105479A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US12/256,747 US20100105479A1 (en) 2008-10-23 2008-10-23 Determining orientation in an external reference frame
US12/490,331 US8282487B2 (en) 2008-10-23 2009-06-24 Determining orientation in an external reference frame

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US12/256,747 US20100105479A1 (en) 2008-10-23 2008-10-23 Determining orientation in an external reference frame

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US12/490,331 Continuation US8282487B2 (en) 2008-10-23 2009-06-24 Determining orientation in an external reference frame

Publications (1)

Publication Number Publication Date
US20100105479A1 true US20100105479A1 (en) 2010-04-29

Family

ID=42117093

Family Applications (2)

Application Number Title Priority Date Filing Date
US12/256,747 Abandoned US20100105479A1 (en) 2008-10-23 2008-10-23 Determining orientation in an external reference frame
US12/490,331 Active 2030-05-18 US8282487B2 (en) 2008-10-23 2009-06-24 Determining orientation in an external reference frame

Family Applications After (1)

Application Number Title Priority Date Filing Date
US12/490,331 Active 2030-05-18 US8282487B2 (en) 2008-10-23 2009-06-24 Determining orientation in an external reference frame

Country Status (1)

Country Link
US (2) US20100105479A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060007142A1 (en) * 2003-06-13 2006-01-12 Microsoft Corporation Pointing device and cursor for use in intelligent computing environments
US20080192007A1 (en) * 2002-02-07 2008-08-14 Microsoft Corporation Determining a position of a pointing device
US20100009754A1 (en) * 2008-07-11 2010-01-14 Takayuki Shimamura Game apparatus and game program
US20120086637A1 (en) * 2010-10-08 2012-04-12 Cywee Group Limited System and method utilized for human and machine interface
KR20120121595A (en) * 2011-04-27 2012-11-06 삼성전자주식회사 Position calculation apparatus and method that use acceleration sensor
US20130027341A1 (en) * 2010-04-16 2013-01-31 Mastandrea Nicholas J Wearable motion sensing computing interface
US20130059661A1 (en) * 2011-09-02 2013-03-07 Zeroplus Technology Co., Ltd. Interactive video game console
US10001645B2 (en) * 2014-01-17 2018-06-19 Sony Interactive Entertainment America Llc Using a second screen as a private tracking heads-up display
US10168775B2 (en) * 2012-10-10 2019-01-01 Innovative Devices Inc. Wearable motion sensing computing interface

Families Citing this family (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8745541B2 (en) 2003-03-25 2014-06-03 Microsoft Corporation Architecture for controlling a computer using hand gestures
US9171454B2 (en) 2007-11-14 2015-10-27 Microsoft Technology Licensing, Llc Magic wand
US8952894B2 (en) 2008-05-12 2015-02-10 Microsoft Technology Licensing, Llc Computer vision-based multi-touch sensing using infrared lasers
US8847739B2 (en) * 2008-08-04 2014-09-30 Microsoft Corporation Fusing RFID and vision for surface object tracking
US9401178B2 (en) 2010-08-26 2016-07-26 Blast Motion Inc. Event analysis system
US9261526B2 (en) 2010-08-26 2016-02-16 Blast Motion Inc. Fitting system for sporting equipment
US9418705B2 (en) 2010-08-26 2016-08-16 Blast Motion Inc. Sensor and media event detection system
US9646209B2 (en) 2010-08-26 2017-05-09 Blast Motion Inc. Sensor and media event detection and tagging system
US9320957B2 (en) 2010-08-26 2016-04-26 Blast Motion Inc. Wireless and visual hybrid motion capture system
US9607652B2 (en) 2010-08-26 2017-03-28 Blast Motion Inc. Multi-sensor event detection and tagging system
US8941723B2 (en) 2010-08-26 2015-01-27 Blast Motion Inc. Portable wireless mobile device motion capture and analysis system and method
US9604142B2 (en) 2010-08-26 2017-03-28 Blast Motion Inc. Portable wireless mobile device motion capture data mining system and method
US9619891B2 (en) 2010-08-26 2017-04-11 Blast Motion Inc. Event analysis and tagging system
US9406336B2 (en) 2010-08-26 2016-08-02 Blast Motion Inc. Multi-sensor event detection system
US9940508B2 (en) 2010-08-26 2018-04-10 Blast Motion Inc. Event detection, confirmation and publication system that integrates sensor data and social media
US9247212B2 (en) 2010-08-26 2016-01-26 Blast Motion Inc. Intelligent motion capture element
US9626554B2 (en) 2010-08-26 2017-04-18 Blast Motion Inc. Motion capture system that combines sensors with different measurement ranges
US9396385B2 (en) 2010-08-26 2016-07-19 Blast Motion Inc. Integrated sensor and video motion analysis method
US9076041B2 (en) 2010-08-26 2015-07-07 Blast Motion Inc. Motion event recognition and video synchronization system and method
EP3584682B1 (en) * 2010-12-22 2021-06-30 zSpace, Inc. Three-dimensional tracking of a user control device in a volume
US11911117B2 (en) 2011-06-27 2024-02-27 Board Of Regents Of The University Of Nebraska On-board tool tracking system and methods of computer assisted surgery
US9498231B2 (en) 2011-06-27 2016-11-22 Board Of Regents Of The University Of Nebraska On-board tool tracking system and methods of computer assisted surgery
CA2840397A1 (en) 2011-06-27 2013-04-11 Board Of Regents Of The University Of Nebraska On-board tool tracking system and methods of computer assisted surgery
US8944940B2 (en) 2011-08-29 2015-02-03 Icuemotion, Llc Racket sport inertial sensor motion tracking analysis
US8811938B2 (en) 2011-12-16 2014-08-19 Microsoft Corporation Providing a user interface experience based on inferred vehicle state
AT512350B1 (en) * 2011-12-20 2017-06-15 Isiqiri Interface Tech Gmbh COMPUTER PLANT AND CONTROL PROCESS THEREFOR
US10105149B2 (en) 2013-03-15 2018-10-23 Board Of Regents Of The University Of Nebraska On-board tool tracking system and methods of computer assisted surgery
KR102233301B1 (en) 2014-06-12 2021-03-31 순위안 카이화 (베이징) 테크놀로지 컴퍼니 리미티드 Removable motion sensor embedded in a sport instrument
US10668353B2 (en) 2014-08-11 2020-06-02 Icuemotion Llc Codification and cueing system for sport and vocational activities
US9409074B2 (en) 2014-08-27 2016-08-09 Zepp Labs, Inc. Recommending sports instructional content based on motion sensor data
US9068843B1 (en) * 2014-09-26 2015-06-30 Amazon Technologies, Inc. Inertial sensor fusion orientation correction
US9449230B2 (en) 2014-11-26 2016-09-20 Zepp Labs, Inc. Fast object tracking framework for sports video recognition
US10129608B2 (en) 2015-02-24 2018-11-13 Zepp Labs, Inc. Detect sports video highlights based on voice recognition
US10572735B2 (en) 2015-03-31 2020-02-25 Beijing Shunyuan Kaihua Technology Limited Detect sports video highlights for mobile computing devices
US9554160B2 (en) 2015-05-18 2017-01-24 Zepp Labs, Inc. Multi-angle video editing based on cloud video sharing
US11577142B2 (en) 2015-07-16 2023-02-14 Blast Motion Inc. Swing analysis system that calculates a rotational profile
US9694267B1 (en) 2016-07-19 2017-07-04 Blast Motion Inc. Swing analysis method using a swing plane reference frame
US10974121B2 (en) 2015-07-16 2021-04-13 Blast Motion Inc. Swing quality measurement system
US11565163B2 (en) 2015-07-16 2023-01-31 Blast Motion Inc. Equipment fitting system that compares swing metrics
US10124230B2 (en) 2016-07-19 2018-11-13 Blast Motion Inc. Swing analysis method using a sweet spot trajectory
USD785473S1 (en) 2015-08-04 2017-05-02 Zepp Labs, Inc. Motion sensor
USD797666S1 (en) 2015-08-04 2017-09-19 Zepp Labs, Inc. Motion sensor charger
US10854104B2 (en) 2015-08-28 2020-12-01 Icuemotion Llc System for movement skill analysis and skill augmentation and cueing
US9600717B1 (en) 2016-02-25 2017-03-21 Zepp Labs, Inc. Real-time single-view action recognition based on key pose analysis for sports videos
US10265602B2 (en) 2016-03-03 2019-04-23 Blast Motion Inc. Aiming feedback system with inertial sensors
US10097745B2 (en) 2016-04-27 2018-10-09 Zepp Labs, Inc. Head rotation tracking device for video highlights identification
US10786728B2 (en) 2017-05-23 2020-09-29 Blast Motion Inc. Motion mirroring system that incorporates virtual environment constraints
US20180364048A1 (en) * 2017-06-20 2018-12-20 Idhl Holdings, Inc. Methods, architectures, apparatuses, systems directed to device position tracking

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5930741A (en) * 1995-02-28 1999-07-27 Virtual Technologies, Inc. Accurate, rapid, reliable position sensing using multiple sensing technologies
US5990908A (en) * 1997-09-22 1999-11-23 Lamb & Company Method and apparatus for processing full motion computer animation
US6678059B2 (en) * 2001-04-24 2004-01-13 Korean Advanced Institute Of Science And Technology Apparatus for measuring 6-degree-of-freedom motions of rigid body by using three-facet mirror
US6693666B1 (en) * 1996-12-11 2004-02-17 Interval Research Corporation Moving imager camera for track and range capture
US6693284B2 (en) * 2000-10-04 2004-02-17 Nikon Corporation Stage apparatus providing multiple degrees of freedom of movement while exhibiting reduced magnetic disturbance of a charged particle beam
US20060012675A1 (en) * 2004-05-10 2006-01-19 University Of Southern California Three dimensional interaction with autostereoscopic displays
US20060264258A1 (en) * 2002-07-27 2006-11-23 Zalewski Gary M Multi-input game control mixer
US7219033B2 (en) * 2005-02-15 2007-05-15 Magneto Inertial Sensing Technology, Inc. Single/multiple axes six degrees of freedom (6 DOF) inertial motion capture system with initial orientation determination capability
US20070152157A1 (en) * 2005-11-04 2007-07-05 Raydon Corporation Simulation arena entity tracking system
US20070230747A1 (en) * 2006-03-29 2007-10-04 Gregory Dunko Motion sensor character generation for mobile device
US20090209343A1 (en) * 2008-02-15 2009-08-20 Eric Foxlin Motion-tracking game controller

Family Cites Families (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6720949B1 (en) 1997-08-22 2004-04-13 Timothy R. Pryor Man machine interfaces and applications
US6195104B1 (en) 1997-12-23 2001-02-27 Philips Electronics North America Corp. System and method for permitting three-dimensional navigation through a virtual reality environment using camera-based gesture inputs
US6181343B1 (en) 1997-12-23 2001-01-30 Philips Electronics North America Corp. System and method for permitting three-dimensional navigation through a virtual reality environment using camera-based gesture inputs
US6269172B1 (en) 1998-04-13 2001-07-31 Compaq Computer Corporation Method for tracking the motion of a 3-D figure
US7095401B2 (en) 2000-11-02 2006-08-22 Siemens Corporate Research, Inc. System and method for gesture interface
US6600475B2 (en) 2001-01-22 2003-07-29 Koninklijke Philips Electronics N.V. Single camera system for gesture-based input and target indication
US6804396B2 (en) 2001-03-28 2004-10-12 Honda Giken Kogyo Kabushiki Kaisha Gesture recognition system
US6888960B2 (en) 2001-03-28 2005-05-03 Nec Corporation Fast optimal linear approximation of the images of variably illuminated solid objects for recognition
US7007236B2 (en) 2001-09-14 2006-02-28 Accenture Global Services Gmbh Lab window collaboration
US7340077B2 (en) 2002-02-15 2008-03-04 Canesta, Inc. Gesture recognition system using depth perceptive sensors
US7821541B2 (en) 2002-04-05 2010-10-26 Bruno Delean Remote control apparatus using gesture recognition
US20040001113A1 (en) 2002-06-28 2004-01-01 John Zipperer Method and apparatus for spline-based trajectory classification, gesture detection and localization
US7352358B2 (en) * 2002-07-27 2008-04-01 Sony Computer Entertainment America Inc. Method and system for applying gearing effects to acoustical tracking
US7665041B2 (en) 2003-03-25 2010-02-16 Microsoft Corporation Architecture for controlling a computer using hand gestures
US7372977B2 (en) 2003-05-29 2008-05-13 Honda Motor Co., Ltd. Visual tracking using depth data
US7038661B2 (en) 2003-06-13 2006-05-02 Microsoft Corporation Pointing device and cursor for use in intelligent computing environments
KR100588042B1 (en) 2004-01-14 2006-06-09 한국과학기술연구원 Interactive presentation system
US20050255434A1 (en) 2004-02-27 2005-11-17 University Of Florida Research Foundation, Inc. Interactive virtual characters for training including medical diagnosis training
US20050212753A1 (en) 2004-03-23 2005-09-29 Marvit David L Motion controlled remote controller
CN100573548C (en) 2004-04-15 2009-12-23 格斯图尔泰克股份有限公司 The method and apparatus of tracking bimanual movements
US7593593B2 (en) 2004-06-16 2009-09-22 Microsoft Corporation Method and system for reducing effects of undesired signals in an infrared imaging system
US8560972B2 (en) 2004-08-10 2013-10-15 Microsoft Corporation Surface UI for gesture-based interaction
US8137195B2 (en) 2004-11-23 2012-03-20 Hillcrest Laboratories, Inc. Semantic gaming and application transformation
US7907117B2 (en) 2006-08-08 2011-03-15 Microsoft Corporation Virtual controller for visual displays
US9171454B2 (en) 2007-11-14 2015-10-27 Microsoft Technology Licensing, Llc Magic wand

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5930741A (en) * 1995-02-28 1999-07-27 Virtual Technologies, Inc. Accurate, rapid, reliable position sensing using multiple sensing technologies
US6148280A (en) * 1995-02-28 2000-11-14 Virtual Technologies, Inc. Accurate, rapid, reliable position sensing using multiple sensing technologies
US6693666B1 (en) * 1996-12-11 2004-02-17 Interval Research Corporation Moving imager camera for track and range capture
US5990908A (en) * 1997-09-22 1999-11-23 Lamb & Company Method and apparatus for processing full motion computer animation
US6693284B2 (en) * 2000-10-04 2004-02-17 Nikon Corporation Stage apparatus providing multiple degrees of freedom of movement while exhibiting reduced magnetic disturbance of a charged particle beam
US6678059B2 (en) * 2001-04-24 2004-01-13 Korean Advanced Institute Of Science And Technology Apparatus for measuring 6-degree-of-freedom motions of rigid body by using three-facet mirror
US20060264258A1 (en) * 2002-07-27 2006-11-23 Zalewski Gary M Multi-input game control mixer
US20060012675A1 (en) * 2004-05-10 2006-01-19 University Of Southern California Three dimensional interaction with autostereoscopic displays
US7219033B2 (en) * 2005-02-15 2007-05-15 Magneto Inertial Sensing Technology, Inc. Single/multiple axes six degrees of freedom (6 DOF) inertial motion capture system with initial orientation determination capability
US20070152157A1 (en) * 2005-11-04 2007-07-05 Raydon Corporation Simulation arena entity tracking system
US20070230747A1 (en) * 2006-03-29 2007-10-04 Gregory Dunko Motion sensor character generation for mobile device
US20090209343A1 (en) * 2008-02-15 2009-08-20 Eric Foxlin Motion-tracking game controller

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8456419B2 (en) 2002-02-07 2013-06-04 Microsoft Corporation Determining a position of a pointing device
US9454244B2 (en) 2002-02-07 2016-09-27 Microsoft Technology Licensing, Llc Recognizing a movement of a pointing device
US10331228B2 (en) 2002-02-07 2019-06-25 Microsoft Technology Licensing, Llc System and method for determining 3D orientation of a pointing device
US8707216B2 (en) 2002-02-07 2014-04-22 Microsoft Corporation Controlling objects via gesturing
US20080204411A1 (en) * 2002-02-07 2008-08-28 Microsoft Corporation Recognizing a movement of a pointing device
US20080259055A1 (en) * 2002-02-07 2008-10-23 Microsoft Corporation Manipulating An Object Utilizing A Pointing Device
US20090198354A1 (en) * 2002-02-07 2009-08-06 Microsoft Corporation Controlling objects via gesturing
US20080204410A1 (en) * 2002-02-07 2008-08-28 Microsoft Corporation Recognizing a motion of a pointing device
US20080192007A1 (en) * 2002-02-07 2008-08-14 Microsoft Corporation Determining a position of a pointing device
US10488950B2 (en) 2002-02-07 2019-11-26 Microsoft Technology Licensing, Llc Manipulating an object utilizing a pointing device
US20060007142A1 (en) * 2003-06-13 2006-01-12 Microsoft Corporation Pointing device and cursor for use in intelligent computing environments
US20060007141A1 (en) * 2003-06-13 2006-01-12 Microsoft Corporation Pointing device and cursor for use in intelligent computing environments
US9283482B2 (en) 2008-07-11 2016-03-15 Nintendo Co., Ltd. Game apparatus for performing game processing according to an attitude of an input device and game program
US20100009754A1 (en) * 2008-07-11 2010-01-14 Takayuki Shimamura Game apparatus and game program
US8535132B2 (en) * 2008-07-11 2013-09-17 Nintendo Co., Ltd. Game apparatus for setting a moving direction of an object in a game space according to an attitude of an input device and game program
US8851995B2 (en) 2008-07-11 2014-10-07 Nintendo Co., Ltd. Game apparatus for performing game processing according to an attitude of an input device and game program
US20130027341A1 (en) * 2010-04-16 2013-01-31 Mastandrea Nicholas J Wearable motion sensing computing interface
US9110505B2 (en) * 2010-04-16 2015-08-18 Innovative Devices Inc. Wearable motion sensing computing interface
US20120086637A1 (en) * 2010-10-08 2012-04-12 Cywee Group Limited System and method utilized for human and machine interface
US8555205B2 (en) * 2010-10-08 2013-10-08 Cywee Group Limited System and method utilized for human and machine interface
KR101956186B1 (en) * 2011-04-27 2019-03-11 삼성전자주식회사 Position estimation apparatus and method using acceleration sensor
KR20120121595A (en) * 2011-04-27 2012-11-06 삼성전자주식회사 Position calculation apparatus and method that use acceleration sensor
US20130059661A1 (en) * 2011-09-02 2013-03-07 Zeroplus Technology Co., Ltd. Interactive video game console
US10168775B2 (en) * 2012-10-10 2019-01-01 Innovative Devices Inc. Wearable motion sensing computing interface
US10001645B2 (en) * 2014-01-17 2018-06-19 Sony Interactive Entertainment America Llc Using a second screen as a private tracking heads-up display
RU2661808C2 (en) * 2014-01-17 2018-07-19 СОНИ ИНТЕРЭКТИВ ЭНТЕРТЕЙНМЕНТ АМЕРИКА ЭлЭлСи Using second screen as private tracking heads-up display

Also Published As

Publication number Publication date
US20100103269A1 (en) 2010-04-29
US8282487B2 (en) 2012-10-09

Similar Documents

Publication Publication Date Title
US8282487B2 (en) Determining orientation in an external reference frame
CN107314778B (en) Calibration method, device and system for relative attitude
US9086724B2 (en) Display control system, display control method, computer-readable storage medium having stored thereon display control program, and display control apparatus
US9604101B2 (en) Golf swing analysis device and golf swing analysis method
EP2936060B1 (en) Display of separate computer vision based pose and inertial sensor based pose
EP1596332B1 (en) Information processing method and apparatus for finding position and orientation of targeted object
TWI397671B (en) System and method for locating carrier, estimating carrier posture and building map
JP5948011B2 (en) Motion analysis device
CN106662443B (en) The method and system determined for normal trajectories
US8441438B2 (en) 3D pointing device and method for compensating movement thereof
EP1870670A1 (en) Method and apparatus for space recognition according to the movement of an input device
KR101950359B1 (en) Method for position estimation of hybird motion capture system
CN105937878A (en) Indoor distance measuring method
US20120065926A1 (en) Integrated motion sensing apparatus
US11698687B2 (en) Electronic device for use in motion detection and method for obtaining resultant deviation thereof
CN104280022A (en) Digital helmet display device tracking system of visual-aided inertial measuring unit
JP2017119102A (en) Motion analysis device, method and program
US11112857B2 (en) Information processing apparatus, information processing method, and program
US8708818B2 (en) Display control system, display control method, computer-readable storage medium having stored thereon display control program, and display control apparatus
KR20120059824A (en) A method and system for acquiring real-time motion information using a complex sensor
US10058779B2 (en) Game apparatus, storage medium having game program stored thereon, game system, and game processing method
US20160129344A1 (en) Information processor, control method of information processor, program, and information storage medium
CN115049697A (en) Visual speed measurement method, device, equipment and storage medium
CN108413970B (en) Positioning method, cloud system, electronic equipment and computer program product
EP2140917B1 (en) Orientation calculation apparatus and storage medium having orientation calculation program stored therein

Legal Events

Date Code Title Description
AS Assignment

Owner name: MICROSOFT CORPORATION,WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WILSON, ANDREW;BEEMAN, STEVEN MICHAEL;SIGNING DATES FROM 20081020 TO 20081022;REEL/FRAME:021908/0329

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE

AS Assignment

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034766/0509

Effective date: 20141014