US5625577A - Computer-implemented motion analysis method using dynamics - Google Patents

Computer-implemented motion analysis method using dynamics Download PDF

Info

Publication number
US5625577A
US5625577A US08/182,545 US18254594A US5625577A US 5625577 A US5625577 A US 5625577A US 18254594 A US18254594 A US 18254594A US 5625577 A US5625577 A US 5625577A
Authority
US
United States
Prior art keywords
segment
motions
motion
segments
dynamics
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
US08/182,545
Inventor
Tosiyasu Kunii
Lining Sun
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SHORINJI KEMPO INTELLECTUAL PROPERTY PROTECTION Corp
Shukyohoji Kongo Zen Sohozan Shoriji
Original Assignee
Shukyohojin Kongo Zen Sohonzan Shorinji
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2418249A external-priority patent/JPH04270372A/en
Priority claimed from JP2418250A external-priority patent/JPH07313648A/en
Priority claimed from JP2418253A external-priority patent/JPH04271734A/en
Priority claimed from JP2418251A external-priority patent/JPH04279979A/en
Application filed by Shukyohojin Kongo Zen Sohonzan Shorinji filed Critical Shukyohojin Kongo Zen Sohonzan Shorinji
Priority to US08/182,545 priority Critical patent/US5625577A/en
Assigned to SHUKYOHOJI, KONGO ZEN SOHOZAN SHORIJI reassignment SHUKYOHOJI, KONGO ZEN SOHOZAN SHORIJI ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SUN, LINING, KUNII, TOSIYASU
Application granted granted Critical
Publication of US5625577A publication Critical patent/US5625577A/en
Assigned to SHORINJI KEMPO INTELLECTUAL PROPERTY PROTECTION CORPORATION reassignment SHORINJI KEMPO INTELLECTUAL PROPERTY PROTECTION CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SHUKYOHOJIN KONGO ZEN SOHONZAN SHORINJI
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/42Recording and playback systems, i.e. in which the programme is recorded from a cycle of operations, e.g. the cycle of operations being manually controlled, after which this record is played back on the same machine
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1671Programme controls characterised by programming, planning systems for manipulators characterised by simulation, either to verify existing program or to create and verify new program, CAD/CAM oriented, graphic oriented programming systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/36Nc in input of data, input key till input tape
    • G05B2219/36171Edit velocity, motion profile, graphic plot of speed as function of time, position
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40395Compose movement with primitive movement segments from database
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/45Nc applications
    • G05B2219/45176Animation for film scenes, show

Definitions

  • the present invention relates to a method for analyzing motions of a human being or an animal and animating the results on a computer screen using a computer.
  • the results of this method can be used, for examples to analyze a physical motion used in performing a sport (hereinafter referred to as a sport motion) in order to develop a curriculum for teaching the motion. It can also by used for analyzing various skills in the industrial and performing arts and forming curricula for teaching those skills.
  • a method can also be used to train an animal, such as a dog or a horse, by analyzing the motions of the animal and deriving a training method.
  • a motion analyst In analyzing motion using a computer, a motion analyst, such as a teacher or a trainer, analyzes the motions depending on his understanding. Thus, the teacher or the trainer can use the computer analysis to teach an unskilled person or animal, the motion being studied so that such person or animal can perform the skill involved. Such analysis can also be used to develop new motions.
  • Such computer analysis utilizes kinematics, which describes the motions only in terms of positions, velocities and accelerations, neglecting the forces and torques responsible.
  • kinematics analysis can only generate a two-dimensional line picture representing parts constituting a human or animal body, and a three-dimensional model of the human or animal body cannot be displayed realistically on a screen. Accordingly, it is difficult to understand the display and also it is difficult to develop a new motion using the display.
  • Another method of motion analysis uses dynamics. While kinematics provides motion analysis in terms of positions, velocities and accelerations, dynamics analyzes the motions of objects based on the relation between movements and forces. If dynamics is applied to the computer analysis of actual motions, it is possible to generate complex behavioral or animated results with minimal control by the analyst. Furthermore, an animation method utilizing dynamics has the great advantage of avoiding the limitations of methods utilizing kinematics.
  • dynamics motion analysis requires data on dynamics parameters such as the moments of inertia, the centers of gravity, joint friction and muscle/ligament elasticity of the moving body being represented by the animation, which parameters are difficult to measure. Without such data, the computer analysis of motions based on dynamics provides unrealistic motions similar to the results produced by kinematic motion analysis. Furthermore, dynamics motion analysis requires the solution of rather complex dynamics equations. For a human body an articulated body with 200 degrees of freedom, it is necessary to solve 600 simultaneous differential equations.
  • a video recorder preferably several video recorders, recording the movement from a plurality of points of view, records the motion of actual humans in a time sequence of video frames.
  • the position of objects in each frame is input to the system to reproduce and analyze the movement.
  • the position and orientation of each segment are specified. Therefore, for each segment there are six degrees of freedom, three for the position and three for the orientation.
  • Each segment in an articulated body is restricted by attachments to neighboring segments. Therefore, it suffices to specify the position of only one segment (orientation must be specified for all segments), thereby reducing the degrees of freedom considerably.
  • the user specifies the position of each body segment by manipulating this figure using a mouse; the user picks a segment and translates and or rotates it so that the displayed figure assumes the position of the figure on the videotape.
  • the image of a video frame can be superimposed on each window to facilitate this process.
  • the body movement which has been input into the computer is then analyzed considering two factors: the movement of the center of gravity and the force exerted on each segment of the body.
  • the position vector G of the center of gravity of a human body is computed by the following formulation: ##EQU1## where g i is the position vector of the center of gravity of a segment i and ⁇ m i in its mass. Since each segment is assumed to be a uniform rigid body, g i and ⁇ m i can be obtained prior to this calculation from the measurement data.
  • each segment is computed in a way similar to the above formulation.
  • Each segment is divided into small wedge-shaped volumes around and along the central axis through the segment, and these volumes are summed up to find the position of the center of gravity.
  • the problem of computing the force exerted on a human body from its movement is a problem of inverse dynamics.
  • m is the mass of the segment
  • V g is the velocity vector of the center of gravity
  • X a and X b are the position vectors of its two ends
  • a and b X a and X b are the velocity vectors
  • I is the inertia tensor
  • ⁇ g is he angular velocity vector around the center of gravity
  • l is the distance between two ends
  • basic motions of a human being or animal body are analyzed to generate data on dynamics parameters including the forces and torques exerted on joints of the body, and this data is stored in a database as knowledge regarding the basic motions.
  • Photographic film or video films of various human or animal motions viewed from various directions are analyzed using the data in the database.
  • the forces and torques exerted on the joints of the human or animal body in these motions are calculated and displayed.
  • an analyst or trainer accesses the database and modifies the data.
  • a computer provides the analyst or trainer with feedback on the result of constraints in terms of constrained motions and the result of inverse dynamics in terms of forces.
  • the analyst or trainer can design new motions in an interactive manner by repeating the above processes until satisfactory results are obtained.
  • the detailed and realistic three-dimensional motions displayed on the computer screen can be manipulated so as to be viewed from any direction and used to train the student, whether an athlete, a dancer, an industrial worker, an animal, etc., to imitate and reproduce the motion being shown on the screen.
  • This is an excellent teaching device.
  • the computational complexity of the method of the present invention is a function O(n), wherein n is the number of segments, so the computational complexity, and thus the time required for such computation, is much less than with conventional motion analysis and animation methods.
  • the present invention can illustrate the motions of human or animal bodies by smooth three-dimensional modeling pictures without requiring trial and error or the intuition of the programmer.
  • FIG. 1 is a flow chart of a motion analysis embodiment according to this invention.
  • FIG. 2 is a schematic view showing structure of a model of a human body.
  • FIGS. 3(a) to 3 (c) are a control graphs showing examples of the forces exerted on a segment joint of a human body.
  • FIGS. 4(a) and 4(b) are views showing one example displaying on a screen a result of motion analysis according to this invention.
  • FIG. 5 is a view showing another example displayed a result of motion analyzing on a screen according to this invention.
  • FIGS. 6(a) and 6(b) are views showing the step of application of dynamics in which body segments are identified and each segment is calculated independently, neglecting joint constraints.
  • FIGS. 7(a) and 7(b) are views showing the results of the step of checking and enforcing constraints, FIG. 7(a) being before the application of constraints and FIG. 7(b) being thereafter.
  • FIGS. 8(a) and 8(b) are views similar to those of FIGS. 6(a) and 6(b) when used for analyzing and teaching sports motions, such as a golf swing.
  • FIGS. 9(a) and 9(b) are views similar to those of FIGS. 7(a) and 7(b) when used for teaching and analyzing a golf swing.
  • FIGS. 10(a) and 10(b) are views similar to those of FIGS. 6(a) and 6(b) and those of FIGS. 8(a) and 8(b), when used for training an animal.
  • FIGS. 11(a) and 11(b) are views similar to those of FIGS. 7(a) and 7(b) and those of FIGS. 9(a) and 9(b), when used for training an animal
  • FIG. 1 shows a flow chart of one embodiment according to this invention, including following steps:
  • the human body is divided into a plurality of segments connected by body joints, each of the segments acting as a minimal unit of motion, as shown in FIG. 2.
  • a human body model then is constructed on the basis of the inherent nature of each of the segments and physical constraints such as the inter-relationship of each segment and the range of movements of the joints connecting the segments.
  • the human body consists of parts roughly divided, a head 1 a trunk 2 and four limbs, and each part is connected with the trunk 2 placed in the center of them.
  • Upper limbs comprising of right and left arms consist of upperarms 3, 4, forearms 5, 6, and hands respectively; each hand having five fingers, having a plurality of segments connected by joints (not shown)
  • Lower limbs comprising of right and left legs also consist of 7, 8, 9 and 10 in like manner.
  • the bones constructing the human body appear as lines, and the joints by which the bones are connected to each other appear as small circles.
  • the bones described by lines are minimal units of motion which will not be divided further, and the joints described by small circles determine the mutual relationship of the bones.
  • a film is taken of the actual motions of a human body, and for each frame of the film or of the video film, the positions of the body parts of the human being are quantified and input to the computer for the modeling.
  • This data is applied to the model and the computer calculates the position, velocity and acceleration of each segment of the model.
  • the data as to motions input in the second step are analyzed using inverse dynamics using Lagrange's equation which describe the relationship of forces and movements.
  • the center of gravity of each of the body segments, the force and torque exerted on each joint, the position of the center of gravity of the whole body and the forces and torques exerted on the centers of gravity are all calculated and then input into the database.
  • the center of gravity of each of the body segments, the force and torque exerted on each joint, the position of the center of gravity of the whole body, and the forces and torques exerted on the centers of gravity, resulting from the fourth step, are displayed by symbols such as arrows on a screen superimposed on a display of the human body model.
  • FIG. 4(a) a cross symbol 11, showing the center of gravity of the whole body, and arrow symbols 12-17, showing the vector of the forces exerted on each center of gravity of each segment, are displayed superimposed on the human body model of the database, when the human body model in FIG. 4(a) comes to that in the state of FIG. 4(b).
  • This display describes forces exerted on the right forearm segment 5 as vector 12, on the left upperarm segment 4 as vector 13, on the right thigh segment 7 as vector 14, on the right leg segment 9 as vector 15, on the left thigh segment 8 as vector 16, and on the left leg segment 10 as vector 17.
  • FIG. 5 shows a display of an example of performing a skill by two persons in Shorinji Kempo, in which positions of the centers of gravity of the bodies and the forces exerted are superimposed on the model of the filmed human bodies.
  • the person MR on the right side seizes the right wrist of the person ML on the left side and acts the force shown as an arrow FR to the right arm of ML and, therefore, produces the force shown as another arrow FL onto the center of gravity GL of the whole body of ML. Accordingly, the display can show motions more concretely to be understood easily by anyone.
  • motions to be analyzed are calculated, on the basis of physical constraints and the inherent nature of each segment acting as the minimal unit of movements of a human or an animal body which are put into a database, and the motion and the center of gravity of each of the segments and the force and torque exerted on each joint, the motion and the center of gravity of the whole body, and the forces and torques exerted on the centers of gravity are obtained.
  • the results obtained are, therefore, reasonable and realistic, and the motions are easily understood by arrows showing directions of movements or forces appearing on the human body model on the screen.
  • the resulting analysis is input into the database to be used further for developing new motions.
  • One way of quantitatively representing the motions is by means of control graphs showing the forces acting on one of the joints of the model as a function of time.
  • FIG. 3 is an example of a control graph of the forces acting on the left elbow of a golfer in the directions of x, y, and z orthogonal axes as a function of time.
  • the data constituting the control graphs are determined from the motions of the model obtained after applying actual motions to the model and analyzing the results, and are stored in the database. Since the two forces exerted on any given joint are equal in magnitude and opposite in direction, this constraint must be maintained by the system.
  • a complicated motion is represented by a plurality of graphs.
  • the control graphs of a human being standing up from a chair and walking constitute a continuous composite motion.
  • Each control graph for active segments is designed in the same manner as for the illustrated control graphs for the left elbow.
  • a complicated motion is represented by several control graphs, each of which is designed by the user by the same method.
  • the control graphs of somebody standing up from a chair and walking constitute a continuous composite motion.
  • Shorinji Kempo a Japanese martial art
  • any kind of motion is based on 12 basic control graphs.
  • Motions are designed at two levels: changing the speed of the motions and the forces causing the motions, and composing a complicated motion from the basic motions.
  • the user chooses the basic motions from the database. Then, the dynamics parameters of the motions chosen are displayed using a two dimensional control graph, where the abscissa represents time and the ordinate represents the forces exerted on each joint of the body (FIG. 2). Since the two forces exerted on the same joint must be equal in magnitude and opposite in direction this constraint must be maintained by the system.
  • the user may be provided with various facilities to modify these physical parameters using a mouse.
  • the global modification The same changes of the forces are caused to all the human body parts on which the forces are exerted. This modification can involve scaling up or down of the x- or y-axis.
  • the local modification It changes only the force exerted on a specific part of the body.
  • Another modification other than the motion is the modification of the acceleration due to gravity, which is necessary when simulating motion outside the earth.
  • the joint limits restrict the range of the joint movements, because the joints of a real human body cannot rotate arbitrarily in all directions.
  • the limits are the constants of a given human body, which have been put into the database.
  • the motion of each body segment is calculated on the basis of the forces corresponding to the basic motions selected by the animator and dynamics equations governing movement of each segment.
  • the segments of the body are actually connected with one another by joints, as shown in FIG. 2, in order to simplify the calculations, it is assumed for the moment that each body segment is independent of the other segments and the constraints on the articulation of the human body and its range of movement of joints are neglected for the moment.
  • Dynamics calculates the path of each body segment independently of the rest, based on the forces specified by the user and the dynamics equations governing the segment movements.
  • each body segment is firstly distinguished (FIG. 6(a) and 6(b), 8(a) and 8(b), and 10(a) and 10(b)), and the articulation of the body and the constraints regarding the range of the movements of joints are neglected for the present. The idea is to reduce the amount of computation.
  • the results of calculation of the motions of every segment are passed to the next step, where the physical constraints of the body are checked and enforced.
  • Newton-Euler formulation For each segment, the Newton-Euler formulation is used to calculate the motion. Newton's equation is used to derive the linear acceleration of the center of gravity as follows:
  • n is the mass of the segment
  • is the position of the center of gravity of the segment
  • g is the acceleration of gravity.
  • x,y,z are the directions of the principal axes
  • N is the external torque being applied to the segment
  • I is the principal moments of inertia
  • is the angular velocity of the segment.
  • the principal moments of inertia are the diagonal elements when the inertia tensor is diagonalized, which are derived from the eigen values of the inertia tensor of the segment. And the corresponding principal axes are obtained from its eigen vector equations.
  • the original coordinates must be transformed to the principal axes.
  • the Euler angles can be used to perform the transformations.
  • t is the time
  • ⁇ t is the time period
  • the articulation of the human body and the range of the movements of the body joints are checked for each of the motions calculated in the eighth step (FIGS. 7(a) and 7(b), 9(a) and 9(b), and 11(a) and 11(b)).
  • the process of applying constraints starts at a segment referred to as a root element, and the position and the orientation of each segment in a subclass of the root element are checked sequentially.
  • two types of checks are performed. One is a check whether a subclass element is always connected to its superclass element. The other is a check whether the movement of each joint exceeds a specified range. If the subclass element is not connected to its superclass element, as shown in FIGS.
  • the subclass element is translated until it becomes connected to its superclass element. If the movement of any joint exceeds the specified range, the movement of the joint is adjusted to be within the range by rotation of the corresponding element, thus modifying the positions of the segments to obtain a posture as shown in FIGS. 7(b), 9(b) and 11(b).
  • Lagrange equations which describe the relationship between forces and movement are used to calculate the forces exerted on each joint of the body.
  • Inverse dynamics uses Lagrange's equations which describe the relationship between forces and movements to calculate the forces exerted on each joint of the body.
  • the Sun paper proposes a linear recursive algorithm to perform inverse dynamics, with the computational complexity O(n) where n is the number of the body segments.
  • the seventh to eleventh steps can be repeated, and the new motions can be developed in an interactive manner.
  • the center of gravity of each of the body segments, the force and torque exerted on each joint, the center of gravity of the whole body, and the force and torque exerted on the center of gravity of the whole body can be displayed as symbols such as an arrow, superimposed over the human body model, as shown in FIG. 4, or over the human body displayed on a screen, in same manner as in the fifth step (displaying the result) as shown in FIG. 5.
  • the result of the calculated movements which have been analyzed using inverse dynamics can also be stored in the database to be used in the design of additional complicated motions.
  • the present invention provides a dynamics analysis based human or animal motion animation using an articulated human or animal body model.
  • the method consists of two stages: analyzing the basic movements of an actual human being or animal and using the analytic results to produce the new motions dynamically.
  • the processes of dynamical simulation proceed in three steps: dynamics, constraints and inverse dynamics.
  • the dynamics step distinguishes the articulated body into independent rigid segments (such as 50 rigid segments) and the motion of each segment is calculated independently of the motions of other segments by using the Newton-Euler equations. The idea is to reduce the number of degrees of freedom and thus the amount of computation required, and to avoid the complexity of dealing with continuum mechanics.
  • the articulation of the body and the range of movements of the body joints are checked and maintained.
  • a segment which violates its constraints is moved by translations and rotations so that it becomes connected to its super-segment or is bent into a natural posture.
  • the forces that produced the new motions, which have been modified due to the constraints are calculated.
  • the total computational complexity is O (n), where n is the number of the body segments.
  • the user interacts with the system by specifying the control forces and getting the motions produced by the dynamics and the constraints, the forces being modified due to the constraints.
  • the present method solves the computational problem that existed in dynamics and enables dynamics to produce real time feedback so that dynamics are now made well suited to computer motion analysis work.
  • the present invention it is possible to develop a new motion in an interactive manner using a computer without requiring trial and error or the intuition of the analyst.
  • the method is particularly useful in analyzing and teaching sports motions as shown in FIGS. 8 and 9.
  • Animation and motion analysis of an animal body can be developed by the method of the present invention in the same manner as described above with respect to a human body, as shown in FIGS. 10-11. Similarly, any articulated system can be analyzed and animated in a similar manner.
  • the motion analysis method comprises the steps of analyzing the basic motions of an actual human or animal body and developing new motions.
  • the analysis of the basic motions of the human or animal body is achieved in three steps: constructing a model of the human or animal body, applying the actual motions of a human or animal to the model, and analyzing the motions of the segments of the model.
  • the development of new motions is achieved in three steps: application of dynamics, application of constraints, and application of inverse dynamics.
  • the body is divided into a plurality of independent body segments (50, for example) connected by joints, and the motion of each body segment is calculated independently of the other segments using Newton's equation of motion and Euler's equations.
  • the step of applying constraints the articulation of the body and the range of movement of its joints are checked.
  • the force modified by the constraints and generating new forces are calculated.
  • the whole computational complexity becomes O(n).
  • the method according to the present invention can eliminate the computational complexity of conventional methods, it permits dynamics to be applied to actual motion analysis, and it permits feedback in real time using dynamics.
  • An object-oriented paradigm has recently been used in a number of areas.
  • the object-oriented philosophy leads to a direct manipulation paradigm.
  • this direct manipulation paradigm since the images displayed on a screen correspond to objects, by using the motion analysis method according to the present invention, the objects can be manipulated directly in the space of the objects.

Abstract

A motion analyzing method analyzes and displays motions of a human being or an animal using a computer in an interactive manner without requiring trial and error or without depending on intuition of an analyst. A human body or an animal body is divided into a plurality of segments connected by joints, each of the segments acting as a minimal unit of motion. Data for modeling the human or animal body on the basis of physical constraints and the inherent nature of each of the segments is maintained in a database. Motions are input to be analyzed and the input motions are analyzed using inverse dynamics. The resultant movements and the center of gravity of each of the segments, the force and torque exerted on each of the joints, the movement and the center of gravity of the whole body, and the forces and torques exerted on the centers of gravity are superimposed on the human or animal body model of the database and are displayed on a screen. The new motions thus displayed can be used for the teaching of new skills in the industrial or performing arts, in sports, or in animal training.

Description

CROSS REFERENCE TO RELATED APPLICATIONS
The present application is a continuation-in-part of all of U.S. applications Ser. No. 07/714,304, filed Jun. 12, 1991, now abandoned, Ser. No. 08/139,703, filed Oct. 5, 1993, now abandoned, Ser. No. 07/712,884, filed Jun. 12 1991, now abandoned, and Ser. No. 07/714,306, filed Jun. 12, 1991, now abandoned, the entire contents of each of which are hereby incorporated by reference. The present invention is also related to applications Ser. No. 08/172,704, filed Dec. 27, 1993, now pending, and Ser. No. 08/178,217, filed Jan. 6, 1994, now pending, all to the same inventors as the present application.
BACKGROUND OF THE INVENTION
The present invention relates to a method for analyzing motions of a human being or an animal and animating the results on a computer screen using a computer. The results of this method can be used, for examples to analyze a physical motion used in performing a sport (hereinafter referred to as a sport motion) in order to develop a curriculum for teaching the motion. It can also by used for analyzing various skills in the industrial and performing arts and forming curricula for teaching those skills. Such a method can also be used to train an animal, such as a dog or a horse, by analyzing the motions of the animal and deriving a training method.
In order to teach a new movement when performing a sport or a skill, such as in the industrial and performing arts or when training an animal or the like, it is necessary first to analyze the motions of the person or animal involved. In the past, a trained analyst observed the actual motions of a skilled athlete or an expert in the skill to be taught or a skilled animal, and then, on the basis of such observations, a curriculum or a program for teaching or reproducing the motions was formed. However, because such observations are subjective, the results may not be reliable. Furthermore, since the development of a new sport motion or skill is performed by trial and error, it can be very time consuming.
There has recently been proposed a technique wherein the motions of a human being or an animal are analyzed using a computer. See D. I. Miller "Computer Simulation of Human Motion", in D. W. Grieve et al (eds) Techniques for the Analysis of Human Motion, Lepus Books, London 1975; Calvert T. W. et al "Composition of Multiple Figure Sequences for Dance and Animation", Proceedings CG International '89, pp. 245-254 (1989); and Nigel W. J. et al, "Some Methods to Choreograph and Implement Motion in Computer Animation", Proceedings of Computer Animation '89, 125-140 (1989). In analyzing motion using a computer, a motion analyst, such as a teacher or a trainer, analyzes the motions depending on his understanding. Thus, the teacher or the trainer can use the computer analysis to teach an unskilled person or animal, the motion being studied so that such person or animal can perform the skill involved. Such analysis can also be used to develop new motions. Such computer analysis utilizes kinematics, which describes the motions only in terms of positions, velocities and accelerations, neglecting the forces and torques responsible. Thus, kinematics analysis can only generate a two-dimensional line picture representing parts constituting a human or animal body, and a three-dimensional model of the human or animal body cannot be displayed realistically on a screen. Accordingly, it is difficult to understand the display and also it is difficult to develop a new motion using the display.
Furthermore, while computer analysis of motions for developing new motions and new skills preferably utilizes a method having real time response, conventional computer analysis methods have no real time response because they require the ascertainment of the contents of actual motions and fine adjustments based on the results.
Another method of motion analysis uses dynamics. While kinematics provides motion analysis in terms of positions, velocities and accelerations, dynamics analyzes the motions of objects based on the relation between movements and forces. If dynamics is applied to the computer analysis of actual motions, it is possible to generate complex behavioral or animated results with minimal control by the analyst. Furthermore, an animation method utilizing dynamics has the great advantage of avoiding the limitations of methods utilizing kinematics.
However, the computer analysis of motions utilizing dynamics requires data on dynamics parameters such as the moments of inertia, the centers of gravity, joint friction and muscle/ligament elasticity of the moving body being represented by the animation, which parameters are difficult to measure. Without such data, the computer analysis of motions based on dynamics provides unrealistic motions similar to the results produced by kinematic motion analysis. Furthermore, dynamics motion analysis requires the solution of rather complex dynamics equations. For a human body an articulated body with 200 degrees of freedom, it is necessary to solve 600 simultaneous differential equations.
Thus, motion analysis methods which have been proposed thus far are not well suited for representing complex motions. Therefore, there is no motion analysis method capable of representing all the motions of a human or animal.
Conventional computer motion analysis methods utilizing dynamics to represent the movements of a human or animal body, for example, involve the following steps:
(1) Constructing a model of the human or animal body;
(2) Applying the actual motions of a human or animal to the model;
(3) Analyzing the motions of the model; and
(4) Reproducing the analyzed motions.
In the fourth step (reproducing the analyzed motions), a method which can exactly solve the dynamics equations using the Gibbs formula is particularly suitable. However, for an actual motion analysis system, this method has not been used because of its conceptual and computational complexity, since when n is the number of segments constituting the model and forming minimal units of motion, the computational complexity of O(f(n)) becomes a function of O(n4) of n4, and thus is very large. Thus, dynamical simulation methods such as that disclosed in Wilhelms J. P et al., "Using Dynamic Analysis to Animate Articulated Bodies such as Humans and Robots", Proceedings, Graphics Interface '85, 97-104 (1985) have not been accepted, mostly because of its conceptual complexity and computational expensiveness.
A motion analyzing method which reduces the computation complexity to O(n), a function of n, enabled by neglecting rotations about the principal axes, has been proposed. Armstrong W. W. et al, "The Dynamics of Articulated Rigid Bodies for Purposes of Animation", The Visual Computer, 1:231-240 (1985). However, when the rotations of joints about the principal axes cannot be neglected, this method is not applicable. Thus, by using this method the forces that produce a motion or the forces that are produced by a motion cannot be displayed exactly and a realistic three-dimensional model of an articulated body cannot be obtained.
In a publication from the department of the present inventors, Sun, L. et al., "An Architectural Design of a Sports Instruction System", First International Conference and Exhibition on Visual Computing for Defense and Government (VISUDA '89), Paris, France (1989) (Technical Report 89-017), a system for inputting and analyzing human body movement is disclosed. The human body is modeled as a set of articulated segments, each of which is essentially a rigid solid. In this model, the human body is represented by 50 segments. The system takes in a set of measurements taken with an actual human body and converts it into the size data for each segment, e.g., length, width and diameter. When displaying the model, the system approximates each segment with a polygon and performs a smooth shading for better appearance. In addition the system calculates the center of gravity and the mass for each segment which will be required in the analysis. A video recorder, preferably several video recorders, recording the movement from a plurality of points of view, records the motion of actual humans in a time sequence of video frames. The position of objects in each frame is input to the system to reproduce and analyze the movement. In order to define the position and configuration of the body in space, the position and orientation of each segment are specified. Therefore, for each segment there are six degrees of freedom, three for the position and three for the orientation. Each segment in an articulated body, however, is restricted by attachments to neighboring segments. Therefore, it suffices to specify the position of only one segment (orientation must be specified for all segments), thereby reducing the degrees of freedom considerably. The user specifies the position of each body segment by manipulating this figure using a mouse; the user picks a segment and translates and or rotates it so that the displayed figure assumes the position of the figure on the videotape. The image of a video frame can be superimposed on each window to facilitate this process.
The body movement which has been input into the computer is then analyzed considering two factors: the movement of the center of gravity and the force exerted on each segment of the body. The position vector G of the center of gravity of a human body is computed by the following formulation: ##EQU1## where gi is the position vector of the center of gravity of a segment i and Δmi in its mass. Since each segment is assumed to be a uniform rigid body, gi and Δmi can be obtained prior to this calculation from the measurement data.
The center of gravity of each segment is computed in a way similar to the above formulation. Each segment is divided into small wedge-shaped volumes around and along the central axis through the segment, and these volumes are summed up to find the position of the center of gravity.
The problem of computing the force exerted on a human body from its movement is a problem of inverse dynamics. One starts with the sequence of positions and obtains the velocity and the acceleration of each part of the body, from which the force that has caused such motion is to be calculated.
Solving a problem of dynamics is essentially solving differential equations which describe the relationship between mass and force and torque applied to it. A variety of formulations are available for describing the dynamics equation. They all produce the same result expressed in slightly different terms. One of the most familiar formulations is Lagrange's equation: ##EQU2## where L is the Lagrangian given by kinetic energy (T) minus potential energy (P), qi is the ith coordinate, f is a constraint equation, λ is an undetermined multiplier, and Qi is the force or torque applied to the ith coordinate.
In a general animation system, the dynamics approach is not yet practical due to the high computational complexity and cost required. The articulated body of the Sun publication model, consisting of 50 segments, requires 300 (=50×6) simultaneous equations to describe it. However, the use of known forces, such as the reactive force between a foot and a floor, simplifies the situation. For each segment, we have ##EQU3## where m is the mass of the segment, Vg is the velocity vector of the center of gravity, Xa and Xb are the position vectors of its two ends, a and b, Xa and Xb are the velocity vectors, I is the inertia tensor, ωg is he angular velocity vector around the center of gravity, and l is the distance between two ends,
Sun proposes letting the left side of he above Lagrange's equation be D. Then,
D(X.sub.a,X.sub.b,X.sub.a,X.sub.b,λF.sub.a,F.sub.b)=0
where Fa and Fb are the forces exerted on a and b, respectively. Each segment has six degrees of freedom, and there are six degrees of freedom, and there are six Lagrange's equations describing it. In the above Lagrange's equations D (Xa,Xb,Xa, Xb,λ,Fa,Fb), the three components of Fa (x,y,z) and Fb (x,y,z), and λ are unknown. Therefore, there are seven unknown variables in total in the the system of six equations. If the force in one direction can be obtained through some means (e.g. by measurement), λ and five other force components can be computed by solving these equations.
Sun then supposes that a segment i is connected at an end b1 to an end aj of another segment j. By Newton's Third Law of Motion, the force exerted on the segment i at bi is equal and opposite to the force exerted on the segment j at aj. By solving the equations for the segment i, one also obtains a partial solution (three of the seven unknown variables) for the segment j as well. By repeating this process, solutions can be found for all segments making up the body.
As the starting point of a series of calculations, Sun suggests using the force exerted on a foot by the floor, which was measured in advance, to calculate the force exerted on the joint of the foot and shin. Then one works upward to head and lower arms through shins, thighs, hip waist and torso. In the end, one has calculated the force exerted on each segment at its ends. Note that the force calculated here is the sum of internal force and external force. Internal force is exerted by muscles on attached joints. For the segments on which no external force is exerted, the result is equal to just the internal force. On the other hand, for the segments with no internal force, the result is equal to the external force.
While this system of inputting and analyzing human body movement is useful for computerizing the dynamics of videotaped human movement, it does not solve the problem of how to produce practical animation works suitable for motion analysis and skill training, in a manner which simulates the motion of a human body as closely as possible, with a real time response, without relying on trial and error or an unworkable magnitude of calculations.
SUMMARY OF THE INVENTION
It is an object of this invention to provide a motion analysis method, utilizing dynamics analysis, which can, in an interactive manner, analyze and display the actual motions of a human being or animal in an animated form on a computer screen, without requiring trial and error or the intuition of an analyst.
It is another object of the present invention to provide a motion analysis method which can generate a realistic three-dimensional modeling picture.
It is yet another object of the present invention to provide a motion analysis method which can represent all the motions of a human or animal body.
It is still a further object of the present invention to provide a motion analyzing method for use in developing new skills and sport motions in an interactive manner.
In order to achieve the objects of the present invention, basic motions of a human being or animal body are analyzed to generate data on dynamics parameters including the forces and torques exerted on joints of the body, and this data is stored in a database as knowledge regarding the basic motions. Photographic film or video films of various human or animal motions viewed from various directions are analyzed using the data in the database. The forces and torques exerted on the joints of the human or animal body in these motions are calculated and displayed.
In order to develop a new motion, an analyst or trainer accesses the database and modifies the data. A computer provides the analyst or trainer with feedback on the result of constraints in terms of constrained motions and the result of inverse dynamics in terms of forces. The analyst or trainer can design new motions in an interactive manner by repeating the above processes until satisfactory results are obtained.
The detailed and realistic three-dimensional motions displayed on the computer screen can be manipulated so as to be viewed from any direction and used to train the student, whether an athlete, a dancer, an industrial worker, an animal, etc., to imitate and reproduce the motion being shown on the screen. As the new motions which have been developed on the computer are all based on the actual constraints of the human or animal joints, this is an excellent teaching device.
The computational complexity of the method of the present invention is a function O(n), wherein n is the number of segments, so the computational complexity, and thus the time required for such computation, is much less than with conventional motion analysis and animation methods. Furthermore, the present invention can illustrate the motions of human or animal bodies by smooth three-dimensional modeling pictures without requiring trial and error or the intuition of the programmer.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a flow chart of a motion analysis embodiment according to this invention.
FIG. 2 is a schematic view showing structure of a model of a human body.
FIGS. 3(a) to 3 (c) are a control graphs showing examples of the forces exerted on a segment joint of a human body.
FIGS. 4(a) and 4(b) are views showing one example displaying on a screen a result of motion analysis according to this invention.
FIG. 5 is a view showing another example displayed a result of motion analyzing on a screen according to this invention.
FIGS. 6(a) and 6(b) are views showing the step of application of dynamics in which body segments are identified and each segment is calculated independently, neglecting joint constraints.
FIGS. 7(a) and 7(b) are views showing the results of the step of checking and enforcing constraints, FIG. 7(a) being before the application of constraints and FIG. 7(b) being thereafter.
FIGS. 8(a) and 8(b) are views similar to those of FIGS. 6(a) and 6(b) when used for analyzing and teaching sports motions, such as a golf swing.
FIGS. 9(a) and 9(b) are views similar to those of FIGS. 7(a) and 7(b) when used for teaching and analyzing a golf swing.
FIGS. 10(a) and 10(b) are views similar to those of FIGS. 6(a) and 6(b) and those of FIGS. 8(a) and 8(b), when used for training an animal.
FIGS. 11(a) and 11(b) are views similar to those of FIGS. 7(a) and 7(b) and those of FIGS. 9(a) and 9(b), when used for training an animal
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
A preferred embodiment of the present invention will now be described with reference to the drawings.
FIG. 1 shows a flow chart of one embodiment according to this invention, including following steps:
1. Constructing a model of a human body;
2. Maintaining a database;
3. Inputting actual movements of a human being;
4. Analyzing the movements that have been input;
5. Displaying the results;
6. Inputting the results into the database;
7. Designing new motions
8. Applying dynamic, neglecting physical constraints;
9. Applying physical constraints;
10. Applying inverse dynamics;
11. Displaying the result;
12. Maintaining the result in a database.
In the first step (constructing a model), the human body is divided into a plurality of segments connected by body joints, each of the segments acting as a minimal unit of motion, as shown in FIG. 2. A human body model then is constructed on the basis of the inherent nature of each of the segments and physical constraints such as the inter-relationship of each segment and the range of movements of the joints connecting the segments.
As shown in FIG. 2 the human body consists of parts roughly divided, a head 1 a trunk 2 and four limbs, and each part is connected with the trunk 2 placed in the center of them. Upper limbs comprising of right and left arms consist of upperarms 3, 4, forearms 5, 6, and hands respectively; each hand having five fingers, having a plurality of segments connected by joints (not shown) Lower limbs comprising of right and left legs also consist of 7, 8, 9 and 10 in like manner.
In this figure, the bones constructing the human body appear as lines, and the joints by which the bones are connected to each other appear as small circles. As shown in the figure, the bones described by lines are minimal units of motion which will not be divided further, and the joints described by small circles determine the mutual relationship of the bones.
The nature of these joints and bones constructing the human body; that is, interconnecting relationship on each of the segments and ranges of the movement of each joint are physical constraints. Data defining the model is stored in the computer on a database.
In the third step (inputting actual movements), a film is taken of the actual motions of a human body, and for each frame of the film or of the video film, the positions of the body parts of the human being are quantified and input to the computer for the modeling. This data is applied to the model and the computer calculates the position, velocity and acceleration of each segment of the model. When the human body is simultaneously filmed from a plurality of directions, the analysis in the next step can be executed more concretely. This is all as described in Sun et al (supra)
In the fourth step (analyzing the movements that have been input), the data as to motions input in the second step are analyzed using inverse dynamics using Lagrange's equation which describe the relationship of forces and movements. The center of gravity of each of the body segments, the force and torque exerted on each joint, the position of the center of gravity of the whole body and the forces and torques exerted on the centers of gravity are all calculated and then input into the database.
In the fifth step (displaying the result), the center of gravity of each of the body segments, the force and torque exerted on each joint, the position of the center of gravity of the whole body, and the forces and torques exerted on the centers of gravity, resulting from the fourth step, are displayed by symbols such as arrows on a screen superimposed on a display of the human body model.
In FIG. 4(a), a cross symbol 11, showing the center of gravity of the whole body, and arrow symbols 12-17, showing the vector of the forces exerted on each center of gravity of each segment, are displayed superimposed on the human body model of the database, when the human body model in FIG. 4(a) comes to that in the state of FIG. 4(b). This display describes forces exerted on the right forearm segment 5 as vector 12, on the left upperarm segment 4 as vector 13, on the right thigh segment 7 as vector 14, on the right leg segment 9 as vector 15, on the left thigh segment 8 as vector 16, and on the left leg segment 10 as vector 17.
FIG. 5 shows a display of an example of performing a skill by two persons in Shorinji Kempo, in which positions of the centers of gravity of the bodies and the forces exerted are superimposed on the model of the filmed human bodies. In this figure, the person MR on the right side seizes the right wrist of the person ML on the left side and acts the force shown as an arrow FR to the right arm of ML and, therefore, produces the force shown as another arrow FL onto the center of gravity GL of the whole body of ML. Accordingly, the display can show motions more concretely to be understood easily by anyone.
According to this invention, motions to be analyzed are calculated, on the basis of physical constraints and the inherent nature of each segment acting as the minimal unit of movements of a human or an animal body which are put into a database, and the motion and the center of gravity of each of the segments and the force and torque exerted on each joint, the motion and the center of gravity of the whole body, and the forces and torques exerted on the centers of gravity are obtained. The results obtained are, therefore, reasonable and realistic, and the motions are easily understood by arrows showing directions of movements or forces appearing on the human body model on the screen.
In these steps, the motion analysis of this invention is carried out.
In the sixth step (inputting into the database), the resulting analysis is input into the database to be used further for developing new motions. One way of quantitatively representing the motions is by means of control graphs showing the forces acting on one of the joints of the model as a function of time. FIG. 3 is an example of a control graph of the forces acting on the left elbow of a golfer in the directions of x, y, and z orthogonal axes as a function of time. The data constituting the control graphs are determined from the motions of the model obtained after applying actual motions to the model and analyzing the results, and are stored in the database. Since the two forces exerted on any given joint are equal in magnitude and opposite in direction, this constraint must be maintained by the system. A complicated motion is represented by a plurality of graphs. For example, the control graphs of a human being standing up from a chair and walking constitute a continuous composite motion. Each control graph for active segments is designed in the same manner as for the illustrated control graphs for the left elbow.
A complicated motion is represented by several control graphs, each of which is designed by the user by the same method. For example, the control graphs of somebody standing up from a chair and walking constitute a continuous composite motion. In case of Shorinji Kempo, a Japanese martial art, any kind of motion is based on 12 basic control graphs.
On the basis of the data input into the database, and particularly the control graphs showing all of the forces on each joint over time as the model goes through a particular motion or activity, it is possible to design new motions (the sixth step of the flow chart). Thus, a new motion may be created starting with the information which is already available in the database.
Motions are designed at two levels: changing the speed of the motions and the forces causing the motions, and composing a complicated motion from the basic motions.
At first, the user chooses the basic motions from the database. Then, the dynamics parameters of the motions chosen are displayed using a two dimensional control graph, where the abscissa represents time and the ordinate represents the forces exerted on each joint of the body (FIG. 2). Since the two forces exerted on the same joint must be equal in magnitude and opposite in direction this constraint must be maintained by the system. The user may be provided with various facilities to modify these physical parameters using a mouse.
To facilitate the modification, two control modes are supported:
1. The global modification: The same changes of the forces are caused to all the human body parts on which the forces are exerted. This modification can involve scaling up or down of the x- or y-axis.
2. The local modification: It changes only the force exerted on a specific part of the body.
Another modification other than the motion is the modification of the acceleration due to gravity, which is necessary when simulating motion outside the earth.
The joint limits restrict the range of the joint movements, because the joints of a real human body cannot rotate arbitrarily in all directions. The limits are the constants of a given human body, which have been put into the database.
In the eighth step (application of dynamics), the motion of each body segment is calculated on the basis of the forces corresponding to the basic motions selected by the animator and dynamics equations governing movement of each segment. Although the segments of the body are actually connected with one another by joints, as shown in FIG. 2, in order to simplify the calculations, it is assumed for the moment that each body segment is independent of the other segments and the constraints on the articulation of the human body and its range of movement of joints are neglected for the moment.
Dynamics calculates the path of each body segment independently of the rest, based on the forces specified by the user and the dynamics equations governing the segment movements. In this calculation, each body segment is firstly distinguished (FIG. 6(a) and 6(b), 8(a) and 8(b), and 10(a) and 10(b)), and the articulation of the body and the constraints regarding the range of the movements of joints are neglected for the present. The idea is to reduce the amount of computation. The results of calculation of the motions of every segment are passed to the next step, where the physical constraints of the body are checked and enforced.
For each segment, the Newton-Euler formulation is used to calculate the motion. Newton's equation is used to derive the linear acceleration of the center of gravity as follows:
F.sub.x =m.sub.x
F.sub.y =m(.sub.y -g)
F.sub.z =m.sub.z
where
F is the force exerted on the segment,
m is the mass of the segment,
ρ is the position of the center of gravity of the segment, and
g is the acceleration of gravity.
Euler's equation is used to derive the angular acceleration about its center of gravity as follows:
N.sub.x =I.sub.x ω.sub.x +(I.sub.y -I.sub.z)ω.sub.y ω.sub.z
N.sub.y =I.sub.y ω.sub.y +(I.sub.z -I.sub.x)ω.sub.z ω.sub.x
N.sub.z =I.sub.z ω.sub.z +(I.sub.z -I.sub.y)ω.sub.x ω.sub.y
where
x,y,z are the directions of the principal axes,
N is the external torque being applied to the segment,
I is the principal moments of inertia, and
ω is the angular velocity of the segment.
Note that the principal moments of inertia are the diagonal elements when the inertia tensor is diagonalized, which are derived from the eigen values of the inertia tensor of the segment. And the corresponding principal axes are obtained from its eigen vector equations. Thus before solving Euler's equation, the original coordinates must be transformed to the principal axes. The Euler angles can be used to perform the transformations.
Once new linear and angular accelerations are obtained, they must be integrated to find new velocities and integrated again to find new positions.
The Euler method is simplest to do this. ##EQU4## where a is the acceleration
v is the velocity,
p is the position
t is the time, and
δt is the time period.
This method assumes acceleration is constant over the time period, and inaccurate results are obtained when the time period is large or accelerations are changing rapidly. More sophisticated methods may be found in a number of numerical integration methods (Press W H et al, "Numerical Recipes", Cambridge Univ. Press, Cambridge, England (1986)).
In the ninth step (application of constraints), the articulation of the human body and the range of the movements of the body joints are checked for each of the motions calculated in the eighth step (FIGS. 7(a) and 7(b), 9(a) and 9(b), and 11(a) and 11(b)). The process of applying constraints starts at a segment referred to as a root element, and the position and the orientation of each segment in a subclass of the root element are checked sequentially. Here, two types of checks are performed. One is a check whether a subclass element is always connected to its superclass element. The other is a check whether the movement of each joint exceeds a specified range. If the subclass element is not connected to its superclass element, as shown in FIGS. 7(a), 9(a) and 11(a), the subclass element is translated until it becomes connected to its superclass element. If the movement of any joint exceeds the specified range, the movement of the joint is adjusted to be within the range by rotation of the corresponding element, thus modifying the positions of the segments to obtain a posture as shown in FIGS. 7(b), 9(b) and 11(b).
In the tenth step (application of inverse dynamics), Lagrange equations which describe the relationship between forces and movement are used to calculate the forces exerted on each joint of the body. Inverse dynamics uses Lagrange's equations which describe the relationship between forces and movements to calculate the forces exerted on each joint of the body. The Sun paper (supra) proposes a linear recursive algorithm to perform inverse dynamics, with the computational complexity O(n) where n is the number of the body segments.
Note the fact that inverse dynamics can provide a reasonable and integrated set of forces for dynamics to produce the motions which are adjusted due to the constraints. Without inverse dynamics, it is impossible to expect the user to be able to find out an integrated force specification. In the method of the present invention, the orientation of each body segment is changed when the related joint limit is exceeded, and the position of each body segment is adjusted to the physical constraints of the body.
If the desired results are not first obtained, the seventh to eleventh steps can be repeated, and the new motions can be developed in an interactive manner.
In the eleventh step (displaying the result), as to the relation between movements and forces obtained from calculating utilizing dynamics, the center of gravity of each of the body segments, the force and torque exerted on each joint, the center of gravity of the whole body, and the force and torque exerted on the center of gravity of the whole body, can be displayed as symbols such as an arrow, superimposed over the human body model, as shown in FIG. 4, or over the human body displayed on a screen, in same manner as in the fifth step (displaying the result) as shown in FIG. 5.
As a matter of course, the result of the calculated movements which have been analyzed using inverse dynamics can also be stored in the database to be used in the design of additional complicated motions.
It can thus be seen that the present invention provides a dynamics analysis based human or animal motion animation using an articulated human or animal body model. The method consists of two stages: analyzing the basic movements of an actual human being or animal and using the analytic results to produce the new motions dynamically. The processes of dynamical simulation proceed in three steps: dynamics, constraints and inverse dynamics. The dynamics step distinguishes the articulated body into independent rigid segments (such as 50 rigid segments) and the motion of each segment is calculated independently of the motions of other segments by using the Newton-Euler equations. The idea is to reduce the number of degrees of freedom and thus the amount of computation required, and to avoid the complexity of dealing with continuum mechanics. In the constraint step, the articulation of the body and the range of movements of the body joints are checked and maintained. A segment which violates its constraints is moved by translations and rotations so that it becomes connected to its super-segment or is bent into a natural posture. In the inverse dynamics step, the forces that produced the new motions, which have been modified due to the constraints, are calculated. In the present invention, since the sequence is executed by a simple line feedback algorithm, the total computational complexity is O (n), where n is the number of the body segments. The user interacts with the system by specifying the control forces and getting the motions produced by the dynamics and the constraints, the forces being modified due to the constraints. The present method solves the computational problem that existed in dynamics and enables dynamics to produce real time feedback so that dynamics are now made well suited to computer motion analysis work.
By using inverse dynamics, a reasonable and complete combination of forces can be obtained. In contrast, without inverse dynamics, it is impossible for the analyst to find the complete design of forces. In the present invention, if the orientation of a body segment is such that the range of movement of either of its joints is exceeded, the orientation of the segment is changed so that the position of the body segment satisfies the physical constraints of the body. Since the motions of the human or animal body thus obtained are natural motions, in which a subclass segment is always connected to its superclass segment and the movement of each joint does not exceed the specified range, such motions can be displayed realistically using a three-dimensional modeling picture.
Furthermore, according to the present invention, it is possible to develop a new motion in an interactive manner using a computer without requiring trial and error or the intuition of the analyst. The method is particularly useful in analyzing and teaching sports motions as shown in FIGS. 8 and 9.
Animation and motion analysis of an animal body can be developed by the method of the present invention in the same manner as described above with respect to a human body, as shown in FIGS. 10-11. Similarly, any articulated system can be analyzed and animated in a similar manner.
As mentioned above, the motion analysis method according to the present invention comprises the steps of analyzing the basic motions of an actual human or animal body and developing new motions. The analysis of the basic motions of the human or animal body is achieved in three steps: constructing a model of the human or animal body, applying the actual motions of a human or animal to the model, and analyzing the motions of the segments of the model. The development of new motions is achieved in three steps: application of dynamics, application of constraints, and application of inverse dynamics. In the step of applying dynamics, the body is divided into a plurality of independent body segments (50, for example) connected by joints, and the motion of each body segment is calculated independently of the other segments using Newton's equation of motion and Euler's equations. In the step of applying constraints, the articulation of the body and the range of movement of its joints are checked. In the step of applying inverse dynamics, the force modified by the constraints and generating new forces are calculated. Thus, the whole computational complexity becomes O(n).
Accordingly, the method according to the present invention can eliminate the computational complexity of conventional methods, it permits dynamics to be applied to actual motion analysis, and it permits feedback in real time using dynamics.
Furthermore, in order to calculate the motions of each segment of the body, since the linear acceleration of the center of gravity is calculated using Newton's equation of motion and the angular acceleration of the center of gravity is calculated using Euler's equations, it is possible to determine and display not only the position of and the force exerted on the center of gravity of each segment, but also the position of and the force exerted on the center of gravity of the whole body.
It is possible according to the present invention to realistically display the motions of a human or animal body using smooth three-dimensional modeling pictures rather than line drawings. In addition, an analyst can look at the human or animal body model on the screen from various directions, and can translate or rotate various segments of the body in an interactive manner. Thus, the analyst can ascertain the relationship between the picture and the human or animal body model correctly.
In conventional motion analysis methods, basic data on the motions of the human or animal body and the constraints that define the range of movements of individual joints is obtained using the intuition of the analyst. In contrast, in the motion analysis method according to the present invention, actual dynamics parameters are obtained by analyzing the actual motions of the human or animal body. Accordingly, the motions derived from these parameters are reliable and provide realistic motions.
An object-oriented paradigm has recently been used in a number of areas. As a user interface, the object-oriented philosophy leads to a direct manipulation paradigm. In this direct manipulation paradigm, since the images displayed on a screen correspond to objects, by using the motion analysis method according to the present invention, the objects can be manipulated directly in the space of the objects.
The above-described processes for analyzing motions and developing new motions according to the present invention can be applied not only to the study of motions in sports or the performing arts, and to the training of animals, but also to the development of animation and to the programming of industrial equipment such as robot control systems and numerical control systems.
All references cited herein, including journal articles or abstracts, published or corresponding U.S. or foreign patent applications, issued U.S. or foreign patents, or any other references, are entirely incorporated by reference herein, including all data, tables, figures, and text presented in the cited references. Additionally, the entire contents of the references cited within the references cited herein are also entirely incorporated by reference.
The foregoing description of the specific embodiments will so fully reveal the general nature of the invention that others can, by applying knowledge within the skill of the art (including the contents of the references cited herein), readily modify and/or adapt for various applications such specific embodiments, without undue experimentation, without departing from the general concept of the present invention. Therefore, such adaptations and modifications are intended to be within the meaning and range of equivalents of the disclosed embodiments, based on the teaching and guidance presented herein. It is to be understood that the phraseology or terminology herein is for the purpose of description and not of limitation, such that the terminology or phraseology of the present specification is to be interpreted by the skilled artisan in light of the teachings and guidance presented herein in combination with the knowledge of one of ordinary skill in the art.

Claims (2)

What is claimed is:
1. A motion analyzing method using a computer, comprising the steps of:
a) maintaining, in a database, data for modeling a moving body, which body has been divided into a plurality of segments, each of said segments connected by joints and each of said segments acting as a minimal unit of motion, said data relating to an inherent nature of each of said segments and physical constraints including inter-relationship of each segment and range of movements of said joints;
b) observing actual movements of a moving body, including a position of each said segment at predetermined intervals in the course of such actual movement;
c) analyzing the data as to the position of each said segment as observed during actual movements and calculating by inverse dynamics centers of gravity of each said segments and of the whole body, force and torque exerted on each of said centers of gravity and force and torque exerted on each of said joints;
d) inputting the calculated results from said calculating step into said database;
e) determining a new motion to be designed;
f) calculating by dynamics, and neglecting physical constraints, motion of each body segment, independently of the remaining body segments, based on forces corresponding to the new motion selected in said determining step using the data maintained in said database for previously analyzed motions and dynamic equations governing movement of each segment;
g) applying said physical constraints as stored in said database to check that each segment is articulated to the adjacent segment and that the movement of each joint does not exceed the range specified by said physical constraints, and adjusting the position of each segment until each of said physical constraints are met; and
h) displaying on a screen the resulting motion of the moving body as designed by said steps of calculating by dynamics and applying physical constraints.
2. A motion analyzing method in accordance with claim 1, further including, after said step g), the steps of:
i) analyzing the data as to the position of each said segment as a result of said applying and adjusting step g) and calculating by inverse dynamics centers of gravity of each of said segments and of the whole body, force and torque exerted on each of said centers of gravity and force and torque exerted on each of said joints, to thereby provide a reasonable and integrated set of forces for dynamics to produce the motions which have been adjusted due to the physical constraints; and
j) inputting into said database the calculated results from said analyzing and calculating step (i).
US08/182,545 1990-12-25 1994-01-18 Computer-implemented motion analysis method using dynamics Expired - Lifetime US5625577A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US08/182,545 US5625577A (en) 1990-12-25 1994-01-18 Computer-implemented motion analysis method using dynamics

Applications Claiming Priority (13)

Application Number Priority Date Filing Date Title
JP2418249A JPH04270372A (en) 1990-12-25 1990-12-25 Skill developing method
JP2418250A JPH07313648A (en) 1990-12-25 1990-12-25 Exercise motion development
JP2-418253 1990-12-25
JP2-418251 1990-12-25
JP2418253A JPH04271734A (en) 1990-12-25 1990-12-25 Method for training animal
JP2-418249 1990-12-25
JP2-418250 1990-12-25
JP2418251A JPH04279979A (en) 1990-12-25 1990-12-25 Motion analyzing method
US71430691A 1991-06-12 1991-06-12
US71288491A 1991-06-12 1991-06-12
US71430491A 1991-06-12 1991-06-12
US13970393A 1993-10-05 1993-10-05
US08/182,545 US5625577A (en) 1990-12-25 1994-01-18 Computer-implemented motion analysis method using dynamics

Related Parent Applications (4)

Application Number Title Priority Date Filing Date
US71430691A Continuation-In-Part 1990-12-25 1991-06-12
US71430491A Continuation-In-Part 1990-12-25 1991-06-12
US71288491A Continuation-In-Part 1990-12-25 1991-06-12
US13970393A Continuation-In-Part 1990-12-25 1993-10-05

Publications (1)

Publication Number Publication Date
US5625577A true US5625577A (en) 1997-04-29

Family

ID=27573678

Family Applications (1)

Application Number Title Priority Date Filing Date
US08/182,545 Expired - Lifetime US5625577A (en) 1990-12-25 1994-01-18 Computer-implemented motion analysis method using dynamics

Country Status (1)

Country Link
US (1) US5625577A (en)

Cited By (135)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1997040471A1 (en) * 1996-04-04 1997-10-30 Katrix, Inc. Limb coordination system for interactive computer animation of articulated characters with blended motion data
US5889532A (en) * 1996-08-02 1999-03-30 Avid Technology, Inc. Control solutions for the resolution plane of inverse kinematic chains
US5890906A (en) * 1995-01-20 1999-04-06 Vincent J. Macri Method and apparatus for tutorial, self and assisted instruction directed to simulated preparation, training and competitive play and entertainment
WO1999067746A1 (en) * 1998-06-24 1999-12-29 Sports Training Technologies, S.L. Method for capturing, analyzing and representing the movement of bodies and objects
US6014150A (en) * 1997-08-01 2000-01-11 Avid Technology, Inc. System and method of defining and employing behaviors for articulated chains
US6050963A (en) * 1998-06-18 2000-04-18 Innovative Sports Training, Inc. System for analyzing the motion of lifting an object
WO2001073689A2 (en) * 2000-03-27 2001-10-04 Massachusetts General Hospital Method and system for viewing kinematic and kinetic information
US6505096B2 (en) * 1996-12-19 2003-01-07 Honda Giken Kogyo Kabushiki Kaisha Posture control system of legged mobile robot
US6506124B1 (en) 2001-12-21 2003-01-14 Callaway Golf Company Method for predicting a golfer's ball striking performance
EP1282079A1 (en) * 2001-02-13 2003-02-05 Sega Corporation Animation creation program
US20030115031A1 (en) * 2001-10-29 2003-06-19 Behzad Dariush Simulation system, method and computer-readable medium for human augmentation devices
US6584377B2 (en) * 2000-05-15 2003-06-24 Sony Corporation Legged robot and method for teaching motions thereof
US20030118979A1 (en) * 2001-11-15 2003-06-26 Axelrod Glen S. Electronic pet book
US20030123728A1 (en) * 2001-12-28 2003-07-03 Koninklijke Philips Electronics N.V. Interactive video installation and method thereof
US6708142B1 (en) 1999-01-14 2004-03-16 University Of Central Florida Automatic motion modeling of rigid bodies using collision detection
US20040054510A1 (en) * 2002-09-18 2004-03-18 Ulrich Raschke System and method for simulating human movement
EP1406215A1 (en) * 2002-10-02 2004-04-07 Université Libre de Bruxelles Method for the modeling of skeletal kinematics
US6738065B1 (en) 1999-08-10 2004-05-18 Oshri Even-Zohar Customizable animation system
US6778866B1 (en) * 2000-03-16 2004-08-17 Ted S. Bettwy Method and apparatus for learning specific body motion
US20040169656A1 (en) * 2002-11-15 2004-09-02 David Piponi Daniele Paolo Method for motion simulation of an articulated figure using animation input
US6793496B2 (en) * 1999-04-15 2004-09-21 General Electric Company Mathematical model and a method and apparatus for utilizing the model
US20040206456A1 (en) * 2000-12-13 2004-10-21 Takamitsu Tadera Plasma processing apparatus
US20040236550A1 (en) * 2002-02-28 2004-11-25 Edic Peter Michael Mathematical model and a method and apparatus for utilizing the model
US20040243261A1 (en) * 2002-11-13 2004-12-02 Brian King System and method for capturing and analyzing tennis player performances and tendencies
US20050096889A1 (en) * 2003-10-29 2005-05-05 Snecma Moteurs Moving a virtual articulated object in a virtual environment while avoiding collisions between the articulated object and the environment
US20050096890A1 (en) * 2003-10-29 2005-05-05 Snecma Moteurs Moving a virtual articulated object in a virtual environment while avoiding internal collisions between the articulated elements of the articulated object
US20050102111A1 (en) * 2002-09-23 2005-05-12 Behzad Dariush Gravity compensation method in a human assist system and a human assist system with gravity compensation control
US20050114073A1 (en) * 2001-12-05 2005-05-26 William Gobush Performance measurement system with quantum dots for object identification
US20050168578A1 (en) * 2004-02-04 2005-08-04 William Gobush One camera stereo system
WO2005082249A2 (en) * 2004-02-26 2005-09-09 K.U. Leuven Research & Development Time-dependent three-dimensional musculo-skeletal modeling based on dynamic surface measurements
US20050209534A1 (en) * 2001-06-29 2005-09-22 Behzad Dariush System and method of predicting novel motion in a serial chain system
US20050209535A1 (en) * 2001-06-29 2005-09-22 Behzad Dariush System and method of estimating joint loads in a three-dimensional system
US20050209536A1 (en) * 2001-06-29 2005-09-22 Behzad Dariush System and method of estimating joint loads using an approach of closed form dynamics
US20050267615A1 (en) * 2004-03-05 2005-12-01 Lavash Bruce W System and method of virtual representation of folds and pleats
US20050264572A1 (en) * 2004-03-05 2005-12-01 Anast John M Virtual prototyping system and method
US20050272512A1 (en) * 2004-06-07 2005-12-08 Laurent Bissonnette Launch monitor
US20050272514A1 (en) * 2004-06-07 2005-12-08 Laurent Bissonnette Launch monitor
US20050272516A1 (en) * 2004-06-07 2005-12-08 William Gobush Launch monitor
US20050278157A1 (en) * 2004-06-15 2005-12-15 Electronic Data Systems Corporation System and method for simulating human movement using profile paths
US20050282645A1 (en) * 2004-06-07 2005-12-22 Laurent Bissonnette Launch monitor
US20060015186A1 (en) * 2002-09-04 2006-01-19 Graham Isaac Cup assembly of an orthopaedic joint prosthesis
US20060046861A1 (en) * 2004-08-31 2006-03-02 Lastowka Eric J Infrared sensing launch monitor
US7012608B1 (en) 2001-08-02 2006-03-14 Iwao Fujisaki Simulation device
US20060099557A1 (en) * 2002-09-30 2006-05-11 Anders Hyltander Device and method for generating a virtual anatomic environment
US20060247904A1 (en) * 2001-06-29 2006-11-02 Behzad Dariush Exoskeleton controller for a human-exoskeleton system
US20060270950A1 (en) * 2001-06-29 2006-11-30 Behzad Dariush Active control of an ankle-foot orthosis
US20060282022A1 (en) * 2001-06-29 2006-12-14 Behzad Dariush Feedback estimation of joint forces and joint movements
US20060286522A1 (en) * 2005-06-17 2006-12-21 Victor Ng-Thow-Hing System and method for activation-driven muscle deformations for existing character motion
US20060293791A1 (en) * 2005-06-10 2006-12-28 Behzad Dariush Regenerative actuation in motion control
WO2007076118A2 (en) * 2005-12-22 2007-07-05 Honda Motor Co., Ltd. Reconstruction, retargetting, tracking, and estimation of motion for articulated systems
WO2007076119A2 (en) * 2005-12-22 2007-07-05 Honda Motor Co., Ltd. Reconstruction, retargetting, tracking, and estimation of pose of articulated systems
US20070255454A1 (en) * 2006-04-27 2007-11-01 Honda Motor Co., Ltd. Control Of Robots From Human Motion Descriptors
US20080020867A1 (en) * 2003-08-28 2008-01-24 Callaway Golf Company Golfer's impact properties during a golf swing
EP1884897A2 (en) * 2006-07-31 2008-02-06 Avid Technology, Inc. Rigless retargeting for character animation
US7386429B1 (en) * 2001-12-07 2008-06-10 Iwao Fujisaki Wrinkle simulation software
US7390309B2 (en) 2002-09-23 2008-06-24 Honda Motor Co., Ltd. Human assist system using gravity compensation control system and method using multiple feasibility parameters
US7402142B2 (en) 2002-09-23 2008-07-22 Honda Giken Kogyo Kabushiki Kaisha Method and processor for obtaining moments and torques in a biped walking system
US20080183450A1 (en) * 2007-01-30 2008-07-31 Matthew Joseph Macura Determining absorbent article effectiveness
US20080221487A1 (en) * 2007-03-07 2008-09-11 Motek Bv Method for real time interactive visualization of muscle forces and joint torques in the human body
US20090005188A1 (en) * 2007-06-26 2009-01-01 A School Corporation Kansai University Analysis method of golf club
US20090074252A1 (en) * 2007-10-26 2009-03-19 Honda Motor Co., Ltd. Real-time self collision and obstacle avoidance
US20090118863A1 (en) * 2007-11-01 2009-05-07 Honda Motor Co., Ltd. Real-time self collision and obstacle avoidance using weighting matrix
US20090220124A1 (en) * 2008-02-29 2009-09-03 Fred Siegel Automated scoring system for athletics
US20090259450A1 (en) * 2008-04-15 2009-10-15 Cleary Paul William physics-based simulation
US20100131113A1 (en) * 2007-05-03 2010-05-27 Motek Bv Method and system for real time interactive dynamic alignment of prosthetics
US20100168998A1 (en) * 2008-12-26 2010-07-01 Toyota Jidosha Kabushiki Kaisha Driving assistance device and driving assistance method
US20100208945A1 (en) * 2007-10-26 2010-08-19 Koninklijke Philips Electronics N.V. Method and system for selecting the viewing configuration of a rendered figure
US20100331090A1 (en) * 2009-06-24 2010-12-30 Takuhiro Dohta Computer readable storage medium and information processing apparatus
TWI384376B (en) * 2008-09-09 2013-02-01 Univ Nat Chunghsing Method of optimization for editing human body actions
WO2013041446A1 (en) 2011-09-20 2013-03-28 Brian Francis Mooney Apparatus and method for analysing a golf swing
US8500568B2 (en) 2004-06-07 2013-08-06 Acushnet Company Launch monitor
WO2014121365A1 (en) * 2013-02-11 2014-08-14 Mytnik Vyacheslav Georgievich Method for controlling an object
US9129077B2 (en) 2004-09-03 2015-09-08 Siemen Product Lifecycle Management Software Inc. System and method for predicting human posture using a rules-based sequential approach
US20160077716A1 (en) * 2014-09-16 2016-03-17 Disney Enterprises, Inc. Computational Design of Linkage-Based Characters
US20160110593A1 (en) * 2014-10-17 2016-04-21 Microsoft Corporation Image based ground weight distribution determination
US10188890B2 (en) 2013-12-26 2019-01-29 Icon Health & Fitness, Inc. Magnetic resistance mechanism in a cable machine
US10252109B2 (en) 2016-05-13 2019-04-09 Icon Health & Fitness, Inc. Weight platform treadmill
US10258828B2 (en) 2015-01-16 2019-04-16 Icon Health & Fitness, Inc. Controls for an exercise device
US10272317B2 (en) 2016-03-18 2019-04-30 Icon Health & Fitness, Inc. Lighted pace feature in a treadmill
US10279212B2 (en) 2013-03-14 2019-05-07 Icon Health & Fitness, Inc. Strength training apparatus with flywheel and related methods
US10293211B2 (en) 2016-03-18 2019-05-21 Icon Health & Fitness, Inc. Coordinated weight selection
US10343017B2 (en) 2016-11-01 2019-07-09 Icon Health & Fitness, Inc. Distance sensor for console positioning
WO2019147956A1 (en) 2018-01-25 2019-08-01 Ctrl-Labs Corporation Visualization of reconstructed handstate information
US10376736B2 (en) 2016-10-12 2019-08-13 Icon Health & Fitness, Inc. Cooling an exercise device during a dive motor runway condition
US10426989B2 (en) 2014-06-09 2019-10-01 Icon Health & Fitness, Inc. Cable system incorporated into a treadmill
US10433612B2 (en) 2014-03-10 2019-10-08 Icon Health & Fitness, Inc. Pressure sensor to quantify work
US10441844B2 (en) 2016-07-01 2019-10-15 Icon Health & Fitness, Inc. Cooling systems and methods for exercise equipment
US10460455B2 (en) 2018-01-25 2019-10-29 Ctrl-Labs Corporation Real-time processing of handstate representation model estimates
US10471299B2 (en) 2016-07-01 2019-11-12 Icon Health & Fitness, Inc. Systems and methods for cooling internal exercise equipment components
US10489986B2 (en) 2018-01-25 2019-11-26 Ctrl-Labs Corporation User-controlled tuning of handstate representation model parameters
US10496168B2 (en) 2018-01-25 2019-12-03 Ctrl-Labs Corporation Calibration techniques for handstate representation modeling using neuromuscular signals
US10493349B2 (en) 2016-03-18 2019-12-03 Icon Health & Fitness, Inc. Display on exercise device
US10504286B2 (en) 2018-01-25 2019-12-10 Ctrl-Labs Corporation Techniques for anonymizing neuromuscular signal data
US10500473B2 (en) 2016-10-10 2019-12-10 Icon Health & Fitness, Inc. Console positioning
US10543395B2 (en) 2016-12-05 2020-01-28 Icon Health & Fitness, Inc. Offsetting treadmill deck weight during operation
US10561894B2 (en) 2016-03-18 2020-02-18 Icon Health & Fitness, Inc. Treadmill with removable supports
US10592001B2 (en) 2018-05-08 2020-03-17 Facebook Technologies, Llc Systems and methods for improved speech recognition using neuromuscular information
US10625137B2 (en) 2016-03-18 2020-04-21 Icon Health & Fitness, Inc. Coordinated displays in an exercise device
US10656711B2 (en) 2016-07-25 2020-05-19 Facebook Technologies, Llc Methods and apparatus for inferring user intent based on neuromuscular signals
US10661114B2 (en) 2016-11-01 2020-05-26 Icon Health & Fitness, Inc. Body weight lift mechanism on treadmill
CN111260774A (en) * 2020-01-20 2020-06-09 北京百度网讯科技有限公司 Method and device for generating 3D joint point regression model
US10684692B2 (en) 2014-06-19 2020-06-16 Facebook Technologies, Llc Systems, devices, and methods for gesture identification
US10687759B2 (en) 2018-05-29 2020-06-23 Facebook Technologies, Llc Shielding techniques for noise reduction in surface electromyography signal measurement and related systems and methods
US10729965B2 (en) 2017-12-22 2020-08-04 Icon Health & Fitness, Inc. Audible belt guide in a treadmill
US10772519B2 (en) 2018-05-25 2020-09-15 Facebook Technologies, Llc Methods and apparatus for providing sub-muscular control
US10817795B2 (en) 2018-01-25 2020-10-27 Facebook Technologies, Llc Handstate reconstruction based on multiple inputs
US10842407B2 (en) 2018-08-31 2020-11-24 Facebook Technologies, Llc Camera-guided interpretation of neuromuscular signals
US10905383B2 (en) 2019-02-28 2021-02-02 Facebook Technologies, Llc Methods and apparatus for unsupervised one-shot machine learning for classification of human gestures and estimation of applied forces
US10921764B2 (en) 2018-09-26 2021-02-16 Facebook Technologies, Llc Neuromuscular control of physical objects in an environment
US10937414B2 (en) 2018-05-08 2021-03-02 Facebook Technologies, Llc Systems and methods for text input using neuromuscular information
US10953305B2 (en) 2015-08-26 2021-03-23 Icon Health & Fitness, Inc. Strength exercise mechanisms
US10970374B2 (en) 2018-06-14 2021-04-06 Facebook Technologies, Llc User identification and authentication with neuromuscular signatures
US10970936B2 (en) 2018-10-05 2021-04-06 Facebook Technologies, Llc Use of neuromuscular signals to provide enhanced interactions with physical objects in an augmented reality environment
US10990174B2 (en) 2016-07-25 2021-04-27 Facebook Technologies, Llc Methods and apparatus for predicting musculo-skeletal position information using wearable autonomous sensors
US11000211B2 (en) 2016-07-25 2021-05-11 Facebook Technologies, Llc Adaptive system for deriving control signals from measurements of neuromuscular activity
US11045137B2 (en) 2018-07-19 2021-06-29 Facebook Technologies, Llc Methods and apparatus for improved signal robustness for a wearable neuromuscular recording device
US11079846B2 (en) 2013-11-12 2021-08-03 Facebook Technologies, Llc Systems, articles, and methods for capacitive electromyography sensors
US11179066B2 (en) 2018-08-13 2021-11-23 Facebook Technologies, Llc Real-time spike detection and identification
US11216069B2 (en) 2018-05-08 2022-01-04 Facebook Technologies, Llc Systems and methods for improved speech recognition using neuromuscular information
US11331045B1 (en) 2018-01-25 2022-05-17 Facebook Technologies, Llc Systems and methods for mitigating neuromuscular signal artifacts
US11337652B2 (en) 2016-07-25 2022-05-24 Facebook Technologies, Llc System and method for measuring the movements of articulated rigid bodies
US11451108B2 (en) 2017-08-16 2022-09-20 Ifit Inc. Systems and methods for axial impact resistance in electric motors
US11481030B2 (en) 2019-03-29 2022-10-25 Meta Platforms Technologies, Llc Methods and apparatus for gesture detection and classification
US11481031B1 (en) 2019-04-30 2022-10-25 Meta Platforms Technologies, Llc Devices, systems, and methods for controlling computing devices via neuromuscular signals of users
US11493993B2 (en) 2019-09-04 2022-11-08 Meta Platforms Technologies, Llc Systems, methods, and interfaces for performing inputs based on neuromuscular control
US11567573B2 (en) 2018-09-20 2023-01-31 Meta Platforms Technologies, Llc Neuromuscular text entry, writing and drawing in augmented reality systems
US11635736B2 (en) 2017-10-19 2023-04-25 Meta Platforms Technologies, Llc Systems and methods for identifying biological structures associated with neuromuscular source signals
US11644799B2 (en) 2013-10-04 2023-05-09 Meta Platforms Technologies, Llc Systems, articles and methods for wearable electronic devices employing contact sensors
US11666264B1 (en) 2013-11-27 2023-06-06 Meta Platforms Technologies, Llc Systems, articles, and methods for electromyography sensors
US11797087B2 (en) 2018-11-27 2023-10-24 Meta Platforms Technologies, Llc Methods and apparatus for autocalibration of a wearable electrode sensor system
US11868531B1 (en) 2021-04-08 2024-01-09 Meta Platforms Technologies, Llc Wearable device providing for thumb-to-finger-based input gestures detected based on neuromuscular signals, and systems and methods of use thereof
US11893749B1 (en) * 2022-10-17 2024-02-06 Chengdu Tommi Technology Co., Ltd. Focus following method based on motion gravity center, storage medium and photographing system
US11907423B2 (en) 2019-11-25 2024-02-20 Meta Platforms Technologies, Llc Systems and methods for contextualized interactions with an environment
US11921471B2 (en) 2013-08-16 2024-03-05 Meta Platforms Technologies, Llc Systems, articles, and methods for wearable devices having secondary power sources in links of a band for providing secondary power in addition to a primary power source
US11961494B1 (en) 2019-03-29 2024-04-16 Meta Platforms Technologies, Llc Electromagnetic interference reduction in extended reality environments

Citations (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3510210A (en) * 1967-12-15 1970-05-05 Xerox Corp Computer process character animation
CA920687A (en) * 1967-10-06 1973-02-06 Control Image Corporation Automatic generation and display of animated figures
US4603284A (en) * 1984-06-05 1986-07-29 Unimation, Inc. Control system for manipulator apparatus with resolved compliant motion control
US4621332A (en) * 1983-06-20 1986-11-04 Hitachi, Ltd. Method and apparatus for controlling a robot utilizing force, position, velocity, spring constant, mass coefficient, and viscosity coefficient
US4631676A (en) * 1983-05-25 1986-12-23 Hospital For Joint Diseases Or Computerized video gait and motion analysis system and method
US4641251A (en) * 1982-02-16 1987-02-03 Inoue-Japax Research Incorporated Robot
US4752836A (en) * 1984-09-07 1988-06-21 Ivex Corporation Method and apparatus for reproducing video images to simulate movement within a multi-dimensional space
US4819184A (en) * 1986-09-29 1989-04-04 Asea Aktiebolag Method and a device for optimum control of control parameters in an industrial robot
US4826392A (en) * 1986-03-31 1989-05-02 California Institute Of Technology Method and apparatus for hybrid position/force control of multi-arm cooperating robots
US4831549A (en) * 1987-07-28 1989-05-16 Brigham Young University Device and method for correction of robot inaccuracy
US4831547A (en) * 1985-06-06 1989-05-16 Toyota Jidosha Kabushiki Kaisha Multi-joint-robot controller for enabling detection of the spatial relationship between a robot and a rotary cable
US4834057A (en) * 1980-03-31 1989-05-30 Physical Diagnostics, Inc. Dynamic joint motion analysis technique
US4851748A (en) * 1986-11-20 1989-07-25 Westinghouse Electric Corp. Basic digital multi-axis robot control having modular performance expansion capability
US4868474A (en) * 1986-11-20 1989-09-19 Westinghouse Electric Corp. Multiprocessor position/velocity servo control for multiaxis digital robot control system
US4891748A (en) * 1986-05-30 1990-01-02 Mann Ralph V System and method for teaching physical skills
US4894788A (en) * 1987-05-04 1990-01-16 Siemens Aktiengesellschaft Method for positioning a tool of a multi-joint robot
US4920500A (en) * 1986-02-25 1990-04-24 Trallfa Robot A/S Method and robot installation for programmed control of a working tool
US4925312A (en) * 1988-03-21 1990-05-15 Staubli International Ag Robot control system having adaptive feedforward torque control for improved accuracy
US4942538A (en) * 1988-01-05 1990-07-17 Spar Aerospace Limited Telerobotic tracker
US4974210A (en) * 1989-05-01 1990-11-27 General Electric Company Multiple arm robot with force control and inter-arm position accommodation
US4999553A (en) * 1989-12-28 1991-03-12 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Method and apparatus for configuration control of redundant robots
US5025394A (en) * 1988-09-09 1991-06-18 New York Institute Of Technology Method and apparatus for generating animated images
US5056021A (en) * 1989-06-08 1991-10-08 Carolyn Ausborn Method and apparatus for abstracting concepts from natural language
US5090042A (en) * 1990-12-24 1992-02-18 Bejjani Fadi J Videofluoroscopy system for in vivo motion analysis
US5099859A (en) * 1988-12-06 1992-03-31 Bell Gene D Method and apparatus for comparative analysis of videofluoroscopic joint motion
US5120228A (en) * 1990-03-15 1992-06-09 William Stahl Intrinsic perceptual motor training device
US5151859A (en) * 1989-06-29 1992-09-29 Honda Giken Kogyo Kabushiki Kaisha Legged walking robot and system for controlling the same
US5159988A (en) * 1989-12-14 1992-11-03 Honda Giken Kogyo Kabushiki Kaisha Articulated structure for legged walking robot
US5184295A (en) * 1986-05-30 1993-02-02 Mann Ralph V System and method for teaching physical skills
US5187796A (en) * 1988-03-29 1993-02-16 Computer Motion, Inc. Three-dimensional vector co-processor having I, J, and K register files and I, J, and K execution units
US5249151A (en) * 1990-06-05 1993-09-28 Fmc Corporation Multi-body mechanical system analysis apparatus and method
US5253189A (en) * 1989-06-13 1993-10-12 Schlumberger Technologies, Inc. Qualitative kinematics
US5255753A (en) * 1989-12-14 1993-10-26 Honda Giken Kogyo Kabushiki Kaisha Foot structure for legged walking robot
US5297057A (en) * 1989-06-13 1994-03-22 Schlumberger Technologies, Inc. Method and apparatus for design and optimization for simulation of motion of mechanical linkages
US5427531A (en) * 1992-10-20 1995-06-27 Schlumberger Technology Corporation Dynamic simulation of mechanisms

Patent Citations (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA920687A (en) * 1967-10-06 1973-02-06 Control Image Corporation Automatic generation and display of animated figures
US3510210A (en) * 1967-12-15 1970-05-05 Xerox Corp Computer process character animation
US4834057A (en) * 1980-03-31 1989-05-30 Physical Diagnostics, Inc. Dynamic joint motion analysis technique
US4641251A (en) * 1982-02-16 1987-02-03 Inoue-Japax Research Incorporated Robot
US4631676A (en) * 1983-05-25 1986-12-23 Hospital For Joint Diseases Or Computerized video gait and motion analysis system and method
US4621332A (en) * 1983-06-20 1986-11-04 Hitachi, Ltd. Method and apparatus for controlling a robot utilizing force, position, velocity, spring constant, mass coefficient, and viscosity coefficient
US4603284A (en) * 1984-06-05 1986-07-29 Unimation, Inc. Control system for manipulator apparatus with resolved compliant motion control
US4752836A (en) * 1984-09-07 1988-06-21 Ivex Corporation Method and apparatus for reproducing video images to simulate movement within a multi-dimensional space
US4831547A (en) * 1985-06-06 1989-05-16 Toyota Jidosha Kabushiki Kaisha Multi-joint-robot controller for enabling detection of the spatial relationship between a robot and a rotary cable
US4920500A (en) * 1986-02-25 1990-04-24 Trallfa Robot A/S Method and robot installation for programmed control of a working tool
US4826392A (en) * 1986-03-31 1989-05-02 California Institute Of Technology Method and apparatus for hybrid position/force control of multi-arm cooperating robots
US5184295A (en) * 1986-05-30 1993-02-02 Mann Ralph V System and method for teaching physical skills
US4891748A (en) * 1986-05-30 1990-01-02 Mann Ralph V System and method for teaching physical skills
US4819184A (en) * 1986-09-29 1989-04-04 Asea Aktiebolag Method and a device for optimum control of control parameters in an industrial robot
US4851748A (en) * 1986-11-20 1989-07-25 Westinghouse Electric Corp. Basic digital multi-axis robot control having modular performance expansion capability
US4868474A (en) * 1986-11-20 1989-09-19 Westinghouse Electric Corp. Multiprocessor position/velocity servo control for multiaxis digital robot control system
US4894788A (en) * 1987-05-04 1990-01-16 Siemens Aktiengesellschaft Method for positioning a tool of a multi-joint robot
US4831549A (en) * 1987-07-28 1989-05-16 Brigham Young University Device and method for correction of robot inaccuracy
US4942538A (en) * 1988-01-05 1990-07-17 Spar Aerospace Limited Telerobotic tracker
US4925312A (en) * 1988-03-21 1990-05-15 Staubli International Ag Robot control system having adaptive feedforward torque control for improved accuracy
US5187796A (en) * 1988-03-29 1993-02-16 Computer Motion, Inc. Three-dimensional vector co-processor having I, J, and K register files and I, J, and K execution units
US5025394A (en) * 1988-09-09 1991-06-18 New York Institute Of Technology Method and apparatus for generating animated images
US5099859A (en) * 1988-12-06 1992-03-31 Bell Gene D Method and apparatus for comparative analysis of videofluoroscopic joint motion
US4974210A (en) * 1989-05-01 1990-11-27 General Electric Company Multiple arm robot with force control and inter-arm position accommodation
US5056021A (en) * 1989-06-08 1991-10-08 Carolyn Ausborn Method and apparatus for abstracting concepts from natural language
US5297057A (en) * 1989-06-13 1994-03-22 Schlumberger Technologies, Inc. Method and apparatus for design and optimization for simulation of motion of mechanical linkages
US5253189A (en) * 1989-06-13 1993-10-12 Schlumberger Technologies, Inc. Qualitative kinematics
US5151859A (en) * 1989-06-29 1992-09-29 Honda Giken Kogyo Kabushiki Kaisha Legged walking robot and system for controlling the same
US5159988A (en) * 1989-12-14 1992-11-03 Honda Giken Kogyo Kabushiki Kaisha Articulated structure for legged walking robot
US5255753A (en) * 1989-12-14 1993-10-26 Honda Giken Kogyo Kabushiki Kaisha Foot structure for legged walking robot
US4999553A (en) * 1989-12-28 1991-03-12 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Method and apparatus for configuration control of redundant robots
US5120228A (en) * 1990-03-15 1992-06-09 William Stahl Intrinsic perceptual motor training device
US5249151A (en) * 1990-06-05 1993-09-28 Fmc Corporation Multi-body mechanical system analysis apparatus and method
US5090042A (en) * 1990-12-24 1992-02-18 Bejjani Fadi J Videofluoroscopy system for in vivo motion analysis
US5427531A (en) * 1992-10-20 1995-06-27 Schlumberger Technology Corporation Dynamic simulation of mechanisms

Non-Patent Citations (18)

* Cited by examiner, † Cited by third party
Title
Armstrong W.W. et al, "The Dynamics of Articulated Rigid Bodies for Purposes of Animation", The Visual Computer, 1:231-240 (1985). Jan. 1985.
Armstrong W.W. et al, The Dynamics of Articulated Rigid Bodies for Purposes of Animation , The Visual Computer, 1:231 240 (1985). Jan. 1985. *
Calvert, T.W. et al, "Composition of Multiple Figure Sequences for Dance and Animation", Proceedings CG International '89, pp. 245-254 (1989). Jan. 1989.
Calvert, T.W. et al, Composition of Multiple Figure Sequences for Dance and Animation , Proceedings CG International 89, pp. 245 254 (1989). Jan. 1989. *
Cramblitt, B. "Computers Capture Moments of Motion", Computer Graphics World, 12(3):50-57, 1989. Jan. 1989.
Cramblitt, B. Computers Capture Moments of Motion , Computer Graphics World , 12(3):50 57, 1989. Jan. 1989. *
D.I. Miller, "computer Simulation of Human Motion", in D.W. Grieve et al (eds), Techniques for the Analysis of Human Motion, Lepus Books, London, 1975. Jan. 1975.
D.I. Miller, computer Simulation of Human Motion , in D.W. Grieve et al (eds), Techniques for the Analysis of Human Motion, Lepus Books, London, 1975. Jan. 1975. *
E. Popov, "Modern Robot Engineering", ST Technology Series, 1982. Jan. 1982.
E. Popov, Modern Robot Engineering , ST Technology Series, 1982. Jan. 1982. *
Nigel, W.J. et al, "Some Methods to Choreograph and Implement Motion in Computer Animation", Proceedings of Computer Animation '89, pp. 125-140 (1989) Jan. 1989.
Nigel, W.J. et al, Some Methods to Choreograph and Implement Motion in Computer Animation , Proceedings of Computer Animation 89, pp. 125 140 (1989) Jan. 1989. *
Phillips, c.B. et al "Interactive Real-time Articulated Figure Manipulation Using Multiple Kinematic Constraints", 1990 Symposium on Interactive 3D Graphics, Computer Graphics, 24(2):242-250, 1990. Jan. 1990.
Phillips, c.B. et al Interactive Real time Articulated Figure Manipulation Using Multiple Kinematic Constraints , 1990 Symposium on Interactive 3D Graphics, Computer Graphics, 24(2):242 250, 1990. Jan. 1990. *
Sun, L. et al, "An Architectural Design of a Sports Instruction System", First International Conference and Exhibition on Visual Computing for Defence and Government (VISUDA '89), Paris, France (1989) (Technical Report 89-017. Jan. 1989.
Sun, L. et al, An Architectural Design of a Sports Instruction System , First International Conference and Exhibition on Visual Computing for Defence and Government (VISUDA 89), Paris, France (1989) (Technical Report 89 017. Jan. 1989. *
Wilhelms, J.P. et al, "Using Dynamic Analysis to Animate Articulated Bodies such as Humans and Robots", in N. Magnent-Thalmann et al (eds), Computer-Generated Images, Springer Verlag, Tokyo, pp. 209-229 (185). Jan. 1985.
Wilhelms, J.P. et al, Using Dynamic Analysis to Animate Articulated Bodies such as Humans and Robots , in N. Magnent Thalmann et al (eds), Computer Generated Images, Springer Verlag, Tokyo, pp. 209 229 (185). Jan. 1985. *

Cited By (211)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5890906A (en) * 1995-01-20 1999-04-06 Vincent J. Macri Method and apparatus for tutorial, self and assisted instruction directed to simulated preparation, training and competitive play and entertainment
US6183259B1 (en) 1995-01-20 2001-02-06 Vincent J. Macri Simulated training method using processing system images, idiosyncratically controlled in a simulated environment
WO1997040471A1 (en) * 1996-04-04 1997-10-30 Katrix, Inc. Limb coordination system for interactive computer animation of articulated characters with blended motion data
US5889532A (en) * 1996-08-02 1999-03-30 Avid Technology, Inc. Control solutions for the resolution plane of inverse kinematic chains
US6505096B2 (en) * 1996-12-19 2003-01-07 Honda Giken Kogyo Kabushiki Kaisha Posture control system of legged mobile robot
US6014150A (en) * 1997-08-01 2000-01-11 Avid Technology, Inc. System and method of defining and employing behaviors for articulated chains
US6050963A (en) * 1998-06-18 2000-04-18 Innovative Sports Training, Inc. System for analyzing the motion of lifting an object
WO1999067746A1 (en) * 1998-06-24 1999-12-29 Sports Training Technologies, S.L. Method for capturing, analyzing and representing the movement of bodies and objects
US6708142B1 (en) 1999-01-14 2004-03-16 University Of Central Florida Automatic motion modeling of rigid bodies using collision detection
US6793496B2 (en) * 1999-04-15 2004-09-21 General Electric Company Mathematical model and a method and apparatus for utilizing the model
US6738065B1 (en) 1999-08-10 2004-05-18 Oshri Even-Zohar Customizable animation system
US6778866B1 (en) * 2000-03-16 2004-08-17 Ted S. Bettwy Method and apparatus for learning specific body motion
WO2001073689A3 (en) * 2000-03-27 2003-01-16 Massachusetts Gen Hospital Method and system for viewing kinematic and kinetic information
WO2001073689A2 (en) * 2000-03-27 2001-10-04 Massachusetts General Hospital Method and system for viewing kinematic and kinetic information
US6584377B2 (en) * 2000-05-15 2003-06-24 Sony Corporation Legged robot and method for teaching motions thereof
US20040206456A1 (en) * 2000-12-13 2004-10-21 Takamitsu Tadera Plasma processing apparatus
US7106334B2 (en) 2001-02-13 2006-09-12 Sega Corporation Animation creation program
US20030034980A1 (en) * 2001-02-13 2003-02-20 Hirotaka Imagawa Animation creation program
EP1282079A1 (en) * 2001-02-13 2003-02-05 Sega Corporation Animation creation program
EP1282079A4 (en) * 2001-02-13 2005-12-21 Sega Corp Animation creation program
US7650204B2 (en) 2001-06-29 2010-01-19 Honda Motor Co., Ltd. Active control of an ankle-foot orthosis
US20060282022A1 (en) * 2001-06-29 2006-12-14 Behzad Dariush Feedback estimation of joint forces and joint movements
US7469166B2 (en) 2001-06-29 2008-12-23 Honda Motor Co., Ltd. System and method of predicting novel motion in a serial chain system
US7774177B2 (en) 2001-06-29 2010-08-10 Honda Motor Co., Ltd. Exoskeleton controller for a human-exoskeleton system
US7386366B2 (en) 2001-06-29 2008-06-10 Honda Giken Kogyo Kabushiki Kaisha Feedback estimation of joint forces and joint movements
US7684896B2 (en) 2001-06-29 2010-03-23 Honda Motor Co., Ltd. System and method of estimating joint loads using an approach of closed form dynamics
US20050209535A1 (en) * 2001-06-29 2005-09-22 Behzad Dariush System and method of estimating joint loads in a three-dimensional system
US20050209534A1 (en) * 2001-06-29 2005-09-22 Behzad Dariush System and method of predicting novel motion in a serial chain system
US7623944B2 (en) 2001-06-29 2009-11-24 Honda Motor Co., Ltd. System and method of estimating joint loads in a three-dimensional system
US20050209536A1 (en) * 2001-06-29 2005-09-22 Behzad Dariush System and method of estimating joint loads using an approach of closed form dynamics
US20060270950A1 (en) * 2001-06-29 2006-11-30 Behzad Dariush Active control of an ankle-foot orthosis
US20060247904A1 (en) * 2001-06-29 2006-11-02 Behzad Dariush Exoskeleton controller for a human-exoskeleton system
US7012608B1 (en) 2001-08-02 2006-03-14 Iwao Fujisaki Simulation device
US20030115031A1 (en) * 2001-10-29 2003-06-19 Behzad Dariush Simulation system, method and computer-readable medium for human augmentation devices
US7251593B2 (en) * 2001-10-29 2007-07-31 Honda Giken Kogyo Kabushiki Kaisha Simulation system, method and computer-readable medium for human augmentation devices
US6944421B2 (en) * 2001-11-15 2005-09-13 T.F.H. Publications, Inc. Method and apparatus for providing training information regarding a pet
US20030118979A1 (en) * 2001-11-15 2003-06-26 Axelrod Glen S. Electronic pet book
US20050114073A1 (en) * 2001-12-05 2005-05-26 William Gobush Performance measurement system with quantum dots for object identification
US8137210B2 (en) 2001-12-05 2012-03-20 Acushnet Company Performance measurement system with quantum dots for object identification
US8498849B1 (en) 2001-12-07 2013-07-30 Iwao Fujisaki Wrinkle simulation on fabric based on three arm joint angles
US7983882B1 (en) 2001-12-07 2011-07-19 Iwao Fujisaki Joint wrinkle and muscle movement simulating software
US8180613B1 (en) 2001-12-07 2012-05-15 Iwao Fujisaki Wrinkles on fabric software
US7386429B1 (en) * 2001-12-07 2008-06-10 Iwao Fujisaki Wrinkle simulation software
US6506124B1 (en) 2001-12-21 2003-01-14 Callaway Golf Company Method for predicting a golfer's ball striking performance
US6602144B2 (en) * 2001-12-21 2003-08-05 Callaway Golf Company Method for predicting a golfer's ball striking performance
US20030123728A1 (en) * 2001-12-28 2003-07-03 Koninklijke Philips Electronics N.V. Interactive video installation and method thereof
US20040236550A1 (en) * 2002-02-28 2004-11-25 Edic Peter Michael Mathematical model and a method and apparatus for utilizing the model
US20060015186A1 (en) * 2002-09-04 2006-01-19 Graham Isaac Cup assembly of an orthopaedic joint prosthesis
US7771486B2 (en) 2002-09-04 2010-08-10 Depuy International, Ltd. Cup assembly of an orthopaedic joint prosthesis
WO2004027655A2 (en) * 2002-09-18 2004-04-01 Ugs Plm Solutions Inc. System and method for simulating human movement
US20040054510A1 (en) * 2002-09-18 2004-03-18 Ulrich Raschke System and method for simulating human movement
WO2004027655A3 (en) * 2002-09-18 2005-02-17 Ugs Plm Solutions Inc System and method for simulating human movement
US8260593B2 (en) 2002-09-18 2012-09-04 Siemens Product Lifecycle Management Software Inc. System and method for simulating human movement
US7402142B2 (en) 2002-09-23 2008-07-22 Honda Giken Kogyo Kabushiki Kaisha Method and processor for obtaining moments and torques in a biped walking system
US7390309B2 (en) 2002-09-23 2008-06-24 Honda Motor Co., Ltd. Human assist system using gravity compensation control system and method using multiple feasibility parameters
US20050102111A1 (en) * 2002-09-23 2005-05-12 Behzad Dariush Gravity compensation method in a human assist system and a human assist system with gravity compensation control
US7217247B2 (en) 2002-09-23 2007-05-15 Honda Giken Kogyo Kabushiki Kaisha Gravity compensation method in a human assist system and a human assist system with gravity compensation control
US9230452B2 (en) * 2002-09-30 2016-01-05 Surgical Science Sweden Ab Device and method for generating a virtual anatomic environment
US20060099557A1 (en) * 2002-09-30 2006-05-11 Anders Hyltander Device and method for generating a virtual anatomic environment
EP1406215A1 (en) * 2002-10-02 2004-04-07 Université Libre de Bruxelles Method for the modeling of skeletal kinematics
US20040243261A1 (en) * 2002-11-13 2004-12-02 Brian King System and method for capturing and analyzing tennis player performances and tendencies
US20040169656A1 (en) * 2002-11-15 2004-09-02 David Piponi Daniele Paolo Method for motion simulation of an articulated figure using animation input
US20080020867A1 (en) * 2003-08-28 2008-01-24 Callaway Golf Company Golfer's impact properties during a golf swing
US7403880B2 (en) * 2003-10-29 2008-07-22 Snecma Moving a virtual articulated object in a virtual environment while avoiding internal collisions between the articulated elements of the articulated object
US20050096889A1 (en) * 2003-10-29 2005-05-05 Snecma Moteurs Moving a virtual articulated object in a virtual environment while avoiding collisions between the articulated object and the environment
US7457733B2 (en) * 2003-10-29 2008-11-25 Snecma Moving a virtual articulated object in a virtual environment while avoiding collisions between the articulated object and the environment
US20050096890A1 (en) * 2003-10-29 2005-05-05 Snecma Moteurs Moving a virtual articulated object in a virtual environment while avoiding internal collisions between the articulated elements of the articulated object
US20050168578A1 (en) * 2004-02-04 2005-08-04 William Gobush One camera stereo system
US8872914B2 (en) 2004-02-04 2014-10-28 Acushnet Company One camera stereo system
WO2005082249A2 (en) * 2004-02-26 2005-09-09 K.U. Leuven Research & Development Time-dependent three-dimensional musculo-skeletal modeling based on dynamic surface measurements
US7899220B2 (en) 2004-02-26 2011-03-01 Diers International Gmbh Time-dependent three-dimensional musculo-skeletal modeling based on dynamic surface measurements of bodies
WO2005082249A3 (en) * 2004-02-26 2005-11-24 Leuven K U Res & Dev Time-dependent three-dimensional musculo-skeletal modeling based on dynamic surface measurements
US20050264563A1 (en) * 2004-03-05 2005-12-01 Macura Matthew J Method of analysis of comfort for virtual prototyping system
US20050264572A1 (en) * 2004-03-05 2005-12-01 Anast John M Virtual prototyping system and method
US20050267615A1 (en) * 2004-03-05 2005-12-01 Lavash Bruce W System and method of virtual representation of folds and pleats
US20050267614A1 (en) * 2004-03-05 2005-12-01 Looney Michael T System and method of virtual modeling of thin materials
US7634394B2 (en) 2004-03-05 2009-12-15 The Procter & Gamble Company Method of analysis of comfort for virtual prototyping system
US7937253B2 (en) 2004-03-05 2011-05-03 The Procter & Gamble Company Virtual prototyping system and method
US20050264562A1 (en) * 2004-03-05 2005-12-01 Macura Matthew J System and method of virtual representation of thin flexible materials
US8475289B2 (en) 2004-06-07 2013-07-02 Acushnet Company Launch monitor
US20050272512A1 (en) * 2004-06-07 2005-12-08 Laurent Bissonnette Launch monitor
US20050272514A1 (en) * 2004-06-07 2005-12-08 Laurent Bissonnette Launch monitor
US20050272516A1 (en) * 2004-06-07 2005-12-08 William Gobush Launch monitor
US7837572B2 (en) 2004-06-07 2010-11-23 Acushnet Company Launch monitor
US20050282645A1 (en) * 2004-06-07 2005-12-22 Laurent Bissonnette Launch monitor
US8500568B2 (en) 2004-06-07 2013-08-06 Acushnet Company Launch monitor
US8556267B2 (en) 2004-06-07 2013-10-15 Acushnet Company Launch monitor
US8622845B2 (en) 2004-06-07 2014-01-07 Acushnet Company Launch monitor
US20050278157A1 (en) * 2004-06-15 2005-12-15 Electronic Data Systems Corporation System and method for simulating human movement using profile paths
US7959517B2 (en) 2004-08-31 2011-06-14 Acushnet Company Infrared sensing launch monitor
US20060046861A1 (en) * 2004-08-31 2006-03-02 Lastowka Eric J Infrared sensing launch monitor
US9129077B2 (en) 2004-09-03 2015-09-08 Siemen Product Lifecycle Management Software Inc. System and method for predicting human posture using a rules-based sequential approach
US8082062B2 (en) 2005-06-10 2011-12-20 Honda Motor Co., Ltd. Regenerative actuation in motion control
US20060293791A1 (en) * 2005-06-10 2006-12-28 Behzad Dariush Regenerative actuation in motion control
US20060286522A1 (en) * 2005-06-17 2006-12-21 Victor Ng-Thow-Hing System and method for activation-driven muscle deformations for existing character motion
US7573477B2 (en) * 2005-06-17 2009-08-11 Honda Motor Co., Ltd. System and method for activation-driven muscle deformations for existing character motion
WO2007076119A2 (en) * 2005-12-22 2007-07-05 Honda Motor Co., Ltd. Reconstruction, retargetting, tracking, and estimation of pose of articulated systems
US8467904B2 (en) * 2005-12-22 2013-06-18 Honda Motor Co., Ltd. Reconstruction, retargetting, tracking, and estimation of pose of articulated systems
US7859540B2 (en) 2005-12-22 2010-12-28 Honda Motor Co., Ltd. Reconstruction, retargetting, tracking, and estimation of motion for articulated systems
WO2007076118A2 (en) * 2005-12-22 2007-07-05 Honda Motor Co., Ltd. Reconstruction, retargetting, tracking, and estimation of motion for articulated systems
WO2007076118A3 (en) * 2005-12-22 2008-05-08 Honda Motor Co Ltd Reconstruction, retargetting, tracking, and estimation of motion for articulated systems
US20070162164A1 (en) * 2005-12-22 2007-07-12 Behzad Dariush Reconstruction, Retargetting, Tracking, And Estimation Of Pose Of Articulated Systems
WO2007076119A3 (en) * 2005-12-22 2008-08-07 Honda Motor Co Ltd Reconstruction, retargetting, tracking, and estimation of pose of articulated systems
US20070255454A1 (en) * 2006-04-27 2007-11-01 Honda Motor Co., Ltd. Control Of Robots From Human Motion Descriptors
US8924021B2 (en) 2006-04-27 2014-12-30 Honda Motor Co., Ltd. Control of robots from human motion descriptors
EP1884897A3 (en) * 2006-07-31 2009-10-21 Avid Technology, Inc. Rigless retargeting for character animation
EP1884897A2 (en) * 2006-07-31 2008-02-06 Avid Technology, Inc. Rigless retargeting for character animation
US7979256B2 (en) 2007-01-30 2011-07-12 The Procter & Gamble Company Determining absorbent article effectiveness
US20080183450A1 (en) * 2007-01-30 2008-07-31 Matthew Joseph Macura Determining absorbent article effectiveness
US7931604B2 (en) 2007-03-07 2011-04-26 Motek B.V. Method for real time interactive visualization of muscle forces and joint torques in the human body
US20080221487A1 (en) * 2007-03-07 2008-09-11 Motek Bv Method for real time interactive visualization of muscle forces and joint torques in the human body
US20090082701A1 (en) * 2007-03-07 2009-03-26 Motek Bv Method for real time interactive visualization of muscle forces and joint torques in the human body
US8452458B2 (en) 2007-05-03 2013-05-28 Motek Bv Method and system for real time interactive dynamic alignment of prosthetics
US20100131113A1 (en) * 2007-05-03 2010-05-27 Motek Bv Method and system for real time interactive dynamic alignment of prosthetics
US8142300B2 (en) * 2007-06-26 2012-03-27 A School Corporation Kansai University Analysis method of golf club
US20090005188A1 (en) * 2007-06-26 2009-01-01 A School Corporation Kansai University Analysis method of golf club
US8170287B2 (en) 2007-10-26 2012-05-01 Honda Motor Co., Ltd. Real-time self collision and obstacle avoidance
US20100208945A1 (en) * 2007-10-26 2010-08-19 Koninklijke Philips Electronics N.V. Method and system for selecting the viewing configuration of a rendered figure
US9418470B2 (en) * 2007-10-26 2016-08-16 Koninklijke Philips N.V. Method and system for selecting the viewing configuration of a rendered figure
US20090074252A1 (en) * 2007-10-26 2009-03-19 Honda Motor Co., Ltd. Real-time self collision and obstacle avoidance
US20090118863A1 (en) * 2007-11-01 2009-05-07 Honda Motor Co., Ltd. Real-time self collision and obstacle avoidance using weighting matrix
US8396595B2 (en) 2007-11-01 2013-03-12 Honda Motor Co., Ltd. Real-time self collision and obstacle avoidance using weighting matrix
US8175326B2 (en) * 2008-02-29 2012-05-08 Fred Siegel Automated scoring system for athletics
US20090220124A1 (en) * 2008-02-29 2009-09-03 Fred Siegel Automated scoring system for athletics
US20090259450A1 (en) * 2008-04-15 2009-10-15 Cleary Paul William physics-based simulation
TWI384376B (en) * 2008-09-09 2013-02-01 Univ Nat Chunghsing Method of optimization for editing human body actions
US8838323B2 (en) * 2008-12-26 2014-09-16 Toyota Jidosha Kabushiki Kaisha Driving assistance device and driving assistance method
US20100168998A1 (en) * 2008-12-26 2010-07-01 Toyota Jidosha Kabushiki Kaisha Driving assistance device and driving assistance method
US8771072B2 (en) 2009-06-24 2014-07-08 Nintendo Co., Ltd. Computer readable storage medium and information processing apparatus
US20100331090A1 (en) * 2009-06-24 2010-12-30 Takuhiro Dohta Computer readable storage medium and information processing apparatus
WO2013041446A1 (en) 2011-09-20 2013-03-28 Brian Francis Mooney Apparatus and method for analysing a golf swing
WO2013041444A1 (en) 2011-09-20 2013-03-28 Brian Francis Mooney Apparatus and method for analysing a golf swing
US10307640B2 (en) 2011-09-20 2019-06-04 Brian Francis Mooney Apparatus and method for analyzing a golf swing
WO2013041445A1 (en) 2011-09-20 2013-03-28 Brian Francis Mooney Apparatus and method for analysing a golf swing
WO2014121365A1 (en) * 2013-02-11 2014-08-14 Mytnik Vyacheslav Georgievich Method for controlling an object
US10279212B2 (en) 2013-03-14 2019-05-07 Icon Health & Fitness, Inc. Strength training apparatus with flywheel and related methods
US11921471B2 (en) 2013-08-16 2024-03-05 Meta Platforms Technologies, Llc Systems, articles, and methods for wearable devices having secondary power sources in links of a band for providing secondary power in addition to a primary power source
US11644799B2 (en) 2013-10-04 2023-05-09 Meta Platforms Technologies, Llc Systems, articles and methods for wearable electronic devices employing contact sensors
US11079846B2 (en) 2013-11-12 2021-08-03 Facebook Technologies, Llc Systems, articles, and methods for capacitive electromyography sensors
US11666264B1 (en) 2013-11-27 2023-06-06 Meta Platforms Technologies, Llc Systems, articles, and methods for electromyography sensors
US10188890B2 (en) 2013-12-26 2019-01-29 Icon Health & Fitness, Inc. Magnetic resistance mechanism in a cable machine
US10433612B2 (en) 2014-03-10 2019-10-08 Icon Health & Fitness, Inc. Pressure sensor to quantify work
US10426989B2 (en) 2014-06-09 2019-10-01 Icon Health & Fitness, Inc. Cable system incorporated into a treadmill
US10684692B2 (en) 2014-06-19 2020-06-16 Facebook Technologies, Llc Systems, devices, and methods for gesture identification
US20160077716A1 (en) * 2014-09-16 2016-03-17 Disney Enterprises, Inc. Computational Design of Linkage-Based Characters
US10437940B2 (en) * 2014-09-16 2019-10-08 Disney Enterprises, Inc. Computational design of linkage-based characters
US20160110593A1 (en) * 2014-10-17 2016-04-21 Microsoft Corporation Image based ground weight distribution determination
US10258828B2 (en) 2015-01-16 2019-04-16 Icon Health & Fitness, Inc. Controls for an exercise device
US10953305B2 (en) 2015-08-26 2021-03-23 Icon Health & Fitness, Inc. Strength exercise mechanisms
US10561894B2 (en) 2016-03-18 2020-02-18 Icon Health & Fitness, Inc. Treadmill with removable supports
US10293211B2 (en) 2016-03-18 2019-05-21 Icon Health & Fitness, Inc. Coordinated weight selection
US10625137B2 (en) 2016-03-18 2020-04-21 Icon Health & Fitness, Inc. Coordinated displays in an exercise device
US10272317B2 (en) 2016-03-18 2019-04-30 Icon Health & Fitness, Inc. Lighted pace feature in a treadmill
US10493349B2 (en) 2016-03-18 2019-12-03 Icon Health & Fitness, Inc. Display on exercise device
US10252109B2 (en) 2016-05-13 2019-04-09 Icon Health & Fitness, Inc. Weight platform treadmill
US10471299B2 (en) 2016-07-01 2019-11-12 Icon Health & Fitness, Inc. Systems and methods for cooling internal exercise equipment components
US10441844B2 (en) 2016-07-01 2019-10-15 Icon Health & Fitness, Inc. Cooling systems and methods for exercise equipment
US11000211B2 (en) 2016-07-25 2021-05-11 Facebook Technologies, Llc Adaptive system for deriving control signals from measurements of neuromuscular activity
US11337652B2 (en) 2016-07-25 2022-05-24 Facebook Technologies, Llc System and method for measuring the movements of articulated rigid bodies
US10656711B2 (en) 2016-07-25 2020-05-19 Facebook Technologies, Llc Methods and apparatus for inferring user intent based on neuromuscular signals
US10990174B2 (en) 2016-07-25 2021-04-27 Facebook Technologies, Llc Methods and apparatus for predicting musculo-skeletal position information using wearable autonomous sensors
US10500473B2 (en) 2016-10-10 2019-12-10 Icon Health & Fitness, Inc. Console positioning
US10376736B2 (en) 2016-10-12 2019-08-13 Icon Health & Fitness, Inc. Cooling an exercise device during a dive motor runway condition
US10343017B2 (en) 2016-11-01 2019-07-09 Icon Health & Fitness, Inc. Distance sensor for console positioning
US10661114B2 (en) 2016-11-01 2020-05-26 Icon Health & Fitness, Inc. Body weight lift mechanism on treadmill
US10543395B2 (en) 2016-12-05 2020-01-28 Icon Health & Fitness, Inc. Offsetting treadmill deck weight during operation
US11451108B2 (en) 2017-08-16 2022-09-20 Ifit Inc. Systems and methods for axial impact resistance in electric motors
US11635736B2 (en) 2017-10-19 2023-04-25 Meta Platforms Technologies, Llc Systems and methods for identifying biological structures associated with neuromuscular source signals
US10729965B2 (en) 2017-12-22 2020-08-04 Icon Health & Fitness, Inc. Audible belt guide in a treadmill
US10817795B2 (en) 2018-01-25 2020-10-27 Facebook Technologies, Llc Handstate reconstruction based on multiple inputs
EP3743892A4 (en) * 2018-01-25 2021-03-24 Facebook Technologies, Inc. Visualization of reconstructed handstate information
US11361522B2 (en) 2018-01-25 2022-06-14 Facebook Technologies, Llc User-controlled tuning of handstate representation model parameters
US11069148B2 (en) * 2018-01-25 2021-07-20 Facebook Technologies, Llc Visualization of reconstructed handstate information
US11331045B1 (en) 2018-01-25 2022-05-17 Facebook Technologies, Llc Systems and methods for mitigating neuromuscular signal artifacts
US10950047B2 (en) 2018-01-25 2021-03-16 Facebook Technologies, Llc Techniques for anonymizing neuromuscular signal data
US10460455B2 (en) 2018-01-25 2019-10-29 Ctrl-Labs Corporation Real-time processing of handstate representation model estimates
US11587242B1 (en) 2018-01-25 2023-02-21 Meta Platforms Technologies, Llc Real-time processing of handstate representation model estimates
US11163361B2 (en) 2018-01-25 2021-11-02 Facebook Technologies, Llc Calibration techniques for handstate representation modeling using neuromuscular signals
US11127143B2 (en) 2018-01-25 2021-09-21 Facebook Technologies, Llc Real-time processing of handstate representation model estimates
US10489986B2 (en) 2018-01-25 2019-11-26 Ctrl-Labs Corporation User-controlled tuning of handstate representation model parameters
WO2019147956A1 (en) 2018-01-25 2019-08-01 Ctrl-Labs Corporation Visualization of reconstructed handstate information
US10496168B2 (en) 2018-01-25 2019-12-03 Ctrl-Labs Corporation Calibration techniques for handstate representation modeling using neuromuscular signals
US10504286B2 (en) 2018-01-25 2019-12-10 Ctrl-Labs Corporation Techniques for anonymizing neuromuscular signal data
US11036302B1 (en) 2018-05-08 2021-06-15 Facebook Technologies, Llc Wearable devices and methods for improved speech recognition
US11216069B2 (en) 2018-05-08 2022-01-04 Facebook Technologies, Llc Systems and methods for improved speech recognition using neuromuscular information
US10592001B2 (en) 2018-05-08 2020-03-17 Facebook Technologies, Llc Systems and methods for improved speech recognition using neuromuscular information
US10937414B2 (en) 2018-05-08 2021-03-02 Facebook Technologies, Llc Systems and methods for text input using neuromuscular information
US10772519B2 (en) 2018-05-25 2020-09-15 Facebook Technologies, Llc Methods and apparatus for providing sub-muscular control
US10687759B2 (en) 2018-05-29 2020-06-23 Facebook Technologies, Llc Shielding techniques for noise reduction in surface electromyography signal measurement and related systems and methods
US11129569B1 (en) 2018-05-29 2021-09-28 Facebook Technologies, Llc Shielding techniques for noise reduction in surface electromyography signal measurement and related systems and methods
US10970374B2 (en) 2018-06-14 2021-04-06 Facebook Technologies, Llc User identification and authentication with neuromuscular signatures
US11045137B2 (en) 2018-07-19 2021-06-29 Facebook Technologies, Llc Methods and apparatus for improved signal robustness for a wearable neuromuscular recording device
US11179066B2 (en) 2018-08-13 2021-11-23 Facebook Technologies, Llc Real-time spike detection and identification
US10842407B2 (en) 2018-08-31 2020-11-24 Facebook Technologies, Llc Camera-guided interpretation of neuromuscular signals
US10905350B2 (en) 2018-08-31 2021-02-02 Facebook Technologies, Llc Camera-guided interpretation of neuromuscular signals
US11567573B2 (en) 2018-09-20 2023-01-31 Meta Platforms Technologies, Llc Neuromuscular text entry, writing and drawing in augmented reality systems
US10921764B2 (en) 2018-09-26 2021-02-16 Facebook Technologies, Llc Neuromuscular control of physical objects in an environment
US10970936B2 (en) 2018-10-05 2021-04-06 Facebook Technologies, Llc Use of neuromuscular signals to provide enhanced interactions with physical objects in an augmented reality environment
US11797087B2 (en) 2018-11-27 2023-10-24 Meta Platforms Technologies, Llc Methods and apparatus for autocalibration of a wearable electrode sensor system
US11941176B1 (en) 2018-11-27 2024-03-26 Meta Platforms Technologies, Llc Methods and apparatus for autocalibration of a wearable electrode sensor system
US10905383B2 (en) 2019-02-28 2021-02-02 Facebook Technologies, Llc Methods and apparatus for unsupervised one-shot machine learning for classification of human gestures and estimation of applied forces
US11481030B2 (en) 2019-03-29 2022-10-25 Meta Platforms Technologies, Llc Methods and apparatus for gesture detection and classification
US11961494B1 (en) 2019-03-29 2024-04-16 Meta Platforms Technologies, Llc Electromagnetic interference reduction in extended reality environments
US11481031B1 (en) 2019-04-30 2022-10-25 Meta Platforms Technologies, Llc Devices, systems, and methods for controlling computing devices via neuromuscular signals of users
US11493993B2 (en) 2019-09-04 2022-11-08 Meta Platforms Technologies, Llc Systems, methods, and interfaces for performing inputs based on neuromuscular control
US11907423B2 (en) 2019-11-25 2024-02-20 Meta Platforms Technologies, Llc Systems and methods for contextualized interactions with an environment
KR20210093795A (en) * 2020-01-20 2021-07-28 베이징 바이두 넷컴 사이언스 앤 테크놀로지 코., 엘티디. Method and apparatus for generating 3d joint point regression model
US11341718B2 (en) * 2020-01-20 2022-05-24 Beijing Baidu Netcom Science And Technology Co., Ltd. Method and apparatus for generating 3D joint point regression model
CN111260774A (en) * 2020-01-20 2020-06-09 北京百度网讯科技有限公司 Method and device for generating 3D joint point regression model
US11868531B1 (en) 2021-04-08 2024-01-09 Meta Platforms Technologies, Llc Wearable device providing for thumb-to-finger-based input gestures detected based on neuromuscular signals, and systems and methods of use thereof
US11893749B1 (en) * 2022-10-17 2024-02-06 Chengdu Tommi Technology Co., Ltd. Focus following method based on motion gravity center, storage medium and photographing system

Similar Documents

Publication Publication Date Title
US5625577A (en) Computer-implemented motion analysis method using dynamics
US5623428A (en) Method for developing computer animation
US5586224A (en) Robot or numerical control programming method
Wilhelms Using dynamic analysis for realistic animation of articulated bodies
Armstrong et al. The dynamics of articulated rigid bodies for purposes of animation
Chadwick et al. Layered construction for deformable animated characters
Yeadon et al. The simulation of aerial movement—IV. A computer simulation model
EP0520099A1 (en) Applied motion analysis and design
Boulic et al. Inverse kinetics for center of mass position control and posture optimization
Kim et al. Adaptation of human motion capture data to humanoid robots for motion imitation using optimization
JP2937834B2 (en) 3D motion generator
Wooten et al. Simulation of human diving
Kunii et al. Dynamic analysis-based human animation
JPH08329272A (en) Animation generator method
Armstrong et al. Dynamics for animation of characters with deformable surfaces
Huang et al. Interactive human motion control using a closed-form of direct and inverse dynamics
CA2043883C (en) Computer-implemented motion analysis method using dynamics
Aydin et al. Realistic articulated character positioning and balance control in interactive environments
CA2043902A1 (en) Method for developing computer animation
Memişoğlu Human motion control using inverse kinematics
Neff et al. Modeling relaxed hand shape for character animation
Huang Motion control for human animation
Singh et al. Control and coordination of head, eyes, and facial expressions of virtual actors in virtual environments
McKenna et al. Dynamic simulation of a complex human figure model with low level behavior control
Westenhofer et al. Using kinematic clones to control the dynamic simulation of articulated figures

Legal Events

Date Code Title Description
AS Assignment

Owner name: SHUKYOHOJI, KONGO ZEN SOHOZAN SHORIJI, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KUNII, TOSIYASU;SUN, LINING;REEL/FRAME:007091/0855;SIGNING DATES FROM 19940519 TO 19940524

STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4

FPAY Fee payment

Year of fee payment: 8

AS Assignment

Owner name: SHORINJI KEMPO INTELLECTUAL PROPERTY PROTECTION CO

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SHUKYOHOJIN KONGO ZEN SOHONZAN SHORINJI;REEL/FRAME:015083/0373

Effective date: 20040520

FPAY Fee payment

Year of fee payment: 12