US20090023554A1 - Exercise systems in virtual environment - Google Patents
Exercise systems in virtual environment Download PDFInfo
- Publication number
- US20090023554A1 US20090023554A1 US12/216,540 US21654008A US2009023554A1 US 20090023554 A1 US20090023554 A1 US 20090023554A1 US 21654008 A US21654008 A US 21654008A US 2009023554 A1 US2009023554 A1 US 2009023554A1
- Authority
- US
- United States
- Prior art keywords
- exercise
- user
- task
- feature
- module
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B24/00—Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
- A63B24/0087—Electric or electronic controls for exercising apparatus of groups A63B21/00 - A63B23/00, e.g. controlling load
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B22/00—Exercising apparatus specially adapted for conditioning the cardio-vascular system, for training agility or co-ordination of movements
- A63B22/02—Exercising apparatus specially adapted for conditioning the cardio-vascular system, for training agility or co-ordination of movements with movable endless bands, e.g. treadmills
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B24/00—Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
- A63B24/0062—Monitoring athletic performances, e.g. for determining the work of a user on an exercise apparatus, the completed jogging or cycling distance
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B24/00—Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
- A63B24/0084—Exercising apparatus with means for competitions, e.g. virtual races
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B71/00—Games or sports accessories not covered in groups A63B1/00 - A63B69/00
- A63B71/06—Indicating or scoring devices for games or players, or for other sports activities
- A63B71/0619—Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
- A63B71/0622—Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/24—Constructional details thereof, e.g. game controllers with detachable joystick handles
- A63F13/245—Constructional details thereof, e.g. game controllers with detachable joystick handles specially adapted to a particular type of game, e.g. steering wheels
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/70—Game security or game management aspects
- A63F13/79—Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
- A63F13/795—Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories for finding other players; for building a team; for providing a buddy list
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/30—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B24/00—Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
- A63B24/0087—Electric or electronic controls for exercising apparatus of groups A63B21/00 - A63B23/00, e.g. controlling load
- A63B2024/0096—Electric or electronic controls for exercising apparatus of groups A63B21/00 - A63B23/00, e.g. controlling load using performance related parameters for controlling electronic or video games or avatars
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B71/00—Games or sports accessories not covered in groups A63B1/00 - A63B69/00
- A63B71/06—Indicating or scoring devices for games or players, or for other sports activities
- A63B71/0619—Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
- A63B71/0622—Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
- A63B2071/0625—Emitting sound, noise or music
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B71/00—Games or sports accessories not covered in groups A63B1/00 - A63B69/00
- A63B71/06—Indicating or scoring devices for games or players, or for other sports activities
- A63B71/0619—Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
- A63B71/0622—Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
- A63B2071/0638—Displaying moving images of recorded environment, e.g. virtual environment
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B71/00—Games or sports accessories not covered in groups A63B1/00 - A63B69/00
- A63B71/06—Indicating or scoring devices for games or players, or for other sports activities
- A63B71/0619—Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
- A63B2071/0658—Position or arrangement of display
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B71/00—Games or sports accessories not covered in groups A63B1/00 - A63B69/00
- A63B71/06—Indicating or scoring devices for games or players, or for other sports activities
- A63B71/0619—Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
- A63B2071/0658—Position or arrangement of display
- A63B2071/0661—Position or arrangement of display arranged on the user
- A63B2071/0666—Position or arrangement of display arranged on the user worn on the head or face, e.g. combined with goggles or glasses
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B21/00—Exercising apparatus for developing or strengthening the muscles or joints of the body by working against a counterforce, with or without measuring devices
- A63B21/06—User-manipulated weights
- A63B21/062—User-manipulated weights including guide for vertical or non-vertical weights or array of weights to move against gravity forces
- A63B21/0626—User-manipulated weights including guide for vertical or non-vertical weights or array of weights to move against gravity forces with substantially vertical guiding means
- A63B21/0628—User-manipulated weights including guide for vertical or non-vertical weights or array of weights to move against gravity forces with substantially vertical guiding means for vertical array of weights
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B22/00—Exercising apparatus specially adapted for conditioning the cardio-vascular system, for training agility or co-ordination of movements
- A63B22/0015—Exercising apparatus specially adapted for conditioning the cardio-vascular system, for training agility or co-ordination of movements with an adjustable movement path of the support elements
- A63B22/0023—Exercising apparatus specially adapted for conditioning the cardio-vascular system, for training agility or co-ordination of movements with an adjustable movement path of the support elements the inclination of the main axis of the movement path being adjustable, e.g. the inclination of an endless band
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B22/00—Exercising apparatus specially adapted for conditioning the cardio-vascular system, for training agility or co-ordination of movements
- A63B22/0076—Rowing machines for conditioning the cardio-vascular system
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B22/00—Exercising apparatus specially adapted for conditioning the cardio-vascular system, for training agility or co-ordination of movements
- A63B22/04—Exercising apparatus specially adapted for conditioning the cardio-vascular system, for training agility or co-ordination of movements with movable multiple steps, i.e. more than one step per limb, e.g. steps mounted on endless loops, endless ladders
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B22/00—Exercising apparatus specially adapted for conditioning the cardio-vascular system, for training agility or co-ordination of movements
- A63B22/06—Exercising apparatus specially adapted for conditioning the cardio-vascular system, for training agility or co-ordination of movements with support elements performing a rotating cycling movement, i.e. a closed path movement
- A63B22/0605—Exercising apparatus specially adapted for conditioning the cardio-vascular system, for training agility or co-ordination of movements with support elements performing a rotating cycling movement, i.e. a closed path movement performing a circular movement, e.g. ergometers
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B22/00—Exercising apparatus specially adapted for conditioning the cardio-vascular system, for training agility or co-ordination of movements
- A63B22/06—Exercising apparatus specially adapted for conditioning the cardio-vascular system, for training agility or co-ordination of movements with support elements performing a rotating cycling movement, i.e. a closed path movement
- A63B22/0664—Exercising apparatus specially adapted for conditioning the cardio-vascular system, for training agility or co-ordination of movements with support elements performing a rotating cycling movement, i.e. a closed path movement performing an elliptic movement
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B2220/00—Measuring of physical parameters relating to sporting activity
- A63B2220/80—Special sensors, transducers or devices therefor
- A63B2220/806—Video cameras
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B2225/00—Miscellaneous features of sport apparatus, devices or equipment
- A63B2225/20—Miscellaneous features of sport apparatus, devices or equipment with means for remote communication, e.g. internet or the like
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B2225/00—Miscellaneous features of sport apparatus, devices or equipment
- A63B2225/50—Wireless data transmission, e.g. by radio transmitters or telemetry
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B2230/00—Measuring physiological parameters of the user
- A63B2230/04—Measuring physiological parameters of the user heartbeat characteristics, e.g. ECG, blood pressure modulations
- A63B2230/06—Measuring physiological parameters of the user heartbeat characteristics, e.g. ECG, blood pressure modulations heartbeat rate only
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B2230/00—Measuring physiological parameters of the user
- A63B2230/30—Measuring physiological parameters of the user blood pressure
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B2230/00—Measuring physiological parameters of the user
- A63B2230/40—Measuring physiological parameters of the user respiratory characteristics
- A63B2230/42—Measuring physiological parameters of the user respiratory characteristics rate
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B2230/00—Measuring physiological parameters of the user
- A63B2230/50—Measuring physiological parameters of the user temperature
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B2230/00—Measuring physiological parameters of the user
- A63B2230/60—Measuring physiological parameters of the user muscle strain, i.e. measured on the user
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B2230/00—Measuring physiological parameters of the user
- A63B2230/65—Measuring physiological parameters of the user skin conductivity
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B69/00—Training appliances or apparatus for special sports
- A63B69/0028—Training appliances or apparatus for special sports for running, jogging or speed-walking
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B69/00—Training appliances or apparatus for special sports
- A63B69/06—Training appliances or apparatus for special sports for rowing or sculling
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B69/00—Training appliances or apparatus for special sports
- A63B69/16—Training appliances or apparatus for special sports for cycling, i.e. arrangements on or for real bicycles
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Physical Education & Sports Medicine (AREA)
- Human Computer Interaction (AREA)
- Epidemiology (AREA)
- Biophysics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Medical Informatics (AREA)
- Primary Health Care (AREA)
- Public Health (AREA)
- Cardiology (AREA)
- Vascular Medicine (AREA)
- Business, Economics & Management (AREA)
- Computer Security & Cryptography (AREA)
- General Business, Economics & Management (AREA)
- Processing Or Creating Images (AREA)
Abstract
An exercise system includes at least two exercise modules and is arranged to allow multiple users performing exercises on, with or against the modules in different locations while performing at least one preset task defined in a context of a story, a scenery or a video (or computer) game each in turn preferably defined in a virtual environment. The exercise system generally generates the task of the story, scenery or game in images of the virtual environment and simulates the exercising users as simulated user in such images, while allowing the users to manipulate the simulated users for attaining a goal of the task based on features related with the exercises performed by the user or allowing the task to manipulate operations of the modules based on the features. The execs system includes at least two output modules each including a visual unit to provide the images for the task to each user and an olfactory or tactile unit to provide smell or tactile sensation to the user, respectively.
Description
- The present application relates to various patent applications which have been filed to the U.S. Patent and Trademark Office by the same Applicant such as, e.g., a U.S. Provisional Patent Application which is entitled “Local exercise systems with compact visual units,” was filed on Jul. 16, 2007, and bears the Serial Number U.S. Ser. No. 60/959,464, a second U.S. Provisional Patent Application which is entitled “Local exercise systems with full-size visual units,” was filed on Jul. 16, 2007, and bears the Serial Number U.S. Ser. No. 60/959,564, and a third U.S. Provisional Patent Application which is entitled “Global exercise systems and methods,” was filed on Aug. ______, 2007, and bears the Serial Number U.S. Ser. No. 60/9______. All of the above Applications will be referred to as the “co-pending Applications” hereinafter and all of the Applications and are to be incorporated herein in their entirety by reference.
- An exercise system includes at least two exercise modules and is arranged to allow multiple users performing exercises on, with or against the modules in different locations while performing at least one preset task defined in a context of a story, a scenery or a video (or computer) game each in turn preferably defined in a virtual environment. The exercise system generally generates the task of the story, scenery or game in images of the virtual environment and simulates the exercising users as simulated user in such images, while allowing the users to manipulate the simulated users for attaining a goal of the task based on features related with the exercises performed by the user or allowing the task to manipulate operations of the modules based on the features. The execs system includes at least two output modules each including a visual unit to provide the images for the task to each user and an olfactory or tactile unit to provide smell or tactile sensation to the user, respectively.
- A recent trend in many civilized countries is that their population tend to get obese. With more nutritious foods available and less time for exercise, excess nutrients are converted into cholesterol and stored in fat cells. Since the obesity is related to a high blood cholesterol and various diseases, not only individuals but also governments focus on reducing obese population.
- As a result, it is not uncommon to find people engaging in various exercises. Some people jog, whereas others choose to go to gyms and to exercise on, with or against various exercise equipment provided thereat. Such exercise or fitness equipment is generally intended to improve and/or enhance a muscle tone of an user, to increase a muscle mass and volume of the user, to force or facilitate the user to reduce his or her weight, to increase physical stamina of the user, and the like.
- Although conventional exercise or fitness equipment is effective in helping the user burn his or her excess energy, it typically requires its user to spend a fair amount of time therewith. For example, the user may waste a piece of cake causing a surge of hundreds of calories within a minute, but she or he has to run on a treadmill for at least an hour to burn those excess calories. In addition, the user has to engage in various aerobic and non-aerobic exercises for a prolonged period of time to improve or enhance the muscle tone and to increase the muscle mass and volume of the user, not to mention reducing his or her weight. Accordingly, patient and endurance are vital virtues for the users of such exercise or fitness equipment. In order to help the user exercise for a proper duration, the gyms offer a variety of amenities such as TV's and DVD players in order to alleviate the user from getting bored. Rather, other users choose to carry their radios or audio players and listen to music while performing exercise. Whatever they may resort to, prior art exercise or fitness equipment is, however, of little or at most a limited value in eliminating the boredom of the user during exercise.
- Various fitness or exercise equipment has been suggested to alleviate such problems, where examples of such equipment have been disclosed in U.S. Pat. Nos. 5,322,490 and 5,425,691 to M. A. van der Hoeven entitled “Stepping and sliding exerciser,” U.S. Pat. No. 6,013,007 to G. M. Root and F. Hoorn entitled “Athlete's GPS-based performance monitor,” U.S. Pat. No. 6,106,297 to E. Pollak and S. Vaquerizo entitled “Distributed interactive simulation exercise manager system and method,” U.S. Pat. No. 6,159,131 to L. Pfeffer entitled “Fitness triage system and method,” U.S. Pat. No. 6,302,789 B2 to T. Harada and K. Shimizu which is entitled “Pedometer with game mode,” U.S. Pat. Nos. 6,312,363 B1 and 6,626,799 B2 both issued to S. R. Watterson et al. entitled “System and methods for providing an improved exercise device with motivational programming,” U.S. Pat. No. 6,336,891 B1 to R. Fedrigon et al. which is entitled “Interactive exercise pad system,” U.S. Pat. No. 6,453,111 B1 to J. H. Sklar et al. entitled “Interactive workstation for creating customized, watch and do physical exercise programs,” U.S. Pat. No. 6,468,086 B1 to S. Brady-Koontz which is entitled “Method of display of video images of exercises,” U.S. Pat. No. 6,561,952 B2 issued to M. C. Wu entitled “Turning control device for a virtual stationary bike,” U.S. Pat. No. 6,590,536 B1 to C. A. Walton which is entitled “Body motion detecting system with correction for tilt of accelerometers and remote measurement of body position,” U.S. Pat. No. 6,604,138 B1 issued to L. D. Virine and T. G. Simpson entitled “System and method for providing demographically targeted information,” U.S. Pat. No. 6,620,078 B2 to L. Pfeffer entitled “Fitness triage system and nutrition gets personal,” U.S. Pat. No. 6,635,013 B2 to L. Pfeffer entitled “Fitness triage system and exercise gets personal,” U.S. Pat. No. 6,643,385 B1 to M. J. Bravomalo entitled “System and method for weight-loss goal visualization and planning and business method for use therefor,” U.S. Pat. No. 6,656,091 B1 to K. G. Abelbeck et al. entitled “Exercise device control and billing system,” U.S. Pat. No. 6,613,000 B1 to D. J. Reinkensmeyer et al. which is entitled “Method and apparatus for mass-delivered movement rehabilitation,” U.S. Pat. No. 6,671,736 B2 to L. D. Virine and T. G. Simpson which is entitled “System and method for providing demographically targeted information,” U.S. Pat. No. 6,672,991 B2 issued to S. M. O'Malley entitled “Guided instructional cardiovascular exercise with accompaniment,” U.S. Pat. No. 6,749,536 B1 to S. M. Cuskaden and A. G. Evans entitled “Exercising using public communication network,” U.S. Pat. No. 6,749,537 B1 to P. L. Hickman entitled “Method and apparatus for remote interactive exercise and health equipment,” U.S. Pat. No. 6,786,848 B2 issued to A. Yamashita entitled “Exercise assisting method and apparatus implementing such method,” U.S. Pat. No. 6,796,927 B2 issued to M. Toyama entitled “Exercise assistance controlling method and exercise assisting apparatus,” U.S. Pat. No. 6,852,069 B2 to S. H. Park which is entitled “Method and system for automatically evaluating physical health state using a game,” U.S. Pat. No. 6,898,411 B2 to S. G. Ziv-el et al. entitled “Method and system for online teaching using web pages,” U.S. Pat. No. 6,918,860 B1 to N. H. Nusbaum entitled “Exercise bicycle virtual reality apparatus,” U.S. Pat. No. 6,921,351 B1 to P. L. Hickman and M. L. Gough entitled “Method and apparatus for remote interactive exercise and health equipment,” U.S. Pat. No. 6,997,853 B1 to S. M. Cuskaden and A. G. Evans entitled “Exercising using a public communication network,” U.S. Pat. No. 7,022,048 B1 to John Fernandez and Juan Fernandez entitled “Video fitness machine,” U.S. Pat. No. 7,074,162 B2 to P. Kuo entitled “Exercise device,” U.S. Pat. No. 7,307,241 B1 to P. Kuo entitled “Exercise device,” U.S. Pat. No. 7,335,134 B1 to R. LaVelle, entitled “Exercise and game controller apparatus and method,” U.S. Pat. No. 7,347,779 B2 to R. James-Herbert entitled “Computer game controller,” and the like. However, any of such equipment disclosed in the above patents or any combination thereof have not completely solved the above problem.
- Therefore, there is a need for an exercise system capable of allowing the users to engage in exercises in geographically different locations without getting bored. To this end, there is a need for an exercise system providing images of a task of a story, a scenery or a video (or computer) game in a preset viewpoint such that the users may engage in the task while performing exercises on, with or against multiple exercise modules also disposed in different locations. There is a need for an exercise system for providing the images of the task of the story, scenery or game on multiple visual units each disposed in a single or multiple view angles of the exercising users. There is a need for the exercise system including multiple exercise modules which are disposed in geographically different locations and communicate with each other either directly, indirectly through another module of the system or through an external provider. There is a need for the exercise system capable of providing auditory, olfactory, and/or tactile features of the task each synchronized with the images of the task. There also is a need for the exercise system capable of manipulating at least one feature of the task of the story, scenery or game based upon at least one another feature of the exercise or, in the alternative, manipulating at least one feature of at least one operation of the exercise system based upon at least one feature of the task.
- The present invention generally relates to an exercise system providing users who perform exercises on, with or against exercise modules of the system in different locations with a task which is defined as a story, scenery or video (or computer) game in a virtual environment. More particularly, the present invention relates to the exercise system for generating the task of the story, scenery or game in images of the virtual environment, simulating the users into simulated user in such images, and allowing the users to manipulate the simulated users for attaining a goal of the task based on features related with the exercises or, alternatively, allowing the task to manipulate operations of the exercise modules based upon such features. The system includes at least two visual units for providing such images for the task to each exercising user. The present invention also relates to various methods of and processes for providing such images of the task to the exercising users and allowing such user to manipulate the task or allowing the task to manipulate the exercise based on the feature.
- The exercise systems of this invention may be provided in various embodiments. For example, the exercise system may be fabricated as an assembly of multiple exercise modules each disposed in different locations and providing at least one preset exercise, at least one output module for providing the virtual environment (i.e., images, optional sounds, smell or sensations, and the like) of the task of a story, a scenery or a game to multiple users who perform the same or different exercises in different locations, at least one control module for manipulating operations of the output and exercise modules or the rest of the system, and the like. In another example, the system may be provided as an add-on assembly of the output and control modules operatively coupling with conventional exercise or fitness modules disposed in different locations and providing the virtual environment with the images, optional sounds, smells or sensations to the users while manipulating the operations of the exercise modules. In another example, the exercise system may include such exercise and control modules, where the latter may operatively couple to a prior art audiovisual, olfactory or tactile output device of the users and use visual, auditory, olfactory or tactile capability of the device to display the images, playing such sounds, giving off smells, or generating sensations for the task, where the device may be a portable or stationary image display device (e.g., a TV, a monitor, a palm device or DVD player with a display panel, a game console with a display panel, a communication device with a display panel, and so on), or a portable or stationary audio device (e.g., a sound-generating device with a speaker, a CD player with a speaker, a communication device with a speaker, a game console with a speaker, and the like).
- Therefore, a primary objective of the present invention is to provide an exercise system with multiple exercise modules disposed in different (or geographically separate) locations and on, with or against each of which each user simultaneously performs physical exercises while participating in a preset task of a story, a scenery or a video (or computer) game each provided in images (and optional sounds, smells, and/or sensations) of a virtual environment and each defining at least one preset goal. Therefore, a related objective of this invention is to include in the exercise system multiple compact or full-size visual units which may display the images for the task for each user and to allow the users to view the images on the visual units while simultaneously performing the same or different exercises. Another related objective of this invention is to generate the task of the story, scenery or game each incorporating therein at least one simulated user which simulates at least one feature of at least one of the exercising users. Another related objective of this invention is to provide the task of the story, scenery or game each defining visual, auditory, olfactory or tactile feature at least one of which may be manipulated by various features of the users, exercises performed by the users, or operations of the system. Another related objective of this invention is to provide the system which can manipulate at least one operation of the exercise modules based on at least one feature of the task of the story, scenery or game. Another related objective of this invention is to enable the exercise system with at least two exercise modules to transfer at least one feature of the task, users, exercises or operations from one to another of the exercise modules via a local or global network in order to allow such users to participate in the task while simultaneously perform the same or different exercises.
- Another objective of the present invention is to provide an exercise system on, against, and/or with which the users simultaneously perform physical exercises while simulating each of such users as at least one simulated user and then incorporating the simulated users into a preset task of a story, a scenery or a video (or computer) game generated in images of the virtual environment and defining at least one preset goal. Accordingly, a related objective of this invention is to generate the simulated users by simulating at least one feature of each user such that at least one feature of each simulated user in such images is related to the feature of each user. Another related objective of this invention is to allow such users to manipulate the simulated users based on at least one feature of such users, exercises, or operations of the system so that the users manipulate at least one feature of the task of the story, scenery or game into which the simulated users are included. Another related objective of this invention is to allow the task to manipulate the simulated users so that the system may manipulate at least one of its operations based on at least one feature of such simulated users, thereby requiring the users to perform the exercises at least one feature of which may be decided by at least one of the simulated users. Another related objective of this invention is to arrange the system to manipulate at least one feature of the task between at least two exercise modules so that at least two users may compete each other in the task while simultaneously performing the same or different exercises with, on, or against each of the exercise modules disposed in the same or different locations.
- Another objective of the present invention is to provide an exercise system on, against, and/or with which multiple users simultaneously perform physical exercises while participating in a task of a story, a scenery or a video (or computer) game and proceeding therethrough so as to attain a preset goal of the task. Accordingly, a related objective of this invention is to provide the story, scenery or game for the task while relating at least one feature of the images for the task to at least one feature of the exercises, users, and/or operation of the exercise system so that the latter may manipulate the feature of the images. Another related objective of this invention is to simulate each of the users into at least one simulated user and include the simulated users as a part of the images while manipulating at least one feature of such simulated users based upon at least one feature of the exercises, users, and/or operations of the system. Another related objective of this invention is to arrange the system to manipulate at least one feature of the task between at least two exercise modules so that at least two users may compete each other in the task of the game while simultaneously performing the same or different exercises on, with or against each of the exercise modules.
- Another objective of the present invention is to participate multiple users exercising on, with or against multiple exercise modules of an exercise system simultaneously in a preset task of a story, a scenery or a video (or computer) game while manipulating at least one operation of the system based on the task. Therefore, a related objective of this invention is to provide the task while relating at least one feature of such images for the game with at least one feature of the operation of the system such that the former may manipulate at least one operation of the system. Another related objective of this invention is to simulate each user into at least one simulated user and then incorporate such as a part of the task while manipulating at least one feature of the operations of the system based upon at least one feature of the task. Another related objective of this invention is to arrange the exercise system to manipulate at least one feature of the task between at least two exercise modules so that at least two users compete each other in the task of the game while simultaneously performing the same or different exercises on, with or against each of the exercise modules.
- Another objective of the present invention is to provide communication between multiple users exercising on, with or against multiple exercise modules while allowing such users to participate in a task of a story, scenery or video (or computer) game and to compete each other in images (or other features of a virtual environment) for the task. Accordingly, a related objective of this invention is to arrange the system to monitor at least one feature of one of the users, at least one feature of one of such exercises, at least one feature of at least one of the exercise modules, or at least one feature of the task through the local or global network. Another related objective of this invention is to monitor the features of the users, the features of the exercises, the features of operations of such exercise modules or the features of the task through the local or global network. Another related objective of this invention is to transfer the feature monitored in one location to another location without altering the monitored feature (i.e., a simple transfer). Another related objective of this invention is to transfer the feature monitored in one location to another while altering or converting the feature based on a preset relation (i.e., an equivalent conversion), where the relation is defined between at least one feature of the exercises and that of the task, between at least one feature of the exercises and the goal and/or stages of the task, between at least one feature of such users and that of the task, and/or between at least one feature of such users and the goal and/or stages of the task, and where the relation may be defined between at least one feature of the operation and at least one feature of the task, and/or between at least one feature of the operation and the goal and/or stages of the task.
- In all of such objectives, the system may be arranged to provide the virtual environment which not only provides the images for the task but also generates the sounds, smells, or sensations for the task, thereby providing the visual feature as well as the optional auditory, olfactory or tactile features. The system may then manipulate at least one feature of the images, sounds, smells, or sensations for the task of the story, scenery or game at least partly based on other features of the users, exercises, or operation of the exercise modules. Alternatively, the system may manipulate at least one feature of operations of the exercise modules at least partly based on the task, users, or performed exercises.
- Various exercise systems of this invention may be constructed in various arrangements. For example, the exercise system may have at least two exercise modules on, with or against which the users simultaneously perform such same or different exercises. In another example, the system may include a single exercise module in one location and on, with or against which one user performs the exercise, and may then couple with an external exercise module in a different location and on, with or against which another user performs the exercise.
- Various apparatus, method, and process aspects of such exercise systems and embodiments thereof are now enumerated. It is appreciated, however, that following system, method, and process aspects of the present invention may also be embodied in many other different modes and, therefore, should not be limited to such aspects and their embodiments which are to be set forth herein. Rather, various exemplary aspects and their embodiments set forth herein are provided so that this disclosure is thorough and complete, and fully conveys the scope of the present invention to one of ordinary skill in the relevant art.
- In one aspect of the present invention, an exercise system is arranged to provide at least two users with at least one preset task of a story, a scenery, and/or a video (or computer) game each of which is provided in images of at least one virtual environment, to allow such users to simultaneously perform exercises (or to allow the user to perform such exercises in a delayed mode), and to directly or indirectly manipulate at least one feature of the preset task at least partly based upon at least one feature of at least one of the exercises performed by the users.
- In one exemplary embodiment of this aspect of the invention, an exercise system includes a first standard exercise module, at least one of a second and third standard exercise modules, at least one first output module, and at least one first control module. The first standard exercise module may be one of a first exercise module through a ninth exercise module, where the first exercise module is arranged to allow the user to perform exercise on, with, or against at least one portion thereof while consuming energy of its user during the exercise, while the second exercise module is arranged to define at least one preset load, to include at least one actuating part capable of coupling with the load and contacting at least one body part of the user, and then to allow the user to perform the exercise by contacting the actuating part and by moving such a part against the load while consuming energy thereof during the exercise. The third exercise module may be arranged to include at least one track capable of translating in a preset direction and to allow the user to perform the exercise of walking or running over the track while consuming energy thereof during the exercise, while the fourth exercise module is arranged to define at least one rotation axis, to define at least one preset load, to include at least one pedal coupling with the load and rotating about the axis, and to allow the user to perform the exercise of rotating the pedal against such a load while consuming energy thereof during its exercise. The fifth exercise module may be arranged to include at least one movable weight, and to allow its user to perform its exercise of pivoting, translating, reciprocating, rotating or moving the weight while consuming energy thereof during the exercise, while the sixth exercise module is arranged to define at least one central point, to define at least one preset load, to include at least one lever coupling with the load and pivoting about the point, and to allow the user to perform the exercise of reciprocating, translating, pivoting, rotating, displacing or moving such a lever about the point against the load while consuming energy thereof during the exercise. The seventh exercise module is arranged to include at least one belt capable of enclosing at least one body part of the user therearound, and then to allow the user to perform the exercise of vibrating the body part while consuming energy thereof during the exercise, while the eighth exercise module is arranged to define a preset load, to include at least one pad capable of coupling with the load and moving or deforming in response to energy supplied thereto by the user, and then to allow the user to perform the exercise of translating, reciprocating, rotating, deforming, pivoting, pushing, or pulling at least a portion of the pad against the load while consuming energy thereof during the exercise. The ninth exercise module may be arranged to define at least one preset load, to include at least one handle coupling with the load, and to allow the user to perform the exercise of translating, reciprocating, rotating, pivoting, displacing, or moving the handle against such a load while consuming energy of the user during the exercise. Regardless of its configuration and operation, the first standard exercise module is disposed in a first location, while the second standard exercise module is one of the first to ninth exercise module, but disposed in a second location which is different (or geographically separate or apart) from the first location, whereby different users may simultaneously perform such exercises of different types and of the same, similar or different extents. The third standard exercise module is one of the first through ninth exercise modules and disposed in the second location, whereby different users may simultaneously perform the exercises of the same or similar types and of the same, similar or different extents. The output module may include at least two or full-size visual units each of which is provided in a disposition and arrangement for displaying an entire portion of each image in a single view angle of each user and within a viewable distance so that each user may simultaneously view the entire portion of each image displayed on each visual unit while performing each exercises on, with or against each of the above standard exercise modules (to be referred to as the “first output module” hereinafter). The control module is arranged to operatively couple with at least one of the output module and standard exercise modules directly or indirectly, to provide the task in the images of the virtual environment, to display the images on the visual units, to assign at least one preset goal to the task, to monitor at least one feature of the exercises provided by the standard exercise modules, the users simultaneously performing such exercises, and/or at least one operation of the standard exercise modules, and to relate the exercises to each other based on at least one preset relation (to be referred to as the “first control module” hereinafter). The first control module may also be arranged to manipulate at least one feature of the images at least partly based on the relation and at least partly based upon at least one feature of such exercises, users, or operation, whereby the users simultaneously proceed to attain the task goal while simultaneously performing the exercises on, with, and/or against the standard exercise modules which are provided in the same or different locations and whereby the control module directly or indirectly manipulates the task feature at least partly based on the relation or upon the feature of at least one of such exercises, users, and operation while communicating with the visual units or standard exercise modules via a local or global network encompassing such locations (to be referred to as the “first control functions” hereinafter).
- In another exemplary embodiment of this aspect of the invention, another exercise system may include the first standard exercise module and at least one of the second and third standard exercise modules, at least one output module, and the first control module which may perform the first control functions. The output module may include at least two visual units each provided in a disposition and an arrangement for displaying different portions of each of the images in each of multiple view angles of each user but within a viewable distance therefrom so that each user simultaneously views each portion of each image displayed on each visual unit one at a time while performing each exercise on, with or against each standard exercise module (to be referred to as the “second output module”).
- In another exemplary embodiment of this aspect of the invention, another exercise system may include the first standard exercise module and at least one of the second and third standard exercise modules, at least one output module, and the first control module which may be arranged to simulate at least one of the users as at least one simulated user included in the images for such one of the users, and to manipulate at least one feature of the simulated user of the images at least partly based on the relation and also at least partly based upon at least one feature of such exercises, users or operation, whereby the users may proceed along the task for the task goal while simultaneously performing the exercises on, with or against the standard exercise modules to be disposed in the different locations and whereby the control module may directly or indirectly manipulate the feature of the simulated user of the task at least partly based upon the relation and the feature of the exercises, users or operation while communicating with at least one visual unit and/or standard exercise module via a local or global network encompassing the locations (to be referred to as the “second control functions”). The output module may include at least two visual units each defines a preset configuration and is provided in an arrangement and disposition for displaying an entire portion of each image in a single view angle of each user and in a viewable distance therefrom due to the configuration, disposition, or arrangement so that each user simultaneously views the entire portion of each image displayed on each visual unit while performing each exercise on, with or against each standard exercise module (to be referred to as the “third output module” hereinafter).
- In another exemplary embodiment of this aspect of the invention, another exercise system may include the first standard exercise module and at least one of the second and third standard exercise modules, at least one output module, and the first control module which performs the second control functions. The output module may include at least two visual units each having a preset configuration and provided in a disposition and an arrangement to display different portions of each image in each of multiple view angles of each user but also in a viewable distance therefrom so that each user may simultaneously view each portion of each image displayed on each visual unit sequentially (or one at a time) due to the configuration, disposition, or arrangement while performing each exercise on, against or with each standard exercise module (to be referred to as the “fourth output module”).
- In another aspect of the present invention, an exercise system is arranged to provide at least two users with at least one preset task of a story, scenery or video (or computer) game in images of at least one virtual environment, to allow the users to simultaneously perform exercises (or to perform such exercises in a delayed mode), and then to directly or indirectly manipulate at least one feature of at least one of such exercises at least partly based on at least one feature of the task.
- In one exemplary embodiment of this aspect of the invention, an exercise system includes the first standard exercise module, at least one of such second and third standard exercise modules, the first output module, and the first control module which is arranged to manipulate at least one feature of the operation at least partly based on the relation and also on at least one feature of the images of the task, whereby the users simultaneously proceed along the task for the task goal while simultaneously performing such exercises on, with or against the standard exercise modules which are disposed in different locations and at least one feature of which is arranged to be manipulated either indirectly or directly by the control module at least partly based on the task performed by at least one of the users while communicating with the visual units or standard exercise modules via a local or global network which is to encompass those locations (to be referred to as the “third control functions”).
- In another exemplary embodiment of this aspect of the invention, another exercise system may include the first standard exercise module and at least one of the second and third standard exercise modules, the second output module, and the first control module performing the third control functions.
- In another exemplary embodiment of this aspect of the invention, another exercise system may include the first standard exercise module and at least one of the second and third standard exercise modules, the third output module, and the first control module. Such a first control module may also be arranged to simulate at least one of the users as at least one simulated user included in the images for such a user, and to manipulate the feature of the operation at least partly based on the relation and on at least one feature of the images for the task, whereby the users simultaneously proceed along the task for the task goal while simultaneously performing the exercises on, with or against the standard exercise modules and at least one feature of which is arranged to be directly or indirectly manipulated by the control module at least partly based on the simulated user while communicating with the visual units or the standard exercise modules via a local or global network which encompass such locations (to be referred to as the “fourth control functions”).
- In another exemplary embodiment of this aspect of the invention, another exercise system may include the first standard exercise module and at least one of the second and third standard exercise modules, the fourth output module, and the first control module performing the fourth control functions.
- In another aspect of the present invention, an exercise system is arranged to provide at least two users with at least one preset task of a story, scenery or video (or computer) game in images of at least one virtual environment, to allow such users to simultaneously perform exercises defining the same, similar or different types and extents (or to perform exercises in a delayed mode), to relate at least one first feature of the task, exercises, users or operation of the system to at least one second feature of at least one another of the task, exercises, users or operation, and to directly or indirectly manipulate one of the first and second features at least partly based on the other thereof.
- In one exemplary embodiment of this aspect of the invention, an exercise system may include the first standard exercise module as well as at least one of the second and third standard exercise modules, where at least one of the first and second (or third) standard exercise modules provides the operation defining such a feature of the system. The system may also include the first output module, and at least one control module which is arranged to operatively couple with at least one of the output module and standard exercise modules directly or indirectly, to provide the task in such images of the virtual environment, to display the images on the visual units, to assign at least one goal to the task, to monitor at least one of the first and second features, to relate one of the first and second features to another thereof at least partly based upon at least one preset relation which may be stored therein, generated thereby, supplied by at least one of the users, and the like (to be referred to as the “second control module”). The second control module may be arranged to manipulate at least one of the first and second features at least partly based on another thereof, whereby the users may simultaneously proceed along the task for the goal while simultaneously performing the exercises on, with or against the standard exercise modules disposed in different locations, whereby the control module directly or indirectly manipulates another task feature or standard exercise modules at least partly based on the relation or monitored feature while communicating with the visual units or standard exercise modules via a local or global network linking those locations (to be referred to as the “fifth control functions”).
- In another exemplary embodiment of this aspect of the invention, another exercise system may include the first standard exercise module and least one of such second and third standard exercise modules, where at least one of the first and second (or third) standard exercise modules provides the operation which defines such a feature of the system. The system may also have the second output module, and the second control module which may also perform the fifth control functions.
- In another exemplary embodiment of this aspect of the invention, an exercise system has the first standard exercise module and at least one of such second and third standard exercise modules, where at least one of the first and second or third standard exercise modules provides the operation defining the feature of the system. The system may include the third output module, and the second control module. The second control module may be arranged to simulate at least one of the users as at least one simulated user in the images for such an user, and to manipulate at least one of the first and second features at least partly based on another thereof, whereby the users may simultaneously proceed along the task for the task goal while simultaneously performing such exercises on, with or against the standard exercise modules incorporated in the different locations and whereby the control module directly or indirectly manipulates another feature of the task or standard exercise modules at least partly based on the preset relation or monitored feature while communicating with at least one of the visual units or standard exercise modules via a local or global network covering such locations (to be referred to as the “sixth control functions”).
- In another exemplary embodiment of this aspect of the invention, another exercise system may include the first standard exercise module and least one of such second and third standard exercise modules, where at least one of the first and second (or third) standard exercise modules provides the operation defining such a feature of the system. The system may also have the fourth output module, and the second control module which may also perform the sixth control functions.
- In another aspect of the present invention, an exercise system is arranged to connect multiple different locations, to include at least one standard exercise module in each location, to define at least one preset task of a story, a scenery or a video or computer game each provided in images of at least one virtual environment, and then to allow multiple users to simultaneously perform exercises on, with or against the standard exercise modules disposed in the locations (or to perform such exercises in a delayed mode) while competing each other in the task images at least partly based on such exercises performed by the users.
- In one exemplary embodiment of this aspect of the invention, an exercise system may include the first standard exercise module, at least one of the second and third standard exercise modules, at least one output module, and at least one control module. The output module may include at least two visual units one of which may be provided in the first location, another of which is provided in the second location, and each of which is provided in a disposition and an arrangement for displaying an entire portion of each image in a single view angle of each user and in a viewable distance therefrom so that each user may simultaneously view the entire portion of each image on each visual unit while performing each exercise on, with or against each of such standard exercise modules (which is to be referred to as the “fifth output module”). The control module is arranged to operatively couple with at least one of the output module and the standard exercise modules directly or indirectly, to provide the task in such images, to display the images on the visual units, to assign at least one goal to the task, to be disposed in one of the first and second locations, to monitor at least one feature of such exercises provided by the standard exercise modules, users simultaneously performing the exercises or at least one operation of the standard exercise modules, to relate such exercises with each other based on at least one preset relation (to be referred to as the “third control module”). The third control module may manipulate at least one feature of the images at least partly based on the relation and on at least one feature of the exercises, users or operation, whereby such users may simultaneously proceed along the task for the goal while simultaneously performing the exercises on, with or against the standard exercise modules in such locations and whereby the control module directly or indirectly manipulates the task feature at least partly based upon the relation or feature of the exercises, users or operation while communicating with the visual units or standard exercise modules disposed in another location through a local or global network (to be referred to as the “seventh control functions”).
- In another exemplary embodiment of this aspect of the invention, an exercise system has the first standard exercise module and at least one of the second and third standard exercise modules, at least one output module, and the third control module which performs such seventh control functions. The output module may also include at least two visual units, one provided in the first location, another provided in the second location, and each provided in a disposition and an arrangement for displaying different portions of each image in each view angle of each user but in a viewable distance therefrom so that each user simultaneously views each portion of each image displayed on each of such visual units sequentially (or one at a time) while performing each exercise on, with or against each standard exercise module (to be referred to as the “sixth output module”).
- In another exemplary embodiment of this aspect of the invention, an exercise system has the first standard exercise module and at least one of the second and third standard exercise modules, at least one output module, and the third control module. The output module may have at least two visual units, one provided in the first location, another provided in the second location, and each defining a configuration and provided in an arrangement and a disposition for displaying an entire portion of each image in a single view angle of each user and within a viewable distance due to such a configuration, disposition or arrangement so that each user may simultaneously view the entire portion of the image on each visual unit while performing each exercise on, with, and/or against each standard exercise module (to be referred to as the “seventh output module”). The third control module may be arranged to simulate at least one of the users as at least one simulated user included in the images of the task, to manipulate at least one feature of such images at least partly based on the relation and on at least one feature of the exercises, users or operation, whereby such users may simultaneously proceed along the task for the task goal while simultaneously performing the exercises on, with or against the standard exercise modules in different locations and whereby the control module directly or indirectly manipulates the task feature at least partly based on the relation or such a feature of the exercises, users or operation while communicating with the visual units or standard exercise modules disposed in another location in a local or global network (to be referred to as the “eighth control functions”).
- In another exemplary embodiment of this aspect of the invention, an exercise system has the first standard exercise module and at least one of the second and third standard exercise modules, at least one output module, and the third control module performing such eighth control functions. The output module may include at least two visual units one provided in the first location, another provided in the second location, and each defining a preset configuration and also provided in a disposition and arrangement for displaying different portions of each image in each view angle of each user but also in a viewable distance therefrom so that each user may simultaneously view each portion of each of the images on each visual unit sequentially due to the configuration, disposition or arrangement while performing each exercise on, with or against each standard exercise module (to be referred to as the “eighth output module”).
- In another aspect of the present invention, an exercise system may be arranged to operatively connect multiple locations through a local network or a global network, to include at least one exercise module in each of the locations, to define at least one preset task of a story, a scenery, a video game, and/or a computer game each defining a preset goal for the task and provided in images for at least one virtual environment, and to allow each of multiple users to simultaneously perform exercises on, with, or against each of the exercise modules disposed in each of the locations while competing each other in the images for the task goal at least partly based on said exercises performed by the users.
- In one exemplary embodiment of this aspect of the invention, an exercise system may have a first exercise module, a second exercise module, at least one output module, and at least one control module. The first exercise module is arranged to define a first exercise type and a first exercise load and to allow a first user to perform a first exercise while consuming energy thereof during the first exercise. The second exercise module is arranged to define a second exercise type and a second exercise load and to also allow a second user to perform a second exercise while consuming energy thereof during the second exercise, where the second exercise module may be arranged to operative couple with the first exercise module via the network indirectly or directly, thereby allowing the first and second users to simultaneously perform the exercises while pursuing the goal of the task. The output module includes at least two visual units one of which is provided in a first location, another of which is provided in a second location, and each of which is provided in a preset disposition as well as in a preset arrangement so as to display such images to each of the exercising users. The control module is arranged to operatively couple with the output module and/or exercise modules indirectly or directly, to define the goal of the task, to provide the task in the images, to display such images on the visual units, to monitor the first and/or second extents and/or loads as well as extents of the first and second exercises each performed by each of the users, to simulate such users into simulated users included in the images, to relate the type, load, and/or extent of the first exercise with at least one of those of the second exercise based on at least one preset relation, and then to perform manipulation of at least one of such simulated users in the images at least partly based on the types, loads, and/or extents related to each other by such a relation, thereby allowing the users to compete for attaining the task goal while simultaneously performing the exercises in such locations regardless of whether the types of the first and second exercises are identical to each other.
- The types of the first and second exercises may be identical to each other, where the control module performs such manipulation at least substantially based on the loads and/or extents of the first and/or second exercises. Such types of the first and second exercises may be different from each other, where the control module may be arranged to convert at least one of the extents from one unit to another unit and then to perform comparison of such converted extent with another of the extents, thereby performing such manipulation at least substantially based on such a comparison. The control module may be arranged to perform the manipulation by manipulating the simulated user in the images. The control module may instead be arranged to perform such manipulation not only by manipulating the simulated user in such images but also by manipulating the operation of at least one of such exercise modules based on such types, loads, and/or extents.
- Embodiments of such apparatus aspects of the present invention may include one or more of the following features, while configurational and/or operational variations and/or modifications of the foregoing systems also fall within the scope of the present invention.
- The system may provide the task in the auditory, olfactory or tactile features (i.e., the sounds, smells or sensations, respectively) of the virtual environment with the visual feature (i.e., the images), where each feature may represent, connote or be associated with at least one of multiple elements such as, e.g., a preset object, background, event, geographic region, activity, surrounding, and so on. The task may include the simulated user which is at least one object or background to be included in the images of the task of the story, scenery or game and which may be manipulatable or controllable by at least one of the features of such exercises, users, or operations of the exercise modules. The system (or its control module) may simulate only one of such users into a single simulated user, only one of the users into multiple simulated users, multiple users as a single simulated user, multiple users into multiple simulated users each simulating only one of the users, and the like.
- The exercise may include voluntary or involuntary physical or electrical activities of muscles of the users, leading to improvement or enhancement of a muscle tone, to an increase in a muscle mass or muscle volume, to a reduction in his or her weight, to an increased physical stamina, and the like. The standard exercise modules may be installed in the different locations so as to prevent each user from accessing both of the standard exercise modules without physically moving from one to another location. The standard exercise modules may directly couple to each other or, alternatively, indirectly couple to each other through the control module. At least one of the standard exercise modules may (or not) be synchronized with another of the modules based on at least one of the features. All (or at least two) of different exercise modules may be disposed in different locations. In the alternative, all (or at least two) of different exercise modules may be disposed in the same locations.
- The images for the task may be a single still picture an entire portion of which is displayed on the visual unit, a single still picture with multiple portions each displayed on the visual unit, a series of still pictures, or a video clip. The visual features may include at least one visual aspect such as, e.g., a shape, a size, a content, a color, a brightness, a contrast, a sharpness, a zoom, and a view angle of the image, a distance of portraying the image, temporal or spatial characteristics, distributions, or variations of any of the above, and so on. The visual features may include at least one visual aspect such as, e.g., a shape or size of the object (or background) or variations thereof provided in different view angles or distances, contents carried by the object (or background), color or brightness thereof, contrast or sharpness thereof, zoom and view angles thereof, and so on. The visual unit may acquire the images by storing such and then retrieving such, by receiving the images from the control module, by obtaining the images from an external source such as, e.g., the user, other persons, an internet, a broadcast, an external storage, and/or game console through wire or wirelessly, or by synthesizing (or composing) the images for itself. The visual unit may repeat at least a portion of such images in a preset sequence, randomly or based on another sequence at least partly determined by at least one of such features. The visual unit may define at least one image domain on which such images may be displayed and which may consist of a single portion, a pair of portions or multiple portions. The visual unit may provide the images in black-and-white, multicolor or a mixture. The visual unit may zoom in or out of such images, vary their view angles, rotate the images with respect to a rotation base, and the like. The visual unit may form such images by acquiring the object and background simultaneously, by acquiring the object and background independently and superposing one onto the other, by acquiring one of the object and background followed by synthesizing the other and composing the object and background, and the like. The visual unit may form the images by acquiring the simulated user and the rest of such elements simultaneously, by acquiring the simulated user and the rest of such elements independently and then superposing one over the other, by acquiring one of the simulated user and the rest of the elements, synthesizing the other, and then composing the object and background, and the like. The visual unit may be disposed on the exercise module, on the control module, away from the user, to be worn by the user over or around at least one of his or her body part, or to be carried by the user. The visual unit may be incorporated into or provided as a wearable article such as, e.g., glasses, goggles, or helmets. The visual unit may display the object (or background) in a perspective of the user or in another perspective defined away from the user. The images may portray the object or background as may be perceived by the user or, in the alternative, the images may rather portray the object or background including the simulated user in the another perspective of a third party. The control and/or output modules may vary the perspective during the exercise based on at least one of the features. The output and/or control modules may zoom in or out of the images while maintaining or changing the perspective based on at least one of the features. The control module may provide the simulated user by animating an appearance of the user, by adjusting a size of the user, by selecting one of prestored multiple simulated users, and the like.
- The output module includes at least one auditory unit to provide the sounds. The system may include at least two auditory units each capable of playing such sounds to each user simultaneously performing the same, similar or different exercises. The auditory features may include, e.g., a volume or loudness of the sounds, their tone, a balance when the auditory unit includes multiple speakers, a frequency of the sounds, their frequency distribution, their direction, temporal or spatial distributions, characteristics or variations of such sounds, and so on. The auditory unit may acquire the sounds by storing and retrieving such, by receiving the sounds from the control module, by obtaining the sounds through external sources including the users, an internet, a broadcast, an external storage, and/or the external game console either by wire or wirelessly, or by synthesizing or composing the sounds. The auditory unit may repeat at least a portion of the sounds in a preset sequence, randomly or based on a sequence at least partly decided by at least one of such features. The sounds may be real sounds, abstract sounds, or their mixture. The sounds may be a voice, a conversation, music, sounds of the animal or plant, sounds from or generated by the object, synthesized or composed sounds, and their mixture. The sounds may be in a mono or stereo mode. The auditory unit may provide the sounds as acquired or retrieved, with or without modifying at least one of the auditory features, by synthesizing or composing the sound, and the like. The auditory unit may include at least one cone-drive speaker, piezoelectric speaker, or electrostatic speaker. The auditory unit may be disposed on the exercise or control module, spaced away from the user, carried by the user, worn by the user over or around at least one of his or her ears or body parts, and the like. The auditory unit may be incorporated into or formed as a wearable article such as a helmet, an earphone, a headphone, and the like.
- The output module may also include at least one olfactory unit which may provide the smells. The olfactory feature may include a type of the smells, an intensity of the smells, a temporal or spatial distribution, characteristics, or variations thereof, and the like. The olfactory unit may include at least one storage storing at least one substance for the smells and may dispense the substance to create the virtual environment. The olfactory unit may include multiple storages each storing a substance for preset smell, may dispense a mixture from at least two of the substances for the virtual environment, may dispense the mixture of different substances in a preset order or randomly, and the like. Such an olfactory unit may include at least one dispenser which manipulates the substance to be discharged from the storage, where the dispenser may include therein at least one wick, nozzle or evaporator. The olfactory unit may give off the smells in a preset order, randomly, or based on at least one of the features. The olfactory unit may dispense the smells to a space adjacent to a portion of the body part of the user, where the portion may be an entire area around the nose of the user, a space covering an upper torso of the user, and the like. Such an olfactory unit may be disposed away from the user, carried by the user, or worn by the user near or around his or her nose.
- The output module may include at least one tactile unit which may provide the sensations. The sensations may include a mechanical sensation, an air flow, heat, coldness, electrical sensation, and the like. The tactile unit may repeat the sensations in a preset order, randomly, based on at least one of the features, and the like. The tactile features may include mechanical, thermal, optical, or electrical properties of at least one portion of the exercise module or, alternatively, the properties sensed by the user away from the exercise module. The tactile unit may have at least one actuator which provides different sensations at a contact between the user and exercise module by changing the mechanical property of the exercise module at a point contacting the user, where the mechanical properties may be, e.g., an elasticity, a modulus, a stiffness, a deformability, a bulk structure, a roughness, a surface structure, and the like. The tactile unit may include at least one air pump for generating an air flow to the user, where such mechanical properties may include an air flow rate, an air velocity, temporal or spatial distributions, characteristics, or variations thereof, and the like. The tactile unit may include at least one heater for irradiating heat (or infrared) rays to the user, for heating the part of the exercise module, for heating the air flow, and the like, where the thermal properties may include a temperature of the part of the exercise module or flow of air, a heat flux rate thereof or therethrough, a position of the heater relative to the user, temporal and/or spatial distributions, characteristics, and/or variations thereof, and the like. The tactile unit may be provided to be disposed away from the user, to be worn by the user over or around at least a portion of his or her body, to be carried by the user, and the like.
- The system (or its control module) may allow transfer of the task feature only from one to the other of the standard exercise modules, between the standard exercise modules, and the like. Such a system (or its control module) may allow the transfer wirelessly or by wire, through control module or between other modules, and the like. The system (or its control module) may perform the transfer without altering the task feature (i.e., a simple transfer) or, in the alternative, by altering or converting the task feature based upon the relation (i.e., an equivalent conversion). The relation may be defined between at least feature of the exercises and that of the task, between at least one feature of such exercises and the goal and/or stages of the task, between at least one feature of the users and that of the task, between at least one feature of the users and the goal and/or stages of the task, and the like. The relation may be defined between at least one feature of the operation and at least one of the features of the task, between at least one feature of the operation and the goal and/or stages of the task, and the like. The relation may be maintained constant during the exercises, may change during the exercises at least partly based upon at least one of the features, may account for physical fatigue of the users which may be reflected by the duration of the exercises and/or load, and the like.
- The control module may be disposed in only one of the locations or, in the alternative, may be disposed in a location which may be different from such locations of the standard exercise modules. The control module may communicate with at least one of the standard exercise modules (or at least one of the users) wirelessly or by wire, may also communicate with at least one of such visual units wirelessly or through wire, and the like. The control module may receive at least one of the features from at least one another of the modules of the system and/or at least one of the users through wire or wirelessly, may transmit at least one of the features to at least one another of the modules of the system and/or at least one of the users by wire or wirelessly. The network may cover the locations of a single city (or different cities) or those of a single country (or different countries). The network may encompass the locations of the same time zone or different time zones.
- The system may operatively couple with at least one external visual unit capable of displaying at least a portion of the images thereon. The external visual unit may operatively couple to the output and/or control modules through wire or wirelessly in order to supplement or replace at least a portion of the modules. The external visual unit may be a visual device including the CRT, LCD, OLED, IOLED, PDP or any screen for displaying the images in the black-and-white or color mode, where examples of the device may include a stationary or portable audiovisual device including at least one screen (e.g., a DVD player, a TV, and the like), a portable data processing device with at least one screen (e.g., a PDA, a data organizer, a laptop computer, and the like), a portable communication device including at least one screen (e.g., a cellular phone), and the like. The system may operatively couple with at least one external game console for providing at least a portion of the task. The external game console may operatively couple with the output and/or control modules by wire or wirelessly in order to supplement or replace at least a portion of such modules. The external game console may provide the task of the game in signals where the control module may generate the images for the task or, in the alternative, to provide the task in the images so that the control module may relay the images to the output module. Such an external game console may be a game device including at least one storage and at least one processor, where the storage may store algorithms for the task of the game, while the processor may execute the algorithms to provide the game to the user.
- In another aspect of the invention, a method is provided to participate at least two exercising users in at least one task of a story, a scenery, or a video (or computer) game each defining at least one task goal and provided in images of at least one virtual environment while allowing the exercising users to manipulate at least one feature of the images and to compete each other for the task goal at least partly based on at least one of at least two exercises performed by the users.
- In one exemplary embodiment of this aspect of the invention, a method includes such steps of: exercising a first user on, with or against a first standard exercise module of an exercise system which is incorporated in a first location and arranged to facilitate the first user to consume energy by performing first exercise provided by the first standard exercise module (to be referred to as the “first exercising”); exercising a second user simultaneously with the first user on, with or against a second standard exercise module disposed in a second location and is also arranged to facilitate the second user to consume energy by performing second exercise provided by the second standard exercise module, where the second location is different from the first location (to be referred to as the “second exercising”); arranging and disposing at least two visual units each providing an entire portion of the images in a single view angle of each of the users user who simultaneously perform such exercises and each disposed within a viewable distance therefrom (to be referred to as the “first arranging”); displaying the images for the task on each of such visual units (which is to be referred to as the “first displaying”); monitoring at least one feature directly or indirectly related to at least one of the first and second exercises (to be referred to as the “first monitoring”); relating the first and second exercises to each other at least partly based upon at least one preset relation (to be referred to as the “first relating”); and manipulating at least one feature of the images at least partly based upon the relation and the monitored feature, thereby allowing the at least one of the users to compete another in such images of the task for the task goal through the exercises which are simultaneously performed by the users (to be referred to as the “first manipulating”). The above steps of monitoring to manipulating may be replaced by such steps of: monitoring at least one feature directly or indirectly related to both of the first and second exercises (to be referred to as the “second monitoring”); the first relating; and manipulating at least one feature of the images at least partly based on the relation and the monitored feature, thereby allowing the users to compete each other in the images for the task goal through the exercises simultaneously performed by the users (to be referred to as the “second manipulating”).
- In another exemplary embodiment of this aspect of the invention, a method includes the steps of: the first exercising; the second exercising; arranging and disposing at least two visual units each providing different portions of the images in multiple view angles of each of the users simultaneously performing the exercises and each disposed in a viewable distance (to be referred to as the “second arranging”); the first displaying; the first monitoring; the first relating; and the first manipulating. The above steps of monitoring to manipulating may be replaced by the steps of: the second monitoring; the first relating; and the second manipulating.
- In another exemplary embodiment of this aspect of the invention, a method includes the steps of: the first exercising; the second exercising; the first arranging; simulating at least one of the users as at least one simulated user (to be referred to as the “first simulating”); including the simulated user in the images of the virtual environment for at least one of such users (to be referred to as the “first including”); displaying the images including the simulated user on at least one of the visual units (to be referred to as the “second displaying”); the first monitoring; the first relating; and manipulating at least one feature of the simulated user at least partly based on the relation and monitored feature, thereby allowing at least one of the users to compete another user in the images of the task for its goal during the exercises simultaneously performed by such users (to be referred to as the “third manipulating”). The above steps of simulating to manipulating may be replaced by the steps of: simulating each user as at least one simulated user (to be referred to as the “second simulating”); including the simulated users in the images (which will be referred to as the “second including” hereinafter); displaying such images including the simulated users on the visual units (to be referred to as the “third displaying”); the first monitoring; the first relating; and manipulating at least one feature of the simulated users at least partly based on the relation and monitored feature, thereby allowing the users to compete each other in the images of the task for the task goal during the exercises simultaneously performed by the users (to be referred to as the “fourth manipulating”).
- In another exemplary embodiment of this aspect of the invention, a method includes the steps of: the first exercising; the second exercising; the second arranging; the first simulating; the first including; the second displaying; the first monitoring; the first relating; and the third manipulating. The above steps of simulating to manipulating may be replaced by the steps of: the second simulating; the second including; the third displaying; the first monitoring; the first relating; and the fourth manipulating.
- In another aspect of the invention, a method is provided to participate at least two exercising users in at least one task of a story, a scenery, or a video (or computer) game each defining at least one preset goal and provided in images of at least one virtual environment while manipulating at least one of multiple exercises performed by the users at least partly based upon at least one of multiple exercises performed by at least one of the users.
- In one exemplary embodiment of this aspect of the invention, a method includes the steps of: the first exercising; the second exercising; the first arranging; the first displaying; monitoring at least one feature directly or indirectly related to the first exercise which is performed by the first user (to be referred to as the “third monitoring”); the first relating; and manipulating at least one feature of the second exercise performed by the second user at least partly based upon the relation and monitored feature, thereby allowing the first user to compete with the second user in such images for the task by the first exercise (to be referred to as the “fifth manipulating”). The above steps of monitoring to manipulating may be replaced by such steps of: monitoring at least one feature directly or indirectly related to each of the exercises performed by each user (to be referred to as the “fourth monitoring”); the first relating; and manipulating at least one feature of the first and second exercises at least partly based on the relation and monitored features of the second and first exercises, respectively, thereby allowing the users to compete each other in the images of the task for the goal during the exercises simultaneously performed by the users (to be referred to as the “sixth manipulating”).
- In another exemplary embodiment of this aspect of the invention, such a method may include the steps of: the first exercising; the second exercising; the second arranging; the first displaying; the third monitoring; the first relating; and the fifth manipulating. Such steps of monitoring to manipulating may be replaced by the steps of: the fourth monitoring; the first relating; and the sixth manipulating.
- In another exemplary embodiment of this aspect of the invention, a method has the steps of: the first exercising; the second exercising; the first arranging; the first simulating; the first including; the second displaying; the third monitoring; the first relating; and manipulating at least one feature of the second exercise performed by the second user at least partly based on the relation and monitored feature, thereby allowing the first user to compete with the second user through the simulated user in the images through the first exercise (to be referred to as the “seventh manipulating”). Such steps of simulating to manipulating may further be replaced by the steps of: the second simulating; the second including; the third displaying; the fourth monitoring; the first relating; and then manipulating at least one feature of the first and second exercises at least partly based on the relation and monitored features of the second and first exercises, respectively, thereby allowing the users to compete each other by the simulated users in the images for the task goal through the exercises simultaneously performed by the users (which will be referred to as the “eighth manipulating”).
- In another exemplary embodiment of the same aspect, a method includes the steps of: the first exercising; the second exercising; the second arranging; the first simulating; the first including; the second displaying; the third monitoring; the first relating; and the seventh manipulating. The steps of simulating to manipulating may further be replaced by the steps of: the second simulating; the second including; the third displaying; the fourth monitoring; the first relating; and the eighth manipulating.
- In another aspect of the invention, a method is provided for competing at least two users who perform exercises in different locations with each other for at least one preset goal of at least one task of a story, scenery or video (or computer) game each having the goal and provided in images of a virtual environment at least partly based on at least one feature of at least one of the exercises.
- In one exemplary embodiment of this aspect of the invention, such a method includes the steps of: the first exercising; the second exercising; operatively coupling such standard exercise modules to each other via a local or global network encompassing the locations (to be referred to as the “first coupling”); the first arranging; the first displaying; the first monitoring through such a network; the first relating; and the first manipulating via the network. The steps of monitoring to manipulating may be replaced by the steps of: the second monitoring through the network; the first relating; and the second manipulating via the network. Such steps of monitoring to manipulating may be replaced by the steps of: the third monitoring via the network; the first relating; and the fifth manipulating via the network. Such steps of monitoring to manipulating may be replaced by the steps of: the fourth monitoring via the network; the first relating; and the sixth manipulating through the network.
- In another exemplary embodiment of this aspect of the invention, such a method includes the steps of: the first exercising; the second exercising; the first coupling; the second arranging; the first displaying; the first monitoring through the network; the first relating; and the first manipulating through the network. The above steps of monitoring to manipulating may be replaced by such steps of: the second monitoring via the network; the first relating; and the second manipulating via the network. The above steps of monitoring to manipulating may be replaced by the steps of: the third monitoring through the network; the first relating; and the fifth manipulating through the network. Such steps of monitoring to manipulating may be replaced by the steps of: the fourth monitoring through the network; he first relating; and the sixth manipulating through the network.
- In another exemplary embodiment of this aspect of the invention, a method includes the steps of: the first exercising; the second exercising; the first coupling; the first arranging; the first simulating; the first including; the second displaying; the first monitoring via the network; the first relating; and the third manipulating through the network. Such steps of simulating to manipulating may be replaced by the steps of: the second simulating; the second including; the third displaying; the first monitoring via the network; the first relating; and the fourth manipulating through the network. The above steps of simulating to manipulating may be replaced by the steps of: the first simulating; the first including; the second displaying; the third monitoring via the network; the first relating; and the seventh manipulating through the network. Such steps of simulating to manipulating may further be replaced by the steps of: the second simulating; the second including; the third displaying; the fourth monitoring through the network; the first relating; and then the eighth manipulating through the network.
- In another exemplary embodiment of this aspect of the invention, such a method includes the steps of: the first exercising; the second exercising; the first coupling; the second arranging; the first simulating; the first including; the second displaying; the first monitoring through the network; the first relating; and the third manipulating through the network. The steps of simulating to manipulating may be replaced by the steps of: the second simulating; the second including; the third displaying; the first monitoring through the network; the first relating; and the fourth manipulating through the network. Such steps of simulating to simulating may be replaced by the steps of: the first simulating; the first including; the second displaying; the third monitoring through the network; the first relating; and the seventh manipulating through the network. Such steps of simulating to simulating may be replaced by the steps of: the second simulating; the second including; the third displaying; the fourth monitoring through the network; the first relating; and the eighth manipulating via the network.
- Embodiments of such method aspects of the present invention may include one or more of the following features, while configurational or operational variations and/or modifications of the foregoing methods also fall within the scope of the present invention.
- Such defining the goal may include the step of: arranging one of the users to perform the task against another user performing the exercise defining the same, similar or different type and/or extent on, with, and/or against another standard exercise module. The defining the goal may include at least one of the steps of: fighting (or opposing, hiding from) at least one opponent manipulated by the task; overcoming (or opposing, hiding from) at least one obstacle provided by the task; proceeding through the obstacles; seeking at least one preset object hidden in the task; or assembling at least one preset shape from multiple objects provided thereby.
- Such simulating may include one of the steps of: simulating the single user into one or multiple simulated users included in the images for the task; simulating each of at least two users as the single simulated user (or each of at least two simulated users); simulating each of at least two but not all of the users into the single simulated user (or each of at least two simulated users), and the like. Such simulating may include the steps of: forming the simulated user as at least one object or background of the images; and changing at least one visual feature of the object or background at least partly based on at least one of the features of the exercise, user, or operation. The simulating may have the steps of: forming the simulated user as at least one object (or background) in the images; and changing at least one feature of the object (or background) with respect to other objects (or backgrounds) of the images at least partly based on at least one of the features of the exercise, user, or operation while maintaining other visual features. The changing may include at least one of the steps of: changing a shape or size of the simulated user; changing its color; changing its contrast or sharpness; changing its zoom; changing its view angle or distance; and changing its position. The changing may include at least one of the steps of: moving the simulated user with respect to the background based on at least one of the factors; varying a shape or size of the simulated user based on at least one of the factors; or changing an orientation of the simulated user based upon at least one of the factors. The changing may include at least one of the steps of: changing temporal characteristics of the visual feature; and varying its spatial characteristics. The simulating may include at least one of the steps of: forming the simulated user as a living organism (e.g., a person, animal, plant, and the like); forming the simulated user as a nonliving object; forming the simulated user as a mixture of the living organism and nonliving object, and the like. The forming may include at least one of the steps of: animating the simulated user after the user; synthesizing the simulated user using a preset program; selecting one of the simulated user from multiple simulated users, and the like. The simulating may also include one of the steps of: defining the simulated user in a perspective of the user; forming the simulated user in a perspective defined away (or at a preset distance) from the user, and the like. The simulating may include one of the steps of: maintaining a perspective depicting the simulated user throughout the task; changing the perspective during at least a portion of the task; and constantly changing the perspective during the task. The simulating may also include the step of: arranging the simulated user to walk, run, sprint, jump, throw, row, push, pull, turn, bend, rotate, swing, hit, and/or otherwise move based on at least one of the features. The simulating may also include at least one of the steps of: simulating at least one feature of the user into the simulated user manipulated by the user; simulating at least one feature of another user into the simulated user manipulated by the user, and the like.
- The disposing may also include at least one of the steps of: incorporating at least a portion of the visual unit into the standard exercise module(s); coupling such a portion away from the standard exercise module(s); and disposing such a portion on a structure supporting or enclosing the standard exercise module(s). The disposing may also include at least one of the steps of: incorporating at least a portion of the visual unit into a wearable article such as glasses, a goggle, a helmet, a hat, a cap, a head band, and the like; releasably coupling the wearable article with the user; releasably (or fixedly) coupling the article to a cloth of the user, and the like. The disposing may include one of the steps of: disposing a single visual unit for multiple users; providing each of the users with at least one visual unit, and the like. The disposing may include one of the steps of: disposing the visual units of same shapes or sizes to the user(s); and disposing the visual units of different shapes or sizes thereto.
- Such coupling may have one of the steps of: directly coupling the standard exercise modules; indirectly coupling the standard exercise modules; coupling the standard modules through a provider which is not a part of the system, and the like. Such coupling may include the step of: coupling the standard exercise modules wirelessly or through wire. The method may include the steps of: placing the standard exercise modules in the different (or geographically separate) locations; and preventing each of the users from accessing both of the standard exercise modules without having to physically move out from one to another of the locations. Such a method may further include at least one of the steps of: encompassing such locations of a single city (or different cities) by the global network; encompassing such locations of a single country (or different countries) by the global network, encompassing the locations of a single time zone by the global network; and encompassing the locations of different time zones.
- The monitoring may include at least one of the steps of: monitoring at least one feature of the user(s); monitoring at least one feature of the exercise(s); and monitoring at least one feature of the operation of the exercise module(s). Such monitoring the user (or exercise, operation) feature may include at least one of the steps of: sensing at least one factor of at least one type of the user(s) (or exercises, operations); sensing at least one factor of at least one extent of the user(s) (or exercises, operations), and the like. The monitoring may include at least one of the steps of: sensing at least one of the features away from the user(s); sensing at least one of the features by contacting the user(s), and the like.
- The manipulating the task (or its feature) may include at least one of the steps of: changing at least a portion of the images only based on at least one feature of the exercises, operations or users; changing such a portion of such images with respect to the rest of the elements of the images based thereon; and varying the perspective, view angle, or distance related to the images. The manipulating the task (or its feature) may include at least one of the steps of: varying the simulated user only based upon at least one feature of the exercises, operations or users; changing the simulated user relative to the rest of the elements of the images based thereupon; and changing the perspective, view angle, or distance related with the simulated user. The manipulating the exercises (or feature thereof may include at least one of the steps of: requiring the users to maintain the posture during the exercises (or task); requiring the users to vary the posture during the exercise (or task) based on at least one feature of the users, task, operations, and user input. The manipulating the exercise (or its feature) may include at least one of the steps of: maintaining the load during the exercises (or task); changing the load during the exercises (or task) based on at least one feature of the user, tasks, operation, or user input, and so on. The manipulating the exercise (or feature thereof may include at least one of the steps of: varying the load based on a fatigue of the users; and varying the load based on at least one feature of the task without considering the fatigue. The manipulating the operation (or its feature) may include at least one of the steps of: controlling a configuration, arrangement or disposition of the actuating part based on at least one feature of the exercises, users or task; controlling the load of the standard exercise modules based on at least one feature thereof, and the like. The manipulating the operation (or its feature) may include at least one of the steps of: maintaining the load of the standard exercise modules during the exercises (or task); and then varying the load during the exercises. The manipulating the operation (or feature thereof) may include at least one of the steps of: performing the manipulating manually; performing the manipulating based on at least one feature of the task, user, and exercise; and performing the manipulating based on the user input.
- The method may include the steps of: providing at least one auditory unit; and playing sounds (or at least one auditory feature) during such exercises. The method may include at least one of the steps of: providing at least one auditory feature (or sounds) related with the task, users, exercises, or operations; providing at least one olfactory feature (or smells) related to the task, users, exercises, or operations; further providing at least one tactile feature (or sensations) related to the task, users, exercises, or operations, and so on. The method may also include one of the steps of: synchronizing the auditory, olfactory, or tactile features with the images; arranging the auditory, olfactory, or tactile feature independent of the images, and the like. The method may also include the steps of: coupling the system to at least one external visual device; and utilizing visual capacity of the device to display at least a portion of the images. The method may include the steps of: coupling the system to at least one external game console; and utilizing game generating capacity of the device to define at least a portion of the task.
- More product-by-process claims may be constructed by modifying the foregoing preambles of the apparatus (or system) claims and/or method claims and by appending thereto such bodies of the apparatus (or system) claims and/or method claims. In addition, such process claims may include one or more of such features of the apparatus (or system) claims and/or method claims of this invention.
- As used herein, the term “exercise equipment” is synonymous to the term “fitness equipment” and refers to various prior art equipment which is primarily intended to improve or enhance a muscle tone of an user, to increase his or her muscle mass or volume, to force or facilitate the user to reduce his or her weight, to increase physical stamina of the user, and the like. To such ends, the “exercise equipment” typically forces or facilitates the user to consume energy by performing physical work on, with or against the equipment or by receiving physical or electrical energy from the equipment in order to twitch his or her muscles based thereon. Therefore, the “exercise equipment” within the scope of the invention does not refer to those prior art devices primarily intended to engage the user in playing physically simulated games or video (or computer) games, although such “exercise equipment” of this invention may be modified to allow the user to engage in such simulated or video (or computer) games while improving or enhancing the muscle tone of the user, increasing the muscle mass or volume of the user, forcing or facilitating the user to reduce his or her weight, or increasing physical stamina of the user. Accordingly, such “exercise equipment” of the present invention may refer to various prior art equipment examples of which may include, but not be limited to, cardio-exercise equipment, weight training equipment such as, e.g., abdominal machines and stretching machines, and the like. Examples of the cardio-exercise equipment may include, but not be limited to, treadmills, running machines, stair climbers, exercise cycles and/or bikes, rowing machines, combinations of such, and so on, whereas examples of the weight training equipment may include, but not limited to, various home gyms, weight machines, curls, extensions, racks such as squat racks, presses, crunches, benches which include incline and decline types, extension benches, bench racks, weight benches, various exercise chairs, leverages, dips, boards, and the like. In addition, such “exercise equipment” may include various prior art devices capable of delivering the physical energy to the user in order to improve or enhance the muscle tone of the user, to increase his or her muscle mass or volume, to force or facilitate the user to reduce his or her weight, or to increase physical stamina of the user. Such “exercise equipment” may also include various prior art devices capable of providing electrical energy to the user in order to improve or enhance the muscle tone of an user, to increase the muscle mass or volume, to force or facilitate the user to reduce his or her weight, or to increase such physical stamina of the user.
- An “exercise module” of an exercise system of this invention is generally similar or identical to the exercise equipment described in the previous paragraph. More particularly, the “exercise module” corresponds to any of the above exercise equipment as well as any modifications thereof according to various teachings as set forth herein. For example, the “exercise module” may refer to any prior art exercise equipment or, in the alternative, may incorporate thereinto one or more of various units of the output and/or control modules of the exercise system as set forth herein. It is appreciated, however, that each “exercise module” always includes at least one actuating part which is typically designed to contact a body part of the user and to receive physical energy from the user therewith and that such an actuating part may be fabricated as a track, a pedal, a weight, a lever, a handle, a belt, a pad, and the like. More particularly, the track is arranged to translate and to allow the user to perform physical exercise of walking or running on such a track while consuming energy during the exercise, while the pedal is arranged to couple with a preset load and to rotate about an axis of rotation against the load when the user performs physical exercise of rotating the pedal while consuming energy during such exercise. The weight is arranged to translate vertically or transversely and to be moved as the user performs physical exercise of lifting the weight while consuming energy during such exercise, while the lever is arranged to couple with a preset load and to pivot about a central point against the load as the user performs physical exercise of displacing, reciprocating or otherwise pivoting the lever while consuming energy during the exercise. The belt is arranged to enclose at least one body part of the user thereabout and to translate or reciprocate to allow the body part of the user to perform physical exercise of vibration while consuming energy during such exercise, and the pad is arranged to couple to a preset load and to translate, rotate, pivot, deform, or otherwise move against the load for allowing the user to perform exercise of translation, rotation, pivoting, deformation or other movements during such exercise. The handle is arranged to couple with a preset load and/or the above weight, lever, or pad and to translate, reciprocate, rotate, pivot, deform or otherwise move against such a load in order to allow the user to perform physical exercise of translation, rotation, reciprocation, pivoting or other movements of the lever while consuming energy during the exercise. It is noted that the actuating part may be an electrode through which electrical energy is supplied to the user and to twitch the muscle of the user while forcing or facilitating the user to consume the energy.
- The term “exercise” refers to any voluntary or involuntary activities of various muscles of the user which consume energy of the user and which lead to improvement or enhancement of a muscle tone of the user, to an increase in a muscle mass or volume of the user, to reduction in the weight, or to an increased physical stamina of the user. In general, various characteristics of the “exercise” is typically determined by the exercise module or, more specifically, operations of the exercise module.
- As used herein, the term “user inputs” refers to various inputs which are supplied by the user onto various modules of the system of this invention in order to manipulate various operations of such modules such as the exercise, output, and control modules. The user may supply such “user inputs” by applying mechanical inputs to various input units such as, e.g., conventional keys, key pads, touch screens, track pads, track balls, track sticks or rods, mouses, handles, joysticks, pedals, and the like. The user may supply the “user inputs” by applying mechanical, thermal, electric or magnetic signals to the modules of the system, by generating movements of his or her body part which are monitored by at least one of such modules of the system, by generating voice, face or body signals which may also be monitored or sensed by at least one of such modules, and the like.
- As used herein, “features of operation” (or “operation features”) include “types of operation” (or “operation types”) and “extents of operation” (or “operation extents”) and are attributed to and/or determined by a specific exercise module of the system. Such “operation types” may be affected or determined by various factors which may include, but not limited to, a shape or size of the actuating part of the exercise module, a position of the actuating part in the exercise module, a contacting mode between a body part of an user and the actuating part during an operation of the exercise module, a movement (i.e., a direction, a displacement, or a sequence) of the actuating part during the operation, and so on. The “operation extents” may also be affected or determined by various factors which may include, but not limited to, a load required for operating the actuating part, a duration of the operation, an amount of energy provided to the user by the exercise module, a mathematical function of the load, duration, and/or amount, and the like.
- As used herein, “features of exercise” (or “exercise features”) include “types of exercise” (or “exercise types”), “extents of exercise” (or “exercise extents”) and result of such operation features. The “exercise types” may be affected or determined by various factors which may include, but not be limited to, a posture of an user required for a specific exercise, an orientation of the user therefor, a movement of the user required therefor, a body part of the user required or recruited therefor, a body part of the user contacting the actuating part of the exercise module therefor, a body part of the user to which mechanical or electrical energy is supplied therefor, and so on. The “exercise extents” may be determined or affected by various factors which may include, but not be limited to, the load which is imposed to such exercise by the exercise module, a duration of the exercise, an amount of energy provided to or consumed by the user during the exercise, a number of calories measured or estimated to be consumed by the user, a product of the load and duration, a temporal integration of the load over the duration, a mathematical function of the load, duration, and/or amount, and the like.
- As used herein, “user features” include “user types” and “user extents” and are related to or affected by a period of a specific exercise performed by an user, thereby reflecting a physical fatigue of the user when desired. The “user types” may be affected or determined by a variety of factors of the user which may include, but not be limited to, a height, a weight, a body fat percent, a sex, an age, a race, a health or disease status, a handicap, a physical and/or physiological condition before, during or after an exercise (i.e., body temperature, systolic, diastolic or other cardiovascular conditions such as a blood flow rate, blood pressure, heart rate, or ECG, a respiratory condition such as a respiratory rate, a respiratory air flow rate, and a pressure along an air way, an EEG, an EMG, and the like), an orientation of the user needed for the exercise, a posture of the user needed therefor, a movement of the user needed therefor, a body part of the user needed therefor, a body part of the user in contact with the exercise module (or its actuating part) therefor, a body part of the user to which mechanical or electrical energy is supplied, and so on. The “user extents” may similarly be affected or determined by various factors which may include, but not limited to, the physical or physiological conditions of the user before, during or after the exercise, a duration of the exercise, an amount of energy consumed by or provided to the user during the exercise, a mathematical function of the load, duration, and/or amount, and the like.
- It is to be understood that the above “load” generally represents resistance to the operation of the actuating part of the exercise module, resistance to the exercise, and the like. Such a “load” may then be represented in various means examples of which may include, but not be limited to, a mass of at least a portion of the actuating part, a mass of a weight coupling with the actuating part, an angle of the actuating part with respect to the user and/or exercise module, a spring constant or elasticity of at least a portion of the actuating part, an electrical property of an electric element functionally related to the actuating part, a viscosity of at least a portion of the actuating part, a viscosity of a dash pot (or a viscous element) coupling with the actuating part, a speed and/or acceleration of the actuating part, a displacement of the actuating part, and the like. In this context, such a “load” is deemed as a variable and/or a parameter determining an amount of energy which is to be consumed by the user in order to consummate a unit displacement or deformation of the actuating part of the exercise module and/or a specific body part of the user. It is to be understood that this “load” may also be quantified by various amounts of energy which may be required for the user to walk or run a unit distance with respect to the actuating part of the exercise module, which may be required to translate, rotate, pivot, deform or bend at least a portion of the actuating part by a preset linear or angular length, which may be needed to rotate, pivot, bend or deform at least a portion of the actuating part about a preset angle, and so on. This “load” may be adjusted by various means examples of which may include, but not be limited to, adjusting the speed or acceleration of the actuating part of the exercise module, adjusting the angle of the actuating part, the mass of the weight or at least a portion of the movable or deformable portion of the actuating part, the spring constant or modulus thereof, the viscosity of at least a portion thereof, a length of such a movable or deformable portion, a curvilinear trajectory of the movable or deformable portion, and the like. In general, the “load” is defined in such a manner that the user has to consume a greater amount of energy as the “load” increases.
- As used herein, “task features” include “primary task features” and “secondary task features” and are defined by or in a specific task. The “primary task features” include “primary task types” and “primary task extents.” Such “primary task types” may be affected or determined by various factors which may include, but be limited to, a type of a goal of the task, a number of stages defined therein, levels or skills needed to complete a stage of the task, means of accomplishing the task goal, means of proceeding through such stages, characteristics (i.e., configuration, arrangement, or disposition) of a simulated user (only if the system defines at least one simulated user), means of manipulating such a simulated user, and the like. The “primary task extents” may be affected or determined by various factors which may include, but not limited to, a stage of the task in which the user or simulated user is currently disposed, a level or skill needed in the current stage, a status of the user or simulated user, a duration of the user or simulated user engaged in the current stage or in the task, and the like. The “secondary task features” may similarly include “secondary task types” and “secondary task extents.” The “secondary task types” may be affected or determined by various factors which may include, but not be limited to, a type of images or visual feature (i.e., a single still picture an entire or only a portion of which is to be displayed, a series of still pictures, a video clip, and the like), a mode of such images or visual feature (i.e., black and white, grey-scale, color-scale), a dimension of such images or visual feature (i.e., two-dimensional or three-dimensional), and so on. The “secondary task extents” may be affected or determined by various factors which may include, but not be limited to, a portion selected from multiple portions of a still picture, a still picture selected from a series of still pictures or video clip, a sequence of selecting a next portion of the still picture, a sequence of selecting a next still picture in a series of still pictures or video clip, a speed or a gap between displaying the portions or pictures, a viewing area or a zoom, a view angle or perspective angle of the picture or video clip, a basis of the view or perspective angle, and so on. It is understood that the “task features” are to collectively refer to the “primary task features” and the “secondary task features” unless otherwise specified, that the “task types” are to collectively refer to the “primary task types” and the “secondary task types” unless otherwise specified, and that the “task extents” similarly collectively refer to the “primary task extents” and the “secondary task extents” unless otherwise specified.
- The term “location” is to mean a three-dimensional space which includes at least one entrance or door. The space of a “single location” may be open or include at least one partition which does not block an user from getting therearound or thereacross, whereby the user may access every corner of the “location.” Therefore, when multiple exercise modules are disposed in a “single location,” all of the exercise modules are to be physically accessible to the user, without requiring the user to get out of the “location” through one of its doors and then to enter the same “location” through another door in order to access another exercise module disposed in a different corner of that “location.” In contrary, “different locations” are to mean three-dimensional spaces which do not share any common entrance or door therebetween. Accordingly, when the exercise modules are disposed in “different locations,” the user cannot physically access all exercise modules from one “location.” Rather, he or she has to get out of one “location” and then to enter a different “location” in order to access the exercise module disposed in that different “location.”
- As used herein, a “game” collectively refers to what is conventionally known as computer or video games. The “game” is typically played by a specific program, e.g., by generating two- or three-dimensional images using a visual unit based on the program and then electrically manipulating at least a portion of such images so as to attain a preset goal specifically defined by the “game.” The “game” incorporates in such images at least one (but preferably) multiple animated and/or imaginary objects or backgrounds such that an user of an exercise system of this invention may manipulate at least one of such objects or backgrounds to accomplish the goal of the “game” according to various preset rules defined therefor. The “game” is typically arranged to allow the user to compete with the program or to allow multiple users to compete for the same goal, where examples of such “games” may include, but not limited to, prior art video games in which the user has to fight one or multiple animated or imaginary opponents manipulated by the program or another user, prior art video games in which the user has to proceed through animated or imaginary obstacles provided or manipulated by the programs or another user, conventional video games in which the user has to identify or find preset animated or imaginary objects hidden by the program or another user, prior art computerized card games in which the user plays against the program or another user using animated or imaginary cards, prior art computerized board games in which the user plays against the program or another user on animated or imaginary boards with their animated or imaginary pieces, conventional computerized puzzles in which the user plays against the program or another user using animated or imaginary pieces, and the like. Examples of the “game” may also include computerized equivalents of any other conventional sport games, war games, card games, board games, puzzles, and the like. In each of the above examples, the user can preferably manipulate at least one object or background incorporated into such images so as to attain the goal of the “game,” where the manipulatable or controllable object or background will be referred to as a “simulated user” hereinafter. When desirable, the “game” may be arranged to synchronize the images with sounds for various purposes such as, e.g., assisting the user, audibly depicting certain stages, events, and/or landmarks of the “game,” enhancing audiovisual quality of the “game,” and the like. When the exercise system of this invention is synchronized with the “game” and the user is to play the “game” while performing the exercise, the “game” is to correspond to the task of the exercise performed by the user, whereas the goal of the “game” is to correspond to the goal of the task, i.e., to accomplish the preset goal of the “game” by manipulating the simulated user of the images according to the preset rules of the “game” against the rest of the animated or imaginary objects or backgrounds manipulated by the program or another user. The “game” or task may include any arbitrary number of stages therein. For example, the “game” may define a single stage in which the user manipulates the simulated user for the task goal, may define multiple stages in which the user may proceed to a next stage only by achieving a preset goal defined in a current stage, and the like. The “game” may define a single or multiple goals and maintain such goals therethrough, through the operation of the exercise module, through exercise of the user, and the like. Alternatively, the “game” may define a single goal and then change the goal therealong, through the operation of the exercise module, through exercise of the user, and the like. Alternatively, the “game” may define multiple goals, select one of such goals, and thereafter select another goal, through the operation of the exercise module, through exercise of the user. In another alternative, the “game” may define a single or multiple goals and then manipulate its goal at least partly based upon any of such features. Similarly, such a “game” may define a single or multiple objects (or backgrounds) and employ such objects (or backgrounds) therethrough, through the operation of the exercise module, through exercise of the user, and the like. In the alternative, the “game” may define a preset number of the objects (or backgrounds) and thereafter vary a number or characteristics of the objects (or backgrounds), during the operation of the exercise module, through exercise of the user, and so on. Alternatively, the “game” may define a preset number of objects (or backgrounds), select therefrom a certain number or characteristics of such objects (or backgrounds), and thereafter select another number or characteristics thereof, through the operation of the exercise module, through exercise of the user, and the like. In the alternative, the “game” may define a preset number of objects (or backgrounds) and then manipulate at least one of the objects (or backgrounds) at least partly based on any of the above features.
- It is appreciated that the “game” within the scope of this invention are to provide such images including therein only those objects (or backgrounds) which are either animated or arbitrary, but to not provide such images including therein any objects (or backgrounds) which are merely photographs or photographical versions of a real objects (or backgrounds) present in a real environment, where the latter images will be explained in greater detail below in conjunction with a “story” or a “scenery.” It is appreciated that the “game” of this invention is to be embodied differently from such prior art video or computer games as commonly seen in prior art game consoles. More particularly, the “game” primarily distinguishes itself from the prior art video or computer games in that the simulated user included in the images of the “game” are to be manipulated at least partly based on at least one of various features of a preset task (i.e., the “game”), an user of the exercise system, at least one operation of an exercise module of the system, or an exercise provided by the exercise module, where all of such features are either directly or indirectly affected or determined by such exercise performed by the user on, with or against the exercise module through applying mechanical energy to the actuating part of the exercise module while allowing the user to consume energy during the exercise. To the contrary, the simulated user included in the images of the prior art video or computer games are manipulated entirely based on user inputs applied thereonto by various body parts of the user and, more importantly, not based upon user inputs applied to the actuating part of the exercise module.
- As used herein, a “story” is to collectively refer to images which are provided sequentially and which generally correspond to prior art movies, plays, shows, musicals, concerts, and the like, where the images of each of the “stories” as a whole define artistic (i.e., literary or musical) themes and are arranged in artistic sequences. In general, such a “story” within the scope of the present invention is to be distinguished from the “game” in two major aspects. In one aspect, the “game” always includes therein at least one simulated user to be manipulated based on at least one of various features of the task, user, exercise, or operation of an exercise module, while the “story” is to not include therein any simulated user. Therefore, the user of the “story” can not manipulate any objects (or backgrounds) of such images contrary to the user of the “game” who can manipulate at least one of such objects (or backgrounds) as set forth above. It is to be understood that, although the user of the “story” may not manipulate any objects (or backgrounds) in the images, the user may manipulate other features of the images such as, e.g., a view angle of the images, a speed of displaying each images, a temporal gap between displaying such images, a sequence of displaying the images or groups of such images, and the like. In another aspect, the “game” always includes therein such objects (or backgrounds) which are animated or imaginary, while the “story” is to include only photographs or photographical versions of real objects (or backgrounds) present in a real environment. Within the scope of this invention, the images including not only the real object (or background) but also the animated or imaginary object (or background) are to be deemed as the “story” as far as the images define the artistic themes and are arranged in the artistic sequence. When desirable, the “story” may be arranged to synchronize such images with sounds for various purposes such as, e.g., assisting the user, audibly depicting certain stages, events or landmarks of the “story,” enhancing audiovisual quality of the “story,” and the like. When the exercise system of this invention is synchronized with the “story” and the user is to view the “story” while performing the exercise, viewing such a “story” is to correspond to the task of his or her exercise, while the goal of the “story” is to correspond to the goal of the task of the user, i.e., to view a preset or entire portion of the “story” by manipulating various features of such the set forth in this paragraph. The “story” or task may include any arbitrary number of stages therein, where such stages may be formed as chapters, plots or parts, primarily based upon contents or contexts thereof, time, and the like. Accordingly, the “story” may include a single stage in which the user views only a portion or entire portion thereof depending upon various features, may have multiple stages in which the user is allowed to proceed to a next stage only upon achieving a preset goal defined for a stage, and so on. The “story” may also define a single or multiple goals and maintain the goals therethrough, through the operation of the exercise module, through exercise of the user, and so on. Alternatively, the “story” may define a single goal and thereafter vary the goal, through the operation of the exercise module, through user's exercise, and so on. In the alternative, the “story” may set forth multiple goals, select one of the goals and then select another goal, through the operation of the exercise module, or through exercise of the user. In another alternative, the “story” may define a single or multiple goals and thereafter manipulate its goal at least partly based upon any of the features.
- As used herein, a “scenery” is to collectively refer to images which are provided sequentially and generally correspond to prior art visual archives, documentaries, movies, and the like, where the images of each of such “sceneries” define aesthetic themes as a whole and where such images are arranged in geographic sequences. Similar to the above “story,” a “scenery” within the scope of this invention is to be also distinguished from the “game” in two major aspects. In one aspect, the “game” always includes at least one simulated user to manipulate the rest of the objects (or backgrounds) of such images therewith, while the “scenery” is to not include therein any simulated users. Therefore, the user can not manipulate any objects (or backgrounds) of the images of the “scenery” contrary to the user of the “game” who manipulates at least one of such objects (or backgrounds) as described above. It is to be understood that, although the user of the “scenery” may not manipulate any objects (or backgrounds) in the images, the user may manipulate other features of the images such as, e.g., a view angle of the images, a speed of displaying each images, a temporal gap between displaying the images, a sequence of displaying the images or groups of multiple images, and the like. In addition, the “scenery” may incorporate at least one simulated user which may be manipulated based upon various features while not changing any objects (or backgrounds) included in the images. In another aspect, the “game” always includes therein the above objects (or backgrounds) which are either animated or imaginary, while the “story” is to include therein only photographs or photographical versions of real objects (or backgrounds) present in a real environment. Within the scope of this invention, the images which may include not only the real object (or background) but also the animated or imaginary object (or background) are to be deemed as the “scenery” as long as such images may define the aesthetic theme and are arranged in the geographic sequence. When desired, the “scenery” may be arranged to synchronize the images with sounds for similar purposes such as, e.g., assisting the user, audibly depicting certain stages, events, and/or landmarks of the “scenery,” enhancing audiovisual quality of the “scenery,” and the like. When such an exercise system of this invention is synchronized with the “scenery” and the user is to view the “scenery” while performing the exercise, viewing the “scenery” is to correspond to the task of his or her exercise, while the goal of the “scenery” is to correspond to the goal of the task, i.e., to view a preset or entire portion of the “scenery” by manipulating the above features of such images set forth in this paragraph. The “scenery” or task may include any arbitrary number of stages therein, where the stages may be formed as chapters, plots, and/or parts, primarily based on contents or contexts, time, geography, landmarks, distance defined in the real environment or in the images, and the like. Accordingly, the “scenery” may form a single stage therein in which the user views only a portion or entire portion thereof depending on various features, may include multiple stages in which the user is allowed to proceed to a next stage only by achieving a preset goal which is defined for a current stage, and the like. The “scenery” may further define a single or multiple goals and maintain the goals therethrough, through the operation of the exercise module, through exercise of the user, and so on. Alternatively, the “scenery” may define a single goal and thereafter vary the goal, through the operation of the exercise module, through exercise of the user, and the like. In the alternative, the “scenery” may define multiple goals, select one of such goals, and thereafter select another of the goals, through the operation of the exercise module, through exercise of the user. In another alternative, the “scenery” may define a single or multiple goals and thereafter manipulate its goal at least partly based on any of the features. It is noted that the “scenery” may be defined based on various different geographies so that the “scenery” may represent the images of land geography, underwater geography, astronomical geography, and the like, as far as such images define aesthetic themes as a whole and are arranged in geographic sequences.
- As used herein, a “virtual environment” (or to be abbreviated as “VE” hereinafter) is to refer to an environment always including at least one visual feature (or “images”) and also optionally including at least one auditory feature (or “sounds”), at least one olfactory feature (or “smells”), and/or at least one tactile feature (or “sensations”), where each of the above features may represent, connote or be associated with at least one of multiple elements such as a preset object, background, event, activity, surrounding, geographic region, and so on. Therefore, the “VE” may include such features pertaining to at least one of the above elements or, alternatively, may include such features pertaining to at least one of such elements and those features pertaining to at least one of a different object, background, event, activity, surrounding, region, and so on. It is appreciated within the scope of this invention that a mere display of alphanumerals, symbols, or loads imposed by the exercise equipment to the user is deemed to not constitute the “VE” and that a mere display of a temporal characteristic, distribution, or variation of such loads is deemed to not constitute the “VE” as well. That is, the “VE” in this invention is to preferably include at least one of such visual, auditory, olfactory or tactile features which relate the user with at least one of the elements. Therefore, the “VE” in this invention must include the visual feature and may optionally include at least one of the auditory, olfactory, and tactile features related to the above elements.
- As used herein, “features” refer to various aspects related to or associated with a preset VE, including temporal or spatial characteristics, distributions, or variations of such aspects. For example, the “visual feature” may refer to various visual aspects examples of which may include, but not limited to, a shape and a size of an image, contents carried thereby, its color and brightness, its contrast and sharpness, its zoom, its temporal or spatial characteristics, distributions or variations, and so on. The “auditory feature” may refer to various auditory (or audible) aspects examples of which may include, but not be limited to, a volume or loudness of a sound, its tone, its balance (e.g., in a stereo mode), its frequency distribution, a direction from its source, and temporal or spatial characteristics, distributions or variations of the audible (or auditory) aspects, and so on. The “olfactory feature” may then include various olfactory aspects examples of which may include, but not be limited to, a type of a smell, its intensity, temporal or spatial characteristics, distributions or variations of such olfactory aspects, and the like. The “tactile feature” may then refer to various tactile aspects examples of which may include, but not be limited to, mechanical, thermal, and/or electrical properties of various parts of such exercise and/or output modules of the exercise system of this invention.
- Unless otherwise defined in the following specification, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which the present invention belongs. Although the methods or materials equivalent or similar to those described herein can be used in the practice or in the testing of the present invention, the suitable methods and materials are described below. All publications, patent applications, patents, and/or other references mentioned herein are incorporated by reference in their entirety. In case of any conflict, the present specification, including definitions, will control. In addition, the materials, methods, and examples are illustrative only and not intended to be limiting the scope of the present invention.
- Other features and/or advantages of the present invention will be apparent from the following detailed description, and from the claims as well.
-
FIG. 1 is a schematic diagram of various modules of an exemplary exercise system according to the present invention; -
FIGS. 2A to 2F are schematic diagrams of exemplary exercise systems incorporating therein a different number of modules; -
FIGS. 3A and 3B are schematic perspective views of exemplary exercise systems according to the present invention; and -
FIGS. 4A and 4B are schematic perspective views of exemplary exercise systems simulating exercising users as simulated users of tasks of video games and allowing the users to compete each other according to the present invention. - The present invention generally relates to an exercise system which includes multiple exercise modules and which provides a preset task of a story, a scenery, and/or a video (or computer) game to multiple users who simultaneously perform the same or different exercises on, with, and/or against such exercise modules. More particularly, the present invention relates to an exercise system which includes multiple exercising modules disposed in different locations but coupling to each other through a local or global network and then provides multiple users exercising on, with or against the exercise modules with a preset task in a format of a story, scenery, or video (or computer) game each defining at least one preset goal and provided in images of a virtual environment so that the users can compete each other in the images for the task for the goal at least partly based upon at least one feature of the users, exercises or operation of the exercise modules. To this end, the exercise system may simulate at least one of the users as at least one simulated user which is included in the images (or as at least one element of the images), and manipulate the simulated user at least partly based upon at least one feature of the task, users, exercises or operation of the exercise modules, where examples of such elements include a preset object and background in the images. Thereby, the users can perform the exercises on, with, and/or against such exercise modules while simultaneously pursuing the goal of the task based on at least one feature of the exercises. Conversely, the present invention relates to an exercise system which provides the users with such a task and manipulates at least one feature of the operation of at least one of its exercise modules directly or indirectly related to the exercises at least partly based on at least one feature of the task. To this end, such an exercise system simulates at least one of the users as at last one simulated user and manipulates the operation of the system at least partly based on at least one feature of the task with or without allowing the users to manipulate other elements of the task through at least one feature directly or indirectly related to such exercises. Whereby, the users can perform the exercises while pursuing the goal of the task based on progress of the task which is determined at least partly based on the exercises. In all of such embodiments, the system always includes at least one output module which in turn incorporates at least two visual units disposed in different locations and displaying the task images to each exercising user and which also includes at least one olfactory or tactile unit providing respectively smells or sensations related to the task. Therefore, the exercising users can perform the exercise while participating in the task as well as monitoring progress of the task by watching the images, hearing the sounds, smelling the smells or feeling the sensations.
- The present invention also relates to various methods of providing such a task of a scenery, a story or a game to multiple users engaged in exercises on, with or against multiple exercise modules incorporated in different locations via a local or global network and relating at least one feature of the task with at least one feature related with the exercises or vice versa. More particularly, the present invention relates to various methods of providing communication between multiple exercise modules in different locations, monitoring various features of the task, users, exercises performed by such users or operation of the exercise modules, and then allowing the users to compete in a virtual environment defining images, sounds, smells or sensations while transferring such features between the exercise modules through the local or global network. Based thereon, the present invention relates to various methods of providing the users with such images for the task, those of simulating such users as the simulated users included into the task, and those of manipulating the simulated users in the images for the task at least partly based upon the features of the exercises, users or operations to manipulate at least one feature of the task to accomplish the task goal. Conversely, the present invention relates to various methods of providing such images to the exercising users, those of simulating such users as the simulated users, and those of manipulating at least one feature of the operation of such exercise modules at least partly based on at least one task feature, thereby affecting at least one feature of the exercises for the goal of the task. In all of these embodiments, the present invention relates to various methods of performing the above manipulations when such users simultaneously perform the same or different exercises on, with or against the exercise modules. The present invention further relates to various methods of providing a compact or full-size visual unit for each user, those of disposing such visual units in a single or multiple view angles of such users, and those of displaying such images for the users during their exercises. The present invention relates to various methods of providing such users with sounds, smells or sensations for the task, those of synchronizing such sounds, smells or sensations with the images for the task, and the like.
- The present invention further relates to various processes for fabricating an exercise system capable of providing the task of the story, scenery or video (or computer) game to multiple users who engage in the same or different exercises in different locations while competing each other in such a task. More particularly, the present invention relates to various processes for providing the system which allows multiple users to simultaneously perform the exercises, those of generating the task in a format of the story, scenery or game each defining at least one preset goal and provided in images of a virtual environment therefor, and those of manipulating progress of the task at least partly based on at least one feature directly or indirectly related with the exercises performed by the users. To these ends, the present invention relates to various processes for defining and assessing various features of the exercises performed by the users or provided by the exercise modules, at least one operation of the exercise modules, users engaging such exercises or the task, those of simulating the users as the simulated users, those of manipulating the simulated users at least partly based on the features of the exercises, users or operation, and the like. Thereby, multiple users can simultaneously perform the same or different exercises while competing each other in the images of the task and pursuing the task goal based upon the exercises. Conversely, the present invention relates to various processes for providing the exercising users with the task, and those of manipulating at least one feature of the operation of the system directly or indirectly related to the exercises at least partly based on at least one feature of the task. To this end, the present invention relates to various processes for defining and then assessing the above features, those of simulating the users as the simulated users, those of manipulating the operation of the exercise modules at least partly based on at least one feature of the task with or without allowing the users to manipulate other elements of such a task based on at least one feature directly or indirectly related to the exercises, and so on. Whereby, the users can perform the exercises while pursuing the preset task goal based on progress of the task which is affected at least partly by the exercises. In either embodiment, the present invention relates to various processes of including in the exercise system multiple visual units which display the images of the task for each user, define compact or full-size configurations, and are in dispositions and arrangements providing a single or multiple view angles to the users who then monitor progress of the task and manipulate such simulated users during the exercises while competing each other for the task. The present invention further relate to various processes of providing such users with sounds, smells or sensations for the task and those of synchronizing such sounds, smells or sensations with the images for the task.
- Various aspects and/or embodiments of various systems, methods, and/or processes of this invention will now be described more particularly with reference to the accompanying drawings and text, where such aspects and/or embodiments thereof only represent different forms. Such systems, methods, and/or processes of this invention, however, may also be embodied in many other different forms and, accordingly, should not be limited to such aspects and/or embodiments which are set forth herein. Rather, various exemplary aspects and/or embodiments described herein are provided so that this disclosure will be thorough and complete, and fully convey the scope of the present invention to one of ordinary skill in the relevant art.
- Unless otherwise specified, various modules, units, elements, and parts of various exercise systems of this invention are not typically drawn to scales or proportions for ease of illustration. It is appreciated that such modules, units, elements, and parts of the exercise systems of this invention designated by the same and/or similar numerals generally represent the same, similar or functionally equivalent modules, units, elements, and/or parts thereof, respectively.
- In one aspect of the present invention, an exemplary exercise system is fabricated in various arrangements, while incorporating therein various modules and units for providing such exercise and task each defining various features.
FIG. 1 describes a schematic diagram of various modules of an exemplary exercise system according to the present invention, where thesystem 10 includes therein at least two standard exercise modules or simply exercise modules 20 (or 20A and 20B), at least onecontrol module 40, and at least oneoutput module 50. It is noted that theexercise modules 20 may be disposed in the same location or in different, geographically separate locations. - The
exercise system 10 may be deemed to include all themodules output modules system 10. It is appreciated that classification ofvarious members system 10 performs various functions set forth herein. - Each
exercise module exercise module 20 may improve or enhance a muscle tone of the user, increase his or her muscle mass or volume, force or facilitate the user to reduce his or her weight, increase his or her physical stamina, primarily by performing such exercise. In this context, theexercise modules 20 may include any prior art exercise or fitness equipment as set forth herein. In general, theexercise modules 20 allows the user to exercise by performing voluntary or involuntary activities of various muscles of the user leading to the above results. Accordingly, theexercise module 20 may include not only the prior art exercise or fitness equipment commonly seen in a gymnasium or fitness center but also any conventional equipment supplying electrical energy to the user and inducing involuntary contraction of the muscles. It is appreciated that theexercise modules 20 perform various operations in order to set a specific exercise to be performed by the user and the operations are characterized by various features (i.e., “features of operation” or “operation features” as defined above). It is also noted that the exercise set by such operations are also characterized by various features (i.e., “features of exercise” or “exercise features” as defined above). - In spite of its various operating mechanisms, each
exercise module exercise module 20 may incorporate at least one load as set forth herein and couple the actuating part with the load so that an amount of energy to be supplied by the user to the actuating part per a unit exercise may vary depending upon a magnitude of the load. When theexercise module 20 is to provide the electrical energy to the user, it may include at least one actuating part which contacts a body part of the user and supplies electric current therein or electric voltage thereacross, where examples of the actuating part may include, but not limited to, electrodes, handles, pedals, belts, and so on. Thus, theexercise module 20 of this invention does not include the prior art device which is preferentially intended to engage the user in solely playing the prior art video or computer game, although theexercise module 20 of this invention may be modified to allow the user to engage in any of such games to obtain the aforementioned results. - As set forth herein, the
exercise modules 20 may be disposed in different locations, where theexercise modules above exercise modules 20 may provide such exercises which define the same or similar types and different extents or, alternatively, which have different types and same, similar or different extents. More particularly,such exercise modules 20 preferably provide the same, similar or different exercises when the users are to simultaneously perform such exercises on, with or against eachexercise module 20. When desirablesuch exercise modules 20 may provide the same, similar or different exercises in a delayed mode when the users are to perform the exercises at different times. Based thereupon, eachexercise module 20 may include the same, similar or different actuating parts to provide such exercises. Theexercise modules 20 may be operatively coupled with each other directly or indirectly by another unit or module or disposed independently of each other. - The
exercise system 10 operatively couples theexercise modules 20 with thecontrol module 40 so as to perform various functions as set forth herein. For example, theexercise modules 20 may receive at least one control signal from thecontrol module 40 so as to manipulate at least one of their operations. Conversely,such exercise modules 20 may transmit various signals to thecontrol module 40 which monitors various variables or parameters of operations of theexercise modules 20 based thereon. Theexercise system 10 also couples theexercise modules 20 with theoutput module 50 by thecontrol module 40 so that at least one feature of the operation of theexercise modules 20 may be reflected by theoutput module 50 or at least one feature of such images for the task may be reflected by the operation of theexercise modules 20. Alternatively, thesystem 10 may also allow theexercise modules 20 to directly to with theoutput module 50. In any of such examples, themodules such exercise modules 20 via a local or global network each encompassing the different locations as will be described below. Although the embodiment ofFIG. 1 describes theexercise system 10 with asingle control module 40, it is appreciated that such asystem 10 may also include multiple control modules when theexercise modules 20 are disposed in different locations and it is desired to perform various control functions for eachexercise module 20 in each location. - As will be described in detail below, the
exercise modules 20 may couple to or incorporate at least one of various units of the control oroutput module exercise modules 20 may directly participate to provide the task of the story, scenery or game in the images, to perform various control functions, to provide the images or other features of the virtual environment, and the like. Theexercise modules 20 are preferably provided as separate modules of thesystem 10 as shown inFIG. 1 . Thesystem 10 may instead consist only of the control andoutput modules - The
output module 50 provides the users with at least one virtual environment which includes at least one visual feature (i.e., images) and may optionally include at least one auditory feature (i.e., sounds), at least one olfactory feature (i.e., smells), and at least one tactile feature (i.e., sensations) each of which is related to the task of the story, scenery or video (or computer) game provided in the images. To this end, such amodule 50 includes multiple visual (output)units 51, at least one optional auditory (output)unit 52, olfactory (output)unit 53, tactile (output)unit 54, anddisplay unit 55. - To display such images or provide other features of the virtual environment, the
output module 50 operatively couples with thecontrol module 40 and optionally at least one of theexercise modules 20. As inFIG. 1 , theoutput module 50 may be provided as a module of thesystem 10 or, alternatively, may be provided as an add-on module coupling with or retrofit to thesystem 10 which consists of the exercise andcontrol modules system 10 may not include any output module but couple to an external audiovisual device to recruit its pre-existing audiovisual capacity such as, e.g., a CRT, an LCD, an OLED, an IOLED, a PDP or any screen capable of displaying the images in a black-and-white or color mode, where examples of such external audiovisual devices may include, but not limited to, a stationary or portable audiovisual device including at least one screen (e.g., a DVD player and TV), a portable data processing device including at least one screen (e.g., a PDA, a data organizer or laptop computer), a portable communication device with at least one screen (e.g., a cellular or mobile phone), and the like. Similarly, thesystem 10 may couple to an external olfactory or tactile device to recruit a pre-existing olfactory or tactile capacity of the device. In addition to various functions to be described in conjunction with such units 51-55, theoutput module 50 may include at least a portion of thecontrol module 40, may perform at least one function of thecontrol module 40, or may be recruited as at least a portion of thecontrol module 40. - The
exercise system 10 may include asingle output module 50 including multiplevisual units 51 and one or multiple auditory, olfactory, tactile, and/or display units 52-55 or, alternatively, may includemultiple output modules 50 each including a single or multiple of such units 51-55 each may define the similar or different configurations and perform similar or different functions. Accordingly, thesystem 10 may display images of a task by a single ormultiple output modules 50 to the users simultaneously exercising on, with or against theexercise modules 20 or performing the exercise in a delayed mode. Therefore, thesingle output module 50 may serve as a “common”output module 50 of thesystem 10, whereas eachoutput module 50 may function as an “individual”output module 50. The same applies to each unit 51-55 of theoutput module 50 as well. - As will be disclosed below, the
output module 50 is preferably disposed to allow multiple users to view the images displayed on multiple compact or full-sizevisual units 51 in a single or multiple view angles and within a viewable distance therefrom, to allow the users to hear the sounds played by theauditory unit 52, to allow the users to smell substances delivered by theolfactory unit 53, to allow the users to feel the sensations generated by thetactile unit 54, and so on. Therefore, the output module 50 (or at least one unit thereof may be disposed away from the exercise andcontrol modules system 10 is installed. In the alternative, the output module 50 (or at least one unit thereof) may be incorporated to various portable articles carried by the users or various wearable articles to be worn thereby, where examples of such articles may include, e.g., glasses, goggles, a helmet, a hat, a cap, a head band, an earphone, a headphone, an earpiece, a headpiece, a hair band, a ring, a glove, or any other articles for releasably coupling such to the users or their cloths. Other configurations of the output module 50 (or at least one of its units) may depend on a specific type of the task provided in the images, and their disposition or arrangement may depend on the type of the task, or a number of theexercise modules FIG. 1 describes theexercise system 10 including asingle output module 50, such asystem 10 may also include multiple output modules when theexercise modules 20 are disposed in different locations and it is desired to perform various output functions for each user exercising on, with, and/or against adifferent exercise module 20 in each location. - It is appreciated that the output module 50 (or at least one of its units) may perform at least one function of the
control module 40 or that at least a portion of thecontrol module 40 may be included in such amodule 50. For example, theoutput module 50 may be arranged to generate at least a portion of the task in such images. In general, the output module 50 (or at least one of its units) is arranged to operate at least partly based on various control signals supplied by thecontrol module 40. The output module 50 (or at least one of its units) may also operate at least partly based on user inputs supplied directly thereto by the user or transmitted thereto through the exercise orcontrol modules - A main function of each
visual unit 51 is to display the images (i.e., visual feature) of the virtual environment for the task of the story, scenery or game, where the images may be generated thereby, stored and retrieved therefrom, supplied from or transmitted by thecontrol module 40 wirelessly or by wire, transmitted through wire or wirelessly from an external source such as, e.g., an external image-supplying device, user(s), other persons, and the like. Eachvisual unit 51 may preferably embody the task by displaying the images of various types such as, e.g., a still picture of a single portion (i.e., an entire portion of the picture is displayed simultaneously), a still picture of multiple portions a different portion of which is displayed one at a time, a series of the still pictures, a video clip, or its mixture. - At least one of the
visual units 51 of theoutput module 50 may define a compact configuration so that, when disposed and arranged properly within the viewable distance, one exercising user can view an entire portion of theunit 51 in a single view angle, where details of such avisual unit 51 have been disclosed in the co-pending Applications. At least one of thevisual units 51 may have a full-size configuration so that, when disposed and arranged properly in a viewable distance, one exercising user may view different portions of thevisual unit 51 in each of such view angles of the user. Details of the full-sizevisual unit 51 have also been disclosed in the co-pending Applications. - The images displayed on the
visual units 51 may be in black-and-white or multiple colors, and may be two- or three-dimensional. Such images may include various features such as, e.g., at least one object, background, optional simulated user, manipulatable feature, and the like, where the object and background may correspond to a living organism (e.g., a person, an animal, a plant, and the like), a nonliving object or a mixture thereof. When the task is the video or computer game, each object and background is preferably synthesized or, at best, animated to at least partly but not entirely resemble a real appearance of the user(s). The object and background may be synthesized by a program with or without animating the user(s) as well, where the simulated user may correspond to the animated or simulated version of at least one user feature. When the task is the story or scenery, each object and background preferably describes an actual configuration of the living organism, nonliving object, their mixture in the real world, and so on. The story or scenery may not typically include the simulated user therein, although the task thereof may also include at least one manipulatable feature. In particular, the background may represent an outdoor or indoor environment, a ground, a desert, a forest or woods, a mountain, a cave, a building, a stadium, a ring, a river, a lake, a water fall, an ocean, an underwater environment, a sky, an universe, a planet, a moon, a star, or other settings against or in which such an object may be included or portrayed, where the event may include an athletic game, a festival, a combat, a meeting, a contest, an exam, a business, or any other happenings related to the object or background, where the activity may then include walking, running, sprinting, jogging, jumping, rowing, throwing, pushing with or pulling a leg or an arm of the user, bending a leg, an arm, or any joint of the user, or rotating a leg, an arm, a joint, a waist, or a neck of the user, while the geographic region may include any landmark such as, e.g., a city, an urban or rural area, a monument, a theater, a building, a street, a road, a tunnel, a park, a desert, a forest, a mountain, a desert, a cave, a river, a lake, woods, a waterfall, an ocean, a sky, a star, an universe, and so on. Eachvisual unit 51 may then provide the images for the task on a single portion of its image domain, where further division of the images is not plausible. Alternatively, eachvisual unit 51 may provide the images simultaneously on multiple portions of its image domain, where each of such portions of the image domain may display at least one object or background thereon. Eachvisual unit 51 may provide such images of the task in a single portion of its image domain, where further division of the images is not plausible. Alternatively, thevisual unit 51 may provide the images on multiple portions of its image domain simultaneously, where each of such portions may then display at least one object or background thereon. - The images for the task of the story, scenery or video (or computer) game also define various features which are referred to as “features” of the task or, simply, “task features” as defined above. At least one of the task features is preferably manipulated at least partly based on at least one feature of the exercises engaged by the users (i.e., exercise features), at least one operation of the exercise modules 20 (i.e., operation features), or users (i.e., user features) such that the task features may be manipulated directly or indirectly by the exercise features. In general, the
system 10 may manipulate any task feature, although examples of the manipulatable task features may include, but not limited to, a shape and a size of the object or background, a shape and size of the simulated user, a number of the object, background or simulated user, their configuration, their arrangement, their disposition, their orientation, their color or hue, their contrast or brightness, their sharpness, view or projection angles thereof, a distance thereto, temporal or spatial characteristics, distribution, or variations of thereof and any of the above, and the like. - The task images may be provided or generated by various means. For example, the visual unit 51 (or control module 40) may store a single or multiple images and retrieve one or more images, may generate a single or multiple images by superposing one of the object and background onto the other or by composing such images from one or multiple visual elements, may receive (or download) one or multiple images from the
control module 40 or at least one external source such as, e.g., an internet, a wired broadcast or wireless broadcast, another user of another system, and the like. Thevisual unit 51 may also acquire one or multiple images by thesensor unit 42. In all of such examples, each image may include at least one object and background as set forth herein. As set forth herein, thevisual unit 51 performs any of such functions based upon the control signals (i.e., automatically or adaptively), based on the user inputs (i.e., manually), based on at least one of the features of the user, exercises, or operations of theexercise module 20, and the like. - To provide the images for the task, the visual unit 51 (or control module 40) may preferably be arranged to provide the desired visual feature in a preset viewpoint with respect to the user. To this end, the visual unit 51 (or control module 40) may control various features of the images as described above. The task images may be provided based on a preset view angle or distance with respect to the user or that the images may be provided to simulate the view angle or distance. When desirable, the
visual unit 51 may zoom in or out of such images, change the view angle of the images, vary the distance to the images, rotate the images with respect to a preset reference point, and the like. - The visual unit 51 (or control module 40) may control the temporal or spatial characteristics or distributions of such images in various modes. For example, the visual unit 51 (or control module 40) may display the still image, a series of still images, a video clip or a mixture thereof each obtained by the above means or sources in a preset order, randomly or based on at least one of various features of the task, users, exercises, operations, and the like. In addition, the visual unit 51 (or control module 40) may also perform the control for a preset period of time or a preset duration, where examples of such periods may be a period selected by the user, a period determined based on such features, and the like, and where examples of the durations may be a duration for the user to finish a preset portion of the exercises (or task), a duration till the images proceed to a preset stage, and so on. The
visual unit 51 may display such images by acquiring the object and background simultaneously, by acquiring the object and background independently and superposing one over the other, by acquiring one of the object and background followed by synthesizing the other and composing the object and background, and the like. The visual unit 51 (or control module 40) may also repeat a preset portion of the images, display the images without any repetition, or display such images in a preset order or randomly. - As set forth above, the
output modules 50 may include one or multiplevisual units 51 for eachexercise module 20. Allvisual units 51 may define the same shapes or sizes and display the same or different images in the same or different view angles, at least two of theunits 51 may define different shapes or sizes and display the same or different sizes in the same or different view angles, and the like, where at least one ofsuch units 51 may provide multiple view angles to the user who may view different portions of the images in each view angle such as, e.g., a panoramic view of such images. - Each
visual unit 51 may be constructed from any prior art display device capable of defining at least one image domain, where the domain may define a single portion or multiple portions related to or independent of such images. Thevisual unit 51 may include at least one driver or storage to include a single or multiple images or retrieves one or multiple images from various prior art storage media such as, e.g., electric, magnetic or optical tapes or disks, semiconductor or other equivalent memory chips, and the like. Thus, examples of such display devices may include, but not be limited to, a cathode-ray tube (or CRT), a liquid crystal display (or LCD), a display device with an organic light emitting diode (or OLED), an inorganic light emitting diode (or IOLED), a plasma display panel (or PDP), a beam projector with a screen, and the like. When thevisual unit 51 defines multiple image domains or instead defines a single image domain with multiple portions, the domains or portions may display the same or different images, where such images may be independent of each other or may cooperate with each other to form a bigger and wider coherent images. Other configurations, arrangements or dispositions of thevisual unit 51 may be similar or identical to those of theoutput module 50 as far as thevisual unit 51 may display the images in multiple view angles of the user and within a viewable distance therefrom. - A primary function of the
auditory unit 52 is to provide the sounds (i.e., auditory features) for the task, where the sounds may be generated thereby, stored and retrieved therefrom, supplied from or transmitted by thecontrol module 40 through wire or wirelessly, transmitted by wire or wirelessly from various external sources which are generally similar to those for thevisual unit 51 such as, e.g., an external sound-supplying device, user(s), other persons, and so on. In general, theauditory unit 52 may embody the task by playing such sounds of various types such as, e.g., voices of (or sounds generated by) the user or others, sounds of (or generated by) an animal or a plant, musical sounds, sounds from (or generated by) a nonliving object, sounds related with (or connoting) preset events or geographic regions, and/or synthetic sounds not related to any of the above, where the sounds may be played in a mono or stereo mode. Depending on their types, such sounds may carry text contents as is the case of the conversation, carry a melody or other musical contents as is the case of music or may not include any of the contents as is the case of instrumental music. - The
auditory unit 52 is to play such sounds corresponding to the auditory feature of the virtual environment for the task. The sounds may be provided in a mono, stereo or surround mode, and may include various features such as, e.g., a melody, tune, tempo, and optional verse, where the features may reflect or represent a living organism (e.g., a person, animal, or plant), a nonliving object, or their mixture. When the task is the game, the sounds may be synthesized or, at best, animated for at least partly but not entirely resembling real sounds of (or from) the organism or object or, alternatively, may be synthesized with or without animating the users. Therefore, the simulated user may correspond to an animated or simulated version of at least one feature of the sounds. When the task is the story or scenery, however, each of the features of the sounds may preferably describe actual sounds of the living organism, nonliving object or their mixture in a real world. The scenery or story may not include the simulated user, although the task may define at least one manipulatable feature. In either case, the features of the sounds may represent those of the objects or backgrounds of the images. - The sounds for the task of the story, scenery or video (or computer) game also define various features which are referred to as “features” of the task or, simply, “task features” as set forth herein. It is to be understood that at least one of the task features may also be preferably manipulated at least partly based on at least one exercise feature, at least one operation features, and/or at least one user features such that the task features may be manipulated either directly or indirectly by such exercise features. In general, the
system 10 may arrange any task feature to be manipulated thereby, although examples of such preferred manipulatable task features may include, but not be limited to, a volume or loudness of such sounds, a frequency or frequency distribution thereof, a balance thereof, temporal or spatial characteristics, distributions, and/or variations thereof, and the like. - Such sounds may be provided or generated by various means. In one example, each auditory unit 52 (or control module 40) may store a single or multiple sounds and retrieve one or more sounds therefrom, generate a single or multiple sounds by superposing one onto the other or by composing such sounds from one or multiple auditory elements, receive or download a single or multiple sounds from the
control module 40 or at least one external source such as, e.g., an internet, wired broadcast or wireless broadcast, another user, and the like. Theauditory unit 52 may acquire a single or multiple sounds by thesensor unit 42. In all of such examples, each sound may include at least one feature representing the object and/or background each of which is preferably animated or synthesized. As set forth herein, any of such functions of theauditory unit 52 may be decided at least partly based on the control signals (i.e., automatically or adaptively), user inputs (i.e., manually), at least one feature of the user(s), exercises, and/or operations of theexercise module 20, and the like. In each example, the sounds may be related to and/or associated with the event, region, timing, user or another person, and/or object. In general, the auditory unit 52 (or control module 40) may manipulate various aspects of the sounds based upon at least one feature of the task, user(s), exercises, and/or operations such as, e.g., the source of the sounds, type thereof, contents thereof, and the like, each of which may in turn be determined by the control signals, above features, and the like. - The
auditory unit 52 is to play such sounds representing the auditory feature of the task. The sounds may include various features each describing the object, background, optional simulated user, optional manipulatable feature, and the like. To provide the sounds with desired auditory features, the auditory unit 52 (or control module 40) may preferably provide the sounds in a preset viewpoint of the user(s). To this end, the auditory unit 52 (or control module 40) may control various features of such sounds, where examples of the features may include, but not limited to, a volume or amplitudes of the sounds, a tone or frequency distribution thereof, contents carried by the sounds, a direction thereof, a balance thereof, temporal and/or spatial characteristics, distributions, and/or variations of the above features, and so on. It is appreciated that the sounds may be generated along a preset direction or in a preset distance with respect to the user, or that such sounds may simulate the direction or distance. When desirable, theauditory unit 52 may control various features of the sounds, change the direction of and/or distance to such sounds, and the like. - The auditory unit 52 (or control module 40) may control the temporal or spatial characteristics or distributions of the sounds in various modes. For example, the
auditory unit 52 may play the same sounds, a series of different sounds, a mixture thereof each obtained by such means or sources in a preset order, randomly or based on at least one of such features. In addition, theauditory unit 52 may perform the control for a preset period of time (or duration), where examples of the preset periods are similar to those of thevisual unit 51. Theauditory unit 52 may play the sounds by acquiring different sounds simultaneously, acquiring different sounds independently and superposing one over another, acquiring one of the sounds, synthesizing the other, and composing mixed sounds therefrom, and the like. It is appreciated that thesystem 10 may also be arranged to provide the user with the task in the images alone, and/or in such sounds along with the images, where the sounds may also be provided independently of the images or synchronized therewith. - Each
auditory unit 52 may also be fabricated from any conventional with speakers and, when necessary, at least one driver and at least one storage capable of storing therein such sounds and/or retrieving such sounds from various prior art storage media such as electric, magnetic, and/or optical tapes or disks, semiconductor or other equivalent memory chips, and the like. In general, theauditory unit 52 may include any prior art speakers examples of which may include, but not be limited to, cone-drive speakers, piezoelectric speakers, electrostatic speakers, and any equivalents of the speakers. Theauditory unit 52 may also include a single speaker or multiple speakers which generate the same or different sounds therefrom. Theauditory unit 52 may propagate such sounds to a preset area of the user(s), where such areas may be selected similar to those of thevisual unit 51. In addition, such anauditory unit 52 may be provided in the dispositions similar to those of thevisual unit 51, except that theauditory unit 52 are to be placed closer to ears (than eyes) of the user(s). Other configurations, arrangements, and/or dispositions of theauditory unit 52 are similar or identical to those of theoverall output module 50 as long as theauditory unit 52 may generate the sounds and delivers such sounds to the exercising user(s). - A main function of the
olfactory unit 53 is to provide the olfactory feature (i.e., smells) of the virtual environment for the task of the story, scenery or game, where such smells may be generated thereby, stored and retrieved therefrom, and/or delivered from an external source such as, e.g., an external device for generating such smells, users, another system or other persons. In general, the olfactory unit 53 (or control module 40) preferably embodies the task by giving off smells provided (or generated) by various means. For example, theolfactory unit 53 may store a single strand or multiple strands of smells therein and retrieve one or more smells therefrom, may generate a single or multiple smells by mixing or reacting at least one chemical substance with another substance, may receive a single or multiple smells from a storage or at least one external source connected thereto by various tubing. Theolfactory unit 53 may acquire a single or multiple smells using thesensor unit 42. In all of these examples, the smells may be real ones (i.e., those existing in the nature) for the task of story or scenery, synthesized (i.e., those not existing in the nature) for the task of the game, and/or mixtures thereof. The real smells may be obtained from a living organism (such as a person, an animal, a plant, and the like) or nonliving object and correspond to the smells of the objects or backgrounds displayed on the images, user's smells, the synthesized smells for the virtual environment, and the like. In each of the examples, such smells may be related or associated with a preset event, geographic location, timing, and the like. In general, theolfactory unit 53 may be arranged to manipulate various aspects of the smells depending on various factors such as, e.g., the source of the smells, their types, contents thereof, and the like, each of which is decided by various features such as, e.g., user inputs, sensed or command signals, variables and/or parameters sensed by thesensor unit 42, conditions of the user and/orexercise module 20, or features of theexercise module 20 and/or task. - Such smells may be typically provided in a mono, stereo or surround mode and include various features such as, e.g., a type of the chemical substance, an intensity thereof, its direction, temporal or spatial characteristics, distributions, and/or variations thereof, and the like, where such features may reflect or represent a living organism (e.g., a person, an animal, a plant, and the like), a nonliving object or a mixture thereof, and the like. When the task is the game, the features are preferably synthesized smells or, at best, animated smells for at least partly but not entirely resembling such real smells of (or from) the organism or object.
- The smells for the task of the story, scenery or video (or computer) game also define various features which are referred to as “features” of the task or, simply, “task features” as set forth herein. It is to be understood that at least one of the task features may also be preferably manipulated at least partly based on at least one exercise feature, at least one operation feature, and/or at least one user feature such that the task features may be manipulated either directly or indirectly by such exercise features. In general, the
system 10 may arrange any task feature to be manipulated thereby, where examples of the manipulatable task features coincide with those enumerated in the above paragraph. - A major function of the
tactile unit 54 is to provide such sensations (i.e., the tactile feature) of the virtual environment, where such sensations may then be generated thereby, retrieved therefrom, generated by thecontrol module 40, retrieved from thecontrol module 40, transmitted by thecontrol module 40, transmitted by an external source such as, e.g., an external device, user, another system, other persons, and the like. In general, thetactile unit 54 may preferably embody the tactile feature of the virtual environment by providing various mechanical, thermal, electrical, and/or optical sensations to the user(s) who is to sense such sensations by his or her skin of a head, a neck, a hand, an arm, a shoulder, an upper torso, a lower torso, a thigh, a leg, a foot, and the like. Thetactile unit 54 may also provide the user(s) with the sensations of various features, e.g., by generating a single sensation of a constant amplitude or time-varying amplitudes, a temporal series of such sensations, a spatial series thereof, and a mixture thereof. Such sensations may be provided and/or generated by various means as well. For example, the tactile unit 54 (or control module 40) may store mechanical, thermal, optical or electrical energy therein and utilize at least a portion of the energy for providing a single or multiple sensations, may receive at least a portion of such energy from thestorage unit 43 or from at least one external source of the energy through a wire or wirelessly such as, e.g., an external energy source, a wireless or wired transmission of such energy, user, or user of another device which may not be theexercise module 20. Thetactile unit 54 may also preferably generate such sensations related to or associated with a preset object, a preset background, and the like. - As described above, the
tactile unit 54 is to provide such sensations which correspond to the tactile feature of the virtual environment for the task of the story, scenery or video or computer game. Such sensations may be provided in a mono, stereo or surround mode, and include various features such as, e.g., a type of the sensations, an intensity thereof, a body part onto which such sensations are applied, temporal and/or spatial characteristics, distributions, and/or variations thereof, and so on, where such features may reflect or represent a living organism (e.g., a person, an animal, a plant, and the like), a nonliving object or their mixture. When the task is the game, such features are preferably synthesized sensations or, at best, animated sensations for at least partly but not entirely resembling real sensations of (or from) the organism or object. Alternatively, the features of the sensations may be synthesized using a program with or without animating the user. When such a task is the story or scenery, each of such features of the sensations preferably describes actual sensations of (or from) the living organism, nonliving object or their mixture in the real world. In either case, such features of the sensations may represent those represented by the objects or backgrounds of the images. - The sensations for the task of the story, scenery or video (or computer) game define various features which are referred to as “features” of the task or, simply, “task features” as set forth herein. It is to be understood that at least one of the task features may also be preferably manipulated at least partly based on at least one exercise feature, at least one operation feature, and/or at least one user feature so that the task features may be manipulated directly or indirectly by the exercise features. In general, the
system 10 may arrange any task feature to be manipulated thereby, where examples of such manipulatable task features coincide with those of the above paragraph. - To provide such sensations of the virtual environment to the exercising user, the tactile unit 54 (or control module 40) preferably provides desired sensations in the preset viewpoint of the user. To this end, the
tactile unit 54 manipulates various factors of such sensations, where examples of such factors may include, but not be limited to, a type of such sensations, an intensity thereof, a body part of the user to which the sensations are delivered, temporal or spatial characteristics, distributions, or variations of such factors, and the like. It is appreciated that such sensations may be provided in a preset angle of application or direction with respect to the user or, alternatively, the sensations may be provided in order to simulate the angle or direction thereof. When desirable, thetactile unit 54 may vary the angle or direction of the application, change the type of the sensations, apply the sensations to different body parts of the user, and the like. - The tactile unit 54 (or control module 40) may control such temporal or spatial characteristics or distributions of the sensations in various modes. For example, the tactile unit 54 may generate the sensations with various mechanical, thermal, electrical, and/or optical properties with or without using at least one applicator, where examples of such mechanical properties may include, but not limited to, amplitudes of such mechanical sensations, an area of the body part applied with such sensations, a configuration and/or a number of such applicators, an arrangement of such an applicator, a hardness or softness thereof, elasticity or rigidness thereof, a surface structure thereof, where examples of such thermal properties may include, but not be limited to, temperature of the sensations, temperature of the applicator, an area of the body part is applied with the sensations, thermal conductivity of the applicator, a number of the applicators, an arrangement thereof, and so on, where examples of such electrical properties may include, but not be limited to, amplitudes of electric current or voltage for the sensations, an area of the body part applied with such sensations, electrical conductivity or resistivity of the applicator, a number of the applicators, and/or an arrangement of such applicators, and where examples of the optical properties may include, but not limited to, amplitudes of the sensations, optical characteristics of lights causing such sensations, an area applied with such sensations, a number of the applicators, an arrangement of the applicators, and the like. It is appreciated that the
tactile unit 54 may be arranged to control various temporal and/or spatial characteristics of such sensations, where examples of the sensations may include, but not limited to, a duration of the sensations, a total number thereof, an interval therebetween, a frequency thereof, a sequence thereof, and the like. Thetactile unit 54 may provide such sensations of various properties to various body parts of the user in various modes. For example, thetactile unit 54 may be arranged to contact its applicator with the body part of the user and to directly deliver such sensations thereto. It is appreciated that the applicator of such atactile unit 54 may be deemed as a part of thetactile unit 54 or, alternatively, may be deemed as a part of the exercise and/orcontrol modules tactile unit 54 may also be arranged to induce such sensations without necessarily including the applicator or without contacting its applicator with the desired body part of the user. To this end, thetactile unit 54 may generate forced convection such as a stream of wind of desired temperature (e.g., an ambient air, heated air, cooled air, and the like), irradiate electromagnetic waves such as infrared rays, and the like. Thetactile unit 54 may generate or induce the sensations each of which may be obtained by the above means or sources in a preset order, randomly or based on at least one of such factors. In addition, thetactile unit 54 may perform such control for a preset period of time or a preset duration, where examples of such preset periods may be a period selected by the user, a period determined based on such factors, and the like, where examples of such preset durations may be a duration until the user(s) may finish the exercise, a duration until the tactile feature of the environment reaches a preset feature, and the like. - It is to be understood that the
tactile unit 54 may be provided to operatively couple with asingle exercise module system 10 may include multipletactile units 54 at least one of which may serve as a “master”tactile unit 54 controlling the rest thereof. Similarly,multiple systems 10 may include at least onetactile unit 54 which serves as the “master”tactile unit 54 which controls the rest thereof. - The
tactile unit 54 may also be constructed from any conventional mechanical devices capable of generating and/or delivering the mechanical sensations, heating and/or cooling devices capable of generating and/or delivering the thermal sensations, electrical devices for generating and/or delivering the electrical sensations, optical devices for generating and/or delivering such optical sensations, and so on. As described above, thetactile unit 54 preferably includes at least one applicator for providing mechanical and/or electrical sensations. In addition, such aunit 54 may include a conductive wire or an electrode for delivering the electrical energy to the preset body part. For such thermal sensations, thetactile unit 54 may include at least one heating element which may deliver the thermal sensations to the user either through a direct contact, an indirect contact or through irradiating such electromagnetic waves thereto. Thetactile unit 54 may be arranged to deliver such sensations across a preset area which may correspond to a preset portion of the user, where such portions may be an area around the eyes of the user, an area encompassing an upper torso of the user, an area with a height similar to that of the user, an area capable of receiving and recognizing the mechanical, electrical, thermal or optical sensations. Thetactile unit 54 may also be provided to be disposed away from the user, to be worn by the user over and/or around at least one of his or her eyes, to be worn by the user over or around his or her head, to be carried by or coupling with the user, and the like. Such atactile unit 54 may also be incorporated into or provided as glasses, goggles, helmets, and the like. When desirable, at least a portion of thetactile unit 54 may be incorporated to theexercise module 20 or, alternatively, at least a portion of theexercise module 20, or an external device which may deliver such sensations may be recruited as thetactile unit 54 of theexercise system 10. Other configurational or operational characteristics of thetactile unit 52 may be similar or identical to those of theoverall output module 50 as long as thetactile unit 54 may generate the sensations and deliver such to the exercising user. - A major function of the
display unit 55 is to provide the exercising user(s) with various system and/or operation variables and/or parameters visually or audibly. To this end, thedisplay unit 55 may include any prior art audiovisual display elements such as, e.g., display panels, speakers, and the like. Thedisplay unit 55 may be disposed into thecontrol module 40 or at least a portion of theunit 55 may be disposed into the exercise and/oroutput modules exercise module 20 may also be recruited as thedisplay unit 55 of thesystem 10 as well. Other configurations, arrangements, and/or dispositions of thedisplay unit 55 are similar or identical to those of theoverall output module 50 as long as thedisplay unit 55 may provide the above variables and/or parameters to the exercising user(s) during such exercise(s). - Various units 51-55 of the
output module 50 may operatively couple with each other in various modes. For example, each unit 51-55 of may operatively couple with the rest thereof so that each unit 51-55 may receive or transmit various informations as the electrical or optical signals through wire or wirelessly. In an opposite example, at least one of the units 51-55 may couple with not all of the rest of the units 51-55 so that, e.g., thevisual unit 51 may couple with theauditory unit 52 but not with thedisplay unit 55, and the like. Accordingly, the detailed coupling modes of such units 51-55 depend not only upon an overall configuration of theoutput module 50 but also upon assigned functions of each unit 51-55. As described above, theoutput module 50 includes thevisual unit 51 but not necessarily includes other units 52-55. By the same token, theoutput module 50 may include multiple units of the same type, e.g., by including multiplevisual units 51 for displaying different and/or overlapping images, or two or moreauditory units 52 for generating different or overlapping sounds. It is appreciated that at least one unit 51-55 of theoutput module 50 may be incorporated intosuch exercise modules control module 40 and perform the same or similar functions as described herein. In such an embodiment, that unit may be deemed as a part of theoutput module 50 or may be deemed to form a part of the exercise orcontrol module - The
control module 40 generally includes at least oneinput unit 41, at least onecontrol unit 44, at least oneoptional sensor unit 42 and at least oneoptional storage unit 43, and is arranged to control various operations of thesystem 10 as a whole, i.e., to generate and manipulate various features of the task of the story, scenery or game in the images, to relate at least one task feature to at least one feature of the users, exercises or operation of theexercise modules 20, to relate at least one feature of the operation (or exercises) with at least one feature of the task, users, exercises or operation, to manipulate at least one feature of the task at least partly based on at least one feature of such users, exercises, and/or operation, to manipulate at least one operation feature at least partly based upon at least one feature of the task, users or exercises, and the like. Thecontrol module 40 may manipulate various operations of the exercise oroutput modules control module 40 may operatively couple with an external audiovisual device or other external devices including the display screens or speakers and supplement (or replace) the visual, auditory, olfactory, tactile or display units 51-55 of theoutput module 50, may operatively couple with an external device so as to provide the task and to replace (or supplement) at least a portion of thecontrol module 40, may be operatively coupled to an external device including the processor and replace or supplement at least a portion of itself, and the like. To these ends, the units 41-44 of thecontrol module 40 may perform various functions such as, e.g., generating the task in the images (or optional sounds), receiving the user inputs from the users in order to convert such to the command signals, monitoring variables and/or parameters and generating the sensed signals based thereupon, generating the control signals based on the command or sensed signals or independently thereof, generating the simulated user(s), manipulating at least one feature of the task by manipulating at least one feature of the images or simulated user(s), manipulating at least one feature of the images (or sounds) displayed (or played) by theoutput module 50, manipulating at least one operation of at least one ofsuch exercise modules 20, and the like. Thecontrol module 40 may operate in various modes as will be provided below and may include one or more of at least one of its units 41-44. It is appreciated that thecontrol module 40 may be arranged to allow communication between at least twoexercise modules 20 either directly or therethrough via a local or global network, where theexercise modules 20 are disposed in different locations while denying thereto access of the user(s) and requiring the users to go out of one location and to enter another for such an access. - In order to manipulate their operations, the
control module 40 may operatively couple with such exercise andoutput modules control module 40 may be provided as a module of theexercise system 10. Alternatively, thecontrol module 40 may be provided as an add-on module and couple with thesystem 10 which then consists of such exercise andoutput modules system 10 may not include any or not all units of thecontrol module 50 but operatively couple with an external device equipped with at least one processor in order to recruit and use pre-existing task-providing and/or task-processing capability of the device examples of which may include, but not be limited to, a laptop computer, a PDA, a data organizer, an external story and/or scenery generator, an external game console, an external audiovisual, visual or communication devices including such processors, and so on. In this embodiment, thesystem 10 may include a preset program for borrowing such task-generating or processing capability of the external devices or include an interface to operatively couple therewith. Alternatively, the external device may be equipped with a preset program which may perform at least one function of thecontrol module 40 as set forth herein. In addition to various functions described hereinabove, thecontrol module 40 may incorporate therein at least a portion of the exercise and/oroutput modules output modules such modules - The
exercise system 10 of this invention may include asingle control module 40 including one or multiple input, sensor, storage, and control units 41-44 or, alternatively, may includemultiple control modules 40 each including one or multiple of the above units 41-44 each of which may have similar or different configurations and each of which may perform similar or different functions. Thesystem 10 may have asingle control module 40 for a single ormultiple output modules 50, where such acontrol module 40 may operate as the common module in the latter case. Thesystem 10 may instead includemultiple control modules 50 for a single ormultiple output modules 50, wheresuch control modules 40 may correspond toindividual control modules 40 for eachoutput module 50 and where at least one of thecontrol modules 40 may serve as amaster control module 40. Similarly, thesystem 10 may include asingle control module 40 formultiple exercise modules 20, where thecontrol module 40 may operate as the common module in the latter case. Thesystem 10 may instead includemultiple control modules 50 formultiple exercise modules 20, where thecontrol modules 40 may be individual modules for eachexercise module 20, and at least one of thecontrol modules 40 may serve as a master module. When thesystem 10 includesmultiple control modules 40, their number may be more or less than a number of theexercise modules 20 oroutput modules 50. Regardless of its number, such acontrol module 40 may allow the user to manipulate the exercise and/oroutput modules such modules exercise modules 20, and the like. It is to be understood that theexercise modules 20 are to be disposed in different locations and, accordingly, that such acontrol module 40 may have to communicate with at least one ofsuch exercise modules 20 through the global network when themodule 40 is disposed in the location of oneexercise module 20. When thecontrol module 40 is disposed in a third location, such amodule 40 communicates with bothexercise modules 20 via the global network encompassing those locations of theexercise modules 20 and itself 40. - It is appreciated that such a control module 40 (or at least one of its units 41-44) may perform at least one function of the
output module 50 or that at least a portion of theoutput module 50 may be included in thecontrol module 40. For example, thecontrol module 40 may be arranged to display the images and/or play the sounds. In general, thecontrol module 40 is arranged to generate the control signals supplied toother modules control module 40 may transmit the user inputs supplied directly thereto toother modules control module 40 may also perform at least one function of theexercise modules 20 or at least a portion of theexercise modules 20 may be incorporated in thecontrol module 40. For example, thecontrol module 40 may directly provide or manipulate the load of the actuating part ofsuch modules 20. Conversely, at least a portion of thecontrol module 40 may be disposed in the exercise oroutput module control unit 44 may be disposed insuch exercise modules 20 for manipulating their operations, may be disposed in theoutput module 50 for manipulating various visual elements of such images for the task, for manipulating various auditory elements of the sounds, and the like. - The control module 40 (or at least one unit thereof) may fixedly or releasably couple with both of the exercise and
output modules control module 40 may be disposed away fromsuch modules system 10 is installed as well. The control module 50 (or at least one unit thereof 41-44) may be incorporated in the portable or wearable articles. Other configurations, arrangements, and/or dispositions of thecontrol module 40 depend on a specific type of the task in the images or sounds, where its disposition and/or arrangement may also depend upon whether the task is visual or audiovisual. - The
control module 40 may include various communication-related units such as, e.g., at least one audio in unit for acquiring such sounds, at least one audio out unit for generating signals carrying therealong the sounds, at least one video in unit for acquiring such images, at least one video out unit for generating signals carrying therealong such images, at least one receiving unit for receiving such signals, at least one sending unit for transmitting such signals, and the like, where such receiving and sending units may respectively receive and transmit such signals through wire or wirelessly through a global network which covers different locations in which theexercise modules 20 are disposed and where such reception or transmission may be unilateral or bilateral. Such communication-related units may be provided as separate units or, in the alternative, may be incorporated into one or more of such units 41-44. In particular, thecontrol module 40 may provide real-time communication of such signals between theexercise modules 20 directly or indirectly, may instead provide a real-time communication between theexercise modules 20 directly or indirectly, or both, thereby providing a transfer of such signals between themodules exercise modules 20, and the like. As will be described below, thecontrol module 40 may also be arranged to transfer such signals without altering the task feature (i.e., a simple transfer) or, in the alternative, to transfer such signals by altering or converting the task feature based upon at least one preset relation (i.e., an equivalent conversion). Accordingly, thecontrol module 40 may perform such transfer sequentially or simultaneously, based on a mode of the users performing the exercises on, with or against theexercise modules 20. When desirable, thecontrol module 40 may operatively couple with an external communication devices, where examples of such devices may include, but not be limited to, a wired or wireless telephone, a wireless portable or mobile phone, a beeper, a walkie-talkie, and other prior art communication devices. - A major function of the
input unit 41 is to receive user inputs which are supplied by the user(s) and related or associated with desired features of the task (i.e., the story, scenery or game), user(s), exercise(s), and/or operation of theexercise modules 20. Based upon its operating mechanisms, theinput unit 41 may receive the user inputs by or without necessarily contacting the user(s). Theinput unit 41 then converts the user inputs into the (electric or optical) command signals. Any prior art input device may be used as theinput unit 41. Accordingly, theinput unit 41 may receive the user inputs by sensing mechanical, electrical, optical, magnetic or electromagnetic input signals supplied thereto by movements of various body parts of the user(s), compression thereby, or contact therewith, where examples of such body parts may include, but not limited to, fingers, hands, wrists, arms, toes, feet, thighs, legs, shoulders, neck, head, eyes, back, belly, sides, and the like. To this end, theinput unit 41 may be fabricated similar to various prior art input devices examples of which may include, but not be limited to, a key, key pad, array of the keys, button or array of buttons, switch or array of switches, touch screen, mouse, track pad, track ball, track stick, joystick, and the like. It is appreciated that theinput unit 41 may define any of such configurations or modifications thereof depending upon types of the user inputs, body parts of the user(s) contacting theinput unit 41 for applying the user inputs, and the like, that theinput unit 41 may move or deform in response to the user inputs, or that theinput unit 41 may not move or deform in response thereto. Theinput unit 41 may receive the user inputs without mechanically contacting any body part of the user(s). To this end, such aninput unit 41 may generate therearound electric or magnetic fields and receive the user inputs by monitoring perturbation of such fields which is caused by the body part of the user(s) disposed in its vicinity but not contacting such. - The
input unit 41 may be incorporated in various positions around thesystem 10. For example, theinput unit 41 may be provided physically separate from the exercise andoutput modules input unit 41 may be disposed on or insuch modules input unit 41 may operatively couple to other units and/or modules of thesystem 10 wirelessly or wire, depending on its disposition and configuration. At least a portion of theinput unit 41 may be included into the portable or wearable articles so that the users may perform the exercises while providing the user inputs without disengaging himself or herself from the exercises. At least a portion of theinput unit 41 may be worn around other body parts and allow the user(s) to provide the user inputs without using his or her hand and/or stopping such exercises. - It is appreciated that the
system 10 may include asingle input unit 41 capable of receiving such user inputs for theexercise modules 20 or, alternatively, may havemultiple input units 41 at least one of which operates as a “master”input unit 41 controlling the rest of thereof. Similarly, theinput unit 41 may receive the user inputs for asingle output module 50 or, alternatively, thesystem 10 may includemultiple input units 41 at least one of which serves as themaster input unit 41. Theinput units 41 may be of the same or different type, disposed in the same or different positions, and/or receive the same or different units inputs. Theinput unit 41 may receive the user inputs from one or more body parts of the user(s) which have to contact the actuating parts of theexercise modules 20 for such exercises. However, theinput unit 41 may receive such user inputs from other body parts of the user(s) which may not be necessary for the exercises, which may not contact such actuating parts of theexercise modules 20, and so on. Therefore, theinput unit 41 may receive the user inputs through the first body part of the user(s) which are required for such exercises and may additionally receive the user inputs by the second body part of the user(s) which is different from such a first body part and which is not necessarily required to perform the exercise. This latter embodiment may be particularly useful when the task requires multiple user inputs for the user(s) to proceed through the task while performing the exercises. For example, theinput unit 41 receives the primary user inputs from the feet of the user running on the treadmill-type exercise module 20 and monitors the exercise or user feature therefrom, and receives the auxiliary user inputs from the hands of the user which are not related to the running exercise but required for manipulating various features of the task. Such primary and auxiliary user inputs may generally be differentiated from each other based on an amount of energy associated with or consumed by the related body parts, i.e., the body part delivering a significant amount of energy of the user. Other configurations, arrangements or dispositions of theinput unit 41 are similar or identical to those of theoverall control module 40 as far as theinput unit 41 can receive such user inputs. - A primary function of the
optional sensor unit 42 is to monitor various variables or parameters of (or related with) various operations of the exercise and/oroutput modules exercise modules 20, and to convert the monitored features into the sensed signals. Depending upon its configuration or operating characteristics, thesensor unit 42 monitors such features by contacting at least one body part of the users or without necessarily contacting such. The control module 40 (or its control unit 44) receives the sensed signal and converts such into the control signal for various purposes such as a feedback control ofother modules - Any conventional sensing device may be used as the
sensor unit 42 of thecontrol module 40. In particular, thesensor unit 42 may monitor various features of physical or physiological conditions of the user(s), operations, exercises, and/or task. Accordingly, any prior art sensing device capable of measuring such conditions and features may be used as thesensor unit 42, where examples of such conditions and features may include, but not limited to, a presence or absence of the user(s) (or body part) with respect to a preset landmark of thesystem 10 orexercise modules 20, a distance from the landmark to the user(s) or body part, a position or posture thereof, a movement (including its direction or displacement) thereof, temperature thereof, heart rate or blood pressure thereof, blood O2 level or sugar concentration thereof, ECG, EEG, EMG, height, weight, or body fat content or percent. - The
sensor unit 42 may be incorporated into various positions of theexercise system 10. For example, thesensor unit 42 may be formed physically separate from such exercise oroutput modules sensor unit 42 may instead be disposed in or on at least one ofsuch modules sensor unit 42 may operatively couple with other parts of thesystem 10 by wire or wirelessly, depending on its disposition or structure. In addition, at least a portion of thesensor unit 42 may be included into the portable or wearable articles so that the user(s) may perform the exercise while providing the user inputs without disengaging himself or herself from such exercises. At least a portion of thesensor unit 42 may also be worn around other body parts of the users to allow theunit 42 to monitor various variables and parameters ofother modules unit 42, however, depends upon such variables or parameters, feature to be monitored or operating mechanism thereof. - The
system 10 may includemultiple sensor units 42 monitoring such variables or parameters of theexercise modules 20 and monitor various features of multiple users simultaneously performing the exercises on, with, and/or against theexercise modules 20. Allsensor units 42 may define the same or different configurations, may monitor of the same or different variables and/or parameters, may be disposed in the same or different positions, and the like. In addition, at least one of thesensor units 42 may also serve as a “master”sensor unit 42 manipulating the rest thereof. Similarly, thesensor units 42 may monitor the variables and/or parameters for asingle output module 50 or, in the alternative, thesystem 10 may havemultiple sensor units 42 at least one of which serves as themaster sensor unit 42. Further configurations, arrangements or dispositions of thesensor units 42 are similar or identical to those of theoverall control module 40 as far as thesensor units 42 monitor the features, variables, and/or parameters with or without contacting the user(s). - A main function of the
optional storage unit 43 is to store informations required for generating or providing various features of the task of the story, scenery, and/or game in such images (or other features) of the virtual environment or which may be needed to transfer various features of the task, user(s), exercise(s), and/or operation(s). Accordingly, thestorage unit 43 may store an algorithm for generating and proceeding along the task, an algorithm (i.e., relation) for manipulating the features of the task based on various features of the exercises, and the like. Depending on the configuration of thestorage unit 43, such informations may be stored therein or may be retrieved from anyother units control module 40. In order to facilitate retrieval of desirable informations, thestorage unit 43 may include a driver for accessing such informations and/or capable of storing such thereinto, where examples of the informations may include various features of the images for the task, those of the objects, backgrounds, and/or simulated user, various set points and/or control thresholds for any of such variables, parameters, features, control programs and/or algorithms, and the like. - Any conventional storage device may be used as the
storage unit 43 of thecontrol module 40. Thus, thestorage unit 43 may be magnetic tapes or disks, optical disks or semiconductor data storage devices, in each of which such informations may be stored in an analog or digital mode. As far as thestorage unit 43 may store such informations, thestorage unit 43 may be formed in almost any prior art processes and in almost any prior art configurations. Such astorage unit 43 may be incorporated into those positions as disclosed in conjunction with thesensor unit 42. - A major function of the
control unit 44 is to perform all of the aforementioned functions except those of other units 41-43 of thecontrol module 40, although thecontrol unit 44 may also perform the functions of those units 41-43 in order to assist or supplement such units 41-43. Most importantly, thecontrol unit 44 may generate the task in the images (or virtual environment) and allow multiple users to perform the same or different exercises while transferring at least one feature of the task, exercises, users, and/or operation of theexercise modules 20 in a preset mode. - The first main function of the
control unit 44 is to provide the task of the story, scenery, and/or video (or computer) game with primary (or 10) features. Such 1° task features typically consists of 10 task types and 1° task extents, where the task may be provided in such images of a single still picture with a single or multiple portions, a series of the pictures, and/or a video clip, or optionally provided in such auditory, olfactory, and/or tactile features. The 1° task types include various features such as, e.g., a task goal (e.g., viewing or watching the images or other features of the virtual environment) for the task, proceeding along such stages of the task, attaining a preset objective by competing a preset program or another user, and the like), a number of stages or levels required therefor (i.e., a number of portions formed in the still pictures, number of still pictures in the video clip, and/or number of parts in the video clip), means to attain the task goal (e.g., performing the exercise, applying the user inputs, and the like), means to proceed through the task stages (e.g., performing the exercise, applying such user inputs, monitoring the variables or parameters of themodules - The second main function of the
control unit 44 is to provide the task of the story, scenery or video (or computer) game in the images (or optionally sounds, smells, sensations) with secondary (or 2°) features. The 2° task features typically consists of 2° task types and 2° task extents, where such a task is provided in such images of a single still picture including a single or multiple portions, a series of such pictures, and/or a video clip, optionally provided in such sounds, and/or optionally provided in such smells and/or sensations. The 2° features of the task defined in the images of a still picture with a single or multiple portions may include, e.g., selecting a preset portion, a direction or a sequence of viewing a next portion, a speed or a temporal gap between viewing different portions, a viewing area or an extent of zoom, a view angle when the images may be rotated, and the like. Such 2° features of the task defined in the images of a series of still pictures, and/or video clip may include, e.g., selecting a preset picture thereof, a direction or sequence of viewing a next picture, a speed or a gap between viewing different pictures, a viewing area or an extent of zoom, a perspective angle of such images, a view angle when the images may be rotated, and the like. - Another main function of the
control unit 44 is to relate at least one feature of one of the task, user, exercise done by the user, and operation of theexercise module 20 with at least one feature of the other thereof, whether directly or through at least one simulated user. Thecontrol unit 44 may be arranged to perform such relating based upon a fixed relation defined between at least two of such features (i.e., automatically), based on a relation defined between the features and varied according to at least one of such features (i.e., automatically and/or adaptively), based on the command signals (i.e., manually), based on the sensed signals (i.e., automatically and/or adaptively), and the like. - Based upon these major functions, the
control unit 44 performs numerous other functions. For example, thecontrol unit 44 receives the command and sensed signals respectively from the input andsensor units output modules control unit 44 generally determines various features of the task of the story, scenery or video (or computer) game defined in such images of the virtual environment, to select desirable images from multiple sets of images stored in thestorage unit 43 or supplied thereto from various sources, to generate the images by assembling or composing various features therefor, and the like. In general, thecontrol unit 44 performs such functions based on various features which include the user inputs or command signals derived therefrom, variables and/or parameters monitored by thesensor unit 42 or sensed signals derived therefrom, and the like. Therefore, the features also include the physical or physiological conditions of the user(s) monitored by thesensor unit 42, control programs or algorithms stored therein or supplied thereto by the user(s) or external sources, and so on. In addition, such features may include various variables and/or parameters related or associated to theexercise modules exercise modules 20, a duration of exercises done against or onto theexercise modules 20, a duration of the exercises performed by the user(s), a duration of the exercises (or energy) done on the user(s), the mechanical load presented by and/or set in theexercise modules 20 against the user(s), a product of the load and any of such durations, a variable which may be represented as a mathematical function of the load, a number of calories consumed by the user(s), a work done against or onto theexercise modules 20 by the user(s), a work done on the user(s) by theexercise modules 20, and the like, where the load may be deemed as a variable or a parameter determining an amount of energy consumed by the user(s) in consummating an unit displacement or an unit deformation of a specific part of theexercise modules 20 or body part of such user(s). It is understood that such a load may be quantified by various means examples of which may include, but not limited to, a distance in which the user(s) travels with respect to a preset part of theexercise modules 20, a length along which such a part travels or deforms, an angle about which the part bends or deforms, or a weight of such a part which is moved or deformed by the user(s). To such ends, theexercise modules 20 may manipulate such a load by various means examples of which may include, but not limited to, adjusting a speed or an angle of a preset part of theexercise modules 20, a modulus of the part, its spring constant or viscosity, its weight or its length. - The
control unit 44 may acquire various features of the virtual environment, may transfer such images or other features related thereto to other units 41-43 of thecontrol module 40 or, alternatively, toother modules control unit 44 may receive such images (or other features) from other units 41-43 of thecontrol module 40 orother modules system 10. To this end, various units of thecontrol unit 44 may perform communication-related functions or, alternatively, thecontrol module 40 may include various communication-related units, both for performing transfer of at least one feature between theexercise modules 20 or between multiple users. - In one example, the
control module 40 may include at least one video-in unit for acquiring such images (including their visual features or elements) or, at least a portion of such acontrol unit 44 may acquire the images. In general, the images are generated by or retrieved from the video-in unit (or the portion of the control unit 44), generated by or retrieved from other units 41-43 of thecontrol module 40, or transmitted thereto by other units 41-43 thereof or by an external source such as the external device or user(s). The images may be acquired as a still picture, a series of the still pictures, a video clip or a mixture, where the images may be in black-and-white or in multiple colors, while each image includes at least one object or background. The images may be formed or generated in various ways. For example, the video-in unit (or a portion of the control unit 44) may store a single or multiple images and retrieve one or more of such therefrom, may provide a single or multiple images by superposing the object onto the background or composing such images from one or multiple visual elements, and/or may receive or download a single or multiple images from thestorage unit 43 or the external source such as, e.g., an internet, wired or wireless broadcast, an user of another exercise module, an user of another external device, and the like. The video-in unit may acquire a single or multiple images by itself or using thesensor unit 42. In all of such examples, each image may include at least one object or background each respectively representing an animation of a real or abstract object, an animation of a real or abstract background, and the like, where the object may be a living organism (such as a person, an animal, a plant, and the like) or a nonliving object, while the background may be the living organism or nonliving object. Accordingly, the object may correspond to the user(s) recreated by the animation, simulated object included in the images, controllable object included in the background, and the like. In each example, contents of the images (i.e., object and background) may be an animated or arbitrary object and/or arbitrary scene. In addition, the object and/or background may be associated or related to a preset event, geographic location, timing, and the like. In general, the video-in unit may manipulate various features of the images depending upon various factors as, e.g., the source of the images, type of the source, contents of the images, and/or such aspects each determined by various factors such as, e.g., the user inputs, command or sensed signals, control signals, variables and/or parameters monitored by thesensor unit 42, conditions of the user(s), and operations. - To provide the images of desired visual features, the video-in unit may provide desired images in a preset viewpoint of the user. To this end, the video-in unit may manipulate various features of the images, where examples of such features may include, but not limited to, shapes and/or sizes of such images or their portions, contents or colors thereof, brightness or hues thereof, sharpness or zoom thereof, contrasts thereof, temporal or spatial characteristics, distributions, and/or variations of such aspects, and the like. It is appreciated that the images may be provided based on a preset view angle of or distance to the user or that such images may be provided to simulate the view angle or distance. When desirable, the
control unit 44 may zoom in or out such images, vary the view angle thereof, vary the distance thereto, rotate such images with respect to a preset base, and the like. - To this end, the video-in unit may include any prior art camera, camcorder, and/or other image recording devices including charge-coupled devices capable of acquiring such images. The
control unit 44 may include a single camera (or camcorder) or multiple cameras (or camcorders) for acquiring the same or different images therewith. The video-in unit may acquire the images of the body part of the user(s), where the video-in unit may be disposed to preferentially aim the body part. Accordingly, the preset or target area for the video-in unit may be, e.g., an entire visible area of the user(s), area around the face thereof, area covering an upper torso thereof or area defining a height similar to that of the user(s). The video-in unit may be disposed in theexercise modules 20 or, alternatively, at least a portion of the exercise modules 20 (or another external audiovisual device) may also be recruited as the video-in unit. Thesensor unit 42 may be used to acquire the images as well. - In another example, the
control module 40 may have at least one video-out unit for generating such images (including visual features or elements), or at least a portion of such acontrol unit 44 may generate the images. Such images may be generated by and/or retrieved from the video-out unit (or a portion of the control unit 44), generated by or retrieved from other units 41-43 of such amodule 40 orother modules system 10, or transmitted thereto by an external source such as, e.g., an external device or user(s). Depending on the types of sources, the images may include a still picture, a series of multiple pictures, a video clip, a mixture thereof, and so on. The video-out unit is to perform various functions similar or identical to those of thevisual unit 51 and, therefore, may be replaced by thevisual unit 51. Otherwise, further configurations, arrangements, and/or dispositions of the video-out unit are similar or identical to those of thevisual unit 51. - The video-in and/or video-out units (or control unit 44) may acquire and/or generate the images associated or synchronized with other features. In one example, the video-in and video-out units may associate or synchronize at least one feature of such images with at least one feature of the user(s) which may include, but not limited to, a face, hand or arm, foot or leg, other body parts, appearance, orientation or posture, movement, and physical or physiological condition thereof. In another example, such video-in and/or video-out units may associate or synchronize at least one feature of the images with at least one of feature of the exercises as set forth herein, each of which may be determined by various aspects and/or factors as described above. In another example, the video-in and/or video-out units may associate or synchronize at least one feature of the images with at least one feature of the operations which may also include, but not limited to, variables and/or parameters of such operations of the
exercise modules 20, a preset control program designed therefor, and the like. - In another example, the
control module 40 may include at least one audio-in unit to acquire the sounds (i.e., auditory features or elements) or, at least a portion of thecontrol unit 44 may acquire the sounds. The sounds may be generated by or retrieved from the audio-in unit (or a portion of the unit 44), generated by or retrieved from other units 41-43 of thecontrol module 40 orother modules 20 of thesystem 10, or transmitted thereto by an external source such as an external device or users. The audio-in unit may acquire the sounds by various means, e.g., by acquiring voices of or sounds of the user(s) or other persons, those generated by a plant or animal, those of music or sounds generated by a nonliving object, where the sounds may be provided in a mono, stereo or surround mode. Based on the types of the sources, the sounds may carry the contents (e.g., conversation) or melody (e.g., instrumental or non-instrumental music) or may not carry any contents (e.g., instrumental music). The sounds may be provided by various means. For example, the audio-in unit may store the sounds and retrieve such therefrom, may synthesize the sounds or superposing at least one sound onto another, may receive or download the sounds from thestorage unit 43 or at least one external source such as an internet, wired or wireless broadcast, an user of another exercise module, or an user of another device. The audio-in unit may also acquire the sounds with thesensor unit 42. In such examples, the sounds may include real or synthesized sounds, where each sound may be generated or represent each source. Accordingly, the sounds may correspond to the sounds of the user(s) in own voices or sounds synthesized or simulated by various prior art means. In each example, such sounds may be related to or associated with the preset event, geographic location, timing, person, and/or object. The audio-in unit may manipulate various aspects of the sounds depending on various factors such as the source of such sounds, type thereof, their contents or aspects each of which may be determined by various factors such as, e.g., the user inputs, command or sensed signals, control signals, variables or parameters sensed by thesensor unit 42, conditions of the user(s), operations, and the like. - To provide such sounds with desired auditory features, the audio-in unit preferably provides the desired sounds in the preset viewpoint of the user. To this end, the audio-in unit may manipulate various features or aspects of the sounds, where examples of such aspects may include, but not be limited to, a volume or tone of the sounds, a content thereof, frequency distribution thereof, a direction thereof, temporal or spatial characteristics, distributions, and variations of such aspects and features, and the like. It is appreciated that the sounds may be provided based on a preset direction or distance with respect to the user or that the sounds may be provided to simulate the direction or distance. The audio-in unit may also control such aspects or features of the sounds, change the direction of and/or distance to such sounds, and the like. To this end, the audio-in unit may be constructed from any prior art microphones for acquiring such sounds in an audible (or inaudible) frequency range. The audio-in unit may include a single or multiple microphones for acquiring the same or different sounds therefrom. The audio-in unit may acquire such sounds from a preset target area of the user, where the audio-in unit may be disposed near a mouth or other body parts of the user(s). Therefore, the preset or target area may include, e.g., an entire audible area of the user(s), an area around a mouth thereof, an area encompassing an upper torso thereof, an area defining a height similar to that of the user(s), and the like. The audio-in unit may be disposed away from the user(s), to be worn thereby over or around a mouth or vocal cord thereof, to be worn thereby over or around a head thereof, to be portably carried thereby, and the like. When desirable, the audio-in unit may be incorporated into the
exercise modules exercise modules sensor unit 42 of thecontrol module 40 may be recruited therefor as well. - In another example, the
control module 40 may include at least one audio-out unit for generate the sounds (including auditory features or elements), or thecontrol unit 44 may generate the sounds. Such sounds may be generated by or retrieved from the audio-out unit (or a portion of the control unit 44), generated by or retrieved from other units 41-43 of thecontrol module 40 orother modules 20 of thesystem 10, transmitted thereto by an external source such as, e.g., an external device or user(s). Depending on the source types, such sounds may be in a mono, stereo or surround mode. The audio-out unit is to perform various functions similar or identical to those of theauditory unit 52 of theoutput module 50 and, thus, may be replaced by theauditory unit 51. Further configurations, arrangements or dispositions of the audio-out unit are similar or identical to those of theauditory unit 52. - The audio-in and/or audio-out units (or control unit 44) may acquire and generate such sounds associated or synchronized with other features. In one example, such audio-in and/or audio-out units may associate or synchronize at least one feature of the sounds with at least one of feature of the user(s) as set forth herein. In another example, such audio-in and/or audio-out units may associate or synchronize at least one feature of the sounds with at least one feature of the exercises as set forth herein, which may be decided by various aspects or factors as set forth herein. In another example, the audio-in and/or audio-out units may associate or synchronize at least one feature of such sounds with at least one feature of the operations such as, e.g., variables or parameters of such operations and a preset control program designed therefor.
- In another example, the
control module 40 includes at least one receiving unit for receiving the signals wirelessly or by wire, or at least a portion of thecontrol unit 44 may receive the signals. The receiving unit (or the portion of the control unit 44) may receive the signals from the exercise oroutput modules control module 40, or an external source such as an internet, wired or wireless broadcast. It is appreciated that the signals may carry various informations which include various features described above. Therefore, the receiving unit (or the portion of the control unit 44) defines a desirable frequency response or sensitivity capable of receiving such informations and features with minimal distortion. The receiving unit may be made from any prior art wireless or wired receiver capable of receiving the signals of preset frequency ranges. The receiving unit may receive the informations from thestorage unit 43 of thecontrol module 40, from a provider of a wired or wireless communication, through a global network, and so on. The receiving unit may operatively couple with the exercise oroutput modules system 10. Thesystem 10 may also include multiple receiving (or control) units at least one of which serves as a “master” receiving unit and controls the rest thereof. Regardless of its types and number, the receiving unit may be at least partially enclosed by a cover, a divider or a partition as it is desired to enclose the receiving unit inside thesystem 10. The receiving unit may include various prior art wave guides or paths for enhancing reception of such waves. Similar to other modules, the receiving unit may be incorporated into other units or modules of thesystem 10. - In another example, the
control module 40 may include at least one sending unit for transmitting various signals wirelessly or through wire, or at least a portion of thecontrol unit 44 may transmit the signals. The signals transmitted by the sending unit (or a portion of the control unit 44) carry various informations therealong which include various features set forth herein and, accordingly, the sending unit preferably includes a desirable frequency response or sensitivity for transmitting the informations or features with minimal distortion. The sending unit transmits such informationsother modules storage unit 43, user(s),other modules - Such communication-related units may be operatively coupled to each other in various modes. For example, each unit may operatively couple with the rest thereof so that each unit may receive or transmit the signals as the electrical or optical signals by wire or wirelessly. In an opposite example, at least one of such units may couple with not all but only some of the rest of the units such that, e.g., the video-in or video-out unit may couple to the audio-in or audio-out unit but not with the rest thereof. Therefore, detailed coupling modes of such units depend not only on a configuration of such units but also on assigned functions of each of such units. As described above, such communication-related units may include multiple of at least one of such units of the same type, e.g., including multiple video-in and/or video-out units in order to display different and/or overlapping images, including two or more audio-in and/or audio-out units to play different or overlapping sounds, and the like. At least a portion of the communication-related units may be included in the exercise and/or
output modules control module 40 or may be deemed as a part of such exercise orcontrol module - Such a
control unit 44 may be arranged to transfer of at least one feature of the task, users, exercises, and/or at least one operation of theexercise modules 20 only from one to another of theexercise modules 20, between theexercise modules 20, and the like. Thecontrol unit 44 allows the transfer wirelessly or by wire, without altering any feature (i.e., a simple transfer) by converting or altering at least one feature based on at least one preset relation (i.e., an equivalent conversion), and the like. To this end, various portions of thecontrol unit 44 may preferably perform various transfer-related functions or such acontrol module 40 may include various conversion-related units, both for performing transfer of at least one feature between the users or betweensuch exercise modules 20. It is appreciated that such transfer (including both of the simple transfer and equivalent conversion) is to transfer the task feature from one to another exercise oroutput modules control unit 44 may perform the transfer directly between themodules control unit 44 may transfer at least one feature of the users, exercises, and/or operations between the exercise and/oroutput modules - Such conversion-related units (or control unit 44) may receive at least one feature of the task, user(s), exercises, and/or operations and convert such into the converted (or control) signals at least partly based on a preset relation. In particular, such conversion-related units (or control unit 44) may generate the converted signals based on the control signals or various features of the task, user(s), exercises, and/or operations, may convert the converted signals into the control signals, and the like, although such units may generate the converted or control signals (to be referred to as the “signals” hereinafter) for manipulating various features of the task provided by the
control module 40 or other external devices defining story-generating, scenery-generating or game-generating capabilities which may or may not be deemed as a part of thesystem 10. It is appreciated that such conversion-related units generate the signals and to deliver such signals to the external device, thereby manipulating the device based on the signals and manipulating at least one task feature based thereon. To these ends, such conversion-related units may include various units such as, e.g., at least one simulator unit, at least one converter unit, and at least one driver unit, where such units may be provided in a singular or plural arrangements. As set forth herein, however, at least one of such units may be incorporated to and/or replaced by at least a portion of thecontrol module 40 or other modules of thesystem 10. At least one of such units may be incorporated into or replaced by at least a portion of thecontrol module 40 and/or device. Accordingly, exact disposition of various conversion-related units or classifications thereof may not be critical to the scope of this invention as long as thesystem 10 may provide various functions to be disclosed in conjunction with various conversion-related units. - In one example, such a
control module 40 may include at least one simulator unit for acquiring at least one feature of the task, user(s), exercise(s), and/or operations, or at least a portion of such acontrol unit 44 may be arranged to acquire the feature. The simulator unit (or the portion of the control unit 44) may passively receive the feature from another module (or unit) of thesystem 10 or actively monitor and acquire the feature using any prior art sensors or thesensor unit 42. As set forth herein, the simulator unit acquires at least one exercise feature and then to convert the acquired feature into the signals, where the simulator unit may acquire the desired feature by directly monitoring such, by estimating the desirable feature from at least one another exercise feature, by estimating the desired feature based on at least one of such features of the user(s), tasks, and/or operations, by receiving the desirable feature from another module or unit, and the like. It is appreciated that the simulator unit may acquire the desired feature by analyzing various images or sounds provided by other modules or units or by the external device which may or may not be deemed as the part of thesystem 10. When the simulator unit acquires at least one feature of the task, user(s), and/or operations, such a unit may acquire the feature by monitoring or estimating through various means similar to those of acquiring the exercise feature. The simulator unit may acquire at least one task feature and to convert the acquired feature into the converted or control signals, where the simulator unit acquires the desired feature of the task by directly monitoring the task, by estimating the desirable feature from at least one another task feature, by estimating the desired feature based on at least one of the features of such user(s), exercise(s), and/or operations, by receiving the desirable feature from another module and/or unit, by receiving the desirable feature from the external device, by receiving the desired feature from another user of another exercise module, and the like. The simulator unit may acquire the desired feature by analyzing the images and/or sounds generated for the task by thecontrol module 40 or by the external device which may or may not be the part of thesystem 10. - The simulator unit couples with
such exercise modules 20 or external device to manipulate at least one operation feature of the modules 20 (or device) at least partly based on at least one feature of the task, users, exercises, and/or operations. The simulator unit may acquire the desired exercise feature by directly coupling to theexercise modules 20, indirectly coupling thereto through thecontrol module 40, and the like. Similarly, the simulator unit may directly couple with the external device when thesystem 10 may drive the device and manipulating at least one feature of the task provided by the device. The simulator unit may also operatively couple to the external device by at least one of other modules or units of thesystem 10. The simulator unit may be constructed from any prior art devices capable of acquiring such desired exercises, task, and/or other features and converting such into the converted and/or control signals. The simulator unit may include at least one receiver to receive such a desired feature from the exercise orcontrol modules other modules - In another example, the
control module 40 may include at least one converter unit for assisting the simulator unit (or the portion of the control unit 44) while performing conversion of the acquired feature to converted or control signals by providing at least one relation and equivalence respectively for the “simple transfer” and “equivalent conversion,” both of which are to be collectively referred to as the “conversion” hereinafter. At least a portion of thecontrol unit 44 may perform such conversion by providing the relation and equivalence. The converter unit may preferably provide such a relation for associating or synchronizing at least one feature of one of such a task, user(s), exercise(s), and operation of theexercise modules 20 with at least one feature of at least one another thereof so that various features of different types may be related to each other at least partly based on the relation. The converter unit may arrange the relation to account for discrepancies in amounts of energy which may be required for performing an unit of various exercises and which may be attributed to different types of exercises, different loads imposed bydifferent exercise modules 20, differences in physical abilities of the users, and the like. Thus, thesystem 10 may perform the “simple transfer” when thecontrol module 40 delivers/or transmits at least one preset feature between at least twomodules system 10 may perform the “equivalent conversion” as thecontrol module 40 may deliver or transmit at least one preset feature between themodules system 10 may relate at least one feature of the task, users, exercises or operations to a different feature of the same type, to the same feature of the different type, or to a different feature of a different type. - The converter unit (or the portion of the control unit 44) may perform the simple transfer and/or equivalent conversion based on the preset relation which may be decided at least partly based on at least one feature of various types such as, e.g., the task, user(s), exercise(s), operation, and the like. For example, the converter unit may provide the relation and convert at least one feature of one of the above types defined in a specific unit (e.g., calories, watts, N, N/m, N-m, minute, meter, and so on) into the same feature of the same type but in a different unit, into the same feature of a different type but in the same unit, into a different feature of the same type but in a different unit, and/or into a different feature of a different type and in a different unit. As a result, the converter unit may synchronize or associate at least one feature of the exercise(s) with at least one feature of the task (or vice versa), may associate or synchronize at least one feature of the user(s) and/or operation(s) with at least one feature of the task or vice versa, and the like. This function may be of particular importance when the conversion unit is to simulate the user(s) into the simulated user such as, e.g., at least one object or background in the images for the task. The converter unit may generate such a relation at least partly based on the control signals generated at least partly based on the user inputs and/or preset program, may retrieve one or more from multiple relations stored in the
system 10, and the like. - The converter unit may keep the relation constant during the exercises or at least one stage of the task. The converter unit may allow the users to manually control the relation during the exercises or stage of the task. The converter unit may determine the relation at least partly based upon various signals provided by the users, an user of another exercise module or the external device, and so on, with or without any intervention therefrom. The converter unit may vary the relation automatically or adaptively at least partly based on at least one factor of the users, task, exercises, and/or operations. The converter unit may be made of any prior art devices for generating or retrieving the basic relation or equivalence and utilizing such to generate the converted or control signals. The converter unit may include at least one optional receiver to receive such a feature from the exercise or
output module 20, 50 (or external device), at least one optional sensor for monitoring such a feature of the task, users or exercises, at least one processor to generate the relation, and the like. The converter unit may also be provided as a software and driven by the control and/orother modules control module 40, or at least a portion thereof may be incorporated to the exercise oroutput modules 20, 50 (or external device) when desirable. - In another example, the
control module 40 may optionally have at least one driver unit capable of providing the converted or control signals and manipulating at least one feature of the task provided by the external device, or at least a portion of thecontrol unit 44 may provide such signals and control the task feature. To this end, the driver unit may have a configuration or arrangement to communicate with the external device of only a certain type or at least two external devices of different operating types. It is appreciated that detailed configurations or arrangements of the driver unit are not critical to the scope of this invention as far as the driver unit drives the external device. Thecontrol module 40 may not incorporate any driver unit when the simulator unit, converter unit or its other units 41-44 may generate the task and manipulate various features thereof or may directly control the external device, which explains why the driver unit is merely an optional unit. The external device may also perform the function of the driver unit when desirable. The driver unit may be provided as a driver of any prior art audiovisual external device capable of communicating with at least one module of thesystem 10. The driver unit may have at least one receiver and converter, where the receiver may receive various signals from the simulator or converter unit, or exercise oroutput modules system 10. Such a driver unit may be incorporated into thecontrol module 40 or at least a portion of the driver unit may be included in the exercise oroutput modules - Based upon these major functions, the
control unit 44 performs numerous other functions. For example, thecontrol unit 44 receives the command and sensed signals respectively from the input andsensor units output module control unit 44 may determine the features of the task defined in the images, to select desirable images from multiple sets of images stored in thestorage unit 43 or supplied thereto from various sources, to generate the images by assembling or composing various features, and the like. Thecontrol unit 44 performs such functions based on various features including the user inputs or command signals, variables or parameters monitored by thesensor unit 42, sensed signals, and so on. Thus, thecontrol unit 44 may monitor the features such as the physical or physiological conditions of the users monitored by thesensor unit 42, control programs or algorithms stored therein or supplied thereto by the users or external sources, and the like. Such features may include various variables or parameters related or associated to theexercise modules 20 or exercises, where examples of such features may include, but not be limited to, a type of the exercise and an extent thereof attained by the user. Such an extent is defined as various criteria examples of which may include, but not be limited to, a duration of presence or absence of the users on or near theexercise modules 20, a duration of such exercises done against or onto theexercise modules 20, a duration of the exercises performed by the users, a duration of the exercises or energy done by or onto the users, a load presented by or set in theexercise modules 20 against the users, a product of the load and any of such durations, a mathematical function of the load, a number of calories estimated to be consumed by the user, a work done against or onto theexercise modules 20 by the users, a work done on the users by theexercise modules 20, and the like, where such a load may be deemed as a variable or a parameter determining an amount of energy consumed by the user in consummating an unit displacement and/or deformation of a specific part (e.g., the actuating part) of theexercise modules 20 or body parts of the users. - As described above, the
control unit 44 may be arranged to manipulate at least one feature of one of the task, users, exercises, and/or operations of theexercise modules 20 at least partly based on at least one feature of another thereof. Such manipulation may be classified in three modes, i.e., manipulating the task based on the exercises, manipulating the exercises based on the task, and both. - For example, the
control unit 44 may monitor the users or exercisemodules 20, acquire at least one feature of such users, exercises, and/or operation of theexercise modules 20, and manipulate at least one task feature at least partly based on the monitored feature. To this end, thecontrol unit 44 monitors the operation or user feature by thesensor unit 42, generates the task in the images (with or without including the simulated user) of the virtual environment at least partly based on the monitored feature, and manipulates theoutput module 50 to display the images (or with the sounds). Therefore, thecontrol module 40 may manipulate the task (i.e., its features) at least partly directly based upon the exercises (i.e., exercise feature) or indirectly based thereon (i.e., user or operation feature), thereby allowing the users to manipulate the task and to proceed through the task at least partly based on the exercises. Thecontrol unit 44 may vary the mode of manipulation in various means, e.g., in response to the command signals (i.e., manually), to the sensed signals (i.e., automatically or adaptively), or at least partly based upon at least one of such features of the task, users, exercises, and/or operation features (i.e., automatically or adaptively). - In another example, the
control unit 44 may monitor the images for the task (or simulated user therein), acquire at least one feature of the images (or that of the simulated user), and then manipulate at least one operation feature of theexercise modules 20 at least partly based on the monitored task feature. To this end, thecontrol module 44 may operatively couple to the load of theexercise modules 20, manipulate the load at least partly based on the monitored feature, and then manipulate theoutput module 50 to display such images (or simulated user) of which the feature is determined at least partly based on the manipulated load ofsuch exercise modules 20. Thus, thecontrol module 40 manipulates the operation feature of theexercise modules 20 at least partly based upon the task, thereby directly affecting the operation of such exercises by the task feature and indirectly affecting the types and/or extents of such exercises to be performed by the users and the physical or physiological conditions of the users which result from such exercises at least partly based on performance of the task. Thecontrol unit 44 may change the manipulation mode in various means, e.g., in response to the command signals (i.e., manually), to the sensed signals (i.e., adaptively or automatically), or at least partly based on at least one of the features including those features of the task, users, exercises, and/or operation (i.e., automatically or adaptively). - In another example, the
control unit 44 may perform such manipulations either sequentially (i.e., one after another) or simultaneously. That is, thecontrol unit 44 may manipulate at least one feature of the task at least partly based on the monitored features of the users, exercises, and/or operation or may manipulate at least one operation feature at least partly based on the monitored task feature. Thiscontrol unit 44 may change its mode of manipulation based upon the command signals (i.e., manually), sensed signals (i.e., automatically or adaptively), at least one of such features (i.e., automatically or adaptively), and the like. It is appreciated that the sequential manipulation is best suited when a single user performs different exercises on, with or against oneexercise module 20 and views the images for the task by theoutput module 50, although it is not impossible to use the simultaneous manipulation mode therefor. Whereas, the simultaneous manipulation is best suited as multiple users perform same or different exercises on, with or againstmultiple exercise modules 20 while viewing such images for the task usingindividual output modules 50. The sequential or simultaneous manipulation is performed in real time so that a desired feature may be transferred between the exercise andoutput modules control unit 44 may store such features of the first exercising user and then transfer such to another user later. In both manipulations, thecontrol unit 44 may transfer the feature by wire or wirelessly or may transfer the feature with or without modifying at least a portion thereof. - When the
control unit 44 is to manipulate various features related to at least one object of such images, such features may include various features associated with the objects, where examples of the features may include, but not be limited to, the type of the object (i.e., an animated or synthesized object), a mode thereof (i.e., black and white in a grey-scale or color-scale), a dimension thereof (i.e., two- or three-dimensional), a configuration thereof, an arrangement and/or disposition thereof, and so on. As thecontrol unit 44 simulates the user(s) into the simulated user, the manipulatable features of the simulated user may similarly include, but not be limited to, the type of the simulated user, its mode, its dimension, its configuration, or its arrangement and/or disposition. Such an object may be selected to be directly manipulated by at least one feature of the user(s), task, exercise, and/or operation or, in the alternative, may be the simulated user simulating or synthesizing the user(s). It is appreciated that thecontrol unit 44 may generate one or multiple simulated user for the sequential manipulation but may generate multiple simulated users for the simultaneous manipulation, although the reverse is feasible. - The
control unit 44 may manipulate at least one of such features based on at least one another thereof while providing the user with the task in the images of such desired features. In one example, thecontrol unit 44 may provide (i.e., generate or select) the images of the task in a preset perspective of the user. For example, such a feature associated or related with the users may not be included in the images. Thecontrol unit 44 may also include in the images at least one user feature which may be constant or varying based upon other features. In another example, thecontrol unit 44 may construct the images in an arrangement that any feature of the images may be varied by the users depending on other features. In particular, thecontrol unit 44 may include in such images at least one object at least one feature of which is determined at least partially based on at least one another feature. Thus, thecontrol unit 44 may allow the users to manipulate at least one feature of the object, thereby allowing him or her to manipulate the images directly or indirectly by manipulating at least one of such features. Thecontrol unit 44 may instead construct the images for the task which the users may not be able to directly control. In another example, thecontrol unit 44 may include at least one object in such images, while simulating at least one characteristic of the users by the object corresponding to the simulated user. Therefore, thecontrol unit 44 may change at least one feature of the images for the task (or its simulated user) based on various features of the users or exercise modules 20 (i.e., their operations). Thecontrol unit 44 may thereafter manipulate at least one feature of such images based upon at least one of such features, user inputs, command or sensed signals, conditions of the users, or operations. - The
control unit 44 may arrange such images to simulate the users into at least one object or background of the task, as at least one voice or other auditory features, and the like. It is appreciated that thecontrol unit 44 may unilaterally manipulate the object, background, voices, sounds, and so on, while simulating the users as such, but the users may not manipulate such in a reverse mode. In the alternative, thecontrol unit 44 may be arranged to allow such control by the users. Accordingly, thecontrol unit 44 may manipulate at least one feature of the images for the task based on at least one of such features, user inputs, command or sensed signals, conditions of the users, or operations. - The
control unit 44 may use the images for the task or features to control various operations of theexercise modules 20. For example, thecontrol unit 44 may provide (i.e., generate or select) such images in the users' (or another) perspective and then to manipulate such operations of theexercise modules 20 based upon at least one of the features. In another example, thecontrol unit 44 simulates the users into at least one of such features of the images and control such operations by manipulating the simulated portion of the feature of such images. In another example, thecontrol unit 44 may allow the users to directly manipulate at least one feature of the images and directly control such operations or may control the operations indirectly using the manipulatable feature of the images for the task. - Any conventional control device may be used as the
control unit 44 of thecontrol module 40 of the present invention. Accordingly, thecontrol unit 44 may include various electric elements such as a resistor, a capacitor, an inductor, an amplifier, a diode, and the like, details of which are well known to one of ordinary skill in the art of the electrical control system. Such acontrol unit 44 may be formed on a circuit board, may be fabricated as a microchip, and the like. As far as thecontrol unit 44 may be able to perform various control functions set forth herein, thecontrol unit 44 may be provided in almost any prior art processes and in almost any prior art configurations. - The
control unit 44 may be disposed into various positions of thesystem 10. For example, thecontrol unit 44 may be formed physically separate from the exercise oroutput module modules control unit 44 may operatively couple with other parts of thesystem 10 by wire or wirelessly, depending on its disposition or structure. In addition, at least a portion of thecontrol unit 44 may be incorporated in the portable or wearable articles so that the users may perform the exercises while providing the user inputs without stopping the exercises. At least a portion of thecontrol unit 44 may be worn around the users to allow thecontrol unit 44 to perform its functions. Other configurations, arrangements, and/or dispositions of thecontrol unit 44 are same or similar to those of theoverall control module 40 as far as thecontrol unit 44 performs such functions. - Various units (including those units related to the communication or conversion) of the
control module 40 may operatively couple to each other in various modes. For example, each unit may couple with the rest thereof and receive or transmit various informations carried along the electrical or optical signals by wire or wirelessly. Conversely, at least one of such units may couple with one or more but not all of such units where, e.g., thestorage unit 43 couples with thecontrol unit 44 but not with thesensor unit 42. Accordingly, the detailed coupling modes of such units depend not only on an overall configuration of thecontrol module 40 but also on those functions assigned to each unit. Thecontrol module 40 requires thecontrol unit 44 but not necessarily needs the input, sensor or storage unit 41-43. Theinput unit 41 may also be replaced by at least oneexercise module 20 capable of receiving the user inputs and relaying such (or command signals) to thecontrol unit 44. Thesensor unit 42 may not be needed as thecontrol unit 44 may monitor the variables or parameters, when thecontrol unit 44 does not include a feedback control mechanism, and the like. Thecontrol module 40 may not need thestorage unit 43 as thecontrol unit 44 provides or generates the task and its features based on various informations supplied thereto by the users or from an external source. By the same token, thecontrol module 40 may include multiple units of the same or different types such that, e.g., thecontrol module 40 may include two ormore control units 44 performing different or redundant functions, two ormore sensor units 42 monitoring the same or different variables or parameters, each monitoring the same variable or parameter in different positions, and so on. At least a portion of thecontrol module 40 may be disposed in the exercise oroutput module control module 40 or may form a part of the exercise and/oroutput module - In another aspect of the present invention, an exemplary exercise system may be embodied in various configurations and/or arrangements.
FIGS. 2A to 2F depict schematic diagrams of exemplary exercise systems incorporating therein a different number of modules. It is appreciated that not every module or unit of the system is included in the figures but that the modules and/or units set forth herein may be incorporated into such systems and perform the above functions as set forth hereinabove. - In one exemplary embodiment and as exemplified in
FIG. 2A , anexemplary exercise system 10 includes twoexercise modules control modules output modules control module 40A manipulates one exercise andoutput module control module 40B manipulates another exercise andoutput module such modules Such modules - In another exemplary embodiment and as exemplified in
FIG. 2B , anexemplary exercise system 10 has twoexercise modules output modules single control module 40, where theexercise modules output modules single control module 40 may manipulate theexercise modules output modules FIG. 2C , anexemplary exercise system 10 has twoexercise modules control modules single output module 50, where theexercise modules control modules single output module 40 may provide the images for theexercise modules control modules - In another exemplary embodiment and as exemplified in
FIG. 2D , anexemplary exercise system 10 includes twoexercise modules single output module 50, but asingle control module 40, where theexercise modules single control module 40 may manipulate theexercise modules output module 50A, and thesingle output module 50 may then display the images for theexercise modules - In another exemplary embodiment and as exemplified in
FIG. 2E , anexemplary exercise system 10 may include asingle exercise module 20, asingle output module 50, and asingle control module 40. Thissystem 10 cooperates with another exercise module or another system which is disposed in the same location and includes at least one exercise module so that a single user may perform different exercises sequentially or multiple users performs the same or different exercises simultaneously. In a related embodiment and as described inFIG. 2F , anexemplary exercise system 10 includes asingle exercise module 20, asingle control module 40, and a pair ofoutput modules system 10 preferably cooperates with another exercise module or another system which is disposed in the same location and includes at least one exercise module such that a single user may perform different exercises sequentially or multiple users performs the same or different exercises simultaneously. - In another aspect of the present invention, such exercise systems may be embodied in various configurations and/or arrangements.
FIGS. 3A and 3B are schematic perspective views of exemplary exercise systems each of which includes two exercise modules according to the present invention. It is appreciated that not every module and/or unit of the systems are shown in the figures but that such modules and/or their units described hereinabove may be incorporated into such systems and perform the above functions as set forth hereinabove. - In one exemplary embodiment of this aspect of the invention and as exemplified in
FIG. 3A , anexemplary exercise system 10 includes twoexercise modules output modules output modules - The
system 10 includes twoexercise modules frame track frame exercise module track frame stand 21S extending upward and bifurcating vertically while forming a pair ofhandles 21H which may assist the user to hold on thereto while engaging running or walking with or on eachexercise module frames couplers 21C of which functions are to be provided below.Such exercise modules exercise modules tracks Such modules tracks tracks - Each
output module visual unit image domain 51M and including anauditory unit 52, anolfactory unit 53, and atactile unit 54. Eachvisual unit visual unit exercise module visual unit flat image domain 51M. When desirable, thesystem 10 may include a singlevisual unit 51 defining multiple portions in itsimage domain 51M or, in the alternative, such asystem 10 may have multiplevisual units multiple images domains 51M. Eachvisual unit exercise module coupler 21C. Thecouplers 21C may support suchvisual units such couplers 21C may movably support thevisual units visual units image domain 51M which is asymmetric vertically or horizontally, disposed preferentially to one side of eachexercise module such exercise modules exercise module visual units auditory unit visual unit auditory unit visual unit system 10 in different arrangements as well, as far as eachauditory unit exercise module auditory units system 10 as long as such a disposition may not hinder normal operation of the visual and/orauditory units auditory units such units exercise modules exercise modules output modules - The
output module olfactory units olfactory units auditory units olfactory units system 10 as long as far disposition may not hinder normal operation thereof. It is further appreciated that theolfactory units output modules tactile units tactile units handles 21H of eachexercise modules tactile units tactile units tactile units output modules tactile units - The
output modules display units such stands 21S of eachexercise module display units various modules output modules output module own display unit exercise module control modules system 10 have been set forth hereinabove. - In operation, multiple (two in this example) users perform the same or different exercises using multiple (two in this example)
exercise modules first exercise module 20A or afirst control module 40A in order to perform first exercise and to provide desired first images (or first virtual environment) for the task. For example, the first user selects what kind of first images are generated by the first control module in a desired mode, and provides the settings to those modules. The user then turns on thefirst exercise module 20A, translates itstrack 22A at a desirable speed, begins walking or running thereon, and so on. Concurrently with the first user, a second user may set asecond exercise module 20B and perform second exercise which is identical to, similar to or different from the first exercise, where the second user also performs the exercise of running or walking on thetrack 22B in this example. The second user also selects second images for the task, where the task for the second user is typically same as that for the first user and where the second images may be identical or similar to the first images, may be such first images viewed in a different perspective, or may be different images of the same task. Such first and/or second control modules may monitor at least one feature of the first and second exercises, the first and second users, and/or operation of such first andsecond exercise modules second exercise modules - The
system 10 transfers the task (or another) feature between the first andsecond exercise modules first modules second modules such exercise modules such exercise modules - In addition, the first and/or
second control modules 40 may preferably compare the features of the task performed by the first and second users, monitored features of such first and second users, monitored features of such first and second exercises, and/or monitored features of the operations of the first andsecond exercise modules second control modules 40 may manipulate the task feature of such first and/or second users, may manipulate the operation feature of such first andsecond exercise modules exercise modules - In another exemplary embodiment of this aspect of the invention and as exemplified in
FIG. 3B , anexemplary exercise system 10 also includes twoexercise modules output modules output modules - The
system 10 includes twoexercise modules FIG. 3B and a second of which is a conventional weight lift equipment. Thesecond exercise module 20B includes aframe 21C, a pair of handles 21HC, achair 23, andmultiple weights 24, where such aframe 21C defines a basic body of the third exercise module 20C, wheresuch weights 24 are selectively loaded and coupled to the handles 21HC, and where the second user sits on thechair 23 and engages in weight lifting by moving the handles 21HC. More particularly, theframe 21C encloses thechair 23 and movably retains a pair of handles 21HC disposed in locations accessible by a second user when sitting on thechair 23.Multiple weights 24 are stacked behind thechair 23, and arranged to be releasably loaded onto a connector (not shown in this figure) which mechanically couples with the handles 21HC. These handles 21HC are arranged to pivot about centers of rotation (not shown in the figure) and to be disposed at a shoulder level of the second user sitting on thechair 23. Similar to that ofFIG. 3A , the second exercise module 20C may include numerous other parts not shown in the figure but commonly seen in the prior art lifting machines as well. Although not included in the figure, theexercise modules output modules - In operation, multiple (e.g., two in this example) users perform the same or different exercises on, with or against two
exercise modules system 10. For example, a first user sets afirst exercise module 20A and/or afirst control module 40A to perform first exercise and to provide the first images (or sounds) for the task. The user turns on thefirst exercise module 20A, translates itstrack 22A at a preset speed, and begins walking or running thereover. A second user also sets asecond exercise module 20B and performs second exercise of lifting theweights 24 by pivoting the handles 21HC. The second user selects second images (and/or second sounds) for the task similar to that ofFIG. 3A . The first and/or second control modules monitor at least one feature of the first and second exercises, first and second users, and/or operation of the first andsecond exercise modules second exercise modules - In addition, the first and/or
second control modules 40 may preferably compare the features of the task performed by the first and second users, monitored features of such first and second users, monitored features of such first and second exercises, and/or monitored features of the operations of the first andsecond exercise modules second control modules 40 may manipulate the task feature of such first and/or second users, may manipulate the operation feature of such first andsecond exercise modules - In another aspect of the present invention, a simulating exercise system may simulate an user thereof into at least one simulated user defined in a task and manipulate various features of the user, task, exercise, and/or exercise module at least partly based on at least one feature of the same type or a different type.
FIG. 4A shows a schematic perspective view of an exemplary simulating exercise system including an exercise module and simulating an user of the exercise module into a simulated user which is defined in a task and which is arranged to compete against a preset program stored in and/or provided to the system according to the present invention. It is to be understood that not every module and/or unit of the simulating system may be shown in the figure but that those modules and/or units thereof described hereinabove as well as those of the co-pending Applications may be included in the system for performing various functions as set forth herein and in the co-pending Applications, respectively. It is also appreciated that any of the above units of the control and conversion modules may be incorporated into various exposed and/or hidden locations of the simulating exercising system. It is further appreciated that an upper panel of the figure represents the perspective view of the entire system, while a lower panel of the figure visually explains a preset task defined for an user who may be engaged in exercise also provided by the system. - An exemplary
simulating exercise system 10 is similar to that ofFIG. 3A , except that the figure only focuses on oneexercise module 20 disposed in one location. Accordingly, it is appreciated that thesystem 10 includes at least one another exercise module which is not shown in this figure but is disposed in a different location and on, with or against which another user performs the same, similar or different exercise. When desires, the user exercising on theexercise module 20 of this figure may be arranged to compete a control module (not shown in the figure) of thesystem 10 in a common task of a story, scenery or game. - As described in the lower panel of
FIG. 4A , the control module (or game console) defines the preset task which is to be performed by the user. In this particular example, the task is similar to the prior art video game which has been known as the “Pac Man” or an equivalent thereof which defines theimage domain 51M on which several mobile and/or stationary objects and a stationary background are defined therein. More particularly, the task defines a two-dimensional background which consists of multiple rectangular blocks which are arranged in rows and columns and spaced away from each other while providing vertical and horizontal routes therebetween. The task defines asimulated user 81, multiple opposingusers 82, andmultiple credits 83, where thesimulated user 81 may preferably be manipulated by the user and travel vertically and/or horizontally along the routes defined between the blocks of the background, while catching thecredits 83 and avoiding encounter with the opposingusers 82. In addition, the task may define a preset goal such as, e.g., catching all ofsuch credits 83, surviving through the task for a preset period of time without being attacked by such opposingusers 82, and the like. In general, the task may dispose thecredits 83 in preset locations along such routes as defined by a preset program stored in the control module 40 (or game console), as selected based on the user inputs, as determined by at least one of such features of the user, task, exercise, and/or exercise module 20 (or operations thereof, and the like. Similarly, the task may dispose the opposingusers 82 in preset locations along the routes and move the opposingusers 82 along such routes in a preset manner which may be determined by a preset program stored in such a control module 40 (or game console), which may be selected based upon the user inputs, which may be determined by at least one of the above features of the user, task, exercise, and/or exercise module 20 (or operations thereof), and the like. - The task may be arranged to define multiple stages each of which may be provided to the user in a preset sequence which may be decided at least partly based on a preset program of the
control module 40 or game console, based on the control and/or converted signals, based on the user inputs provided by the user of theexercise module 20 of the same ordifferent system 10, and the like. Each of such stages may also define identical, similar or different objects and/or background therein, may define levels or difficulties of different extents by varying characteristics of the opposingusers 82 orcredits 83, and the like. The task may also assign various thresholds onto those stages such that the simulated user may proceed from one stage to the next one when the simulated user accomplishes a preset goal in that stage. - The control and/or
conversion modules 40, 70 may then be arranged to manipulate at least one of various features of the above task such that thesimulated user 81 defined in theimage domain 51M may proceed through such stages of the task at least partly based on at least one of the features of the user, exercise, and/or exercise module 20 (i.e., various operations thereof. Therefore, the user of theexercise module 20 may manipulate thesimulated user 81 of the task to proceed through such stages of the task while performing exercise thereon, therewith, and/or thereagainst. Alternatively, such control and/orconversion modules 40, 70 may be arranged to manipulate at least one of various features of the exercise and/orexercise module 20 such that thecontrol module 40 may manipulate at least one operation of theexercise module 20 and/or exercise provided by such amodule 20 at least partly based on at least one of such features of the task. Accordingly, the user of the control module 40 (or game console) may manipulate thesimulated user 81 of the task to proceed through the stages of the task while performing the exercise of which features are determined at least partly based upon at least one of such features of the task. - Other than the Pac Man game exemplified herein, the control module 40 (or game console) may be arranged to provide the user with different audiovisual games each of which may define the above or similar features, while requiring the user to resort to specific means of accomplishing the task goal such as, e.g., by fighting an opposing user manipulated by a preset program and/or another user, by proceeding against opposing users manipulated by the preset program or another user, by arriving at a preset stage or a preset location thereof without or against manipulation by such a preset program or another user, finding a hidden object without or against such manipulation, and the like. Details of the task provided by the control module 40 (and/or game console), however, may not necessarily be critical to the scope of the present invention, as long as the user may manipulate various task features defined in the
image domain 51M at least partly based on such features of the user, exercise, and/orexercise module 20 while performing such exercise on, with, and/or against theexercise module 20, as long as theexercise module 20 may change at least one of its operations at least partly based on such features of the user, task, and/or exercise offered by such amodule 20, and the like. - Depending on the nature of the task, the control module 40 (or game console) may manipulate only a single preset feature or multiple features of the simulated user. For example, the
control module 40 may convert one or more features of the user and/or exercise into such control and/or converted signals and move thesimulated user 81 in a single or multiple directions at a preset or variable speeds, either directly or in conjunction with the conversion module 70 to incorporate the equivalence between different features of the same or different types. As is manifest in the figure, however, it is preferred that thecontrol module 40 manipulate such asimulated user 81 to move in different directions and/or at different speeds. To this end, thecontrol module 40 may utilize itsinput unit 41 to receive various user inputs capable of manipulating at least one feature of thesimulated user 81 in theimage domain 51M. For example, theinput unit 41 may be disposed in and/or operatively couple with thetrack 22 of theexercise module 20, monitor a force applied thereto by the user, a direction of a movement of the user, and/or an acceleration thereof, extract an intended user input therefrom, and manipulate at least one feature of thesimulated user 81 at least partly based thereupon. It is appreciated in this example that the user may not only perform the exercise but also provide the intended user input while walking or running on such atrack 22 by manipulating various features of his or her exercise and that a single part of thesystem 10 may not only receive the energy associated with the exercise but also receive the user input for manipulating thesimulated user 81. In another example, such aninput unit 41 may be disposed on and/or operatively couple to thehandle 21H of theexercise module 20, monitor a force and/or torque applied thereto by the user, extract the intended user input therefrom, and manipulate at least one feature of thesimulated user 81 at least partly based thereon. In another example, theinput unit 41 may be portably carried by and/or disposed on the user, monitor or receive the user input, and manipulate at least one feature of thesimulated user 81 at least partly based thereon. It is appreciated in these two last examples that thesystem 10 includes at least two parts, i.e., at least one major part for receiving the energy from the user for such exercise and at least one minor part for receiving the user input for manipulating thesimulated user 81 in the task, that thetrack 22 of theexercise module 20 may function as the major part in this embodiment, and that theinput unit 41 of thecontrol module 40 may function as the minor part herein. - It is to be understood that the major and minor parts may be provided in various arrangements. For example, the major part may be arranged to receive energy from the user while acquiring the user input at least partly based on a direction of input force related and/or associated with such energy, a velocity thereof, an acceleration thereof, a duration thereof, and the like, where the energy supplied to such a minor part and associated with the user input may correspond to at most 5%, 10%, 15%, 20%, 25%, 30%, 35%, 40%, 45%, 50%, 55%, 60%, 65%, 70%, 75%, 80%, 85% or 90% of another energy supplied to the above major part. The major part may have a configuration and/or may be incorporated into locations capable of contacting a foot or feet of the user, a leg or legs thereof, a thigh or thighs thereof, a back thereof, a belly thereof, a side or sides thereof, a finger or fingers thereof, a hand or hands thereof, an arm or arms thereof, a shoulder or shoulders thereof, a head thereof, and/or a neck thereof. The minor part may similarly define a configuration and/or may be incorporated into locations capable of contacting a foot or feet of the user, a leg or legs thereof, a thigh or thighs thereof, a back thereof, a belly thereof, a side or sides thereof, a hand or hands thereof, an arm or arms thereof, a neck thereof, a shoulder or shoulders thereof, a head thereof, a finger or fingers thereof, and the like. The major part may be designed and/or disposed to contact a first body part which may be capable of providing more energy than another energy capable of being provided by a second body part to said minor part. The major and minor parts may be designed and/or disposed to respectively allow the first and second body parts to move and/or to be depressed in the same or different directions. The major and minor parts may be designed and/or disposed to respectively contact such first and second body parts which are spaced away from each other by at least 1 cm, 2 cm, 3 cm, 4 cm, 5 cm, 7 cm, 10 cm, and the like. The minor part may be disposed away from the major part, around the major part, inside the major part, side by side with respect to the major part, may be disposed in an elevation higher than that of the major part, may be in a recession, may be flush with the major part, and the like. Therefore, the user may provide various input signals to move the
simulated user 81 along a desired direction, to perform a preset function, to cope with the opposinguser 82, and the like. - In operation, the user supplies various settings for desired exercise in which the user intends to be engaged and for a desired task or, more particularly, the user selects what kind of features of a task is to be provided by the
system 10 in a desired mode and provides settings thereof. Thereafter, the user turns on theexercise module 20, moves thetrack 22 at a desired speed, and begins walking and/or running on thetrack 22. As the user begins the exercise, the output module provides the user with such images, sounds, and/or virtual environment with intended features. For example, the output module may provide the user with the visual feature of the virtual environment by displaying desired images and/or visual feature of the environment on theimage domain 51M of itsvisual unit 51, where the images may be a still picture, a series of still pictures, a video clip, a combination thereof, and the like. It is preferred, however, that the output module may manipulate thevisual unit 51 to display such images and/or visual feature related and/or associated with a desired type and/or extent of exercise and/or task selected by the user. When commanded by the user, the output module may provide such sounds and/or virtual environment with the auditory feature, where it is preferred that such sounds or auditory feature may be related or associated with the type and/or extent of the exercise and/or task, the images and/or visual feature displayed on theimage domain 51M, and the like. The output module may also provide the virtual environment with the olfactory feature by giving off various smells related and/or associated with such images, sounds, and/or virtual environment so that the user is provided with not only the images and sounds but also with the smells related thereto or associated therewith. Upon being instructed, the output module may provide the virtual environment with the tactile feature by generating various sensations. It is preferred that the output module provide the above features of the virtual environment in the preset perspective with respect to the user as described above and that the output module may vary at least one of the features and/or at least one of temporal and/or spatial characteristics of at least one of such features during the exercise based on various factors which have also been disclosed above. - Depending on a preset mode of operation, the
control module 40 acquires at least one feature of the user, task, exercise, and/or exercise module 20 (i.e., operations thereof using its input and/orsensor units simulated user 81 and then display thesimulated user 81 on theimage domain 51M of the output unit. The conversion module 70 (with its driver and/or interface units 73, 74) and/or control module 40 (using its control unit 44) may manipulate at least one feature of thesimulated user 81 in order to allow thesimulated user 81 to proceed through the preset stage of the task for the goal of the task by, e.g., fighting against or avoiding the opposingusers 82, collectingsuch credits 83, and the like. More importantly, such control and/orconversion modules 40, 70 may manipulate at least one of multiple features of thesimulated user 81 at least partly based upon at least one of multiple features of the user, exercise, and/orexercise module 20, whereby the user may manipulate various features of thesimulated user 81 at least partly based upon various features of the exercise which he or she performs on, with, and/or against theexercise module 20. Conversely, the control and/orconversion modules 40, 70 may manipulate at least one of such features of at least one operation of theexercise module 20 at least partly based upon at least one of such features of the user, task, and/or exercise. Accordingly, the user may perform the exercise which is in turn provided by theexercise module 20 and of which the features may be at least partly dependent upon various user features such as his or her physical or physiological conditions, upon various task features such as a status of thesimulated user 81 in the task, and/or upon various exercise features such as the type and/or extent thereof. In each of the examples, various task features may also be manipulated in various modes. For example, thecontrol module 40 may manipulate the task feature when such amodule 40 is to provide the task. In another example, the conversion module 70 may manipulate the task feature when thesystem 10 is operatively coupled to the game console which may be a part of thesystem 10 or a console external thereto. In the latter case, thesystem 10 may be arranged to directly manipulate the task feature or, in the alternative, may generate and then supply the control and/or converted signals with which such a game console may manipulate the task feature. - Such a
system 10 may be arranged to allow the user to manipulate at least one feature of the user, task, exercise, and/or exercise module 20 (i.e., operations thereof either directly or indirectly. In one example, the user may directly supply the user input to thesystem 10 which may then change at least one feature of the task at least partly based thereon. In another example, the user may supply the user input to theexercise module 20 which may then deliver the control and/or converted signals to the control and/orconversion modules 40, 70 for manipulating the task feature at least partly based thereon. Thesystem 10 may instead be arranged to allow the user to change the task feature at least partly based upon at least one feature of the user, task, exercise, and/orexercise module 20 with or without accompanying changes in the operations of theexercise module 20. Accordingly, thesystem 10 may adaptively change at least one feature of the task, exercise, and/or operations of theexercise module 20 based on at least one of such features acquired thereby. - In another aspect of the present invention, a simulating exercise system may simulate an user thereof into at least one simulated user defined in a task and manipulate various features of the user, task, exercise, and/or exercise module at least partly based on at least one feature of the same type or a different type.
FIG. 4B shows a schematic perspective view of an exemplary simulating exercise system including an exercise module and simulating an user of such a module as a simulated user of a task playing a board game against another simulated user of another exercise module according to the present invention. It is appreciated that not every module and/or unit of the simulating system may be incorporated into the figure but that those modules and/or units thereof described hereinabove as well as those of the co-pending Applications may be incorporated into the system so as to perform various functions as set forth herein and in the co-pending Applications, respectively. It is further appreciated that any of the above units of the control and conversion modules may also be incorporated to various exposed and/or hidden locations of the simulating exercising system. It is further appreciated that an upper panel of the figure represents the perspective view of the entire system, while a lower panel of the figure visually explains a preset task defined for an user engaged in exercise which is provided by the system as well. - An exemplary
simulating exercise system 10 is similar to that ofFIG. 3B , except that the figure only focuses on oneexercise module 20 disposed in one location. Accordingly, it is appreciated that thesystem 10 includes at least one another exercise module which is not shown in this figure but is disposed in a different location and on, with or against which another user performs the same, similar or different exercise. When desires, the user exercising on theexercise module 20 of this figure may be arranged to compete a control module (not shown in the figure) of thesystem 10 in a common task of a story, scenery or game. - As described in the lower panel of
FIG. 2B , the control module (or game console) defines the preset task which is to be performed by the user. In this particular example, the task is similar to the prior art “go” game or an equivalent thereof which defines theimage domain 51M on which nineteen horizontal and vertical lines are to intersect each other. More particularly, such a task defines a two-dimensional background which consists of multiple points of intersection of such lines arranged in a 19-by-19 matrix. The task defines multiplesimulated users 81 and multiple opposingusers 82, where thesimulated users 81 may be represented by one of black and white marbles, whereas the opposingusers 82 may be represented by the other of such marbles. Suchsimulated users 81 may preferably be manipulated by the user and disposed in any of the above intersections of the background, while competing against the opposingusers 82 according to preset rules of the task such as, e.g., creating as much a territory as possible while surrounding and capturing such opposingusers 82. In general, the task may decide where the opposingusers 82 are to be disposed in response to positioning of thesimulated users 82 based on a preset program stored in the control module 40 (or game console), as selected based upon the user inputs, as determined by at least one of such features of the user, task, exercise, and/or exercise module 20 (or operations thereof, and the like. - The task may define multiple stages each provided to the user in a preset sequence which may be decided at least partly based on a preset program of the
control module 40 or game console, based on the control and/or converted signals, based on the user inputs provided by the user of theexercise module 20 of the same ordifferent system 10, and the like. Each of such stages may define levels or difficulties of different extents by varying skills of such opposingusers 82. - The control and/or
conversion modules 40, 70 may then be arranged to manipulate at least one of various features of the above task such that thesimulated user 81 defined in theimage domain 51M may be positioned at least partly based upon at least one of the features of the user, exercise, and/or exercise module 20 (i.e., various operations thereof. Therefore, the user of theexercise module 20 may manipulate thesimulated users 81 of the task to be disposed in preferable positions of theimage domain 51M of such a task while performing exercise thereon, therewith, and/or thereagainst. In the alternatively, the control and/orconversion modules 40, 70 may be arranged to manipulate at least one of the features of the exercise and/orexercise module 20 so that thecontrol module 40 may control at least one operation of theexercise module 20 and/or exercise provided by such amodule 20 at least partly based on at least one of such features of the task. Accordingly, the user of the control module 40 (or game console) may manipulate thesimulated users 81 while performing the exercise of which features are determined at least partly based upon at least one of such features of the task. - Other than the go game exemplified herein, the
control module 40 and/or game console may be arranged to provide the user with different board and/or card games each of which may define similar or different features, while requiring the user to resort to specific means of accomplishing the goal of the task such as, e.g., by positioning multiple simulated users while competing against opposing users, by moving one or more simulated users against one or more opposing users, by collecting preferable simulated users (or cards) from a given set of cards, and the like. Details of the task provided by the control module 40 (and/or game console), however, may not necessarily be critical to the scope of the present invention, as long as the user may manipulate such task features defined in theimage domain 51M at least partly based upon such features of the user, exercise, and/orexercise module 20 while performing such exercise on, with or against theexercise module 20, as long as theexercise module 20 may change at least one of its operations at least partly based on such features of the user, task, and/or exercise offered by such amodule 20, and the like. - In operation, the user supplies various settings for desired exercise in which the user intends to be engaged and for a desired task or, more particularly, the user selects what kind of features of a task is to be provided by the
system 10 in a desired mode and provides settings thereof. Thereafter, the user couples a desired number of theweights 24 with thehandle 21H, sits on thechair 23 of theexercise module 20, grabs thehandle 21H, and then begins lifting theweights 21H by pivoting and/or reciprocating thehandle 21H. As the user begins the exercise, the output module provides the user with such images, sounds, and/or virtual environment with intended features, while providing the user with the images, sounds, and/or virtual environment as disclosed in conjunction withFIG. 4A . - Depending on a preset mode of operation, the
control module 40 acquires at least one feature of the user, task, exercise, and/or exercise module 20 (i.e., operations thereof) using its input and/orsensor units simulated user 81 and then display thesimulated user 81 on theimage domain 51M of the output unit. The conversion module 70 (with its driver and/or interface units 73, 74) and/or control module 40 (using its control unit 44) may manipulate at least one feature of thesimulated user 81 in order to allow thesimulated user 81 to proceed through the preset stage of the task for the goal of the task by, e.g., fighting against or avoiding the opposingusers 82, collectingsuch credits 83, and the like. More importantly, such control and/orconversion modules 40, 70 may manipulate at least one of multiple features of thesimulated user 81 at least partly based upon at least one of multiple features of the user, exercise, and/orexercise module 20, whereby the user may manipulate various features of thesimulated user 81 at least partly based upon various features of the exercise which he or she performs on, with, and/or against theexercise module 20. Conversely, the control and/orconversion modules 40, 70 may manipulate at least one of such features of at least one operation of theexercise module 20 at least partly based upon at least one of such features of the user, task, and/or exercise. Accordingly, the user may perform the exercise which is in turn provided by theexercise module 20 and of which the features may be at least partly dependent upon various user features such as his or her physical or physiological conditions, upon various task features such as a status of thesimulated user 81 in the task, and/or upon various exercise features such as the type and/or extent thereof. In each of the examples, various task features may also be manipulated in various modes. For example, thecontrol module 40 may manipulate the task feature when such amodule 40 is to provide the task. In another example, the conversion module 70 may manipulate the task feature when thesystem 10 is operatively coupled to the game console which may be a part of thesystem 10 or a console external thereto. In the latter case, thesystem 10 may be arranged to directly manipulate the task feature or, in the alternative, may generate and then supply the control and/or converted signals with which such a game console may manipulate the task feature. Other configurational or operational characteristics of theexercise system 10 ofFIG. 4B are similar or identical to those of the system ofFIG. 4A . - Configurational and/or operational variations and/or modifications of the above embodiments of various exemplary exercise systems, their modules, or units shown in
FIG. 1 ,FIGS. 2A to 2F ,FIGS. 3A and 3B , andFIGS. 4A and 4B also fall within the scope of this invention. - As shown in conjunction with
FIG. 1 , the system typically consists of three types of modules, where the control module includes four units, while the output module includes five units. However, the control and/or output modules may not necessarily include all of the units. Accordingly, the control unit may only include the input and control units, whereas the output unit may include only visual unit. In other words, the exact number of those units of the control and output modules may not be critical to the scope of this invention as far as each module performs its intended functions. Similarly, various units of such modules may be deemed to belong to other modules different from those set forth inFIG. 1 . For example, the control unit may belong to the output module, while the storage unit may belong to the exercise modules. At least a portion of at least one of the visual, auditory, and/or display units of the output module may be incorporated into the control module. In other words, classifications of such units are not critical to the scope of this invention as far as each unit performs its intended function. - By the same token, such exercise modules may be necessary for the system of this invention. In contrary, the system may be deemed to include the control and output modules, where the exercise modules may be the external equipment to which the system is operatively coupled. The same applies to other auxiliary modules needed for various operations of the system. For example, a power supply module may be required to power such a system, where the power supply module may or may not be deemed to be a module of the system. The system may also require at least one support which may physically retain various modules, where the support may or may not be deemed as a part of thereof. That is, the exercise system requires the control and output modules, the control module requires the control unit, the output module requires the visual unit, and so on. Other units of the control and output modules set forth in this description, accordingly, may be deemed as optional units of the system.
- The output module may operatively couple to multiple exercise modules and generate the same or different images (and/or sounds) for multiple users simultaneously performing the exercises. Such a control module may perform the same control function for each exercise module while providing the same or different images (or sounds) for the user(s) or, alternatively, to perform at least one different control function for each exercise module. When desirable, the control module may operatively couple to multiple exercise modules of different systems which are disposed in the single location, where this control module is then deemed as a “common” module for multiple exercise systems.
- Each visual unit preferably defines at least one image domain for displaying the images for the task of the story, scenery or game thereon. Such a visual unit may utilize an entire portion of its image domain for displaying the images. When desirable, at least one visual unit may define multiple portions in the image domain, where such portions may define the same or different shapes and/or sizes, may be arranged in rows and/or columns, may be disposed symmetrically or asymmetrically, and the like.
- The visual units may be arranged to manipulate the configurations or dispositions of the images and/or their domains. For example, at least one visual unit may form the image domain, define a preset number of portions therein, and change a shape or size of at least one of the portions while changing shapes or sizes of the rest thereof or, alternatively, while maintaining the shapes or sizes of the rest thereof. At least one visual unit may also define the image domain, define a preset number of portions therein, and change a disposition of at least one of such portions while maintaining dispositions of the rest. At least one visual unit may define the image domain, form a preset number of portions therein while assigning the object and/or background thereto, and change assignments of at least one of the portions so that, e.g., one portion assigned with a single object may be assigned with the background, another object, a combination thereof, and so on. In addition, at least one visual unit may be arranged to change the number of portions defined in the image domain during exercise. At least one visual unit may be arranged to assign the object and background to such portions of the image domain based on various arrangements. For example, the visual unit may assign only one of the object and background into each portion so that the object may be assigned to one of such portions, while the background or another object may be assigned to another thereof. In an opposite example, the visual unit may assign the object and background to one or both of the portions.
- The visual and/or control units may further be arranged to manipulate the configurations and/or dispositions of the image domain as well as its portions. For example, such unit(s) may first form the image domain, define a preset number of portions therein, and then change a shape and/or size of at least one of such portions while changing shapes and/or sizes of the rest of such portions or, in the alternative, while maintaining the shapes and/or sizes of the rest thereof. Such unit(s) may also first form the image domain, define a preset number of portions therein, and then change a disposition of at least one of the portions while maintaining dispositions of the rest of such portions. Such unit(s) may also first form the image domain, define a preset number of portions therein while assigning the object and/or background thereto, and thereafter change such assignment for at least one of such portions such that, e.g., one portion assigned with a single object may then be assigned with the background, another object, a combination thereof, and the like. In addition, such unit(s) may also be arranged to change the number of portions defined in the image domain during exercise.
- The exercise system may also include at least one input unit capable of receiving various user inputs supplied thereto by the user(s). The system may also receive such user inputs by the exercise or control modules by including various prior art input devices as set forth hereinabove. Such user(s) may supply the user inputs by applying mechanical, thermal, and/or electric signals to various parts of the exercise and/or output modules, by generating body movements which may be monitored by such parts of the exercise, control, and/or output modules, by generating voice signals, face signals, and/or body signals which may similarly be monitored by those parts of such exercise, control, and/or output modules, and the like. The system may then utilize such user inputs for manipulating the task feature, whether or not the body movements of the user(s) required for generating such user inputs may be necessary or commensurate for consuming the energy of the exercising user(s). The control module may further extract the user inputs by monitoring and analyzing the body movements of the user(s), voices thereof, facial expressions thereof, and/or other body signals.
- The task of the story, scenery, and/or game may be provided in such images using the control module alone, by the external story, scenery or game console alone, by both, and the like, where the control module may provide the images and optional sounds, smells, and/or sensations for the task to the output module for providing such features of the virtual environment. The control module may also manipulate at least one operation feature of at least one the exercise modules at least partly based on the exercise and/or user features.
- Unless otherwise specified, various features of one embodiment of one aspect of the present invention may apply interchangeably to other embodiments of the same aspect of this invention and/or embodiments of one or more of different aspects of the present invention. Accordingly, any module of the system may be equipped with communication capabilities in order to communicate with at least one another module thereof, as long as the part of the system disposed in each of such locations may have such capabilities. In other words, such communication may not have to be performed solely by the control module and may rather be performed by the exercise and/or
output modules - As described hereinabove, various systems, methods, and/or processes of this invention may be applied to any prior art exercise equipment. For example, such systems, methods, and processes may be applied to the exercise equipment normally requiring its user(s) to perform such physical work thereon or thereagainst. In another example, such systems, methods, and processes may be applied to the exercise equipment providing the user(s) physical and/or electrical energy while forcing and/or facilitating the body of the user(s) to vibrate or twitch the muscles thereof based thereon. In another example, the systems, methods, or process of this invention may be applied to convert any prior art equipment not intended as the exercise system of this invention. Therefore, any conventional devices primarily intended to engage the user(s) in playing physically simulated games or video games may be converted to the exercise system of this invention which may improve or enhance the muscle tone of the user(s), increase the muscle mass or volume thereof, force and/or facilitate the user to reduce the weight, increase the physical stamina of the user, and the like.
- It is to be understood that, while various aspects and embodiments of the present invention have been described in conjunction with the detailed description thereof, the foregoing description is intended to illustrate and not to limit the scope of the invention, which is defined by the scope of the appended claims. Other embodiments, aspects, advantages, and modifications are within the scope of the following claims.
Claims (20)
1. An exercise system which is configured to provide at least two users with at least one preset task of at least one of a story, a scenery, a video game, and a computer game each of which defines a preset task goal and is provided in images of at least one virtual environment, to allow said users to simultaneously perform exercises, to relate at least one first feature of one of said exercises with at least one second feature of another of said exercises, and to one of directly and indirectly manipulate said second feature at least partly based upon said first feature comprising:
a first standard exercise module corresponding to at least one of a first, second, third, fourth, fifth, sixth, seventh, eighth, and ninth exercise module each configured to allow a first user to perform a first exercise at least one of thereon, therewith, and thereagainst while consuming energy thereof during said first exercise, wherein said first exercise module is configured to allow said first user to perform said first exercise at least one of on, with, and against at least one portion thereof, wherein said second exercise module is configured to define at least one preset load, to incorporate therein at least one actuating part operatively coupling to said load and contacting at least one body part of said first user, and to allow said user to perform said first exercise by contacting said actuating part and moving said actuating part against said load while consuming said energy during said first exercise, wherein said third exercise module is configured to include therein at least one track translating along a preset direction and to allow said first user to perform said first exercise of at least one of walking and running on said track while consuming said energy during said first exercise, wherein said fourth exercise module is configured to include at least one rotation axis, to define at least one preset load, to include therein at least one pedal coupling with said load and rotating about said axis, and to allow said first user to perform said first exercise of rotating said pedal against said load while consuming said energy during said first exercise, wherein said fifth exercise module is configured to include therein at least one movable weight, and to allow said first user to perform said first exercise of at least one of translating, reciprocating, pivoting, rotating, and moving said weight while consuming said energy during said first exercise, wherein said sixth exercise module is configured to define at least one central point, to define at least one preset load, to include therein at least one lever coupling with said load and pivoting about said point, and to allow said first user to perform said first exercise of at least one of reciprocating, translating, pivoting, rotating, displacing, and moving said lever about said point against said load while consuming said energy during said first exercise, wherein said seventh exercise module is configured to incorporate therein at least one belt capable of enclosing at least one body part of said first user therearound, and to allow said first user to perform said first exercise of vibrating said body part while consuming said energy during said first exercise, wherein said eighth exercise module is configured to define a preset load, to include therein at least one pad capable of coupling to said load and at least one of moving and deforming in response to energy supplied by said user thereonto, and to allow said first user to perform said first exercise of at least one of translating, reciprocating, rotating, pivoting, deforming, pushing, and pulling at least a portion of said pad against said load while consuming said energy during said first exercise, wherein said ninth exercise module is configured to define at least one preset load, to incorporate therein at least one handle operatively coupling with said load, and to allow said first user to perform said first exercise of at least one of translating, displacing, reciprocating, rotating, pivoting, and moving said handle against said load while consuming said energy during said first exercise, wherein said first exercise defines a first type and a first extent, wherein said first standard exercise module is disposed in a first location, and wherein said first feature is related to at least one of said task, said first user, said first exercise, and at least one operation of said first standard exercise module;
a second standard exercise module corresponding to at least one of said first, second, third, fourth, fifth, sixth, seventh, eighth, and ninth exercise module each of which is configured to allow a second user to perform a second exercise at least one of thereon, therewith, and thereagainst while consuming energy thereof during said second exercise, wherein said second exercise is configured to define a second type and a second extent which are one of the same as, similar to, and different from said first type and extent, respectively, wherein said second exercise module is disposed in a second location which is one of the same as and different from said first location in which said first standard exercise module is disposed, wherein said second feature is related to at least one of said task, said second user, said second exercise, and at least one operation of said second standard exercise module, and wherein said second standard exercise module is configured to be operatively coupled to said first standard exercise module through one of a local network and a global network one of indirectly and directly and to allow said first and second users to simultaneously perform said exercises;
at least one output module which includes at least two visual units and at least one of at least one olfactory unit and at least one tactile unit, wherein each of said visual units is disposed in a preset disposition and arrangement and is configured to display said images to each of said users, wherein said olfactory unit is configured to provide smell for said virtual environment, to said user and wherein said tactile unit is configured to provide sensation of said virtual environment to said user; and
at least one control module which is configured to operatively couple with said output module and at least one of said standard exercise modules one of directly and indirectly, to provide said task in said images of said virtual environment, to display said images on said visual units, to assign at least one preset goal to said task, to monitor at least one of said first feature and second feature, to relate one of said first and second features to another thereof at least partly based upon at least one preset relation one of stored therein, generated thereby, and supplied thereto by at least one of said users, and to perform manipulation of one of said first and second features at least partly based on another of said features, while providing at least one of said smells and sensation,
wherein said control module is configured to provide said relation as well as to perform said manipulation not only when said types and extents of said first and second exercises are identical to each other but also when said types and extents of said first and second exercises are not identical to each other based on at least one of said first and second features, thereby allowing said first and second users to compete for accomplishing said task goal while simultaneously performing said first and second exercises respectively in said first and second locations.
2. The system of claim 1 , wherein said relation relates said first feature of a preset characteristic related with at least one of said first user, first exercise, and operation of said first standard exercise module with said second feature of said preset characteristic related with at least one of said second user, second exercise, and operation of said second standard exercise module.
3. The system of claim 1 , wherein said relation is configured to relate said first feature defining one characteristic which is related with at least one of said first user, first exercise, and operation of said first standard exercise module with said second feature defining a different characteristic which is then related with at least one of said second user, second exercise, and operation of said second standard exercise module.
4. The system of claim 3 , wherein said control module is configured to convert said first feature from a first unit into a second different unit and to perform said manipulation based upon said relation which is configured to compare said converted first feature with said second feature.
5. The system of claim 1 , wherein said relation is configured to relate said first feature defining one characteristic which is related with at least one of said first user, first exercise, and operation of said first standard exercise module with said second feature having said characteristic which is then related to at least one of said second user, second exercise, and operation of said second standard exercise module.
6. The system of claim 1 , wherein said control module is further configured to manipulate at least one of said first and second features at least partly based upon another thereof, whereby said users simultaneously proceed along said task for said goal while simultaneously performing said exercises at least one of on, with, and against said standard exercise modules which are incorporated in said locations and whereby said control module manipulates said another feature of one of said task and standard exercise modules one of directly and indirectly at least partly based upon at least one of said relation and monitored feature while communicating with at least one of said visual units and standard exercise module through one of said local global networks encompassing said locations.
7. The system of claim 1 , wherein said first feature corresponds to said feature of at least one of said first exercise and first user and wherein said second feature corresponds to said feature of said task, whereby said control module is configured to manipulate said task at least partly based upon at least one of said exercises and users.
8. The system of claim 1 , wherein said first feature corresponds to said feature of said task and wherein said second feature corresponds to said feature of said operation, and wherein said control module is configured to manipulate said operation of at least one of said standard exercise modules at least partly based on said task performed by at least one of said users.
9. The system of claim 1 , wherein said control module is also configured to simulate at least one of said users as at least one simulated user, to incorporate said simulated user in said images for said virtual environment, and to allow said at least one of said users to perform said feature of said task by manipulating said simulated user at least partly based on said feature of at least one of said exercises and users.
10. The system of claim 1 , wherein said control module is configured to be disposed in one of said locations and to communicate with at least one of said standard exercise modules disposed in another of said locations for said manipulation one of wirelessly and through wire.
11. An exercise system which is configured to connect a plurality of locations, to incorporate at least one exercise module in each of said locations, to define at least one preset task of at least one of a story, a scenery, a video game, and a computer game each defining a preset goal and provided in images for at least one virtual environment, and to allow each of a plurality of users to simultaneously perform exercises at least one of on, with, and against each of said exercise modules in each of said locations while competing each other in said images for said goal of said task at least partly based on said exercises performed by said users comprising:
a first exercise module which is disposed in a first location and configured to allow a first user to perform a first exercise while consuming energy thereof during said first exercise, wherein a first feature is related with at least one of said first user, first exercise, and at least one operation of said first exercise module;
a second exercise module which is disposed in a second and different location different and configured to allow a second user to perform a second exercise while consuming energy thereof during said second exercise, wherein said second feature is related with at least one of said second user, second exercise, and at least one operation of said second exercise module, and wherein said second exercise module is configured to form an operative coupling to said first exercise module via one of a local network and a global network one of indirectly and directly, thereby allowing said first and second users to simultaneously perform said exercises while maintaining said coupling through said network;
at least one output module which includes at least two visual units and at least one of at least one olfactory unit and at least one tactile unit, wherein each of said visual units is disposed in a preset disposition and arrangement and is configured to display said images to each of said users, wherein said olfactory unit is configured to provide smell for said virtual environment, to said user and wherein said tactile unit is configured to provide sensation of said virtual environment to said user; and
at least one control module which is configured to operatively couple with at least one of said output module and exercise modules one of directly and indirectly, to provide said task in said images, to display said images on said visual units, to assign at least one goal to said task, to be disposed in at least one of said first and second locations, to monitor at least one of said first and second features, to relate one of said first and second exercises to another based on at least one preset relation, and to perform manipulation of at least one of said features at least partly based on at least one another feature,
wherein said control module is configured to provide said relation as well as to perform said manipulation not only when said types and extents of said first and second exercises are identical to each other but also when said types and extents of said first and second exercises are not identical to each other based on at least one of said first and second features, thereby allowing said first and second users to compete for accomplishing said task goal while simultaneously performing said first and second exercises respectively in said first and second locations.
12. The system of claim 11 , wherein said relation relates said first feature of a first characteristic related with at least one of said first user, first exercise, and operation of said first exercise module with said second feature of said first characteristic which is related with at least one of said second user, second exercise, and operation of said second exercise module.
13. The system of claim 11 , wherein said relation is configured to relate said first feature defining one characteristic related with at least one of said first user, first exercise, and operation of said first exercise module with said second feature defining a different characteristic which is related with at least one of said second user, second exercise, and operation of said second exercise module.
14. The system of claim 13 , wherein said control module is configured to convert said first feature from a first unit into a second different unit and to perform said manipulation based upon said relation which is configured to compare said converted first feature with said second feature.
15. The system of claim 11 , wherein said relation is configured to relate said first feature defining one characteristic which is related with at least one of said first user, first exercise, and operation of said first standard exercise module with said second feature having said characteristic which is then related to at least one of said second user, second exercise, and operation of said second standard exercise module.
16. An exercise system which is configured to operatively connect a plurality of locations via at least one of a local network and a global network, to include at least one exercise module in each of said locations, to define at least one preset task of at least one of a story, a scenery, a video game, and a computer game each defining a preset goal for said task and provided in images for at least one virtual environment, and to allow each of a plurality of users to simultaneously perform exercises at least one of on, with, and against each of said exercise modules disposed in each of said locations while competing each other in said images for said task goal at least partly based on said exercises performed by said users comprising:
a first exercise module which is configured to define a first exercise type and a first exercise load and to allow a first user to perform a first exercise while consuming energy thereof during said first exercise;
a second exercise module which is configured to define a second exercise type and a second exercise load and to also allow a second user to perform a second exercise while consuming energy thereof during said second exercise, wherein said second exercise module is configured to operative couple with said first exercise module via said network one of indirectly and directly, thereby allowing said first and second users to simultaneously perform said exercises while pursuing said goal of said task;
at least one output module which includes at least two visual units and at least one of at least one olfactory unit and at least one tactile unit, wherein each of said visual units is disposed in a preset disposition and arrangement and is configured to display said images to each of said users, wherein said olfactory unit is configured to provide smell for said virtual environment, to said user and wherein said tactile unit is configured to provide sensation of said virtual environment to said user; and
at least one control module which is configured to operatively couple with at least one of said output module and exercise modules one of directly and indirectly, to define said task goal, to provide said task in said images, to display said images on said visual units, to monitor at least one of said first and second extents and loads as well as extents of said first and second exercises each performed by each of said users, to simulate said users into simulated users included in said images, to relate at least one of said type, load, and extent of said first exercise with at least one of those of said second exercise based on at least one preset relation, and then to perform manipulation of at least one of said simulated users in said images at least partly based on at least one of said types, loads, and extents related to each other by said relation, thereby allowing said users to compete for accomplishing said task goal while simultaneously performing said exercises in said locations regardless of whether said types of said first and second exercises are identical to each other.
17. The system of claim 16 , wherein said types of said first and second exercises are identical to each other and wherein said control module performs said manipulation at least substantially based on at least one of said loads and extents of said exercises.
18. The system of claim 16 , wherein said types of said first and second exercises are different from each other and wherein said control module is configured to convert at least one of said extents from one unit to another unit and to perform comparison of said converted extent with another of said extents, thereby performing said manipulation at least substantially based on said comparison.
19. The system of claim 16 , wherein said control module is configured to perform said manipulation by manipulating said simulated user in said images.
20. The system of claim 19 , wherein said control module is configured to perform said manipulation not only by manipulating said simulated user in said images but also by manipulating said operation of at least one of said exercise modules based on at least one of said types, loads, and extents.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/216,540 US20090023554A1 (en) | 2007-07-16 | 2008-07-07 | Exercise systems in virtual environment |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US95946407P | 2007-07-16 | 2007-07-16 | |
US95956407P | 2007-07-16 | 2007-07-16 | |
US12/216,540 US20090023554A1 (en) | 2007-07-16 | 2008-07-07 | Exercise systems in virtual environment |
Publications (1)
Publication Number | Publication Date |
---|---|
US20090023554A1 true US20090023554A1 (en) | 2009-01-22 |
Family
ID=40265318
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/216,539 Abandoned US20090023553A1 (en) | 2007-07-16 | 2008-07-07 | Exercise systems in local or global network |
US12/216,540 Abandoned US20090023554A1 (en) | 2007-07-16 | 2008-07-07 | Exercise systems in virtual environment |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/216,539 Abandoned US20090023553A1 (en) | 2007-07-16 | 2008-07-07 | Exercise systems in local or global network |
Country Status (1)
Country | Link |
---|---|
US (2) | US20090023553A1 (en) |
Cited By (56)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100062818A1 (en) * | 2008-09-09 | 2010-03-11 | Apple Inc. | Real-time interaction with a virtual competitor while performing an exercise routine |
US20100092935A1 (en) * | 2008-10-15 | 2010-04-15 | Tom Root | Web-based physical fitness monitoring system |
US8005656B1 (en) * | 2008-02-06 | 2011-08-23 | Ankory Ran | Apparatus and method for evaluation of design |
US20120309551A1 (en) * | 2011-06-02 | 2012-12-06 | Cento E Vinte 120 Participacoes E Empreendimentos Ltda | Structured space for the practice of fitness training and a method of fitness training practice management |
US8332544B1 (en) | 2010-03-17 | 2012-12-11 | Mattel, Inc. | Systems, methods, and devices for assisting play |
US20140073481A1 (en) * | 2012-09-11 | 2014-03-13 | Casio Computer Co., Ltd. | Exercise support apparatus, exercise support method and exercise support program |
US20140274564A1 (en) * | 2013-03-15 | 2014-09-18 | Eric A. Greenbaum | Devices, systems and methods for interaction in a virtual environment |
US20140309084A1 (en) * | 2012-02-11 | 2014-10-16 | Icon Health & Fitness, Inc. | Indoor-Outdoor Exercise System |
US20150070271A1 (en) * | 2013-09-11 | 2015-03-12 | International Business Machines Corporation | Techniques for adjusting a position of a display device based on a position of a user |
US9632746B2 (en) | 2015-05-18 | 2017-04-25 | Echostar Technologies L.L.C. | Automatic muting |
US9723393B2 (en) | 2014-03-28 | 2017-08-01 | Echostar Technologies L.L.C. | Methods to conserve remote batteries |
US9729989B2 (en) | 2015-03-27 | 2017-08-08 | Echostar Technologies L.L.C. | Home automation sound detection and positioning |
US9724596B2 (en) | 2015-02-24 | 2017-08-08 | Stephen Vincent Masarik | Video game controller with handlebar clip |
US9769522B2 (en) | 2013-12-16 | 2017-09-19 | Echostar Technologies L.L.C. | Methods and systems for location specific operations |
US9772612B2 (en) | 2013-12-11 | 2017-09-26 | Echostar Technologies International Corporation | Home monitoring and control |
US9798309B2 (en) | 2015-12-18 | 2017-10-24 | Echostar Technologies International Corporation | Home automation control based on individual profiling using audio sensor data |
US9824578B2 (en) | 2014-09-03 | 2017-11-21 | Echostar Technologies International Corporation | Home automation control using context sensitive menus |
US9838736B2 (en) | 2013-12-11 | 2017-12-05 | Echostar Technologies International Corporation | Home automation bubble architecture |
CN107615217A (en) * | 2015-06-12 | 2018-01-19 | 索尼互动娱乐股份有限公司 | Information processor |
US9882736B2 (en) | 2016-06-09 | 2018-01-30 | Echostar Technologies International Corporation | Remote sound generation for a home automation system |
US9946857B2 (en) | 2015-05-12 | 2018-04-17 | Echostar Technologies International Corporation | Restricted access for home automation system |
US9948477B2 (en) | 2015-05-12 | 2018-04-17 | Echostar Technologies International Corporation | Home automation weather detection |
US9960980B2 (en) | 2015-08-21 | 2018-05-01 | Echostar Technologies International Corporation | Location monitor and device cloning |
US9967614B2 (en) | 2014-12-29 | 2018-05-08 | Echostar Technologies International Corporation | Alert suspension for home automation system |
US9977587B2 (en) * | 2014-10-30 | 2018-05-22 | Echostar Technologies International Corporation | Fitness overlay and incorporation for home automation system |
US9983011B2 (en) | 2014-10-30 | 2018-05-29 | Echostar Technologies International Corporation | Mapping and facilitating evacuation routes in emergency situations |
US9989507B2 (en) | 2014-09-25 | 2018-06-05 | Echostar Technologies International Corporation | Detection and prevention of toxic gas |
US9996066B2 (en) | 2015-11-25 | 2018-06-12 | Echostar Technologies International Corporation | System and method for HVAC health monitoring using a television receiver |
US10049515B2 (en) | 2016-08-24 | 2018-08-14 | Echostar Technologies International Corporation | Trusted user identification and management for home automation systems |
US10060644B2 (en) | 2015-12-31 | 2018-08-28 | Echostar Technologies International Corporation | Methods and systems for control of home automation activity based on user preferences |
US10073428B2 (en) | 2015-12-31 | 2018-09-11 | Echostar Technologies International Corporation | Methods and systems for control of home automation activity based on user characteristics |
US10091017B2 (en) | 2015-12-30 | 2018-10-02 | Echostar Technologies International Corporation | Personalized home automation control based on individualized profiling |
US10101717B2 (en) | 2015-12-15 | 2018-10-16 | Echostar Technologies International Corporation | Home automation data storage system and methods |
US10188890B2 (en) | 2013-12-26 | 2019-01-29 | Icon Health & Fitness, Inc. | Magnetic resistance mechanism in a cable machine |
US10204525B1 (en) | 2007-12-14 | 2019-02-12 | JeffRoy H. Tillis | Suggestion-based virtual sessions engaging the mirror neuron system |
US10220259B2 (en) | 2012-01-05 | 2019-03-05 | Icon Health & Fitness, Inc. | System and method for controlling an exercise device |
US10226396B2 (en) | 2014-06-20 | 2019-03-12 | Icon Health & Fitness, Inc. | Post workout massage device |
US10272317B2 (en) | 2016-03-18 | 2019-04-30 | Icon Health & Fitness, Inc. | Lighted pace feature in a treadmill |
US10279212B2 (en) | 2013-03-14 | 2019-05-07 | Icon Health & Fitness, Inc. | Strength training apparatus with flywheel and related methods |
US10294600B2 (en) | 2016-08-05 | 2019-05-21 | Echostar Technologies International Corporation | Remote detection of washer/dryer operation/fault condition |
US10391361B2 (en) | 2015-02-27 | 2019-08-27 | Icon Health & Fitness, Inc. | Simulating real-world terrain on an exercise device |
US20190290181A1 (en) * | 2016-07-11 | 2019-09-26 | Strive Tech Inc. | Analytics System for Detecting Athletic Fatigue, and Associated Methods |
US10426989B2 (en) | 2014-06-09 | 2019-10-01 | Icon Health & Fitness, Inc. | Cable system incorporated into a treadmill |
US10433612B2 (en) | 2014-03-10 | 2019-10-08 | Icon Health & Fitness, Inc. | Pressure sensor to quantify work |
US10437334B2 (en) * | 2013-07-15 | 2019-10-08 | Vr Electronics Limited | Method and wearable apparatus for synchronizing a user with a virtual environment |
US10493349B2 (en) | 2016-03-18 | 2019-12-03 | Icon Health & Fitness, Inc. | Display on exercise device |
US10596371B1 (en) * | 2016-04-19 | 2020-03-24 | Orbital Research Inc. | Galvanic vestibular stimulation (GVS) systems, devices and methods |
US10625137B2 (en) | 2016-03-18 | 2020-04-21 | Icon Health & Fitness, Inc. | Coordinated displays in an exercise device |
US10671705B2 (en) | 2016-09-28 | 2020-06-02 | Icon Health & Fitness, Inc. | Customizing recipe recommendations |
US20210233139A1 (en) * | 2016-07-06 | 2021-07-29 | Suiko TANAKA | Flowerbed sales order system and plant arrangement planning support program |
US11130041B2 (en) * | 2017-11-15 | 2021-09-28 | Jae Hwan Kim | System for providing a virtual exercise place |
US20220111278A1 (en) * | 2018-12-17 | 2022-04-14 | Hdts, A.S. | High-speed skatemill with a movable skatemill belt |
US11452928B2 (en) * | 2019-07-02 | 2022-09-27 | Jae Hwan Kim | System for providing virtual exercising place |
US11465031B2 (en) * | 2020-09-16 | 2022-10-11 | RevolutioNice, Inc. | Ambulation simulation systems, terrain simulation systems, treadmill systems, and related systems and methods |
US20230211205A1 (en) * | 2012-08-31 | 2023-07-06 | Blue Goji Llc | Health - related data collection system for healthcare diagnostics and treatment platforms |
US11771994B2 (en) | 2019-07-05 | 2023-10-03 | Nintendo Co., Ltd. | Storage medium having information processing program stored therein, information processing system, information processing apparatus, and information processing method |
Families Citing this family (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8157706B2 (en) * | 2009-10-19 | 2012-04-17 | Precor Incorporated | Fitness facility equipment usage control system and method |
US8882637B2 (en) | 2003-01-26 | 2014-11-11 | Precor Incorporated | Fitness facility equipment distribution management |
WO2004067101A2 (en) * | 2003-01-26 | 2004-08-12 | Precor Incorporated | Service tracking and alerting system for fitness equipment |
US20110065373A1 (en) * | 2007-12-07 | 2011-03-17 | Richard Goldmann | Apparatus for surrounding an exerciser with cooling air having manual local control of air outlets built into a stationary exercise device |
US20110061840A1 (en) * | 2007-12-07 | 2011-03-17 | Richard Goldmann | Apparatus for cooling an exerciser having convenient centralized control of air outlets built into a stationary exercise device |
US7955219B2 (en) * | 2009-10-02 | 2011-06-07 | Precor Incorporated | Exercise community system |
US8827870B2 (en) * | 2009-10-02 | 2014-09-09 | Precor Incorporated | Exercise guidance system |
US8221292B2 (en) * | 2010-01-25 | 2012-07-17 | Precor Incorporated | User status notification system |
US9367668B2 (en) | 2012-02-28 | 2016-06-14 | Precor Incorporated | Dynamic fitness equipment user interface adjustment |
US9849333B2 (en) * | 2012-08-31 | 2017-12-26 | Blue Goji Llc | Variable-resistance exercise machine with wireless communication for smart device control and virtual reality applications |
US9179232B2 (en) | 2012-09-17 | 2015-11-03 | Nokia Technologies Oy | Method and apparatus for associating audio objects with content and geo-location |
US10258828B2 (en) | 2015-01-16 | 2019-04-16 | Icon Health & Fitness, Inc. | Controls for an exercise device |
US10953305B2 (en) | 2015-08-26 | 2021-03-23 | Icon Health & Fitness, Inc. | Strength exercise mechanisms |
US10212994B2 (en) | 2015-11-02 | 2019-02-26 | Icon Health & Fitness, Inc. | Smart watch band |
US10561894B2 (en) | 2016-03-18 | 2020-02-18 | Icon Health & Fitness, Inc. | Treadmill with removable supports |
US10293211B2 (en) | 2016-03-18 | 2019-05-21 | Icon Health & Fitness, Inc. | Coordinated weight selection |
US10252109B2 (en) | 2016-05-13 | 2019-04-09 | Icon Health & Fitness, Inc. | Weight platform treadmill |
US10441844B2 (en) | 2016-07-01 | 2019-10-15 | Icon Health & Fitness, Inc. | Cooling systems and methods for exercise equipment |
US10471299B2 (en) | 2016-07-01 | 2019-11-12 | Icon Health & Fitness, Inc. | Systems and methods for cooling internal exercise equipment components |
US10252141B2 (en) * | 2016-08-30 | 2019-04-09 | Johnson Health Tech Co., Ltd. | Exercise apparatus with temperature variable handle assembly |
US10500473B2 (en) | 2016-10-10 | 2019-12-10 | Icon Health & Fitness, Inc. | Console positioning |
US10376736B2 (en) | 2016-10-12 | 2019-08-13 | Icon Health & Fitness, Inc. | Cooling an exercise device during a dive motor runway condition |
TWI646997B (en) | 2016-11-01 | 2019-01-11 | 美商愛康運動與健康公司 | Distance sensor for console positioning |
US10661114B2 (en) | 2016-11-01 | 2020-05-26 | Icon Health & Fitness, Inc. | Body weight lift mechanism on treadmill |
TWI680782B (en) | 2016-12-05 | 2020-01-01 | 美商愛康運動與健康公司 | Offsetting treadmill deck weight during operation |
TWI744546B (en) | 2017-08-16 | 2021-11-01 | 美商愛康運動與健康公司 | Systems for providing torque resisting axial impact |
US10729965B2 (en) | 2017-12-22 | 2020-08-04 | Icon Health & Fitness, Inc. | Audible belt guide in a treadmill |
US20200113518A1 (en) * | 2018-10-12 | 2020-04-16 | Joshua Mollohan | System for facilitating monitoring of fitness devices |
CN115670390B (en) * | 2022-12-30 | 2023-04-07 | 广东工业大学 | Parkinson's disease axial symptom severity degree characterization method |
Citations (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020022551A1 (en) * | 1999-07-08 | 2002-02-21 | Watterson Scott R. | Methods and systems for controlling an exercise apparatus using a portable remote device |
US20020055419A1 (en) * | 2000-04-12 | 2002-05-09 | Michael Hinnebusch | System and method to improve fitness training |
US6458060B1 (en) * | 1999-07-08 | 2002-10-01 | Icon Ip, Inc. | Systems and methods for interaction with exercise device |
US20030134714A1 (en) * | 2002-01-11 | 2003-07-17 | Konami Corporation | Exercise assistance apparatus |
US6634992B1 (en) * | 1998-03-09 | 2003-10-21 | Csk Corporation | Training machine, image output processing device and method, and recording medium which stores image outputting programs |
US20050075214A1 (en) * | 2000-04-28 | 2005-04-07 | Brown Michael Wayne | Program and system for managing fitness activity across diverse exercise machines utilizing a portable computer system |
US6902513B1 (en) * | 2002-04-02 | 2005-06-07 | Mcclure Daniel R. | Interactive fitness equipment |
US20050233861A1 (en) * | 2001-10-19 | 2005-10-20 | Hickman Paul L | Mobile systems and methods for heath, exercise and competition |
US20050239601A1 (en) * | 2003-08-14 | 2005-10-27 | Tom Thomas | Virtual exercise system and method |
US20060025282A1 (en) * | 2004-07-28 | 2006-02-02 | Redmann William G | Device and method for exercise prescription, detection of successful performance, and provision of reward therefore |
US20060030458A1 (en) * | 2004-08-09 | 2006-02-09 | Heywood Richard D | Method and apparatus for precision pacing |
US20060205569A1 (en) * | 1999-07-08 | 2006-09-14 | Watterson Scott R | Systems and methods for enabling two-way communication between one or more exercise devices and computer devices and for enabling users of the one or more exercise devices to competitively exercise |
US20070042868A1 (en) * | 2005-05-11 | 2007-02-22 | John Fisher | Cardio-fitness station with virtual- reality capability |
US20070111858A1 (en) * | 2001-03-08 | 2007-05-17 | Dugan Brian M | Systems and methods for using a video game to achieve an exercise objective |
US20070219059A1 (en) * | 2006-03-17 | 2007-09-20 | Schwartz Mark H | Method and system for continuous monitoring and training of exercise |
US20070260483A1 (en) * | 2006-05-08 | 2007-11-08 | Marja-Leena Nurmela | Mobile communication terminal and method |
US20070265138A1 (en) * | 1999-07-08 | 2007-11-15 | Ashby Darren C | Methods and systems for controlling an exercise apparatus using a portable data storage device |
US20070281828A1 (en) * | 2000-03-21 | 2007-12-06 | Rice Michael J P | Games controllers |
US20080269017A1 (en) * | 2007-04-30 | 2008-10-30 | Nike, Inc. | Adaptive Training System |
-
2008
- 2008-07-07 US US12/216,539 patent/US20090023553A1/en not_active Abandoned
- 2008-07-07 US US12/216,540 patent/US20090023554A1/en not_active Abandoned
Patent Citations (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6634992B1 (en) * | 1998-03-09 | 2003-10-21 | Csk Corporation | Training machine, image output processing device and method, and recording medium which stores image outputting programs |
US6458060B1 (en) * | 1999-07-08 | 2002-10-01 | Icon Ip, Inc. | Systems and methods for interaction with exercise device |
US20020022551A1 (en) * | 1999-07-08 | 2002-02-21 | Watterson Scott R. | Methods and systems for controlling an exercise apparatus using a portable remote device |
US20070265138A1 (en) * | 1999-07-08 | 2007-11-15 | Ashby Darren C | Methods and systems for controlling an exercise apparatus using a portable data storage device |
US20060205569A1 (en) * | 1999-07-08 | 2006-09-14 | Watterson Scott R | Systems and methods for enabling two-way communication between one or more exercise devices and computer devices and for enabling users of the one or more exercise devices to competitively exercise |
US20070281828A1 (en) * | 2000-03-21 | 2007-12-06 | Rice Michael J P | Games controllers |
US20020055419A1 (en) * | 2000-04-12 | 2002-05-09 | Michael Hinnebusch | System and method to improve fitness training |
US20050075214A1 (en) * | 2000-04-28 | 2005-04-07 | Brown Michael Wayne | Program and system for managing fitness activity across diverse exercise machines utilizing a portable computer system |
US20070111858A1 (en) * | 2001-03-08 | 2007-05-17 | Dugan Brian M | Systems and methods for using a video game to achieve an exercise objective |
US20050233861A1 (en) * | 2001-10-19 | 2005-10-20 | Hickman Paul L | Mobile systems and methods for heath, exercise and competition |
US20030134714A1 (en) * | 2002-01-11 | 2003-07-17 | Konami Corporation | Exercise assistance apparatus |
US6902513B1 (en) * | 2002-04-02 | 2005-06-07 | Mcclure Daniel R. | Interactive fitness equipment |
US20050239601A1 (en) * | 2003-08-14 | 2005-10-27 | Tom Thomas | Virtual exercise system and method |
US20060025282A1 (en) * | 2004-07-28 | 2006-02-02 | Redmann William G | Device and method for exercise prescription, detection of successful performance, and provision of reward therefore |
US20060030458A1 (en) * | 2004-08-09 | 2006-02-09 | Heywood Richard D | Method and apparatus for precision pacing |
US20070042868A1 (en) * | 2005-05-11 | 2007-02-22 | John Fisher | Cardio-fitness station with virtual- reality capability |
US20070219059A1 (en) * | 2006-03-17 | 2007-09-20 | Schwartz Mark H | Method and system for continuous monitoring and training of exercise |
US20070260483A1 (en) * | 2006-05-08 | 2007-11-08 | Marja-Leena Nurmela | Mobile communication terminal and method |
US20080269017A1 (en) * | 2007-04-30 | 2008-10-30 | Nike, Inc. | Adaptive Training System |
Cited By (73)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10204525B1 (en) | 2007-12-14 | 2019-02-12 | JeffRoy H. Tillis | Suggestion-based virtual sessions engaging the mirror neuron system |
US8005656B1 (en) * | 2008-02-06 | 2011-08-23 | Ankory Ran | Apparatus and method for evaluation of design |
US20100062818A1 (en) * | 2008-09-09 | 2010-03-11 | Apple Inc. | Real-time interaction with a virtual competitor while performing an exercise routine |
US20100092935A1 (en) * | 2008-10-15 | 2010-04-15 | Tom Root | Web-based physical fitness monitoring system |
US8332544B1 (en) | 2010-03-17 | 2012-12-11 | Mattel, Inc. | Systems, methods, and devices for assisting play |
US20120309551A1 (en) * | 2011-06-02 | 2012-12-06 | Cento E Vinte 120 Participacoes E Empreendimentos Ltda | Structured space for the practice of fitness training and a method of fitness training practice management |
US10220259B2 (en) | 2012-01-05 | 2019-03-05 | Icon Health & Fitness, Inc. | System and method for controlling an exercise device |
US8911330B2 (en) | 2012-02-11 | 2014-12-16 | Icon Health & Fitness, Inc. | Indoor-outdoor exercise system |
US8992387B2 (en) | 2012-02-11 | 2015-03-31 | Icon Health & Fitness, Inc. | Indoor-outdoor exercise system |
US9028370B2 (en) * | 2012-02-11 | 2015-05-12 | Icon Health & Fitness, Inc. | Indoor-outdoor exercise system |
US20140309084A1 (en) * | 2012-02-11 | 2014-10-16 | Icon Health & Fitness, Inc. | Indoor-Outdoor Exercise System |
US20230211205A1 (en) * | 2012-08-31 | 2023-07-06 | Blue Goji Llc | Health - related data collection system for healthcare diagnostics and treatment platforms |
US11951355B2 (en) * | 2012-08-31 | 2024-04-09 | Blue Goji Llc | Health-related data collection system for healthcare diagnostics and treatment platforms |
US20140073481A1 (en) * | 2012-09-11 | 2014-03-13 | Casio Computer Co., Ltd. | Exercise support apparatus, exercise support method and exercise support program |
US10279212B2 (en) | 2013-03-14 | 2019-05-07 | Icon Health & Fitness, Inc. | Strength training apparatus with flywheel and related methods |
US20140274564A1 (en) * | 2013-03-15 | 2014-09-18 | Eric A. Greenbaum | Devices, systems and methods for interaction in a virtual environment |
US10437334B2 (en) * | 2013-07-15 | 2019-10-08 | Vr Electronics Limited | Method and wearable apparatus for synchronizing a user with a virtual environment |
US9459691B2 (en) * | 2013-09-11 | 2016-10-04 | Globalfoundries Inc | Techniques for adjusting a position of a display device based on a position of a user |
US20150070271A1 (en) * | 2013-09-11 | 2015-03-12 | International Business Machines Corporation | Techniques for adjusting a position of a display device based on a position of a user |
US10027503B2 (en) | 2013-12-11 | 2018-07-17 | Echostar Technologies International Corporation | Integrated door locking and state detection systems and methods |
US9772612B2 (en) | 2013-12-11 | 2017-09-26 | Echostar Technologies International Corporation | Home monitoring and control |
US9900177B2 (en) | 2013-12-11 | 2018-02-20 | Echostar Technologies International Corporation | Maintaining up-to-date home automation models |
US9838736B2 (en) | 2013-12-11 | 2017-12-05 | Echostar Technologies International Corporation | Home automation bubble architecture |
US9912492B2 (en) | 2013-12-11 | 2018-03-06 | Echostar Technologies International Corporation | Detection and mitigation of water leaks with home automation |
US10200752B2 (en) | 2013-12-16 | 2019-02-05 | DISH Technologies L.L.C. | Methods and systems for location specific operations |
US11109098B2 (en) | 2013-12-16 | 2021-08-31 | DISH Technologies L.L.C. | Methods and systems for location specific operations |
US9769522B2 (en) | 2013-12-16 | 2017-09-19 | Echostar Technologies L.L.C. | Methods and systems for location specific operations |
US10188890B2 (en) | 2013-12-26 | 2019-01-29 | Icon Health & Fitness, Inc. | Magnetic resistance mechanism in a cable machine |
US10433612B2 (en) | 2014-03-10 | 2019-10-08 | Icon Health & Fitness, Inc. | Pressure sensor to quantify work |
US9723393B2 (en) | 2014-03-28 | 2017-08-01 | Echostar Technologies L.L.C. | Methods to conserve remote batteries |
US10426989B2 (en) | 2014-06-09 | 2019-10-01 | Icon Health & Fitness, Inc. | Cable system incorporated into a treadmill |
US10226396B2 (en) | 2014-06-20 | 2019-03-12 | Icon Health & Fitness, Inc. | Post workout massage device |
US9824578B2 (en) | 2014-09-03 | 2017-11-21 | Echostar Technologies International Corporation | Home automation control using context sensitive menus |
US9989507B2 (en) | 2014-09-25 | 2018-06-05 | Echostar Technologies International Corporation | Detection and prevention of toxic gas |
US9983011B2 (en) | 2014-10-30 | 2018-05-29 | Echostar Technologies International Corporation | Mapping and facilitating evacuation routes in emergency situations |
US9977587B2 (en) * | 2014-10-30 | 2018-05-22 | Echostar Technologies International Corporation | Fitness overlay and incorporation for home automation system |
US9967614B2 (en) | 2014-12-29 | 2018-05-08 | Echostar Technologies International Corporation | Alert suspension for home automation system |
US9724596B2 (en) | 2015-02-24 | 2017-08-08 | Stephen Vincent Masarik | Video game controller with handlebar clip |
US10391361B2 (en) | 2015-02-27 | 2019-08-27 | Icon Health & Fitness, Inc. | Simulating real-world terrain on an exercise device |
US9729989B2 (en) | 2015-03-27 | 2017-08-08 | Echostar Technologies L.L.C. | Home automation sound detection and positioning |
US9948477B2 (en) | 2015-05-12 | 2018-04-17 | Echostar Technologies International Corporation | Home automation weather detection |
US9946857B2 (en) | 2015-05-12 | 2018-04-17 | Echostar Technologies International Corporation | Restricted access for home automation system |
US9632746B2 (en) | 2015-05-18 | 2017-04-25 | Echostar Technologies L.L.C. | Automatic muting |
US10525349B2 (en) * | 2015-06-12 | 2020-01-07 | Sony Interactive Entertainment Inc. | Information processing apparatus |
US20180140950A1 (en) * | 2015-06-12 | 2018-05-24 | Sony Interactive Entertainment Inc. | Information processing apparatus |
CN107615217A (en) * | 2015-06-12 | 2018-01-19 | 索尼互动娱乐股份有限公司 | Information processor |
US9960980B2 (en) | 2015-08-21 | 2018-05-01 | Echostar Technologies International Corporation | Location monitor and device cloning |
US9996066B2 (en) | 2015-11-25 | 2018-06-12 | Echostar Technologies International Corporation | System and method for HVAC health monitoring using a television receiver |
US10101717B2 (en) | 2015-12-15 | 2018-10-16 | Echostar Technologies International Corporation | Home automation data storage system and methods |
US9798309B2 (en) | 2015-12-18 | 2017-10-24 | Echostar Technologies International Corporation | Home automation control based on individual profiling using audio sensor data |
US10091017B2 (en) | 2015-12-30 | 2018-10-02 | Echostar Technologies International Corporation | Personalized home automation control based on individualized profiling |
US10073428B2 (en) | 2015-12-31 | 2018-09-11 | Echostar Technologies International Corporation | Methods and systems for control of home automation activity based on user characteristics |
US10060644B2 (en) | 2015-12-31 | 2018-08-28 | Echostar Technologies International Corporation | Methods and systems for control of home automation activity based on user preferences |
US10493349B2 (en) | 2016-03-18 | 2019-12-03 | Icon Health & Fitness, Inc. | Display on exercise device |
US10625137B2 (en) | 2016-03-18 | 2020-04-21 | Icon Health & Fitness, Inc. | Coordinated displays in an exercise device |
US10272317B2 (en) | 2016-03-18 | 2019-04-30 | Icon Health & Fitness, Inc. | Lighted pace feature in a treadmill |
US10596371B1 (en) * | 2016-04-19 | 2020-03-24 | Orbital Research Inc. | Galvanic vestibular stimulation (GVS) systems, devices and methods |
US9882736B2 (en) | 2016-06-09 | 2018-01-30 | Echostar Technologies International Corporation | Remote sound generation for a home automation system |
US20210233139A1 (en) * | 2016-07-06 | 2021-07-29 | Suiko TANAKA | Flowerbed sales order system and plant arrangement planning support program |
US11810173B2 (en) * | 2016-07-06 | 2023-11-07 | Suiko TANAKA | Flowerbed sales order system and plant arrangement planning support program |
US11471085B2 (en) * | 2016-07-11 | 2022-10-18 | Strive Tech Inc. | Algorithms for detecting athletic fatigue, and associated methods |
US20190290181A1 (en) * | 2016-07-11 | 2019-09-26 | Strive Tech Inc. | Analytics System for Detecting Athletic Fatigue, and Associated Methods |
US10294600B2 (en) | 2016-08-05 | 2019-05-21 | Echostar Technologies International Corporation | Remote detection of washer/dryer operation/fault condition |
US10049515B2 (en) | 2016-08-24 | 2018-08-14 | Echostar Technologies International Corporation | Trusted user identification and management for home automation systems |
US10671705B2 (en) | 2016-09-28 | 2020-06-02 | Icon Health & Fitness, Inc. | Customizing recipe recommendations |
US11148034B2 (en) * | 2017-11-15 | 2021-10-19 | Jae Hwan Kim | System for providing a virtual exercise place |
US11130041B2 (en) * | 2017-11-15 | 2021-09-28 | Jae Hwan Kim | System for providing a virtual exercise place |
US20220111278A1 (en) * | 2018-12-17 | 2022-04-14 | Hdts, A.S. | High-speed skatemill with a movable skatemill belt |
US11878226B2 (en) * | 2018-12-17 | 2024-01-23 | Hdts, A.S. | High-speed skatemill with a movable skatemill belt |
US11452928B2 (en) * | 2019-07-02 | 2022-09-27 | Jae Hwan Kim | System for providing virtual exercising place |
US11771994B2 (en) | 2019-07-05 | 2023-10-03 | Nintendo Co., Ltd. | Storage medium having information processing program stored therein, information processing system, information processing apparatus, and information processing method |
US11771995B2 (en) | 2019-07-05 | 2023-10-03 | Nintendo Co., Ltd. | Storage medium having information processing program stored therein, information processing system, information processing apparatus, and information processing method |
US11465031B2 (en) * | 2020-09-16 | 2022-10-11 | RevolutioNice, Inc. | Ambulation simulation systems, terrain simulation systems, treadmill systems, and related systems and methods |
Also Published As
Publication number | Publication date |
---|---|
US20090023553A1 (en) | 2009-01-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20090023554A1 (en) | Exercise systems in virtual environment | |
CN113164808B (en) | Body-building apparatus controller based on control sequence | |
US9878206B2 (en) | Method for interactive training and analysis | |
Mueller et al. | Designing sports: a framework for exertion games | |
US9295878B2 (en) | Instructional displays and methods for an exercise machine | |
US8620146B1 (en) | Picture-in-picture video system for virtual exercise, instruction and entertainment | |
Mueller et al. | Exertion games | |
EP0682544B1 (en) | Interactive exercise apparatus | |
Marshall et al. | Expanding exertion gaming | |
Hardy et al. | Framework for personalized and adaptive game-based training programs in health sport | |
US20080191864A1 (en) | Interactive Surface and Display System | |
US20090176581A1 (en) | Instructional gaming methods and apparatus | |
US20110172060A1 (en) | Interactive systems and methods for reactive martial arts fitness training | |
CA3228628A1 (en) | System and method for artificial intelligence (ai) assisted activity training | |
US20060058155A1 (en) | System and a method for providing an environment for organizing interactive x events for users of exercise apparatus | |
Turmo Vidal et al. | Sensory bodystorming for collocated physical training design | |
Höysniemi | Design and evaluation of physically interactive games | |
Carmichael et al. | Investigating a DTV-based physical activity application to facilitate wellbeing in older adults | |
Buchem et al. | Gamification in mixed-reality exergames for older adult patients in a mobile immersive diagnostic center: a pilot study in the BewARe project | |
Shapi'i et al. | Rehabilitation exercise game model for post-stroke using Microsoft Kinect camera | |
KR200328554Y1 (en) | Running Machine | |
Senger et al. | Serious gaming: enhancing the quality of life among the elderly through play with the multimedia platform SilverGame | |
Burns | On the relevance of using virtual humans for motor skills teaching: A case study on karate gestures | |
Keskinen et al. | Schoolchildren’s user experiences on a physical exercise game utilizing lighting and audio | |
JP2018201769A (en) | Golf game system for prevention of care |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |