US20090051652A1 - Control apparatus and method - Google Patents

Control apparatus and method Download PDF

Info

Publication number
US20090051652A1
US20090051652A1 US12/171,599 US17159908A US2009051652A1 US 20090051652 A1 US20090051652 A1 US 20090051652A1 US 17159908 A US17159908 A US 17159908A US 2009051652 A1 US2009051652 A1 US 2009051652A1
Authority
US
United States
Prior art keywords
information
specific point
pointing device
control
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US12/171,599
Other versions
US8363011B2 (en
Inventor
Chia Hoang Lee
Jian Liang Lin
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National Chiao Tung University NCTU
Original Assignee
National Chiao Tung University NCTU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National Chiao Tung University NCTU filed Critical National Chiao Tung University NCTU
Assigned to NATIONAL CHIAO TUNG UNIVERSITY reassignment NATIONAL CHIAO TUNG UNIVERSITY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LEE, CHIA HOANG, LIN, JIAN LIANG
Publication of US20090051652A1 publication Critical patent/US20090051652A1/en
Application granted granted Critical
Publication of US8363011B2 publication Critical patent/US8363011B2/en
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • the present invention relates generally to a control apparatus and method, and more particularly, to an apparatus and method for three-dimensional control.
  • multimedia and entertainment software such as media players or games
  • mobile apparatus such as mobile phone or PDA
  • multimedia and entertainment software is developed for personal computer with all kinds of input apparatus, such as keyboard, mouse, or touch pad, it is not easy to apply said software in mobile apparatus without enough input apparatus.
  • trackball applies ball-shaped wheel to control the position of cursor to reach the goal of pointing and interaction.
  • a user can control the moving speed of the cursor by controlling the moving speed of the trackball.
  • the size of trackball is too large to be applied in mobile apparatus.
  • track point applies the point stick to control the position of cursor to reach the goal of pointing and interaction.
  • the moving speed of the multi-directional key is fixed, and it can not provide 3D control function by itself.
  • touch pad applies a touch panel with sensing apparatus to reach the goal of pointing and interaction.
  • the size of touch pad is too large, and it needs other input devices to provide three-dimensional control. Accordingly, touch pad is not suitable for mobile apparatus.
  • the multi-directional key which has been applied in mobile apparatus for an example, it can control the directions of an object displayed on the screen of a mobile apparatus through a number of keys.
  • the moving speed of the multi-directional key is fixed, and it can not provide 3D control function by itself.
  • highly-concentrated keys may make it easy for the user to press the wrong key.
  • U.S. Pat. No. 6,847,354 entitled “Three dimensional interactive display” discloses a three-dimensional interactive display and method of forming it.
  • the 3D interactive display includes a transparent capaciflector (TC) camera formed on a transparent shield layer on the screen surface to sense the movement of an object, such as a finger, so as to perform the 3D control.
  • TC transparent capaciflector
  • the cost of a TC camera is too high to be applied on a consuming mobile apparatus.
  • the camera applied presently in the consuming mobile apparatus can not provide the same functions of a TC camera.
  • U.S. Pat. No. 6,844,871 entitled “Method and apparatus for computing input using six degrees of freedom” discloses a mouse that uses a camera as its input sensor.
  • a real-time vision algorithm determines the six degree-of-freedom mouse posture, consisting of 2D motion, tilt in the forward/back and left/right axes, rotation of the mouse about its vertical axis, and some limited height sensing.
  • a familiar 2D device can be extended for three-dimensional manipulation, while remaining suitable for standard 2D Graphical User Interface tasks.
  • the size of the apparatus is too large, and the apparatus needs a specific panel. Therefore, the apparatus is not suitable for mobile apparatus.
  • U.S. Pat. No. 6,757,557 entitled “Position location system” discloses methods and apparatus for locating the position, preferably in three dimensions, of a sensor by generating magnetic fields which are detected by the sensor.
  • the magnetic fields are generated from a plurality of locations and, in one embodiment of the invention, enables both the orientation and location of a single coil sensor to be determined.
  • the magnetic field generator and detector are not suitable for mobile apparatus.
  • control apparatus of the prior arts can not meet the requirement for performing three-dimensional control of an object on a mobile apparatus. Therefore, there is the need to develop a control apparatus to solve the problem.
  • one aspect of the present invention is to provide a control apparatus and method.
  • the apparatus and method of the invention can be applied for three-dimensional control on a mobile apparatus.
  • control apparatus can provide a user with the control of an object displayed on a display apparatus by a pointing device.
  • the control apparatus of the invention includes an image capturing module, a separating module, a positioning module, a constructing module, and a processing module.
  • the image capturing module is used to record an image sequence includes N images, wherein N is a positive integer.
  • the separating module is applied to capture the pointing device image related to the pointing device from each image.
  • the positioning module is used for calculating a specific point of the pointing device image of each image to generate a first set of specific point information.
  • the constructing module is applied to generate a trajectory in accordance with a pre-defined criterion and the first set of specific point information.
  • the processing module is used to analyze the trajectory and generate a control signal to control the object.
  • control method can provide a user with the control of an object displayed on a display apparatus by a pointing device.
  • the method of the invention includes the following steps:
  • (a) records a video sequence and generates N image frames, wherein N is a positive integer.
  • (b) captures a pointing device image related to the pointing device from each image frame in accordance with a characteristic of the pointing device.
  • (c) calculates a specific point of the pointing device image of each image frame to generate a first set of specific point information.
  • FIG. 1 illustrates a functional block of a control apparatus of an embodiment of the invention.
  • FIG. 2 illustrates a functional block of a control apparatus of an embodiment of the invention.
  • FIG. 3 shows a flow chart of a control method of an embodiment of the invention.
  • FIG. 4 shows a flow chart of a control method of an embodiment of the invention.
  • FIG. 5 shows a flow chart of a control method of an embodiment of the invention.
  • FIG. 6A to 6J illustrate a control process of an embodiment of the invention.
  • FIG. 1 shows a functional block of a control apparatus of an embodiment of the invention.
  • the control apparatus 1 can provide a user with the control (such as three-dimensional control) of an object displayed in a display apparatus by a pointing device.
  • the pointing device can be, but not limited to, such as a finger of the user, a touch pen, a writing pen, or other suitable devices.
  • the control apparatus 1 includes an image capturing module 11 , a separating module 13 , a positioning module 15 , a constructing module 17 , and a processing module 19 .
  • the image capturing module 11 such as a video camera or a camera, can be used to record a video sequence, which contains N image frames, wherein N is a positive integer.
  • the N image frames include some images of the pointing device.
  • the video camera can be a digital video camera, whereas the camera can be a digital camera.
  • the separating module 13 can capture a pointing device image related to the pointing device from each image frame in accordance with a characteristic of the pointing device.
  • the characteristic can be the color of the pointing device or other suitable characteristics such as the shape or the structure of the pointing device.
  • the positioning module 15 can be used to calculate a specific point of the pointing device image of each image frame to generate a first set of specific point information.
  • the specific point can be, but not limited to, a central point of the pointing device image.
  • the constructing module 17 can generate a trajectory in accordance with a pre-defined criterion and the first set of specific point information. Practically, the constructing module 17 can further determine if the trajectory is finished in accordance with the first set of specific point information. For example, if the trajectory stops at the edge of the image frame or the trajectory comes back to the center of the image frame, the constructing module 17 determines the trajectory is finished. Moreover, if the constructing module 17 determines the trajectory is finished, the constructing module 17 transmits the trajectory to the processing module 19 .
  • the constructing module 17 further includes a counter.
  • the constructing module 17 further determines the trajectory is finished, it triggers the counter. Furthermore, when the counting is finished, the constructing module 17 transmits the trajectory to the processing module 19 .
  • the above-mentioned first set of specific point information comprises a second specific point and a third specific point adjacent to the second specific point.
  • the pre-defined criterion is that when the difference between the second specific point and the third specific point is smaller than a pre-defined value, the trajectory comprises the third specific point. Accordingly, if the difference between the second specific point and the third specific point exceeds the predefined value, the constructing module 17 can delete the third specific point, and connects the stable second specific points to form the trajectory.
  • the processing module 19 can analyze the trajectory and generates a control signal to control (such as two-dimensional control or three-dimensional control) the object.
  • the processing module 19 analyzes the trajectory to generate a direction information, a strength information, and a rotation angle information, or other suitable information.
  • the direction information can include an upward information, a downward information, a leftward information, or a rightward information; and the rotation angle information can include a clockwise information or a counterclockwise information.
  • the processing module 19 further comprises a classifier 191 , such as a direction and rotation classifier, which can classify the above-mentioned direction information, strength information, and rotation angle information, so as to generate the control signal.
  • a classifier 191 such as a direction and rotation classifier, which can classify the above-mentioned direction information, strength information, and rotation angle information, so as to generate the control signal.
  • the image capturing module 11 further comprises a vibration reduction module 112 for removing a pre-defined noise, such as the noise generated by the image capturing module 11 , or the interference caused by the user's action, from the video sequence, and further for generating the N image frames.
  • the vibration reduction module 112 can perform the above-mentioned process by blurring/smoothing algorithm.
  • control apparatus of the invention can be contained in a hand-held electronic apparatus.
  • control apparatus of the invention can be integrated in a mobile phone or a personal digital assistant (PDA).
  • PDA personal digital assistant
  • FIG. 3 shows a flow chart of a control method of an embodiment of the invention.
  • the control method includes the following steps: first of all, in step S 71 , a video sequence is recorded and N image frames are generated, wherein N is a positive integer. Afterward, in step S 73 , a pointing device image related to the pointing device from each image frame is captured in accordance with a characteristic of the pointing device, such as the color of the pointing device.
  • step S 75 a specific point is calculated, such as the central point, in the pointing device image of each image frame, a first set of specific point information is generated.
  • step S 77 a trajectory is generated in accordance with a pre-defined criterion and the first set of specific point information.
  • the first set of specific point information comprises a second specific point and a third specific point adjacent to the second specific point.
  • the pre-defined criterion is that when the difference between the second specific point and the third specific point is smaller than a pre-defined value, the trajectory comprises the third specific point. Accordingly, if the difference between the second specific point and the third specific point excesses the pre-defined value, the third specific point can be deleted in step S 77 , and the stable second specific points can be connected to form the trajectory.
  • step S 79 the trajectory is analyzed and a control signal is generated.
  • the method of the invention analyzes the trajectory to generate a direction information, a strength information, and a rotation angle information, or other suitable information, and further generates the control signal in accordance with said information.
  • the direction information can include at least an upward information, a downward information, a leftward information, or a rightward information; and the rotation angle information can include either a clockwise information or a counterclockwise information.
  • step S 81 the object is controlled according to the control signal.
  • the method of the invention can perform two-dimensional or three-dimensional control.
  • the method of the invention can further include the following step: Removing a pre-defined noise from the video sequence and generating the N image frames.
  • the above-mentioned process can be performed by blurring/smoothing algorithm.
  • the method of the invention can further include step S 78 , in which it determines, in accordance with the first set of specific point information, whether the trajectory is finished. Moreover, if the answer of step S 78 is yes, proceed step S 79 . On the contrary, if the answer of step S 78 is no, proceed step S 83 , through which we can confirm the position of the track point.
  • FIG. 5 shows a flow chart of a control method of an embodiment of the invention.
  • the method of the invention further includes a classifier to classify the above-mentioned direction information, strength information, and rotation angle information, so as to generate the control signal.
  • step S 85 the classifier receives the direction information, strength information, and rotation angle information to generate a current control signal. Furthermore, in step S 87 , it determines if a previous control signal is existed, and if no, proceed step S 89 and control the object according to the current control signal. On the contrary, if the answer is yes, proceed step S 91 , and determine whether an interruption exists and if the answer is yes, proceed step S 89 , control the object according to the current control signal. If the answer of step S 91 is no, proceed step S 93 , determine whether the current control signal is the same as the previous control signal, and if no, proceed step S 95 , and control the object according to the previous control signal. If the answer of step S 93 is yes, proceed step S 97 , determine whether the direction is clear, and if the answer is yes, proceed step S 99 , update the strength information. If the answer of step S 97 is no, return to step S 85 .
  • FIG. 6A to 6J Please refer to FIG. 6A to 6J for a control process of an embodiment of the invention.
  • the control apparatus and the display apparatus are integrated in a mobile phone 3 .
  • the image capturing module of the invention is the photographic module 31 of the mobile phone 3
  • the display is the screen 33 of the mobile phone 3 .
  • the object displayed on the screen 33 is an image of the racing car 35 .
  • the photographic module 31 is on the back of the mobile phone 3 , and the photographic module 31 can record the action of a pointing device 37 .
  • the pointing device 37 is located at a first position L 1 .
  • the photographic module 31 can record a video sequence which contains a series of image frames of the movement of the pointing device 37 .
  • the separating module (not shown) can separate the image of the pointing device 37 from the background image of each image frame according to the color of the pointing device.
  • the positioning module (not shown) can further calculate the central point of the image of the pointing device 37 separated by the separating module.
  • the constructing module (not shown) generates the trajectory of the pointing device 37
  • the processing module (not shown) analyzes the track information of the trajectory.
  • the track information shows that the pointing device 37 moves to right.
  • the processing module can further generate a control signal according to the track information.
  • the control signal is a right turn signal, so that the processing module controls the image of the racing car 35 to turn right.
  • the photographic module 31 can record a video sequence which contains a series of image frames of the movement of the pointing device 37 . Additionally, the processing module controls the image of the racing car 35 to continuously making a right turn.
  • the photographic module 31 can record a video sequence which contains a series of image frames of the movement (from top to bottom) of the pointing device 37 .
  • the processing module of the invention analyzes the track information of the trajectory.
  • the track information is that the pointing device 37 moves to the bottom.
  • the processing module further generates the control signal based on the track information, and in the embodiment, the control signal is a downward turn signal. Therefore, the processing module controls the image of the racing car 35 to turn down.
  • the photographic module 31 can record a video sequence which contains a series of image frames of the movement of the pointing device 37 .
  • the processing module of the invention analyzes the track information of the trajectory.
  • the track information is that the pointing device 37 rotates counterclockwise.
  • the processing module further generates the control signal based on the track information, and in the embodiment, the control signal is an enlarged signal. Therefore, the processing module enlarges the image of the racing car 35 . Practically, the relationship between the degree of enlargement and the rotation angle of the track information can be optionally set.
  • the track information and the control signal can optionally be set, so as to perform three-dimensional control of the object.
  • the above-mentioned right-moving information can be set to correspond to a control signal such as right-moving control signal, or right-turning control signal.
  • the track information of rotation clockwise 60° can be set to correspond to a control signal such as right-turning 10°, 60°, or 90°.
  • control apparatus can provide the user with easy performance of three-dimensional control of the object displayed in the display apparatus.
  • control apparatus of the invention can be integrated in a hand-held electronic apparatus, and can perform complicated control, such as game control or virtual control, with the input module of the hand-held electronic apparatus.

Abstract

The invention discloses a control apparatus capable of providing a user with a control over an object displayed on a display apparatus by a pointing device. The control apparatus includes an image capturing module, a separating module, a positioning module, a constructing module, and a processing module. The image capturing module is used to record an image sequence included N images. The separating module is applied to capture the pointing device image related to the pointing device from each image. The positioning module is used for calculating a specific point of the pointing device image of each image to generate a first set of specific point information. The constructing module is applied to generate a trajectory in accordance with a pre-defined criterion and the first set of specific point information. Additionally, the processing module is used to analyze the trajectory and generate a control signal to control the object.

Description

    BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present invention relates generally to a control apparatus and method, and more particularly, to an apparatus and method for three-dimensional control.
  • 2. Description of the Prior Art
  • Recently, multimedia and entertainment software, such as media players or games, has been largely grafted to mobile apparatus such as mobile phone or PDA, to increase the pleasure of using said mobile apparatus. However, because most of the multimedia and entertainment software is developed for personal computer with all kinds of input apparatus, such as keyboard, mouse, or touch pad, it is not easy to apply said software in mobile apparatus without enough input apparatus.
  • There are some defects existed in the input apparatus of the prior art, such as keyboard, mouse, touch pad, trackball, or track point, and other input apparatus which have been applied in mobile apparatus such as direction key or point stick. Accordingly, said input apparatus can not provide a more convenient environment for a user to control the mobile apparatus. Furthermore, more and more above-mentioned multimedia and entertainment software need three-dimensional control, and it makes it harder for the input apparatus of the prior art to reach the goal.
  • For example, trackball applies ball-shaped wheel to control the position of cursor to reach the goal of pointing and interaction. Moreover, a user can control the moving speed of the cursor by controlling the moving speed of the trackball. However, the size of trackball is too large to be applied in mobile apparatus. Additionally, track point applies the point stick to control the position of cursor to reach the goal of pointing and interaction. However, the moving speed of the multi-directional key is fixed, and it can not provide 3D control function by itself.
  • For another example, touch pad applies a touch panel with sensing apparatus to reach the goal of pointing and interaction. However, the size of touch pad is too large, and it needs other input devices to provide three-dimensional control. Accordingly, touch pad is not suitable for mobile apparatus. Take the multi-directional key which has been applied in mobile apparatus for an example, it can control the directions of an object displayed on the screen of a mobile apparatus through a number of keys. However, the moving speed of the multi-directional key is fixed, and it can not provide 3D control function by itself. Furthermore, because the size of the mobile apparatus is smaller, highly-concentrated keys may make it easy for the user to press the wrong key.
  • U.S. Pat. No. 6,847,354 entitled “Three dimensional interactive display” discloses a three-dimensional interactive display and method of forming it. Moreover, the 3D interactive display includes a transparent capaciflector (TC) camera formed on a transparent shield layer on the screen surface to sense the movement of an object, such as a finger, so as to perform the 3D control. However, the cost of a TC camera is too high to be applied on a consuming mobile apparatus. Besides, the camera applied presently in the consuming mobile apparatus can not provide the same functions of a TC camera.
  • U.S. Pat. No. 6,844,871 entitled “Method and apparatus for computing input using six degrees of freedom” discloses a mouse that uses a camera as its input sensor. A real-time vision algorithm determines the six degree-of-freedom mouse posture, consisting of 2D motion, tilt in the forward/back and left/right axes, rotation of the mouse about its vertical axis, and some limited height sensing. Thus, a familiar 2D device can be extended for three-dimensional manipulation, while remaining suitable for standard 2D Graphical User Interface tasks. However, the size of the apparatus is too large, and the apparatus needs a specific panel. Therefore, the apparatus is not suitable for mobile apparatus.
  • Furthermore, U.S. Pat. No. 6,757,557 entitled “Position location system” discloses methods and apparatus for locating the position, preferably in three dimensions, of a sensor by generating magnetic fields which are detected by the sensor. The magnetic fields are generated from a plurality of locations and, in one embodiment of the invention, enables both the orientation and location of a single coil sensor to be determined. However, the magnetic field generator and detector are not suitable for mobile apparatus.
  • Accordingly, the control apparatus of the prior arts can not meet the requirement for performing three-dimensional control of an object on a mobile apparatus. Therefore, there is the need to develop a control apparatus to solve the problem.
  • SUMMARY OF THE INVENTION
  • Accordingly, one aspect of the present invention is to provide a control apparatus and method. Particularly, the apparatus and method of the invention can be applied for three-dimensional control on a mobile apparatus.
  • According to a preferred embodiment of the invention, the control apparatus can provide a user with the control of an object displayed on a display apparatus by a pointing device. The control apparatus of the invention includes an image capturing module, a separating module, a positioning module, a constructing module, and a processing module.
  • The image capturing module is used to record an image sequence includes N images, wherein N is a positive integer. The separating module is applied to capture the pointing device image related to the pointing device from each image. The positioning module is used for calculating a specific point of the pointing device image of each image to generate a first set of specific point information. Additionally, the constructing module is applied to generate a trajectory in accordance with a pre-defined criterion and the first set of specific point information. Furthermore, the processing module is used to analyze the trajectory and generate a control signal to control the object.
  • According to another preferred embodiment of the invention, the control method can provide a user with the control of an object displayed on a display apparatus by a pointing device. The method of the invention includes the following steps:
  • (a) records a video sequence and generates N image frames, wherein N is a positive integer.
  • (b) captures a pointing device image related to the pointing device from each image frame in accordance with a characteristic of the pointing device.
  • (c) calculates a specific point of the pointing device image of each image frame to generate a first set of specific point information.
  • (d) generates a trajectory in accordance with a pre-defined criterion and the first set of specific point information.
  • (e) analyzes the trajectory and generates a control signal.
  • (f) controls the object according to the control signal.
  • The objective of the present invention will no doubt become obvious to those of ordinary skill in the art after reading the following detailed description of the preferred embodiment, which is illustrated in the various figures and drawings.
  • BRIEF DESCRIPTION OF THE APPENDED DRAWINGS
  • FIG. 1 illustrates a functional block of a control apparatus of an embodiment of the invention.
  • FIG. 2 illustrates a functional block of a control apparatus of an embodiment of the invention.
  • FIG. 3 shows a flow chart of a control method of an embodiment of the invention.
  • FIG. 4 shows a flow chart of a control method of an embodiment of the invention.
  • FIG. 5 shows a flow chart of a control method of an embodiment of the invention.
  • FIG. 6A to 6J illustrate a control process of an embodiment of the invention.
  • DETAILED DESCRIPTION OF THE INVENTION
  • Please refer to FIG. 1, which shows a functional block of a control apparatus of an embodiment of the invention. The control apparatus 1 can provide a user with the control (such as three-dimensional control) of an object displayed in a display apparatus by a pointing device. In practice, the pointing device can be, but not limited to, such as a finger of the user, a touch pen, a writing pen, or other suitable devices.
  • As shown in FIG. 1, the control apparatus 1 includes an image capturing module 11, a separating module 13, a positioning module 15, a constructing module 17, and a processing module 19.
  • The image capturing module 11, such as a video camera or a camera, can be used to record a video sequence, which contains N image frames, wherein N is a positive integer. Practically, the N image frames include some images of the pointing device. In practice, the video camera can be a digital video camera, whereas the camera can be a digital camera.
  • The separating module 13 can capture a pointing device image related to the pointing device from each image frame in accordance with a characteristic of the pointing device. In practice, the characteristic can be the color of the pointing device or other suitable characteristics such as the shape or the structure of the pointing device.
  • The positioning module 15 can be used to calculate a specific point of the pointing device image of each image frame to generate a first set of specific point information. In practice, the specific point can be, but not limited to, a central point of the pointing device image.
  • The constructing module 17 can generate a trajectory in accordance with a pre-defined criterion and the first set of specific point information. Practically, the constructing module 17 can further determine if the trajectory is finished in accordance with the first set of specific point information. For example, if the trajectory stops at the edge of the image frame or the trajectory comes back to the center of the image frame, the constructing module 17 determines the trajectory is finished. Moreover, if the constructing module 17 determines the trajectory is finished, the constructing module 17 transmits the trajectory to the processing module 19.
  • In practice, the constructing module 17 further includes a counter. When the constructing module 17 further determines the trajectory is finished, it triggers the counter. Furthermore, when the counting is finished, the constructing module 17 transmits the trajectory to the processing module 19.
  • Furthermore, in practice, the above-mentioned first set of specific point information comprises a second specific point and a third specific point adjacent to the second specific point. Particularly, the pre-defined criterion is that when the difference between the second specific point and the third specific point is smaller than a pre-defined value, the trajectory comprises the third specific point. Accordingly, if the difference between the second specific point and the third specific point exceeds the predefined value, the constructing module 17 can delete the third specific point, and connects the stable second specific points to form the trajectory.
  • The processing module 19 can analyze the trajectory and generates a control signal to control (such as two-dimensional control or three-dimensional control) the object. In practice, the processing module 19 analyzes the trajectory to generate a direction information, a strength information, and a rotation angle information, or other suitable information. Furthermore, the direction information can include an upward information, a downward information, a leftward information, or a rightward information; and the rotation angle information can include a clockwise information or a counterclockwise information.
  • In the embodiment, the processing module 19 further comprises a classifier 191, such as a direction and rotation classifier, which can classify the above-mentioned direction information, strength information, and rotation angle information, so as to generate the control signal.
  • Please refer to FIG. 2, which illustrates a functional block of a control apparatus of an embodiment of the invention. As shown in FIG. 2, the image capturing module 11 further comprises a vibration reduction module 112 for removing a pre-defined noise, such as the noise generated by the image capturing module 11, or the interference caused by the user's action, from the video sequence, and further for generating the N image frames. Practically, the vibration reduction module 112 can perform the above-mentioned process by blurring/smoothing algorithm.
  • In practice, the control apparatus of the invention can be contained in a hand-held electronic apparatus. For example, the control apparatus of the invention can be integrated in a mobile phone or a personal digital assistant (PDA).
  • Please refer to FIG. 3, which shows a flow chart of a control method of an embodiment of the invention.
  • As shown in FIG. 3, the control method includes the following steps: first of all, in step S71, a video sequence is recorded and N image frames are generated, wherein N is a positive integer. Afterward, in step S73, a pointing device image related to the pointing device from each image frame is captured in accordance with a characteristic of the pointing device, such as the color of the pointing device.
  • Furthermore, in step S75, a specific point is calculated, such as the central point, in the pointing device image of each image frame, a first set of specific point information is generated. Afterward, in step S77, a trajectory is generated in accordance with a pre-defined criterion and the first set of specific point information.
  • In practice, the first set of specific point information comprises a second specific point and a third specific point adjacent to the second specific point. Particularly, the pre-defined criterion is that when the difference between the second specific point and the third specific point is smaller than a pre-defined value, the trajectory comprises the third specific point. Accordingly, if the difference between the second specific point and the third specific point excesses the pre-defined value, the third specific point can be deleted in step S77, and the stable second specific points can be connected to form the trajectory.
  • Furthermore, in step S79, the trajectory is analyzed and a control signal is generated. Practically, in step S79, the method of the invention analyzes the trajectory to generate a direction information, a strength information, and a rotation angle information, or other suitable information, and further generates the control signal in accordance with said information. Furthermore, the direction information can include at least an upward information, a downward information, a leftward information, or a rightward information; and the rotation angle information can include either a clockwise information or a counterclockwise information. Finally, in step S81, the object is controlled according to the control signal. In practice, the method of the invention can perform two-dimensional or three-dimensional control.
  • In an embodiment, the method of the invention can further include the following step: Removing a pre-defined noise from the video sequence and generating the N image frames. Practically, the above-mentioned process can be performed by blurring/smoothing algorithm.
  • Please refer to FIG. 4, which shows a flow chart of a control method of an embodiment of the invention. In the embodiment, the method of the invention can further include step S78, in which it determines, in accordance with the first set of specific point information, whether the trajectory is finished. Moreover, if the answer of step S78 is yes, proceed step S79. On the contrary, if the answer of step S78 is no, proceed step S83, through which we can confirm the position of the track point.
  • Please refer to FIG. 5, which shows a flow chart of a control method of an embodiment of the invention. In the embodiment, the method of the invention further includes a classifier to classify the above-mentioned direction information, strength information, and rotation angle information, so as to generate the control signal.
  • As shown in FIG. 5, in step S85, the classifier receives the direction information, strength information, and rotation angle information to generate a current control signal. Furthermore, in step S87, it determines if a previous control signal is existed, and if no, proceed step S89 and control the object according to the current control signal. On the contrary, if the answer is yes, proceed step S91, and determine whether an interruption exists and if the answer is yes, proceed step S89, control the object according to the current control signal. If the answer of step S91 is no, proceed step S93, determine whether the current control signal is the same as the previous control signal, and if no, proceed step S95, and control the object according to the previous control signal. If the answer of step S93 is yes, proceed step S97, determine whether the direction is clear, and if the answer is yes, proceed step S99, update the strength information. If the answer of step S97 is no, return to step S85.
  • Please refer to FIG. 6A to 6J for a control process of an embodiment of the invention. As shown in FIG. 6A and FIG. 6B, the control apparatus and the display apparatus are integrated in a mobile phone 3. Moreover, the image capturing module of the invention is the photographic module 31 of the mobile phone 3, and the display is the screen 33 of the mobile phone 3. Furthermore, the object displayed on the screen 33 is an image of the racing car 35. Moreover, as shown in FIG. 6B, the photographic module 31 is on the back of the mobile phone 3, and the photographic module 31 can record the action of a pointing device 37. Additionally, the pointing device 37 is located at a first position L1.
  • Furthermore, as shown in FIG. 6C and FIG. 6D, when the pointing device 37 moves from the first position L1 to a second position L2, the photographic module 31 can record a video sequence which contains a series of image frames of the movement of the pointing device 37. Afterward, the separating module (not shown) can separate the image of the pointing device 37 from the background image of each image frame according to the color of the pointing device. The positioning module (not shown) can further calculate the central point of the image of the pointing device 37 separated by the separating module. Moreover, the constructing module (not shown) generates the trajectory of the pointing device 37, and the processing module (not shown) analyzes the track information of the trajectory. Particularly, in the embodiment, the track information shows that the pointing device 37 moves to right. The processing module can further generate a control signal according to the track information. Moreover, in the embodiment, the control signal is a right turn signal, so that the processing module controls the image of the racing car 35 to turn right.
  • Furthermore, as shown in FIG. 6E and FIG. 6F, when the pointing device 37 moves from the second position L2 to a third position L3, the photographic module 31 can record a video sequence which contains a series of image frames of the movement of the pointing device 37. Additionally, the processing module controls the image of the racing car 35 to continuously making a right turn.
  • Furthermore, as shown in FIG. 6G and FIG. 6H, when the pointing device 37 moves from a fourth position L4 to a fifth position L5, the photographic module 31 can record a video sequence which contains a series of image frames of the movement (from top to bottom) of the pointing device 37. Moreover, the processing module of the invention analyzes the track information of the trajectory. In the embodiment, the track information is that the pointing device 37 moves to the bottom. The processing module further generates the control signal based on the track information, and in the embodiment, the control signal is a downward turn signal. Therefore, the processing module controls the image of the racing car 35 to turn down.
  • Furthermore, as shown in FIG. 6I and FIG. 6J, when the pointing device 37 moves counterclockwise from the fourth position L4 to the sixth position L6, the photographic module 31 can record a video sequence which contains a series of image frames of the movement of the pointing device 37. Moreover, the processing module of the invention analyzes the track information of the trajectory. In the embodiment, the track information is that the pointing device 37 rotates counterclockwise. The processing module further generates the control signal based on the track information, and in the embodiment, the control signal is an enlarged signal. Therefore, the processing module enlarges the image of the racing car 35. Practically, the relationship between the degree of enlargement and the rotation angle of the track information can be optionally set.
  • Please note that, in practice, the track information and the control signal can optionally be set, so as to perform three-dimensional control of the object. For example, the above-mentioned right-moving information can be set to correspond to a control signal such as right-moving control signal, or right-turning control signal. For another example, the track information of rotation clockwise 60° can be set to correspond to a control signal such as right-turning 10°, 60°, or 90°.
  • Obviously, the control apparatus can provide the user with easy performance of three-dimensional control of the object displayed in the display apparatus. Additionally, the control apparatus of the invention can be integrated in a hand-held electronic apparatus, and can perform complicated control, such as game control or virtual control, with the input module of the hand-held electronic apparatus.
  • Although the present invention has been illustrated and described with reference to the preferred embodiment thereof, it should be understood that it is in no way limited to the details of such an embodiment, but is capable of numerous modifications within the scope of the appended claims.

Claims (25)

1. A control apparatus for providing a user with a control over an object displayed in a display apparatus by a pointing device, the control apparatus comprising:
an image capturing module for recording a video sequence and generating N image frames, N being a positive integer;
a separating module for capturing a pointing device image related to the pointing device from each image frame in accordance with a characteristic of the pointing device;
a positioning module for calculating a specific point of the pointing device image of each image frame to generate a first set of specific point information;
a constructing module for generating a trajectory in accordance with a pre-defined criterion and the first set of specific point information; and
a processing module for analyzing the trajectory and generating a control signal to control the object.
2. The control apparatus of claim 1, wherein the specific point of the pointing device image is a central point of the pointing device image.
3. The control apparatus of claim 1, wherein the first set of specific point information comprises a second specific point and a third specific point adjacent to the second specific point, and the pre-defined criterion is that when the difference between the second specific point and the third specific point is smaller than a pre-defined value, the trajectory comprising the third specific point.
4. The control apparatus of claim 1, wherein the processing module performs three-dimensional control of the object according to the control signal.
5. The control apparatus of claim 4, wherein the image capturing module further comprises:
a vibration reduction module for removing a pre-defined noise from the video sequence, and generating N image frames.
6. The control apparatus of claim 5, wherein the vibration reduction module processes the video sequence by blurring/smoothing algorithm to remove the pre-defined noise.
7. The control apparatus of claim 1, wherein the image capturing module further comprises a video camera or a camera.
8. The control apparatus of claim 1, wherein the characteristic is the color of the pointing device.
9. The control apparatus of claim 1, wherein the constructing module further determines, in accordance with the first set of specific point information, whether the trajectory is finished, and if yes, the constructing module transmitting the trajectory to the processing module.
10. The control apparatus of claim 1, wherein the processing module analyzes the trajectory to generate a direction information, a strength information, and a rotation angle information; and the processing module generating the control signal in accordance with the direction information, the strength information, and the rotation angle information.
11. The control apparatus of claim 10, wherein the direction information is an upward information, a downward information, a leftward information, or a rightward information.
12. The control apparatus of claim 10, wherein the rotation angle information is either a clockwise information or a counterclockwise information.
13. The control apparatus of claim 10, wherein the processing module generates the control signal in accordance with the direction information, the strength information, the rotation angle information, and a classifier.
14. A control method for providing a user with the control over an object displayed in a display apparatus by a pointing device, the control method comprising the following steps:
(a) recording a video sequence and generating N image frames, N being a positive integer;
(b) capturing a pointing device image related to the pointing device from each image frame in accordance with a characteristic of the pointing device;
(c) calculating a specific point of the pointing device image of each image frame to generate a first set of specific point information;
(d) generating a trajectory in accordance with a pre-defined criterion and the first set of specific point information;
(e) analyzing the trajectory and generating a control signal; and
(f) controlling the object according to the control signal.
15. The control method of claim 14, wherein the specific point of the pointing device image is a central point of the pointing device image.
16. The control method of claim 14, wherein the first set of specific point information comprises a second specific point and a third specific point adjacent to the second specific point, and the pre-defined criterion is that when the difference between the second specific point and the third specific point is smaller than a pre-defined value, the trajectory comprising the third specific point.
17. The control method of claim 14, wherein step (f) performs three-dimensional control of the object according to the control signal.
18. The control method of claim 14, wherein step (a) further comprises the following step:
(a1) removing a pre-defined noise from the video sequence, and generating N image frames.
19. The control method of claim 18, wherein step (a1) processes the video sequence by blurring/smoothing algorithm to remove the pre-defined noise.
20. The control method of claim 14, wherein the characteristic is the color of the pointing device.
21. The control method of claim 14, wherein step (d) further comprises the following step:
(d1) determining, in accordance with the first set of specific point information, whether the trajectory is finished, and if yes, proceeding step (e).
22. The control method of claim 14, wherein step (e) analyzes the trajectory to generate a direction information, a strength information, and a rotation angle information, and generating the control signal in accordance with the direction information, the strength information, and the rotation angle information.
23. The control method of claim 22, wherein the direction information is an upward information, a downward information, a leftward information, or a rightward information.
24. The control method of claim 22, wherein the rotation angle information is either a clockwise information or a counterclockwise information.
25. The control method of claim 22, wherein step (e) generates the control signal in accordance with the direction information, the strength information, the rotation angle information, and a classifier.
US12/171,599 2007-08-24 2008-07-11 Control apparatus and method Expired - Fee Related US8363011B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
TW96131340A 2007-08-24
CN096131340 2007-08-24
TW096131340A TW200910147A (en) 2007-08-24 2007-08-24 Control apparatus and method

Publications (2)

Publication Number Publication Date
US20090051652A1 true US20090051652A1 (en) 2009-02-26
US8363011B2 US8363011B2 (en) 2013-01-29

Family

ID=40381694

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/171,599 Expired - Fee Related US8363011B2 (en) 2007-08-24 2008-07-11 Control apparatus and method

Country Status (3)

Country Link
US (1) US8363011B2 (en)
JP (1) JP5055156B2 (en)
TW (1) TW200910147A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110158553A1 (en) * 2009-12-30 2011-06-30 Hon Hai Precision Industry Co., Ltd. Portable device having vibration reduction function and vibration reduction methode thereof

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI476640B (en) 2012-09-28 2015-03-11 Ind Tech Res Inst Smoothing method and apparatus for time data sequences

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010040924A1 (en) * 2000-05-11 2001-11-15 Osamu Hori Object region data describing method and object region data creating apparatus
US20020140667A1 (en) * 2001-04-02 2002-10-03 Toshio Horiki Portable communication terminal, information display device, control input device and control input method
US6501515B1 (en) * 1998-10-13 2002-12-31 Sony Corporation Remote control system
US20030109322A1 (en) * 2001-06-11 2003-06-12 Funk Conley Jack Interactive method and apparatus for tracking and analyzing a golf swing in a limited space with swing position recognition and reinforcement
US6587574B1 (en) * 1999-01-28 2003-07-01 Koninklijke Philips Electronics N.V. System and method for representing trajectories of moving objects for content-based indexing and retrieval of visual animated data
US20040022435A1 (en) * 2002-07-30 2004-02-05 Canon Kabushiki Kaisha Image processing apparatus and method and program storage medium
US6757557B1 (en) * 1992-08-14 2004-06-29 British Telecommunications Position location system
US20040125984A1 (en) * 2002-12-19 2004-07-01 Wataru Ito Object tracking method and object tracking apparatus
US20040141634A1 (en) * 2002-10-25 2004-07-22 Keiichi Yamamoto Hand pattern switch device
US6844871B1 (en) * 1999-11-05 2005-01-18 Microsoft Corporation Method and apparatus for computer input using six degrees of freedom
US6847354B2 (en) * 2000-03-23 2005-01-25 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Three dimensional interactive display
US20060001645A1 (en) * 2004-06-30 2006-01-05 Microsoft Corporation Using a physical object to control an attribute of an interactive display application
US20060040712A1 (en) * 2004-08-23 2006-02-23 Siemens Information And Communication Mobile, Llc Hand-held communication device as pointing device
US20080042981A1 (en) * 2004-03-22 2008-02-21 Itay Katz System and Method for Inputing User Commands to a Processor

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0769185B2 (en) * 1990-09-13 1995-07-26 山武ハネウエル株式会社 Data smoothing method and apparatus
JP2003228458A (en) * 2002-02-05 2003-08-15 Takenaka Komuten Co Ltd Instruction motion recognition device and method
JP2004258837A (en) * 2003-02-25 2004-09-16 Nippon Hoso Kyokai <Nhk> Cursor operation device, method therefor and program therefor
JP4563723B2 (en) * 2004-05-10 2010-10-13 株式会社竹中工務店 Instruction motion recognition device and instruction motion recognition program
JP2007042020A (en) * 2005-08-05 2007-02-15 Nec Corp Portable terminal and program

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6757557B1 (en) * 1992-08-14 2004-06-29 British Telecommunications Position location system
US6501515B1 (en) * 1998-10-13 2002-12-31 Sony Corporation Remote control system
US6587574B1 (en) * 1999-01-28 2003-07-01 Koninklijke Philips Electronics N.V. System and method for representing trajectories of moving objects for content-based indexing and retrieval of visual animated data
US6844871B1 (en) * 1999-11-05 2005-01-18 Microsoft Corporation Method and apparatus for computer input using six degrees of freedom
US6847354B2 (en) * 2000-03-23 2005-01-25 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Three dimensional interactive display
US20010040924A1 (en) * 2000-05-11 2001-11-15 Osamu Hori Object region data describing method and object region data creating apparatus
US20020140667A1 (en) * 2001-04-02 2002-10-03 Toshio Horiki Portable communication terminal, information display device, control input device and control input method
US20030109322A1 (en) * 2001-06-11 2003-06-12 Funk Conley Jack Interactive method and apparatus for tracking and analyzing a golf swing in a limited space with swing position recognition and reinforcement
US20040022435A1 (en) * 2002-07-30 2004-02-05 Canon Kabushiki Kaisha Image processing apparatus and method and program storage medium
US20040141634A1 (en) * 2002-10-25 2004-07-22 Keiichi Yamamoto Hand pattern switch device
US20040125984A1 (en) * 2002-12-19 2004-07-01 Wataru Ito Object tracking method and object tracking apparatus
US20080042981A1 (en) * 2004-03-22 2008-02-21 Itay Katz System and Method for Inputing User Commands to a Processor
US20060001645A1 (en) * 2004-06-30 2006-01-05 Microsoft Corporation Using a physical object to control an attribute of an interactive display application
US20060040712A1 (en) * 2004-08-23 2006-02-23 Siemens Information And Communication Mobile, Llc Hand-held communication device as pointing device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110158553A1 (en) * 2009-12-30 2011-06-30 Hon Hai Precision Industry Co., Ltd. Portable device having vibration reduction function and vibration reduction methode thereof
US8165423B2 (en) * 2009-12-30 2012-04-24 Hon Hai Precision Industry Co., Ltd. Portable device having vibration reduction function and vibration reduction methode thereof
TWI416963B (en) * 2009-12-30 2013-11-21 Hon Hai Prec Ind Co Ltd Portable device having vibration reduction function and vibration reduction methode thereof

Also Published As

Publication number Publication date
TW200910147A (en) 2009-03-01
JP5055156B2 (en) 2012-10-24
JP2009054131A (en) 2009-03-12
US8363011B2 (en) 2013-01-29

Similar Documents

Publication Publication Date Title
US11003253B2 (en) Gesture control of gaming applications
US10761612B2 (en) Gesture recognition techniques
CN108885533B (en) Combining virtual reality and augmented reality
KR101247991B1 (en) Camera gestures for user interface control
US7774075B2 (en) Audio-visual three-dimensional input/output
US6594616B2 (en) System and method for providing a mobile input device
JP4768196B2 (en) Apparatus and method for pointing a target by image processing without performing three-dimensional modeling
US20170038850A1 (en) System and method for combining three-dimensional tracking with a three-dimensional display for a user interface
US20110012830A1 (en) Stereo image interaction system
TWI528224B (en) 3d gesture manipulation method and apparatus
US20150058782A1 (en) System and method for creating and interacting with a surface display
TW200941313A (en) Apparatus and methods for a touch user interface using an image sensor
KR20210010930A (en) Method, system and computer program for remote control of a display device via head gestures
JP2017534135A (en) Method for simulating and controlling a virtual ball on a mobile device
WO2020110547A1 (en) Information processing device, information processing method, and program
US8363011B2 (en) Control apparatus and method
Zhang Vision-based interaction with fingers and papers
Sato et al. Video-based tracking of user's motion for augmented desk interface
JP2002259045A (en) Method and device for inputting handwritten data, method and device for inputting movement data, and method and device for authenticating individual
Fan et al. Back-to-Back: A Novel Approach for Real Time 3D Hand Gesture Interaction

Legal Events

Date Code Title Description
AS Assignment

Owner name: NATIONAL CHIAO TUNG UNIVERSITY, TAIWAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LEE, CHIA HOANG;LIN, JIAN LIANG;REEL/FRAME:021226/0648

Effective date: 20071210

STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20210129