WO2014041548A1 - System and method for controlling behavior of a cursor - Google Patents

System and method for controlling behavior of a cursor Download PDF

Info

Publication number
WO2014041548A1
WO2014041548A1 PCT/IL2013/050782 IL2013050782W WO2014041548A1 WO 2014041548 A1 WO2014041548 A1 WO 2014041548A1 IL 2013050782 W IL2013050782 W IL 2013050782W WO 2014041548 A1 WO2014041548 A1 WO 2014041548A1
Authority
WO
WIPO (PCT)
Prior art keywords
cursor
data
electronic device
motion
condition
Prior art date
Application number
PCT/IL2013/050782
Other languages
French (fr)
Inventor
Ori Rimon
Rafi Zachut
Original Assignee
Zrro Technologies (2009) Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zrro Technologies (2009) Ltd. filed Critical Zrro Technologies (2009) Ltd.
Publication of WO2014041548A1 publication Critical patent/WO2014041548A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present invention is in the field of computing, more particularly in the field of object tracking devices and pointing devices.
  • One type of such techniques utilizes one or more imagers (cameras) with an appropriate field(s) of view for tracking the object motion at the object location.
  • Another type of the monitoring techniques of the kind specified utilizes proximity sensors for tracking the object's movement in the vicinity of the object.
  • Patent publications WO 2010/084498 and US 2011/0279397 which share the inventors and the assignee of the present patent application, relate to a monitoring unit for use in monitoring a behavior of at least a part of a physical object.
  • the present invention is aimed at a system and a method for controlling the behavior of a cursor so as not to cause the user to suffer from thumb fatigue.
  • an object tracking device comprising a proximity sensor matrix capable of generating measured data relating to the position of an object (e.g. a user's fingertip) in either one or a combination of both of contact and contactless modes.
  • a proximity sensor matrix capable of generating measured data relating to the position of an object (e.g. a user's fingertip) in either one or a combination of both of contact and contactless modes.
  • the object tracking device includes a monitoring unit comprising: a data input module configured for receiving measured data indicative of a behavior of at least a part of the physical object in a certain coordinate system associated with a predetermined sensing surface; and a processor configured and operable to be responsive to said measured data for transforming the measured data into an approximate representation of said at least portion of the physical object into a virtual coordinate system such that the transformation maintains a positional relationship between virtual points and corresponding portions within said at least part of the physical object.
  • a fingertip is sensed (and its location in 3D space is calculated) while hovering above the sensor surface and its representation shape (e.g. cursor) on a screen is defined by a function of the fingertip's three-dimensional position with respect to the sensor surface.
  • the user can activate (or drag, rotate etc) the virtual object when the fingertip representation reaches the virtual object's location, and the finger tip has moved from hover to touch the proximity sensor matrix. Thus, as long as the finger hovers above the proximity sensor matrix, the virtual object is not activated.
  • a user may need to keep his finger (e.g. thumb) hovering above the sensor until he or she decides to activate a virtual object again. This may lead to "thumb fatigue", i.e. to a fatigue and/or ache in the thumb (and possibly wrist) muscles from keeping the thumb raised above the proximity sensor matrix for long periods.
  • thumb fatigue i.e. to a fatigue and/or ache in the thumb (and possibly wrist) muscles from keeping the thumb raised above the proximity sensor matrix for long periods.
  • a data flow controller associated with the monitoring unit.
  • a sensor matrix in communication with the monitoring unit generates measured data indicative of the position of an object (e.g. a user's finger).
  • the monitoring unit receives the measured data and uses the measured data to calculate the object's position in a first (real) coordinate system defined by the sensor matrix.
  • the monitoring unit further converts the object's position in the first coordinate system to a representation of the object's position in a second (virtual) coordinate system (e.g. defined by a display), and generates cursor data relating to a position (and optionally to an image) of the object's representation (e.g. cursor).
  • the cursor data can be outputted to an external electronic device for displaying on the electronic device's display.
  • the cursor data may be used by a software utility (application) running on the electronic device to recognize a certain behavior of the cursor, corresponding to certain action defined by the software utility, and execute the certain action.
  • the action may, for example, include activating/manipulating virtual objects on the electronic device's display.
  • the data flow controller is configured for identifying a condition of no-motion of the object or of the object's representation, and if the condition of no-motion is maintained for a predetermined time period, as a consequence thereof the data flow controller prevents the output of at least some cursor data (deactivates the cursor) to an electronic device connected to the output of the data flow controller, or transmits a signal to the electronic device instructing the electronic device to disregard some or all of the operations (e.g. activation or manipulation of a virtual object) associated with the cursor data.
  • the cursor When the cursor is deactivated, the cursor either disappears from the display or appears in a different form (e.g., shape, color).
  • the data flow controller operates to prevent the user from performing actions on virtual objects while the sensor matrix continues to generate measured data and possibly the monitoring unit continues tracking the object position. This may be achieved by (i) repressing/filtering out at least some of the cursor data sent to the electronic device, or (ii) instructing the monitoring unit to stop generating at least part of the cursor data, or (iii) instructing the electronic device to disregard some or all of the operations associated with the cursor data, until the cursor is reactivated. In this manner, the behavior of the user's finger cannot be used to activate or manipulate virtual objects.
  • the user may prop his or her thumb on a surface associated with the sensor matrix, and the touch will not be outputted as an instruction to manipulate a virtual object. In this manner, the user may rest his or her thumb rather than keeping it hovering.
  • the data flow controller is configured and operable for recognizing a specific behavior of the finger by which the user instructs the data flow controller to reactivate the cursor.
  • the data flow controller reactivates the cursor by (i) stopping the repression/filtering out of the cursor data, or (ii) instructing the monitoring unit to resume generating all the cursor data, or (iii) not instructing the electronic device to instructing the electronic device to disregard some or all of the operations associated with the cursor data.
  • an aspect of some embodiments of the present invention relates to a data flow controller, for controlling data flow from a monitoring unit monitoring a behavior of at least a part of a physical object to an electronic device, the data flow controller comprising a motion recognition module a control module in communication with the motion recognition module.
  • the motion recognition module is configured for analyzing measured data generated by a sensor matrix and indicative of a behavior of at least a part of a physical object in a first coordinate system associated with a predetermined sensor matrix and/or for analyzing cursor data generated by the monitoring unit and indicative of an approximate representation of the at least part of the physical object in a second coordinate system, in order to recognize a condition of no-motion corresponding to a state in which the at least part of the physical object and/or of the representation thereof does not move, and to identify a specific behavior of the at least part of the physical object and/or of the approximate representation.
  • the control module is configured for: (a) preventing an output of at least some of the cursor data to electronic device and/or instructing the electronic device to disregard some or all of the operations associated with the cursor data, if the condition of no-motion is maintained for a predetermined time period; and (b) re-enabling the output and/or ceasing to instruct the electronic device to disregard some or all of the operations associated with the cursor data, if the specific behavior is recognized following the predetermined time period.
  • control module is configured for gradually changing at least one parameter of an image of the approximate representation of the at least part of the object configured for being displayed by a display of the electronic device in a time interval between a moment at which the condition of no-motion is recognized and a certain time, the time interval being within the predetermined time period.
  • control module changes the parameter, such that a value of the parameter at the certain time is a function of a length of the time interval.
  • control module is configured for resetting the change in the parameter's value following the recognition of the specific behavior or following an interruption of the condition of no-motion.
  • the parameter may be at least one of an opacity, size, and color of the image of the approximate representation.
  • the parameter is opacity
  • the change of the opacity during the time interval is a decrease of the opacity
  • the parameter is size
  • the change of the size during the time interval is a decrease of the size
  • the control module may be configured for causing the image of the approximate representation to disappear between the end of the predetermined time period and a time point at which the specific behavior is recognized, by preventing an output of at least some of the cursor data to electronic device and/or instructing the electronic device to disregard some or all of the operations associated with the cursor data.
  • control module is further configured for causing the image of the approximate representation to reappear following the time point at which the specific behavior is recognized, by re-enabling the output and/or ceasing to instruct the electronic device to disregard some or all of the operations associated with the cursor data, if the specific behavior is recognized following the predetermined time period.
  • control module is configured for detecting the specific behavior by recognizing a touch by the at least part of the object of a certain surface associated with the predetermined surface, and thereafter by recognizing a hover of the at least part of the object over the certain surface.
  • condition of no-motion is defined as a condition in which a distance by which the at least part of the object moves does not rise over a first threshold distance within the first coordinate system and/or as a condition in which a distance by which the representation moves does not rise over a second threshold distance within the second coordinate system.
  • the monitoring unit comprises a transformation module configured and operable to receive measured data indicative of the behavior in a first coordinate system associated with a predetermined surface defined by a sensor matrix, and be responsive to said measured data for transforming the measured data into cursor data indicative of an approximate representation of said at least portion of the physical object in a second coordinate system, thereby enabling an output of the cursor data to a desired electronic device.
  • the data flow controller comprises a motion recognition module and a control module.
  • the motion recognition module is configured for analyzing the measured data and/or the cursor data to recognize a condition of no-motion corresponding to a state in which the at least part of the physical object and/or of the representation thereof does not move, and to recognize a specific behavior of the at least part of the object.
  • the control module is configured for: (a) preventing an output of at least some of the cursor data to electronic device and/or instructing the electronic device to disregard some or all of the operations associated with the cursor data, if the condition of no-motion is maintained for a predetermined time period; and (b) re-enabling the output and/or ceasing to instruct the electronic device to disregard some or all of the operations associated with the cursor data, if the specific behavior is recognized following the predetermined time period.
  • the system comprises the sensor matrix.
  • the sensor matrix is a capacitive sensor array defining a sensing surface.
  • a further aspect of some embodiments of the present invention relates to a method for controlling for controlling data flow between a monitoring unit and an electronic device, the monitoring unit being configured for transforming measured data indicative of a behavior of at least part of an object in a first coordinate system associated with a predetermined surface defined by a sensor matrix into cursor data indicative of an approximate representation of said at least portion of the physical object in a second coordinate system, the method comprising:
  • condition of no-motion is defined as a condition in which a distance by which the at least part of the object moves does not rise over a first threshold distance within the first coordinate system, and/or as a condition in which a distance by which the cursor moves does not rise over a second threshold distance within the second coordinate system.
  • the method comprises - while the condition of no-motion is maintained within a time interval inside the predetermined threshold time period - gradually changing at least one parameter of an image of the cursor configured for being displayed by a display of the electronic device.
  • the changing of the parameter may comprise changing a value of the parameter, such that the value of the parameter is a function a length of the time interval.
  • the method comprises - following the identification of the specific behavior or an interruption of the condition of no-motion - resetting the parameter's value change.
  • the parameter comprises at least one of an opacity, a size, and a color of the image of the cursor.
  • the change of the opacity during the time interval may be a decrease of the opacity.
  • the change of the size during the time interval may be a decrease of the size.
  • preventing an output of at least some of the cursor data to electronic device and/or instructing the electronic device to disregard some or all of the operations associated with the cursor data causes the image of the cursor to disappear and/or causes the operations associated with the cursor data to be disabled, between the end of the threshold period and a time point at which the specific behavior is recognized.
  • re-enabling the output and/or ceasing to instruct the electronic device to disregard some or all of the operations associated with the cursor data causes the cursor's image of the approximate representation to reappear and/or causes the operations associated with the cursor data to be enabled following the time point at which the specific behavior is recognized.
  • Yet another aspect of some embodiments of the present invention relates to a computing system including at least one processor, a memory operatively coupled to the processor, and a software utility that can execute in the processor from the memory and that, when executed by the processor from the memory, causes the computing system to perform the steps of the above-described method.
  • Yet a further aspect of some embodiments of the present invention relates to a medium readable by a computing system and is useful in conjunction with the computing system, the medium storing a software utility which is configured to cause the computing system to perform the steps of the above-described method.
  • Fig. 1 is a block diagram illustrating a system including a data flow controller associated with a monitoring unit for controlling a behavior of a cursor, according to some embodiments of the present invention
  • Fig. 2 is a flowchart illustrating a method for controlling the activation and deactivation of a cursor, according to some embodiments of the present invention
  • Fig. 3 is a flowchart illustrating an example of the method illustrated in Fig. 2;
  • Fig. 4 is a flowchart illustrating a specific example of the implementation of the behavior recognition step of the method illustrated in Fig. 3;
  • Figs. 5a-5d are schematic drawings exemplifying some embodiments of the present invention, in which the cursor gradually fades before being deactivated;
  • Figs. 6a-6d and 7a-7d are schematic drawings exemplifying some embodiments of the present invention, in which the cursor is gradually reduced in size before being deactivated;
  • Figs. 8a and 8b are schematic drawings exemplifying some embodiments of the present invention, in which the cursor changes its color upon being deactivated;
  • Figs. 9a-9b are schematic drawings exemplifying a behavior for reactivating the cursor, according to some embodiments of the present invention.
  • FIG. 1 a block diagram is provided illustrating a system including a data flow controller associated with a monitoring unit for controlling a behavior of a cursor, according to some embodiments of the present invention.
  • the systemlOO is configured for tracking the motion of an object or a part thereof with respect to a surface defined by a sensor matrix 108 and to control a behavior of an approximate representation of the object or of part thereof.
  • the system 100 includes a monitoring unit 102 and a data flow controller 104 in wired or wireless communication with each other.
  • the monitoring unit 102 includes a transformation module 102a, configured for receiving measured data 106 indicative of a behavior of an object or a part of an object in a first coordinate system defined by or associated with the sensing surface, and configured for transforming the measured data 106 into cursor data 110 indicative of an approximate representation of the object (or part of the object) in a second (virtual) coordinate system.
  • the second coordinate system may be, for example, a display of an electronic device (such as a computer, a television, etc).
  • the sensing matrix 108 may be part of system 100 or may be an external unit in wired or wireless communication with the object tracking system 100.
  • the cursor data 110 is meant to be transmitted in a wired or wireless fashion to the electronic device, which may be a remote device or a device integral with system 100, so as to enable the electronic device to display an image of the cursor on the electronic device's display and move the image in the display's virtual coordinate system.
  • the cursor data 110 may be directly fed to the electronic device's display, or may need a formatting/processing within the electronic device before being readable by the display.
  • the cursor data may be used by a software utility (application) running on the electronic device to recognize a certain behavior of the cursor, corresponding to certain action defined by the software utility, and execute the certain action.
  • the action may, for example, include activating/manipulating virtual objects on the electronic device's display.
  • the cursor data 110 is transmitted in a wired or wireless fashion to a motion recognition module 104a of the data flow controller 104.
  • the data flow controller 104 is in direct (wired or wireless) communication with the sensor matrix 108, and the motion recognition module 104a receives the measured data 106.
  • the cursor data 110 and/or the measured data 106 are analyzed by the motion recognition module 104a.
  • the motion recognition module 104a is configured and operable for recognizing a no- motion condition, in which the object does not move with respect to the first (real) coordinate system, or the representation of the object does not move with respect to the second (virtual) coordinate system.
  • the motion recognition module 104a is configured for recognizing a specific behavior of the object by analyzing the measured data. It should be noticed that the no-motion condition does not necessarily correspond to a condition in which the object or representation is completely stationary. In fact, the no-motion condition can be defined as a condition in which the distance by which the object and/or object's representation move does not rise over a certain threshold distance within the respective coordinate systems. Thus, an error interval can be made for slight involuntary motions, such as a tremor or motion generated by blood flow within the user's fingers.
  • the no-motion condition may be recognized when a difference between the measured data of two successive readings from the sensor matrix 108 is below a predefined threshold.
  • the measured data may be in the form of a signal intensity map in the first coordinate system, while transforming comprises processing the signal intensity map.
  • the sensor matrix together with its built- in or coupled to processor utilizes appropriate processing utilities (hardware and algorithm) to obtain the proximity/height measurement or map image as described in WO 2010/084498 and US 2011/0279397, which share the inventors and the assignee of the present patent application.
  • a function describing dependence of the signal magnitude or potential versus on a distance/proximity from the sensing surface can generally be known, e.g. determined in a calibration procedure.
  • the difference the measured data of two successive readings might be estimated as sum of absolute magnitude differences between successive signals generated by each sensor matrix element.
  • the data flow controller's control module 104b starts to measure time.
  • the time measurement is performed by a timer, which is an independent application running on the data flow controller 104.
  • time measurement is performed by hardware which incrementally counts the number of cycles in which the no-motion condition is maintained. If the no- motion condition is maintained for a predetermined time period (hereafter referred to as t x ), then the control module 104b deactivates the cursor.
  • This may be done by (i) preventing the output of at least some of the cursor data to reach the electronic device, or (ii) instructing the monitoring unit to cease generating or outputting the cursor data 110, or (iii) by generating a signal instructing the electronic device to disregard some or all of the operations associated with the cursor data.
  • the image of the cursor may disappear, and the functionality of the cursor (e.g. the targeting and/or manipulation of a virtual object on the display) will be reduced or even removed until the cursor is re-enabled.
  • this allows the user to rest his/her finger on the surface defined by the sensor matrix 108 without performing an undesired action, such as activating or manipulating a virtual object. If the no-motion condition is interrupted before the end of the predetermined time period (hereafter referred to as t x ), then the control module 104b resets the time.
  • the control module 104b re-enables the cursor, for example by enabling the output of the cursor data to the electronic device or by withdrawing the signal instructing the electronic device to disregard some or all of the operations associated with the cursor data.
  • the control module 104b prevents the output of at least some of the cursor data to the electronic device by sending a signal 112 to the monitoring unit 102, instructing the monitoring unit 102 not to generate cursor data or to generate only partial cursor data until instructed otherwise.
  • the output 114 to the electronic device contains either none of or only a portion of the cursor data 110.
  • the motion recognition module 104a cannot use the cursor data 110 to recognize the specific behavior of the object for re-enabling the cursor.
  • the motion recognition module 104a processes the measured data 106 to detect the specific behavior of the object.
  • the configuration in which the monitoring unit 102 ceases or lessens the generation and/or transmission of cursor data may be an energy-saving configuration, since the energy consumption needed by the monitoring unit 102 to process the measured data and/or transmit the cursor data is reduced or even eliminated.
  • control module 104b prevents the output of at least some of the cursor data to the electronic device by filtering the cursor data, so that the output 114 to the electronic device contains only a portion or even none of the cursor data.
  • control module 104b allows the cursor data to be outputted to the electronic device but generates a signal included in the output 114, to instruct the electronic device to disregard some or all of the operations associated with the cursor data. If the control module 104b instructs the electronic device to ignore all of the cursor data, then the cursor's image on the display will disappear and the user will not be able to convey commands to the electronic device via gestures or behavior of the object.
  • control module 104b instructs the electronic device to disregard some or all of the operations associated with the cursor data, while the image of the cursor may still be displayed (for example in a different color). In such a case, the user will still not be able to convey commands to the electronic device via gestures or behavior of the object.
  • This last feature may be used in the case in which - even when the cursor is inactive - it is desirable to display the cursor's image, while suppressing the cursor's functionality to activate or manipulate virtual objects.
  • the data flow controller may effect a change in a parameter of the cursor's image in order to notify the user of the upcoming deactivation.
  • the data indicative of the cursor's image is generated by the monitoring unit.
  • the change in the parameter of the cursor's image may effected directly by the control module, by altering the data indicative of the cursor's image.
  • the electronic device generates the data indicative of the cursor's image in response to the received cursor data. In this case, the control module transmits a signal to the electronic device, to instruct the electronic device to change the parameter of the cursor's image.
  • the parameter of the cursor's image is then reset, when the cursor is reactivated and in case the no-motion condition is interrupted before the end of the predetermined threshold time period.
  • the monitoring unit 102 and the data flow controller 104 may be physically separate units in wired or wireless communication with each other and having dedicated circuitry for performing their required actions.
  • the monitoring unit 102 and its module and the data flow controller 104 and its module are functional elements of a software package configured for being implemented on one or more electronic circuits (e.g. processors).
  • the monitoring unit 102 and its module and the data flow controller 104 and its module may include some electronic circuits dedicated to individual elements (units or modules), some common electronic circuits for some or all the elements, and some software utilities configured for operating the dedicated and common circuits for performing the required actions.
  • the monitoring unit 102 and its module and the data flow controller 104 and its module may perform their actions only via hardware elements, such as logic circuits, as known in the art.
  • the sensor matrix 108 is able to detect only a two dimensional position of the object.
  • a two-dimensional sensor matrix may be, for example, touch screen with no hover capabilities.
  • the condition of no- motion corresponds to a substantial absence of motion on the two-dimensional surface defined by the sensor matrix 108.
  • the sensor matrix 108 can be any proximity sensor matrix, i.e. a sensor matrix which is able to detect a location of the object in three dimensions, and thus to distinguish a hover mode (where the object hovers above a predefined surface) and a touch mode (where the object touches the predefined surface).
  • the proximity sensor matrix may include a capacitive sensor array, as described in WO 2010/084498 and US 2011/0279397 of the assignee of the present invention, or an optical sensor array as known in the art. If the sensor matrix is a proximity sensor matrix, the condition of no-motion may correspond to a substantial absence of motion within a three-dimensional space within which the sensor matrix can detect the object.
  • the sensor matrix may be a single-touch matrix or a multi-touch matrix.
  • a single-touch matrix allows detection of only one object (or a section thereof) at a given time.
  • the monitoring unit can track only one object (or part thereof) at a given time.
  • a multi-touch matrix allows detection of multiple objects simultaneously, and the monitoring unit can track multiple objects (or multiple parts of an object) at a given time, as described, for example in WO 2010/084498 and US 2011/0279397 of the assignee of the present invention. It should be noted that the structure and operation of the sensor matrix are unimportant for the purposes of the present invention.
  • the touching of the surface defined by the sensor matrix is equivalent to the touching of a second surface associated with the first surface defined by the sensor matrix.
  • the first sensing surface may be protected by a cover representing the second surface, to prevent the object from touching directly the sensing surface.
  • the object can only touch the outer surface of the protective cover.
  • the outer surface of the protective cover is thus a second surface associated with the surface defined by the sensor matrix.
  • a flowchart 200 illustrates a method for controlling the activation and deactivation of a cursor, according to some embodiments of the present invention.
  • the method of the flowchart 200 may be seen as a mode of operation of the data flow controller 104 of Fig. 1.
  • the motion of the object is monitored.
  • the monitoring includes the reception and analysis of data indicative of the object behavior in a certain first coordinate system and/or data indicative of the behavior of a representation of the object (cursor) in a second virtual coordinate system.
  • the data indicative of the object's behavior may be measured data generated by a sensor matrix
  • the data indicative of the behavior of a representation of the object may be the cursor data generated by a monitoring unit in response to the measured data.
  • the monitoring may be performed at a certain rate independently of the process illustrated by the flowchart 200 and while the process illustrated by the flowchart 200 is operated.
  • a check is performed to determine whether the object has moved a distance larger than a certain threshold, so as to recognize a no-motion condition.
  • the check is performed by determining whether the object and/or the object's representation substantially do not move relatively to their respective coordinate systems. If the object and/or the object's representation are motionless, a timer is incremented at 306. The timer keeps incrementing as long as the no-motion condition is maintained. If the object has moved, the motion check of 304 is repeated until a condition of no-motion is identified in 304.
  • a parameter of an image of the cursor is changed or gradually changed as time passes after the no-motion condition was first recognized.
  • the changing of the parameter may start as soon as the time starts counting, or after a certain time interval shorter than the threshold time period ⁇ .
  • the parameter changes more and more as the time period measured by the timer grows.
  • the parameter may be the cursor's opacity, size, or color (Figs. 5a-5d, 6a-6d, 7a- 7d, 8a-8b).
  • a check is performed to determine whether the time measured by the timer has reached a predetermined threshold time period t x . If the result is negative, a new check is performed at 312 to determine whether the object and/or the object's representation have moved. If motion was detected, optionally, the parameter change of the cursor is reset to its appropriate level, which may depend on its position, at 314. At 316, the timer is reset, and the process is brought back to the first motion check 304.
  • the cursor is deactivated at 318.
  • the deactivation of the cursor may involve preventing the user from performing actions on virtual objects.
  • its image disappears from the display of the electronic device or changes appearance (e.g. color).
  • the deactivation of the cursor may be implemented in various ways. For example, at least some of the cursor data may be filtered out, the monitoring module may be instructed to cease or lessen the generation/output of cursor data, or the electronic device may be instructed to disregard some or all of the operations associated with the cursor data. Following the cursor's deactivation, the optional change in the parameter of the cursor' s image may stop.
  • a check is made to determine if a reactivation behavior of the object (user's finger) has been identified. If no reactivation behavior is identified, the cursor remains inactive and the check is performed until the reactivation behavior is identified. An example of the reactivation behavior will be discussed below in the descriptions of Figs. 4, 9a-9b. If the reactivation behavior is positively identified, the cursor is reactivated at 322, enabling the user to perform actions on virtual objects on the display. Subsequently, the parameter change of the cursor's image is optionally reset to its appropriate level at 314, the timer is reset at 316, and the process restarts with a motion check at 304.
  • a new object detected by the sensor matrix and monitored by the monitoring unit gets a new cursor, where the timer and the parameter's change of the cursor's object are reset.
  • the new object may be an additional object (if the sensor matrix is a multi- touch sensor matrix) or the same object (if the object's detection was interrupted and then restored).
  • each cursor corresponding to a respective object may be independent of the activation/deactivation process of other cursors, or may be dependent of them.
  • a first cursor corresponding to a first object e.g. fingertip
  • at least one other cursor is deactivated, and vice versa. This is needed for example in the case in which a supporting thumb that should be ignored touches the sensing surface (or a second surface associated therewith), while the other thumb targets and activates virtual objects.
  • the deactivation process of one cursor affects the deactivation process of all other cursors, and the reactivation of one cursor causes the reactivation of all other cursors. This is needed for example in scenario of typing. Movement of one thumb should reset the timer of "no-motion" condition of the other thumb.
  • a flowchart 300 illustrates a more specific non-limiting example of a possible implementation of the method illustrated in Fig. 2. This flow chart may be implemented to via hardware logic circuits and/or via software.
  • the flowchart 300 is self evident and need not be explained, since its steps are the same as the steps illustrated in the flowchart 200, but implemented in a slightly different order.
  • steps of the method illustrated by the flowcharts 200 and 300 of Figs. 2 and 3 may be steps configured for being performed by one or more processors operating under the instruction of software readable by a system which includes the processor.
  • the steps of the method illustrated by the flowcharts 200 and 300 of Figs. 2 and 3 may be steps configured for being performed by a computing system having dedicated logic circuits designed to carry out the above method without software instruction.
  • the scope of the present invention extends to a computing system including at least one processor, a memory operatively coupled to the processor, and a software utility that can execute in the processor from the memory and that, when executed by the processor from the memory, causes the computing system to perform the steps described in the flowcharts 200 and 300.
  • the computing system may also include a digital circuit constructed from logic gates in the form of integrated circuits or programmable logic devices.
  • the reactivation behavior includes touching the sensor surface and then hovering above the sensor surface.
  • This behavior is a natural behavior of a user, since the user tends to rest his/her finger while the cursor is inactive (thus, the touch), and simply needs to raise the finger to reactivate the cursor (thus, the hover).
  • the sensor matrix is a proximity sensor matrix, the user can maintain his/her finger hovering, in order to navigate (move the cursor's image). If the sensor matrix is a two dimensional sensor matrix, then after deactivation, touch of the finger will be ignored until the finger is once raised (and break of contact was sensed). When the finger re-touches the surface it will not be ignored.
  • the identification of touch could be done via the cursor data 110 (Z axis is 0) or via the measured data 106. In the latter case, and if the sensor matrix is a capacitive proximity sensor matrix, touch is identified if a sensor matrix element sends a signal magnitude corresponding to touch.
  • hover or no-touch
  • the identification of hover could be done via the cursor data 110 (Z axis is bigger than 0) or via the measured data 106.
  • the sensor matrix is a capacitive proximity sensor matrix
  • touch is identified if no sensor matrix element sends a signal magnitude corresponding to touch.
  • the behavior recognition step 320 includes two stages. In the first stage 330, a check is made to determine whether the sensor matrix has detected a touch of the object. If no touch is sensed, the cursor remains deactivated and the check is performed again. If the touch is sensed, progress to the second stage 332 is made. At 332, a check is made to determine whether the sensor matrix has detected the object hovering above it (corresponding to a break of detection, in the case of a two-dimensional sensor matrix). If no hover is sensed, the cursor remains deactivated and the check at 332 is performed again. If hover is sensed, the behavior recognition is complete and the cursor is reactivated.
  • FIGs. 5a-5d schematic drawings are provided to exemplify some embodiments of the present invention, in which the cursor gradually fades before being deactivated.
  • These figures illustrate an example in which the opacity of the cursor's image is the parameter that changes gradually as time passes, during the period between the identification of the no-motion condition and the deactivation of the cursor.
  • the object e.g. a finger 400 hovers above the sensing surface 402 of a proximity sensor matrix at a location A.
  • the first coordinates of the fingertip relative to the sensing surface 402 are XA, YA, ZA-
  • the first fingertip's coordinates XA and YA are converted to the second (virtual) coordinates ⁇ and ⁇ -
  • the coordinate ZA is converted to a certain parameter, such as the size of the cursor 406 (e.g. the size increases as the distance Z A increases) and/or the opacity of the cursor 406 (e.g. the cursor becomes more opaque as the distance ZA decreases).
  • the no-motion condition has been identified, and the timer was started.
  • the fingertip's no-motion condition is maintained, thus the time interval measured by the timer has grown, and the cursor's opacity has decreased as a function of the measured time interval.
  • the fingertip's no-motion condition is maintained, and the cursor's opacity has further decreased.
  • the time interval measured by the timer has reached the threshold value (t x in Figs. 2 and 3 above), the cursor's opacity has reached 0, and thus the cursor is no longer visible.
  • the disappearance of the cursor's image from the display 404 informs the user that cursor has been deactivated.
  • the opacity of the cursor is a decreasing function of the length of the measured time interval. In a non-limiting embodiment of the present invention, the opacity of the cursor is inversely proportional of to the length of the measured time interval.
  • the cursor's opacity may be a parameter that changes as a function of the coordinate ZA- If this is the case, the decrease increments of the cursor's opacity during the time in which the no-motion condition is maintained may be calculated as fractions of the original opacity of the cursor at the time point in which the no-motion condition was identified. To illustrate this point, a cursor which has opacity of 70% due to the value of ZA at the time point in which the no-motion condition was identified can be considered, and the opacity of the cursor is determined as inversely proportional to the length of the measured time interval.
  • opacity of 100% refers to a condition of complete opacity
  • opacity of 0 refers to a condition of complete transparency.
  • FIGs. 6a-6d and 7a-7d schematic drawings are provided to exemplify some embodiments of the present invention, in which the cursor is gradually reduced in size before being deactivated.
  • FIGs. 6a-6d and 7a-7d schematic drawings are provided to exemplify some embodiments of the present invention, in which the cursor is gradually reduced in size before being deactivated.
  • the cursor 406 is circular and has a radius RA whose value may change as a function of the coordinate ZA of the tip of the finger 400.
  • the condition of no-motion is detected/identified, and the radius has a certain value RA(to) (Fig. 6a).
  • the condition of no-motion is maintained, and the extent of the radius RA gradually and monotonically decreases to RA(ti) and RA(t2) as function of time (Figs. 6b and 6c).
  • the threshold time t x is reached (Fig. 6d)
  • the radius RA has decreased to 0, and the cursor has disappeared from the display 404.
  • the radius of the cursor decreases from an original value RA(to) which optionally depends on the coordinate ZA, and shrinks to zero when the threshold time t x is reached.
  • the cursor's radius decreases monotonically as a function of time.
  • the extent of the cursor's radius is inversely proportional to time.
  • the square of the cursor's radius is inversely proportional to time, so that the area of the cursor is inversely proportional to time.
  • Figs. 7a-7d are similar to Figs. 6a-6d. However, rather than decreasing the cursor's radius with time, slices of the cursor 406 are removed as time passes. At the time point in which the condition of no-motion is detected, the cursor has a certain size. The size of the cursor decreases by deleting slices of the cursor, such that the remaining portion of the cursor 406 grows smaller with time. At the threshold time t x , all of the cursor's slices have been removed and the cursor no longer appears on the display 404.
  • FIGs. 8a and 8b schematic drawings are provided to exemplify some embodiments of the present invention, in which the cursor changes its color upon being deactivated.
  • FIGs. 8a and 8b schematic drawings are provided to exemplify some embodiments of the present invention, in which the cursor changes its color upon being deactivated.
  • These figures illustrate an example in which the color of the cursor's image is the parameter that changes gradually during the time period between the identification of the no-motion condition and the deactivation of the cursor.
  • a white cursor 406 indicates that the cursor is active. After a condition of no-motion has been detected, the cursor's color gradually changes to black (for example, through successive shades of gray, each darker than the previous). When the threshold time t x is reached, the cursor becomes black (Fig. 8b) to indicate that the cursor has been deactivated.
  • FIGs. 9a-9b schematic drawings exemplify a behavior of a user's finger which reactivates the cursor, according to some embodiments of the present invention.
  • the finger's reactivating behavior includes touching the surface 402 of the sensor matrix with the object (finger 400) and raising the object (finger 400) to hover above the surface 402.
  • the user has placed his/her finger 400 on the surface 402 of the sensor matrix.
  • this touch occurred after the cursor's deactivation.
  • the touch may have occurred before deactivation, and maintained after deactivation.
  • the sensor matrix has therefore detected a touch (see step 330 in Fig. 4a). Later, in order to reactivate the cursor, the user raises his/her finger to hover above the surface 402 at a location A.
  • the sensor matrix has thus detected a hover (or in the case of a two-dimensional sensor matrix a break of contact) as illustrated in step 332 of Fig. 4, and has thus detected the complete reactivating behavior.
  • the data flow controller 104 of Fig. 1 can identify the behavior and reactivate the cursor.
  • the reactivated cursor 406 has second virtual coordinates ⁇ and ⁇ on the display 404, where ⁇ and ⁇ are respectively functions of the first coordinates XA and YA defined by the sensor matrix's surface 402.
  • the coordinate Z (if measured) may be converted to a value representing a parameter (e.g. size or opacity) of the cursor 406.
  • the finger's (or object's in general) reactivating behavior may be any predetermined behavior or gesture.
  • the reactivating behavior may include a double tap on the surface defined by the sensor matrix.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A data flow controller is presented for controlling data flow from a monitoring unit monitoring a behavior of at least a part of a physical object to an electronic device. The data flow controller comprises a motion recognition module configured for communication with the monitoring unit, and a control module in communication with the motion recognition module. The motion recognition module is configured for receiving and analyzing measured data indicative of a behavior of at least a part of a physical object in a first coordinate system associated with a predetermined sensor matrix, and/or cursor data indicative of an approximate representation of at least a part of the physical object in a second coordinate system.

Description

SYSTEM AND METHOD FOR CONTROLLING BEHAVIOR OF A CURSOR
TECHNOLOGICAL FIELD
The present invention is in the field of computing, more particularly in the field of object tracking devices and pointing devices.
BACKGROUND
Various techniques have been developed for monitoring the behavior of an object. One type of such techniques utilizes one or more imagers (cameras) with an appropriate field(s) of view for tracking the object motion at the object location. Another type of the monitoring techniques of the kind specified utilizes proximity sensors for tracking the object's movement in the vicinity of the object.
Patent publications WO 2010/084498 and US 2011/0279397, which share the inventors and the assignee of the present patent application, relate to a monitoring unit for use in monitoring a behavior of at least a part of a physical object.
GENERAL DESCRIPTION
The present invention is aimed at a system and a method for controlling the behavior of a cursor so as not to cause the user to suffer from thumb fatigue.
In some embodiments of patent publications WO 2010/084498 and US 2011/0279397, an object tracking device is presented, comprising a proximity sensor matrix capable of generating measured data relating to the position of an object (e.g. a user's fingertip) in either one or a combination of both of contact and contactless modes. The object tracking device includes a monitoring unit comprising: a data input module configured for receiving measured data indicative of a behavior of at least a part of the physical object in a certain coordinate system associated with a predetermined sensing surface; and a processor configured and operable to be responsive to said measured data for transforming the measured data into an approximate representation of said at least portion of the physical object into a virtual coordinate system such that the transformation maintains a positional relationship between virtual points and corresponding portions within said at least part of the physical object. Optionally, a fingertip is sensed (and its location in 3D space is calculated) while hovering above the sensor surface and its representation shape (e.g. cursor) on a screen is defined by a function of the fingertip's three-dimensional position with respect to the sensor surface. This allows the user to target a virtual object on the screen without activating the virtual object, by moving the fingertip in hover mode. The user can activate (or drag, rotate etc) the virtual object when the fingertip representation reaches the virtual object's location, and the finger tip has moved from hover to touch the proximity sensor matrix. Thus, as long as the finger hovers above the proximity sensor matrix, the virtual object is not activated.
Using a conventional approach, if a user does not wish to activate a virtual object and stay idle, the user may need to keep his finger (e.g. thumb) hovering above the sensor until he or she decides to activate a virtual object again. This may lead to "thumb fatigue", i.e. to a fatigue and/or ache in the thumb (and possibly wrist) muscles from keeping the thumb raised above the proximity sensor matrix for long periods.
According to some embodiments of the present invention there is provided a data flow controller associated with the monitoring unit. A sensor matrix in communication with the monitoring unit generates measured data indicative of the position of an object (e.g. a user's finger). The monitoring unit receives the measured data and uses the measured data to calculate the object's position in a first (real) coordinate system defined by the sensor matrix. The monitoring unit further converts the object's position in the first coordinate system to a representation of the object's position in a second (virtual) coordinate system (e.g. defined by a display), and generates cursor data relating to a position (and optionally to an image) of the object's representation (e.g. cursor). The cursor data can be outputted to an external electronic device for displaying on the electronic device's display. Moreover, the cursor data may be used by a software utility (application) running on the electronic device to recognize a certain behavior of the cursor, corresponding to certain action defined by the software utility, and execute the certain action. The action may, for example, include activating/manipulating virtual objects on the electronic device's display.
The data flow controller is configured for identifying a condition of no-motion of the object or of the object's representation, and if the condition of no-motion is maintained for a predetermined time period, as a consequence thereof the data flow controller prevents the output of at least some cursor data (deactivates the cursor) to an electronic device connected to the output of the data flow controller, or transmits a signal to the electronic device instructing the electronic device to disregard some or all of the operations (e.g. activation or manipulation of a virtual object) associated with the cursor data. When the cursor is deactivated, the cursor either disappears from the display or appears in a different form (e.g., shape, color). Moreover, when the cursor is deactivated, the data flow controller operates to prevent the user from performing actions on virtual objects while the sensor matrix continues to generate measured data and possibly the monitoring unit continues tracking the object position. This may be achieved by (i) repressing/filtering out at least some of the cursor data sent to the electronic device, or (ii) instructing the monitoring unit to stop generating at least part of the cursor data, or (iii) instructing the electronic device to disregard some or all of the operations associated with the cursor data, until the cursor is reactivated. In this manner, the behavior of the user's finger cannot be used to activate or manipulate virtual objects. Thus, if the object is the user's thumb, the user may prop his or her thumb on a surface associated with the sensor matrix, and the touch will not be outputted as an instruction to manipulate a virtual object. In this manner, the user may rest his or her thumb rather than keeping it hovering.
In some embodiments, the data flow controller is configured and operable for recognizing a specific behavior of the finger by which the user instructs the data flow controller to reactivate the cursor. Thus, if the user wishes to manipulate virtual objects again, the user's finger's specific behavior is recognized by the data flow controller, and the data flow controller reactivates the cursor by (i) stopping the repression/filtering out of the cursor data, or (ii) instructing the monitoring unit to resume generating all the cursor data, or (iii) not instructing the electronic device to instructing the electronic device to disregard some or all of the operations associated with the cursor data.
Therefore, an aspect of some embodiments of the present invention relates to a data flow controller, for controlling data flow from a monitoring unit monitoring a behavior of at least a part of a physical object to an electronic device, the data flow controller comprising a motion recognition module a control module in communication with the motion recognition module. The motion recognition module is configured for analyzing measured data generated by a sensor matrix and indicative of a behavior of at least a part of a physical object in a first coordinate system associated with a predetermined sensor matrix and/or for analyzing cursor data generated by the monitoring unit and indicative of an approximate representation of the at least part of the physical object in a second coordinate system, in order to recognize a condition of no-motion corresponding to a state in which the at least part of the physical object and/or of the representation thereof does not move, and to identify a specific behavior of the at least part of the physical object and/or of the approximate representation. The control module is configured for: (a) preventing an output of at least some of the cursor data to electronic device and/or instructing the electronic device to disregard some or all of the operations associated with the cursor data, if the condition of no-motion is maintained for a predetermined time period; and (b) re-enabling the output and/or ceasing to instruct the electronic device to disregard some or all of the operations associated with the cursor data, if the specific behavior is recognized following the predetermined time period.
According to some embodiments of the present invention, the control module is configured for gradually changing at least one parameter of an image of the approximate representation of the at least part of the object configured for being displayed by a display of the electronic device in a time interval between a moment at which the condition of no-motion is recognized and a certain time, the time interval being within the predetermined time period.
Optionally, the control module changes the parameter, such that a value of the parameter at the certain time is a function of a length of the time interval.
Optionally, the control module is configured for resetting the change in the parameter's value following the recognition of the specific behavior or following an interruption of the condition of no-motion.
The parameter may be at least one of an opacity, size, and color of the image of the approximate representation.
In a variant, the parameter is opacity, and the change of the opacity during the time interval is a decrease of the opacity.
In another variant, the parameter is size, and the change of the size during the time interval is a decrease of the size.
The control module may be configured for causing the image of the approximate representation to disappear between the end of the predetermined time period and a time point at which the specific behavior is recognized, by preventing an output of at least some of the cursor data to electronic device and/or instructing the electronic device to disregard some or all of the operations associated with the cursor data.
Optionally, the control module is further configured for causing the image of the approximate representation to reappear following the time point at which the specific behavior is recognized, by re-enabling the output and/or ceasing to instruct the electronic device to disregard some or all of the operations associated with the cursor data, if the specific behavior is recognized following the predetermined time period.
In a variant, the control module is configured for detecting the specific behavior by recognizing a touch by the at least part of the object of a certain surface associated with the predetermined surface, and thereafter by recognizing a hover of the at least part of the object over the certain surface.
In another variant, the condition of no-motion is defined as a condition in which a distance by which the at least part of the object moves does not rise over a first threshold distance within the first coordinate system and/or as a condition in which a distance by which the representation moves does not rise over a second threshold distance within the second coordinate system.
Another aspect of some embodiments of the present invention relates to a system for use in monitoring a behavior of at least a part of a physical object, the system comprising a monitoring unit and a data flow controller. The monitoring unit comprises a transformation module configured and operable to receive measured data indicative of the behavior in a first coordinate system associated with a predetermined surface defined by a sensor matrix, and be responsive to said measured data for transforming the measured data into cursor data indicative of an approximate representation of said at least portion of the physical object in a second coordinate system, thereby enabling an output of the cursor data to a desired electronic device. The data flow controller comprises a motion recognition module and a control module. The motion recognition module is configured for analyzing the measured data and/or the cursor data to recognize a condition of no-motion corresponding to a state in which the at least part of the physical object and/or of the representation thereof does not move, and to recognize a specific behavior of the at least part of the object. The control module is configured for: (a) preventing an output of at least some of the cursor data to electronic device and/or instructing the electronic device to disregard some or all of the operations associated with the cursor data, if the condition of no-motion is maintained for a predetermined time period; and (b) re-enabling the output and/or ceasing to instruct the electronic device to disregard some or all of the operations associated with the cursor data, if the specific behavior is recognized following the predetermined time period.
In a variant, the system comprises the sensor matrix.
Optionally, the sensor matrix is a capacitive sensor array defining a sensing surface.
A further aspect of some embodiments of the present invention relates to a method for controlling for controlling data flow between a monitoring unit and an electronic device, the monitoring unit being configured for transforming measured data indicative of a behavior of at least part of an object in a first coordinate system associated with a predetermined surface defined by a sensor matrix into cursor data indicative of an approximate representation of said at least portion of the physical object in a second coordinate system, the method comprising:
analyzing the measured data and/or the approximate representation data to monitor a motion of the at least part of the object and/or the approximate representation; identifying a condition of no-motion corresponding to a state in which the at least part of the physical object and/or of the representation thereof does not move; measuring a time period in which the condition of no-motion is maintained, and restarting the time measuring whenever the motion is resumed within a predetermined threshold time period;
preventing an output of at least some of the cursor data to electronic device and/or instructing the electronic device to disregard some or all of the operations associated with the cursor data, if the condition of no-motion is maintained for the threshold time period;
identifying a specific behavior of the object; and
upon identifying the specific behavior, re-enabling the output and/or ceasing to instruct the electronic device to disregard some or all of the operations associated with the cursor data, and restarting the measuring of time.
Optionally, the condition of no-motion is defined as a condition in which a distance by which the at least part of the object moves does not rise over a first threshold distance within the first coordinate system, and/or as a condition in which a distance by which the cursor moves does not rise over a second threshold distance within the second coordinate system.
In a variant, the method comprises - while the condition of no-motion is maintained within a time interval inside the predetermined threshold time period - gradually changing at least one parameter of an image of the cursor configured for being displayed by a display of the electronic device.
The changing of the parameter may comprise changing a value of the parameter, such that the value of the parameter is a function a length of the time interval.
In another variant, the method comprises - following the identification of the specific behavior or an interruption of the condition of no-motion - resetting the parameter's value change.
Optionally, the parameter comprises at least one of an opacity, a size, and a color of the image of the cursor.
The change of the opacity during the time interval may be a decrease of the opacity.
The change of the size during the time interval may be a decrease of the size.
In a variant, preventing an output of at least some of the cursor data to electronic device and/or instructing the electronic device to disregard some or all of the operations associated with the cursor data causes the image of the cursor to disappear and/or causes the operations associated with the cursor data to be disabled, between the end of the threshold period and a time point at which the specific behavior is recognized.
Optionally, re-enabling the output and/or ceasing to instruct the electronic device to disregard some or all of the operations associated with the cursor data causes the cursor's image of the approximate representation to reappear and/or causes the operations associated with the cursor data to be enabled following the time point at which the specific behavior is recognized.
Yet another aspect of some embodiments of the present invention relates to a computing system including at least one processor, a memory operatively coupled to the processor, and a software utility that can execute in the processor from the memory and that, when executed by the processor from the memory, causes the computing system to perform the steps of the above-described method.
Yet a further aspect of some embodiments of the present invention relates to a medium readable by a computing system and is useful in conjunction with the computing system, the medium storing a software utility which is configured to cause the computing system to perform the steps of the above-described method.
BRIEF DESCRIPTION OF THE DRAWINGS
In order to better understand the subject matter that is disclosed herein and to exemplify how it may be carried out in practice, embodiments will now be described, by way of non-limiting example only, with reference to the accompanying drawings, in which:
Fig. 1 is a block diagram illustrating a system including a data flow controller associated with a monitoring unit for controlling a behavior of a cursor, according to some embodiments of the present invention;
Fig. 2 is a flowchart illustrating a method for controlling the activation and deactivation of a cursor, according to some embodiments of the present invention;
Fig. 3 is a flowchart illustrating an example of the method illustrated in Fig. 2;
Fig. 4 is a flowchart illustrating a specific example of the implementation of the behavior recognition step of the method illustrated in Fig. 3;
Figs. 5a-5d are schematic drawings exemplifying some embodiments of the present invention, in which the cursor gradually fades before being deactivated;
Figs. 6a-6d and 7a-7d are schematic drawings exemplifying some embodiments of the present invention, in which the cursor is gradually reduced in size before being deactivated;
Figs. 8a and 8b are schematic drawings exemplifying some embodiments of the present invention, in which the cursor changes its color upon being deactivated; and
Figs. 9a-9b are schematic drawings exemplifying a behavior for reactivating the cursor, according to some embodiments of the present invention.
DETAILED DESCRIPTION OF EMBODIMENTS
Referring now to Fig. 1, a block diagram is provided illustrating a system including a data flow controller associated with a monitoring unit for controlling a behavior of a cursor, according to some embodiments of the present invention.
The systemlOO is configured for tracking the motion of an object or a part thereof with respect to a surface defined by a sensor matrix 108 and to control a behavior of an approximate representation of the object or of part thereof. The system 100 includes a monitoring unit 102 and a data flow controller 104 in wired or wireless communication with each other. The monitoring unit 102 includes a transformation module 102a, configured for receiving measured data 106 indicative of a behavior of an object or a part of an object in a first coordinate system defined by or associated with the sensing surface, and configured for transforming the measured data 106 into cursor data 110 indicative of an approximate representation of the object (or part of the object) in a second (virtual) coordinate system. The second coordinate system may be, for example, a display of an electronic device (such as a computer, a television, etc). The sensing matrix 108, may be part of system 100 or may be an external unit in wired or wireless communication with the object tracking system 100.
The cursor data 110 is meant to be transmitted in a wired or wireless fashion to the electronic device, which may be a remote device or a device integral with system 100, so as to enable the electronic device to display an image of the cursor on the electronic device's display and move the image in the display's virtual coordinate system. For example, the cursor data 110 may be directly fed to the electronic device's display, or may need a formatting/processing within the electronic device before being readable by the display. Moreover, the cursor data may be used by a software utility (application) running on the electronic device to recognize a certain behavior of the cursor, corresponding to certain action defined by the software utility, and execute the certain action. The action may, for example, include activating/manipulating virtual objects on the electronic device's display.
Before reaching the electronic device, the cursor data 110 is transmitted in a wired or wireless fashion to a motion recognition module 104a of the data flow controller 104. Optionally or additionally, the data flow controller 104 is in direct (wired or wireless) communication with the sensor matrix 108, and the motion recognition module 104a receives the measured data 106. The cursor data 110 and/or the measured data 106 are analyzed by the motion recognition module 104a. The motion recognition module 104a is configured and operable for recognizing a no- motion condition, in which the object does not move with respect to the first (real) coordinate system, or the representation of the object does not move with respect to the second (virtual) coordinate system. The motion recognition module 104a is configured for recognizing a specific behavior of the object by analyzing the measured data. It should be noticed that the no-motion condition does not necessarily correspond to a condition in which the object or representation is completely stationary. In fact, the no-motion condition can be defined as a condition in which the distance by which the object and/or object's representation move does not rise over a certain threshold distance within the respective coordinate systems. Thus, an error interval can be made for slight involuntary motions, such as a tremor or motion generated by blood flow within the user's fingers. If the sensor matrix 108 is a capacitive sensor matrix such as a touch pad, or a proximity sensor, then the no-motion condition may be recognized when a difference between the measured data of two successive readings from the sensor matrix 108 is below a predefined threshold. The measured data may be in the form of a signal intensity map in the first coordinate system, while transforming comprises processing the signal intensity map. The sensor matrix together with its built- in or coupled to processor utilizes appropriate processing utilities (hardware and algorithm) to obtain the proximity/height measurement or map image as described in WO 2010/084498 and US 2011/0279397, which share the inventors and the assignee of the present patent application. For each sensor matrix element, a function describing dependence of the signal magnitude or potential versus on a distance/proximity from the sensing surface can generally be known, e.g. determined in a calibration procedure. The difference the measured data of two successive readings might be estimated as sum of absolute magnitude differences between successive signals generated by each sensor matrix element.
When the no-motion condition is recognized, the data flow controller's control module 104b starts to measure time. In a variant, the time measurement is performed by a timer, which is an independent application running on the data flow controller 104. In another variant, time measurement is performed by hardware which incrementally counts the number of cycles in which the no-motion condition is maintained. If the no- motion condition is maintained for a predetermined time period (hereafter referred to as tx), then the control module 104b deactivates the cursor. This may be done by (i) preventing the output of at least some of the cursor data to reach the electronic device, or (ii) instructing the monitoring unit to cease generating or outputting the cursor data 110, or (iii) by generating a signal instructing the electronic device to disregard some or all of the operations associated with the cursor data. In this manner, the image of the cursor may disappear, and the functionality of the cursor (e.g. the targeting and/or manipulation of a virtual object on the display) will be reduced or even removed until the cursor is re-enabled. As mentioned above, this allows the user to rest his/her finger on the surface defined by the sensor matrix 108 without performing an undesired action, such as activating or manipulating a virtual object. If the no-motion condition is interrupted before the end of the predetermined time period (hereafter referred to as tx), then the control module 104b resets the time.
When the motion recognition module 104a recognizes a specific behavior of the object through the cursor data 110 following the deactivation of the cursor, the control module 104b re-enables the cursor, for example by enabling the output of the cursor data to the electronic device or by withdrawing the signal instructing the electronic device to disregard some or all of the operations associated with the cursor data.
In a non-limiting example of the present invention, the control module 104b prevents the output of at least some of the cursor data to the electronic device by sending a signal 112 to the monitoring unit 102, instructing the monitoring unit 102 not to generate cursor data or to generate only partial cursor data until instructed otherwise. In this manner, the output 114 to the electronic device contains either none of or only a portion of the cursor data 110. In this instance, however, the motion recognition module 104a cannot use the cursor data 110 to recognize the specific behavior of the object for re-enabling the cursor. Thus, the motion recognition module 104a processes the measured data 106 to detect the specific behavior of the object. It should be noted that the configuration in which the monitoring unit 102 ceases or lessens the generation and/or transmission of cursor data may be an energy-saving configuration, since the energy consumption needed by the monitoring unit 102 to process the measured data and/or transmit the cursor data is reduced or even eliminated.
In another example, the control module 104b prevents the output of at least some of the cursor data to the electronic device by filtering the cursor data, so that the output 114 to the electronic device contains only a portion or even none of the cursor data.
In a further example, the control module 104b allows the cursor data to be outputted to the electronic device but generates a signal included in the output 114, to instruct the electronic device to disregard some or all of the operations associated with the cursor data. If the control module 104b instructs the electronic device to ignore all of the cursor data, then the cursor's image on the display will disappear and the user will not be able to convey commands to the electronic device via gestures or behavior of the object.
In some embodiments, the control module 104b instructs the electronic device to disregard some or all of the operations associated with the cursor data, while the image of the cursor may still be displayed (for example in a different color). In such a case, the user will still not be able to convey commands to the electronic device via gestures or behavior of the object. This last feature may be used in the case in which - even when the cursor is inactive - it is desirable to display the cursor's image, while suppressing the cursor's functionality to activate or manipulate virtual objects.
Optionally, during the time in which the no-motion condition is maintained, up to the deactivation of the cursor, the data flow controller may effect a change in a parameter of the cursor's image in order to notify the user of the upcoming deactivation. In one variant, the data indicative of the cursor's image is generated by the monitoring unit. Thus, the change in the parameter of the cursor's image may effected directly by the control module, by altering the data indicative of the cursor's image. In another variant, the electronic device generates the data indicative of the cursor's image in response to the received cursor data. In this case, the control module transmits a signal to the electronic device, to instruct the electronic device to change the parameter of the cursor's image. The parameter of the cursor's image is then reset, when the cursor is reactivated and in case the no-motion condition is interrupted before the end of the predetermined threshold time period.
It should be noted that in one variant, the monitoring unit 102 and the data flow controller 104 may be physically separate units in wired or wireless communication with each other and having dedicated circuitry for performing their required actions. In another variant, the monitoring unit 102 and its module and the data flow controller 104 and its module are functional elements of a software package configured for being implemented on one or more electronic circuits (e.g. processors). In a further variant, the monitoring unit 102 and its module and the data flow controller 104 and its module may include some electronic circuits dedicated to individual elements (units or modules), some common electronic circuits for some or all the elements, and some software utilities configured for operating the dedicated and common circuits for performing the required actions. In yet a further variant, the monitoring unit 102 and its module and the data flow controller 104 and its module may perform their actions only via hardware elements, such as logic circuits, as known in the art.
In some embodiments, the sensor matrix 108 is able to detect only a two dimensional position of the object. Such a two-dimensional sensor matrix may be, for example, touch screen with no hover capabilities. In this case, the condition of no- motion corresponds to a substantial absence of motion on the two-dimensional surface defined by the sensor matrix 108.
In another variant, the sensor matrix 108 can be any proximity sensor matrix, i.e. a sensor matrix which is able to detect a location of the object in three dimensions, and thus to distinguish a hover mode (where the object hovers above a predefined surface) and a touch mode (where the object touches the predefined surface). The proximity sensor matrix may include a capacitive sensor array, as described in WO 2010/084498 and US 2011/0279397 of the assignee of the present invention, or an optical sensor array as known in the art. If the sensor matrix is a proximity sensor matrix, the condition of no-motion may correspond to a substantial absence of motion within a three-dimensional space within which the sensor matrix can detect the object.
The sensor matrix may be a single-touch matrix or a multi-touch matrix. A single-touch matrix allows detection of only one object (or a section thereof) at a given time. Thus the monitoring unit can track only one object (or part thereof) at a given time. A multi-touch matrix allows detection of multiple objects simultaneously, and the monitoring unit can track multiple objects (or multiple parts of an object) at a given time, as described, for example in WO 2010/084498 and US 2011/0279397 of the assignee of the present invention. It should be noted that the structure and operation of the sensor matrix are unimportant for the purposes of the present invention.
Moreover, it should be noted that in the case of a proximity sensor matrix, the touching of the surface defined by the sensor matrix is equivalent to the touching of a second surface associated with the first surface defined by the sensor matrix. In the case in which the sensor matrix is capacitive, the first sensing surface may be protected by a cover representing the second surface, to prevent the object from touching directly the sensing surface. In this case, the object can only touch the outer surface of the protective cover. The outer surface of the protective cover is thus a second surface associated with the surface defined by the sensor matrix. It is to be noted that in all of the above-described types of sensor matrices, the condition of no-motion may occur and be recognized while the object touches the surface associated with the sensor matrix.
Referring now to Fig. 2, a flowchart 200 illustrates a method for controlling the activation and deactivation of a cursor, according to some embodiments of the present invention. The method of the flowchart 200 may be seen as a mode of operation of the data flow controller 104 of Fig. 1.
At 302, the motion of the object is monitored. The monitoring includes the reception and analysis of data indicative of the object behavior in a certain first coordinate system and/or data indicative of the behavior of a representation of the object (cursor) in a second virtual coordinate system. As explained above, the data indicative of the object's behavior may be measured data generated by a sensor matrix, while the data indicative of the behavior of a representation of the object may be the cursor data generated by a monitoring unit in response to the measured data. The monitoring may be performed at a certain rate independently of the process illustrated by the flowchart 200 and while the process illustrated by the flowchart 200 is operated.
At 304, a check is performed to determine whether the object has moved a distance larger than a certain threshold, so as to recognize a no-motion condition. The check is performed by determining whether the object and/or the object's representation substantially do not move relatively to their respective coordinate systems. If the object and/or the object's representation are motionless, a timer is incremented at 306. The timer keeps incrementing as long as the no-motion condition is maintained. If the object has moved, the motion check of 304 is repeated until a condition of no-motion is identified in 304.
Optionally, at 308, a parameter of an image of the cursor is changed or gradually changed as time passes after the no-motion condition was first recognized. In this manner, the user is notified that if the user maintains the condition of no-motion, the cursor will be disabled. The changing of the parameter may start as soon as the time starts counting, or after a certain time interval shorter than the threshold time period ίχ. In a variant, the parameter changes more and more as the time period measured by the timer grows. As will be explained later, the parameter may be the cursor's opacity, size, or color (Figs. 5a-5d, 6a-6d, 7a- 7d, 8a-8b). At 310, a check is performed to determine whether the time measured by the timer has reached a predetermined threshold time period tx. If the result is negative, a new check is performed at 312 to determine whether the object and/or the object's representation have moved. If motion was detected, optionally, the parameter change of the cursor is reset to its appropriate level, which may depend on its position, at 314. At 316, the timer is reset, and the process is brought back to the first motion check 304.
If the time measured by the timer has reached a predetermined threshold time period tx following the check at 310, the cursor is deactivated at 318. As mentioned above, the deactivation of the cursor may involve preventing the user from performing actions on virtual objects. Optionally, when the cursor is deactivated, its image disappears from the display of the electronic device or changes appearance (e.g. color). As explained above, the deactivation of the cursor may be implemented in various ways. For example, at least some of the cursor data may be filtered out, the monitoring module may be instructed to cease or lessen the generation/output of cursor data, or the electronic device may be instructed to disregard some or all of the operations associated with the cursor data. Following the cursor's deactivation, the optional change in the parameter of the cursor' s image may stop.
At 320, a check is made to determine if a reactivation behavior of the object (user's finger) has been identified. If no reactivation behavior is identified, the cursor remains inactive and the check is performed until the reactivation behavior is identified. An example of the reactivation behavior will be discussed below in the descriptions of Figs. 4, 9a-9b. If the reactivation behavior is positively identified, the cursor is reactivated at 322, enabling the user to perform actions on virtual objects on the display. Subsequently, the parameter change of the cursor's image is optionally reset to its appropriate level at 314, the timer is reset at 316, and the process restarts with a motion check at 304.
A new object detected by the sensor matrix and monitored by the monitoring unit gets a new cursor, where the timer and the parameter's change of the cursor's object are reset. The new object may be an additional object (if the sensor matrix is a multi- touch sensor matrix) or the same object (if the object's detection was interrupted and then restored).
It should noted that, in a multi-touch sensor matrix and corresponding monitoring unit, the activation/deactivation process of each cursor corresponding to a respective object may be independent of the activation/deactivation process of other cursors, or may be dependent of them. In the former case, a first cursor corresponding to a first object (e.g. fingertip) may be active, while at least one other cursor is deactivated, and vice versa. This is needed for example in the case in which a supporting thumb that should be ignored touches the sensing surface (or a second surface associated therewith), while the other thumb targets and activates virtual objects. In the latter case, the deactivation process of one cursor affects the deactivation process of all other cursors, and the reactivation of one cursor causes the reactivation of all other cursors. This is needed for example in scenario of typing. Movement of one thumb should reset the timer of "no-motion" condition of the other thumb.
Referring now to Fig. 3, a flowchart 300 illustrates a more specific non-limiting example of a possible implementation of the method illustrated in Fig. 2. This flow chart may be implemented to via hardware logic circuits and/or via software.
The flowchart 300 is self evident and need not be explained, since its steps are the same as the steps illustrated in the flowchart 200, but implemented in a slightly different order.
It should be noted that the steps of the method illustrated by the flowcharts 200 and 300 of Figs. 2 and 3 may be steps configured for being performed by one or more processors operating under the instruction of software readable by a system which includes the processor. The steps of the method illustrated by the flowcharts 200 and 300 of Figs. 2 and 3 may be steps configured for being performed by a computing system having dedicated logic circuits designed to carry out the above method without software instruction.
Moreover, the scope of the present invention extends to a computing system including at least one processor, a memory operatively coupled to the processor, and a software utility that can execute in the processor from the memory and that, when executed by the processor from the memory, causes the computing system to perform the steps described in the flowcharts 200 and 300. The computing system may also include a digital circuit constructed from logic gates in the form of integrated circuits or programmable logic devices.
Referring now to Fig. 4, a flowchart illustrates a specific example of the implementation of the reactivation behavior recognition step 320 of the method illustrated in Figs. 2 and 3. In this example, the reactivation behavior includes touching the sensor surface and then hovering above the sensor surface. This behavior is a natural behavior of a user, since the user tends to rest his/her finger while the cursor is inactive (thus, the touch), and simply needs to raise the finger to reactivate the cursor (thus, the hover). If the sensor matrix is a proximity sensor matrix, the user can maintain his/her finger hovering, in order to navigate (move the cursor's image). If the sensor matrix is a two dimensional sensor matrix, then after deactivation, touch of the finger will be ignored until the finger is once raised (and break of contact was sensed). When the finger re-touches the surface it will not be ignored.
The identification of touch could be done via the cursor data 110 (Z axis is 0) or via the measured data 106. In the latter case, and if the sensor matrix is a capacitive proximity sensor matrix, touch is identified if a sensor matrix element sends a signal magnitude corresponding to touch.
Similarly, the identification of hover (or no-touch) could be done via the cursor data 110 (Z axis is bigger than 0) or via the measured data 106. In the latter case, and if the sensor matrix is a capacitive proximity sensor matrix, touch is identified if no sensor matrix element sends a signal magnitude corresponding to touch.
In Fig. 4, the behavior recognition step 320 includes two stages. In the first stage 330, a check is made to determine whether the sensor matrix has detected a touch of the object. If no touch is sensed, the cursor remains deactivated and the check is performed again. If the touch is sensed, progress to the second stage 332 is made. At 332, a check is made to determine whether the sensor matrix has detected the object hovering above it (corresponding to a break of detection, in the case of a two-dimensional sensor matrix). If no hover is sensed, the cursor remains deactivated and the check at 332 is performed again. If hover is sensed, the behavior recognition is complete and the cursor is reactivated.
Referring now to Figs. 5a-5d, schematic drawings are provided to exemplify some embodiments of the present invention, in which the cursor gradually fades before being deactivated. These figures illustrate an example in which the opacity of the cursor's image is the parameter that changes gradually as time passes, during the period between the identification of the no-motion condition and the deactivation of the cursor.
In Fig. 5a, the object (e.g. a finger 400) hovers above the sensing surface 402 of a proximity sensor matrix at a location A. The first coordinates of the fingertip relative to the sensing surface 402 are XA, YA, ZA- On a display 404 the first fingertip's coordinates XA and YA are converted to the second (virtual) coordinates ΧΆ and ΥΆ- Optionally, the coordinate ZA is converted to a certain parameter, such as the size of the cursor 406 (e.g. the size increases as the distance ZA increases) and/or the opacity of the cursor 406 (e.g. the cursor becomes more opaque as the distance ZA decreases). In Fig. 5a, the no-motion condition has been identified, and the timer was started.
At a later time (Fig. 5b), the fingertip's no-motion condition is maintained, thus the time interval measured by the timer has grown, and the cursor's opacity has decreased as a function of the measured time interval. In Fig. 5c, at an even later time, the fingertip's no-motion condition is maintained, and the cursor's opacity has further decreased. In Fig. 5d, the time interval measured by the timer has reached the threshold value (tx in Figs. 2 and 3 above), the cursor's opacity has reached 0, and thus the cursor is no longer visible. The disappearance of the cursor's image from the display 404 informs the user that cursor has been deactivated. The opacity of the cursor is a decreasing function of the length of the measured time interval. In a non-limiting embodiment of the present invention, the opacity of the cursor is inversely proportional of to the length of the measured time interval.
It should be noted that the cursor's opacity may be a parameter that changes as a function of the coordinate ZA- If this is the case, the decrease increments of the cursor's opacity during the time in which the no-motion condition is maintained may be calculated as fractions of the original opacity of the cursor at the time point in which the no-motion condition was identified. To illustrate this point, a cursor which has opacity of 70% due to the value of ZA at the time point in which the no-motion condition was identified can be considered, and the opacity of the cursor is determined as inversely proportional to the length of the measured time interval. After a time interval equivalent to 1/10 of the threshold time period, the opacity has decreased by 7% (1/10 of 70%) to 63%. After ½ of the threshold time period has passed, the opacity has decreased by 35% (½ of 70%) to 35%. After 4/5 of the threshold time has passed, the opacity has decreased by 56% (4/5 of 70%) to 14%. At the threshold time, the cursor's opacity reaches 0 and the cursor is no longer visible on the display. In this example, opacity of 100% refers to a condition of complete opacity, and opacity of 0 refers to a condition of complete transparency.
Referring now to Figs. 6a-6d and 7a-7d, schematic drawings are provided to exemplify some embodiments of the present invention, in which the cursor is gradually reduced in size before being deactivated. These figures illustrate an example in which the size of the cursor's image is the parameter that changes gradually during the time period between the identification of the no-motion condition and the deactivation of the cursor.
In Figs. 6a-6d, the cursor 406 is circular and has a radius RA whose value may change as a function of the coordinate ZA of the tip of the finger 400. At t0, the condition of no-motion is detected/identified, and the radius has a certain value RA(to) (Fig. 6a). At successive time points ti and t2, the condition of no-motion is maintained, and the extent of the radius RA gradually and monotonically decreases to RA(ti) and RA(t2) as function of time (Figs. 6b and 6c). When the threshold time tx is reached (Fig. 6d), the radius RA has decreased to 0, and the cursor has disappeared from the display 404. Thus, the radius of the cursor decreases from an original value RA(to) which optionally depends on the coordinate ZA, and shrinks to zero when the threshold time tx is reached. The cursor's radius decreases monotonically as a function of time. Optionally, the extent of the cursor's radius is inversely proportional to time. Alternatively, the square of the cursor's radius is inversely proportional to time, so that the area of the cursor is inversely proportional to time.
Figs. 7a-7d are similar to Figs. 6a-6d. However, rather than decreasing the cursor's radius with time, slices of the cursor 406 are removed as time passes. At the time point in which the condition of no-motion is detected, the cursor has a certain size. The size of the cursor decreases by deleting slices of the cursor, such that the remaining portion of the cursor 406 grows smaller with time. At the threshold time tx, all of the cursor's slices have been removed and the cursor no longer appears on the display 404.
Referring now to Figs. 8a and 8b, schematic drawings are provided to exemplify some embodiments of the present invention, in which the cursor changes its color upon being deactivated. These figures illustrate an example in which the color of the cursor's image is the parameter that changes gradually during the time period between the identification of the no-motion condition and the deactivation of the cursor.
As mentioned above, it may be desirable to display the cursor even when the cursor has been deactivated, while suppressing the cursor's functionality to activate or manipulate virtual objects. In the non-limiting Fig. 8a, a white cursor 406 indicates that the cursor is active. After a condition of no-motion has been detected, the cursor's color gradually changes to black (for example, through successive shades of gray, each darker than the previous). When the threshold time tx is reached, the cursor becomes black (Fig. 8b) to indicate that the cursor has been deactivated.
Referring now to Figs. 9a-9b schematic drawings exemplify a behavior of a user's finger which reactivates the cursor, according to some embodiments of the present invention.
In Figs. 9a ad 9b, the finger's reactivating behavior includes touching the surface 402 of the sensor matrix with the object (finger 400) and raising the object (finger 400) to hover above the surface 402. In Fig. 9a, the user has placed his/her finger 400 on the surface 402 of the sensor matrix. In a variant, this touch occurred after the cursor's deactivation. In another variant, the touch may have occurred before deactivation, and maintained after deactivation. The sensor matrix has therefore detected a touch (see step 330 in Fig. 4a). Later, in order to reactivate the cursor, the user raises his/her finger to hover above the surface 402 at a location A. The sensor matrix has thus detected a hover (or in the case of a two-dimensional sensor matrix a break of contact) as illustrated in step 332 of Fig. 4, and has thus detected the complete reactivating behavior. By monitoring the motion of the finger through the cursor data 110 or through the measured data 106, the data flow controller 104 of Fig. 1 can identify the behavior and reactivate the cursor. The reactivated cursor 406 has second virtual coordinates ΧΆ and ΥΆ on the display 404, where ΧΆ and ΥΆ are respectively functions of the first coordinates XA and YA defined by the sensor matrix's surface 402. The coordinate Z (if measured) may be converted to a value representing a parameter (e.g. size or opacity) of the cursor 406.
It should be noted that while the finger's (or object's in general) reactivating behavior may be any predetermined behavior or gesture. For example, the reactivating behavior may include a double tap on the surface defined by the sensor matrix.

Claims

CLAIMS:
1. A data flow controller, for controlling data flow from a monitoring unit monitoring a behavior of at least a part of a physical object to an electronic device, the data flow controller comprising:
a motion recognition module configured for communication with the monitoring unit for receiving and analyzing at least one of the following: measured data indicative of a behavior of at least a part of a physical object in a first coordinate system associated with a predetermined sensor matrix, and cursor data indicative of an approximate representation of at least a part of the physical object in a second coordinate system; identifying a condition of no-motion corresponding to a state in which at least one of the at least part of the physical object and of the representation thereof does not move; and identifying a specific behavior of at least one of said at least part of the physical object and of the approximate representation thereof; and
a control module in communication with the motion recognition module, the control module being configured for:
in response to data corresponding to the condition of no-motion being maintained for a predetermined time period, carrying out at least one of the following: (i) preventing an output of at least some of the cursor data to electronic device, and (ii) generating instruction data to the electronic device to disregard some or all of the operations associated with the cursor data; and
in response to data corresponding to the specific behavior being identified following the predetermined time period, carrying out at least one of the following: (a) re-enabling the output of at least some of the cursor data to electronic device, and (b) ceasing to instruct the electronic device to disregard some or all of the operations associated with the cursor data.
2. The data flow controller of claim 1, wherein the control module is configured for gradually changing at least one parameter of an image of the approximate representation of the at least part of the object configured for being displayed by a display of the electronic device in a time interval between a moment at which the condition of no-motion is recognized and a certain time, the time interval being within the predetermined time period.
3. The controller of claim 2, wherein the control module changes the parameter, such that a value of the parameter at the certain time is a function of a length of the time interval.
4. The controller of claim 3, wherein the control module is configured for resetting the change in the parameter's value following the recognition of the specific behavior or following an interruption of the condition of no-motion.
5. The controller of claim 4, wherein the parameter is at least one of an opacity, size, and color of the image of the approximate representation.
6. The controller of claim 5, wherein the parameter is opacity, and the change of the opacity during the time interval is a decrease of the opacity.
7. The controller of claim 5, wherein the parameter is size, and the change of the size during the time interval is a decrease of the size.
8. The controller of claim 1, wherein the control module is configured for causing the image of the approximate representation to disappear between the end of the predetermined time period and a time point at which the specific behavior is recognized, by carrying out said at least one of (i) and (ii).
9. The controller of claim 8, wherein the control module is configured for causing the image of the approximate representation to reappear following the time point at which the specific behavior is recognized, by carrying out said at least one of (a) and (b), if the specific behavior is recognized following the predetermined time period.
10. The controller of claim 1 , wherein the control module is configured for detecting the specific behavior by recognizing a touch by the at least part of the object of a certain surface associated with the predetermined surface, and thereafter by recognizing a hover of the at least part of the object over the certain surface.
11. The controller of claim 1, wherein the condition of no-motion is defined as at least one of the following: a condition in which a distance by which the at least part of the object moves substantially does not exceed a first threshold distance within the first coordinate system, and a condition in which a distance by which the representation moves substantially does not exceed a second threshold distance within the second coordinate system.
12. A system for use in monitoring a behavior of at least a part of a physical object, the system comprising: a monitoring unit comprising:
a sensor matrix; and
a transformation module configured and operable to receive measured data indicative of the behavior in a first coordinate system associated with a predetermined surface defined by the sensor matrix, and transform the measured data into cursor data indicative of an approximate representation of said at least portion of the physical object in a second coordinate system, thereby enabling an output of the cursor data to a desired electronic device;
and
the data flow controller of Claim 1 configured for communication with the transformation module.
13. The system of claim 12, wherein the sensor matrix is a capacitive sensor array defining a sensing surface.
14. A method for controlling data flow between a monitoring unit and an electronic device, the monitoring unit being configured for transforming measured data indicative of a behavior of at least part of an object in a first coordinate system associated with a predetermined surface defined by a sensor matrix into cursor data indicative of an approximate representation of said at least portion of the physical object in a second coordinate system, the method comprising:
receiving input data comprising at least one of the following: measured data corresponding to behavior of at least part of an object in a first coordinate system defined by a sensor matrix, and cursor data indicative of an approximate representation of said at least portion of the physical object in a second coordinate system;
analyzing the received data;
upon identifying that a condition of no-motion of at least one of said at least part of the physical object and the approximate representation thereof is maintained for a certain threshold time period, carrying out at least one of the following: preventing an output of at least some of the cursor data to electronic device, and instructing the electronic device to disregard some or all of the operations associated with the cursor data;
upon identifying a specific behavior of the object, carrying out at least one of the following: re-enabling the output of at least some of the cursor data to electronic device, and ceasing to instruct the electronic device to disregard some or all of the operations associated with the cursor data.
15. The method of claim 14, wherein the condition of no-motion is defined as at least one of the following: a condition in which a distance by which the at least part of the object moves does not rise over a first threshold distance within the first coordinate system, and as a condition in which a distance by which the cursor moves does not rise over a second threshold distance within the second coordinate system.
16. The method of claim 14, comprising, upon identifying that the condition of no- motion is maintained within a time interval inside the predetermined threshold time period, gradually changing at least one parameter of an image of the cursor configured for being displayed by a display of the electronic device.
17. The method of claim 16, wherein the changing of the parameter comprises changing a value of the parameter, such that the value of the parameter is a function of a length of the time interval.
18. The method of claim 16, comprising, following the identification of the specific behavior or an interruption of the condition of no-motion, resetting the parameter's value change.
19. The method of claim 16, wherein the parameter comprises at least one of an opacity, a size, and a color of the image of the cursor.
20. The method of claim 19, wherein the change of the opacity during the time interval is a decrease of the opacity.
21. The method of claim 19, wherein the change of the size during the time interval is a decrease of the size.
22. The method of any one of claim 14, wherein said preventing of the output of at least some of the cursor data to electronic device and said instructing the electronic device to disregard some or all of the operations associated with the cursor data causes, respectively, the image of the cursor to disappear and the operations associated with the cursor data to be disabled, between the end of the threshold period and a time point at which the specific behavior is recognized.
23. A computer-readable storage media configured to be executed by a processor for carrying out the method of claim 14 for controlling data flow between a monitoring unit and an electronic device.
PCT/IL2013/050782 2012-09-13 2013-09-12 System and method for controlling behavior of a cursor WO2014041548A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201261700443P 2012-09-13 2012-09-13
US61/700,443 2012-09-13

Publications (1)

Publication Number Publication Date
WO2014041548A1 true WO2014041548A1 (en) 2014-03-20

Family

ID=50277732

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2013/050782 WO2014041548A1 (en) 2012-09-13 2013-09-12 System and method for controlling behavior of a cursor

Country Status (1)

Country Link
WO (1) WO2014041548A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3076272A3 (en) * 2015-03-31 2016-10-19 Fujitsu Limited Content display control method and system based on detecting the reflection of a beam of light
US10287492B2 (en) 2015-08-13 2019-05-14 Osram Opto Semiconductors Gmbh Method for producing a conversion element

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030048312A1 (en) * 1987-03-17 2003-03-13 Zimmerman Thomas G. Computer data entry and manipulation apparatus and method
US20100238803A1 (en) * 2007-11-01 2010-09-23 Telefonaktiebolaget Lm Ericsson (Publ) Efficient Flow Control in a Radio Network Controller (RNC)
US20110279397A1 (en) * 2009-01-26 2011-11-17 Zrro Technologies (2009) Ltd. Device and method for monitoring the object's behavior

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030048312A1 (en) * 1987-03-17 2003-03-13 Zimmerman Thomas G. Computer data entry and manipulation apparatus and method
US20100238803A1 (en) * 2007-11-01 2010-09-23 Telefonaktiebolaget Lm Ericsson (Publ) Efficient Flow Control in a Radio Network Controller (RNC)
US20110279397A1 (en) * 2009-01-26 2011-11-17 Zrro Technologies (2009) Ltd. Device and method for monitoring the object's behavior

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
KIRUPA.: "Hiding Mouse Cursor After Some Time.", 20 March 2011 (2011-03-20), Retrieved from the Internet <URL:http://www.kirupa:com/flash/hiding_cursor_after_some_time.htm> [retrieved on 20131227] *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3076272A3 (en) * 2015-03-31 2016-10-19 Fujitsu Limited Content display control method and system based on detecting the reflection of a beam of light
US9857918B2 (en) 2015-03-31 2018-01-02 Fujitsu Limited Content display control method and system
US10287492B2 (en) 2015-08-13 2019-05-14 Osram Opto Semiconductors Gmbh Method for producing a conversion element

Similar Documents

Publication Publication Date Title
JP6333568B2 (en) Proximity motion recognition device using sensor and method using the device
US20140189579A1 (en) System and method for controlling zooming and/or scrolling
TWI502459B (en) Electronic device and touch operating method thereof
US10942605B2 (en) Method of optimizing touch detection
KR101844366B1 (en) Apparatus and method for recognizing touch gesture
KR101413539B1 (en) Apparatus and Method of Inputting Control Signal by using Posture Recognition
CN102681754B (en) Messaging device and information processing method
US9207802B2 (en) Suppression of unintended touch objects
US20130155018A1 (en) Device and method for emulating a touch screen using force information
JP2011503709A (en) Gesture detection for digitizer
WO2015025458A1 (en) Information processing apparatus and information processing method
WO2012032515A1 (en) Device and method for controlling the behavior of virtual objects on a display
TWI502474B (en) Method for operating user interface and electronic device thereof
KR20220101771A (en) Touch-based input for stylus
WO2010032268A2 (en) System and method for controlling graphical objects
KR20110049589A (en) Touch input method and apparatus
KR20120016015A (en) Display apparatus and method for moving object thereof
US20120249487A1 (en) Method of identifying a multi-touch shifting gesture and device using the same
WO2016131274A1 (en) Method, device and terminal for controlling terminal display
WO2013153551A1 (en) Stylus and digitizer for 3d manipulation of virtual objects
US20140118276A1 (en) Touch system adapted to touch control and hover control, and operating method thereof
US20180059806A1 (en) Information processing device, input control method for controlling input to information processing device, and computer-readable storage medium storing program for causing information processing device to perform input control method
CN106796462B (en) Determining a position of an input object
WO2014041548A1 (en) System and method for controlling behavior of a cursor
KR102169236B1 (en) Touchscreen device and method for controlling the same and display apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13837257

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13837257

Country of ref document: EP

Kind code of ref document: A1