US20110083089A1 - Monitoring pointer trajectory and modifying display interface - Google Patents
Monitoring pointer trajectory and modifying display interface Download PDFInfo
- Publication number
- US20110083089A1 US20110083089A1 US12/571,448 US57144809A US2011083089A1 US 20110083089 A1 US20110083089 A1 US 20110083089A1 US 57144809 A US57144809 A US 57144809A US 2011083089 A1 US2011083089 A1 US 2011083089A1
- Authority
- US
- United States
- Prior art keywords
- pointer
- target object
- display
- touch
- trajectory
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000012544 monitoring process Methods 0.000 title description 4
- 238000000034 method Methods 0.000 claims abstract description 21
- 238000013459 approach Methods 0.000 claims abstract description 10
- 238000004040 coloring Methods 0.000 claims abstract description 4
- 230000006399 behavior Effects 0.000 description 28
- 238000010586 diagram Methods 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 238000001514 detection method Methods 0.000 description 3
- 238000010422 painting Methods 0.000 description 2
- 241001422033 Thestylus Species 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000010079 rubber tapping Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04812—Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04101—2.5D-digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface and also measures the distance of the input means within a short range in the Z direction, possibly with a separate measurement setup
Definitions
- This disclosure relates to methods and apparatus for monitoring a trajectory of a pointer and modifying a display interface accordingly.
- Granular controls for instance sliders and knobs, are small and close together relative to the user's finger or a stylus and so are frequently selected or manipulated improperly, requiring correction.
- color palettes may include dozens of colors, and it is very hard to select a specific color on a small screen.
- a cursor may become active on the screen and the user watches the cursor while moving the pointer along the screen.
- Some applications make it easier to select or manipulate an object as the cursor approaches the object on the screen display. But this is not helpful for a user who intends to select an object by tapping it rather than by sliding a pointer along the display.
- the present disclosure relates to apparatus and methods for improving touch-screen interface usability and accuracy by predicting the intent of the user and modifying the display accordingly.
- a system is disclosed that determines a 3-dimensional trajectory of a pointer, as the pointer approaches the touch-screen. The system may then predict an object on the display that the user is likely to select next. The system may designate this object as a Designated Target Object, or DTO. The system may modify the appearance of the DTO by, for example, increasing the size of the DTO, or by changing its shape, style, coloring, perspective, positioning, etc.
- the system may determine what type of pointer is being used, and change the appearance of the DTO based upon the pointer type as well. For example, the size of the DTO might be enlarged more when the pointer is a finger than when the pointer is a stylus, because of the greater precision of a stylus.
- the system may also take into account other factors, such as behavior patterns of the user or usage patterns for an application providing the touch-screen display.
- the system may continuously update its DTO as the various factors change over time.
- the system may change the DTO if the trajectory towards the display changes, or it may deselect a target object if the pointer moves away from the display.
- FIG. 1 is a block diagram showing an illustrative embodiment of a system for monitoring a pointer and modifying an associated display.
- FIG. 2 is a schematic view showing details of an example of the system of FIG. 1 .
- FIG. 3 is a schematic view of a behavior analyzer such as that shown in FIG. 2 .
- FIGS. 4 A-4C are illustrative displays showing examples of display modification according to FIGS. 1-3 .
- FIG. 5 is a flow diagram showing steps performed by an illustrative embodiment of the system of FIGS. 1-3 .
- FIGS. 6A-6F are illustrative displays showing further examples of display modification according to FIGS. 1-5 .
- FIG. 1 is a simplified block diagram according to an illustrative embodiment of a system 10 for monitoring the 3-dimensional trajectory of a user pointer 20 and modifying an associated touch-screen 22 display.
- a touch-enabled display such as a touch-screen tablet connected to a computer or a touch-enabled cell phone screen.
- the display changes to make it easier for the user to select objects on the touch-screen display. For example, if the trajectory of the pointer is tending towards a particular object on the screen, that object may be made larger or change color to indicate to the user that the trajectory is leading to that object, and to make the object easier to select.
- System 10 typically includes, among other elements, a computing device 12 having a processor 14 , memory 16 , a predictor/adapter 17 , an input/output interface 18 , a user pointer 20 , a user pointer position detector 24 , and a touch enabled display 22 , such as a touch-screen for a computer or a cell phone.
- processor 14 memory 16
- predictor/adapter 17 an input/output interface 18
- input/output interface 18 such as a touch-screen for a computer or a cell phone.
- Processor 14 , memory 16 , and I/O interface 18 are generally linked by a bus 26 .
- Position detector e.g. depth detection camera
- a pointer 20 such as a stylus or a user's index finger
- Predictor/adapter 17 is shown in more detail in FIGS. 2 and 3 .
- the path of the pointer 20 can be used to predict the user's intent, for example by predicting the next object on the touch-screen 22 that the user will select (the designated target object, or DTO).
- the system might determine the 3-dimensional trajectory of the pointer, and determine that an object is located nearest to the end of the trajectory, and designate that object as the DTO 290 (see FIGS. 4A-C and 6 A-F).
- the display interface may be modified according to the prediction, for example by increasing the size of the DTO to make the DTO easier to select, and to indicate to the user that the DTO is at the end of the current trajectory, so the user can change the trajectory if the DTO is not the object the user intends to select.
- the DTO may be a single object or portion of an object on the screen, or could be a group of elements, a section of a display, or an application window.
- predictor/adapter 17 is shown as a separate software component running from memory 16 , those skilled in the art will appreciate that predictor/adapter 17 could be built-in to an application 28 or the operating system 30 if desired. Or, predictor/adapter 17 could be implemented on a chip.
- touch-screen 22 is a standard touch-enabled display connected to a computer via a USB and VGA interface or the like. Touch-screen 22 then interfaces with the conventional I/O interface 18 as usual, through control signals from I/O interface 18 to touch-screen 22 . The display modifications are accomplished by adjusting the rendering instructions 130 provided by I/O interface 18 . In this case, no special cables or connectors are required to implement the system. Examples of touch-screens include HP TouchSmart® PCs, Windows® 7 touch-enabled laptops (e.g. Lenovo® x200) Fujitsu® touch-screens and touch overlays, conductive film overlays, etc.
- position detector 24 might comprise a depth detecting webcam built-in to the display, or it could comprise a discrete depth detection camera connected to the computer via USB or firewire or the like.
- An example of a position detector is the 3DV Systems ZCamTM.
- Position detector 24 might also comprise a transceiver attached to the pointer 20 , such as a Wacom Graphire® pen with Bluetooth®.
- FIG. 2 is a schematic view showing possible details of the system of FIG. 1 .
- pointer 20 provides three types of input signal.
- a pointer position detector 24 (in this case a depth detection camera) detects the physical position 104 of pointer 20 in space. It provides data 116 on the 3-dimensional position of the pointer in space as it approaches the screen but before it touches the screen.
- the x, y, z position of pointer 20 is provided to behavior analyzer 122 within predictor adapter 17 .
- the x and y data could be analogous to the x-y position on the screen (up-down and right-left parallel to the screen), while the z data could indicate distance from the screen (perpendicular to the screen).
- polar coordinates could be provided.
- Pointer type data 106 may be detected by pointer determination element 112 .
- pointer type data might comprise a signal from a transceiver in the stylus when a stylus is used, and no signal when a finger is used.
- pointer determination element 112 provides a signal 118 to behavior analyzer 122 indicating that pointer 20 is a finger or a stylus.
- pointers 20 may also be used (for example two fingers, other body parts, or various drawing or painting devices) and that a variety of methods may be used to indicate the type of device. For example, a user might wear a ring having a transceiver to indicate the pointer is a finger. Or the user might select an icon to manually indicate the pointer 20 prior to using the device. Or, an IR detector might be used to distinguish the pointer 20 type.
- the 2-dimensional position 108 of pointer 20 on touch-screen 22 is provided as, for example, x, y data 120 to I/O interface 18 .
- the application providing the display on touch-screen 22 may control the display at this point, so this x, y data 120 isn't necessarily provided to behavior analyzer 122 .
- Behavior analyzer 122 is shown in more detail in FIG. 3 . Briefly, behavior analyzer 122 takes data from pointer 20 (such as pointer position 116 and pointer type 118 ) and combines it with any other data available (such as patterns of usage for the currently running application or stored data about the user) to generate behavior data 124 and to provide it to UI adapter 126 .
- UI adapter 126 provides a display control signal 128 to I/O interface 18 .
- I/O interface 18 provides touch screen input 130 to touch-screen 22 in turn.
- FIG. 3 is a schematic view of an example of a behavior analyzer 122 of FIG. 2 .
- behavior analyzer 122 includes an input trajectory element 202 , which computes 3-dimensional trajectory data 204 for pointer 20 as it approaches touch-screen 22 .
- the term “trajectory” is intended to encompass the path taken by the pointer through space over time, and could include curved paths. Trajectory data 204 might include position, velocity, and acceleration.
- Target object designation element 206 uses trajectory data 204 (among other potential inputs) to make a determination as to what object on the display of touch-screen 22 will likely be selected next by a user.
- trajectory data 204 (among other potential inputs) to make a determination as to what object on the display of touch-screen 22 will likely be selected next by a user.
- the object that would be contacted if the present trajectory continued unchanged becomes a designated target object (DTO) and behavior data 124 indicates the DTO to UI adapter 126 .
- DTO target object
- behavior analyzer 122 may take other input and stored data into account in generating behavior data 124 .
- Pointer type 118 might bias the selection of the DTO.
- behavior analyzer 122 might indicate the DTO sooner when a stylus is used rather than the user's finger.
- certain objects might not be eligible to be DTOs for certain pointer types.
- a finger painting icon might not be selectable or might be less likely to be selected, if the pointer was a stylus rather than a finger.
- certain small dials and sliders in a music application might not become DTOs when a finger is used because of its lesser precision.
- behavior analyzer 122 might have past user behavior stored data 214 indicating that this user skews upward on the display when using a stylus, or that this user tends to select certain objects most frequently. Then user pattern data 216 might be used to bias how the DTO is selected. As another example, behavior analyzer 122 might have stored data 210 that indicates when this application is used, a particular sequence of objects is often selected. Then application pattern data 212 might be used to bias how the DTO is selected. Those skilled in the art will appreciate a number of other ways in which behavior analyzer 122 might influence behavior data 124 . Behavior analyzer 122 will generally continue to refine its determinations, so the DTO or other behavior data 124 may change as the pointer approaches the touch-screen.
- FIGS. 4A , 4 B, and 4 C are illustrative displays showing a display modification according to the system of FIGS. 1-3 .
- three slide controls for adjusting color balance red, green, and blue
- the top display 250 In the middle display 252 , pointer 20 is on a trajectory toward the bottom slide control, and the bottom slide control has become the designated target object (DTO) 290 and has increased in size where the pointer is aiming. The user has an opportunity to course correct at this point, if the DTO is not the user's intended target object (see FIG. 4C ).
- the bottom slide control has increased in size even further as the pointer 20 touches the touch-screen 22 .
- FIG. 4B shows a second example of a display modified by the system of FIGS. 1-3 .
- an onscreen keyboard is in its default state, with all letters the same size.
- the middle display 258 one of the letters has become the DTO 290 . It has grown somewhat in size. In this example, the letters on either side of the DTO are also larger than usual, though not as large as the DTO.
- the bottom display 260 the DTO reaches its maximum size as it is selected. It also has a heavier black outline to indicate the pointer has touched the screen and selected the object that was the DTO (also called “press state”).
- the letters on either side of the DTO letter are somewhat taller than usual, and are compressed horizontally to prevent the on-screen keyboard from being distorted.
- the coloring or shading of the DTO letter is also modified to make it more distinct to the user.
- Those skilled in the art will appreciate that many other visual cues may be used to indicate the DTO 290 to the user and make it easier to select, and if desired to indicate when a DTO is selected.
- shape, style e.g. flat versus convex or font style for text
- positioning in z-space e.g. perspective, or 3-dimensional appearance on the screen
- Other objects or portions of objects besides the DTO may also change in appearance, for example by moving away or shrinking.
- FIG. 4C shows a third example of a display modified by the system of FIGS. 1-3 , wherein the user changes the DTO 290 by changing the trajectory of the pointer.
- three slide controls for adjusting color balance red, green, and blue
- the top display 262 shows three slide controls for adjusting color balance (red, green, and blue) in a default configuration in the top display 262 .
- pointer 20 is on a trajectory toward the bottom slide control, and the bottom slide control has become the DTO 290 and has increased in size.
- the user does not want to select the bottom slide control, but rather the middle slide control. So the user changes the trajectory of the pointer such that the trajectory is directed toward the middle slide control.
- the middle slide control has become the DTO 290 .
- the bottom slide control has increased in size even further as the pointer 20 touches the touch-screen 22 .
- FIG. 5 is a flow diagram showing steps performed by an illustrative embodiment of the system of FIGS. 1-4 .
- This example shows the process of changing the appearance of a designated target object 290 on touch-screen display 22 , as shown in FIGS. 4A-C .
- Step 302 detects an approaching pointer 20 as it approaches touch-screen 22 .
- Pointer determination element 112 determines what pointer 20 is being used in step 304 .
- Pointer position detector 24 and behavior analyzer 122 determine 3-dimensional input trajectory 204 in step 306 .
- behavior analyzer 122 predicts the target object that will be selected by the user next (designated target object, or DTO).
- the appearance of the DTO is changed in step 310 by UI adapter 126 , for example by increasing its size. As discussed above, other aspects of a DTO may be changed, such as color or shape. Loop 312 indicates that the behavior analyzer 122 may continue to refine the behavior data 124 it generates. For example, if trajectory 204 changes, the DTO may change. If pointer 20 moves away, the DTO appearance may return to default.
- step 314 pointer 20 contacts touch-screen 22 .
- the application providing the display on touch-screen 22 takes control of the display at this point.
- Optional step 316 further changes the DTO appearance to provide feedback to the user that the DTO has been selected.
- FIGS. 6A-6F show another example of how a display 302 might be modified according to an embodiment of a display modifying system 10 .
- FIGS. 6A-6F show how a curved trajectory can be used to select an object 306 that is initially hidden behind another object 304 .
- object 306 is hidden behind object 304 .
- a chat window might be hidden behind a drawing application window.
- a particular trajectory executed by pointer 20 (for example a quick, curved trajectory) might be used to indicate to the system that the user wishes to see both the front window and the rear window.
- FIG. 6B the user can see both windows. For the user, it is as if the display rotated sideways so the original front window 304 is to the left and the original back window 306 is to the right.
- FIG. 6C shows window 306 as it becomes the DTO 290 . In FIG. 6C this is indicated by a darker outline around the window, and the increased size of the window. In FIG. 6D , the user selects window 306 . The press state of window 306 is indicated by an even thicker line around the window.
- FIG. 6E shows window 306 after it has been selected, so it is now the front window.
- Window 304 is back to its default appearance, but now is the rear window.
- the user is still manipulating window 306 , so it is still increased in size and has a darker outline, as it is still DTO 290 .
- window 306 has been deselected (or un-designated as DTO), so it is back to normal size and appearance.
- window 306 is still the front window, as it was the last window used.
Abstract
Description
- This disclosure relates to methods and apparatus for monitoring a trajectory of a pointer and modifying a display interface accordingly.
- Today's touch-screen-based user interfaces yield a significant number of user input errors, especially in small screen scenarios (e.g. mobile phones). Users frequently tap on the wrong control with their finger or stylus, and are forced to correct these errors after the tap has resulted in a selection on the touch-screen, which consequently reduces efficiency and end-user satisfaction. Granular controls, for instance sliders and knobs, are small and close together relative to the user's finger or a stylus and so are frequently selected or manipulated improperly, requiring correction. As an example, color palettes may include dozens of colors, and it is very hard to select a specific color on a small screen. Similarly, when a user means to type the letter “a” on a virtual QWERTY keyboard, it's common that the system recognizes the letter “s,” because “s” is next to “a” on the keyboard, and the target touch areas for the letters are small relative to the user's fingertip. Users are then forced to press the delete or back button, slowing down task completion time.
- Once the hand-held user input device, or pointer is in contact with the touch-screen, a cursor may become active on the screen and the user watches the cursor while moving the pointer along the screen. Some applications make it easier to select or manipulate an object as the cursor approaches the object on the screen display. But this is not helpful for a user who intends to select an object by tapping it rather than by sliding a pointer along the display.
- The claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
- The present disclosure relates to apparatus and methods for improving touch-screen interface usability and accuracy by predicting the intent of the user and modifying the display accordingly. A system is disclosed that determines a 3-dimensional trajectory of a pointer, as the pointer approaches the touch-screen. The system may then predict an object on the display that the user is likely to select next. The system may designate this object as a Designated Target Object, or DTO. The system may modify the appearance of the DTO by, for example, increasing the size of the DTO, or by changing its shape, style, coloring, perspective, positioning, etc.
- As a feature, the system may determine what type of pointer is being used, and change the appearance of the DTO based upon the pointer type as well. For example, the size of the DTO might be enlarged more when the pointer is a finger than when the pointer is a stylus, because of the greater precision of a stylus. The system may also take into account other factors, such as behavior patterns of the user or usage patterns for an application providing the touch-screen display.
- The system may continuously update its DTO as the various factors change over time. The system may change the DTO if the trajectory towards the display changes, or it may deselect a target object if the pointer moves away from the display.
- This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
-
FIG. 1 is a block diagram showing an illustrative embodiment of a system for monitoring a pointer and modifying an associated display. -
FIG. 2 is a schematic view showing details of an example of the system ofFIG. 1 . -
FIG. 3 is a schematic view of a behavior analyzer such as that shown inFIG. 2 . - FIGS. 4A-4C are illustrative displays showing examples of display modification according to
FIGS. 1-3 . -
FIG. 5 is a flow diagram showing steps performed by an illustrative embodiment of the system ofFIGS. 1-3 . -
FIGS. 6A-6F are illustrative displays showing further examples of display modification according toFIGS. 1-5 . - Table of elements and reference numbers used in this disclosure:
-
10 System 12 Computing device 14 Processor 16 Memory 17 Predictor/ Adapter 18 Input/ output interface 20 Pointer 22 Touch enabled display (touch-screen) 24 Pointer position detector 26 Bus 28 Application 30 Operating system 104 Physical position in space 106 Pointer type data 108 Position on touch screen 112 Pointer determination element 116 x, y, z position in space 118 Pointer type 120 x, y position of pointer on touch screen 122 Behavior analyzer 124 Behavior data 126 UI adapter 128 Display control signal 130 Touch screen input 202 Input trajectory element 204 Input trajectory data 206 Target object designation element 210 Stored data on pattern of use for application 212 Application pattern data 214 Stored data on past user patterns 216 User pattern data 250-270 (even Display examples numbers) 290 Designated target object 302, 304, 306 Display examples -
FIG. 1 is a simplified block diagram according to an illustrative embodiment of asystem 10 for monitoring the 3-dimensional trajectory of auser pointer 20 and modifying an associated touch-screen 22 display. This is helpful when a user is using a touch-enabled display, such as a touch-screen tablet connected to a computer or a touch-enabled cell phone screen. As the user approaches the touch-screen with apointer 20 such as an index finger or a stylus, the display changes to make it easier for the user to select objects on the touch-screen display. For example, if the trajectory of the pointer is tending towards a particular object on the screen, that object may be made larger or change color to indicate to the user that the trajectory is leading to that object, and to make the object easier to select. -
System 10 typically includes, among other elements, acomputing device 12 having aprocessor 14,memory 16, a predictor/adapter 17, an input/output interface 18, auser pointer 20, a userpointer position detector 24, and a touch enableddisplay 22, such as a touch-screen for a computer or a cell phone.Processor 14,memory 16, and I/O interface 18 are generally linked by abus 26. - Position detector (e.g. depth detection camera) 24 detects the 3-dimensional position over time of a pointer 20 (such as a stylus or a user's index finger) as the pointer approaches the touch-enabled display (or touch-screen) 22. Predictor/
adapter 17 is shown in more detail inFIGS. 2 and 3 . Briefly, the path of thepointer 20 can be used to predict the user's intent, for example by predicting the next object on the touch-screen 22 that the user will select (the designated target object, or DTO). For example, the system might determine the 3-dimensional trajectory of the pointer, and determine that an object is located nearest to the end of the trajectory, and designate that object as the DTO 290 (seeFIGS. 4A-C and 6A-F). Then, the display interface may be modified according to the prediction, for example by increasing the size of the DTO to make the DTO easier to select, and to indicate to the user that the DTO is at the end of the current trajectory, so the user can change the trajectory if the DTO is not the object the user intends to select. Note that the DTO may be a single object or portion of an object on the screen, or could be a group of elements, a section of a display, or an application window. - While predictor/
adapter 17 is shown as a separate software component running frommemory 16, those skilled in the art will appreciate that predictor/adapter 17 could be built-in to anapplication 28 or theoperating system 30 if desired. Or, predictor/adapter 17 could be implemented on a chip. - In some embodiments, touch-
screen 22 is a standard touch-enabled display connected to a computer via a USB and VGA interface or the like. Touch-screen 22 then interfaces with the conventional I/O interface 18 as usual, through control signals from I/O interface 18 to touch-screen 22. The display modifications are accomplished by adjusting therendering instructions 130 provided by I/O interface 18. In this case, no special cables or connectors are required to implement the system. Examples of touch-screens include HP TouchSmart® PCs, Windows® 7 touch-enabled laptops (e.g. Lenovo® x200) Fujitsu® touch-screens and touch overlays, conductive film overlays, etc. - Similarly,
position detector 24 might comprise a depth detecting webcam built-in to the display, or it could comprise a discrete depth detection camera connected to the computer via USB or firewire or the like. An example of a position detector is the 3DV Systems ZCam™.Position detector 24 might also comprise a transceiver attached to thepointer 20, such as a Wacom Graphire® pen with Bluetooth®. -
FIG. 2 is a schematic view showing possible details of the system ofFIG. 1 . In the example ofFIG. 2 ,pointer 20 provides three types of input signal. Prior to contact betweenpointer 20 and touch-screen 22, a pointer position detector 24 (in this case a depth detection camera) detects thephysical position 104 ofpointer 20 in space. It providesdata 116 on the 3-dimensional position of the pointer in space as it approaches the screen but before it touches the screen. For example, the x, y, z position ofpointer 20 is provided tobehavior analyzer 122 withinpredictor adapter 17. The x and y data could be analogous to the x-y position on the screen (up-down and right-left parallel to the screen), while the z data could indicate distance from the screen (perpendicular to the screen). As an alternative, polar coordinates could be provided. - In the example of
FIG. 2 , signal 106 regarding the type ofpointer 20 is also detected, and theparticular pointer type 118 is provided as an input tobehavior analyzer 122.Pointer type data 106 may be detected bypointer determination element 112. In the example where the pointer comprises either a finger or a stylus, pointer type data might comprise a signal from a transceiver in the stylus when a stylus is used, and no signal when a finger is used. Thuspointer determination element 112 provides asignal 118 tobehavior analyzer 122 indicating thatpointer 20 is a finger or a stylus. Those skilled in the art will appreciate that other types ofpointers 20 may also be used (for example two fingers, other body parts, or various drawing or painting devices) and that a variety of methods may be used to indicate the type of device. For example, a user might wear a ring having a transceiver to indicate the pointer is a finger. Or the user might select an icon to manually indicate thepointer 20 prior to using the device. Or, an IR detector might be used to distinguish thepointer 20 type. - After
pointer 20 comes into contact with touch-screen 22, the 2-dimensional position 108 ofpointer 20 on touch-screen 22 is provided as, for example, x,y data 120 to I/O interface 18. The application providing the display on touch-screen 22 may control the display at this point, so this x,y data 120 isn't necessarily provided tobehavior analyzer 122. -
Behavior analyzer 122 is shown in more detail inFIG. 3 . Briefly,behavior analyzer 122 takes data from pointer 20 (such aspointer position 116 and pointer type 118) and combines it with any other data available (such as patterns of usage for the currently running application or stored data about the user) to generatebehavior data 124 and to provide it toUI adapter 126.UI adapter 126 provides adisplay control signal 128 to I/O interface 18. I/O interface 18 providestouch screen input 130 to touch-screen 22 in turn. -
FIG. 3 is a schematic view of an example of abehavior analyzer 122 ofFIG. 2 . In the example ofFIG. 2 ,behavior analyzer 122 includes aninput trajectory element 202, which computes 3-dimensional trajectory data 204 forpointer 20 as it approaches touch-screen 22. The term “trajectory” is intended to encompass the path taken by the pointer through space over time, and could include curved paths.Trajectory data 204 might include position, velocity, and acceleration. - Target
object designation element 206 uses trajectory data 204 (among other potential inputs) to make a determination as to what object on the display of touch-screen 22 will likely be selected next by a user. In the simplest case, the object that would be contacted if the present trajectory continued unchanged becomes a designated target object (DTO) andbehavior data 124 indicates the DTO toUI adapter 126. - However,
behavior analyzer 122 may take other input and stored data into account in generatingbehavior data 124.Pointer type 118 might bias the selection of the DTO. For example,behavior analyzer 122 might indicate the DTO sooner when a stylus is used rather than the user's finger. Or certain objects might not be eligible to be DTOs for certain pointer types. For example, in a painting program, a finger painting icon might not be selectable or might be less likely to be selected, if the pointer was a stylus rather than a finger. Or certain small dials and sliders in a music application might not become DTOs when a finger is used because of its lesser precision. - Further,
behavior analyzer 122 might have past user behavior storeddata 214 indicating that this user skews upward on the display when using a stylus, or that this user tends to select certain objects most frequently. Thenuser pattern data 216 might be used to bias how the DTO is selected. As another example,behavior analyzer 122 might have storeddata 210 that indicates when this application is used, a particular sequence of objects is often selected. Thenapplication pattern data 212 might be used to bias how the DTO is selected. Those skilled in the art will appreciate a number of other ways in whichbehavior analyzer 122 might influencebehavior data 124.Behavior analyzer 122 will generally continue to refine its determinations, so the DTO orother behavior data 124 may change as the pointer approaches the touch-screen. -
FIGS. 4A , 4B, and 4C are illustrative displays showing a display modification according to the system ofFIGS. 1-3 . InFIG. 4A , three slide controls for adjusting color balance (red, green, and blue) are shown in a default configuration in thetop display 250. In themiddle display 252,pointer 20 is on a trajectory toward the bottom slide control, and the bottom slide control has become the designated target object (DTO) 290 and has increased in size where the pointer is aiming. The user has an opportunity to course correct at this point, if the DTO is not the user's intended target object (seeFIG. 4C ). In thebottom display 254 ofFIG. 4A , the bottom slide control has increased in size even further as thepointer 20 touches the touch-screen 22. -
FIG. 4B shows a second example of a display modified by the system ofFIGS. 1-3 . In thetop-most display 256, an onscreen keyboard is in its default state, with all letters the same size. In themiddle display 258, one of the letters has become theDTO 290. It has grown somewhat in size. In this example, the letters on either side of the DTO are also larger than usual, though not as large as the DTO. In thebottom display 260, the DTO reaches its maximum size as it is selected. It also has a heavier black outline to indicate the pointer has touched the screen and selected the object that was the DTO (also called “press state”). In this example, the letters on either side of the DTO letter are somewhat taller than usual, and are compressed horizontally to prevent the on-screen keyboard from being distorted. In this example, the coloring or shading of the DTO letter is also modified to make it more distinct to the user. Those skilled in the art will appreciate that many other visual cues may be used to indicate theDTO 290 to the user and make it easier to select, and if desired to indicate when a DTO is selected. For example, shape, style (e.g. flat versus convex or font style for text), or positioning in z-space (e.g. perspective, or 3-dimensional appearance on the screen) are useful indicators. Other objects or portions of objects besides the DTO may also change in appearance, for example by moving away or shrinking. -
FIG. 4C shows a third example of a display modified by the system ofFIGS. 1-3 , wherein the user changes theDTO 290 by changing the trajectory of the pointer. As inFIG. 4A , three slide controls for adjusting color balance (red, green, and blue) are shown in a default configuration in thetop display 262. In thesecond display 264,pointer 20 is on a trajectory toward the bottom slide control, and the bottom slide control has become theDTO 290 and has increased in size. In this case, the user does not want to select the bottom slide control, but rather the middle slide control. So the user changes the trajectory of the pointer such that the trajectory is directed toward the middle slide control. In the third display down, the middle slide control has become theDTO 290. In thebottom display 270 ofFIG. 4C , the bottom slide control has increased in size even further as thepointer 20 touches the touch-screen 22. -
FIG. 5 is a flow diagram showing steps performed by an illustrative embodiment of the system ofFIGS. 1-4 . This example shows the process of changing the appearance of a designatedtarget object 290 on touch-screen display 22, as shown inFIGS. 4A-C . Step 302 detects an approachingpointer 20 as it approaches touch-screen 22. Pointer determination element 112 (if used) determines whatpointer 20 is being used instep 304.Pointer position detector 24 andbehavior analyzer 122 determine 3-dimensional input trajectory 204 instep 306. Instep 308,behavior analyzer 122 predicts the target object that will be selected by the user next (designated target object, or DTO). The appearance of the DTO is changed instep 310 byUI adapter 126, for example by increasing its size. As discussed above, other aspects of a DTO may be changed, such as color or shape.Loop 312 indicates that thebehavior analyzer 122 may continue to refine thebehavior data 124 it generates. For example, iftrajectory 204 changes, the DTO may change. Ifpointer 20 moves away, the DTO appearance may return to default. - In
step 314,pointer 20 contacts touch-screen 22. Generally, the application providing the display on touch-screen 22 takes control of the display at this point.Optional step 316 further changes the DTO appearance to provide feedback to the user that the DTO has been selected. -
FIGS. 6A-6F show another example of how adisplay 302 might be modified according to an embodiment of adisplay modifying system 10.FIGS. 6A-6F show how a curved trajectory can be used to select anobject 306 that is initially hidden behind anotherobject 304. InFIG. 6A , object 306 is hidden behindobject 304. For example, a chat window might be hidden behind a drawing application window. A particular trajectory executed by pointer 20 (for example a quick, curved trajectory) might be used to indicate to the system that the user wishes to see both the front window and the rear window. InFIG. 6B , the user can see both windows. For the user, it is as if the display rotated sideways so the originalfront window 304 is to the left and theoriginal back window 306 is to the right. -
FIG. 6C showswindow 306 as it becomes theDTO 290. InFIG. 6C this is indicated by a darker outline around the window, and the increased size of the window. InFIG. 6D , the user selectswindow 306. The press state ofwindow 306 is indicated by an even thicker line around the window. -
FIG. 6E showswindow 306 after it has been selected, so it is now the front window.Window 304 is back to its default appearance, but now is the rear window. The user is still manipulatingwindow 306, so it is still increased in size and has a darker outline, as it is stillDTO 290. InFIG. 6F ,window 306 has been deselected (or un-designated as DTO), so it is back to normal size and appearance. In this example,window 306 is still the front window, as it was the last window used. - Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
Claims (20)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/571,448 US8261211B2 (en) | 2009-10-01 | 2009-10-01 | Monitoring pointer trajectory and modifying display interface |
US13/564,478 US20120293439A1 (en) | 2009-10-01 | 2012-08-01 | Monitoring pointer trajectory and modifying display interface |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/571,448 US8261211B2 (en) | 2009-10-01 | 2009-10-01 | Monitoring pointer trajectory and modifying display interface |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/564,478 Continuation US20120293439A1 (en) | 2009-10-01 | 2012-08-01 | Monitoring pointer trajectory and modifying display interface |
Publications (2)
Publication Number | Publication Date |
---|---|
US20110083089A1 true US20110083089A1 (en) | 2011-04-07 |
US8261211B2 US8261211B2 (en) | 2012-09-04 |
Family
ID=43824120
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/571,448 Active 2031-02-20 US8261211B2 (en) | 2009-10-01 | 2009-10-01 | Monitoring pointer trajectory and modifying display interface |
US13/564,478 Abandoned US20120293439A1 (en) | 2009-10-01 | 2012-08-01 | Monitoring pointer trajectory and modifying display interface |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/564,478 Abandoned US20120293439A1 (en) | 2009-10-01 | 2012-08-01 | Monitoring pointer trajectory and modifying display interface |
Country Status (1)
Country | Link |
---|---|
US (2) | US8261211B2 (en) |
Cited By (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120056849A1 (en) * | 2010-09-07 | 2012-03-08 | Shunichi Kasahara | Information processing device, information processing method, and computer program |
US20120087545A1 (en) * | 2010-10-12 | 2012-04-12 | New York University & Tactonic Technologies, LLC | Fusing depth and pressure imaging to provide object identification for multi-touch surfaces |
US20120133677A1 (en) * | 2010-11-26 | 2012-05-31 | Sony Corporation | Information processing device, information processing method, and computer program product |
US20120176319A1 (en) * | 2011-01-06 | 2012-07-12 | Ikuyasu Miyako | Haptic input device |
US20120304122A1 (en) * | 2011-05-25 | 2012-11-29 | International Business Machines Corporation | Movement reduction when scrolling for item selection during direct manipulation |
US20130088454A1 (en) * | 2011-10-11 | 2013-04-11 | International Business Machines Corporation | Pointing to a desired object displayed on a touchscreen |
US20130097550A1 (en) * | 2011-10-14 | 2013-04-18 | Tovi Grossman | Enhanced target selection for a touch-based input enabled user interface |
US20130158748A1 (en) * | 2010-09-03 | 2013-06-20 | Aldebaran Robotics | Mobile robot |
WO2013100727A1 (en) * | 2011-12-28 | 2013-07-04 | Samsung Electronics Co., Ltd. | Display apparatus and image representation method using the same |
CN103995638A (en) * | 2013-02-20 | 2014-08-20 | 富士施乐株式会社 | Data processing apparatus, data processing system, and non-transitory computer readable medium |
EP2860614A1 (en) * | 2013-10-10 | 2015-04-15 | ELMOS Semiconductor AG | Method and device for handling graphically displayed data |
US9146656B1 (en) * | 2011-06-27 | 2015-09-29 | Google Inc. | Notifications user interface |
GB2527891A (en) * | 2014-07-04 | 2016-01-06 | Jaguar Land Rover Ltd | Apparatus and method for determining an intended target |
US9372829B1 (en) * | 2011-12-15 | 2016-06-21 | Amazon Technologies, Inc. | Techniques for predicting user input on touch screen devices |
US9405390B2 (en) * | 2013-12-18 | 2016-08-02 | International Business Machines Corporation | Object selection for computer display screen |
US9535512B2 (en) * | 2009-08-12 | 2017-01-03 | Shimane Prefectural Government | Image recognition apparatus, operation determining method and computer-readable medium |
EP3182250A1 (en) * | 2015-12-18 | 2017-06-21 | Delphi Technologies, Inc. | System and method for monitoring 3d space in front of an output unit for the control of the output unit |
US20170308259A1 (en) * | 2014-11-21 | 2017-10-26 | Renault S.A.S. | Graphical interface and method for managing said graphical interface during the touch-selection of a displayed element |
US10042445B1 (en) * | 2014-09-24 | 2018-08-07 | Amazon Technologies, Inc. | Adaptive display of user interface elements based on proximity sensing |
EP3425488A1 (en) * | 2017-07-03 | 2019-01-09 | Aptiv Technologies Limited | System and method for predicting a touch position of a pointer on a touch-enabled unit or determining a pointing direction in 3d space |
US10191630B2 (en) * | 2014-11-21 | 2019-01-29 | Renault S.A.S. | Graphical interface and method for managing said graphical interface during the touch-selection of a displayed element |
US10353545B2 (en) * | 2014-04-22 | 2019-07-16 | Entit Software Llc | Flow autocomplete |
US10564770B1 (en) * | 2015-06-09 | 2020-02-18 | Apple Inc. | Predictive touch detection |
US10725654B2 (en) * | 2017-02-24 | 2020-07-28 | Kabushiki Kaisha Toshiba | Method of displaying image selected from multiple images on touch screen |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9122390B2 (en) * | 2012-04-24 | 2015-09-01 | Behaviometrics Ab | Method, application and/or service to collect more fine-grained or extra event data from a user sensor device |
US10637933B2 (en) | 2016-05-26 | 2020-04-28 | Logitech Europe S.A. | Method and apparatus for transferring information between electronic devices |
US11010013B2 (en) * | 2018-10-29 | 2021-05-18 | International Business Machines Corporation | Providing visual control guidance on a display |
US11209979B2 (en) | 2019-05-10 | 2021-12-28 | Microsoft Technology Licensing, Llc | Systems and methods for input interfaces promoting obfuscation of user navigation and selections |
US11086514B2 (en) | 2019-05-10 | 2021-08-10 | Microsoft Technology Licensing, Llc | Systems and methods for obfuscating user navigation and selections directed by free-form input |
US20200356263A1 (en) | 2019-05-10 | 2020-11-12 | Microsoft Technology Licensing, Llc | Systems and methods for obscuring touch inputs to interfaces promoting obfuscation of user selections |
US11301056B2 (en) | 2019-05-10 | 2022-04-12 | Microsoft Technology Licensing, Llc | Systems and methods for obfuscating user selections |
US11112881B2 (en) | 2019-05-10 | 2021-09-07 | Microsoft Technology Licensing, Llc. | Systems and methods for identifying user-operated features of input interfaces obfuscating user navigation |
US11526273B2 (en) | 2019-05-10 | 2022-12-13 | Microsoft Technology Licensing, Llc | Systems and methods of selection acknowledgement for interfaces promoting obfuscation of user operations |
US11562638B2 (en) | 2020-08-24 | 2023-01-24 | Logitech Europe S.A. | Electronic system and method for improving human interaction and activities |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020196238A1 (en) * | 2001-06-20 | 2002-12-26 | Hitachi, Ltd. | Touch responsive display unit and method |
US20060238519A1 (en) * | 1998-01-26 | 2006-10-26 | Fingerworks, Inc. | User interface gestures |
US7358956B2 (en) * | 1998-09-14 | 2008-04-15 | Microsoft Corporation | Method for providing feedback responsive to sensing a physical presence proximate to a control of an electronic device |
US20090006101A1 (en) * | 2007-06-28 | 2009-01-01 | Matsushita Electric Industrial Co., Ltd. | Method to detect and assist user intentions with real time visual feedback based on interaction language constraints and pattern recognition of sensory features |
US20090007001A1 (en) * | 2007-06-28 | 2009-01-01 | Matsushita Electric Industrial Co., Ltd. | Virtual keypad systems and methods |
US20090015559A1 (en) * | 2007-07-13 | 2009-01-15 | Synaptics Incorporated | Input device and method for virtual trackball operation |
US20090027334A1 (en) * | 2007-06-01 | 2009-01-29 | Cybernet Systems Corporation | Method for controlling a graphical user interface for touchscreen-enabled computer systems |
US20110138324A1 (en) * | 2009-06-05 | 2011-06-09 | John Sweeney | Predictive target enlargement |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7952570B2 (en) * | 2002-06-08 | 2011-05-31 | Power2B, Inc. | Computer navigation |
US7308112B2 (en) * | 2004-05-14 | 2007-12-11 | Honda Motor Co., Ltd. | Sign based human-machine interaction |
US7653883B2 (en) * | 2004-07-30 | 2010-01-26 | Apple Inc. | Proximity detector in handheld device |
US20070067744A1 (en) * | 2005-08-11 | 2007-03-22 | Lane David M | System and method for the anticipation and execution of icon selection in graphical user interfaces |
US7877707B2 (en) * | 2007-01-06 | 2011-01-25 | Apple Inc. | Detecting and interpreting real-world and security gestures on touch and hover sensitive devices |
US20080309621A1 (en) * | 2007-06-15 | 2008-12-18 | Aggarwal Akhil | Proximity based stylus and display screen, and device incorporating same |
US8432365B2 (en) * | 2007-08-30 | 2013-04-30 | Lg Electronics Inc. | Apparatus and method for providing feedback for three-dimensional touchscreen |
US8402391B1 (en) * | 2008-09-25 | 2013-03-19 | Apple, Inc. | Collaboration system |
-
2009
- 2009-10-01 US US12/571,448 patent/US8261211B2/en active Active
-
2012
- 2012-08-01 US US13/564,478 patent/US20120293439A1/en not_active Abandoned
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060238519A1 (en) * | 1998-01-26 | 2006-10-26 | Fingerworks, Inc. | User interface gestures |
US20080042989A1 (en) * | 1998-01-26 | 2008-02-21 | Apple Inc. | Typing with a touch sensor |
US7358956B2 (en) * | 1998-09-14 | 2008-04-15 | Microsoft Corporation | Method for providing feedback responsive to sensing a physical presence proximate to a control of an electronic device |
US20020196238A1 (en) * | 2001-06-20 | 2002-12-26 | Hitachi, Ltd. | Touch responsive display unit and method |
US20090027334A1 (en) * | 2007-06-01 | 2009-01-29 | Cybernet Systems Corporation | Method for controlling a graphical user interface for touchscreen-enabled computer systems |
US20090006101A1 (en) * | 2007-06-28 | 2009-01-01 | Matsushita Electric Industrial Co., Ltd. | Method to detect and assist user intentions with real time visual feedback based on interaction language constraints and pattern recognition of sensory features |
US20090007001A1 (en) * | 2007-06-28 | 2009-01-01 | Matsushita Electric Industrial Co., Ltd. | Virtual keypad systems and methods |
US20090015559A1 (en) * | 2007-07-13 | 2009-01-15 | Synaptics Incorporated | Input device and method for virtual trackball operation |
US20110138324A1 (en) * | 2009-06-05 | 2011-06-09 | John Sweeney | Predictive target enlargement |
Cited By (54)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9535512B2 (en) * | 2009-08-12 | 2017-01-03 | Shimane Prefectural Government | Image recognition apparatus, operation determining method and computer-readable medium |
US20130158748A1 (en) * | 2010-09-03 | 2013-06-20 | Aldebaran Robotics | Mobile robot |
US9400504B2 (en) * | 2010-09-03 | 2016-07-26 | Aldebaran Robotics | Mobile robot |
US9916046B2 (en) * | 2010-09-07 | 2018-03-13 | Sony Corporation | Controlling movement of displayed objects based on user operation |
US20120056849A1 (en) * | 2010-09-07 | 2012-03-08 | Shunichi Kasahara | Information processing device, information processing method, and computer program |
US20190384432A1 (en) * | 2010-10-12 | 2019-12-19 | New York University | Fusing Depth and Pressure Imaging to Provide Object Identification for Multi-Touch Surfaces |
US11809659B2 (en) * | 2010-10-12 | 2023-11-07 | New York University | Fusing depth and pressure imaging to provide object identification for multi-touch surfaces |
US11301083B2 (en) | 2010-10-12 | 2022-04-12 | New York University | Sensor having a set of plates, and method |
US11249589B2 (en) * | 2010-10-12 | 2022-02-15 | New York University | Fusing depth and pressure imaging to provide object identification for multi-touch surfaces |
US20160364047A1 (en) * | 2010-10-12 | 2016-12-15 | New York University | Fusing Depth and Pressure Imaging to Provide Object Identification for Multi-Touch Surfaces |
US20220308729A1 (en) * | 2010-10-12 | 2022-09-29 | New York University | Fusing Depth and Pressure Imaging to Provide Object Identification for Multi-Touch Surfaces |
US10345984B2 (en) * | 2010-10-12 | 2019-07-09 | New York University | Fusing depth and pressure imaging to provide object identification for multi-touch surfaces |
US9360959B2 (en) * | 2010-10-12 | 2016-06-07 | Tactonic Technologies, Llc | Fusing depth and pressure imaging to provide object identification for multi-touch surfaces |
US20120087545A1 (en) * | 2010-10-12 | 2012-04-12 | New York University & Tactonic Technologies, LLC | Fusing depth and pressure imaging to provide object identification for multi-touch surfaces |
US9678543B2 (en) * | 2010-11-26 | 2017-06-13 | Sony Corporation | Information processing device, information processing method, and computer program product with display inclination features |
US10503218B2 (en) | 2010-11-26 | 2019-12-10 | Sony Corporation | Information processing device and information processing method to control display of image based on inclination information |
US20120133677A1 (en) * | 2010-11-26 | 2012-05-31 | Sony Corporation | Information processing device, information processing method, and computer program product |
US8872769B2 (en) * | 2011-01-06 | 2014-10-28 | Alps Electric Co., Ltd. | Haptic input device |
US20120176319A1 (en) * | 2011-01-06 | 2012-07-12 | Ikuyasu Miyako | Haptic input device |
US9146654B2 (en) * | 2011-05-25 | 2015-09-29 | International Business Machines Corporation | Movement reduction when scrolling for item selection during direct manipulation |
US20120304122A1 (en) * | 2011-05-25 | 2012-11-29 | International Business Machines Corporation | Movement reduction when scrolling for item selection during direct manipulation |
US9146656B1 (en) * | 2011-06-27 | 2015-09-29 | Google Inc. | Notifications user interface |
US20130088454A1 (en) * | 2011-10-11 | 2013-04-11 | International Business Machines Corporation | Pointing to a desired object displayed on a touchscreen |
US8860679B2 (en) * | 2011-10-11 | 2014-10-14 | International Business Machines Corporation | Pointing to a desired object displayed on a touchscreen |
US20130097550A1 (en) * | 2011-10-14 | 2013-04-18 | Tovi Grossman | Enhanced target selection for a touch-based input enabled user interface |
US10684768B2 (en) * | 2011-10-14 | 2020-06-16 | Autodesk, Inc. | Enhanced target selection for a touch-based input enabled user interface |
US10175883B2 (en) * | 2011-12-15 | 2019-01-08 | Amazon Technologies, Inc. | Techniques for predicting user input on touch screen devices |
US20160259547A1 (en) * | 2011-12-15 | 2016-09-08 | Amazon Technologies, Inc. | Techniques for predicting user input on touch screen devices |
US9372829B1 (en) * | 2011-12-15 | 2016-06-21 | Amazon Technologies, Inc. | Techniques for predicting user input on touch screen devices |
US9747002B2 (en) | 2011-12-28 | 2017-08-29 | Samsung Electronics Co., Ltd | Display apparatus and image representation method using the same |
WO2013100727A1 (en) * | 2011-12-28 | 2013-07-04 | Samsung Electronics Co., Ltd. | Display apparatus and image representation method using the same |
US20140237423A1 (en) * | 2013-02-20 | 2014-08-21 | Fuji Xerox Co., Ltd. | Data processing apparatus, data processing system, and non-transitory computer readable medium |
US9619101B2 (en) * | 2013-02-20 | 2017-04-11 | Fuji Xerox Co., Ltd. | Data processing system related to browsing |
CN103995638A (en) * | 2013-02-20 | 2014-08-20 | 富士施乐株式会社 | Data processing apparatus, data processing system, and non-transitory computer readable medium |
EP2860614A1 (en) * | 2013-10-10 | 2015-04-15 | ELMOS Semiconductor AG | Method and device for handling graphically displayed data |
US9405390B2 (en) * | 2013-12-18 | 2016-08-02 | International Business Machines Corporation | Object selection for computer display screen |
US10353545B2 (en) * | 2014-04-22 | 2019-07-16 | Entit Software Llc | Flow autocomplete |
GB2527891A (en) * | 2014-07-04 | 2016-01-06 | Jaguar Land Rover Ltd | Apparatus and method for determining an intended target |
US10719133B2 (en) | 2014-07-04 | 2020-07-21 | Jaguar Land Rover Limited | Apparatus and method for determining an intended target |
GB2528245A (en) * | 2014-07-04 | 2016-01-20 | Jaguar Land Rover Ltd | Apparatus and method for determining an intended target |
GB2530847A (en) * | 2014-07-04 | 2016-04-06 | Jaguar Land Rover Ltd | Apparatus and method for determining an intended target |
US10042445B1 (en) * | 2014-09-24 | 2018-08-07 | Amazon Technologies, Inc. | Adaptive display of user interface elements based on proximity sensing |
US10191630B2 (en) * | 2014-11-21 | 2019-01-29 | Renault S.A.S. | Graphical interface and method for managing said graphical interface during the touch-selection of a displayed element |
US10481787B2 (en) * | 2014-11-21 | 2019-11-19 | Renault S.A.S. | Graphical interface and method for managing said graphical interface during the touch-selection of a displayed element |
US20170308259A1 (en) * | 2014-11-21 | 2017-10-26 | Renault S.A.S. | Graphical interface and method for managing said graphical interface during the touch-selection of a displayed element |
US10564770B1 (en) * | 2015-06-09 | 2020-02-18 | Apple Inc. | Predictive touch detection |
EP3182250A1 (en) * | 2015-12-18 | 2017-06-21 | Delphi Technologies, Inc. | System and method for monitoring 3d space in front of an output unit for the control of the output unit |
CN106896913A (en) * | 2015-12-18 | 2017-06-27 | 戴尔菲技术公司 | 3d space before monitoring output unit is with the system and method for controlled output unit |
US10031624B2 (en) | 2015-12-18 | 2018-07-24 | Delphi Technologies, Inc. | System and method for monitoring 3D space in front of an output unit for the control of the output unit |
US10725654B2 (en) * | 2017-02-24 | 2020-07-28 | Kabushiki Kaisha Toshiba | Method of displaying image selected from multiple images on touch screen |
US10620752B2 (en) | 2017-07-03 | 2020-04-14 | Delphi Technologies, Llc | System and method for predicting a touch position of a pointer on a touch-enabled unit or determining a pointing direction in 3D space |
CN109213363A (en) * | 2017-07-03 | 2019-01-15 | 德尔福技术有限责任公司 | Predictive indicator touch location determines the system and method being directed toward in 3d space |
EP4224293A1 (en) | 2017-07-03 | 2023-08-09 | Aptiv Technologies Limited | System and method for predicting a touch position of a pointer on a touchenabled unit or determining a pointing direction in 3d space |
EP3425488A1 (en) * | 2017-07-03 | 2019-01-09 | Aptiv Technologies Limited | System and method for predicting a touch position of a pointer on a touch-enabled unit or determining a pointing direction in 3d space |
Also Published As
Publication number | Publication date |
---|---|
US20120293439A1 (en) | 2012-11-22 |
US8261211B2 (en) | 2012-09-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8261211B2 (en) | Monitoring pointer trajectory and modifying display interface | |
US11726630B2 (en) | Relative touch user interface enhancements | |
KR101467513B1 (en) | Apparatus for controlling mobile terminal and method thereof | |
EP2256614B1 (en) | Display control apparatus, display control method, and computer program | |
US7849421B2 (en) | Virtual mouse driving apparatus and method using two-handed gestures | |
US20090066659A1 (en) | Computer system with touch screen and separate display screen | |
EP2972669B1 (en) | Depth-based user interface gesture control | |
US8466934B2 (en) | Touchscreen interface | |
EP2686758B1 (en) | Input device user interface enhancements | |
KR101085603B1 (en) | Gesturing with a multipoint sensing device | |
CN107368191B (en) | System for gaze interaction | |
US20160110095A1 (en) | Ergonomic motion detection for receiving character input to electronic devices | |
US20100037183A1 (en) | Display Apparatus, Display Method, and Program | |
US20100259482A1 (en) | Keyboard gesturing | |
US20120092278A1 (en) | Information Processing Apparatus, and Input Control Method and Program of Information Processing Apparatus | |
US20110163988A1 (en) | Image object control system, image object control method and image object control program | |
US8456433B2 (en) | Signal processing apparatus, signal processing method and selection method of user interface icon for multi-touch panel | |
US20120218307A1 (en) | Electronic device with touch control screen and display control method thereof | |
US10956030B2 (en) | Multi-touch based drawing input method and apparatus | |
US11112965B2 (en) | Advanced methods and systems for text input error correction | |
US20120179963A1 (en) | Multi-touch electronic device, graphic display interface thereof and object selection method of multi-touch display | |
JP5657866B2 (en) | Input device, pointer display position adjustment method and program | |
US20150091803A1 (en) | Multi-touch input method for touch input device | |
CN110727388B (en) | Method and device for controlling input method keyboard | |
US10175825B2 (en) | Information processing apparatus, information processing method, and program for determining contact on the basis of a change in color of an image |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MICROSOFT CORPORATION, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PAHUD, MICHEL;MURILLO, OSCAR E.;KARLSON, AMY K.;AND OTHERS;SIGNING DATES FROM 20090928 TO 20090929;REEL/FRAME:023310/0220 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034564/0001 Effective date: 20141014 |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
AS | Assignment |
Owner name: MICROSOFT CORPORATION, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PAHUD, MICHEL;MURILLO, OSCAR E.;KARLSON, AMY K.;AND OTHERS;SIGNING DATES FROM 20090928 TO 20090929;REEL/FRAME:038297/0721 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 12 |