Search Images Maps Play YouTube News Gmail Drive More »
Sign in
Screen reader users: click this link for accessible mode. Accessible mode has the same essential features but works better with your reader.

Patents

  1. Advanced Patent Search
Publication numberUS20110202934 A1
Publication typeApplication
Application numberUS 13/026,097
Publication date18 Aug 2011
Filing date11 Feb 2011
Priority date12 Feb 2010
Also published asUS20110202889
Publication number026097, 13026097, US 2011/0202934 A1, US 2011/202934 A1, US 20110202934 A1, US 20110202934A1, US 2011202934 A1, US 2011202934A1, US-A1-20110202934, US-A1-2011202934, US2011/0202934A1, US2011/202934A1, US20110202934 A1, US20110202934A1, US2011202934 A1, US2011202934A1
InventorsLester F. Ludwig
Original AssigneeLudwig Lester F
Export CitationBiBTeX, EndNote, RefMan
External Links: USPTO, USPTO Assignment, Espacenet
Window manger input focus control for high dimensional touchpad (htpd), advanced mice, and other multidimensional user interfaces
US 20110202934 A1
Abstract
A method for routing signals from a user interface device providing traditional user interface device signals and additional user interface signals to an application is described. Traditional user interface device signals and additional user interface signals are received from a user interface device. Routing of traditional user interface device signals and additional user interface signals from such a user interface device to particular applications can be made responsive to input focus control provided by a windowing system, window manager, operating system, or combination. In one approach, input focus control is provided a single focus control element. In another approach, separate focus control elements are used for traditional user interface device signals and additional user interface signals.
Images(13)
Previous page
Next page
Claims(14)
1. A method for routing signals from a user interface device to an application, the user interface device providing traditional user interface device signals and additional user interface signals, the method comprising:
receiving traditional user interface device signals and additional user interface signals from a user interface device;
routing the traditional user interface device signals to a selected application according to a first input focus selection;
routing the additional user interface device signals to the selected application according to a second input focus selection;
wherein the first and second input focus selection made by at least one focus control element.
2. The method of claim 1 wherein the at least one focus control element comprises a window manager.
3. The method of claim 1 wherein the at least one focus control element comprises a window system.
4. The method of claim 1 wherein the at least one focus control element comprises an operating system.
5. The method of claim 1 wherein both the first and second input focus selection is made by the same focus control element.
6. The method of claim 1 wherein the first and second input focus selection made by a first focus control element and second input focus selection made by a second focus control element.
7. The method of claim 1 wherein the user input device is a computer mouse comprising a first and second scroll wheel.
8. The method of claim 1 wherein the user input device is a computer mouse comprising a touchpad.
9. The method of claim 1 wherein the user input device is a computer mouse comprising a High Definition Touch Pad (HDTP).
10. The method of claim 1 wherein the user input device comprises a touch user interface responsive to gestures and the at least one additional user-adjustable input comprises at least one gesture.
11. The method of claim 1 wherein the user input device comprises a touch user interface responsive to the yaw angle of a finger in contact with the touch user interface and the at least one additional user-adjustable input is responsive to a measurement of the yaw angle.
12. The method of claim 1 wherein the user input device comprises a touch user interface responsive to the roll angle of a finger in contact with the touch user interface and the at least one additional user-adjustable input is responsive to a measurement of the roll angle.
13. The method of claim 1 wherein the user input device comprises a touch user interface responsive to the pitch angle of a finger in contact with the touch user interface and the at least one additional user-adjustable input is responsive to a measurement of the pitch angle.
14. The method of claim 1 wherein the user input device comprises a touch user interface responsive to at least two angles of a finger in contact with the touch user interface and the at least one additional user-adjustable input is responsive to a measurement of each of the two angles.
Description
    CROSS-REFERENCE TO RELATED APPLICATIONS
  • [0001]
    Pursuant to 35 U.S.C. 119(e), this application claims benefit of priority from Provisional U.S. Patent application Ser. No. 61/303898, filed Feb. 12, 2010, the contents of which are incorporated by reference.
  • COPYRIGHT & TRADEMARK NOTICES
  • [0002]
    Certain marks referenced herein may be common law or registered trademarks of the applicant, the assignee or third parties affiliated or unaffiliated with the applicant or the assignee. Use of these marks is for providing an enabling disclosure by way of example and shall not be construed to exclusively limit the scope of the disclosed subject matter to material associated with such marks.
  • BACKGROUND OF THE INVENTION
  • [0003]
    1. Field of the Invention
  • [0004]
    The invention relates to user interface devices that provide additional user interface control signals beyond those of a traditional mouse, touchpad, or track ball, and relates also to window manager input focus control, and in particular to the direction of signals from such a user interface device to particular applications responsive to input focus control provided by a windowing system, operating system, or both.
  • [0005]
    2. Overview of the Invention
  • [0006]
    The present invention addresses the routing of additional user interface signals provided by the High Dimensional Touchpad “HTPD” (for example as taught in 1999 filings of U.S. Pat. No. 6,570,078 and pending U.S. patent application Ser. No. 11/761,978, pending U.S. patent application Ser. Nos. 12/418,605, 12/502,230, 12/541,948, and related pending U.S. patent applications), Advanced Mice (for example as taught in U.S. Pat. No. 7,557,797 pending U.S. patent application Ser. Nos. 12/619,678, 13/025,129, 13/024,569, and related pending U.S. patent applications), and other multidimensional or rich parameter user interfaces providing additional user interface signals above those found in traditional computer mice, touchpads, and trackballs. The routing of signals from such a user interface device to particular applications can be made responsive to input focus control provided by a windowing system, operating system, or both.
  • SUMMARY OF THE INVENTION
  • [0007]
    For purposes of summarizing, certain aspects, advantages, and novel features are described herein. Not all such advantages may be achieved in accordance with any one particular embodiment. Thus, the disclosed subject matter may be embodied or carried out in a manner that achieves or optimizes one advantage or group of advantages without achieving all advantages as may be taught or suggested herein.
  • [0008]
    In one aspect of the invention, a method for routing signals from a user interface device providing traditional user interface device signals and additional user interface signals to an application includes receiving traditional user interface device signals and additional user interface signals from a user interface device, routing the traditional user interface device signals to a selected application according to a first input focus selection, and routing the traditional [additional?] user interface device signals to the selected application according to a second input focus selection. The first and second input focus selection made by at least one focus control element.
  • [0009]
    Another aspect of the invention is that the at least one focus control element comprises a window manager, a window system, or an operating system. Further, the first and second input focus selection is made by the same focus control element, or the first and second input focus selection are made by a first focus control element and the second input focus selection is made by a second focus control element.
  • [0010]
    The user input device may a computer mouse comprising a first and second scroll wheel, a touchpad, or a High Definition Touch Pad (HDTP). When the user input device comprises a touch user interface, the touch user interface is responsive to gestures and the additional user-adjustable input comprises at least one gesture.
  • [0011]
    The touch user interface may be responsive to one of the yaw angle, roll angle, or pitch angle of a finger in contact with the touch user interface. The additional user-adjustable input is responsive to the yaw angle, roll angle, or pitch angle. The touch user interface also may be responsive at least two angles of a finger in contact with the touch user interface, and the additional user-adjustable input is responsive to a measurement of each of the two angles.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • [0012]
    The above and other aspects, features and advantages of the present invention will become more apparent upon consideration of the following description of preferred embodiments taken in conjunction with the accompanying drawing figures.
  • [0013]
    FIG. 1 depicts a plurality of windows, one or more of which can be a hypermedia application window such as a browser, and hierarchies of visually displayed and other objects within or associated with these windows.
  • [0014]
    FIG. 2 illustrates the side view of a finger lightly touching the surface of a tactile sensor array.
  • [0015]
    FIG. 3 depicts a popularly accepted model of a typical cell phone or PDA capacitive proximity sensor implementation.
  • [0016]
    FIG. 4 is a graphical representation of a tactile image produced by contact of a human finger on a tactile sensor array.
  • [0017]
    FIG. 5 provides a graphical representation of a tactile image produced by contact with multiple human fingers on a tactile sensor array of lesser spatial resolution than that depicted in FIG. 4.
  • [0018]
    FIG. 6 depicts a signal flow in an HDTP implementation.
  • [0019]
    FIGS. 7 a-7 f illustrate the six independently adjustable degrees of freedom of touch from a single finger that can be simultaneously measured by the HDTP technology.
  • [0020]
    FIG. 8 suggests general ways in which two or more of these independently adjustable degrees of freedom adjusted simultaneously.
  • [0021]
    FIG. 9 demonstrates a few two-finger multi-touch postures and/or gestures from the many that can be readily recognized by HDTP technology.
  • [0022]
    FIG. 10 shows an example of how raw measurements of the six quantities of FIGS. 7 a-7 f, together with shape recognition for distinguishing contact with various parts of the hand and the touchpad, can be used to create a rich information flux of parameters, rates, and symbols.
  • [0023]
    FIG. 11 shows an approach for incorporating posture recognition, gesture recognition, and other functions to create a rich human/machine tactile interface system capable of additionally supporting or incorporating syntax and grammars.
  • [0024]
    FIGS. 12 a-12 d depict operations acting on various parameters, rates, and symbols to produce other parameters, rates, and symbols, including operations such as sample/hold, interpretation, context, etc.
  • [0025]
    FIG. 13 depicts a user interface input arrangement incorporating one or more HDTPs that provides user interface input event and quantity routing for focus control.
  • [0026]
    FIGS. 14 a-14 g depict a number of arrangements and embodiments employing HDTP technology.
  • [0027]
    FIGS. 15 a-15 e depict various integrations of an HDTP into the back of a conventional computer mouse as taught in U.S. Pat. No. 7,557,797 and pending U.S. patent application Ser. No. 12/619,678.
  • [0028]
    FIGS. 16 a and 16 b illustrate examples of conventional scroll-wheel mouse provided with an added left-right scroll-wheel as taught in U.S. patent application Ser. No. 13/024569.
  • [0029]
    FIGS. 17 a-17 c illustrate examples where a single trackball is incorporated into the back of a conventional computer mouse as taught in U.S. Pat. No. 7,557,797.
  • [0030]
    FIGS. 18 a-18 c illustrate examples where two trackballs are incorporated into the back of a conventional computer mouse as taught in U.S. Pat. No. 7,557,797, some of these (FIGS. 18 b-18 c) comprising yet other additional sensors.
  • [0031]
    FIG. 18 d depicts a mouse provided with a trackball and a small touchpad as taught in U.S. Pat. No. 7,557,797.
  • [0032]
    FIG. 18 e depicts a mouse provided with a plurality of slider controls as taught in U.S. Pat. No. 7,557,797.
  • [0033]
    FIGS. 19 a-19 c depicts exemplary embodiments providing HDTP technologies with a HID device abstraction for interfacing to applications.
  • [0034]
    FIGS. 20 a-20 d depict arrangements for directing additional user interface parameter signals to applications.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • [0035]
    In the following description, reference is made to the accompanying drawing figures which form a part hereof, and which show by way of illustration specific embodiments of the invention. It is to be understood by those of ordinary skill in this technological field that other embodiments may be utilized, and structural, electrical, as well as procedural changes may be made without departing from the scope of the present invention.
  • [0036]
    In the following, numerous specific details are set forth to provide a thorough description of various embodiments. Certain embodiments may be practiced without these specific details or with some variations in detail. In some instances, certain features are described in less detail so as not to obscure other aspects. The level of detail associated with each of the elements or features should not be construed to qualify the novelty or importance of one feature over the others.
  • Windowing Systems
  • [0037]
    Desktop, laptop, tablet, web, and other types of contemporary computers provide for a plurality of active software applications to share visual display and user input devices by means of some form of windowing system. Windowing systems are well known with foundational principles dating back decades (see for example F. R. Hopgood, et al., Methodology of Window Management, Springer-Verlag, Berlin, 1986, ISBN 0387161163) and are known at least as an operational level to virtually all users of these devices. Without getting into the many well-known aspects of windowing systems, one skilled in the art is reminded that:
      • A plurality of windows can be displayed simultaneously on the screen of the (desktop, laptop, tablet, or web) computer;
      • Multiple windows can overlap one another;
      • The windowing system also provides a visually-rendered cursor whose position is determined by left-right/forward-back operation provisions of a pointing device (mouse, touchpad, trackball, etc.);
      • Windows are typically selected by “clicking” a discrete-event provision (button operation, touchpad tap, etc.) of the pointing device—windows can also be selected by default in some cases, such as when the initialization of a previously inactive application displays, updates, or pops-up a new window;
      • A selected window remains selected until the user selects a different window or a window is selected by default;
      • User keyboard input and other types of pointing device input is typically directed to aspects of an application associated with the window that is currently selected.
  • [0044]
    FIG. 1 depicts a visual display screen area 100 displaying a plurality of representative windows—here 101, 102, 110. In this figure., none of the windows 101, 102, 110 are shown as overlapping so as to streamline the discussion; one skilled in the art will understand the appearance of overlapping of one or more of these windows. One or more of these windows can be a hypermedia window such as a browser (here 110), and hierarchies of objects (111 and 111.1; 112 and 112.1, 112.2) rendered within or superimposed over the display area of the browser window 110. The hypermedia (browser or application) window 110 of FIG. 1 also depicts a toolbar 110.tb as well as a vertical scrollbar 110.vs and a horizontal scrollbar 110.hs. Such vertical and horizontal scrollbars typically appear when the display area within the window is smaller than the vertical and/or horizontal span of the visual content, allowing control of the vignette displayed within the aperture created by the display, this control responsive to the positions of scrollbar(s) within the degrees of possible travel. The position of a scrollbar is in turn controlled by one or more input aspects from the pointing device.
  • Traditional User Interface Pointing Devices
  • [0045]
    Turning now to traditional user interface pointing devices, the traditional mouse, traditional trackball, and traditional touchpad, and traditional touchscreen typically are used to provide the following user inputs:
  • [0000]
    Traditional
    User Input Traditional Traditional Touchpad or
    Type Mouse Trackball Touchscreen
    Cursor “X” Left-right position Left-right rotation Left-right position
    position of housing of trackball of finger/stylus
    Cursor “Y” Forward-back Forward-back Left-right position
    position position of rotation of of finger/stylus
    housing trackball
    Left click Left button Left button Left button and/or
    tap
    Right click Right button Right button Right button
    Double (left) Double operation Double operation Double operation
    click of Left button of Left button of Left button
    and/or double-tap
  • Contemporary Generation User Interface Pointing Devices
  • [0046]
    More contemporary computer mice additionally provide a scrollwheel along with the traditional components and features of the traditional mouse. Some scrollwheels allow the wheel to be depressed downward to operate a spring-loaded switch that provides a third class of button events. As mentioned just above, typically the scrollwheel provided by a contemporary computer mouse is solely directed to the operation of the vertical scrollbar (for example, the scrollbar 110.vs of FIG. 1), if displayed, of the currently selected window. More recently, computer mice providing “2-way scrolling” (sometimes called “4-way scrolling”) features wherein the scrollwheel, in addition to conventional forward-back rotation, can be tilted left or right with the resulting signal directed to the control the horizontal scrollbar (for example, the scrollbar 110.hs of FIG. 1), if displayed, of the currently selected window.
  • [0047]
    Providing an additional scroll control to a scrollwheel mouse that can be used to operate the horizontal scrollbar with a left-right operation was taught several years prior to the appearance of such products in the specification of issued U.S. Pat. No. 7,557,797 (priority date Feb. 12, 2004) and is to be addressed in a pending continuation patent application from that specification subject to that same priority date.
  • [0048]
    Additionally, touch screens have recently received tremendous attention with the addition of array tactile imaging capabilities. Such touch screen technologies permit multi-touch sensing, metaphors, and gestures. Although such touch screen technologies have obtained great commercial success from there defining role in the iPhone and subsequent adaptations in PDAs and other types of cell phones and hand-held devices, these were in fact taught in the 1999 filings of U.S. Pat. No. 6,570,078 and pending U.S. patent application Ser. No. 11/761,978.
  • [0049]
    These more advanced user interface pointing devices provide additional user control capabilities that can be used in hypermedia applications, and in particular in web-based applications rendered in a browser. A known example of this is the aforementioned use of the scrollwheel in controlling the degree of zoom in the web-based Google Maps application.
  • [0050]
    Further, there remains a wide range of additional control capabilities that can be provided by further enhanced user interface technologies. A number of representative enhanced user interface technologies are described next, specifically:
      • (a) the HDTP taught in the 1999 filings of U.S. Pat. No. 6,570,078 and pending U.S. patent application Ser. No. 11/761,978, pending U.S. patent application Ser. Nos. 12/418,605, 12/502,230, 12/541,948, and related pending U.S. patent applications; and
      • (b) the Advanced Mice taught in the 2004 filings of issued U.S. Pat. No. 7,557,797 and related pending U.S. patent applications such as Ser. Nos. 12/619,678, 13/025,129, and 13/024,569. The capabilities of these, or to a more limited extent, the capabilities of contemporary generation user interface pointing devices can be used to enhance the capabilities of traditional hypermedia objects (such as the hyperlink, button, rollover, menu, and slider) as well as defining new types of hypermedia objects.
  • HDTP User Interface Technology
  • [0053]
    In an embodiment, a touchpad used as a pointing and data entry device can comprise an array of sensors. The array of sensors is used to create a tactile image of a type associated with the type of sensor and method of contact by the human hand. The tactile image comprises and array of data elements such as an array of pressure measurements, and array of proximity measurements, an array of reflective optical measurements, etc. Thus the tactile image can be or comprise a pressure image, proximity image, reflective optical image, etc. In an embodiment, each data element comprises a scalar numerical value corresponding to a measurement from an associated sensor. In another embodiment, at least one data element comprises a plurality of scalar numerical values. In an embodiment, each data element comprises one or more scalar values produced from signal processing, image processing, and/or other operations applied to measurements provided by an array of sensors.
  • [0054]
    In one embodiment, the individual sensors in the sensor array are pressure sensors and a direct pressure-sensing tactile image is generated by the sensor array.
  • [0055]
    In another embodiment, the individual sensors in the sensor array are proximity sensors and a direct proximity tactile image is generated by the sensor array. Since the contacting surfaces of the finger or hand tissue contacting a surface typically increasingly deforms as pressure is applied, the sensor array comprised of proximity sensors also provides an indirect pressure-sensing tactile image.
  • [0056]
    In another embodiment, the individual sensors in the sensor array can be optical sensors. In one variation of this, an optical image is generated and an indirect proximity tactile image is generated by the sensor array. In another variation, the optical image can be observed through a transparent or translucent rigid material and, as the contacting surfaces of the finger or hand tissue contacting a surface typically increasingly deforms as pressure is applied, the optical sensor array also provides an indirect pressure-sensing tactile image.
  • [0057]
    In some embodiments, the array of sensors can be transparent or translucent and can be provided with an underlying visual display element such as an alphanumeric and/or graphics and/or image display. The underlying visual display can comprise, for example, an LED array display, a backlit LCD, etc. Such an underlying display can be used to render geometric boundaries or labels for soft-key functionality implemented with the tactile sensor array, to display status information, etc.
  • [0058]
    In an embodiment, the touchpad can comprise a tactile sensor array obtains or provides individual measurements in every enabled cell in the sensor array that provides these as numerical values. The numerical values can be communicated in a numerical data array, as a sequential data stream, or in other ways. When regarded as a numerical data array with row and column ordering that can be associated with the geometric layout of the individual cells of the sensor array, the numerical data array can be regarded as representing a tactile image.
  • [0059]
    The tactile sensor array should not be confused with the “null/contact” touchpad which, in normal operation, acts as a pair of orthogonally responsive potentiometers. These “null/contact” touchpads do not produce pressure images, proximity images, or other image data but rather, in normal operation, two voltages linearly corresponding to the location of a left-right edge and forward-back edge of a single area of contact. Such “null/contact” touchpads, which are universally found in existing laptop computers, are discussed and differentiated from tactile sensor arrays in issued U.S. Pat. No. 6,570,078 and pending U.S. patent application Ser. No. 11/761,978 (pre-grant publication U.S. 2007/0229477). Before leaving this topic, it is pointed out that these the “null/contact” touchpads nonetheless can be inexpensively adapted with simple analog electronics to provide at least primitive multi-touch capabilities as taught in U.S. Pat. No. 6,570,078 and pending U.S. patent application Ser. No. 11/761,978 (therein, paragraphs [0022]-[0029] of its pre-grant publication U.S. 2007/0229477, for example).
  • [0060]
    One implementation of a tactile sensor array is a pressure sensor array. Pressure sensor arrays discussed in U.S. Pat. No. 6,570,078 and pending U.S. patent application Ser. No. 11/761,978. These typically operate by measuring changes in electrical (resistive, capacitive) or optical properties of an elastic material as the material is compressed. Prominent manufacturers and suppliers of pressure sensor arrays include Tekscan, Inc. (307 West First Street, South Boston, Mass., 02127, www.tekscan.com), Pressure Profile Systems (5757 Century Boulevard, Suite 600, Los Angeles, Calif. 90045, www.pressureprofile.com), Sensor Products, Inc. (300 Madison Avenue, Madison, N.J. 07940 USA, www.sensorprod.com), and Xsensor Technology Corporation (Suite 111, 319-2nd Ave SW, Calgary, Alberta T2P 0C5, Canada, www.xsensor.com).
  • [0061]
    In lieu of a pressure sensor array, a proximity sensor array or effective equivalents (for example, as can be accomplished with a video camera as described in issued U.S. Pat. No. 6,570,078 and pending U.S. patent application Ser. No. 11/761,978) can be used as a tactile sensor array. In general, a tactile proximity sensor array suitable for use with the present invention can be implemented in a wide variety of ways using any number of techniques or physical effects. The only requirement is that the tactile proximity sensor array produce a multi-level gradient measurement image as a finger, part of hand, or other pliable object varies is proximity in the immediate area of the sensor surface.
  • [0062]
    More specifically, FIG. 2 illustrates a representative side view of a finger 1001 lightly touching the surface 1002 of a tactile sensor array. In this example, the finger 1001 contacts the tactile sensor surface in a relatively small area 1003. In this situation, on either side the finger curves away from the region of contact 1003, where the non-contacting yet proximate portions of the finger grow increasingly far 1004 a, 1005 a, 1004 b, 1005 b from the surface of the sensor 1002. These variations in physical proximity of portions of the finger with respect to the sensor surface should cause each sensor element in the tactile proximity sensor array to provide a corresponding proximity measurement varying responsively to the proximity, separation distance, etc. The tactile proximity sensor array advantageously comprises enough spatial resolution to provide a plurality of sensors within the area occupied by the finger (for example, the area comprising width 1006). In this case, as the finger is pressed down, the region of contact 1003 grows as the more and more of the pliable surface of the finger conforms to the tactile sensor array surface 1002, and the distances 1004 a, 1005 a, 1004 b, 1005 b contract. If the finger is tilted, for example by rolling in the user viewpoint counterclockwise (which in the depicted end-of-finger viewpoint clockwise 1007 a) the separation distances on one side of the finger 1004 a, 1005 a will contract while the separation distances on one side of the finger 1004 b, 1005 b will lengthen. Similarly if the finger is tilted, for example by rolling in the user viewpoint clockwise (which in the depicted end-of-finger viewpoint counterclockwise 1007 b) the separation distances on the side of the finger 1004 b, 1005 b will contract while the separation distances on the side of the finger 1004 a, 1005 a will lengthen.
  • [0063]
    Capacitive proximity sensors can be used in various handheld devices with touch interfaces (see for example, among many, http://electronics.howstuffworks.com/iphone2.htm, http://www.veritasetvisus.com/VVTP-12,%20Walker.pdf). Prominent manufacturers and suppliers include Balda AG (Bergkirchener Str. 228, 32549 Bad Oeynhausen, Del., www.balda.de), Cypress (198 Champion Ct., San Jose, Calif. 95134, www.cypress.com), and Synaptics (2381 Bering Dr., San Jose, Calif. 95131, www.synaptics.com). In these sensors, the region of finger contact is detected by variations in localized capacitance resulting from capacitive proximity effects induced by a nearly-adjacent finger. More specifically, the electrical field at the intersection of orthogonally-aligned conductive buses is influenced by the vertical distance or gap between the surface of the sensor array and the skin surface of the finger. The capacitive proximity sensor technology is low-cost, reliable, long-life, stable, and can readily be made transparent. FIG. 3 (adapted from http://www.veritasetvisus.com/VVTP-12,%20Walker.pdf with slightly more functional detail added) shows a popularly accepted model of a typical cell phone or PDA capacitive proximity sensor implementation. In some embodiments the present invention can use the same spatial resolution as current capacitive proximity touchscreen sensor arrays. In other embodiments of the present invention, a higher spatial resolution is advantageous. For example, in many contemporary capacitive proximity sensors, the touch of a fingertip can be comprised within the physical dimensions of one sensor element or one sensor-separation spacing. In higher resolution implementations, the touch of a fingertip can span the physical dimensions of many sensor elements and sensor-separation spacing, for example as in the higher resolution example depicted in (soon to be discussed) FIGS. 4-5.
  • [0064]
    As a first example of an optical array sensor, Forrest M. Mims is credited as showing that a conventional LED can be used as a light detector as well as a light emitter. Recently, light-emitting diodes have been used as a tactile proximity sensor array (for example, as depicted in the video available at http://cs.nyu.edu/˜jhan/ledtouch/index.html). Such tactile proximity array implementations typically need to be operated in a darkened environment (as seen in the video in the above web link). In one embodiment provided for by the invention, each LED in an array of LEDs can be used as a photodetector as well as a light emitter, although a single LED can either transmit or receive information at one time. Each LED in the array can sequentially be selected to be set to be in receiving mode while others adjacent to it are placed in light emitting mode. A particular LED in receiving mode can pick up reflected light from the finger, provided by said neighboring illuminating-mode LEDs. The invention provides for additional systems and methods for not requiring darkness in the user environment in order to operate an LED array as a tactile proximity sensor. In one embodiment, potential interference from ambient light in the surrounding user environment can be limited by using an opaque pliable and/or elastically deformable surface covering the LED array that is appropriately reflective (directionally, amorphously, etc. as can be advantageous in a particular design) on the side facing the LED array. Such a system and method can be readily implemented in a wide variety of ways as is clear to one skilled in the art. In another embodiment, potential interference from ambient light in the surrounding user environment can be limited by employing amplitude, phase, or pulse width modulated circuitry and/or software to control the underlying light emission and receiving process. For example, in an implementation the LED array can be configured to emit modulated light modulated at a particular carrier frequency or variation waveform and respond to only modulated light signal components extracted from the received light signals comprising that same carrier frequency or variation waveform. Such a system and method can be readily implemented in a wide variety of ways as is clear to one skilled in the art.
  • [0065]
    As a second example of an optical array sensor, use of video cameras for gathering control information from the human hand in various ways is discussed in U.S. Pat. No. 6,570,078 and Pending U.S. patent application Ser. No. 11/761,978. In another video camera tactile controller embodiment, a flat or curved translucent panel can be used as sensor surface. When a finger is placed on the translucent panel, light applied to the opposite side of the translucent panel reflects light in a distinctly different manner than in other regions where there is no finger or other tactile contact. The image captured by an associated video camera will provide gradient information responsive to the contact and proximity of the finger with respect to the surface of the translucent panel. For example, the parts of the finger that are in contact with the surface will provide the greatest degree of reflection while parts of the finger that curve away from the surface of the sensor provide less reflection of the light. Gradients of the reflected light captured by the video camera can be arranged to produce a gradient image that appears similar to the multilevel quantized image captured by a pressure sensor. By comparing changes in gradient, changes in the position of the finger and pressure applied by the finger can be detected.
  • [0066]
    In many various embodiments, the tactile sensor array can be connected to interface hardware that sends numerical data responsive to tactile information captured by the tactile sensor array to a processor. In various embodiments, this processor will process the data captured by the tactile sensor array and transform it various ways, for example into a collection of simplified data, or into a sequence of tactile image “frames” (this sequence akin to a video stream), or into highly refined information responsive to the position and movement of one or more fingers and/or other parts of the hand.
  • [0067]
    As to further representative detail of the latter example, a “frame” can refer to a 2-dimensional list comprising a number of rows and a number of columns forming an array, the array comprising tactile measurement value(s) for every sensor in a tactile sensor array at a given instance. In an embodiment, each data element comprises a scalar numerical value corresponding to a measurement from an associated sensor. In another embodiment, at least one data element comprises a plurality of scalar numerical values. In an embodiment, each data element comprises one or more scalar values produced from signal processing, image processing, and/or other operations applied to measurements provided by an array of sensors. The time interval between one frame and the next one depends on the frame rate of the system and the number of frames in a unit time (usually frames per second). FIG. 4 is a graphical representation of a tactile image produced by contact with the bottom surface of the most outward section (between the end of the finger and the most nearby joint) of a human finger on a tactile sensor array. In this example tactile array, there are 24 rows and 24 columns; other realizations can have significantly more (hundreds or thousands) of rows and columns. Tactile measurement values of each cell are indicated by the numbers and shading in each cell. Darker cells represent cells with higher tactile measurement values. Similarly, FIG. 5 provides a graphical representation of an example tactile image produced by contact with multiple human fingers on a tactile sensor array. In other embodiments, there can be a larger or smaller number of pixels for a given images size, resulting in varying resolution. Additionally, there can be larger or smaller area with respect to the image size resulting in a greater or lesser potential measurement area for the region of contact to be located in or move about. (Note the sensor array of FIG. 3 has less spatial resolution than that associated with FIG. 5, which in turn has less spatial resolution than that associated with FIG. 4.
  • [0068]
    Individual sensor elements in a tactile sensor array can vary sensor-by-sensor when presented with the same stimulus. The invention provides for each sensor to be individually calibrated in implementations where that can be advantageous. Sensor-by-sensor measurement value scaling, offset, and/or nonlinear warpings can be invoked for all or selected sensor elements during data acquisition scans. Similarly, the invention provides for individual noisy or defective sensors to be tagged for omission of their flawed measurements during data acquisition scans and/or post-scan data processing.
  • [0069]
    FIG. 6 depicts an example realization wherein a tactile sensor array is provided with real-time or near-real-time data acquisition capabilities. The captured data reflects spatially distributed tactile measurements (such as pressure, proximity, etc.). The tactile sensory array and data acquisition stage provides this real-time or near-real-time tactile measurement data to a specialized image processing arrangement for the production of parameters, rates of change of those parameters, and symbols responsive to aspects of the hand's relationship with the tactile or other type of sensor array. In some applications, these measurements can be used directly. In other situations, the real-time or near-real-time derived parameters can be directed to mathematical mappings (such as scaling, offset, and/or nonlinear warpings) in real-time or near-real-time into real-time or near-real-time application-specific parameters or other representations useful for applications. In some embodiments, general purpose outputs can be assigned to variables defined or expected by the application.
  • [0070]
    FIGS. 7 a-7 f illustrate the six independently adjustable degrees of freedom of touch from a single finger that can be simultaneously measured by the HDTP technology. The depiction in these figures is from the side of the touchpad. FIGS. 7 a-7 c show actions of positional change (amounting to applied pressure in the case of FIG. 7 c) while FIGS. 7 d-7 f show actions of angular change. Each of these can be used to control a user interface parameter, allowing the touch of a single fingertip to control up to six simultaneously-adjustable quantities in an interactive user interface. In more detail:
      • FIG. 7 a depicts variation of the left/right position (“x”) of the finger contact;
      • FIG. 7 b depicts variation of the forward/back position (“y”) of the finger contact;
      • FIG. 7 c depicts variation of the up/down position or downward pressure (“p”) of the finger contact;
      • FIG. 7 d depicts variation of the clockwise/counterclockwise (yaw) angle (“ψ”) of the finger contact;
      • FIG. 7 e depicts variation of the left/right tilt (roll) angle (“φ”) of the finger contact;
      • FIG. 7 f depicts variation of the forward/back (pitch) angle (“θ”) of the finger contact.
  • [0077]
    FIG. 8 suggests general ways in which two or more of these independently adjustable degrees of freedom adjusted at once with a single finger 800:
      • left/right position (“x”) of the finger contact 811;
      • forward/back position (“y”) of the finger contact 812;
      • up/down position or downward pressure (“p”) of the finger contact 816;
      • clockwise/counterclockwise (yaw) angle (“ψ”) of the finger contact 815;
      • left/right tilt (roll) angle (“φ”) of the finger contact 813;
      • forward/back (pitch) angle (“θ”) of the finger contact 814.
  • [0084]
    More advanced implementations of the HDTP provide for multi-touch capabilities that can be far more sophisticated that those popularized by the Apple iPhone, NYU, and others.
  • [0085]
    FIG. 9 demonstrates a few representative two-finger multi-touch postures and/or gestures from the hundreds that can be readily recognized by HDTP technology. HDTP technology can also be configured to recognize and measure postures and/or gestures involving three or more fingers, various parts of the hand, the entire hand, multiple hands, etc., as taught for example in U.S. Pat. No. 6,570,078 and pending U.S. patent application Ser. Nos. 11/761,978 and 12/418,605
  • [0086]
    FIG. 10 shows an example of how raw measurements of the six quantities of FIGS. 7 a-7 f, together with shape recognition for distinguishing contact with various parts of the hand and the touchpad, can be used to create a rich information flux of parameters, rates, and symbols, as taught for example in U.S. Pat. No. 6,570,078 and pending U.S. patent application Ser. Nos. 11/761,978 and 12/418,605.
  • [0087]
    FIG. 11 shows a representative approach for incorporating posture recognition, gesture recognition, state machines, and parsers to create an even richer human/machine tactile interface system capable of incorporating syntax and grammars, as taught for example in U.S. Pat. No. 6,570,078 and pending U.S. patent application Ser. Nos. 11/761,978 and 12/418,605.
  • [0088]
    The HDTP affords and provides for yet further capabilities. For example, sequence of symbols can be directed to a state machine, as shown in FIG. 12 a, to produce other symbols that serve as interpretations of one or more possible symbol sequences. In an embodiment, one or more symbols can be designated the meaning of an “Enter” key, permitting for sampling one or more varying parameter, rate, and/or symbol values and holding the value(s) until, for example, another “Enter” event, thus producing sustained values as illustrated in FIG. 12 b. In an embodiment, one or more symbols can be designated as setting a context for interpretation or operation and thus control mapping and/or assignment operations on parameter, rate, and/or symbol values as shown in FIG. 12 c. The operations associated with FIGS. 12 a-12 c can be combined to provide yet other capabilities. For example, the example arrangement of FIG. 12 d shows mapping and/or assignment operations that feed an interpretation state machine which in turn controls mapping and/or assignment operations. In implementations where context is involved, such as in arrangements such as those depicted in FIGS. 12 b-12 d, the invention provides for both context-oriented and context-free production of parameter, rate, and symbol values. The parallel production of context-oriented and context-free values can be useful to drive multiple applications simultaneously, for data recording, diagnostics, user feedback, and a wide range of other uses.
  • [0089]
    FIG. 13 depicts a representative user arrangement incorporating one or more HDTP system(s) or subsystem(s) that provide(s) user interface input event and routing of HDTP produced parameter values, rate values, symbols, etc. to a variety of applications. In an embodiment, these parameter values, rate values, symbols, etc. can be produced for example by utilizing one or more of the individual systems, individual methods, and/or individual signals described above in conjunction with the discussion of FIGS. 10, 11, and 12 a-12 b. As discussed later, such an approach can be used with other rich multiparameter user interface devices in place of the HDTP. An arrangement similar to that of FIG. 13 is also taught in pending U.S. patent application Ser. No. 12/502,230 “Control of Computer Window Systems, Computer Applications, and Web Applications via High Dimensional Touchpad User Interface” by Seung Lim, and FIG. 13 is adapted from FIG. 6 e of that pending application (U.S. patent application Ser. No. 12/502,230) for further expansion here.
  • [0090]
    In an implementation approach or modality of operation for an arrangement such as the one of FIG. 13, the Focus Control element uses a selected subset of the information stream provided by the HDTP or other user interface device providing traditional user-adjustable inputs supplemented by additional user-adjustable inputs. The Focus Control element uses a selected subset of of the information stream to interpret the user's intention for the direction of focus among several windows, applications, etc. The figure shows only applications, but some of these can be replaced with application child windows, operating system, background window, etc. In this example, focus may be controlled by an {x,y} location threshold test and a “select” symbol event, although other information may be used in its place.
  • [0091]
    In an arrangement such as the one of FIG. 13, or in other implementations, at least two parameters are used for navigation of the cursor when the overall interactive user interface system is in a mode recognizing input from cursor control. These can be, for example, the left-right (“x”) parameter and forward/back (“y”) parameter provided by the touchpad. The arrangement of FIG. 13 includes a representative implementation of this.
  • [0092]
    Alternatively, these two cursor-control parameters can be provided by another user interface device, for example another touchpad or a separate or attached mouse (the latter to be discussed shortly in the context of FIGS. 15 a-15 e).
  • [0093]
    In some situations, control of the cursor location can be implemented by more complex means. One example of this is the control of location of a 3D cursor wherein a third parameter must be employed to specify the depth coordinate of the cursor location. For such situations, the arrangement of FIG. 13 would be modified to include a third parameter (for use in specifying this depth coordinate) in addition to the left-right (“x”) parameter and forward/back (“y”) parameter described earlier.
  • [0094]
    In an embodiment, focus control is used to interactively routing user interface signals among applications. In most current systems, there is at least some modality wherein the focus is determined by either the current cursor location or a previous cursor location when a selection event was made. In the user experience, this selection event typically involves the user interface providing an event symbol of some type (for example a mouse click, mouse double-click touchpad tap, touchpad double-tap, etc). The representative arrangement of FIG. 13 includes an implementation wherein a select event generated by the touchpad system is directed to the focus control element. The focus control element in this arrangement in turn controls a focus selection element that directs all or some of the broader information stream from the HDTP system to the currently selected application. (In FIG. 13, “Application K” has been selected as indicated by the thick-lined box and information-flow arrows.)
  • [0095]
    In some embodiments, each application that is a candidate for focus selection provides a window displayed at least in part on the screen, or provides a window that can be deiconified from an icon tray or retrieved from beneath other windows that may be obfuscating it. In some embodiments, if the background window is selected, focus selection element that directs all or some of the broader information stream from the HDTP system to the operating system, window system, and/or features of the background window. In some embodiments, the background window can be in fact regarded as merely one of the applications shown in the right portion of the arrangement of FIG. 13. In other embodiments, the background window can be in fact regarded as being separate from the applications shown in the right portion of the arrangement of FIG. 13. In this case the routing of the broader information stream from the HDTP system to the operating system, window system, and/or features of the background window is not explicitly shown in FIG. 13.
  • Touchscreen and Other Embodiments of the HDTP
  • [0096]
    FIGS. 14 a-14 g and 15 a-15 e depict a number of representative arrangements and embodiments employing the HDTP technology. FIG. 14 a illustrates a HDTP as a peripheral that can be used with a desktop computer (shown) or laptop) not shown). FIG. 14 b depicts an HDTP integrated into a laptop in place of the traditional touchpad pointing device. In FIGS. 14 a-14 b the HDTP tactile sensor can be a stand-alone component or can be integrated over a display so as to form a touchscreen. FIG. 14 c depicts an HDTP integrated into a desktop computer display so as to form a touchscreen. FIG. 14 d shows the HDTP integrated into a laptop computer display so as to form a touchscreen.
  • [0097]
    FIG. 14 e depicts an HDTP integrated into a cell phone, smartphone, PDA, or other hand-held consumer device. FIG. 14 f shows an HDTP integrated into a test instrument, portable service-tracking device, portable service-entry device, field instrument, or other hand-held industrial device. In FIGS. 14 e-14 f the HDTP tactile sensor can be a stand-alone component or can be integrated over a display so as to form a touchscreen.
  • [0098]
    FIG. 14 g depicts an HDTP touchscreen configuration that can be used in a tablet computer, wall-mount computer monitor, digital television, video conferencing screen, kiosk, etc.
  • [0099]
    In at least the arrangements of FIGS. 14 a, 14 c, 14 d, and 14 g, or other sufficiently large tactile sensor implementation of the HDTP, more than one hand can be used and individually recognized as such.
  • Embodiments Incorporating the HDTP into a Traditional or Contemporary Generation Mouse
  • [0100]
    FIGS. 15 a-15 e depict various representative integrations of an HDTP into the back of a conventional computer mouse. In FIGS. 15 a-15 d the HDTP tactile sensor can be a stand-alone component or can be integrated over a display so as to form a touchscreen. Such configurations have very recently become popularized by the product release of Apple “Magic Mouse™” although such combinations of a mouse with a tactile sensor array on its back responsive to multitouch and gestures were taught earlier in pending U.S. patent application Ser. No. 12/619,678 (priority date Feb. 12, 2004) entitled “User Interface Mouse with Touchpad Responsive to Gestures and Multi-Touch.”
  • [0101]
    In another embodiment taught in the specification of issued U.S. Pat. No. 7,557,797 and associated pending continuation applications more than two touchpads can be included in the advance mouse embodiment, for example as suggested in the arrangement of FIG. 15 e. As with the arrangements of FIGS. 15 a-15 d, one or more of the plurality of HDTP tactile sensors or exposed sensor areas of arrangements such as that of FIG. 15 e can be integrated over a display so as to form a touchscreen.
  • Advanced Mice User Interface Technology
  • [0102]
    The HDTP in the above examples is used to supply more than the traditional two user interface parameters provided by a conventional user interface input device such as a conventional computer mouse, trackball, touchpad, etc. The present invention provides for the use of other user interface input arrangements and devices as alternatives to or in conjunction with one or more HDTPs. In this section the features and capabilities of Advanced Mice are briefly reviewed and set up for their use in embodiments of the invention. Focus control can be implemented in a manner completely or nearly analogous with FIG. 20, as well as other approaches (for example as will be presented later in the contexts of FIGS. 19 a-19 d).
  • [0103]
    In a simple example, the scroll-wheel of a scroll-wheel mouse is used to provide a third simultaneously adjustable user interface parameter. In another example, a second or yet more additional scroll-wheels can be added to a conventional scroll-wheel mouse. The resultant collection of scroll-wheels can be relatively positioned in parallel, oriented at orthogonal angles so as to support a coordinate-metaphor, positioned on the sides of the mouse body, etc. FIGS. 16 a and 16 b illustrate examples of conventional scroll-wheel mouse provided with an added left-right scroll-wheel 1622 as taught in U.S. patent application Ser. No. 13/024569. Such arrangements can employ a connecting cable, or the device can be wireless.
  • [0104]
    In another example of Advanced Mice, one or more trackballs can be added to a conventional computer mouse, for example on the back of the mouse. FIGS. 17 a-17 c illustrate examples where a single trackball is incorporated into the back of a conventional computer mouse as taught in U.S. Pat. No. 7,557,797. FIGS. 18 a-18 c illustrate examples where two trackballs are incorporated into the back of a conventional computer mouse as taught in U.S. Pat. No. 7,557,797. The trackballs in the arrangements of FIGS. 17 a-17 c and FIGS. 18 a-18 c can be the conventional two degree of freedom type (roll left-right, roll away-towards) or can provide three to six degrees of freedom as taught in U.S. Pat. No. 7,557,797; U.S. patent application Ser. No. 10/806,694. Such arrangements can employ a connecting cable, or the device can be wireless.
  • [0105]
    Another example Advanced Mice arrangements include the trackball/touchpad/mouse combinations of FIGS. 18 c and 18 d and the multiple slider configuration of FIG. 18 e, each taught in U.S. Pat. No. 7,557,797. Other example Advanced Mice arrangements include those with two or more scroll wheels (for example as in pending U.S. patent application Ser. No. 13/024,569), a multiple-parameter joystick providing three or more simultaneously adjustable user interface inputs on the back of a mouse (for example as in pending U.S. patent application Ser. No. 13/025,129), and such a multiple-parameter joystick combined with a trackball (for example as also in pending U.S. patent application Ser. No. 13/025,129).
  • [0106]
    Each of these arrangements can employ a connecting cable, or the device can be wireless.
  • Video Control
  • [0107]
    Additionally, images of the human hand as captured by video cameras can be used as an enhanced multiple-parameter interface responsive to hand positions and gestures, for example as taught in pending U.S. patent application Ser. No. 10/683,915 and more specifically in paragraphs [314], [321]-[332], [411], [653], and (in view of paragraph [325]) also paragraphs [241 ]-[263] of that pending application's pre-grant publication U.S. 2004/0118268.
  • Example use of the Additional Parameters by Applications
  • [0108]
    The types of human-machine geometric interaction between the hand and the HDTP facilitate many useful applications within a visualization environment. A few of these include control of visualization observation viewpoint location, orientation of the visualization, and controlling fixed or selectable ensembles of one or more of viewing parameters, visualization rendering parameters, pre-visualization operations parameters, data selection parameters, simulation control parameters, etc. As one example, the 6D orientation of a finger can be naturally associated with visualization observation viewpoint location and orientation, location and orientation of the visualization graphics, etc. As another example, the 6D orientation of a finger can be naturally associated with a vector field orientation for introducing synthetic measurements in a numerical simulation.
  • [0109]
    As yet another example, at least some aspects of the 6D orientation of a finger can be naturally associated with the orientation of a robotically positioned sensor providing actual measurement data. As another example, the 6D orientation of a finger can be naturally associated with an object location and orientation in a numerical simulation. As another example, the large number of interactive parameters can be abstractly associated with viewing parameters, visualization rendering parameters, pre-visualization operations parameters, data selection parameters, numeric simulation control parameters, etc.
  • [0110]
    In yet another example, the “x” and “y” parameters provided by the HDTP can be used for focus selection and the remaining parameters can be used to control parameters within a selected GUI.
  • [0111]
    In still another example, the “x” and “y” parameters provided by the HDTP can be regarded as a specifying a position within an underlying base plane and the roll and pitch angles can be regarded as a specifying a position within a superimposed parallel plane. In a first example extension of the previous two-plane example, the yaw angle can be regarded as the rotational angle between the base and superimposed planes. In a second example extension of the previous two-plane example, the finger pressure can be employed to determine the distance between the base and superimposed planes. In a variation of the previous two-plane example, the base and superimposed plane can not be fixed as parallel but rather intersect as an angle associated with the yaw angle of the finger. In the each of these, either or both of the two planes can represent an index or indexed data, a position, pair of parameters, etc. of a viewing aspect, visualization rendering aspect, pre-visualization operations, data selection, numeric simulation control, etc.
  • [0112]
    A large number of additional approaches are possible as is appreciated by one skilled in the art. These are provided for by the invention.
  • USB HID Device Abstraction
  • [0113]
    The USB HID device class provides an open interface useful for both traditional computer pointing devices such as the standard computer mouse and other user interface devices such as game controllers. The USB HID device class has also been used to interface with the Logitech 3DConnexion SpaceNavigator™. The USB HID device class is currently specified at the time of this patent application by at least the Device Class Definition for HID 1.11, currently available at http://www.usb.org/developers/devclass docs/HID1 11.pdf. More generally, the invention provides for the USB HID device class to be used for at least additional user interface signals (user interface parameters) provided by the High Dimensional Touchpad (HTPD), Advanced Mice, and other multidimensional or rich parameter user interfaces that generate additional user interface signals above those found in traditional computer mice, touchpads, and trackballs. This can be done in a number ways, for example as taught in pending U.S. patent application Ser. No. 61/435,401 and as described below in material adapted from that pending U.S. Patent Application.
  • [0114]
    In a first exemplary embodiment, a USB HID device abstraction is employed to connect a computer or other device with an HDTP sensor that is connected to the computer via a USB interface. Here the exemplary HDTP signal processing and HDTP gesture processing are implemented on the computer or other device. The HDTP signal processing and HDTP gesture processing implementation can be realized via one or more of CPU software, GPU software, embedded processor software or firmware, and/or a dedicated integrated circuit. FIG. 19 a depicts an exemplary implementation of such an embodiment.
  • [0115]
    In another exemplary embodiment, a USB HID device abstraction is employed to connect a computer or other device with an HDTP sensor and one or more associated processor(s) which in turn is/are connected to the computer via a USB interface. Here the exemplary HDTP signal processing and HDTP gesture detection are implemented on the one or more processor(s) associated with HDTP sensor. The HDTP signal processing and HDTP gesture processing implementation can be realized via one or more of CPU software, GPU software, embedded processor software or firmware, and/or a dedicated integrated circuit. FIG. 19 b depicts an exemplary implementation of such an embodiment.
  • [0116]
    In another exemplary embodiment, a USB HID device abstraction is used as a software interface even though no USB port is actually used. The HDTP signal processing and HDTP gesture processing implementation can be realized via one or more of CPU software, GPU software, embedded processor software or firmware, and/or a dedicated integrated circuit. FIG. 19 c depicts an exemplary implementation of such an embodiment. Alternatively, ADPs can interface to a computer or other device in yet other ways. For example, a special purpose interface can be used. As another example, the Bluetooth networking standard can be used.
  • Support for Additional Parameters via Existing or Extended Window Systems, Window Managers, and Operating Systems
  • [0117]
    There are a number of ways conventional window systems, window managers, and operating systems can be used or adapted to support the additional interactively-controlled parameters provided by an APD. A few examples are provided here, and other approaches are anticipated by the invention.
  • [0118]
    The additional interactively-controlled user input parameters provided by an HDTP (such as that taught in the 1999 filings of issued U.S. Pat. 6,570,078 and pending U.S. patent application Ser. No. 11/761,978, pending U.S. patent application Ser. Nos. 12/418,605, 12/502,230, 12/541,948, and related pending U.S. patent applications), Advanced Mice (such as that Mice taught in the 2004 filings of issued U.S. Pat. No. 7,557,797 and related pending U.S. patent applications such as Ser. Nos. 12/619,678, 13/025,129, 13/024,569), and other rich multiparameter user interface devices supply more interactively-controlled parameters than the established number supported by conventional window and operating systems. Provisions to support the use of additional interactively-controlled parameters provided by HDTP, Advanced Mice, and other rich multiparameter user interface devices with existing or extended operating systems has been taught in pending U.S. patent application Ser. No. 12/875,128. Some material from pending U.S. patent application Ser. No. 12/875,128 is directly adapted in this section for convenience. Additionally, images of the human hand as captured by video cameras can be used as an enhanced multiple-parameter interface responsive to hand positions and gestures, for example as taught in pending U.S. patent application Ser. No. 10/683,915 and more specifically in paragraphs [314], [321]-[332], [411], [653], and (in view of paragraph [325]) also paragraphs [241]-[263] of that pending application's pre-grant publication U.S. 2004/0118268.
  • [0119]
    More generally, the invention provides for additional user interface parameter signals provided by the not only the High Dimensional Touchpad (HTPD) and Advanced Mice, but also other multidimensional or rich parameter user interfaces providing additional user interface signals above those found in traditional computer mice, touchpads, and trackballs. This fuller collection (HDTP, Advanced Mice, other multidimensional or rich parameter user interface devices providing additional user interface signals above those found in traditional computer mice, touchpads, and trackballs) will be collectively referred to as Advanced Pointing Devices (APDs).
  • [0120]
    In one approach, the entire (interactively-controlled) information flux provided by an APD is carried over the same framework used to carry the traditional computer mouse/touchpad user interface signals from conventional pointing devices. In one version of this approach, only the driver for the APD need be added and recognized by the window system, window manager, operating system, or combination of these. The window system, window manager, operating system, or combination of these then distributes the entire (interactively-controlled) information flux to the application selected according to focus control implemented by the operating system. For some window systems, window managers, and operating systems, such an approach can be implemented without modification. In other implementations of window systems, window managers, operating systems, or combination of these, such an approach can require a modification to the window and/or operating system(s). Should a particular existing window system, window manager, operating system, or combination of these resident on a computing device require such modification, the invention provides for the modification to be implemented via a downloadable patch or other form of an update (for example, using a data-storage media).
  • [0121]
    FIGS. 19 a and 19 b depict a representative rendering of this approach. In each figure, the driver for the APD presents traditional computer mouse/touchpad user interface signals from conventional pointing devices (thin straight arrowed lines) to the window system, window manager, operating system, or combination of these as well as additional computer mouse/touchpad user interface signals (thick straight arrowed lines) from the APD. In each of these approaches, as well as other variations clear to one skilled in the art, the window system, window manager, operating system, or combination of these comprises a focus control functionality used to selectively route the traditional user interface signals and additional user interface signals. The focus control can be responsive to at least the position of a displayed cursor with respect to a displayed application window, the cursor and application window displayed on a display screen. In some approaches or operating modes, merely positioning the cursor within the window of an application makes a focus selection to that application. In other approaches or operating modes, positioning the cursor within the window of an application is not alone sufficient to make a focus selection to that application; instead the focus stays with the last selection until a user-provided selection event is made, for example a mouse click or double click, a touchpad tap or double-tap, a trackball button click or double click, etc.
  • [0122]
    In the suggestive rendering of FIGS. 19 a and 19 b, focus control (for example, as defined by cursor location with respect to one or more displayed application windows) is responsive traditional (computer mouse/touchpad/trackball) user interface signals (thin straight arrowed lines). In other arrangements, such as a system employing a 3D desktop, at least one additional parameter can be also directed to focus control and/or cursor location. In the suggestive rendering of FIGS. 19 a and 19 b, there are a plurality of applications, some designed to accept only traditional computer mouse/touchpad user interface signals (in the upper right of each figure) as well as other applications designed to accept these traditional signals as well as one or more of the additional user interface signals provided by the APD (in the lower right of each figure). The applicable portions of the description applies even if there are fewer applications of either or both types, or if there is only one type or only one application overall. In the case of FIG. 19 a, the focus control routes only the traditional interface signals to a selected application designed to accept only traditional computer mouse/touchpad user interface signals. In the case of FIG. 19 b, the focus control routes a larger collection of signals, including both traditional computer mouse/touchpad user interface signals as well as at least one additional user interface signal made available by the APD.
  • [0123]
    In another approach, the window system, window manager, operating system, or combination of these only distributes traditional computer mouse/touchpad user interface signals from conventional pointing devices and other provisions are used to direct the additional user interface parameter signals provided by the APD to selected applications. This can be implemented in a number of ways. In one example, depicted in FIG. 19 c, separate focus controls are used, each responsive to the traditional user interface signals provided by the APD. In another example, depicted in FIG. 19 d, the operating system focus control provides signals to the routing element for the additional user interface parameter signals provided by the APD. Other variations are anticipated and are provided for by the invention.
  • [0124]
    Once user interface signals are routed to an application, the application it self can utilize or sub-route the user interface signals in various ways. Some applications, such as data visualization, maps, simulations, CAD systems, etc. can beneficially use more than three simultaneously interactively adjustable user inputs directly. Other applications, such as browsers and viewers, can support such applications indirectly as taught and discussed for example in pending U.S. patent application Ser. No. 12/875,119. Browsers, viewers, and hypermedia documents can also be provided with advanced hypermedia objects that generalize the notion of hyperlinks, rollovers, sliders, buttons, etc. that are configured to utilize additional user interface signals; such advanced hypermedia objects taught and discussed for example in pending U.S. Patent Application 61/435,395.
  • [0125]
    While the invention has been described in detail with reference to disclosed embodiments, various modifications within the scope of the invention will be apparent to those of ordinary skill in this technological field. It is to be appreciated that features described with respect to one embodiment typically can be applied to other embodiments.
  • [0126]
    The invention can be embodied in other specific forms without departing from the spirit or essential characteristics thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Therefore, the invention properly is to be construed with reference to the claims.
  • [0127]
    Although exemplary embodiments have been provided in detail, it should be understood that various changes, substitutions and alternations could be made thereto without departing from spirit and scope of the disclosed subject matter as defined by the appended claims. Variations described for exemplary embodiments may be realized in any combination desirable for each particular application. Thus particular limitations, and/or embodiment enhancements described herein, which may have particular advantages to a particular application, need not be used for all applications. Also, not all limitations need be implemented in methods, systems, and/or apparatuses including one or more concepts described with relation to the provided exemplary embodiments.
Patent Citations
Cited PatentFiling datePublication dateApplicantTitle
US4748676 *14 May 198731 May 1988Fuji Electric Co., Ltd.Pattern rotation angle detecting system
US4899137 *26 Sep 19886 Feb 1990Aeg Olympia AktiengesellschaftArrangement for the input and processing of characters and/or graphic patterns
US5237647 *28 Oct 199117 Aug 1993Massachusetts Institute Of TechnologyComputer aided drawing in three dimensions
US5270711 *30 Apr 199014 Dec 1993U.S. Philips CorporationTouch sensor array systems and display systems incorporating such
US5292999 *13 Aug 19928 Mar 1994Fernandes Co., Ltd.Electric stringed instrument having a device for sustaining the vibration of the string
US5341133 *9 May 199123 Aug 1994The Rowland Institute For Science, Inc.Keyboard having touch sensor keys for conveying information electronically
US5347295 *31 Oct 199013 Sep 1994Go CorporationControl of a computer through a position-sensed stylus
US5357048 *8 Oct 199218 Oct 1994Sgroi John JMIDI sound designer with randomizer function
US5378850 *12 Jan 19933 Jan 1995Fernandes Co., Ltd.Electric stringed instrument having an arrangement for adjusting the generation of magnetic feedback
US5386219 *28 Jul 199331 Jan 1995International Business Machines Corp.Touch overlay for improved touch sensitivity
US5420936 *29 Mar 199430 May 1995International Business Machines CorporationMethod and apparatus for accessing touch screen desktop objects via fingerprint recognition
US5440072 *25 Sep 19928 Aug 1995Willis; Raymon A.System for rejuvenating vintage organs and pianos
US5442168 *6 Jan 199315 Aug 1995Interactive Light, Inc.Dynamically-activated optical instrument for producing control signals having a self-calibration means
US5459282 *12 May 199417 Oct 1995Willis; Raymon A.System for rejuvenating vintage organs and pianos
US5471008 *21 Oct 199428 Nov 1995Kabushiki Kaisha Kawai Gakki SeisakushoMIDI control apparatus
US5475214 *20 Jan 199512 Dec 1995Interactive Light, Inc.Musical sound effects controller having a radiated emission space
US5565641 *28 Mar 199415 Oct 1996Gruenbaum; LeonRelativistic electronic musical instrument
US5585588 *3 May 199517 Dec 1996Fernandes Co., Ltd.Electric stringed instrument having a device for sustaining the vibration of a string and an electromagnetic driver for the device
US5592572 *5 Nov 19937 Jan 1997The United States Of America As Represented By The Department Of Health And Human ServicesAutomated portrait/landscape mode detection on a binary image
US5592752 *7 Jun 199514 Jan 1997Industrial Technology Research InstituteProcess and an apparatus for producing teas
US5659145 *27 Apr 199519 Aug 1997Weil; Robert P.Foot operated audio signal controller with lighted visual reference
US5659466 *2 Nov 199419 Aug 1997Advanced Micro Devices, Inc.Monolithic PC audio circuit with enhanced digital wavetable audio synthesizer
US5665927 *24 Jun 19949 Sep 1997Casio Computer Co., Ltd.Method and apparatus for inputting musical data without requiring selection of a displayed icon
US5668338 *2 Nov 199416 Sep 1997Advanced Micro Devices, Inc.Wavetable audio synthesizer with low frequency oscillators for tremolo and vibrato effects
US5675100 *16 Sep 19967 Oct 1997Hewlett; Walter B.Method for encoding music printing information in a MIDI message
US5717939 *17 Nov 199510 Feb 1998Compaq Computer CorporationMethod and apparatus for entering and manipulating spreadsheet cell data
US5719347 *12 Oct 199517 Feb 1998Yamaha CorporationKeyboard apparatus for electronic musical instrument with key depression detection unit
US5719561 *25 Oct 199517 Feb 1998Gilbert R. GonzalesTactile communication device and method
US5724985 *2 Aug 199510 Mar 1998Pacesetter, Inc.User interface for an implantable medical device using an integrated digitizer display screen
US5741993 *17 Jul 199621 Apr 1998Kabushiki Kaisha Kawai Gakki SeisakushoElectronic keyboard having a discrete pitch bender
US5748184 *28 May 19965 May 1998International Business Machines CorporationVirtual pointing device for touchscreens
US5763806 *15 Jul 19969 Jun 1998Willis; Raymon A.Method and apparatus for midifying vintage organs and pianos
US5786540 *5 Mar 199628 Jul 1998Westlund; Robert L.Controller apparatus for music sequencer
US5801340 *29 Jun 19951 Sep 1998Invotronics ManufacturingProximity sensor
US5805137 *5 May 19948 Sep 1998Itu Research, Inc.Touch sensitive input control device
US5824930 *5 Jun 199620 Oct 1998Yamaha CorporationKeyboard musical instrument having key monitor exactly discriminating key motion
US5827989 *23 Jun 199727 Oct 1998Microsoft CorporationSystem and method for representing a musical event and for converting the musical event into a series of discrete events
US5841428 *20 Sep 199624 Nov 1998Intertactile Technologies CorporationRotary circuit control devices with changeable graphics
US5850051 *15 Aug 199615 Dec 1998Yamaha CorporationMethod and apparatus for creating an automatic accompaniment pattern on the basis of analytic parameters
US5852251 *25 Jun 199722 Dec 1998Industrial Technology Research InstituteMethod and apparatus for real-time dynamic midi control
US5889236 *13 Nov 199530 Mar 1999Synaptics IncorporatedPressure sensitive scrollbar feature
US5932827 *9 Jan 19953 Aug 1999Osborne; Gary T.Sustainer for a musical instrument
US5969283 *17 Jun 199819 Oct 1999Looney Productions, LlcMusic organizer and entertainment center
US5977466 *9 Jul 19972 Nov 1999Yamaha CorporationKeyboard musical instrument equipped with small simple economical key touch generator
US5986224 *23 Dec 199816 Nov 1999Elo Touchsystems, Inc.Acoustic condition sensor employing a plurality of mutually non-orthogonal waves
US6005545 *14 Dec 199521 Dec 1999Sega Enterprise, Ltd.Image processing method and electronic device
US6037937 *4 Dec 199714 Mar 2000Nortel Networks CorporationNavigation tool for graphical user interface
US6047073 *2 Nov 19944 Apr 2000Advanced Micro Devices, Inc.Digital wavetable audio synthesizer with delay-based effects processing
US6051769 *25 Nov 199818 Apr 2000Brown, Jr.; DonivalComputerized reading display
US6100461 *10 Jun 19988 Aug 2000Advanced Micro Devices, Inc.Wavetable cache using simplified looping
US6107997 *27 Jun 199622 Aug 2000Ure; Michael J.Touch-sensitive keyboard/mouse and computing device using the same
US6140565 *7 Jun 199931 Oct 2000Yamaha CorporationMethod of visualizing music system by combination of scenery picture and player icons
US6204441 *25 Mar 199920 Mar 2001Yamaha CorporationMethod and apparatus for effectively displaying musical information with visual display
US6225975 *9 Mar 19981 May 2001Alps Electric Co., Ltd.Control device for game machine
US6285358 *30 Sep 19974 Sep 2001Microtouch Systems, Inc.Method of and apparatus for the elimination of the effects of inertial interference in force measurement systems, including touch-input computer and related displays employing touch force location measurement techniques
US6288317 *21 May 199911 Sep 2001Raymon A. WillisReal time transmission of keyboard musical performance
US6310279 *18 Dec 199830 Oct 2001Yamaha CorporationDevice and method for generating a picture and/or tone on the basis of detection of a physical event from performance information
US6310610 *4 Dec 199730 Oct 2001Nortel Networks LimitedIntelligent touch display
US6320112 *18 May 200120 Nov 2001Martin LotzeProcedure and device for the automatic selection of musical and/or tonal compositions
US6323846 *25 Jan 199927 Nov 2001University Of DelawareMethod and apparatus for integrating manual input
US6360019 *30 Jun 199719 Mar 2002Microsoft CorporationTable-based compression with embedded coding
US6363159 *17 Nov 199926 Mar 2002Digimarc CorporationConsumer audio appliance responsive to watermark data
US6373475 *15 Apr 199816 Apr 2002Michael ChallisConverter for resistive touchscreens
US6392636 *22 Jan 199821 May 2002Stmicroelectronics, Inc.Touchpad providing screen cursor/pointer movement control
US6392705 *7 Jul 199721 May 2002Microsoft CorporationMultimedia compression system with additive temporal layers
US6400836 *15 May 19984 Jun 2002International Business Machines CorporationCombined fingerprint acquisition and control device
US6404898 *24 Jun 199911 Jun 2002Digimarc CorporationMethod and system for encoding image and audio content
US6408087 *13 Jan 199818 Jun 2002Stmicroelectronics, Inc.Capacitive semiconductor user input device
US6570078 *19 Mar 200127 May 2003Lester Frank LudwigTactile, visual, and array controllers for real-time control of music signal processing, mixing, video, and lighting
US6610917 *15 May 199926 Aug 2003Lester F. LudwigActivity indication, external source, and processing loop provisions for driven vibrating-element environments
US6703552 *19 Jul 20019 Mar 2004Lippold HakenContinuous music keyboard
US6793619 *9 Jun 199921 Sep 2004Yaacov BlumentalComputer-implemented method and system for giving a user an impression of tactile feedback
US7030860 *8 Oct 199918 Apr 2006Synaptics IncorporatedFlexible transparent touch sensing system for electronic devices
US7348946 *31 Dec 200125 Mar 2008Intel CorporationEnergy sensing light emitting diode display
US7408108 *10 Oct 20035 Aug 2008Ludwig Lester FMultiple-paramenter instrument keyboard combining key-surface touch and key-displacement sensor arrays
US20010036299 *15 May 19981 Nov 2001Andrew William SeniorCombined fingerprint acquisition and control device
US20020005108 *19 Mar 200117 Jan 2002Ludwig Lester FrankTactile, visual, and array controllers for real-time control of music signal processing, mixing, video, and lighting
US20020093491 *12 Aug 199718 Jul 2002David W. GillespieObject position detector with edge motion feature and gesture recognition
US20030003976 *12 Jun 20022 Jan 2003Sony CorporationMemory card, personal digital assistant, information processing method, recording medium, and program
US20030151592 *24 Feb 200314 Aug 2003Dieter RitterMethod for requesting destination information and for navigating in a map view, computer program product and navigation unit
US20040074379 *10 Oct 200322 Apr 2004Ludwig Lester F.Functional extensions of traditional music keyboards
US20040118268 *10 Oct 200324 Jun 2004Ludwig Lester F.Controlling and enhancing electronic musical instruments with video
US20040251402 *21 Sep 200216 Dec 2004Gerd ReimeCircuit with an opto-electronic display unit
US20050179651 *24 Nov 200418 Aug 2005Ludwig Lester F.Mouse-based user interface device providing multiple parameters and modalities
US20060001914 *30 Jun 20045 Jan 2006Mesmer Ralph MColor scanner display
US20060252530 *21 Jun 20069 Nov 2006IgtMobile device for providing filtered casino information based on real time data
US20070044019 *19 May 200422 Feb 2007Byung-Ro MoonMulti-campaign assignment apparatus considering overlapping recommendation problem
US20070063990 *20 Sep 200622 Mar 2007Samsung Electronics Co., Ltd.Touch sensitive display device and driving apparatus thereof, and method of detecting a touch
US20070229477 *12 Jun 20074 Oct 2007Ludwig Lester FHigh parameter-count touchpad controller
US20080010616 *18 Jan 200710 Jan 2008Cherif Atia AlgreatlySpherical coordinates cursor, mouse, and method
US20080034286 *19 Jul 20067 Feb 2008Verizon Services Organization Inc.Intercepting text strings
US20080143690 *10 May 200719 Jun 2008Lg.Philips Lcd Co., Ltd.Display device having multi-touch recognizing function and driving method thereof
US20080150848 *12 Jul 200726 Jun 2008Lg. Philips Lcd Co., Ltd.Organic light-emitting diode panel and touch-screen system including the same
US20080164076 *4 Jan 200710 Jul 2008Timothy James OrsleyCapacitive sensing and absolute position mapping in displacement type pointing devices
US20080259053 *11 Apr 200823 Oct 2008John NewtonTouch Screen System with Hover and Click Input Methods
US20080297482 *30 May 20074 Dec 2008Microsoft CorporationRecognizing selection regions from multiple simultaneous inputs
US20080300055 *29 May 20074 Dec 2008Lutnick Howard WGame with hand motion control
US20080309634 *13 Jun 200718 Dec 2008Apple Inc.Multi-touch skins spanning three dimensions
US20100110025 *29 Jul 20096 May 2010Lim Seung EControl of computer window systems and applications using high dimensional touchpad user interface
Referenced by
Citing PatentFiling datePublication dateApplicantTitle
US84771119 Apr 20122 Jul 2013Lester F. LudwigAdvanced touch control of interactive immersive imaging applications via finger angle using a high dimensional touchpad (HDTP) touch user interface
US85095427 Apr 201213 Aug 2013Lester F. LudwigHigh-performance closed-form single-scan calculation of oblong-shape rotation angles from binary images of arbitrary size and location using running sums
US85422099 Apr 201224 Sep 2013Lester F. LudwigAdvanced touch control of interactive map viewing via finger angle using a high dimensional touchpad (HDTP) touch user interface
US860436415 Aug 200910 Dec 2013Lester F. LudwigSensors, algorithms and applications for a high dimensional touchpad
US86383125 Mar 201328 Jan 2014Lester F. LudwigAdvanced touch control of a file browser via finger angle using a high dimensional touchpad (HDTP) touch user interface
US863903718 Mar 201328 Jan 2014Lester F. LudwigHigh-performance closed-form single-scan calculation of oblong-shape rotation angles from image data of arbitrary size and location using running sums
US86436225 Mar 20134 Feb 2014Lester F. LudwigAdvanced touch control of graphics design application via finger angle using a high dimensional touchpad (HDTP) touch user interface
US870251331 Dec 201222 Apr 2014Lester F. LudwigControl of the operating system on a computing device via finger angle using a high dimensional touchpad (HDTP) touch user interface
US871730312 Jun 20076 May 2014Lester F. LudwigSensor array touchscreen recognizing finger flick gesture and other touch gestures
US874306813 Jul 20123 Jun 2014Lester F. LudwigTouch screen method for recognizing a finger-flick touch gesture
US874307621 Jan 20143 Jun 2014Lester F. LudwigSensor array touchscreen recognizing finger flick gesture from spatial pressure distribution profiles
US875486211 Jul 201117 Jun 2014Lester F. LudwigSequential classification recognition of gesture primitives and window-based parameter smoothing for high dimensional touchpad (HDTP) user interfaces
US87972887 Mar 20125 Aug 2014Lester F. LudwigHuman user interfaces utilizing interruption of the execution of a first recognized gesture with the execution of a recognized second gesture
US88261136 Nov 20122 Sep 2014Lester F. LudwigSurface-surface graphical intersection tools and primitives for data visualization, tabular data, and advanced spreadsheets
US88261149 Nov 20122 Sep 2014Lester F. LudwigSurface-curve graphical intersection tools and primitives for data visualization, tabular data, and advanced spreadsheets
US886678526 Dec 201321 Oct 2014Lester F. LudwigSensor array touchscreen recognizing finger flick gesture
US887880711 Mar 20134 Nov 2014Lester F. LudwigGesture-based user interface employing video camera
US887881021 Jan 20144 Nov 2014Lester F. LudwigTouch screen supporting continuous grammar touch gestures
US88944895 Mar 201425 Nov 2014Lester F. LudwigTouch user interface supporting global and context-specific touch gestures that are responsive to at least one finger angle
US90192375 Apr 200928 Apr 2015Lester F. LudwigMultitouch parameter and gesture user interface employing an LED-array tactile sensor that can also operate as a display
US905277210 Aug 20129 Jun 2015Lester F. LudwigHeuristics for 3D and 6D touch gesture touch parameter calculations for high-dimensional touch parameter (HDTP) user interfaces
US930467716 May 20125 Apr 2016Advanced Touchscreen And Gestures Technologies, LlcTouch screen apparatus for recognizing a touch gesture
US94426527 Mar 201213 Sep 2016Lester F. LudwigGeneral user interface gesture lexicon and grammar frameworks for multi-touch, high dimensional touch pad (HDTP), free-space camera, and other user interfaces
US9532098 *19 Jul 201327 Dec 2016Zte CorporationMethod, apparatus and system for controlling focus on TV interface
US96058815 Nov 201228 Mar 2017Lester F. LudwigHierarchical multiple-level control of adaptive cooling and energy harvesting arrangements for information technology
US962602311 Jul 201118 Apr 2017Lester F. LudwigLED/OLED array approach to integrated display, lensless-camera, and touch-screen user interface devices and associated processors
US96323447 Aug 201325 Apr 2017Lester F. LudwigUse of LED or OLED array to implement integrated combinations of touch screen tactile, touch gesture sensor, color image display, hand-image gesture sensor, document scanner, secure optical data exchange, and fingerprint processing capabilities
US96655545 Nov 201230 May 2017Lester F. LudwigValue-driven visualization primitives for tabular data of spreadsheets
US20110210943 *1 Mar 20111 Sep 2011Lester F. LudwigCurve-fitting approach to hdtp parameter extraction
US20150312617 *19 Jul 201329 Oct 2015Zte CorporationMethod, apparatus and system for controlling focus on TV interface
Classifications
U.S. Classification719/328
International ClassificationG06F9/46
Cooperative ClassificationG06F3/04815
Legal Events
DateCodeEventDescription
9 Jun 2017ASAssignment
Owner name: NRI R&D PATENT LICENSING, LLC, TEXAS
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LUDWIG, LESTER F;REEL/FRAME:042745/0063
Effective date: 20170608