CA2543191C - Human interface system - Google Patents
Human interface system Download PDFInfo
- Publication number
- CA2543191C CA2543191C CA2543191A CA2543191A CA2543191C CA 2543191 C CA2543191 C CA 2543191C CA 2543191 A CA2543191 A CA 2543191A CA 2543191 A CA2543191 A CA 2543191A CA 2543191 C CA2543191 C CA 2543191C
- Authority
- CA
- Canada
- Prior art keywords
- input
- hand
- electronic device
- elements
- assembly
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
- 230000006870 function Effects 0.000 claims abstract description 97
- 210000003811 finger Anatomy 0.000 claims abstract description 83
- 210000003813 thumb Anatomy 0.000 claims abstract description 59
- 238000012905 input function Methods 0.000 claims abstract description 33
- 230000000694 effects Effects 0.000 claims abstract description 15
- 238000013507 mapping Methods 0.000 claims abstract description 12
- 238000000034 method Methods 0.000 claims description 92
- 230000033001 locomotion Effects 0.000 claims description 29
- 230000000712 assembly Effects 0.000 claims description 16
- 238000000429 assembly Methods 0.000 claims description 16
- 238000003825 pressing Methods 0.000 claims description 16
- 230000015654 memory Effects 0.000 claims description 12
- 230000008859 change Effects 0.000 claims description 10
- 230000008569 process Effects 0.000 claims description 6
- 230000004044 response Effects 0.000 claims description 5
- 238000002372 labelling Methods 0.000 claims description 4
- 238000004891 communication Methods 0.000 description 16
- 230000008901 benefit Effects 0.000 description 12
- 229920001690 polydopamine Polymers 0.000 description 9
- 238000010586 diagram Methods 0.000 description 6
- 230000009471 action Effects 0.000 description 4
- 238000003860 storage Methods 0.000 description 4
- 238000012545 processing Methods 0.000 description 3
- 238000010079 rubber tapping Methods 0.000 description 3
- 230000006886 spatial memory Effects 0.000 description 3
- 230000002459 sustained effect Effects 0.000 description 3
- 238000012549 training Methods 0.000 description 3
- 241000282326 Felis catus Species 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 2
- 238000010304 firing Methods 0.000 description 2
- 210000004247 hand Anatomy 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 210000003205 muscle Anatomy 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 241001422033 Thestylus Species 0.000 description 1
- 230000003213 activating effect Effects 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000036461 convulsion Effects 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 230000009849 deactivation Effects 0.000 description 1
- 230000000881 depressing effect Effects 0.000 description 1
- 230000003292 diminished effect Effects 0.000 description 1
- 239000013536 elastomeric material Substances 0.000 description 1
- 239000012530 fluid Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 230000004973 motor coordination Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1626—Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/169—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated pointing device, e.g. trackball in the palm rest area, mini-joystick integrated between keyboard keys, touch pads or touch stripes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/02—Input arrangements using manually operated switches, e.g. using keyboards or dials
- G06F3/0202—Constructional details or processes of manufacture of the input device
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/02—Input arrangements using manually operated switches, e.g. using keyboards or dials
- G06F3/023—Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
- G06F3/0238—Programmable keyboards
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04809—Textured surface identifying touch areas, e.g. overlay structure for a virtual keyboard
Abstract
A human interface configured to optimize a biomechanical effect of a human user's opposing thumb and fingers by including, on one surface, one or more software configurable input elements manipulatable by a user's thumb(s) or a stylus, and, on another surface, one or more software configurable selection elements manipulatable by a user's finger(s). A selection element may be a pressure sensor pad configurable to represent delineated active areas that are mapped to one or more input functions. Shape changing media may be provided to permit a user to tactilely discriminate between delineated active areas.
Tactile feedback may be provided to a user through palpable detents, vibratory or force producing units. Inputting data may include mapping each selection element to a shift function, mapping each input element to text functions, and using the selection elements to shift between text functions associated with an input element to input a desired text function.
Tactile feedback may be provided to a user through palpable detents, vibratory or force producing units. Inputting data may include mapping each selection element to a shift function, mapping each input element to text functions, and using the selection elements to shift between text functions associated with an input element to input a desired text function.
Description
Human Interface System BACKGROUND
The following description relates to human interface and input systems for electronic devices, particularly hand-held electronic devices, such as cell phones, personal digital assistants ("PDAs"), pocket personal computers, smart phones, hand-held game devices, bar-code readers, remote controls, and other similar input devices having a keypad or one or more input elements.
Electronic devices have become increasingly sophisticated and physically smaller due in part to a decrease in the price of processing power and a concurrent increase in demand by consumers for smaller devices. Such devices, however, tend to be limited in function and utility by the user's ability to interface with the device for data input (e.g., text, numeric, and functional input) and/or device control, which becomes increasingly more difficult to do as the available space on the device's surface for positioning the input elements, which are used for data input and/or device control, continues to decrease.
Various human interface and input systems and techniques for hand-held electronic devices have been developed for data input and device control. These include miniature keyboards and keypads used in combination with chordal input techniques, modal input techniques and/or smart keys; and touch screens used in combination with on-screen keyboard or keypad software or hand-writing recognition software.
Keyboard or Key pad Used With Chordal, Modal and Smart Key Techniques Miniature keyboards and keypads are similar to their standard full-size versions-i.e., a keyboard generally has a full set or substantially full set of numeric, character, and functional input elements, while key pads typically have a reduced set of numeric, character and/or functional input elements compared to keyboards. These miniature input devices typically are designed to fit the available space on one surface of a hand-held electronic device or are designed as small, easily transportable, external plug-in devices. Thus, as hand-held electronic devices become smaller, the size of the input elements typically has been reduced in order for the desired number of input elements to fit on one surface of the electronic device.
For data input and device control, miniature keyboards and keypads typically either require one of two input techniques-- use of one or more thumbs or fingers to press the desired input elements or use of a stylus to "peck" the desired input elements (which is usually done where the input element is of smaller size). Various techniques, such as chordal input techniques, modal input techniques and smart keys, have been developed and implemented to improve the efficiency and effectiveness of using miniature keyboards and keypads.
= Chordal Input Techniques Chordal input techniques generally are based upon the principle that characters, symbols, words, phrases or concepts can be represented by a reduced set of input elements.
Thus, by only having to press a reduced combination of input elements, functionality can be increased and quicker and more accurate data input can be realized. Chordal input techniques can be used on any keyboard or keypad configuration or any device having more than one input element, and typically results in fewer input elements or more functions compared to conventional keyboards or keypads. An example of an electronic device using two-handed chordal input techniques is a court reporter or stenographer's typewriter. One chordal input technique using a keypad to decrease the number of actuations to achieve a large number of functions is described in U.S. Patent No. 5,973,621 to Levy, entitled "Compact Keyed Input Device".
= Modal Input Techniques Modal input techniques are based on the concept that functions of the electronic device, e.g., text messaging in a cell-phone or PDA, can be accessed by pressing a particular input element (or combination of elements) on a keyboard or keypad. Once that particular input element is pressed, the functionality of all or a portion of the input elements on the keyboard or keypad may change. Modal techniques typically are used in calculators, cell-phones, and PDAs. For example, in cell phones, a modal technique called multi-tap is common, in which individual input elements on the keypad are associated with multiple symbols, such as characters, letters, numbers, icons or other types of symbols, which tends to reduce the number of input elements required to achieve the desired functions, e.g., a twelve-input-element keypad can be used to represent all letters of the English alphabet and the decimal digits. A user can input a desired symbol within a set of symbols associated with a certain input element by tapping on that particular input element with a thumb, finger, or stylus, one or more times to input the desired character. Thus, if a user desires to send a text message, the user may press a functional input element, e.g., a mode key, to access the text messaging functionality of the cell phone and then tap an individual input element one or more times to select the associated symbol for input. The number of taps needed to input a particular symbol may differ depending on the language character set chosen.
For example, Japanese keypad or keyboards typically require a minimum set of 46 characters for text input, while English or American keyboards and keypads usually require a minimum set of 26 characters for text input. These modal input techniques have gained some popularity as users perform more text functions.
= Smart Keys Smart keys are typically used on keypads and refer to a single key or combination of keys that, when pressed, predict the users next logical action. Some implementations work better than others and some applications reduce the number of keystrokes required to complete a function better than others. Word-predictor software, for example, attempts to predict the word or character the user intends to input based upon one or more letters inputted by the user and the likely probabilities within a given language. The probability of the software guessing correctly increases with the length of the word or number of letters or characters inputted. In a device using smart keys on the keypad, a user may tap the keys 2, 2 and 8 in sequence to generate the word "cat" and the device would display that word first because it is usually the most common combination, whereas the word "bat,"
which can be generated by pressing the same keys, would not be displayed first because it is not as common. Also, the word "cat" may be displayed after pressing the 2 key the second time based on a guess by the word-predictor software.
Smart keys also are typically used for Japanese data input where a user phonetically inputs letters representing the sound of the Japanese character (e.g., a Kanji character). Based on the inputted letters, the predictor software guesses the Japanese character. To select the character, a user would press the accept button or use the scrolling function to go to the next character with a similar set of phonetic inputs.
Touch Screen Using On-Screen Keyboard or Handwriting Recognition Software Using on-screen keyboard or keypad software with a touch screen offers users the ability to enter data with fingers or thumbs on a screen-sized keyboard or buttons, allowing faster data input without a stylus or physical keyboard or keypad accessory;
while using handwriting recognition software with a touch screen, such as GraffitiO on the Palm operating system, offers users the ability to enter text with a stylus by writing the text directly on the touch screen. Touch screens usually consume more power and are more expensive than non touch-sensitive screens. This higher power consumption can be a problem for hand-held electronic devices, which typically have limited power resources.
Moreover, touch screens usually require the user to use both hands (e.g., one hand is used to hold and steady the device while the other hand is used to grasp the stylus), which is generally undesirable for interfacing with and controlling one handed hand-held electronic device, such as cell phones.
The following description relates to human interface and input systems for electronic devices, particularly hand-held electronic devices, such as cell phones, personal digital assistants ("PDAs"), pocket personal computers, smart phones, hand-held game devices, bar-code readers, remote controls, and other similar input devices having a keypad or one or more input elements.
Electronic devices have become increasingly sophisticated and physically smaller due in part to a decrease in the price of processing power and a concurrent increase in demand by consumers for smaller devices. Such devices, however, tend to be limited in function and utility by the user's ability to interface with the device for data input (e.g., text, numeric, and functional input) and/or device control, which becomes increasingly more difficult to do as the available space on the device's surface for positioning the input elements, which are used for data input and/or device control, continues to decrease.
Various human interface and input systems and techniques for hand-held electronic devices have been developed for data input and device control. These include miniature keyboards and keypads used in combination with chordal input techniques, modal input techniques and/or smart keys; and touch screens used in combination with on-screen keyboard or keypad software or hand-writing recognition software.
Keyboard or Key pad Used With Chordal, Modal and Smart Key Techniques Miniature keyboards and keypads are similar to their standard full-size versions-i.e., a keyboard generally has a full set or substantially full set of numeric, character, and functional input elements, while key pads typically have a reduced set of numeric, character and/or functional input elements compared to keyboards. These miniature input devices typically are designed to fit the available space on one surface of a hand-held electronic device or are designed as small, easily transportable, external plug-in devices. Thus, as hand-held electronic devices become smaller, the size of the input elements typically has been reduced in order for the desired number of input elements to fit on one surface of the electronic device.
For data input and device control, miniature keyboards and keypads typically either require one of two input techniques-- use of one or more thumbs or fingers to press the desired input elements or use of a stylus to "peck" the desired input elements (which is usually done where the input element is of smaller size). Various techniques, such as chordal input techniques, modal input techniques and smart keys, have been developed and implemented to improve the efficiency and effectiveness of using miniature keyboards and keypads.
= Chordal Input Techniques Chordal input techniques generally are based upon the principle that characters, symbols, words, phrases or concepts can be represented by a reduced set of input elements.
Thus, by only having to press a reduced combination of input elements, functionality can be increased and quicker and more accurate data input can be realized. Chordal input techniques can be used on any keyboard or keypad configuration or any device having more than one input element, and typically results in fewer input elements or more functions compared to conventional keyboards or keypads. An example of an electronic device using two-handed chordal input techniques is a court reporter or stenographer's typewriter. One chordal input technique using a keypad to decrease the number of actuations to achieve a large number of functions is described in U.S. Patent No. 5,973,621 to Levy, entitled "Compact Keyed Input Device".
= Modal Input Techniques Modal input techniques are based on the concept that functions of the electronic device, e.g., text messaging in a cell-phone or PDA, can be accessed by pressing a particular input element (or combination of elements) on a keyboard or keypad. Once that particular input element is pressed, the functionality of all or a portion of the input elements on the keyboard or keypad may change. Modal techniques typically are used in calculators, cell-phones, and PDAs. For example, in cell phones, a modal technique called multi-tap is common, in which individual input elements on the keypad are associated with multiple symbols, such as characters, letters, numbers, icons or other types of symbols, which tends to reduce the number of input elements required to achieve the desired functions, e.g., a twelve-input-element keypad can be used to represent all letters of the English alphabet and the decimal digits. A user can input a desired symbol within a set of symbols associated with a certain input element by tapping on that particular input element with a thumb, finger, or stylus, one or more times to input the desired character. Thus, if a user desires to send a text message, the user may press a functional input element, e.g., a mode key, to access the text messaging functionality of the cell phone and then tap an individual input element one or more times to select the associated symbol for input. The number of taps needed to input a particular symbol may differ depending on the language character set chosen.
For example, Japanese keypad or keyboards typically require a minimum set of 46 characters for text input, while English or American keyboards and keypads usually require a minimum set of 26 characters for text input. These modal input techniques have gained some popularity as users perform more text functions.
= Smart Keys Smart keys are typically used on keypads and refer to a single key or combination of keys that, when pressed, predict the users next logical action. Some implementations work better than others and some applications reduce the number of keystrokes required to complete a function better than others. Word-predictor software, for example, attempts to predict the word or character the user intends to input based upon one or more letters inputted by the user and the likely probabilities within a given language. The probability of the software guessing correctly increases with the length of the word or number of letters or characters inputted. In a device using smart keys on the keypad, a user may tap the keys 2, 2 and 8 in sequence to generate the word "cat" and the device would display that word first because it is usually the most common combination, whereas the word "bat,"
which can be generated by pressing the same keys, would not be displayed first because it is not as common. Also, the word "cat" may be displayed after pressing the 2 key the second time based on a guess by the word-predictor software.
Smart keys also are typically used for Japanese data input where a user phonetically inputs letters representing the sound of the Japanese character (e.g., a Kanji character). Based on the inputted letters, the predictor software guesses the Japanese character. To select the character, a user would press the accept button or use the scrolling function to go to the next character with a similar set of phonetic inputs.
Touch Screen Using On-Screen Keyboard or Handwriting Recognition Software Using on-screen keyboard or keypad software with a touch screen offers users the ability to enter data with fingers or thumbs on a screen-sized keyboard or buttons, allowing faster data input without a stylus or physical keyboard or keypad accessory;
while using handwriting recognition software with a touch screen, such as GraffitiO on the Palm operating system, offers users the ability to enter text with a stylus by writing the text directly on the touch screen. Touch screens usually consume more power and are more expensive than non touch-sensitive screens. This higher power consumption can be a problem for hand-held electronic devices, which typically have limited power resources.
Moreover, touch screens usually require the user to use both hands (e.g., one hand is used to hold and steady the device while the other hand is used to grasp the stylus), which is generally undesirable for interfacing with and controlling one handed hand-held electronic device, such as cell phones.
Handwriting recognition software has improved the slowness and awkwardness inherent in stylus, finger or thumb input but other drawbacks still remain, such as high power consumption, the necessity to use both hands, and lack of tactile feedback to inform a user when an input element has been. Moreover, recognition software requires training to use properly, and, even then, still results in a high error rate.
Game Control For game control, many of the above approaches have been used, but in most hand-held electronic devices, a user typically controls game play through the use of some form of input element, such as on a miniature keypad and/or directional pad ("D-pad"), which typically is located on the front surface of the device. Game control on some hand-held electronic devices, such as cell phones, is inherently one handed or at most two thumbed because of the size of the device, while game control on other hand-held electronic devices, such as PDAs and conventional game console controllers, is typically two-handed. The input elements associated with game control on these devices are typically digital even though analog input elements have been used on game controllers for PC and console game systems, such as Microsoft's Xbox or Sony's Play Station 2.
SUMMARY
The present inventors recognized that conventional human interface and input systems for hand-held electronic devices tended to be relatively inflexible, cumbersome, and inefficient to use, among other reasons, because they were not designed to take advantage of the biomechanics of the human hand, particularly the advantages associated with the opposition of the thumb to the fingers and the beneficial attributes of the thumb, e.g., its large range of motion and ability to impart large sustained forces, and the beneficial attributes of the fingers, e.g., their fine motor control, spatial memory and rapidity of motion.
The present inventors also recognized that the input techniques developed to improve the efficiency of data input and device control, such as chordal and modal techniques, were limited by the inefficiencies inherent in conventional input systems. For example, miniature keyboards and keypads used in combination with chordal input techniques not only required the user to memorize numerous input combinations and develop the necessary motor skills to control a reduced number of input elements to provide even more complex functionality compared to typical QWERTY keyboards, but also did not use or allocate input tasks to the fingers and thumb of the human hand effectively. Moreover, miniature keyboards and keypads used in combination with modal input techniques tended to limit the user's ability to efficiently input data depending on the number of taps required to input a particular symbol and how fast the user could tap the input element with his thumb or a stylus to select the particular symbol.
The present inventors also recognized that a user's ability to control game play in such devices was greatly limited. For example, while analog game control has been available to users of PC and console game systems, analog game control generally has not been widely available on hand-held electronic devices, such as cell phones and PDAs.
Moreover, because the game controls for conventional hand-held electronic devices were typically positioned on the front surface of the device, the user's hand typically obscured the user's view of the video screen. Also, the "fast twitch" control (e.g., a trigger) for shooting or activating special purpose game controls, which users have come to expect in console and PC game systems, generally has not been available to users of such hand-held electronic devices due in part to the nature of conventional interface and input systems, which were optimized for data input rather than for game control.
Consequently, the present inventors developed a flexible and efficient human interface and input system and techniques for hand-held electronic devices (whether one handed or two handed) that utilize the opposed thumb and finger ergonomics inherent in the hand and the skills already developed for using conventional input techniques to accomplish data input, device control and game control in a timely, efficient, comfortable and intuitive manner.
Thus, no specialized training beyond that normally expected with any newly purchased hand-held device is expected.
Implementations of the human interface and input system for hand-held electronic devices described here may include various combinations of the following features.
The human interface and input system for a hand-held electronic device may be configured to include on a first surface a plurality of input elements that can be manipulated by one or both of a human user's thumbs or a stylus. At least one of the input elements may be configured in software to provide access to a plurality of input functions.
For example, one of the input elements may provide access to the text symbols 5, j, k and 1, while another input element may provide access to the text symbols 3, d, e and f, such as is found on a typical cell phone keypad. The human interface and input system also includes on a second surface one or more selection elements that may be manipulated by any of the human user's fingers. The selection elements may be associated with one or more input functions, which may be configured in software. For example, the selection elements may be configured to correspond to a particular shift position. Therefore, when a user manipulates a selection element, which is configured to correspond to a third shift position, for example, then the input function that may be accessed by a particular input element will be the third input function associated with the input element. In the example provided above, the third input function may be the text symbol "k" for the input element that provides access to the text symbols 5, j, k and 1.
One of the selection elements may be a pressure sensor pad that can be configured to represent a plurality of delineated active areas, as well as inactive areas.
These delineated active areas likewise can be configured in software to represent one or more input finictions.
A shape changing media also may be provided with the pressure sensor pad so as to permit a human user to tactilely discriminate between the plurality of delineated active areas and/or inactive areas.
The input elements and/or selection elements also may be associated with a palpable detent, vibratory unit and/or force producing unit, which may provide tactile feedback to the user when the user manipulates the elements or in response to events occurring in a software application running on a processor.
The human interface and input system also may be configured to include a first input assembly and a second input assembly. The first input assembly may include a plurality of input or selection elements situated on one or more surfaces of the electronic device and configured to be easily and comfortably actuated by one or both of a human user's thumbs or a stylus. The second input assembly may include one or more input or selection elements situated on one ore more surfaces of the electronic device and configured to be easily and comfortably actuated by one or more of the human user's fingers. The first input and second input assemblies may be disposed on one or more surfaces of the hand-held electronic device to take advantage of the full range of opposition configurations of the thumb and the fingers.
Sensing circuitry, such as an input controller, may be provided to receive signals generated by the elements of the first and/or second input assemblies when the elements are manipulated by the human user and convert those signals in a form suitable to be received by a processor running application software, which based on the received signals, can determine the type of input provided by the human user.
The first input assembly may be situated on a front surface of the electronic device, while the second input assembly may be situated on the back surface of the electronic device to take advantage of the thumb/finger opposition. As configured, a user may manipulate input elements in the first input assembly with one or both thumbs or a stylus, while, manipulating elements in the second input assembly with one or more fingers.
Game Control For game control, many of the above approaches have been used, but in most hand-held electronic devices, a user typically controls game play through the use of some form of input element, such as on a miniature keypad and/or directional pad ("D-pad"), which typically is located on the front surface of the device. Game control on some hand-held electronic devices, such as cell phones, is inherently one handed or at most two thumbed because of the size of the device, while game control on other hand-held electronic devices, such as PDAs and conventional game console controllers, is typically two-handed. The input elements associated with game control on these devices are typically digital even though analog input elements have been used on game controllers for PC and console game systems, such as Microsoft's Xbox or Sony's Play Station 2.
SUMMARY
The present inventors recognized that conventional human interface and input systems for hand-held electronic devices tended to be relatively inflexible, cumbersome, and inefficient to use, among other reasons, because they were not designed to take advantage of the biomechanics of the human hand, particularly the advantages associated with the opposition of the thumb to the fingers and the beneficial attributes of the thumb, e.g., its large range of motion and ability to impart large sustained forces, and the beneficial attributes of the fingers, e.g., their fine motor control, spatial memory and rapidity of motion.
The present inventors also recognized that the input techniques developed to improve the efficiency of data input and device control, such as chordal and modal techniques, were limited by the inefficiencies inherent in conventional input systems. For example, miniature keyboards and keypads used in combination with chordal input techniques not only required the user to memorize numerous input combinations and develop the necessary motor skills to control a reduced number of input elements to provide even more complex functionality compared to typical QWERTY keyboards, but also did not use or allocate input tasks to the fingers and thumb of the human hand effectively. Moreover, miniature keyboards and keypads used in combination with modal input techniques tended to limit the user's ability to efficiently input data depending on the number of taps required to input a particular symbol and how fast the user could tap the input element with his thumb or a stylus to select the particular symbol.
The present inventors also recognized that a user's ability to control game play in such devices was greatly limited. For example, while analog game control has been available to users of PC and console game systems, analog game control generally has not been widely available on hand-held electronic devices, such as cell phones and PDAs.
Moreover, because the game controls for conventional hand-held electronic devices were typically positioned on the front surface of the device, the user's hand typically obscured the user's view of the video screen. Also, the "fast twitch" control (e.g., a trigger) for shooting or activating special purpose game controls, which users have come to expect in console and PC game systems, generally has not been available to users of such hand-held electronic devices due in part to the nature of conventional interface and input systems, which were optimized for data input rather than for game control.
Consequently, the present inventors developed a flexible and efficient human interface and input system and techniques for hand-held electronic devices (whether one handed or two handed) that utilize the opposed thumb and finger ergonomics inherent in the hand and the skills already developed for using conventional input techniques to accomplish data input, device control and game control in a timely, efficient, comfortable and intuitive manner.
Thus, no specialized training beyond that normally expected with any newly purchased hand-held device is expected.
Implementations of the human interface and input system for hand-held electronic devices described here may include various combinations of the following features.
The human interface and input system for a hand-held electronic device may be configured to include on a first surface a plurality of input elements that can be manipulated by one or both of a human user's thumbs or a stylus. At least one of the input elements may be configured in software to provide access to a plurality of input functions.
For example, one of the input elements may provide access to the text symbols 5, j, k and 1, while another input element may provide access to the text symbols 3, d, e and f, such as is found on a typical cell phone keypad. The human interface and input system also includes on a second surface one or more selection elements that may be manipulated by any of the human user's fingers. The selection elements may be associated with one or more input functions, which may be configured in software. For example, the selection elements may be configured to correspond to a particular shift position. Therefore, when a user manipulates a selection element, which is configured to correspond to a third shift position, for example, then the input function that may be accessed by a particular input element will be the third input function associated with the input element. In the example provided above, the third input function may be the text symbol "k" for the input element that provides access to the text symbols 5, j, k and 1.
One of the selection elements may be a pressure sensor pad that can be configured to represent a plurality of delineated active areas, as well as inactive areas.
These delineated active areas likewise can be configured in software to represent one or more input finictions.
A shape changing media also may be provided with the pressure sensor pad so as to permit a human user to tactilely discriminate between the plurality of delineated active areas and/or inactive areas.
The input elements and/or selection elements also may be associated with a palpable detent, vibratory unit and/or force producing unit, which may provide tactile feedback to the user when the user manipulates the elements or in response to events occurring in a software application running on a processor.
The human interface and input system also may be configured to include a first input assembly and a second input assembly. The first input assembly may include a plurality of input or selection elements situated on one or more surfaces of the electronic device and configured to be easily and comfortably actuated by one or both of a human user's thumbs or a stylus. The second input assembly may include one or more input or selection elements situated on one ore more surfaces of the electronic device and configured to be easily and comfortably actuated by one or more of the human user's fingers. The first input and second input assemblies may be disposed on one or more surfaces of the hand-held electronic device to take advantage of the full range of opposition configurations of the thumb and the fingers.
Sensing circuitry, such as an input controller, may be provided to receive signals generated by the elements of the first and/or second input assemblies when the elements are manipulated by the human user and convert those signals in a form suitable to be received by a processor running application software, which based on the received signals, can determine the type of input provided by the human user.
The first input assembly may be situated on a front surface of the electronic device, while the second input assembly may be situated on the back surface of the electronic device to take advantage of the thumb/finger opposition. As configured, a user may manipulate input elements in the first input assembly with one or both thumbs or a stylus, while, manipulating elements in the second input assembly with one or more fingers.
The input function of the input elements of the first and/or the second input assembly may change depending on the application running on the electronic device. When a text application (e.g., e-mail, word processing, or text messaging) is running on the electronic device, the elements of the first and/or second input assembly may be associated with data input keys, such as symbols. When a game application is running on the electronic device, the input elements of the first and/or second input assembly may be associated with game controls, such as a directional control, action buttons, and trigger buttons.
The mapping of one ore more of the input elements of the first and/or second input assembly to a software application, i.e., whether one ore more of the input elements will operate as data input keys, game controls or device controls can be customized by the software application developer or the user through downloads or other programming modalities. Moreover, to reduce the cost of manufacturing hand-held electronic devices that will be used in multiple countries, input element sets particular to the language of a desired country can be implemented in software.
In accordance with one aspect of the invention, there is provided a hand-held electronic device. The device includes a memory configured to store a plurality of applications. Each application is associated with a set of functions. The device also includes a processor configured to process a selected one of the plurality of applications, and a first input assembly having a plurality of input elements on a first surface configured to receive input from a human user through manipulation of the plurality of input elements. At least one of the input elements on the first surface is configured to selectively map to one or more input functions of the set of functions associated with the selected one of the plurality of applications. The device further includes a second input assembly having one or more input elements on a second surface configured to be manipulated by one or more of the human user's fingers. At least one of the input elements on the second surface is further configured to be selectively mapped to one or more input functions of the set of functions corresponding to the selected one of the plurality of applications. Further, the plurality of input elements on the first surface and the one or more input elements on the second surface are arranged so as to substantially optimize a biomechanical effect of the human user's hand. At least one of the input elements of the second input assembly is a selectively configurable sensing surface so as to selectively provide one or more delineated active areas configured based on the selected application.
Manipulation of a delineated active area may cause the input function of one or more input elements of the first input assembly to change.
The mapping of one ore more of the input elements of the first and/or second input assembly to a software application, i.e., whether one ore more of the input elements will operate as data input keys, game controls or device controls can be customized by the software application developer or the user through downloads or other programming modalities. Moreover, to reduce the cost of manufacturing hand-held electronic devices that will be used in multiple countries, input element sets particular to the language of a desired country can be implemented in software.
In accordance with one aspect of the invention, there is provided a hand-held electronic device. The device includes a memory configured to store a plurality of applications. Each application is associated with a set of functions. The device also includes a processor configured to process a selected one of the plurality of applications, and a first input assembly having a plurality of input elements on a first surface configured to receive input from a human user through manipulation of the plurality of input elements. At least one of the input elements on the first surface is configured to selectively map to one or more input functions of the set of functions associated with the selected one of the plurality of applications. The device further includes a second input assembly having one or more input elements on a second surface configured to be manipulated by one or more of the human user's fingers. At least one of the input elements on the second surface is further configured to be selectively mapped to one or more input functions of the set of functions corresponding to the selected one of the plurality of applications. Further, the plurality of input elements on the first surface and the one or more input elements on the second surface are arranged so as to substantially optimize a biomechanical effect of the human user's hand. At least one of the input elements of the second input assembly is a selectively configurable sensing surface so as to selectively provide one or more delineated active areas configured based on the selected application.
Manipulation of a delineated active area may cause the input function of one or more input elements of the first input assembly to change.
The hand-held electronic device may further include shape changing media configured relative to the sensor pad so as to permit the human user to tactilely discriminate between the plurality of delineated active areas.
The processor may receive signals generated by the input elements of first or second input assemblies when manipulated by the human user.
The hand-held electronic device may further include an input controller. The input controller may receive signals generated by the input elements of first or second input assemblies when manipulated by the human user and may convert the signals into a form suitable to be interpreted by the processor.
At least one of the input elements of the second input assembly may be a rotary sensor.
At least one of the input elements of the second input assembly may be a D-pad.
The hand-held electronic device may further include at least one palpable detent. The detent may be associated with at least one of the input elements of the first or second input assemblies so as to provide tactile feedback to the human user when the human user manipulates the input element associated with the palpable detent.
The hand-held electronic device may further include one or more vibratory or force producing units, and at least one of the vibratory or force producing units may be configured to provide tactile feedback upon the human user's manipulation of at least one of the input elements of the first or second input assemblies.
At least one of the vibratory units may provide tactile feedback in response to events occurring in the selected application running on the processor.
In accordance with another aspect of the invention, there is provided a hand-held electronic device. The device includes a memory configured to store a plurality of applications. Each application is associated with a set of functions. The device also includes a processor configured to process a selected one of the plurality of applications. The set of functions associated with the selected application includes a plurality of text symbol functions and a plurality of shifting functions. The device further includes a first surface having a plurality of input elements configured to receive input from a human user through manipulation of the plurality of input elements. At least one of the input elements of the first surface is further configured to selectively map to more than one text symbol function. The device also includes a second surface having one or more input elements. At least one of the input elements of the second surface has one or more selectable active areas configured to be 7a manipulated by one or more of the human user's fingers, each selectable active area configured to selectively map to a different shifting function. Manipulation of one of the selectable active area causes the text symbol function of the one or more input elements of the first surface to change. Further, the plurality of input elements of the first surface and the one or more input elements of the second surface are arranged so as to substantially optimize a biomechanical effect of the human user's hand.
The hand-held electronic device may further include a controller. The controller may receive signals generated by the human user's manipulation of the input elements of the first surface or active areas.
The hand-held electronic device may further include a dome cap positioned above at least one input element of the first surface or the second surface and may be capable of providing tactile feedback to the human user when the input element associated with the dome cap is manipulated.
The hand-held electronic device may further include one or more vibratory units capable of providing tactile feedback.
The hand-held electronic device may further include one or more force producing units capable of providing tactile feedback.
In accordance with another aspect of the invention, there is provided a method for configuring a human interface and input system for use with a hand-held electronic device configured to run a plurality of applications, each application associated with a set of functions. The method involves disposing on a first surface a first input assembly having a plurality of input elements configured to receive input from a human user through manipulation of the plurality of input elements. At least one of the input elements of the first input assembly is further configured to map to more than one input function associated with a selected one of the plurality of applications. The method further involves disposing on a second surface a second input assembly having one or more input elements configured to be manipulated by one or more of the human user's fingers. At least one of the input elements of the second input assembly is further configured to selectively map to one or more input functions associated with the selected application. The method also involves arranging the plurality of input elements of the first input assembly and the one or more input elements of the second input assembly to substantially optimize a biomechanical effect of the human user's hand.
7b The method may further involve physically or electronically labeling at least one input element of the first input assembly or the second input assembly so as to visually indicate an input function that can be selectively accessed by actuating the input element.
The method may further involve connecting a controller to the input elements of the first input assembly or the second input assembly. The controller may be configured to receive signals generated by a manipulation of one or more of the input elements of the first input assembly or the second input assembly.
At least one input element of the second input assembly may have a plurality of active areas configurable by the controller to form a plurality of delineated active areas.
The method may further involve positioning a shape changing media relative to the one input element of the second input assembly having a plurality of active areas so as to permit the human user to tactilely discriminate between the plurality of delineated active areas.
The method may further involve positioning a palpable detent with at least one input element of the first input assembly or the second input assembly so as to provide tactile feedback when manipulated by the human user.
In accordance with another aspect of the invention, there is provided a method for inputting data on a hand-held electronic device having a first surface with a plurality of input elements configured to receive input from a human user through manipulation of the plurality of input elements. At least one of the input elements is further configured to map to a plurality of symbols in a data input mode. Each of the plurality of symbols is associated with a unique index position identifier, and a second surface having one or more selection elements configured to be manipulated by one or more of the human user's fingers. Each selection element corresponds to one of the unique index position identifiers.
Further, the plurality of input elements and the one or more selection elements are arranged to substantially optimize a biomechanical effect of the human user's hand. The method involves executing a selected application from a plurality of applications. The selected application is associated with a set of functions. The method also involves determining the index position identifier of a desired symbol to be inputted based on the functions associated with the selected application, pressing the selection element corresponding to the index position identifier of the desired symbol with any digit or object held in the human user's hand, and pressing the input element configured to map to the desired symbol with any digit or object held in the human user's hand.
7c Each input element may be physically or electronically labeled indicating each symbol that is mapped to the input element and a positional order in which each symbol may be selectively accessed by actuating the input element.
Determining the index position identifier of the desired character to be inputted may involve locating the input element configured to map to the desired symbol, and counting from left to right the number of symbols preceding the desired symbol labeled on the located input element. The index position identifier of the desired symbol may be the number of symbols preceding the desired symbol plus one.
At least one of the input elements or selection elements may be further configured to map to a plurality of modes corresponding to the selected application executing on the hand-held electronic device. At least one of the modes may be the data input mode, and the method may further involve enabling the data input mode.
In accordance with another aspect of the invention, there is provided a method for a human user to input data on a hand-held electronic device using an interface and input system involving a plurality of input elements in a thumb-manipulated assembly to substantially optimize a biomechanical effect of the human user's thumb and fingers. At least one input element is mapped to more than one text function, and one or more selection elements in a finger-manipulated input assembly. Each selection element is mapped to a unique shift position. The method involves executing a selected text application from a plurality of applications. The selected application is associated with a set of functions.
The method further involves pressing a desired selection element of the finger-manipulated input assembly with a human finger to select a desired shift position of the selected text application, and pressing a desired input element of the thumb-manipulated input assembly with a human thumb to input a desired text character.
In accordance with another aspect of the invention, there is provided a hand-held electronic device. The device includes a memory configured to store a plurality of applications. Each application is associated with a set of functions. The device also includes a processor configured to process a selected one of the plurality of applications, and a first input assembly disposed on a first surface of the electronic device. The first input assembly includes a plurality of input elements configured to be actuated by a human user's hand. At least one of the input elements of the first input assembly is configured to map to one or more input functions of the set of functions associated with the selected one of the plurality of applications. The device further includes a second input assembly disposed on a second 7d surface so as to substantially optimize a biomechanical effect of the human user's hand. The second input assembly includes one or more input elements configured to be manipulated by one or more of the human user's fingers. At least one of the input elements of the second input assembly is a selectively configurable sensing surface so as to provide a plurality of delineated active areas. Further, one or more of the delineated active areas is mapped to one or more functions associated with the selected application, and the memory is further configured to store for each application a mapping of the selectively configurable sensing surface to the plurality of delineated active areas.
The selected one of the plurality of applications may be a text application, and the one or more input elements on the second surface of the second input assembly may include one or more selection elements. Manipulations of the one or more selection elements may cause the input elements on the first surface of the first input assembly to be selectively mapped from one text function to another text function.
The selected one of the plurality of applications may be a game application, and at least one of the plurality of input elements of the first input assembly and at least one of the input elements of the second input assembly may each be configured to selectively map to one or more game functions.
The hand-held electronic device may further include an input controller. The input controller may receive a plurality of signals generated by the input elements of the first input assembly and the second input assembly when manipulated by the human user, and may convert the plurality of signals into a form suitable to be interpreted by the processor.
At least one input element of the first input assembly or the second input assembly may be configured to map to one or more input functions associated with the selected application that control a cursor on a screen.
The selected one of the plurality of applications may be a game application.
At least one input element of the first input assembly or the second input assembly may be configured to map to one or more input functions associated with the game application that control a game character on a screen.
The input controller may be further configured to interpret a movement of the human user's finger sliding across two or more delineated active areas as a change in the mapped function of the two or more delineated active areas. The mapped function may be at least one of a speed control, a size control, a weapon fire control, and a position control.
7e The input controller may be further configured to interpret a pressure applied by the human user's finger on a selected one of the delineated active areas as a change in the mapped function of the selected delineated active area. The mapped function may be at least one of a speed control, a size control, a weapon fire, and position control.
At least one of the functions mapped to the input element of the first input assembly may be a game function that is substantially optimized for actuation by the human user's thumb.
The game function that may be substantially optimized for actuation by the human user's thumbs may include a directional control.
At least one of the functions mapped to the delineated active areas may be a game function that may be substantially optimized for actuation by one or more of the human user's fingers.
The game function that may be substantially optimized for actuation by one or more of the human user's fingers may include a weapon fire control.
The game function that may be substantially optimized for actuation by one or more of the human user's fingers may include a game character jump control.
In accordance with another aspect of the invention, there is provided a method for configuring a human interface and input system for use with a hand-held electronic device configured to run a plurality of applications, each application associated with a set of functions. The method involves disposing on a first surface a first input assembly having a plurality of input elements configured to receive input from a human user's hand through manipulation of the plurality of input elements. At least one of the input elements of the first input assembly is further configured to map to more than one input function associated with a selected one of the plurality of applications. The method also involves disposing on a second surface a second input assembly having one or more input elements configured to be manipulated by one or more of the human user's fingers. At least one of the input elements of the second input assembly is further configured to selectively map to one or more input functions associated with the selected application. The method further involves mapping the set of functions of the selected application to the one or more input elements of the first input assembly and the second input assembly to substantially optimize a biomechanical effect of the human user's hand.
The selected application may be at least one of a scrolling application, a text application and a game application.
7f The method may further involve physically or electronically labeling at least one input element of the first input assembly or the second input assembly so as to visually indicate an input function that may be selectively accessed by actuating the input element.
The method may further involve connecting a controller to the input elements of the first input assembly or the second input assembly. The controller may be configured to receive signals generated by a manipulation of one or more of the input elements of first input assembly or the second input assembly.
At least one input element of the second input assembly having a plurality of active areas may be configurable by the controller to form a plurality of delineated active areas.
The method may further involve positioning a shape changing media relative to the one input element of the second input assembly having a plurality of active areas so as to permit the human user to tactilely discriminate between the plurality of delineated active areas.
The processor may be further configured to be communicatively coupled to a host electronic device.
The processor may be further configured to be communicatively coupled to a host electronic device.
The controller may be further configured to be communicatively coupled to a host electronic device.
The hand-held electronic device may be further configured to be communicatively coupled to a host electronic device.
The hand-held electronic device may be further configured to interface with a host electronic device.
The processor may be further configured to interface with a host electronic device.
The hand-held electronic device may be configured to interface with a host electronic device.
The selectively configurable sensing surface may be configured to selectively vary a shape of the provided one or more delineated active areas based on the selected application.
The selectively configurable sensing surface may be configured to selectively vary a number of the delineated active areas based on the selected application.
The selectively configurable sensing surface may selectively provide the one or more delineated active areas customized for a user's hand.
7g The systems and techniques described here may provide one or more of the following advantages. The human interface and input system and associated techniques offer the functionality of a high performance game controller, which can support the most demanding game input requirements, and the speed and accuracy of data input that can be obtained with the use of a conventional standard QWERTY keyboard, but without the large footprint. Also, the human interface and input system and associated techniques can increase the number of functions that may be associated with a given number of input elements without increasing the number of keystrokes or taps that is required. Moreover, it allows the input element size to remain consistent with the ergonomics of the human hand without increasing the time it takes to learn how to use the input system compared to conventional input systems.
Details of one or more implementations are set forth in the accompanying drawings and the description below. Other features and advantages will be apparent from the description and drawings, and from the claims.
DESCRIPTION OF THE DRAWINGS
Fig. 1 is a block diagram of a typical hand-held electronic device upon which the human interface and input system may be implemented.
Fig. 2 is a block diagram of an implementation of the human interface and input system.
7h Figs. 3a and 3b show front and back isometric views, respectively, of a hand-held electronic device wherein the second input assembly includes a pressure sensor pad having a plurality of configurable active areas.
Fig. 3c illustrates an exploded view of an example of an input element of the first input assembly.
Fig. 3d depicts one implementation of how the plurality of configurable active areas of the pressure sensor pad of the second input assembly may be configured.
Figs. 4a and 4b depict front and back isometric views, respectively, of a hand-held electronic device wherein the second input assembly includes three touch pads.
Figs. 5a and 5b depict front and back isometric views, respectively, of a hand-held electronic device wherein the second input assembly includes three two-position rockers.
Figs. 6a and 6b illustrate front and back isometric views, respectively, of a hand-held electronic device wherein the second input assembly includes a D-pad and two contact sensors.
Figs. 7a and 7b show a two-handed hand-held electronic device wherein the second input assembly includes two rotary dials.
Fig. 8 is a block diagram of a hand-held electronic device in the context of a communication system that may be used to implement the human interface and input systems and techniques described here.
Like reference symbols in the various drawings indicate like elements.
DETAILED DESCRIPTION
Biomechanics of the Human Hand The human hand comprises an opposable thumb and four fingers, i.e., the thumb may be used in opposition, in concert, in combination or functional relation with any of the four fingers. Compared to the human fingers, the human thumb may be characterized as having larger range of motion, stronger sustaining force actuation and poorer dexterity. The human base joint of the thumb has three degrees of freedom, such as side-to-side movement, up and down movement, and rotation about the thumb's long axis; while the base joint of the fingers has two degrees of freedom, such as side-to-side and up and down movement.
Thus, the thumb typically is considered to have better range of motion than any of the fingers. Also, because the human thumb has a bigger actuation muscle than any of the fingers, it can provide larger sustaining forces than the fingers. But also because of the larger muscle, the human thumb may suffer from diminished fine motor control and rapidity of motion that can be exerted compared to the fingers. Thus, the human fingers are more suitable for performing tasks that require fine motor coordination or the ability to pinpoint or rapidly repeat actuation.
Hand-Held Electronic Device Hardware Overview FIG 1 is a block diagram that illustrates a hand-held electronic device 100, such as a cell-phone, PDA, pocket PC, or smart phone, or other similar input devices upon which the human interface and input system and associated input techniques described herein may be implemented. Electronic device 100 may include a bus 102 or other communication mechanism for communicating information, and a processor 104, such as an ARM, OMAP or other similar processor, coupled with bus 102 for processing information, such as one or more sequences of one or more instructions, which may be embedded software, firmware, or software applications, such as a text messaging application or video game.
Electronic device 100 also may include a main memory 106, such as a random access memory (RAM) or other dynamic storage device, coupled to bus 102 for storing information and instructions to be executed by processor 104. Main memory 106 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by processor 104. Electronic device 100 further may include a read only memory (ROM) 108 or other static storage device coupled to bus 102 for storing static information and instructions for processor 104. A storage device 110 may be provided and coupled to bus 102 for storing information and instructions. Electronic device 100 may also include a display 112, such as a liquid crystal display (LCD), for displaying information to a user, and a human interface and input system 114 for communicating information and command selections to processor 104.
Electronic device 100 also may include a communication interface 118 coupled to bus 102.
Communication interface 118 provides a two-way data communication coupling to a base station. For example, communication interface 118 may be a wireless link, a modem to provide a data communication connection to a corresponding type of telephone line or any other communication interface known to one of ordinary skill. As another example, communication interface 118 may be a local area network (LAN) card to provide a data communication connection to a compatible LAN. In the wireless link implementation, communication interface 118 may send and receive electrical, electromagnetic or optical signals that carry digital data streams representing various types of information.
Human Interface and Input System Overview FIG 2 is a block diagram illustrating the major subsystems of the human interface and input system 114. The input system 114 may include a first input assembly 206, a second input assembly 208, and an input controller 216. The first input assembly 206 and the second input assembly 208 may include one or more input or elements. The input or selection elements, which may be keys, buttons, pressure sensor pads, touch pads or other actuators, associated with one or more sensors, which produce one or more electrical signals 214 when the input or selection elements are actuated. The input controller 216, which may include one or more processors, receives the one or more electrical signals 214 and converts them into a form suitable to be received and interpreted by processor 104 after passing through bus 102.
One or more context signals 222 are provided to input controller 216 through bus 102 in response to processor 104 executing embedded software, firmware, or software applications, such as a text messaging application. The context signals 222 are received and used by input controller 216 to map input or selection elements in the first input assembly 206 and/or the second input assembly 208 to one or more application input functions and responses. For example, if a text application is being executed by processor 104, then the input controller 216 may map one or more input elements of the first input assembly 206 to one or more symbols, such as characters, letters, numbers, icons, other types of symbols, or combinations of different types of symbols, and map one or more input or selection elements of the second input assembly 208 to a shifting or indexing functionality. If processor 104 is executing a game application, then the input controller 216 may map the input or selection elements of the input assemblies 206, 208 to game functions. The mapping of the input or selection elements to particular input functions for a given software application, whether done by the input controller 216 or processor 104, maybe customized by the application developer or the user through downloads or other programming modalities.
Moreover, the mapping of the input or selection elements may be done for language key set changes, which may reduce the cost of manufacturing hand-held electronic devices for manufacturers servicing multiple countries.
Alternative implementations of the input system 114 need not have the input controller 216, particularly where cost is of a concern. In those instances, processor 104 may assume the functions of the input controller 216. Thus, processor 104 can perform the mapping function described above.
Human Interface and Input System and Techniques Implementations Figs. 3a and 3b illustrates front and back isometric views, respectively, of a hand-held electronic device 302 upon which an implementation of the human interface and input system may be implemented. Electronic device 302 may include six planar or contoured surfaces: a front-side surface 312, a back-side surface 314, a left-side surface 316, a right-side surface 318, a top-side surface 320, and a bottom-side surface 322. Although, in other implementations, electronic device 302 may have more or fewer planar and/or contoured surfaces. On the front-side surface 312, a display 330, such as an LCD, and a first input assembly 340 is disposed adjacent to each other. Alternatively, display 330 may be on a separate assembly such as those displays for PDA's and cell phones with a swivel-mounted screen or flip-phone configurations. Also, the first input assembly 340 may be disposed on more than one surface. The first input assembly may be a typical cell-phone keypad, which may include twelve input elements 342, although any number of input elements may be provided. A user's thumb or thumbs or a stylus may actuate the input elements 342.
A second input assembly 350 is disposed on the back-side surface 314, left-side surface 316 and right side surface 318. Alternatively, the second input assembly may be disposed on one of those surfaces or a combination of those surfaces. In this implementation, the first input assembly 340 is disposed relative to the second input assembly 350 to take advantage of the opposition of the human thumb and finger. The second input assembly 350 includes a pressure sensor pad 354 having a plurality of software configurable active areas, which may be actuated by one or more of the user's fingers. The pressure sensor pad 354 in this implementation may include an actuator, such as an elastomeric material, attached to a force sensitive resistor array, a capacitive mat or array, or other similar pressure sensitive device or grid that can provide multiple outputs corresponding to the pressure readings of the plurality of active areas on the pad's 354 surface. Here, the pressure sensor pad 354 wraps around from the left-side surface 316 across the back-side surface 314 to the right-side surface 318.
It is to be understood that the input elements 342, 354 in this implementation and any other implementation could be analog and/or digital buttons, keys, rockers (which may be a one or more position buttons or an analog joystick-type button), sliders, dials or touch pads used in combination with pressure sensors (such as force sensitive resistors, piezo resistive sensors, and capacitive sensors), positional sensors (such as rotary encoders, linear potentiometers and the like) or other sensors or a combination of them.
Fig. 3c depicts an exploded view of an input element 342 of the first input assembly 340, which is mapped to represent one or more text functions. Here, the input element is mapped to represent the number 7 and letters p, q, r and s, as is found on a typical keypad of a cell phone. Other input elements 342 may be associated with other letters, numbers and/or icons. For example, one input element may be associated with the number 4 and letters g, h and i, while another input element may be associated with the number 2 and the letters a, b and c.
As shown in FIG 3d, the pressure sensor pad 354 may be configured in software to represent one or more delineated active areas corresponding to different programmable functions depending on the application. In this case, inverted U-shaped active area 360 forms an active area-- the vertical sides 362 of the inverted U-shaped active area 360 are on the left-side surface 316 and the right-side surface 318 and the horizontal side 364 of the inverted U-shaped active area 360 is along the top edge of the pressure sensor pad 354 on the back-side surface 314. Below the inverted U-shaped active area 360 on the back-side surface 314 are five oblong-shaped active areas 372 labeled from 1 to 5. On the bottom of both the left-side surface 316 and the right-side surface 318 and stretching to the back-side surface 314 of the pressure sensor pad 354 are rectangular-shaped active areas 374, 376, 378, 380. The remaining area of the pressure sensor pad 354 may be configured to be inactive.
In this implementation, inverted U-shaped active area 360 may be used for navigation-- the vertical sides 362 for y-directional movement and the horizontal side 364 for x-directional movement. The oblong-shaped active areas 372 may be used for shifting or indexing between symbols, such as characters, letters and/or numbers, or text input. The rectangular-shaped active areas 374, 376, 378, 380 maybe used for shifting between modes-two of the active areas 374, 376 for left-handed users and the other two active areas 378, 380 for right-handed users. In another configuration of the pressure sensor pad 354, the entire surface of the pressure sensor pad 354 may be covered by horizontal rectangular active areas interspersed between small rectangular horizontal inactive areas to achieve any desired number of active areas. Other configurations of the pressure sensor pad 354 may be realized depending on the requirements of the desired application.
The delineated active areas of the pressure sensor pad 354 may be actuated by one or more of the user's fingers, such as by applying pressure against the delineated active areas of the pad 354 beyond a pre-defined or user-adjustable threshold pressure.
Likewise, the absence of pressure may be used as an actuation event. The pressure sensor pad 354 also may contain or be mounted above or below a shape-changing media such as an electrorheastatic fluid, shape memory metal array or similar material, which can permit the user to tactilely discriminate between the one or more delineated active areas. Thus, the user will be able to perceive the one or more delineated active areas as if they were physical buttons. Also, a computer graphical representation (not shown) of the configuration of the delineated active areas of the pad 354 may be displayed temporarily (or some predetermined time) on a portion of the display 330 to visually assist the user in locating where the delineated active areas of the pad 354 are positioned. Moreover, an input element 342 of the first input assembly 340 may be mapped to activate and/or deactivate the displaying of the computer graphical representation.
The input architecture described above, with the first input assembly 340 on the front-side surface 312 and the second input assembly 350 on the back-side surface 314, left-side surface 316 and right-side surface 318, is configured to take advantage of the biomechanics of the hand, whether the user is left-handed or right-handed. This configuration, for example, can reduce the number of thumb taps required to input a particular symbol compared to the number of thumb taps or stylus presses required using only a typical key pad with modal input techniques, such as is found in conventional text input systems.
Moreover, this configuration can permit full keyboard capability with fewer input elements on the first input assembly 340 and with greater spacing between input elements to enhance the ease of input compared to typical keypads for existing cell phones. Also, this configuration can permit full functionality of a high performance game controller, which can support the most demanding game input requirements.
A method to implement full keyboard capability and reduce the number of thumb taps is to map in software the delineated active areas of the pressure sensor pad 354, such as the oblong-shaped active areas 372, to an indexing or shifting functionality to take advantage of the capability of the human finger, i.e., rapidity of motion, spatial memory and fine motor control, and to map in software the input elements 342 of the first input assembly 340 to text functionality to take advantage of the capability of the human thumb, i.e., range of motion and sustained force actuation.
When a text messaging application is running on the electronic device 302 and displayed on the screen 330, the first input assembly 340 and the second input assembly 350 are used together to perform the text messaging functions. Each input element 342 of the first input assembly 340 may represent one or more text functions, e.g., one input element may be associated with the decimal digit 2 and letters a, b and c, while another input element may be associated with the decimal digit 7 and letters p, q, r and s (as shown in Fig. 3c), such as is found on typical keypads.
In this implementation, the input elements 342 are configured the same as a typical keypad on a cell phone. The specific text function inputted by a user for a particular input element 342 is determined by which delineated active area of the pressure sensor pad 354 is pressed. For example, going from left to right, each oblong-shaped active area 372 may be mapped to represent a separate index or shift position such that index position 1 may be assigned to the left-most oblong-shaped active area (labeled 1), index position 2 may be assigned to the adjacent oblong-shaped active area 372 (labeled 2) and so on, where index position 5 may be assigned to the right-most oblong-shaped active area 372 (labeled 5).
Thus, to input the word "yes", the user may press the oblong-shaped active area 372 representing index position 4 with any of his fingers and press the particular input element 342 representing the letter "y" with his thumb; then the user may press the oblong-shaped active area 372 representing index position 3 with any of his fingers and press the input element 342 representing the letter "e" with his thumb; and then the user may press the oblong-shaped active area 372 representing index position 5 with any of his fingers and press the input element 342 representing the letter "s" with his thumb.
The coordination of finger motions and thumb motions in other than a grasping motion may be difficult for most people. Generally, doing two separate types of motions simultaneously can be difficult. However, the human interface and input system described herein does not require those types of motions due to the flexibility of the system. Generally, it is easier to tap both the fingers and thumbs or leave either the thumb or fingers in contact with an input element or delineated active area while moving the other. For example, a user's finger may press an oblong-shaped active area 372 at the same time or nearly the same time the user's thumb taps an input element 342 in the first input assembly 340.
Also, a user may tap an input element 342 with his thumb while pressing an oblong-shaped active area 372. Pressing or touching an oblong-shaped active area 372 while tapping on an input element 342 in the first input assembly 340 typically is natural, comfortable and easy to do. Likewise, the same holds true where the index finger moves substantially linearly from one oblong-shaped active area 372 to the next, generally a left to right motion or vice versa, while the thumb taps an input element 342 in the first input assembly 340.
Another way to implement finger/thumb coordination would be to permit asynchronous or sequential tapping between the finger tap and the thumb tap.
For example, pressing an input element 342 within a pre-determined time (e.g., one second) after pressing and depressing a oblong-shaped active area 372 would constitute the same action as if both were pressed simultaneously. This time window could be configured by the user to facilitate different proficiencies in typing or different types of applications-for game applications, the time window could be quite short, whereas for text input applications, the time window could be much longer. The time window also could be different for different delineated active areas based on their intended function in a given application.
Another method to implement full keyboard capability and reduce the number of thumb taps is map in software the delineated active areas of the second input assembly 350 as follows: left vertical side 362 of the inverted U-shaped active area 360 to be shift position 1;
anywhere along the horizontal side 364 to be shift position 2; the top-left rectangular-shaped active area 378 to be shift position 3; the top-right rectangular-shaped active area 374 to be shift position 4; the bottom-left rectangular-shaped active area 380 to be shift position 5; and, if needed, the bottom-right rectangular-shaped active area 376. The input elements 342 of the first input assembly 340 may again be mapped to text functionality.
Thus, to input the word "yes", the user may press the top-right rectangular-shaped active area 374 representing shift position 4 with any of his fingers and press the particular input element 342 representing the letter "y" with his thumb; then the user may press the top-left rectangular-shaped active area 378 representing index position 3 with any of his fingers and press the input element 342 representing the letter "e" with his thumb;
and then the user may press the bottom-left rectangular-shaped active area 380 representing index position 5 with any of his fingers and press the input element 342 representing the letter "s" with his thumb.
A method of implementing the functionality of a game controller is to assign in software the input elements 342 of the first input assembly 340 specific game functions to take advantage of the capability of the human thumb, i.e., range of motion and sustained force actuation, and to map in software the delineated active areas of the pressure sensor pad 354 of the second input assembly 350 analog control to take advantage of the capability of the human finger, i.e., rapidity of motion, spatial memory and fine motor control. Thus, as a user's index finger or middle finger slides from left to right across the oblong-shaped active areas 372, the horizontal side 364 of the inverted U-shaped active area 360, and/or the rectangular active area 370, the input controller (not shown) may interpret the motion as "increasing" a parameter such as speed, size, position, etc. Alternatively, the input controller may be programmed to interpret different levels of applied pressure to these delineated active areas as the "increasing" parameter, i.e., increased pressured may represent increased speed, size, position, etc.
In this implementation, the vertical side 362 of the inverted U-shaped active area 360 may be programmed to represent the y-directional (vertical) movement of control of a character in a game, while the horizontal side 364 of the U-shaped active area 360 may be programmed to represent the x-directional (horizontal) movement. Movement into or out of the field of view may be controlled by the left and right rectangular buttons 374, 376, 378, 380, thereby allowing 3-D control. Rapid firing of weapons maybe accomplished by using the input elements 342 of the first input assembly 340 or one of the five oblong-shaped active areas 372, with each one representing a different weapon or action. Complex moves or mode shifts could be accomplished by combining input elements 342 of the first input assembly 340 with any delineated active area of the second input assembly 350. In this way, a game developer may optimize the mapping of delineated active areas based on the best configuration for a particular game. For example, a game developer could set up control configurations for novice users differently than for advanced users, such as mapping different numbers or sizes of delineated active areas, in order to reduce the learning time to be proficient and make game control easier for first time players.
Figs. 4a and 4b illustrate front and back isometric views, respectively, of a hand-held electronic device 402 similar to the device shown in Figs. 3 a and 3b, except the second input assembly 450 includes three input or selection elements 454, 456, 458, which may be rectangular-shaped touch pads. Each touch pad 454, 456, 458 may transduce the location of the contact of an object or a user's finger anywhere on its surface. Also each touch pad 454, 456, 458 may correspond to different programmable functions. Here, touch pad 454 may be disposed on the back-side surface 414; touch pad 456 may be disposed on the left-side surface 416; and touch pad 458 may be disposed on the right-side surface 418.
In a hand-held device such as a cell-phone or PDA, the second input assembly may include a touch-pad located diagonally on the back-side surface 414 with another touch-pad on the left-side surface 416 because a right-handed user's index finger typically is placed along a diagonal path on the back-side surface 414 wrapping around to the left-side surface 416. In that case, second input assembly 450 may include touch pad 454 and touch pad 456.
A user's finger may finger may move along the length of the touch-pad strip 454 in order to select the desired function. For example, a far left portion of touch-pad 454 maybe mapped to be index position 1, a far right portion of touch-ad 454 may be mapped to be index position 5, and portions between the far-left portion and the far right portion of the touch-pad 454 may be mapped to intervening index positions. Alternatively, index position 1 may be mapped to touch pad 456 for right-handed users and mapped to touch pad 458 for left-handed users.
Thus, in this implementation, text input is similar to that as described with respect to Figs. 3a and 3b. Other configurations of the active areas of the touch pads 454, 456, 458 are possible and can be tailored to specific applications.
Figs. 5a and 5b illustrate front and back isometric views, respectively, of a hand-held electronic device 502 similar to the device shown in Figs. 3a and 3b, except the second input assembly 550 includes three input or selection elements 554, which maybe actuated by any of the user's fingers, typically the user's index finger or middle finger or a combination of both. The input elements 554 in this implementation are conventional two-position rockers.
Thus, the second input assembly 550 can provide six index positions at a relatively low cost with passive tactile feedback built in.
Figs. 6a and 6b illustrate front and back isometric views, respectively, of a hand-held electronic device 602 similar to the device shown in Figs 3a and 3b, except the second input assembly 650 includes three input or selection elements 654, 656, 658. Input element 654 may be a D-pad input device and input elements 656, 658 may be either digital or analog contact sensors. The D-pad 654 may be mounted on the center of the back-side surface 614 and mapped in software to represent one or more index or shift positions. For example, the D-pad 654 may be mapped to represent four index positions with each compass heading of the D-pad (e.g., North, South, East and West) representing a different index position. A fifth index position could be mapped to orthogonal movement of the center of the D-pad 654 into the device 602. Alternatively, the D-pad 654 may be mapped to represent eight index positions, e.g., the compass directions North, South, East, West, Northeast, Northwest, Southeast and Southwest may be mapped. The contact sensors 656, 658 may be used as mode functions, for firing weapons, or any other functionality specified by an application developer.
Figs. 7a and 7b illustrate front and back isometric views, respectively, of a two-handed hand-held electronic device 702. A first input assembly 740 including a plurality of input elements 742 is disposed on the front-side surface 712. A second input assembly 750, including two input or selection elements 754, 756, is disposed on the back-side surface 714.
In this implementation, the two input elements 754, 756 are rotary dials.
Alternatively, rotary dial 754 may be disposed on the left-side surface 716 and rotary dial 756 may be disposed on the right-side surface 718. In a one-handed hand-held electronic device, such as a cell-phone, typically one rotary dial is needed if placed on the back-side surface 714 or two rotary dials are needed if placed on the left and right side surfaces 716, 718. Rotation of the rotary dials 754, 756 may be mapped in software to represent one or more index positions.
The rotary dials 754, 756 may be implemented with detents so that the user can distinguish between separate index positions, i.e., tactile feedback may be provided to the user's finger(s).
FIG 8 is a block diagram that illustrates a hand-held electronic device 800, such as a cell-phone or PDA, upon which the human interface and input system and associated techniques described herein may be implemented in a communication system.
Network link 820 typically provides data communication through one or more networks to other devices.
For example, network link 820 may provide a connection through local network 822 to a host computer 824 or to data equipment operated by an Internet Service Provider (ISP) 826. ISP
826 in turn provides data communication services through the world wide packet data communication network now commonly referred to as the "Internet" 828. Network link 820 also could provide data communication directly to the ISP 826 and Internet 828. Local network 822 and Internet 828 both use electrical, electromagnetic or optical signals that carry digital data streams. The signals through the various networks and the signals on network link 820, which carry the digital data to and from electronic device 800, are exemplary forms of carrier waves transporting the information.
Electronic device 800 can send messages and receive data, including program code, which includes one or more sequences of one or more instructions, through the network(s) and network link 820. In the Internet example, a server 830 might transmit a requested code for an application program through Internet 828, ISP 826, local network 822 and network link 820. In one aspect, one such downloaded application may be for software games to be played on electronic device 800, which may obtain application code in the form of a carrier wave.
In any of the above implementations, active and/or passive tactile feedback may be implemented. To provide passive tactile feedback, the input elements of the first and/or second input assemblies may be combined with a palpable detent, such as a dome cap or dome spring so that a user can tactilely perceive, through his fingers or thumbs, activation and/or deactivation of an input element. In one implementation, among others, the palpable detent may be positioned between the actuator and sensor components of the input elements.
To provide active tactile feedback, one ore more vibratory units or force producing units may be mounted in the hand-held electronic device and activated to provide tap or index level or other information to a user. The vibratory unit may be an electric motor with an eccentric mass attached to the motor's shaft, a solenoid, a variable reluctance device, a loud speaker or any other vibrator that can provide tactile feedback. A force producing unit may be a solenoid in non-vibratory mode, a motor, non-vibratory actuators or any other actuator that can produce forces. A vibratory unit and/or force producing unit may be provided for each input element . In that case, the vibratory unit and/or force producing unit may be mounted below the input element so that when the vibratory unit and/or force producing unit is activated, the vibratory unit and/or force producing unit can push out the surface of the electronic device to a different level or position depending on the information to be communicated. Thus, in implementations using a pressure sensor pad or touch-pad as the input element, a stepped array may be configured to indicate higher and higher levels of index positions across the touch pad or pressure sensor pad. The vibratory units and/or force producing units may also be used to provide tactile feedback to indicate the momentary achievement of an objective, such as target lock in game applications. Tactile feedback may also be accomplished by actuators, such as a solenoid, which changes the stiffness of the input element electronically or pushes against the user's hand or fingers to indicate an event of interest in the software application.
The computational aspects described here can be implemented in analog or digital electronic circuitry, or in computer hardware, firmware, software, or in combinations of them.
Where appropriate, aspects of these systems and techniques can be implemented in a computer program product tangibly embodied in a machine-readable storage device for execution by a programmable processor; and method steps can be performed by a programmable processor executing a program of instructions to perform functions by operating on input data and generating output.
The systems and techniques described above utilize the biornechanics of the thumb and fingers, i.e., it uses the function of opposition, the fine motor control of the fingers, and the larger range of motion and stronger actuation provided by the thumb. By using the fingers and thumb in concert, the number of taps and time needed to accomplish a given function is reduced, the accuracy is improved, and the natural programming inherent in the human hand replaces the training required for other systems.
A number of implementations have been described. Other implementations may include different or additional features. For example, other configurations of the one or more input elements of the first and second input assemblies may be realized. Also, the hand-held electronic devices described herein may have more or less than six planar or contoured surfaces. Moreover, the number of input elements in the first and second input assemblies are not limited to the number of input elements described in the implementations above.
Also, the one or more input elements of the first and second input assemblies may be any input or selection type known to one of skill in the art, such as keys, buttons, touch pads, other types of pads, rockers, sliders, dials, contact sensors or other actuators associated with any sensor. Each sensor associated with an actuator may include digital momentary on/off switches or analog sensors, such as pressure sensors (e.g., force sensitive resistors, piezo film sensors, or capacitive sensors), or positional sensors (e.g., rotary or linear potentiometers or encoders), or other analog sensors known to those of ordinary skill, or accelerometers or gyroscopes. The first and second input assemblies may include a combination of these different types of input or selection elements, which may be mounted in the configurations shown in the figures or imbedded within the device to permit control through motion of the overall device.
Moreover, the methods to provide data input, device control or game control may be performed in a different order and still achieve desirable results.
Accordingly, other implementations are within the scope of the following claims.
The processor may receive signals generated by the input elements of first or second input assemblies when manipulated by the human user.
The hand-held electronic device may further include an input controller. The input controller may receive signals generated by the input elements of first or second input assemblies when manipulated by the human user and may convert the signals into a form suitable to be interpreted by the processor.
At least one of the input elements of the second input assembly may be a rotary sensor.
At least one of the input elements of the second input assembly may be a D-pad.
The hand-held electronic device may further include at least one palpable detent. The detent may be associated with at least one of the input elements of the first or second input assemblies so as to provide tactile feedback to the human user when the human user manipulates the input element associated with the palpable detent.
The hand-held electronic device may further include one or more vibratory or force producing units, and at least one of the vibratory or force producing units may be configured to provide tactile feedback upon the human user's manipulation of at least one of the input elements of the first or second input assemblies.
At least one of the vibratory units may provide tactile feedback in response to events occurring in the selected application running on the processor.
In accordance with another aspect of the invention, there is provided a hand-held electronic device. The device includes a memory configured to store a plurality of applications. Each application is associated with a set of functions. The device also includes a processor configured to process a selected one of the plurality of applications. The set of functions associated with the selected application includes a plurality of text symbol functions and a plurality of shifting functions. The device further includes a first surface having a plurality of input elements configured to receive input from a human user through manipulation of the plurality of input elements. At least one of the input elements of the first surface is further configured to selectively map to more than one text symbol function. The device also includes a second surface having one or more input elements. At least one of the input elements of the second surface has one or more selectable active areas configured to be 7a manipulated by one or more of the human user's fingers, each selectable active area configured to selectively map to a different shifting function. Manipulation of one of the selectable active area causes the text symbol function of the one or more input elements of the first surface to change. Further, the plurality of input elements of the first surface and the one or more input elements of the second surface are arranged so as to substantially optimize a biomechanical effect of the human user's hand.
The hand-held electronic device may further include a controller. The controller may receive signals generated by the human user's manipulation of the input elements of the first surface or active areas.
The hand-held electronic device may further include a dome cap positioned above at least one input element of the first surface or the second surface and may be capable of providing tactile feedback to the human user when the input element associated with the dome cap is manipulated.
The hand-held electronic device may further include one or more vibratory units capable of providing tactile feedback.
The hand-held electronic device may further include one or more force producing units capable of providing tactile feedback.
In accordance with another aspect of the invention, there is provided a method for configuring a human interface and input system for use with a hand-held electronic device configured to run a plurality of applications, each application associated with a set of functions. The method involves disposing on a first surface a first input assembly having a plurality of input elements configured to receive input from a human user through manipulation of the plurality of input elements. At least one of the input elements of the first input assembly is further configured to map to more than one input function associated with a selected one of the plurality of applications. The method further involves disposing on a second surface a second input assembly having one or more input elements configured to be manipulated by one or more of the human user's fingers. At least one of the input elements of the second input assembly is further configured to selectively map to one or more input functions associated with the selected application. The method also involves arranging the plurality of input elements of the first input assembly and the one or more input elements of the second input assembly to substantially optimize a biomechanical effect of the human user's hand.
7b The method may further involve physically or electronically labeling at least one input element of the first input assembly or the second input assembly so as to visually indicate an input function that can be selectively accessed by actuating the input element.
The method may further involve connecting a controller to the input elements of the first input assembly or the second input assembly. The controller may be configured to receive signals generated by a manipulation of one or more of the input elements of the first input assembly or the second input assembly.
At least one input element of the second input assembly may have a plurality of active areas configurable by the controller to form a plurality of delineated active areas.
The method may further involve positioning a shape changing media relative to the one input element of the second input assembly having a plurality of active areas so as to permit the human user to tactilely discriminate between the plurality of delineated active areas.
The method may further involve positioning a palpable detent with at least one input element of the first input assembly or the second input assembly so as to provide tactile feedback when manipulated by the human user.
In accordance with another aspect of the invention, there is provided a method for inputting data on a hand-held electronic device having a first surface with a plurality of input elements configured to receive input from a human user through manipulation of the plurality of input elements. At least one of the input elements is further configured to map to a plurality of symbols in a data input mode. Each of the plurality of symbols is associated with a unique index position identifier, and a second surface having one or more selection elements configured to be manipulated by one or more of the human user's fingers. Each selection element corresponds to one of the unique index position identifiers.
Further, the plurality of input elements and the one or more selection elements are arranged to substantially optimize a biomechanical effect of the human user's hand. The method involves executing a selected application from a plurality of applications. The selected application is associated with a set of functions. The method also involves determining the index position identifier of a desired symbol to be inputted based on the functions associated with the selected application, pressing the selection element corresponding to the index position identifier of the desired symbol with any digit or object held in the human user's hand, and pressing the input element configured to map to the desired symbol with any digit or object held in the human user's hand.
7c Each input element may be physically or electronically labeled indicating each symbol that is mapped to the input element and a positional order in which each symbol may be selectively accessed by actuating the input element.
Determining the index position identifier of the desired character to be inputted may involve locating the input element configured to map to the desired symbol, and counting from left to right the number of symbols preceding the desired symbol labeled on the located input element. The index position identifier of the desired symbol may be the number of symbols preceding the desired symbol plus one.
At least one of the input elements or selection elements may be further configured to map to a plurality of modes corresponding to the selected application executing on the hand-held electronic device. At least one of the modes may be the data input mode, and the method may further involve enabling the data input mode.
In accordance with another aspect of the invention, there is provided a method for a human user to input data on a hand-held electronic device using an interface and input system involving a plurality of input elements in a thumb-manipulated assembly to substantially optimize a biomechanical effect of the human user's thumb and fingers. At least one input element is mapped to more than one text function, and one or more selection elements in a finger-manipulated input assembly. Each selection element is mapped to a unique shift position. The method involves executing a selected text application from a plurality of applications. The selected application is associated with a set of functions.
The method further involves pressing a desired selection element of the finger-manipulated input assembly with a human finger to select a desired shift position of the selected text application, and pressing a desired input element of the thumb-manipulated input assembly with a human thumb to input a desired text character.
In accordance with another aspect of the invention, there is provided a hand-held electronic device. The device includes a memory configured to store a plurality of applications. Each application is associated with a set of functions. The device also includes a processor configured to process a selected one of the plurality of applications, and a first input assembly disposed on a first surface of the electronic device. The first input assembly includes a plurality of input elements configured to be actuated by a human user's hand. At least one of the input elements of the first input assembly is configured to map to one or more input functions of the set of functions associated with the selected one of the plurality of applications. The device further includes a second input assembly disposed on a second 7d surface so as to substantially optimize a biomechanical effect of the human user's hand. The second input assembly includes one or more input elements configured to be manipulated by one or more of the human user's fingers. At least one of the input elements of the second input assembly is a selectively configurable sensing surface so as to provide a plurality of delineated active areas. Further, one or more of the delineated active areas is mapped to one or more functions associated with the selected application, and the memory is further configured to store for each application a mapping of the selectively configurable sensing surface to the plurality of delineated active areas.
The selected one of the plurality of applications may be a text application, and the one or more input elements on the second surface of the second input assembly may include one or more selection elements. Manipulations of the one or more selection elements may cause the input elements on the first surface of the first input assembly to be selectively mapped from one text function to another text function.
The selected one of the plurality of applications may be a game application, and at least one of the plurality of input elements of the first input assembly and at least one of the input elements of the second input assembly may each be configured to selectively map to one or more game functions.
The hand-held electronic device may further include an input controller. The input controller may receive a plurality of signals generated by the input elements of the first input assembly and the second input assembly when manipulated by the human user, and may convert the plurality of signals into a form suitable to be interpreted by the processor.
At least one input element of the first input assembly or the second input assembly may be configured to map to one or more input functions associated with the selected application that control a cursor on a screen.
The selected one of the plurality of applications may be a game application.
At least one input element of the first input assembly or the second input assembly may be configured to map to one or more input functions associated with the game application that control a game character on a screen.
The input controller may be further configured to interpret a movement of the human user's finger sliding across two or more delineated active areas as a change in the mapped function of the two or more delineated active areas. The mapped function may be at least one of a speed control, a size control, a weapon fire control, and a position control.
7e The input controller may be further configured to interpret a pressure applied by the human user's finger on a selected one of the delineated active areas as a change in the mapped function of the selected delineated active area. The mapped function may be at least one of a speed control, a size control, a weapon fire, and position control.
At least one of the functions mapped to the input element of the first input assembly may be a game function that is substantially optimized for actuation by the human user's thumb.
The game function that may be substantially optimized for actuation by the human user's thumbs may include a directional control.
At least one of the functions mapped to the delineated active areas may be a game function that may be substantially optimized for actuation by one or more of the human user's fingers.
The game function that may be substantially optimized for actuation by one or more of the human user's fingers may include a weapon fire control.
The game function that may be substantially optimized for actuation by one or more of the human user's fingers may include a game character jump control.
In accordance with another aspect of the invention, there is provided a method for configuring a human interface and input system for use with a hand-held electronic device configured to run a plurality of applications, each application associated with a set of functions. The method involves disposing on a first surface a first input assembly having a plurality of input elements configured to receive input from a human user's hand through manipulation of the plurality of input elements. At least one of the input elements of the first input assembly is further configured to map to more than one input function associated with a selected one of the plurality of applications. The method also involves disposing on a second surface a second input assembly having one or more input elements configured to be manipulated by one or more of the human user's fingers. At least one of the input elements of the second input assembly is further configured to selectively map to one or more input functions associated with the selected application. The method further involves mapping the set of functions of the selected application to the one or more input elements of the first input assembly and the second input assembly to substantially optimize a biomechanical effect of the human user's hand.
The selected application may be at least one of a scrolling application, a text application and a game application.
7f The method may further involve physically or electronically labeling at least one input element of the first input assembly or the second input assembly so as to visually indicate an input function that may be selectively accessed by actuating the input element.
The method may further involve connecting a controller to the input elements of the first input assembly or the second input assembly. The controller may be configured to receive signals generated by a manipulation of one or more of the input elements of first input assembly or the second input assembly.
At least one input element of the second input assembly having a plurality of active areas may be configurable by the controller to form a plurality of delineated active areas.
The method may further involve positioning a shape changing media relative to the one input element of the second input assembly having a plurality of active areas so as to permit the human user to tactilely discriminate between the plurality of delineated active areas.
The processor may be further configured to be communicatively coupled to a host electronic device.
The processor may be further configured to be communicatively coupled to a host electronic device.
The controller may be further configured to be communicatively coupled to a host electronic device.
The hand-held electronic device may be further configured to be communicatively coupled to a host electronic device.
The hand-held electronic device may be further configured to interface with a host electronic device.
The processor may be further configured to interface with a host electronic device.
The hand-held electronic device may be configured to interface with a host electronic device.
The selectively configurable sensing surface may be configured to selectively vary a shape of the provided one or more delineated active areas based on the selected application.
The selectively configurable sensing surface may be configured to selectively vary a number of the delineated active areas based on the selected application.
The selectively configurable sensing surface may selectively provide the one or more delineated active areas customized for a user's hand.
7g The systems and techniques described here may provide one or more of the following advantages. The human interface and input system and associated techniques offer the functionality of a high performance game controller, which can support the most demanding game input requirements, and the speed and accuracy of data input that can be obtained with the use of a conventional standard QWERTY keyboard, but without the large footprint. Also, the human interface and input system and associated techniques can increase the number of functions that may be associated with a given number of input elements without increasing the number of keystrokes or taps that is required. Moreover, it allows the input element size to remain consistent with the ergonomics of the human hand without increasing the time it takes to learn how to use the input system compared to conventional input systems.
Details of one or more implementations are set forth in the accompanying drawings and the description below. Other features and advantages will be apparent from the description and drawings, and from the claims.
DESCRIPTION OF THE DRAWINGS
Fig. 1 is a block diagram of a typical hand-held electronic device upon which the human interface and input system may be implemented.
Fig. 2 is a block diagram of an implementation of the human interface and input system.
7h Figs. 3a and 3b show front and back isometric views, respectively, of a hand-held electronic device wherein the second input assembly includes a pressure sensor pad having a plurality of configurable active areas.
Fig. 3c illustrates an exploded view of an example of an input element of the first input assembly.
Fig. 3d depicts one implementation of how the plurality of configurable active areas of the pressure sensor pad of the second input assembly may be configured.
Figs. 4a and 4b depict front and back isometric views, respectively, of a hand-held electronic device wherein the second input assembly includes three touch pads.
Figs. 5a and 5b depict front and back isometric views, respectively, of a hand-held electronic device wherein the second input assembly includes three two-position rockers.
Figs. 6a and 6b illustrate front and back isometric views, respectively, of a hand-held electronic device wherein the second input assembly includes a D-pad and two contact sensors.
Figs. 7a and 7b show a two-handed hand-held electronic device wherein the second input assembly includes two rotary dials.
Fig. 8 is a block diagram of a hand-held electronic device in the context of a communication system that may be used to implement the human interface and input systems and techniques described here.
Like reference symbols in the various drawings indicate like elements.
DETAILED DESCRIPTION
Biomechanics of the Human Hand The human hand comprises an opposable thumb and four fingers, i.e., the thumb may be used in opposition, in concert, in combination or functional relation with any of the four fingers. Compared to the human fingers, the human thumb may be characterized as having larger range of motion, stronger sustaining force actuation and poorer dexterity. The human base joint of the thumb has three degrees of freedom, such as side-to-side movement, up and down movement, and rotation about the thumb's long axis; while the base joint of the fingers has two degrees of freedom, such as side-to-side and up and down movement.
Thus, the thumb typically is considered to have better range of motion than any of the fingers. Also, because the human thumb has a bigger actuation muscle than any of the fingers, it can provide larger sustaining forces than the fingers. But also because of the larger muscle, the human thumb may suffer from diminished fine motor control and rapidity of motion that can be exerted compared to the fingers. Thus, the human fingers are more suitable for performing tasks that require fine motor coordination or the ability to pinpoint or rapidly repeat actuation.
Hand-Held Electronic Device Hardware Overview FIG 1 is a block diagram that illustrates a hand-held electronic device 100, such as a cell-phone, PDA, pocket PC, or smart phone, or other similar input devices upon which the human interface and input system and associated input techniques described herein may be implemented. Electronic device 100 may include a bus 102 or other communication mechanism for communicating information, and a processor 104, such as an ARM, OMAP or other similar processor, coupled with bus 102 for processing information, such as one or more sequences of one or more instructions, which may be embedded software, firmware, or software applications, such as a text messaging application or video game.
Electronic device 100 also may include a main memory 106, such as a random access memory (RAM) or other dynamic storage device, coupled to bus 102 for storing information and instructions to be executed by processor 104. Main memory 106 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by processor 104. Electronic device 100 further may include a read only memory (ROM) 108 or other static storage device coupled to bus 102 for storing static information and instructions for processor 104. A storage device 110 may be provided and coupled to bus 102 for storing information and instructions. Electronic device 100 may also include a display 112, such as a liquid crystal display (LCD), for displaying information to a user, and a human interface and input system 114 for communicating information and command selections to processor 104.
Electronic device 100 also may include a communication interface 118 coupled to bus 102.
Communication interface 118 provides a two-way data communication coupling to a base station. For example, communication interface 118 may be a wireless link, a modem to provide a data communication connection to a corresponding type of telephone line or any other communication interface known to one of ordinary skill. As another example, communication interface 118 may be a local area network (LAN) card to provide a data communication connection to a compatible LAN. In the wireless link implementation, communication interface 118 may send and receive electrical, electromagnetic or optical signals that carry digital data streams representing various types of information.
Human Interface and Input System Overview FIG 2 is a block diagram illustrating the major subsystems of the human interface and input system 114. The input system 114 may include a first input assembly 206, a second input assembly 208, and an input controller 216. The first input assembly 206 and the second input assembly 208 may include one or more input or elements. The input or selection elements, which may be keys, buttons, pressure sensor pads, touch pads or other actuators, associated with one or more sensors, which produce one or more electrical signals 214 when the input or selection elements are actuated. The input controller 216, which may include one or more processors, receives the one or more electrical signals 214 and converts them into a form suitable to be received and interpreted by processor 104 after passing through bus 102.
One or more context signals 222 are provided to input controller 216 through bus 102 in response to processor 104 executing embedded software, firmware, or software applications, such as a text messaging application. The context signals 222 are received and used by input controller 216 to map input or selection elements in the first input assembly 206 and/or the second input assembly 208 to one or more application input functions and responses. For example, if a text application is being executed by processor 104, then the input controller 216 may map one or more input elements of the first input assembly 206 to one or more symbols, such as characters, letters, numbers, icons, other types of symbols, or combinations of different types of symbols, and map one or more input or selection elements of the second input assembly 208 to a shifting or indexing functionality. If processor 104 is executing a game application, then the input controller 216 may map the input or selection elements of the input assemblies 206, 208 to game functions. The mapping of the input or selection elements to particular input functions for a given software application, whether done by the input controller 216 or processor 104, maybe customized by the application developer or the user through downloads or other programming modalities.
Moreover, the mapping of the input or selection elements may be done for language key set changes, which may reduce the cost of manufacturing hand-held electronic devices for manufacturers servicing multiple countries.
Alternative implementations of the input system 114 need not have the input controller 216, particularly where cost is of a concern. In those instances, processor 104 may assume the functions of the input controller 216. Thus, processor 104 can perform the mapping function described above.
Human Interface and Input System and Techniques Implementations Figs. 3a and 3b illustrates front and back isometric views, respectively, of a hand-held electronic device 302 upon which an implementation of the human interface and input system may be implemented. Electronic device 302 may include six planar or contoured surfaces: a front-side surface 312, a back-side surface 314, a left-side surface 316, a right-side surface 318, a top-side surface 320, and a bottom-side surface 322. Although, in other implementations, electronic device 302 may have more or fewer planar and/or contoured surfaces. On the front-side surface 312, a display 330, such as an LCD, and a first input assembly 340 is disposed adjacent to each other. Alternatively, display 330 may be on a separate assembly such as those displays for PDA's and cell phones with a swivel-mounted screen or flip-phone configurations. Also, the first input assembly 340 may be disposed on more than one surface. The first input assembly may be a typical cell-phone keypad, which may include twelve input elements 342, although any number of input elements may be provided. A user's thumb or thumbs or a stylus may actuate the input elements 342.
A second input assembly 350 is disposed on the back-side surface 314, left-side surface 316 and right side surface 318. Alternatively, the second input assembly may be disposed on one of those surfaces or a combination of those surfaces. In this implementation, the first input assembly 340 is disposed relative to the second input assembly 350 to take advantage of the opposition of the human thumb and finger. The second input assembly 350 includes a pressure sensor pad 354 having a plurality of software configurable active areas, which may be actuated by one or more of the user's fingers. The pressure sensor pad 354 in this implementation may include an actuator, such as an elastomeric material, attached to a force sensitive resistor array, a capacitive mat or array, or other similar pressure sensitive device or grid that can provide multiple outputs corresponding to the pressure readings of the plurality of active areas on the pad's 354 surface. Here, the pressure sensor pad 354 wraps around from the left-side surface 316 across the back-side surface 314 to the right-side surface 318.
It is to be understood that the input elements 342, 354 in this implementation and any other implementation could be analog and/or digital buttons, keys, rockers (which may be a one or more position buttons or an analog joystick-type button), sliders, dials or touch pads used in combination with pressure sensors (such as force sensitive resistors, piezo resistive sensors, and capacitive sensors), positional sensors (such as rotary encoders, linear potentiometers and the like) or other sensors or a combination of them.
Fig. 3c depicts an exploded view of an input element 342 of the first input assembly 340, which is mapped to represent one or more text functions. Here, the input element is mapped to represent the number 7 and letters p, q, r and s, as is found on a typical keypad of a cell phone. Other input elements 342 may be associated with other letters, numbers and/or icons. For example, one input element may be associated with the number 4 and letters g, h and i, while another input element may be associated with the number 2 and the letters a, b and c.
As shown in FIG 3d, the pressure sensor pad 354 may be configured in software to represent one or more delineated active areas corresponding to different programmable functions depending on the application. In this case, inverted U-shaped active area 360 forms an active area-- the vertical sides 362 of the inverted U-shaped active area 360 are on the left-side surface 316 and the right-side surface 318 and the horizontal side 364 of the inverted U-shaped active area 360 is along the top edge of the pressure sensor pad 354 on the back-side surface 314. Below the inverted U-shaped active area 360 on the back-side surface 314 are five oblong-shaped active areas 372 labeled from 1 to 5. On the bottom of both the left-side surface 316 and the right-side surface 318 and stretching to the back-side surface 314 of the pressure sensor pad 354 are rectangular-shaped active areas 374, 376, 378, 380. The remaining area of the pressure sensor pad 354 may be configured to be inactive.
In this implementation, inverted U-shaped active area 360 may be used for navigation-- the vertical sides 362 for y-directional movement and the horizontal side 364 for x-directional movement. The oblong-shaped active areas 372 may be used for shifting or indexing between symbols, such as characters, letters and/or numbers, or text input. The rectangular-shaped active areas 374, 376, 378, 380 maybe used for shifting between modes-two of the active areas 374, 376 for left-handed users and the other two active areas 378, 380 for right-handed users. In another configuration of the pressure sensor pad 354, the entire surface of the pressure sensor pad 354 may be covered by horizontal rectangular active areas interspersed between small rectangular horizontal inactive areas to achieve any desired number of active areas. Other configurations of the pressure sensor pad 354 may be realized depending on the requirements of the desired application.
The delineated active areas of the pressure sensor pad 354 may be actuated by one or more of the user's fingers, such as by applying pressure against the delineated active areas of the pad 354 beyond a pre-defined or user-adjustable threshold pressure.
Likewise, the absence of pressure may be used as an actuation event. The pressure sensor pad 354 also may contain or be mounted above or below a shape-changing media such as an electrorheastatic fluid, shape memory metal array or similar material, which can permit the user to tactilely discriminate between the one or more delineated active areas. Thus, the user will be able to perceive the one or more delineated active areas as if they were physical buttons. Also, a computer graphical representation (not shown) of the configuration of the delineated active areas of the pad 354 may be displayed temporarily (or some predetermined time) on a portion of the display 330 to visually assist the user in locating where the delineated active areas of the pad 354 are positioned. Moreover, an input element 342 of the first input assembly 340 may be mapped to activate and/or deactivate the displaying of the computer graphical representation.
The input architecture described above, with the first input assembly 340 on the front-side surface 312 and the second input assembly 350 on the back-side surface 314, left-side surface 316 and right-side surface 318, is configured to take advantage of the biomechanics of the hand, whether the user is left-handed or right-handed. This configuration, for example, can reduce the number of thumb taps required to input a particular symbol compared to the number of thumb taps or stylus presses required using only a typical key pad with modal input techniques, such as is found in conventional text input systems.
Moreover, this configuration can permit full keyboard capability with fewer input elements on the first input assembly 340 and with greater spacing between input elements to enhance the ease of input compared to typical keypads for existing cell phones. Also, this configuration can permit full functionality of a high performance game controller, which can support the most demanding game input requirements.
A method to implement full keyboard capability and reduce the number of thumb taps is to map in software the delineated active areas of the pressure sensor pad 354, such as the oblong-shaped active areas 372, to an indexing or shifting functionality to take advantage of the capability of the human finger, i.e., rapidity of motion, spatial memory and fine motor control, and to map in software the input elements 342 of the first input assembly 340 to text functionality to take advantage of the capability of the human thumb, i.e., range of motion and sustained force actuation.
When a text messaging application is running on the electronic device 302 and displayed on the screen 330, the first input assembly 340 and the second input assembly 350 are used together to perform the text messaging functions. Each input element 342 of the first input assembly 340 may represent one or more text functions, e.g., one input element may be associated with the decimal digit 2 and letters a, b and c, while another input element may be associated with the decimal digit 7 and letters p, q, r and s (as shown in Fig. 3c), such as is found on typical keypads.
In this implementation, the input elements 342 are configured the same as a typical keypad on a cell phone. The specific text function inputted by a user for a particular input element 342 is determined by which delineated active area of the pressure sensor pad 354 is pressed. For example, going from left to right, each oblong-shaped active area 372 may be mapped to represent a separate index or shift position such that index position 1 may be assigned to the left-most oblong-shaped active area (labeled 1), index position 2 may be assigned to the adjacent oblong-shaped active area 372 (labeled 2) and so on, where index position 5 may be assigned to the right-most oblong-shaped active area 372 (labeled 5).
Thus, to input the word "yes", the user may press the oblong-shaped active area 372 representing index position 4 with any of his fingers and press the particular input element 342 representing the letter "y" with his thumb; then the user may press the oblong-shaped active area 372 representing index position 3 with any of his fingers and press the input element 342 representing the letter "e" with his thumb; and then the user may press the oblong-shaped active area 372 representing index position 5 with any of his fingers and press the input element 342 representing the letter "s" with his thumb.
The coordination of finger motions and thumb motions in other than a grasping motion may be difficult for most people. Generally, doing two separate types of motions simultaneously can be difficult. However, the human interface and input system described herein does not require those types of motions due to the flexibility of the system. Generally, it is easier to tap both the fingers and thumbs or leave either the thumb or fingers in contact with an input element or delineated active area while moving the other. For example, a user's finger may press an oblong-shaped active area 372 at the same time or nearly the same time the user's thumb taps an input element 342 in the first input assembly 340.
Also, a user may tap an input element 342 with his thumb while pressing an oblong-shaped active area 372. Pressing or touching an oblong-shaped active area 372 while tapping on an input element 342 in the first input assembly 340 typically is natural, comfortable and easy to do. Likewise, the same holds true where the index finger moves substantially linearly from one oblong-shaped active area 372 to the next, generally a left to right motion or vice versa, while the thumb taps an input element 342 in the first input assembly 340.
Another way to implement finger/thumb coordination would be to permit asynchronous or sequential tapping between the finger tap and the thumb tap.
For example, pressing an input element 342 within a pre-determined time (e.g., one second) after pressing and depressing a oblong-shaped active area 372 would constitute the same action as if both were pressed simultaneously. This time window could be configured by the user to facilitate different proficiencies in typing or different types of applications-for game applications, the time window could be quite short, whereas for text input applications, the time window could be much longer. The time window also could be different for different delineated active areas based on their intended function in a given application.
Another method to implement full keyboard capability and reduce the number of thumb taps is map in software the delineated active areas of the second input assembly 350 as follows: left vertical side 362 of the inverted U-shaped active area 360 to be shift position 1;
anywhere along the horizontal side 364 to be shift position 2; the top-left rectangular-shaped active area 378 to be shift position 3; the top-right rectangular-shaped active area 374 to be shift position 4; the bottom-left rectangular-shaped active area 380 to be shift position 5; and, if needed, the bottom-right rectangular-shaped active area 376. The input elements 342 of the first input assembly 340 may again be mapped to text functionality.
Thus, to input the word "yes", the user may press the top-right rectangular-shaped active area 374 representing shift position 4 with any of his fingers and press the particular input element 342 representing the letter "y" with his thumb; then the user may press the top-left rectangular-shaped active area 378 representing index position 3 with any of his fingers and press the input element 342 representing the letter "e" with his thumb;
and then the user may press the bottom-left rectangular-shaped active area 380 representing index position 5 with any of his fingers and press the input element 342 representing the letter "s" with his thumb.
A method of implementing the functionality of a game controller is to assign in software the input elements 342 of the first input assembly 340 specific game functions to take advantage of the capability of the human thumb, i.e., range of motion and sustained force actuation, and to map in software the delineated active areas of the pressure sensor pad 354 of the second input assembly 350 analog control to take advantage of the capability of the human finger, i.e., rapidity of motion, spatial memory and fine motor control. Thus, as a user's index finger or middle finger slides from left to right across the oblong-shaped active areas 372, the horizontal side 364 of the inverted U-shaped active area 360, and/or the rectangular active area 370, the input controller (not shown) may interpret the motion as "increasing" a parameter such as speed, size, position, etc. Alternatively, the input controller may be programmed to interpret different levels of applied pressure to these delineated active areas as the "increasing" parameter, i.e., increased pressured may represent increased speed, size, position, etc.
In this implementation, the vertical side 362 of the inverted U-shaped active area 360 may be programmed to represent the y-directional (vertical) movement of control of a character in a game, while the horizontal side 364 of the U-shaped active area 360 may be programmed to represent the x-directional (horizontal) movement. Movement into or out of the field of view may be controlled by the left and right rectangular buttons 374, 376, 378, 380, thereby allowing 3-D control. Rapid firing of weapons maybe accomplished by using the input elements 342 of the first input assembly 340 or one of the five oblong-shaped active areas 372, with each one representing a different weapon or action. Complex moves or mode shifts could be accomplished by combining input elements 342 of the first input assembly 340 with any delineated active area of the second input assembly 350. In this way, a game developer may optimize the mapping of delineated active areas based on the best configuration for a particular game. For example, a game developer could set up control configurations for novice users differently than for advanced users, such as mapping different numbers or sizes of delineated active areas, in order to reduce the learning time to be proficient and make game control easier for first time players.
Figs. 4a and 4b illustrate front and back isometric views, respectively, of a hand-held electronic device 402 similar to the device shown in Figs. 3 a and 3b, except the second input assembly 450 includes three input or selection elements 454, 456, 458, which may be rectangular-shaped touch pads. Each touch pad 454, 456, 458 may transduce the location of the contact of an object or a user's finger anywhere on its surface. Also each touch pad 454, 456, 458 may correspond to different programmable functions. Here, touch pad 454 may be disposed on the back-side surface 414; touch pad 456 may be disposed on the left-side surface 416; and touch pad 458 may be disposed on the right-side surface 418.
In a hand-held device such as a cell-phone or PDA, the second input assembly may include a touch-pad located diagonally on the back-side surface 414 with another touch-pad on the left-side surface 416 because a right-handed user's index finger typically is placed along a diagonal path on the back-side surface 414 wrapping around to the left-side surface 416. In that case, second input assembly 450 may include touch pad 454 and touch pad 456.
A user's finger may finger may move along the length of the touch-pad strip 454 in order to select the desired function. For example, a far left portion of touch-pad 454 maybe mapped to be index position 1, a far right portion of touch-ad 454 may be mapped to be index position 5, and portions between the far-left portion and the far right portion of the touch-pad 454 may be mapped to intervening index positions. Alternatively, index position 1 may be mapped to touch pad 456 for right-handed users and mapped to touch pad 458 for left-handed users.
Thus, in this implementation, text input is similar to that as described with respect to Figs. 3a and 3b. Other configurations of the active areas of the touch pads 454, 456, 458 are possible and can be tailored to specific applications.
Figs. 5a and 5b illustrate front and back isometric views, respectively, of a hand-held electronic device 502 similar to the device shown in Figs. 3a and 3b, except the second input assembly 550 includes three input or selection elements 554, which maybe actuated by any of the user's fingers, typically the user's index finger or middle finger or a combination of both. The input elements 554 in this implementation are conventional two-position rockers.
Thus, the second input assembly 550 can provide six index positions at a relatively low cost with passive tactile feedback built in.
Figs. 6a and 6b illustrate front and back isometric views, respectively, of a hand-held electronic device 602 similar to the device shown in Figs 3a and 3b, except the second input assembly 650 includes three input or selection elements 654, 656, 658. Input element 654 may be a D-pad input device and input elements 656, 658 may be either digital or analog contact sensors. The D-pad 654 may be mounted on the center of the back-side surface 614 and mapped in software to represent one or more index or shift positions. For example, the D-pad 654 may be mapped to represent four index positions with each compass heading of the D-pad (e.g., North, South, East and West) representing a different index position. A fifth index position could be mapped to orthogonal movement of the center of the D-pad 654 into the device 602. Alternatively, the D-pad 654 may be mapped to represent eight index positions, e.g., the compass directions North, South, East, West, Northeast, Northwest, Southeast and Southwest may be mapped. The contact sensors 656, 658 may be used as mode functions, for firing weapons, or any other functionality specified by an application developer.
Figs. 7a and 7b illustrate front and back isometric views, respectively, of a two-handed hand-held electronic device 702. A first input assembly 740 including a plurality of input elements 742 is disposed on the front-side surface 712. A second input assembly 750, including two input or selection elements 754, 756, is disposed on the back-side surface 714.
In this implementation, the two input elements 754, 756 are rotary dials.
Alternatively, rotary dial 754 may be disposed on the left-side surface 716 and rotary dial 756 may be disposed on the right-side surface 718. In a one-handed hand-held electronic device, such as a cell-phone, typically one rotary dial is needed if placed on the back-side surface 714 or two rotary dials are needed if placed on the left and right side surfaces 716, 718. Rotation of the rotary dials 754, 756 may be mapped in software to represent one or more index positions.
The rotary dials 754, 756 may be implemented with detents so that the user can distinguish between separate index positions, i.e., tactile feedback may be provided to the user's finger(s).
FIG 8 is a block diagram that illustrates a hand-held electronic device 800, such as a cell-phone or PDA, upon which the human interface and input system and associated techniques described herein may be implemented in a communication system.
Network link 820 typically provides data communication through one or more networks to other devices.
For example, network link 820 may provide a connection through local network 822 to a host computer 824 or to data equipment operated by an Internet Service Provider (ISP) 826. ISP
826 in turn provides data communication services through the world wide packet data communication network now commonly referred to as the "Internet" 828. Network link 820 also could provide data communication directly to the ISP 826 and Internet 828. Local network 822 and Internet 828 both use electrical, electromagnetic or optical signals that carry digital data streams. The signals through the various networks and the signals on network link 820, which carry the digital data to and from electronic device 800, are exemplary forms of carrier waves transporting the information.
Electronic device 800 can send messages and receive data, including program code, which includes one or more sequences of one or more instructions, through the network(s) and network link 820. In the Internet example, a server 830 might transmit a requested code for an application program through Internet 828, ISP 826, local network 822 and network link 820. In one aspect, one such downloaded application may be for software games to be played on electronic device 800, which may obtain application code in the form of a carrier wave.
In any of the above implementations, active and/or passive tactile feedback may be implemented. To provide passive tactile feedback, the input elements of the first and/or second input assemblies may be combined with a palpable detent, such as a dome cap or dome spring so that a user can tactilely perceive, through his fingers or thumbs, activation and/or deactivation of an input element. In one implementation, among others, the palpable detent may be positioned between the actuator and sensor components of the input elements.
To provide active tactile feedback, one ore more vibratory units or force producing units may be mounted in the hand-held electronic device and activated to provide tap or index level or other information to a user. The vibratory unit may be an electric motor with an eccentric mass attached to the motor's shaft, a solenoid, a variable reluctance device, a loud speaker or any other vibrator that can provide tactile feedback. A force producing unit may be a solenoid in non-vibratory mode, a motor, non-vibratory actuators or any other actuator that can produce forces. A vibratory unit and/or force producing unit may be provided for each input element . In that case, the vibratory unit and/or force producing unit may be mounted below the input element so that when the vibratory unit and/or force producing unit is activated, the vibratory unit and/or force producing unit can push out the surface of the electronic device to a different level or position depending on the information to be communicated. Thus, in implementations using a pressure sensor pad or touch-pad as the input element, a stepped array may be configured to indicate higher and higher levels of index positions across the touch pad or pressure sensor pad. The vibratory units and/or force producing units may also be used to provide tactile feedback to indicate the momentary achievement of an objective, such as target lock in game applications. Tactile feedback may also be accomplished by actuators, such as a solenoid, which changes the stiffness of the input element electronically or pushes against the user's hand or fingers to indicate an event of interest in the software application.
The computational aspects described here can be implemented in analog or digital electronic circuitry, or in computer hardware, firmware, software, or in combinations of them.
Where appropriate, aspects of these systems and techniques can be implemented in a computer program product tangibly embodied in a machine-readable storage device for execution by a programmable processor; and method steps can be performed by a programmable processor executing a program of instructions to perform functions by operating on input data and generating output.
The systems and techniques described above utilize the biornechanics of the thumb and fingers, i.e., it uses the function of opposition, the fine motor control of the fingers, and the larger range of motion and stronger actuation provided by the thumb. By using the fingers and thumb in concert, the number of taps and time needed to accomplish a given function is reduced, the accuracy is improved, and the natural programming inherent in the human hand replaces the training required for other systems.
A number of implementations have been described. Other implementations may include different or additional features. For example, other configurations of the one or more input elements of the first and second input assemblies may be realized. Also, the hand-held electronic devices described herein may have more or less than six planar or contoured surfaces. Moreover, the number of input elements in the first and second input assemblies are not limited to the number of input elements described in the implementations above.
Also, the one or more input elements of the first and second input assemblies may be any input or selection type known to one of skill in the art, such as keys, buttons, touch pads, other types of pads, rockers, sliders, dials, contact sensors or other actuators associated with any sensor. Each sensor associated with an actuator may include digital momentary on/off switches or analog sensors, such as pressure sensors (e.g., force sensitive resistors, piezo film sensors, or capacitive sensors), or positional sensors (e.g., rotary or linear potentiometers or encoders), or other analog sensors known to those of ordinary skill, or accelerometers or gyroscopes. The first and second input assemblies may include a combination of these different types of input or selection elements, which may be mounted in the configurations shown in the figures or imbedded within the device to permit control through motion of the overall device.
Moreover, the methods to provide data input, device control or game control may be performed in a different order and still achieve desirable results.
Accordingly, other implementations are within the scope of the following claims.
Claims (56)
1. A hand-held electronic device comprising:
a memory configured to store a plurality of applications, wherein each application is associated with a set of functions;
a processor configured to process a selected one of the plurality of applications;
a first input assembly having a plurality of input elements on a first surface configured to receive input from a human user through manipulation of the plurality of input elements, wherein at least one of the input elements on the first surface is configured to selectively map to one or more input functions of the set of functions associated with the selected one of the plurality of applications; and a second input assembly having one or more input elements on a second surface configured to be manipulated by one or more of the human user's fingers, wherein at least one of the input elements on the second surface is further configured to be selectively mapped to one or more input functions of the set of functions corresponding to the selected one of the plurality of applications, further wherein the plurality of input elements on the first surface and the one or more input elements on the second surface are arranged so as to substantially optimize a biomechanical effect of the human user's hand, wherein at least one of the input elements of the second input assembly is a selectively configurable sensing surface so as to selectively provide one or more delineated active areas configured based on the selected application.
a memory configured to store a plurality of applications, wherein each application is associated with a set of functions;
a processor configured to process a selected one of the plurality of applications;
a first input assembly having a plurality of input elements on a first surface configured to receive input from a human user through manipulation of the plurality of input elements, wherein at least one of the input elements on the first surface is configured to selectively map to one or more input functions of the set of functions associated with the selected one of the plurality of applications; and a second input assembly having one or more input elements on a second surface configured to be manipulated by one or more of the human user's fingers, wherein at least one of the input elements on the second surface is further configured to be selectively mapped to one or more input functions of the set of functions corresponding to the selected one of the plurality of applications, further wherein the plurality of input elements on the first surface and the one or more input elements on the second surface are arranged so as to substantially optimize a biomechanical effect of the human user's hand, wherein at least one of the input elements of the second input assembly is a selectively configurable sensing surface so as to selectively provide one or more delineated active areas configured based on the selected application.
2. The hand-held electronic device of claim 1, wherein manipulation of a delineated active area causes the input function of one or more input elements of the first input assembly to change.
3. (The hand-held electronic device of claim 2, further comprising a shape changing media configured relative to the sensor pad so as to permit the human user to tactilely discriminate between the plurality of delineated active areas.
4. The hand-held electronic device of claim 1, wherein the processor receives signals generated by the input elements of first or second input assemblies when manipulated by the human user.
5. The hand-held electronic device of claim 1 further comprising an input controller, wherein the input controller receives signals generated by the input elements of first or second input assemblies when manipulated by the human user and converts the signals into a form suitable to be interpreted by the processor.
6. The hand-held electronic device of claim 1, wherein at least one of the input elements of the second input assembly is a rotary sensor.
7. The hand-held electronic device of claim 1, wherein at least one of the input elements of the second input assembly is a D-pad.
8. The hand-held electronic device of claim 1, further comprising at least one palpable detent, wherein the detent is associated with at least one of the input elements of the first or second input assemblies so as to provide tactile feedback to the human user when the human user manipulates the input element associated with the palpable detent.
9. The hand-held electronic device of claim 1 further comprising one or more vibratory or force producing units, at least one of the vibratory or force producing units configured to provide tactile feedback upon the human user's manipulation of at least one of the input elements of the first or second input assemblies.
10. The hand-held electronic device of claim 9, wherein at least one of the vibratory units provide tactile feedback in response to events occurring in the selected application running on the processor.
11. A hand-held electronic device comprising:
a memory configured to store a plurality of applications, wherein each application is associated with a set of functions;
a processor configured to process a selected one of the plurality of applications, wherein the set of functions associated with the selected application includes a plurality of text symbol functions and a plurality of shifting functions;
a first surface having a plurality of input elements configured to receive input from a human user through manipulation of the plurality of input elements, wherein at least one of the input elements of the first surface is further configured to selectively map to more than one text symbol function; and a second surface having one or more input elements, wherein at least one of the input elements of the second surface having one or more selectable active areas configured to be manipulated by one or more of the human user's fingers, each selectable active area configured to selectively map to a different shifting function, wherein manipulation of one of the selectable active area causes the text symbol function of the one or more input elements of the first surface to change, further wherein the plurality of input elements of the first surface and the one or more input elements of the second surface are arranged so as to substantially optimize a biomechanical effect of the human user's hand.
a memory configured to store a plurality of applications, wherein each application is associated with a set of functions;
a processor configured to process a selected one of the plurality of applications, wherein the set of functions associated with the selected application includes a plurality of text symbol functions and a plurality of shifting functions;
a first surface having a plurality of input elements configured to receive input from a human user through manipulation of the plurality of input elements, wherein at least one of the input elements of the first surface is further configured to selectively map to more than one text symbol function; and a second surface having one or more input elements, wherein at least one of the input elements of the second surface having one or more selectable active areas configured to be manipulated by one or more of the human user's fingers, each selectable active area configured to selectively map to a different shifting function, wherein manipulation of one of the selectable active area causes the text symbol function of the one or more input elements of the first surface to change, further wherein the plurality of input elements of the first surface and the one or more input elements of the second surface are arranged so as to substantially optimize a biomechanical effect of the human user's hand.
12. The hand-held electronic device of claim 11 further comprising a controller, wherein the controller receives signals generated by the human user's manipulation of the input elements of the first surface or active areas.
13. The hand-held electronic device of claim 12 further comprising a dome cap positioned above at least one input element of the first surface or the second surface and capable of providing tactile feedback to the human user when the input element associated with the dome cap is manipulated.
14. The hand-held electronic device of claim 12 further comprising one or more vibratory units capable of providing tactile feedback.
15. The hand-held electronic device of claim 12 further comprising one or more force producing units capable of providing tactile feedback.
16. A method for configuring a human interface and input system for use with a hand-held electronic device configured to run a plurality of applications, each application associated with a set of functions, the method comprising:
disposing on a first surface a first input assembly having a plurality of input elements configured to receive input from a human user through manipulation of the plurality of input elements, wherein at least one of the input elements of the first input assembly is further configured to map to more than one input function associated with a selected one of the plurality of applications;
disposing on a second surface a second input assembly having one or more input elements configured to be manipulated by one or more of the human user's fingers, wherein at least one of the input elements of the second input assembly is further configured to selectively map to one or more input functions associated with the selected application;
and arranging the plurality of input elements of the first input assembly and the one or more input elements of the second input assembly to substantially optimize a biomechanical effect of the human user's hand.
disposing on a first surface a first input assembly having a plurality of input elements configured to receive input from a human user through manipulation of the plurality of input elements, wherein at least one of the input elements of the first input assembly is further configured to map to more than one input function associated with a selected one of the plurality of applications;
disposing on a second surface a second input assembly having one or more input elements configured to be manipulated by one or more of the human user's fingers, wherein at least one of the input elements of the second input assembly is further configured to selectively map to one or more input functions associated with the selected application;
and arranging the plurality of input elements of the first input assembly and the one or more input elements of the second input assembly to substantially optimize a biomechanical effect of the human user's hand.
17. The method of claim 16 further comprising: physically or electronically labeling at least one input element of the first input assembly or the second input assembly so as to visually indicate an input function that can be selectively accessed by actuating the input element.
18. The method of claim 16 further comprising connecting a controller to the input elements of the first input assembly or the second input assembly, wherein the controller is configured to receive signals generated by a manipulation of one or more of the input elements of the first input assembly or the second input assembly.
19. The method of claim 18, wherein at least one input element of second input assembly having a plurality of active areas configurable by the controller to form a plurality of delineated active areas.
20. The method of claim 19 further comprising positioning a shape changing media relative to the one input element of the second input assembly having a plurality of active areas so as to permit the human user to tactilely discriminate between the plurality of delineated active areas.
21. The method of claim 18 further comprising positioning a palpable detent with at least one input element of the first input assembly or the second input assembly so as to provide tactile feedback when manipulated by the human user.
22. A method for inputting data on a hand-held electronic device having a first surface with a plurality of input elements configured to receive input from a human user through manipulation of the plurality of input elements, wherein at least one of the input elements is further configured to map to a plurality of symbols in a data input mode, wherein each of the plurality of symbols is associated with a unique index position identifier, and a second surface having one or more selection elements configured to be manipulated by one or more of the human user's fingers, wherein each selection element corresponds to one of the unique index position identifiers, further wherein the plurality of input elements and the one or more selection elements are arranged to substantially optimize a biomechanical effect of the human user's hand, the method comprising:
executing a selected application from a plurality of applications, wherein the selected application is associated with a set of functions determining the index position identifier of a desired symbol to be inputted based on the functions associated with the selected application;
pressing the selection element corresponding to the index position identifier of the desired symbol with any digit or object held in the human user's hand; and pressing the input element configured to map to the desired symbol with any digit or object held in the human user's hand.
executing a selected application from a plurality of applications, wherein the selected application is associated with a set of functions determining the index position identifier of a desired symbol to be inputted based on the functions associated with the selected application;
pressing the selection element corresponding to the index position identifier of the desired symbol with any digit or object held in the human user's hand; and pressing the input element configured to map to the desired symbol with any digit or object held in the human user's hand.
23. The method of claim 22, wherein each input element is physically or electronically labeled indicating each symbol that is mapped to by the input element and a positional order in which each symbol can be selectively accessed by actuating the input element.
24. The method of claim 22, wherein determining the index position identifier of the desired character to be inputted comprises:
locating the input element configured to map to the desired symbol;
and counting from left to right the number of symbols preceding the desired symbol labeled on the located input element, wherein the index position identifier of the desired symbol is the number of symbols preceding the desired symbol plus one.
locating the input element configured to map to the desired symbol;
and counting from left to right the number of symbols preceding the desired symbol labeled on the located input element, wherein the index position identifier of the desired symbol is the number of symbols preceding the desired symbol plus one.
25. The method of claim 22, wherein at least one of the input elements or selection elements is further configured to map to a plurality of modes corresponding to the selected application executing on the hand-held electronic device, at least one of the modes is the data input mode, the method further comprising enabling the data input mode.
26. A method for a human user to input data on a hand-held electronic device using an interface and input system comprising a plurality of input elements in a thumb-manipulated assembly to substantially optimize a biomechanical effect of the human user's thumb and fingers, wherein at least one input element is mapped to more than one text function, and one or more selection elements in a finger-manipulated input assembly, wherein each selection element is mapped to a unique shift position, the method comprising:
executing a selected text application from a plurality of applications, wherein the selected application is associated with a set of functions pressing a desired selection element of the finger-manipulated input assembly with a human finger to select a desired shift position of the selected text application; and pressing a desired input element of the thumb-manipulated input assembly with a human thumb to input a desired text character.
executing a selected text application from a plurality of applications, wherein the selected application is associated with a set of functions pressing a desired selection element of the finger-manipulated input assembly with a human finger to select a desired shift position of the selected text application; and pressing a desired input element of the thumb-manipulated input assembly with a human thumb to input a desired text character.
27. A hand-held electronic device comprising:
a memory configured to store a plurality of applications, wherein each application is associated with a set of functions;
a processor configured to process a selected one of the plurality of applications;
a first input assembly disposed on a first surface of the electronic device, wherein the first input assembly comprises a plurality of input elements configured to be actuated by a human user's hand, wherein at least one of the input elements of the first input assembly is configured to map to one or more input functions of the set of functions associated with the selected one of the plurality of applications; and a second input assembly disposed on a second surface so as to substantially optimize a biomechanical effect of the human user's hand, wherein the second input assembly comprises one or more input elements configured to be manipulated by one or more of the human user's fingers, wherein at least one of the input elements of the second input assembly is a selectively configurable sensing surface so as to provide a plurality of delineated active areas, further wherein one or more of the delineated active areas is mapped to one or more functions associated with the selected application, further wherein the memory is further configured to store for each application a mapping of the selectively configurable sensing surface to the plurality of delineated active areas.
a memory configured to store a plurality of applications, wherein each application is associated with a set of functions;
a processor configured to process a selected one of the plurality of applications;
a first input assembly disposed on a first surface of the electronic device, wherein the first input assembly comprises a plurality of input elements configured to be actuated by a human user's hand, wherein at least one of the input elements of the first input assembly is configured to map to one or more input functions of the set of functions associated with the selected one of the plurality of applications; and a second input assembly disposed on a second surface so as to substantially optimize a biomechanical effect of the human user's hand, wherein the second input assembly comprises one or more input elements configured to be manipulated by one or more of the human user's fingers, wherein at least one of the input elements of the second input assembly is a selectively configurable sensing surface so as to provide a plurality of delineated active areas, further wherein one or more of the delineated active areas is mapped to one or more functions associated with the selected application, further wherein the memory is further configured to store for each application a mapping of the selectively configurable sensing surface to the plurality of delineated active areas.
28. The hand-held electronic device of claim 1, wherein the selected one of the plurality of applications is a text application; and the one or more input elements on the second surface of the second input assembly comprises one or more selection elements, wherein manipulations of the one or more selection elements causes the input elements on the first surface of the first input assembly to be selectively mapped from one text function to another text function.
29. The hand-held electronic device of claim 1, wherein the selected one of the plurality of applications is a game application, and at least one of the plurality of input elements of the first input assembly and at least one of the input elements of the second input assembly are each configured to selectively map to one or more game functions.
30. The hand-held electronic device of claim 27, further comprising: an input controller, wherein the input controller receives a plurality of signals generated by the input elements of the first input assembly and the second input assembly when manipulated by the human user, and converts the plurality of signals into a form suitable to be interpreted by the processor.
31. The hand-held electronic device of claim 30, wherein at least one input element of the first input assembly or the second input assembly is configured to map to one or more input functions associated with the selected application that control a cursor on a screen.
32. The hand-held electronic device of claim 30, wherein the selected one of the plurality of applications is a game application.
33. The hand-held electronic device of claim 32, wherein at least one input element of the first input assembly or the second input assembly is configured to map to one or more input functions associated with the game application that control a game character on a screen.
34. The hand-held electronic device of claim 32, wherein the input controller is further configured to interpret a movement of the human user's finger sliding across two or more delineated active areas as a change in the mapped function of the two or more delineated active areas, wherein the mapped function is at least one of a speed control, a size control, a weapon fire control, and a position control.
35. The hand-held electronic device of claim 32, wherein the input controller is further configured to interpret a pressure applied by the human user's finger on a selected one of the delineated active areas as a change in the mapped function of the selected delineated active area, wherein the mapped function is at least one of a speed control, a size control, a weapon fire, and position control.
36. The hand-held electronic device of claim 32, wherein at least one of the functions mapped to the input element of the first input assembly is a game function that is substantially optimized for actuation by the human user's thumb.
37. The hand-held electronic device of claim 36, wherein the game function that is substantially optimized for actuation by the human user's thumbs comprises a directional control.
38. The hand-held electronic device of claim 32, wherein at least one of the functions mapped to the delineated active areas is a game function that is substantially optimized for actuation by one or more of the human user's fingers.
39. The hand-held electronic device of claim 38, wherein the game function that is substantially optimized for actuation by one or more of the human user's fingers comprises a weapon fire control.
40. The hand-held electronic device of claim 38, wherein the game function that is substantially optimized for actuation by one or more of the human user's fingers comprises a game character jump control.
41. A method for configuring a human interface and input system for use with a hand-held electronic device configured to run a plurality of applications, each application associated with a set of functions, the method comprising:
disposing on a first surface a first input assembly having a plurality of input elements configured to receive input from a human user's hand through manipulation of the plurality of input elements, wherein at least one of the input elements of the first input assembly is further configured to map to more than one input function associated with a selected one of the plurality of applications;
disposing on a second surface a second input assembly having one or more input elements configured to be manipulated by one or more of the human user's fingers, wherein at least one of the input elements of the second input assembly is further configured to selectively map to one or more input functions associated with the selected application;
and mapping the set of functions of the selected application to the one or more input elements of the first input assembly and the second input assembly to substantially optimize a biomechanical effect of the human user's hand.
disposing on a first surface a first input assembly having a plurality of input elements configured to receive input from a human user's hand through manipulation of the plurality of input elements, wherein at least one of the input elements of the first input assembly is further configured to map to more than one input function associated with a selected one of the plurality of applications;
disposing on a second surface a second input assembly having one or more input elements configured to be manipulated by one or more of the human user's fingers, wherein at least one of the input elements of the second input assembly is further configured to selectively map to one or more input functions associated with the selected application;
and mapping the set of functions of the selected application to the one or more input elements of the first input assembly and the second input assembly to substantially optimize a biomechanical effect of the human user's hand.
42. The method of claim 41, wherein the selected application is at least one of a scrolling application, a text application and a game application.
43. The method of claim 42 further comprising: physically or electronically labeling at least one input element of the first input assembly or the second input assembly so as to visually indicate an input function that can be selectively accessed by actuating the input element.
44. The method of claim 42 further comprising connecting a controller to the input elements of the first input assembly or the second input assembly, wherein the controller is configured to receive signals generated by a manipulation of one or more of the input elements of first input assembly or the second input assembly.
45. The method of claim 44, wherein at least one input element of the second input assembly has a plurality of active areas configurable by the controller to form a plurality of delineated active areas.
46. The method of claim 45 further comprising positioning a shape changing media relative to the one input element of the second input assembly having a plurality of active areas so as to permit the human user to tactilely discriminate between the plurality of delineated active areas.
47. The hand-held electronic device of claim 1, wherein the processor is further configured to be communicatively coupled to a host electronic device.
48. The hand-held electronic device of claim 11, wherein the processor is further configured to be communicatively coupled to a host electronic device.
49. The method of claim 18, wherein the controller is further configured to be communicatively coupled to a host electronic device.
50. The method of claim 22, wherein the hand-held electronic device is further configured to be communicatively coupled to a host electronic device.
51. The method of claim 26, wherein the hand-held electronic device is further configured to interface with a host electronic device.
52. The hand-held electronic device of claim 27, wherein the processor is further configured to interface with a host electronic device.
53. The method of claim 41, wherein the hand-held electronic device is configured to interface with a host electronic device.
54. The hand-held electronic device of claim 1, wherein the selectively configurable sensing surface is configured to selectively vary a shape of the provided one or more delineated active areas based on the selected application.
55. The hand-held electronic device of claim 1, wherein the selectively configurable sensing surface is configured to selectively vary a number of the delineated active areas based on the selected application.
56. The hand-held electronic device of claim 1, wherein the selectively configurable sensing surface selectively provides the one or more delineated active areas customized for a user's hand.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/699,555 US7218313B2 (en) | 2003-10-31 | 2003-10-31 | Human interface system |
US10/699,555 | 2003-10-31 | ||
PCT/US2004/036163 WO2005043371A2 (en) | 2003-10-31 | 2004-10-29 | Human interface system |
Publications (2)
Publication Number | Publication Date |
---|---|
CA2543191A1 CA2543191A1 (en) | 2005-05-12 |
CA2543191C true CA2543191C (en) | 2011-10-25 |
Family
ID=34550998
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA2543191A Expired - Fee Related CA2543191C (en) | 2003-10-31 | 2004-10-29 | Human interface system |
Country Status (8)
Country | Link |
---|---|
US (3) | US7218313B2 (en) |
EP (2) | EP2093646A1 (en) |
JP (1) | JP2007510233A (en) |
KR (3) | KR20060096451A (en) |
CN (2) | CN101140481B (en) |
CA (1) | CA2543191C (en) |
HK (1) | HK1115652A1 (en) |
WO (1) | WO2005043371A2 (en) |
Families Citing this family (168)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA2495038C (en) * | 2002-09-30 | 2012-11-06 | Microsoft Corporation | System and method for making user interface elements known to an application and user |
US7218313B2 (en) * | 2003-10-31 | 2007-05-15 | Zeetoo, Inc. | Human interface system |
US20050130604A1 (en) * | 2003-11-26 | 2005-06-16 | Nokia Corporation | Electronic device having a physical configuration that is dependent upon its operational status |
WO2005102088A1 (en) * | 2004-04-19 | 2005-11-03 | 4Sight, Inc. | Hand covering features for the manipulation of small devices |
US7321360B1 (en) | 2004-05-24 | 2008-01-22 | Michael Goren | Systems, methods and devices for efficient communication utilizing a reduced number of selectable inputs |
TWI291642B (en) * | 2004-07-15 | 2007-12-21 | N trig ltd | A tracking window for a digitizer system |
JP4795343B2 (en) * | 2004-07-15 | 2011-10-19 | エヌ−トリグ リミテッド | Automatic switching of dual mode digitizer |
US7671836B2 (en) * | 2005-01-03 | 2010-03-02 | Nokia Corporation | Cell phone with shiftable keypad |
EP1920408A2 (en) * | 2005-08-02 | 2008-05-14 | Ipifini, Inc. | Input device having multifunctional keys |
US7669770B2 (en) * | 2005-09-06 | 2010-03-02 | Zeemote, Inc. | Method of remapping the input elements of a hand-held device |
US8142287B2 (en) * | 2005-10-11 | 2012-03-27 | Zeemote Technology Inc. | Universal controller for toys and games |
US7652660B2 (en) * | 2005-10-11 | 2010-01-26 | Fish & Richardson P.C. | Mobile device customizer |
US7280097B2 (en) * | 2005-10-11 | 2007-10-09 | Zeetoo, Inc. | Human interface input acceleration system |
US7649522B2 (en) * | 2005-10-11 | 2010-01-19 | Fish & Richardson P.C. | Human interface input acceleration system |
KR100837162B1 (en) * | 2005-10-28 | 2008-06-11 | 엘지전자 주식회사 | Communication Terminal with Multi-input Device |
US10048860B2 (en) * | 2006-04-06 | 2018-08-14 | Google Technology Holdings LLC | Method and apparatus for user interface adaptation |
JP2007293407A (en) * | 2006-04-21 | 2007-11-08 | E-Lead Electronic Co Ltd | Shouldering type input device |
US20070290992A1 (en) * | 2006-06-16 | 2007-12-20 | Creative Technology Ltd | Control interface for media player |
US9355568B2 (en) * | 2006-11-13 | 2016-05-31 | Joyce S. Stone | Systems and methods for providing an electronic reader having interactive and educational features |
US20080259028A1 (en) * | 2007-04-19 | 2008-10-23 | Brenda Teepell | Hand glove mouse |
US20080282446A1 (en) * | 2007-05-15 | 2008-11-20 | 180S, Inc. | Hand Covering With Tactility Features |
EP2017688B1 (en) * | 2007-06-16 | 2012-06-06 | RAFI GmbH & Co. KG | Device for creating electrically evaluable control signals |
US8081164B2 (en) * | 2007-07-02 | 2011-12-20 | Research In Motion Limited | Controlling user input devices based upon detected attitude of a handheld electronic device |
US20090125848A1 (en) * | 2007-11-14 | 2009-05-14 | Susann Marie Keohane | Touch surface-sensitive edit system |
US9513765B2 (en) * | 2007-12-07 | 2016-12-06 | Sony Corporation | Three-dimensional sliding object arrangement method and system |
US8336119B2 (en) * | 2007-12-09 | 2012-12-25 | 180's. Inc. | Hand covering with conductive portion |
US9003567B2 (en) | 2007-12-09 | 2015-04-14 | 180S, Inc. | Hand covering with tactility features |
US9423875B2 (en) | 2008-01-04 | 2016-08-23 | Tactus Technology, Inc. | Dynamic tactile interface with exhibiting optical dispersion characteristics |
US9760172B2 (en) | 2008-01-04 | 2017-09-12 | Tactus Technology, Inc. | Dynamic tactile interface |
US9557915B2 (en) | 2008-01-04 | 2017-01-31 | Tactus Technology, Inc. | Dynamic tactile interface |
US8456438B2 (en) | 2008-01-04 | 2013-06-04 | Tactus Technology, Inc. | User interface system |
US9274612B2 (en) | 2008-01-04 | 2016-03-01 | Tactus Technology, Inc. | User interface system |
US9063627B2 (en) | 2008-01-04 | 2015-06-23 | Tactus Technology, Inc. | User interface and methods |
US9588683B2 (en) | 2008-01-04 | 2017-03-07 | Tactus Technology, Inc. | Dynamic tactile interface |
US9612659B2 (en) | 2008-01-04 | 2017-04-04 | Tactus Technology, Inc. | User interface system |
US8570295B2 (en) | 2008-01-04 | 2013-10-29 | Tactus Technology, Inc. | User interface system |
US9552065B2 (en) | 2008-01-04 | 2017-01-24 | Tactus Technology, Inc. | Dynamic tactile interface |
US8553005B2 (en) | 2008-01-04 | 2013-10-08 | Tactus Technology, Inc. | User interface system |
US8547339B2 (en) | 2008-01-04 | 2013-10-01 | Tactus Technology, Inc. | System and methods for raised touch screens |
US8922510B2 (en) | 2008-01-04 | 2014-12-30 | Tactus Technology, Inc. | User interface system |
US8154527B2 (en) | 2008-01-04 | 2012-04-10 | Tactus Technology | User interface system |
US9052790B2 (en) | 2008-01-04 | 2015-06-09 | Tactus Technology, Inc. | User interface and methods |
US9720501B2 (en) | 2008-01-04 | 2017-08-01 | Tactus Technology, Inc. | Dynamic tactile interface |
US9298261B2 (en) | 2008-01-04 | 2016-03-29 | Tactus Technology, Inc. | Method for actuating a tactile interface layer |
US8947383B2 (en) | 2008-01-04 | 2015-02-03 | Tactus Technology, Inc. | User interface system and method |
US9367132B2 (en) | 2008-01-04 | 2016-06-14 | Tactus Technology, Inc. | User interface system |
US8243038B2 (en) | 2009-07-03 | 2012-08-14 | Tactus Technologies | Method for adjusting the user interface of a device |
US9430074B2 (en) | 2008-01-04 | 2016-08-30 | Tactus Technology, Inc. | Dynamic tactile interface |
US8004501B2 (en) * | 2008-01-21 | 2011-08-23 | Sony Computer Entertainment America Llc | Hand-held device with touchscreen and digital tactile pixels |
KR101506162B1 (en) * | 2008-01-23 | 2015-03-26 | 삼성전자주식회사 | Portable terminal using qwerty key and method for setting and inputting symbol thereof |
US20090203318A1 (en) * | 2008-02-11 | 2009-08-13 | Haan Ido De | Headset |
BRPI0804355A2 (en) | 2008-03-10 | 2009-11-03 | Lg Electronics Inc | terminal and control method |
KR100956826B1 (en) * | 2008-03-10 | 2010-05-11 | 엘지전자 주식회사 | Terminal and method for controlling the same |
US20090243897A1 (en) * | 2008-04-01 | 2009-10-01 | Davidson Wayne A | Method and apparatus for entering alphanumeric data via keypads or display screens |
JP5203797B2 (en) * | 2008-05-13 | 2013-06-05 | 株式会社エヌ・ティ・ティ・ドコモ | Information processing apparatus and display information editing method for information processing apparatus |
KR100939063B1 (en) * | 2008-06-10 | 2010-01-28 | 한국과학기술원 | Haptic feedback device and haptic feedback providing method thereof |
US8281046B2 (en) | 2008-07-03 | 2012-10-02 | Steelseries Aps | System and method for distributing user interface device configurations |
US7925797B2 (en) | 2008-07-03 | 2011-04-12 | Steelseries Hq | System and method for distributing user interface device configurations |
US8342926B2 (en) | 2008-07-13 | 2013-01-01 | Sony Computer Entertainment America Llc | Game aim assist |
US8498425B2 (en) * | 2008-08-13 | 2013-07-30 | Onvocal Inc | Wearable headset with self-contained vocal feedback and vocal command |
KR101292719B1 (en) * | 2008-09-03 | 2013-08-01 | 에스케이플래닛 주식회사 | Side touch interface device and method |
KR20100039743A (en) * | 2008-10-08 | 2010-04-16 | 삼성전자주식회사 | Display apparatus and method of displaying thereof |
KR20100048090A (en) * | 2008-10-30 | 2010-05-11 | 삼성전자주식회사 | Interface apparatus for generating control command by touch and motion, interface system including the interface apparatus, and interface method using the same |
US20100109915A1 (en) * | 2008-10-31 | 2010-05-06 | Scarboro John E | Rapid Typing System for a Hand-held Electronic Device |
US20100123662A1 (en) * | 2008-11-14 | 2010-05-20 | Sony Ericsson Mobile Communications Ab | Method and apparatus for providing a user interface on a mobile device |
US20100123658A1 (en) * | 2008-11-17 | 2010-05-20 | Sony Ericsson Mobile Communications Ab | Portable communication device having a touch-sensitive input device with non-linear active areas |
JP4633166B2 (en) | 2008-12-22 | 2011-02-16 | 京セラ株式会社 | Input device and control method of input device |
US9588684B2 (en) | 2009-01-05 | 2017-03-07 | Tactus Technology, Inc. | Tactile interface for a computing device |
EP2390768B1 (en) * | 2009-01-20 | 2014-08-20 | Nec Corporation | Input device, information processing device, input method, and program |
JP5233708B2 (en) * | 2009-02-04 | 2013-07-10 | ソニー株式会社 | Information processing apparatus, information processing method, and program |
US8917239B2 (en) | 2012-10-14 | 2014-12-23 | Neonode Inc. | Removable protective cover with embedded proximity sensors |
US8643628B1 (en) | 2012-10-14 | 2014-02-04 | Neonode Inc. | Light-based proximity detection system and user interface |
US9489046B2 (en) * | 2009-05-04 | 2016-11-08 | Immersion Corporation | Method and apparatus for providing haptic feedback to non-input locations |
TW201040788A (en) * | 2009-05-05 | 2010-11-16 | Inventec Appliances Corp | Method to control a handheld electronic device |
US10401961B2 (en) | 2009-06-09 | 2019-09-03 | Immersion Corporation | Method and apparatus for generating haptic effects using actuators |
US9891708B2 (en) | 2009-06-09 | 2018-02-13 | Immersion Corporation | Method and apparatus for generating haptic effects using actuators |
TW201044226A (en) * | 2009-06-10 | 2010-12-16 | Weistech Technology Co Ltd | Integrated wired/wireless virtual unit control apparatus and method |
US8265717B2 (en) | 2009-06-26 | 2012-09-11 | Motorola Mobility Llc | Implementation of touchpad on rear surface of single-axis hinged device |
US9024908B2 (en) * | 2009-06-30 | 2015-05-05 | Microsoft Technology Licensing, Llc | Tactile feedback display screen overlay |
US20100328219A1 (en) * | 2009-06-30 | 2010-12-30 | Motorola, Inc. | Method for Integrating an Imager and Flash into a Keypad on a Portable Device |
CN102483675B (en) | 2009-07-03 | 2015-09-09 | 泰克图斯科技公司 | User interface strengthens system |
US8095191B2 (en) * | 2009-07-06 | 2012-01-10 | Motorola Mobility, Inc. | Detection and function of seven self-supported orientations in a portable device |
US20110014983A1 (en) * | 2009-07-14 | 2011-01-20 | Sony Computer Entertainment America Inc. | Method and apparatus for multi-touch game commands |
US8497884B2 (en) * | 2009-07-20 | 2013-07-30 | Motorola Mobility Llc | Electronic device and method for manipulating graphic user interface elements |
US8462126B2 (en) * | 2009-07-20 | 2013-06-11 | Motorola Mobility Llc | Method for implementing zoom functionality on a portable device with opposing touch sensitive surfaces |
JP4633184B1 (en) * | 2009-07-29 | 2011-02-23 | 京セラ株式会社 | Input device and control method of input device |
US10191547B2 (en) * | 2009-08-27 | 2019-01-29 | Kyocera Corporation | Tactile sensation providing apparatus and control method for tactile sensation providing apparatus |
US9317116B2 (en) * | 2009-09-09 | 2016-04-19 | Immersion Corporation | Systems and methods for haptically-enhanced text interfaces |
WO2011087816A1 (en) | 2009-12-21 | 2011-07-21 | Tactus Technology | User interface system |
EP2517089A4 (en) | 2009-12-21 | 2016-03-09 | Tactus Technology | User interface system |
US20110161809A1 (en) * | 2009-12-30 | 2011-06-30 | Gilmour Daniel A | Hand-held electronic device |
US9239623B2 (en) | 2010-01-05 | 2016-01-19 | Tactus Technology, Inc. | Dynamic tactile interface |
US8619035B2 (en) * | 2010-02-10 | 2013-12-31 | Tactus Technology, Inc. | Method for assisting user input to a device |
US8884790B2 (en) * | 2010-03-03 | 2014-11-11 | Twitch Technologies Llc | Matrix keyboarding system |
US9342241B2 (en) | 2010-03-03 | 2016-05-17 | Twitch Technologies Llc | Matrix keyboarding system |
KR101109313B1 (en) * | 2010-04-14 | 2012-01-31 | 삼성전기주식회사 | Display apparatus having touch screen panel |
WO2011133605A1 (en) | 2010-04-19 | 2011-10-27 | Tactus Technology | Method of actuating a tactile interface layer |
WO2011133604A1 (en) | 2010-04-19 | 2011-10-27 | Tactus Technology | User interface system |
US9542032B2 (en) | 2010-04-23 | 2017-01-10 | Handscape Inc. | Method using a predicted finger location above a touchpad for controlling a computerized system |
US9310905B2 (en) * | 2010-04-23 | 2016-04-12 | Handscape Inc. | Detachable back mounted touchpad for a handheld computerized device |
US9639195B2 (en) | 2010-04-23 | 2017-05-02 | Handscape Inc. | Method using finger force upon a touchpad for controlling a computerized system |
US9311724B2 (en) | 2010-04-23 | 2016-04-12 | Handscape Inc. | Method for user input from alternative touchpads of a handheld computerized device |
US9891820B2 (en) | 2010-04-23 | 2018-02-13 | Handscape Inc. | Method for controlling a virtual keyboard from a touchpad of a computerized device |
US9529523B2 (en) | 2010-04-23 | 2016-12-27 | Handscape Inc. | Method using a finger above a touchpad for controlling a computerized system |
US9891821B2 (en) | 2010-04-23 | 2018-02-13 | Handscape Inc. | Method for controlling a control region of a computerized device from a touchpad |
US9678662B2 (en) | 2010-04-23 | 2017-06-13 | Handscape Inc. | Method for detecting user gestures from alternative touchpads of a handheld computerized device |
US8390573B2 (en) * | 2010-04-26 | 2013-03-05 | Chris Trout | Data processing device |
FR2959838A1 (en) * | 2010-05-04 | 2011-11-11 | Amtek System Co Ltd | Wireless keyboard for use as wireless input device for home computer, has touchpad operated by finger tips of middle and index fingers and controlling cursor similar to computer mouse, where curved wings are provided at sides of board |
US8633907B2 (en) * | 2010-07-06 | 2014-01-21 | Padmanabhan Mahalingam | Touch screen overlay for visually impaired persons |
US8732697B2 (en) | 2010-08-04 | 2014-05-20 | Premkumar Jonnala | System, method and apparatus for managing applications on a device |
WO2012044714A1 (en) * | 2010-10-01 | 2012-04-05 | Imerj LLC | Pinch gesture to swap windows |
WO2012054780A1 (en) | 2010-10-20 | 2012-04-26 | Tactus Technology | User interface system |
CN103124946B (en) | 2010-10-20 | 2016-06-29 | 泰克图斯科技公司 | User interface system and method |
US9618972B2 (en) * | 2011-01-20 | 2017-04-11 | Blackberry Limited | Three-dimensional, multi-depth presentation of icons in association with differing input components of a user interface |
US8493357B2 (en) * | 2011-03-04 | 2013-07-23 | Integrated Device Technology, Inc | Mechanical means for providing haptic feedback in connection with capacitive sensing mechanisms |
JP5563153B2 (en) * | 2011-03-30 | 2014-07-30 | 本田技研工業株式会社 | Operating device |
JP5894380B2 (en) * | 2011-06-15 | 2016-03-30 | 株式会社スクウェア・エニックス | Video game processing apparatus and video game processing program |
US20150277597A1 (en) * | 2011-09-01 | 2015-10-01 | Handscape Inc. | Touchpad hand detector |
EP2570893A1 (en) * | 2011-09-16 | 2013-03-20 | Research In Motion Limited | Electronic device and method of character selection |
US8866747B2 (en) | 2011-09-16 | 2014-10-21 | Blackberry Limited | Electronic device and method of character selection |
US9367085B2 (en) | 2012-01-26 | 2016-06-14 | Google Technology Holdings LLC | Portable electronic device and method for controlling operation thereof taking into account which limb possesses the electronic device |
US9785273B2 (en) * | 2012-01-27 | 2017-10-10 | Visteon Global Technologies, Inc. | Touch surface and microprocessor assembly |
US9778841B2 (en) | 2012-02-10 | 2017-10-03 | Hand Held Products, Inc. | Apparatus having random ordered keypad |
US8803831B1 (en) * | 2012-03-23 | 2014-08-12 | Google Inc. | Chording sheath for computing device |
US9493342B2 (en) | 2012-06-21 | 2016-11-15 | Nextinput, Inc. | Wafer level MEMS force dies |
EP2870445A1 (en) | 2012-07-05 | 2015-05-13 | Ian Campbell | Microelectromechanical load sensor and methods of manufacturing the same |
US9081542B2 (en) | 2012-08-28 | 2015-07-14 | Google Technology Holdings LLC | Systems and methods for a wearable touch-sensitive device |
US9959038B2 (en) | 2012-08-30 | 2018-05-01 | Google Llc | Displaying a graphic keyboard |
US20140078086A1 (en) * | 2012-09-20 | 2014-03-20 | Marvell World Trade Ltd. | Augmented touch control for hand-held devices |
WO2014047656A2 (en) | 2012-09-24 | 2014-03-27 | Tactus Technology, Inc. | Dynamic tactile interface and methods |
US9405417B2 (en) | 2012-09-24 | 2016-08-02 | Tactus Technology, Inc. | Dynamic tactile interface and methods |
US9741184B2 (en) | 2012-10-14 | 2017-08-22 | Neonode Inc. | Door handle with optical proximity sensors |
US10585530B2 (en) | 2014-09-23 | 2020-03-10 | Neonode Inc. | Optical proximity sensor |
US9164625B2 (en) | 2012-10-14 | 2015-10-20 | Neonode Inc. | Proximity sensor for determining two-dimensional coordinates of a proximal object |
US10324565B2 (en) | 2013-05-30 | 2019-06-18 | Neonode Inc. | Optical proximity sensor |
US10282034B2 (en) | 2012-10-14 | 2019-05-07 | Neonode Inc. | Touch sensitive curved and flexible displays |
US9921661B2 (en) | 2012-10-14 | 2018-03-20 | Neonode Inc. | Optical proximity sensor and associated user interface |
PT2735956E (en) * | 2012-11-23 | 2015-10-20 | Ericsson Telefon Ab L M | Adaptable input |
US9041647B2 (en) * | 2013-03-15 | 2015-05-26 | Immersion Corporation | User interface device provided with surface haptic sensations |
WO2014176370A2 (en) | 2013-04-23 | 2014-10-30 | Handscape Inc. | Method for user input from alternative touchpads of a computerized system |
US9215302B2 (en) | 2013-05-10 | 2015-12-15 | Google Technology Holdings LLC | Method and device for determining user handedness and controlling a user interface |
US9557813B2 (en) | 2013-06-28 | 2017-01-31 | Tactus Technology, Inc. | Method for reducing perceived optical distortion |
DE102013012176A1 (en) * | 2013-07-22 | 2015-01-22 | Jungheinrich Aktiengesellschaft | Operating element for an industrial truck |
JP6217274B2 (en) * | 2013-09-20 | 2017-10-25 | カシオ計算機株式会社 | Portable terminal device and program |
US9902611B2 (en) | 2014-01-13 | 2018-02-27 | Nextinput, Inc. | Miniaturized and ruggedized wafer level MEMs force sensors |
US10146330B2 (en) | 2014-06-18 | 2018-12-04 | Matthew Swan Lawrence | Systems and methods for character and command input |
US9971496B2 (en) | 2014-08-04 | 2018-05-15 | Google Technology Holdings LLC | Method and apparatus for adjusting a graphical user interface on an electronic device |
KR101637285B1 (en) * | 2014-11-28 | 2016-07-07 | 현대자동차 주식회사 | Control panel for providing shortcut function |
WO2016201235A1 (en) | 2015-06-10 | 2016-12-15 | Nextinput, Inc. | Ruggedized wafer level mems force sensor with a tolerance trench |
CN105528071A (en) * | 2015-11-30 | 2016-04-27 | 广东欧珀移动通信有限公司 | Control method and electronic device |
WO2017150127A1 (en) * | 2016-03-04 | 2017-09-08 | 株式会社ソニー・インタラクティブエンタテインメント | Control apparatus and control program |
TWI618995B (en) * | 2016-04-18 | 2018-03-21 | Kita Sensor Tech Co Ltd | Pressure sensor and control system |
US10393600B2 (en) | 2016-12-15 | 2019-08-27 | Htc Corporation | Portable electronic device and sensing method thereof |
CN116907693A (en) | 2017-02-09 | 2023-10-20 | 触控解决方案股份有限公司 | Integrated digital force sensor and related manufacturing method |
US11243125B2 (en) | 2017-02-09 | 2022-02-08 | Nextinput, Inc. | Integrated piezoresistive and piezoelectric fusion force sensor |
GB2562758B (en) * | 2017-05-24 | 2021-05-12 | Sony Interactive Entertainment Inc | Input device and method |
CN111448446B (en) | 2017-07-19 | 2022-08-30 | 触控解决方案股份有限公司 | Strain transferring stack in MEMS force sensor |
US11423686B2 (en) | 2017-07-25 | 2022-08-23 | Qorvo Us, Inc. | Integrated fingerprint and force sensor |
DE102017116830A1 (en) * | 2017-07-25 | 2019-01-31 | Liebherr-Hydraulikbagger Gmbh | Operating device for a work machine |
WO2019023552A1 (en) | 2017-07-27 | 2019-01-31 | Nextinput, Inc. | A wafer bonded piezoresistive and piezoelectric force sensor and related methods of manufacture |
US11579028B2 (en) | 2017-10-17 | 2023-02-14 | Nextinput, Inc. | Temperature coefficient of offset compensation for force sensor and strain gauge |
WO2019090057A1 (en) | 2017-11-02 | 2019-05-09 | Nextinput, Inc. | Sealed force sensor with etch stop layer |
WO2019099821A1 (en) | 2017-11-16 | 2019-05-23 | Nextinput, Inc. | Force attenuator for force sensor |
EP3866941A1 (en) * | 2018-10-19 | 2021-08-25 | Hit Box. L.L.C. | Ergonomic game controller and system |
US10962427B2 (en) | 2019-01-10 | 2021-03-30 | Nextinput, Inc. | Slotted MEMS force sensor |
CN110399088B (en) * | 2019-07-31 | 2021-09-14 | 联想(北京)有限公司 | Information processing method and device applied to electronic equipment, electronic equipment and medium |
CN110515319A (en) * | 2019-10-12 | 2019-11-29 | 广州市紫霏洋电子产品有限公司 | A kind of control method of adjusting knob, device and processing terminal |
JP2023504590A (en) | 2019-12-31 | 2023-02-03 | ネオノード インコーポレイテッド | Contactless touch input system |
Family Cites Families (76)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4912462A (en) | 1982-07-29 | 1990-03-27 | Sharp Kabushiki Kaisha | Letter input device for electronic word retrieval device |
US4891777A (en) | 1983-05-11 | 1990-01-02 | The Laitram Corporation | Single hand keyboard arrays providing alphanumeric capabilities from twelve keys |
GB8423427D0 (en) | 1984-09-17 | 1984-10-24 | Jones P S | Music synthesizer |
JPS62258136A (en) | 1986-04-30 | 1987-11-10 | Mazda Motor Corp | Fuel feed control device for engine |
US4823311A (en) | 1986-05-30 | 1989-04-18 | Texas Instruments Incorporated | Calculator keyboard with user definable function keys and with programmably alterable interactive labels for certain function keys |
US4896554A (en) | 1987-11-03 | 1990-01-30 | Culver Craig F | Multifunction tactile manipulatable control |
AU6416390A (en) | 1989-09-28 | 1991-04-28 | Kurt W. Biller | Data input keyboard |
US6005496A (en) | 1991-04-10 | 1999-12-21 | Kinesis Corporation | Ergonomic keyboard apparatus |
US5189416A (en) | 1991-04-29 | 1993-02-23 | Walker-Estes Corporation | Chordal keyboard method and apparatus |
US5365589A (en) | 1992-02-07 | 1994-11-15 | Gutowitz Howard A | Method and apparatus for encryption, decryption and authentication using dynamical systems |
US6343991B1 (en) * | 1997-10-01 | 2002-02-05 | Brad A. Armstrong | Game control with analog pressure sensor |
JP3199130B2 (en) | 1992-03-31 | 2001-08-13 | パイオニア株式会社 | 3D coordinate input device |
AR247303A1 (en) | 1992-08-21 | 1994-11-30 | Gilligan Federico Gustavo Y Fa | New computer keyboard. |
US5824931A (en) | 1993-03-12 | 1998-10-20 | Musacus International Limited | Electronic information aid |
US5432510A (en) | 1993-03-22 | 1995-07-11 | Matthews; Walter S. | Ambidextrous single hand chordic data management device |
US5973621A (en) | 1993-06-03 | 1999-10-26 | Levy; David | Compact keyed input device |
US5612690A (en) | 1993-06-03 | 1997-03-18 | Levy; David | Compact keypad system and method |
US5473325A (en) | 1993-08-11 | 1995-12-05 | Mcalindon; Peter J. | Ergonomic human-computer interface apparatus and method |
US5515305A (en) * | 1993-12-09 | 1996-05-07 | Dell Usa, L.P. | PDA having chord keyboard input device and method of providing data thereto |
WO1995032461A1 (en) | 1994-05-23 | 1995-11-30 | Australian Institute Of Marine Science | A human/machine interface for computing devices |
US5782642A (en) | 1995-12-19 | 1998-07-21 | Goren; Michael | Interactive video and audio display system network interactive monitor module interface |
WO1997027674A1 (en) | 1996-01-26 | 1997-07-31 | Harrison Shelley | Key palette |
GB2314179B (en) * | 1996-06-12 | 1998-05-20 | John Quentin Phillipps | Portable electronic apparatus |
US5859629A (en) | 1996-07-01 | 1999-01-12 | Sun Microsystems, Inc. | Linear touch input device |
US6297752B1 (en) * | 1996-07-25 | 2001-10-02 | Xuan Ni | Backside keyboard for a notebook or gamebox |
US6115028A (en) | 1996-08-22 | 2000-09-05 | Silicon Graphics, Inc. | Three dimensional input system using tilt |
US6232956B1 (en) | 1997-02-27 | 2001-05-15 | Spice Technologies, Inc. | OHAI technology user interface |
DE19718711C1 (en) | 1997-05-02 | 1998-12-03 | Easyphone Gmbh | Mobile device with a reduced key set |
US6084576A (en) | 1997-09-27 | 2000-07-04 | Leu; Neng-Chyang | User friendly keyboard |
TW401598B (en) | 1997-11-27 | 2000-08-11 | United Microelectronics Corp | The manufacture method of hemispherical grain silicon (HSG-Si) |
DE19757933A1 (en) | 1997-12-27 | 1998-10-01 | Lei Sun | Wireless input and display unit for computer |
JPH11239670A (en) | 1998-02-25 | 1999-09-07 | Sony Corp | Portable electronic equipment |
US6919879B2 (en) | 1998-06-26 | 2005-07-19 | Research In Motion Limited | Hand-held electronic device with a keyboard optimized for use with the thumbs |
US6512511B2 (en) | 1998-07-20 | 2003-01-28 | Alphagrip, Inc. | Hand grippable combined keyboard and game controller system |
US6760013B2 (en) | 1998-07-20 | 2004-07-06 | Alphagrip, Inc. | Hand held gaming and data entry system |
US6219731B1 (en) | 1998-12-10 | 2001-04-17 | Eaton: Ergonomics, Inc. | Method and apparatus for improved multi-tap text input |
US6885317B1 (en) | 1998-12-10 | 2005-04-26 | Eatoni Ergonomics, Inc. | Touch-typable devices based on ambiguous codes and methods to design such devices |
US6320942B1 (en) | 1998-12-31 | 2001-11-20 | Keytouch Corporation | Directionally-mapped, keyed alpha-numeric data input/output system |
JP2000267787A (en) * | 1999-03-18 | 2000-09-29 | Canon Inc | Input device and portable information processor |
US6377685B1 (en) | 1999-04-23 | 2002-04-23 | Ravi C. Krishnan | Cluster key arrangement |
US6606486B1 (en) | 1999-07-29 | 2003-08-12 | Ericsson Inc. | Word entry method for mobile originated short messages |
US6909424B2 (en) * | 1999-09-29 | 2005-06-21 | Gateway Inc. | Digital information appliance input device |
US6865718B2 (en) | 1999-09-29 | 2005-03-08 | Microsoft Corp. | Accelerated scrolling |
US6542091B1 (en) | 1999-10-01 | 2003-04-01 | Wayne Allen Rasanen | Method for encoding key assignments for a data input device |
US6498601B1 (en) | 1999-11-29 | 2002-12-24 | Xerox Corporation | Method and apparatus for selecting input modes on a palmtop computer |
US6654733B1 (en) | 2000-01-18 | 2003-11-25 | Microsoft Corporation | Fuzzy keyboard |
US6573844B1 (en) | 2000-01-18 | 2003-06-03 | Microsoft Corporation | Predictive keyboard |
US20030083114A1 (en) | 2000-04-13 | 2003-05-01 | Daniel Lavin | Hardware configuration for a navigation control unit for a wireless computer resource access device, such as a wireless web content access device |
US6741235B1 (en) | 2000-06-13 | 2004-05-25 | Michael Goren | Rapid entry of data and information on a reduced size input area |
US20020023265A1 (en) | 2000-08-08 | 2002-02-21 | Metcalf Darrell J. | Wireless controller with publicly-accessible communications link for controlling the content seen on large-screen systems |
DE10046099A1 (en) * | 2000-09-18 | 2002-04-04 | Siemens Ag | Touch sensitive display with tactile feedback |
US6520699B2 (en) | 2001-02-16 | 2003-02-18 | Toshiyasu Abe | Keyboard |
US6738045B2 (en) | 2001-02-26 | 2004-05-18 | Microsoft Corporation | Method and system for accelerated data navigation |
US20020163504A1 (en) | 2001-03-13 | 2002-11-07 | Pallakoff Matthew G. | Hand-held device that supports fast text typing |
US7012595B2 (en) * | 2001-03-30 | 2006-03-14 | Koninklijke Philips Electronics N.V. | Handheld electronic device with touch pad |
US7072975B2 (en) | 2001-04-24 | 2006-07-04 | Wideray Corporation | Apparatus and method for communicating information to portable computing devices |
US6541715B2 (en) | 2001-05-24 | 2003-04-01 | Philip Swanson | Alphanumeric keyboard for hand-held electronic devices |
JP4336788B2 (en) * | 2001-06-04 | 2009-09-30 | 日本電気株式会社 | Mobile telephone system and mobile telephone |
GB0116083D0 (en) | 2001-06-30 | 2001-08-22 | Koninkl Philips Electronics Nv | Text entry method and device therefor |
WO2003007117A2 (en) | 2001-07-12 | 2003-01-23 | Friedman Gary L | Portable, hand-held electronic input device and combination with a personal digital device |
WO2004019315A1 (en) | 2001-07-17 | 2004-03-04 | Nohr Steven P | System and method for finger held hardware device |
US7092734B2 (en) | 2001-08-06 | 2006-08-15 | Samsung Electronics Co., Ltd. | IOTA software download via auxiliary device |
US20030048205A1 (en) | 2001-08-10 | 2003-03-13 | Junru He | 3D electronic data input device with key mapping card |
DE10144634A1 (en) | 2001-09-11 | 2003-04-10 | Trw Automotive Electron & Comp | operating system |
CA2459043C (en) | 2001-09-20 | 2007-11-20 | Yuvee, Inc. | Universal keyboard |
JP2003099704A (en) | 2001-09-21 | 2003-04-04 | Mitsubishi Electric Corp | Handy terminal device with programmable vibration pattern, and application software for handy terminal device |
FI115861B (en) | 2001-11-12 | 2005-07-29 | Myorigo Oy | Method and apparatus for generating a response |
US8176432B2 (en) | 2001-11-20 | 2012-05-08 | UEI Electronics Inc. | Hand held remote control device having an improved user interface |
WO2003052948A1 (en) | 2001-12-18 | 2003-06-26 | Nokia Corporation | Removable housing cover for a portable radio communication device |
US6947028B2 (en) * | 2001-12-27 | 2005-09-20 | Mark Shkolnikov | Active keyboard for handheld electronic gadgets |
US20030193418A1 (en) | 2002-04-10 | 2003-10-16 | Xiaodong Shi | Method and Apparatus To Input Text On Devices Requiring A Small Keypad |
AU2003237247A1 (en) | 2002-05-23 | 2003-12-12 | Digit Wireless, Llc | Keypads and key switches |
DE10229068B3 (en) | 2002-06-28 | 2004-02-05 | Fujitsu Siemens Computers Gmbh | PDA (Personal Digital Assistant) with touch screen display |
US6998871B2 (en) | 2002-11-29 | 2006-02-14 | Sigmatel, Inc. | Configurable integrated circuit for use in a multi-function handheld device |
US20040208681A1 (en) * | 2003-04-19 | 2004-10-21 | Dechene Joseph Fernand | Computer or input device with back side keyboard |
US7218313B2 (en) | 2003-10-31 | 2007-05-15 | Zeetoo, Inc. | Human interface system |
-
2003
- 2003-10-31 US US10/699,555 patent/US7218313B2/en not_active Expired - Fee Related
-
2004
- 2004-10-29 WO PCT/US2004/036163 patent/WO2005043371A2/en active Application Filing
- 2004-10-29 KR KR1020067010113A patent/KR20060096451A/en not_active Application Discontinuation
- 2004-10-29 KR KR1020107004921A patent/KR101036532B1/en not_active IP Right Cessation
- 2004-10-29 CN CN2007101533713A patent/CN101140481B/en not_active Expired - Fee Related
- 2004-10-29 CN CNA2004800324221A patent/CN1875335A/en active Pending
- 2004-10-29 EP EP09162387A patent/EP2093646A1/en not_active Withdrawn
- 2004-10-29 EP EP04810164A patent/EP1678598A2/en not_active Ceased
- 2004-10-29 JP JP2006538366A patent/JP2007510233A/en active Pending
- 2004-10-29 CA CA2543191A patent/CA2543191C/en not_active Expired - Fee Related
- 2004-10-29 KR KR1020087012601A patent/KR101188484B1/en not_active IP Right Cessation
-
2007
- 2007-05-11 US US11/747,863 patent/US7463245B2/en not_active Expired - Fee Related
-
2008
- 2008-05-14 HK HK08105314.0A patent/HK1115652A1/en not_active IP Right Cessation
- 2008-12-05 US US12/329,411 patent/US7667692B2/en not_active Expired - Fee Related
Also Published As
Publication number | Publication date |
---|---|
HK1115652A1 (en) | 2008-12-05 |
WO2005043371A2 (en) | 2005-05-12 |
CA2543191A1 (en) | 2005-05-12 |
US20050093846A1 (en) | 2005-05-05 |
WO2005043371A3 (en) | 2005-06-16 |
KR20080051192A (en) | 2008-06-10 |
KR20100043271A (en) | 2010-04-28 |
US7667692B2 (en) | 2010-02-23 |
EP1678598A2 (en) | 2006-07-12 |
US20070211035A1 (en) | 2007-09-13 |
US20090143142A1 (en) | 2009-06-04 |
CN1875335A (en) | 2006-12-06 |
KR101036532B1 (en) | 2011-05-24 |
CN101140481A (en) | 2008-03-12 |
US7218313B2 (en) | 2007-05-15 |
US7463245B2 (en) | 2008-12-09 |
CN101140481B (en) | 2010-06-09 |
JP2007510233A (en) | 2007-04-19 |
KR101188484B1 (en) | 2012-10-05 |
KR20060096451A (en) | 2006-09-11 |
EP2093646A1 (en) | 2009-08-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CA2543191C (en) | Human interface system | |
US7280097B2 (en) | Human interface input acceleration system | |
AU2004322755B2 (en) | Active keyboard system for handheld electronic devices | |
JP6740389B2 (en) | Adaptive user interface for handheld electronic devices | |
WO2008041975A1 (en) | Keypad emulation | |
EP1727026A2 (en) | Character entry system and method for electronic devices | |
WO2005041014A1 (en) | Device having a joystick keypad | |
Yoon et al. | Square: 3x3 keypad mapped to geometric elements of a square | |
WO2009052658A1 (en) | Input device and method for inputting characters | |
AU3899301A (en) | A device or component for alphanumeric and direction input |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
EEER | Examination request | ||
MKLA | Lapsed |
Effective date: 20181029 |