US20150009165A1 - Counter-tactile keypad - Google Patents

Counter-tactile keypad Download PDF

Info

Publication number
US20150009165A1
US20150009165A1 US14/485,815 US201414485815A US2015009165A1 US 20150009165 A1 US20150009165 A1 US 20150009165A1 US 201414485815 A US201414485815 A US 201414485815A US 2015009165 A1 US2015009165 A1 US 2015009165A1
Authority
US
United States
Prior art keywords
electronic device
button
touch
housing
receiving
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/485,815
Inventor
Eyal Bychkov
Hagay Katz
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Google LLC
Original Assignee
Google LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Google LLC filed Critical Google LLC
Priority to US14/485,815 priority Critical patent/US20150009165A1/en
Publication of US20150009165A1 publication Critical patent/US20150009165A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/0202Constructional details or processes of manufacture of the input device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0489Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/23Construction or mounting of dials or of equivalent devices; Means for facilitating the use thereof
    • H04M1/236Construction or mounting of dials or of equivalent devices; Means for facilitating the use thereof including keys on side or rear faces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Definitions

  • the present invention relates to touch-based user interfaces for electronic devices.
  • Touch screens are used for both input and output. They enable device manufacturers to reduce the area of the device used for off-screen input controls, such as buttons and keypads, and to enlarge the screen area, thereby enhancing the user experience.
  • Control elements correspond to user commands, are displayed on a screen, and provide areas for a user to press on.
  • Control elements may appear as buttons, scroll bars, slide bars and wheels. Users can press or tap on control elements such as buttons, or drag control elements such as scroll bars, slide bars and wheels to a desired location. Pressing, tapping or dragging control elements activates their corresponding commands.
  • touch screens display graphics, similar to conventional LCD displays.
  • FIG. 1 is a prior art illustration of a touch screen. Shown in FIG. 1 is a handheld electronic device 100 having a touch screen 110 . Device 100 displays various buttons 120 in touch screen 110 , which a user can press in order to enter numbers and commands.
  • touch screens are the flexibility of displaying a wide variety of control elements, such as buttons, icons and selection menus, for a corresponding wide variety of modes of operation.
  • a touch screen may display a numeric keypad, and while in an SMS mode of operation, the touch screen may display an alphabet keypad. Areas on the screen thus produce different actions when pressed, depending on the control elements being displayed therein.
  • a drawback with touch screens is the lack of a tactile feeling, as a result of which many people find them difficult to use.
  • Prior art methods of overcoming this drawback include graphical methods, audio methods, force feedback methods and vibration methods. Graphical methods make control elements appear to be pressed and released, similar to physical buttons presses, thus creating a perception of a physical button press. Audio methods provide sounds in response to elements being pressed.
  • the TouchSense® system of Immersion Corporation of San Jose, Calif. includes both graphical and audio feedback when touch screens are pressed.
  • Force feedback methods operate by mounting a touch screen on a linear flexure, which allows the screen to bend inwards when pressed. Force feedback for touch screens is described in U.S. Pat. No. 7,113,177 to Franzen.
  • the '177 patent describes a touch-sensitive display with tactile feedback, comprised of three layers; namely, a display layer, a layer that includes receptors, and a layer that includes controllable actuators.
  • Vibration methods cause a device to vibrate in response to a control element being pressed, as a tactile feedback.
  • Pantech Group of Seoul, Korea developed such a touch screen for its dual-LCD sliding phones.
  • the present invention provides a way to generate tactile feedback for screens that display user interface control elements.
  • the present invention uses both front and back sides of an electronic device; one side for housing a screen, and the other side for housing physical buttons.
  • the screen is positioned substantially opposite the buttons. Pressing a button on the device activates a control element that is displayed opposite the button on the other side of the device.
  • the screen is a touch screen and the buttons are not electrically connected to the device; i.e. the buttons are merely used for their tactile feedback.
  • the screen is a non-touch screen and the buttons are fully functional.
  • the screen is a touch screen and the buttons are fully functional.
  • a touch-based user interface for an electronic device, including a housing including electronic circuitry, a plurality of buttons mounted within a first area on a first side of the housing, and a screen mounted on a second area of a second side of the housing, the second side being opposite to the first side, and the second area being opposite to at least a portion of the first area, wherein the electronic circuitry is operative (i) to display on the screen at least one user interface control element that corresponds respectively to at least one button, each such user interface control element having a command associated therewith, and (ii) to perform the command associated with a designated user interface control element when its corresponding button is pressed.
  • a method for a touch-based user interface for an electronic device including receiving notification that a user has pressed a button on a first side of an electronic device, and performing a command associated with a control element displayed on a screen on the electronic device, wherein the screen is located on a second side of the electronic device, the second side being opposite to the first side, and wherein the control element is displayed opposite the location of the button that was pressed.
  • a computer readable storage medium storing program code for causing an electronic device to receive notification that a user has pressed a button on a first side of the electronic device, and to perform a command associated with a control element displayed on a screen on the electronic device, wherein the screen is located on a second side of the electronic device, the second side being opposite to the first side, and wherein the control element is displayed opposite the location of the button that was pressed.
  • a method for touch-based user interface for an electronic device including receiving a notification that a user has pressed an area of a touch screen on an electronic device where a control element is displayed, verifying that the user has also pressed an off-screen button corresponding to the control element, and performing a command associated with the control element after and only after the receiving and the verifying have been performed.
  • a computer readable storage medium storing program code for causing an electronic device to receive a notification that a user has pressed an area of a touch screen on the electronic device where a control element is displayed, to verify that the user has also pressed an off-screen button corresponding to the control element, and to perform a command associated with the control element after and only after both the receiving and the verifying have been performed.
  • FIG. 1 is a prior art illustration of a touch screen
  • FIG. 2 is an illustration of a touch-based user interface that uses two opposite sides of an electronic device, in accordance with an embodiment of the present invention
  • FIG. 3 is a simplified cross-sectional front, side and back view of an electronic device that has a touch-based user interface, in accordance with an embodiment of the present invention.
  • FIG. 4 is an illustration of two opposite sides of an electronic device that has a touch-based user interface, in accordance with an embodiment of the present invention.
  • the present invention relates to touch-based user interfaces for electronic devices.
  • the present invention uses two opposite sides of the devices; one side for a screen, and the opposite side for physical buttons.
  • the screen is located substantially opposite the buttons.
  • FIG. 2 is an illustration of a touch-based user interface that uses two opposite sides of an electronic device, in accordance with an embodiment of the present invention.
  • a user presses a screen on the front side of the device with his thumb, and presses a button on the back side of the device with his index finger.
  • the user's thumb and index finger are aligned substantially opposite one another.
  • FIG. 3 is a simplified cross-sectional front, side and back view of an electronic device that has a touch-based user interface, in accordance with an embodiment of the present invention.
  • a screen is housed on the front side of the device and physical buttons are positioned on the back side of the device.
  • the side view indicates that the buttons protrude from the housing. It will be appreciated by those skilled in the art, however, that the buttons may be flush with or indented in the back surface of the housing.
  • the screen housed on the front side of the device is positioned in an area that is substantially opposite the area where the buttons are located on the back side of the device.
  • FIG. 4 is an illustration of two opposite sides of an electronic device that has a touch-based user interface, in accordance with an embodiment of the present invention. Shown in FIG. 4 are control elements 410 displayed on a screen 400 on the front side of the device, and physical buttons 420 positioned on the back side of the device. Each control element 410 on the front side is positioned substantially opposite a corresponding button 420 on the back side. The shapes of buttons 420 need not be the same as those of the areas of their corresponding control elements 410 . In FIG. 4 , for example, buttons 420 are oval shaped, and their corresponding control elements 410 are rectangular shaped.
  • a button 420 on the back side When a button 420 on the back side is pressed, a command associated with its corresponding control element 410 on the front side is performed.
  • a user may position his thumb on the control element, and press its corresponding button 420 with his index finger. I.e., pressing on a button 420 on the back side behind the screen corresponds to pressing its corresponding control element 410 on the screen.
  • a motivation for the present invention is that fact that neurologically people are able to accurately align the tips of their thumbs and index fingers. In fact, neurological diagnoses often incorporate patients' accuracy in arranging their two fingers to touch.
  • screen 400 is a touch screen
  • buttons 420 are physically functional but not electrically connected to the device. Buttons 420 serve to supplement the touch screen with a tactile effect, with inexpensive mechanical buttons.
  • screen 400 is an inexpensive conventional non-touch screen, and buttons 420 are fully functional. Buttons 420 serve to provide the non-touch screen with the flexibility of a touch screen.
  • screen 400 is a touch screen, and buttons 420 are fully functional.
  • operation of the device is configured so that a control element is activated only when both the control element is touched on the screen and its corresponding button is pressed. The device thus ensures that a user is using two fingers, which is useful in avoiding unintended presses of the screen, and eliminates the need to lock the screen.
  • screen 400 is a touch screen, and buttons 420 are fully operational, as in the third embodiment.
  • operation of the device is configured to that a control element is activated when either the control element is touched on the screen, or when its corresponding button is pressed.
  • graphical and audio feedback may be incorporated, to notify a user that his action is acknowledged.
  • a controller of the device in FIGS. 2-4 is programmed to map each button 420 to a specific area of screen 400 .
  • Control elements have buttons associated therewith, and are displayed by the controller on screen 400 at positions within the screen areas to which their buttons map. Different control elements may be displayed in different modes of operation of the device, but for each mode the control elements are positioned within the screen areas to which their buttons map.
  • the controller is programmed to detect both presses before activating the control element.
  • the device in FIGS. 2-4 may also be operated in certain cases without the display, in a “button-only” mode.
  • the button-only mode may be activated manually, by a user pressing a “button-only” button, or automatically when the device is held in an orientation with the screen facing down and the buttons facing the user. Such an orientation may be automatically detected by an orientation sensor within the device.
  • buttons are engraved with symbols, such as alphanumeric symbols, which represent their default functions.
  • the default button functions are operational when the device is in the button-only mode.

Abstract

An electronic device, including electronic circuitry contained within a housing, an orientation sensor for detecting an orientation of the housing, a button-based user interface on a first side of the housing including physical buttons for enabling button-press user inputs to the electronic circuitry, and a touch-sensitive screen on a second side of the housing for enabling touch-based user inputs to the electronic circuitry, the second side being opposite the first side, wherein the electronic circuitry is operative (i) to process button-press user inputs and to ignore touch-based user inputs, when the orientation sensor detects an orientation of the housing wherein the second side is facing down and the first side is facing up, and (ii) to process touch-based user inputs and to ignore button-press user inputs, when the orientation sensor detects an orientation of the housing wherein the first side is facing down and the second side is facing up.

Description

    PRIORITY REFERENCE TO RELATED APPLICATIONS
  • This application claims benefit of U.S. Provisional Application No. 60/964,872, entitled COUNTER-TACTILE KEYPAD, filed on Aug. 14, 2007 by inventors Eyal Bychkov and Hagay Katz.
  • FIELD OF THE INVENTION
  • The present invention relates to touch-based user interfaces for electronic devices.
  • BACKGROUND OF THE INVENTION
  • Handheld electronic devices have benefited from touch screen technology. Touch screens are used for both input and output. They enable device manufacturers to reduce the area of the device used for off-screen input controls, such as buttons and keypads, and to enlarge the screen area, thereby enhancing the user experience.
  • For input, users interact with touch screens using visual control elements. Control elements correspond to user commands, are displayed on a screen, and provide areas for a user to press on. Control elements may appear as buttons, scroll bars, slide bars and wheels. Users can press or tap on control elements such as buttons, or drag control elements such as scroll bars, slide bars and wheels to a desired location. Pressing, tapping or dragging control elements activates their corresponding commands.
  • For output, touch screens display graphics, similar to conventional LCD displays.
  • Reference is now made to FIG. 1, which is a prior art illustration of a touch screen. Shown in FIG. 1 is a handheld electronic device 100 having a touch screen 110. Device 100 displays various buttons 120 in touch screen 110, which a user can press in order to enter numbers and commands.
  • An advantage of touch screens is the flexibility of displaying a wide variety of control elements, such as buttons, icons and selection menus, for a corresponding wide variety of modes of operation. Thus, while in a dialer mode of operation, a touch screen may display a numeric keypad, and while in an SMS mode of operation, the touch screen may display an alphabet keypad. Areas on the screen thus produce different actions when pressed, depending on the control elements being displayed therein.
  • A drawback with touch screens is the lack of a tactile feeling, as a result of which many people find them difficult to use. Prior art methods of overcoming this drawback include graphical methods, audio methods, force feedback methods and vibration methods. Graphical methods make control elements appear to be pressed and released, similar to physical buttons presses, thus creating a perception of a physical button press. Audio methods provide sounds in response to elements being pressed. The TouchSense® system of Immersion Corporation of San Jose, Calif., includes both graphical and audio feedback when touch screens are pressed.
  • Force feedback methods operate by mounting a touch screen on a linear flexure, which allows the screen to bend inwards when pressed. Force feedback for touch screens is described in U.S. Pat. No. 7,113,177 to Franzen. The '177 patent describes a touch-sensitive display with tactile feedback, comprised of three layers; namely, a display layer, a layer that includes receptors, and a layer that includes controllable actuators.
  • Vibration methods cause a device to vibrate in response to a control element being pressed, as a tactile feedback. Pantech Group of Seoul, Korea, developed such a touch screen for its dual-LCD sliding phones.
  • SUMMARY OF THE DESCRIPTION
  • The present invention provides a way to generate tactile feedback for screens that display user interface control elements. The present invention uses both front and back sides of an electronic device; one side for housing a screen, and the other side for housing physical buttons. The screen is positioned substantially opposite the buttons. Pressing a button on the device activates a control element that is displayed opposite the button on the other side of the device.
  • Three embodiments of the invention are described. In the first embodiment, the screen is a touch screen and the buttons are not electrically connected to the device; i.e. the buttons are merely used for their tactile feedback. In the second embodiment, the screen is a non-touch screen and the buttons are fully functional. In the third embodiment, the screen is a touch screen and the buttons are fully functional.
  • There is thus provided in accordance with an embodiment of the present invention a touch-based user interface for an electronic device, including a housing including electronic circuitry, a plurality of buttons mounted within a first area on a first side of the housing, and a screen mounted on a second area of a second side of the housing, the second side being opposite to the first side, and the second area being opposite to at least a portion of the first area, wherein the electronic circuitry is operative (i) to display on the screen at least one user interface control element that corresponds respectively to at least one button, each such user interface control element having a command associated therewith, and (ii) to perform the command associated with a designated user interface control element when its corresponding button is pressed.
  • There is additionally provided in accordance with an embodiment of the present invention a method for a touch-based user interface for an electronic device, including receiving notification that a user has pressed a button on a first side of an electronic device, and performing a command associated with a control element displayed on a screen on the electronic device, wherein the screen is located on a second side of the electronic device, the second side being opposite to the first side, and wherein the control element is displayed opposite the location of the button that was pressed.
  • There is moreover provided in accordance with an embodiment of the present invention a computer readable storage medium storing program code for causing an electronic device to receive notification that a user has pressed a button on a first side of the electronic device, and to perform a command associated with a control element displayed on a screen on the electronic device, wherein the screen is located on a second side of the electronic device, the second side being opposite to the first side, and wherein the control element is displayed opposite the location of the button that was pressed.
  • There is further provided in accordance with an embodiment of the present invention a method for touch-based user interface for an electronic device, including receiving a notification that a user has pressed an area of a touch screen on an electronic device where a control element is displayed, verifying that the user has also pressed an off-screen button corresponding to the control element, and performing a command associated with the control element after and only after the receiving and the verifying have been performed.
  • There is yet further provided in accordance with an embodiment of the present invention a computer readable storage medium storing program code for causing an electronic device to receive a notification that a user has pressed an area of a touch screen on the electronic device where a control element is displayed, to verify that the user has also pressed an off-screen button corresponding to the control element, and to perform a command associated with the control element after and only after both the receiving and the verifying have been performed.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The present invention will be more fully understood and appreciated from the following detailed description, taken in conjunction with the drawings in which:
  • FIG. 1 is a prior art illustration of a touch screen;
  • FIG. 2 is an illustration of a touch-based user interface that uses two opposite sides of an electronic device, in accordance with an embodiment of the present invention;
  • FIG. 3 is a simplified cross-sectional front, side and back view of an electronic device that has a touch-based user interface, in accordance with an embodiment of the present invention; and
  • FIG. 4 is an illustration of two opposite sides of an electronic device that has a touch-based user interface, in accordance with an embodiment of the present invention.
  • DETAILED DESCRIPTION
  • The present invention relates to touch-based user interfaces for electronic devices. The present invention uses two opposite sides of the devices; one side for a screen, and the opposite side for physical buttons. The screen is located substantially opposite the buttons.
  • Reference is now made to FIG. 2, which is an illustration of a touch-based user interface that uses two opposite sides of an electronic device, in accordance with an embodiment of the present invention. A user presses a screen on the front side of the device with his thumb, and presses a button on the back side of the device with his index finger. The user's thumb and index finger are aligned substantially opposite one another.
  • Reference is now made to FIG. 3, which is a simplified cross-sectional front, side and back view of an electronic device that has a touch-based user interface, in accordance with an embodiment of the present invention. As shown in FIG. 3, a screen is housed on the front side of the device and physical buttons are positioned on the back side of the device. The side view indicates that the buttons protrude from the housing. It will be appreciated by those skilled in the art, however, that the buttons may be flush with or indented in the back surface of the housing.
  • The screen housed on the front side of the device is positioned in an area that is substantially opposite the area where the buttons are located on the back side of the device.
  • Reference is now made to FIG. 4, which is an illustration of two opposite sides of an electronic device that has a touch-based user interface, in accordance with an embodiment of the present invention. Shown in FIG. 4 are control elements 410 displayed on a screen 400 on the front side of the device, and physical buttons 420 positioned on the back side of the device. Each control element 410 on the front side is positioned substantially opposite a corresponding button 420 on the back side. The shapes of buttons 420 need not be the same as those of the areas of their corresponding control elements 410. In FIG. 4, for example, buttons 420 are oval shaped, and their corresponding control elements 410 are rectangular shaped.
  • When a button 420 on the back side is pressed, a command associated with its corresponding control element 410 on the front side is performed. Thus, to activate a specific control element 410, a user may position his thumb on the control element, and press its corresponding button 420 with his index finger. I.e., pressing on a button 420 on the back side behind the screen corresponds to pressing its corresponding control element 410 on the screen.
  • A motivation for the present invention is that fact that neurologically people are able to accurately align the tips of their thumbs and index fingers. In fact, neurological diagnoses often incorporate patients' accuracy in arranging their two fingers to touch.
  • In a first embodiment of the present invention, screen 400 is a touch screen, and buttons 420 are physically functional but not electrically connected to the device. Buttons 420 serve to supplement the touch screen with a tactile effect, with inexpensive mechanical buttons.
  • In a second embodiment of the present invention, screen 400 is an inexpensive conventional non-touch screen, and buttons 420 are fully functional. Buttons 420 serve to provide the non-touch screen with the flexibility of a touch screen.
  • The following pseudo-code is a simplified description of the second embodiment, in accordance with the present invention.
  • x=0
    if (button == 1)
    {
    x = Find_control_element(location == 1);
    }
    if ((button == 2)
    {
    x = Find_control_element(location == 2);
    }
    .....
    if (x>0)
    {
    do Control_element_function(X)
    }
  • In a third embodiment of the present invention, screen 400 is a touch screen, and buttons 420 are fully functional. In this embodiment, operation of the device is configured so that a control element is activated only when both the control element is touched on the screen and its corresponding button is pressed. The device thus ensures that a user is using two fingers, which is useful in avoiding unintended presses of the screen, and eliminates the need to lock the screen.
  • The following pseudo-code is a simplified description of the third embodiment, in accordance with the present invention.
  • if (button-only( ) == FALSE)
    {
    if ((screen-control-element) == 1) AND (button == 1))
    {
    do Control_element_function 1
    }
    if ((screen-control-element) == 2) AND (button == 2))
    {
    do Control_element_function 2
    }
    .....
    else
    {
    do nothing;
    }
    }
    else
    {
    if (button == 1)
    {
    do Control_element_function 1
    }
    if (button == 2)
    {
    do Control_element_function 2
    }
    .....
    else
    {
    do nothing;
    }
    }
  • In a fourth embodiment of the present invention, screen 400 is a touch screen, and buttons 420 are fully operational, as in the third embodiment. In this embodiment, operation of the device is configured to that a control element is activated when either the control element is touched on the screen, or when its corresponding button is pressed.
  • The following pseudo-code is a simplified description of the fourth embodiment, in accordance with the present invention.
  • if (button-only( ) == FALSE)
    {
    if ((screen-control-element) == 1) OR (button == 1))
    {
    do Control_element_function 1
    }
    if ((screen-control-element) == 2) OR (button == 2))
    {
    do Control_element_function 2
    }
    .....
    else
    {
    do nothing;
    }
    }
    else
    {
    if (button == 1)
    {
    do Control_element_function 1
    }
    if (button == 2)
    {
    do Control_element_function 2
    }
    .....
    else
    {
    }
    }
  • In all four of the above embodiments, graphical and audio feedback may be incorporated, to notify a user that his action is acknowledged.
  • In accordance with an embodiment of the present invention, a controller of the device in FIGS. 2-4 is programmed to map each button 420 to a specific area of screen 400. Control elements have buttons associated therewith, and are displayed by the controller on screen 400 at positions within the screen areas to which their buttons map. Different control elements may be displayed in different modes of operation of the device, but for each mode the control elements are positioned within the screen areas to which their buttons map.
  • In the third embodiment described hereinabove, wherein a control element and its button must both be pressed in order to activate the control element, the controller is programmed to detect both presses before activating the control element.
  • In accordance with an embodiment of the present invention, the device in FIGS. 2-4 may also be operated in certain cases without the display, in a “button-only” mode. The button-only mode may be activated manually, by a user pressing a “button-only” button, or automatically when the device is held in an orientation with the screen facing down and the buttons facing the user. Such an orientation may be automatically detected by an orientation sensor within the device.
  • The buttons are engraved with symbols, such as alphanumeric symbols, which represent their default functions. The default button functions are operational when the device is in the button-only mode.
  • In the foregoing specification, the invention has been described with reference to specific exemplary embodiments thereof. It will, however, be evident that various modifications and changes may be made to the specific exemplary embodiments without departing from the broader spirit and scope of the invention as set forth in the appended claims. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.

Claims (9)

1.-21. (canceled)
22. An electronic device, comprising:
a housing;
electronic circuitry contained within said housing;
an orientation sensor coupled to said electronic circuitry, for detecting an orientation of said housing;
a button-based user interface mounted on a first side of said housing comprising physical buttons for enabling button-press user inputs to said electronic circuitry; and
a touch-sensitive screen mounted on a second side of said housing for enabling touch-based user inputs to said electronic circuitry, the second side being opposite the first side,
wherein said electronic circuitry is operative (i) to process button-press user inputs and to ignore touch-based user inputs, when said orientation sensor detects an orientation of said housing wherein the second side is facing down and the first side is facing up, and (ii) to process touch-based user inputs and to ignore button-press user inputs, when said orientation sensor detects an orientation of said housing wherein the first side is facing down and the second side is facing up.
23. The electronic device of claim 22 wherein said button-based user interface enables substantially the same input commands, at any given time, as the touch-sensitive screen.
24. A method performed by an electronic device, the method comprising:
receiving, by the electronic device, notification that a user has pressed a physical input button on a first side of an electronic device;
further receiving, by the electronic device, notification that a user has touched a touch-sensitive screen on a second side of the electronic device, the second side being opposite the first side;
detecting, by the electronic device, an orientation of said housing; and
processing, by the electronic device, said receiving and ignoring, by the electronic device, said further receiving, when said detecting detects an orientation of the electronic device wherein the second side is facing down and the first side is facing up; and
processing, by the electronic device, said further receiving and ignoring, by the electronic device, said receiving, when said detecting detects an orientation of the electronic device wherein the first side is facing down and the second side is facing up.
25. A non-transitory computer readable storage medium storing program code for causing an electronic device to perform a method, the method comprising:
receiving notification that a user has pressed a physical input button on a first side of the electronic device;
further receiving notification that a user has touched a touch-sensitive screen on a second side of the electronic device, the second side being opposite the first side;
detecting an orientation of said housing; and
processing said receiving and ignoring said further receiving, when said detecting detects an orientation of the electronic device wherein the second side is facing down and the first side is facing up; and
processing said further receiving and ignoring said receiving, when said detecting detects an orientation of the electronic device wherein the first side is facing down and the second side is facing up.
26. An electronic device, comprising:
a housing;
electronic circuitry contained within said housing;
a button-based user interface mounted on a first side of said housing comprising physical buttons for enabling button-press user inputs to said electronic circuitry;
a touch-sensitive screen mounted on a second side of said housing for enabling touch-based user inputs to said electronic circuitry, the second side being opposite the first side; and
a button-only button mounted on said housing comprising a physical button, for activating a de-activating a button-only state of said electronic circuitry at any time;
wherein said electronic circuitry is operative (i) to process button-press user inputs and to ignore touch-based user inputs, when the button-only state is currently activated, and (ii) to process touch-based user inputs and to ignore button-press user inputs, when the button-only state is currently de-activated.
27. The electronic device of claim 26 wherein said button-based user interface enables substantially the same input commands, at any given time, as the touch-sensitive screen.
28. A method performed by an electronic device, the method comprising:
receiving, by the electronic device, notification that a user has pressed a physical input button on a first side of an electronic device;
further receiving, by the electronic device, notification that a user has touched a touch-sensitive screen on a second side of the electronic device, the second side being opposite the first side;
detecting, by the electronic device, whether a button-only state of the electronic device is currently activated or de-activated; and
processing, by the electronic device, said receiving and ignoring, by the electronic device, said further receiving, when the electronic device detects that the button-only state is currently activated; and
processing, by the electronic device, said further receiving and ignoring, by the electronic device, said receiving, when the electronic device detects that the button-only state is currently de-activated.
29. A non-transitory computer readable storage medium storing program code for causing an electronic device to perform a method, the method comprising:
receiving notification that a user has pressed a physical input button on a first side of an electronic device;
further receiving notification that a user has touched a touch-sensitive screen on a second side of the electronic device, the second side being opposite the first side;
detecting whether a button-only state of the electronic device is currently activated or de-activated; and
processing said receiving and ignoring said further receiving, when the button-only state is currently activated; and
processing said further receiving and ignoring said receiving, when the button-only state is currently de-activated.
US14/485,815 2007-08-14 2014-09-15 Counter-tactile keypad Abandoned US20150009165A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/485,815 US20150009165A1 (en) 2007-08-14 2014-09-15 Counter-tactile keypad

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US96487207P 2007-08-14 2007-08-14
US12/190,599 US8836637B2 (en) 2007-08-14 2008-08-13 Counter-tactile keypad
US14/485,815 US20150009165A1 (en) 2007-08-14 2014-09-15 Counter-tactile keypad

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US12/190,599 Continuation US8836637B2 (en) 2007-08-14 2008-08-13 Counter-tactile keypad

Publications (1)

Publication Number Publication Date
US20150009165A1 true US20150009165A1 (en) 2015-01-08

Family

ID=39790755

Family Applications (2)

Application Number Title Priority Date Filing Date
US12/190,599 Active 2031-12-20 US8836637B2 (en) 2007-08-14 2008-08-13 Counter-tactile keypad
US14/485,815 Abandoned US20150009165A1 (en) 2007-08-14 2014-09-15 Counter-tactile keypad

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US12/190,599 Active 2031-12-20 US8836637B2 (en) 2007-08-14 2008-08-13 Counter-tactile keypad

Country Status (2)

Country Link
US (2) US8836637B2 (en)
GB (1) GB2451952B (en)

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8836637B2 (en) * 2007-08-14 2014-09-16 Google Inc. Counter-tactile keypad
US20090256809A1 (en) * 2008-04-14 2009-10-15 Sony Ericsson Mobile Communications Ab Three-dimensional touch interface
US8665228B2 (en) 2008-06-19 2014-03-04 Tactile Displays, Llc Energy efficient interactive display with energy regenerative keyboard
US8115745B2 (en) * 2008-06-19 2012-02-14 Tactile Displays, Llc Apparatus and method for interactive display with tactile feedback
US9513705B2 (en) * 2008-06-19 2016-12-06 Tactile Displays, Llc Interactive display with tactile feedback
US20100271231A1 (en) * 2009-04-23 2010-10-28 Mark Gottlieb Two-Sided Handheld Remote Control
US20100295794A1 (en) * 2009-05-20 2010-11-25 Microsoft Corporation Two Sided Slate Device
KR101648747B1 (en) * 2009-10-07 2016-08-17 삼성전자 주식회사 Method for providing user interface using a plurality of touch sensor and mobile terminal using the same
TW201122992A (en) * 2009-12-31 2011-07-01 Askey Computer Corp Cursor touch-control handheld electronic device
US10719131B2 (en) 2010-04-05 2020-07-21 Tactile Displays, Llc Interactive display with tactile feedback
US20200393907A1 (en) 2010-04-13 2020-12-17 Tactile Displays, Llc Interactive display with tactile feedback
US8532563B2 (en) * 2010-07-30 2013-09-10 Motorola Mobility Llc Portable electronic device with configurable operating mode
US8922493B2 (en) * 2010-09-19 2014-12-30 Christine Hana Kim Apparatus and method for automatic enablement of a rear-face entry in a mobile device
KR20130129426A (en) * 2011-01-20 2013-11-28 블랙베리 리미티드 Three-dimensional, multi-depth presentation of icons in association with differing input components of a user interface
US9618972B2 (en) * 2011-01-20 2017-04-11 Blackberry Limited Three-dimensional, multi-depth presentation of icons in association with differing input components of a user interface
US20120290107A1 (en) * 2011-05-12 2012-11-15 John Carlson Apparatus and method for displaying state data of an industrial plant
KR101160681B1 (en) 2011-10-19 2012-06-28 배경덕 Method, mobile communication terminal and computer-readable recording medium for operating specific function when activaing of mobile communication terminal
EP2597775A3 (en) * 2011-11-22 2014-08-20 Sony Mobile Communications AB Key device and electronic equipment
KR101910509B1 (en) * 2012-07-17 2018-10-22 삼성전자주식회사 Method and apparatus for preventing screen off during automatic response system service in electronic device
CN104903335B (en) * 2012-08-02 2018-05-18 霍夫曼-拉罗奇有限公司 For manufacturing new double-iridium-complex of ECL- labels
WO2014083369A1 (en) 2012-11-27 2014-06-05 Thomson Licensing Adaptive virtual keyboard
US10048861B2 (en) 2012-11-27 2018-08-14 Thomson Licensing Adaptive virtual keyboard
JP2014106937A (en) * 2012-11-30 2014-06-09 Toshiba Corp Electronic apparatus

Citations (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5893037A (en) * 1994-12-09 1999-04-06 Eastman Kodak Company Combined electronic/silver-halide image capture system with cellular transmission capability
US5923325A (en) * 1996-11-14 1999-07-13 International Business Machines Corporation System and method for enhancing conveyed user information relating to symbols in a graphical user interface
US6297752B1 (en) * 1996-07-25 2001-10-02 Xuan Ni Backside keyboard for a notebook or gamebox
US20020090980A1 (en) * 2000-12-05 2002-07-11 Wilcox Russell J. Displays for portable electronic apparatus
US6421234B1 (en) * 2000-10-10 2002-07-16 Juniper Systems, Inc. Handheld electronics device having ergonomic features
US20020118175A1 (en) * 1999-09-29 2002-08-29 Gateway, Inc. Digital information appliance input device
US6492974B1 (en) * 1996-10-08 2002-12-10 Fujitsu Limited Small-sized portable information processing apparatus
US20030048205A1 (en) * 2001-08-10 2003-03-13 Junru He 3D electronic data input device with key mapping card
US6559830B1 (en) * 1998-09-14 2003-05-06 Microsoft Corporation Method of interacting with a computer using a proximity sensor in a computer input device
US6640113B1 (en) * 2000-09-08 2003-10-28 Mobigence, Inc. Touch sensitive display integrated with a handheld radiotelephone
US20040208681A1 (en) * 2003-04-19 2004-10-21 Dechene Joseph Fernand Computer or input device with back side keyboard
US20050007339A1 (en) * 2003-06-12 2005-01-13 Tadamitsu Sato Inputting method and input device
US20060012577A1 (en) * 2004-07-16 2006-01-19 Nokia Corporation Active keypad lock for devices equipped with touch screen
US20060073893A1 (en) * 2004-10-01 2006-04-06 Dahl John M Touchscreen audio feedback in a wagering game system
US20060084482A1 (en) * 2004-10-15 2006-04-20 Nokia Corporation Electronic hand-held device with a back cover keypad and a related method
US7088342B2 (en) * 2002-05-16 2006-08-08 Sony Corporation Input method and input device
US7180502B2 (en) * 2004-03-23 2007-02-20 Fujitsu Limited Handheld device with preferred motion selection
US20070103454A1 (en) * 2005-04-26 2007-05-10 Apple Computer, Inc. Back-Side Interface for Hand-Held Devices
US20090046076A1 (en) * 2007-08-14 2009-02-19 Modu Ltd. Counter-tactile keypad
US20100088532A1 (en) * 2008-10-07 2010-04-08 Research In Motion Limited Method and handheld electronic device having a graphic user interface with efficient orientation sensor use
US20100134418A1 (en) * 2008-12-02 2010-06-03 Microsoft Corporation Opposite facing device keypad
US20120139959A1 (en) * 2010-12-06 2012-06-07 Hyung-Soo Kim Display device
US8412281B2 (en) * 2008-06-26 2013-04-02 Kyocera Corporation Portable terminal device

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2330981B (en) 1997-10-31 2002-07-03 Nokia Mobile Phones Ltd A radiotelephone handset
EP2423779A1 (en) 2003-07-28 2012-02-29 NEC Corporation Mobile information terminal
WO2006052175A1 (en) 2004-11-15 2006-05-18 Telefonaktiebolaget Lm Ericsson (Publ) Terminal design with keyboard arranged on the back or side surface of the terminal
EP1832957A1 (en) 2006-03-10 2007-09-12 E-Lead Electronic Co., Ltd. Back-loading input device
US20080058033A1 (en) 2006-09-06 2008-03-06 Sony Ericsson Mobile Communications Ab Portable electronic device

Patent Citations (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5893037A (en) * 1994-12-09 1999-04-06 Eastman Kodak Company Combined electronic/silver-halide image capture system with cellular transmission capability
US6297752B1 (en) * 1996-07-25 2001-10-02 Xuan Ni Backside keyboard for a notebook or gamebox
US6492974B1 (en) * 1996-10-08 2002-12-10 Fujitsu Limited Small-sized portable information processing apparatus
US5923325A (en) * 1996-11-14 1999-07-13 International Business Machines Corporation System and method for enhancing conveyed user information relating to symbols in a graphical user interface
US6559830B1 (en) * 1998-09-14 2003-05-06 Microsoft Corporation Method of interacting with a computer using a proximity sensor in a computer input device
US20020118175A1 (en) * 1999-09-29 2002-08-29 Gateway, Inc. Digital information appliance input device
US6640113B1 (en) * 2000-09-08 2003-10-28 Mobigence, Inc. Touch sensitive display integrated with a handheld radiotelephone
US6421234B1 (en) * 2000-10-10 2002-07-16 Juniper Systems, Inc. Handheld electronics device having ergonomic features
US20020090980A1 (en) * 2000-12-05 2002-07-11 Wilcox Russell J. Displays for portable electronic apparatus
US20030048205A1 (en) * 2001-08-10 2003-03-13 Junru He 3D electronic data input device with key mapping card
US7088342B2 (en) * 2002-05-16 2006-08-08 Sony Corporation Input method and input device
US20040208681A1 (en) * 2003-04-19 2004-10-21 Dechene Joseph Fernand Computer or input device with back side keyboard
US7359999B2 (en) * 2003-06-12 2008-04-15 Alps Electric Co., Ltd Inputting method and device with a flat input having sensor capable of coordinate touch input wherein display is switchable based on touch pressure
US20050007339A1 (en) * 2003-06-12 2005-01-13 Tadamitsu Sato Inputting method and input device
US7180502B2 (en) * 2004-03-23 2007-02-20 Fujitsu Limited Handheld device with preferred motion selection
US20060012577A1 (en) * 2004-07-16 2006-01-19 Nokia Corporation Active keypad lock for devices equipped with touch screen
US20060073893A1 (en) * 2004-10-01 2006-04-06 Dahl John M Touchscreen audio feedback in a wagering game system
US20060084482A1 (en) * 2004-10-15 2006-04-20 Nokia Corporation Electronic hand-held device with a back cover keypad and a related method
US20070103454A1 (en) * 2005-04-26 2007-05-10 Apple Computer, Inc. Back-Side Interface for Hand-Held Devices
US20090046076A1 (en) * 2007-08-14 2009-02-19 Modu Ltd. Counter-tactile keypad
US8412281B2 (en) * 2008-06-26 2013-04-02 Kyocera Corporation Portable terminal device
US20100088532A1 (en) * 2008-10-07 2010-04-08 Research In Motion Limited Method and handheld electronic device having a graphic user interface with efficient orientation sensor use
US20100134418A1 (en) * 2008-12-02 2010-06-03 Microsoft Corporation Opposite facing device keypad
US20120139959A1 (en) * 2010-12-06 2012-06-07 Hyung-Soo Kim Display device

Also Published As

Publication number Publication date
US20090046076A1 (en) 2009-02-19
GB2451952A (en) 2009-02-18
GB0814859D0 (en) 2008-09-17
GB2451952B (en) 2012-04-04
US8836637B2 (en) 2014-09-16

Similar Documents

Publication Publication Date Title
US20150009165A1 (en) Counter-tactile keypad
KR101070111B1 (en) Hand held electronic device with multiple touch sensing devices
US8274484B2 (en) Tracking input in a screen-reflective interface environment
JP5642901B2 (en) Electronic device and transmission signal application method
US11656711B2 (en) Method and apparatus for configuring a plurality of virtual buttons on a device
TWI382739B (en) Method for providing a scrolling movement of information,computer program product,electronic device and scrolling multi-function key module
EP2965176B1 (en) Mechanical actuator apparatus and a touchscreen
US20150169059A1 (en) Display apparatus with haptic feedback
WO2011024389A1 (en) Input device
EP2383631A1 (en) Hand-held mobile device and method for operating the hand-held mobile device
US20100026646A1 (en) Overlay film and electronic device with same
JP2011048832A (en) Input device
US20140015785A1 (en) Electronic device
EP3190482B1 (en) Electronic device, character input module and method for selecting characters thereof
JP5676131B2 (en) Portable electronic devices
US20090167715A1 (en) User interface of portable device and operating method thereof
EP2112586A1 (en) Operation method of user interface and computer readable medium and portable device
KR100606406B1 (en) Computer for a blind person
JP2010257163A (en) Input device and input auxiliary implement of touch panel allowing touch typing
US20070262956A1 (en) Input method with a large keyboard table displaying on a small screen
KR20120068416A (en) Apparatus and method for providing visual and haptic information, and terminal for having thereof
CN101794194A (en) Method and device for simulation of input of right mouse button on touch screen
US8760421B2 (en) Method for increased accessibility to a human machine interface
JP2014075022A (en) Input device

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION