Search Images Maps Play YouTube News Gmail Drive More »
Sign in
Screen reader users: click this link for accessible mode. Accessible mode has the same essential features but works better with your reader.

Patents

  1. Advanced Patent Search
Publication numberUS20100030549 A1
Publication typeApplication
Application numberUS 12/183,437
Publication date4 Feb 2010
Filing date31 Jul 2008
Priority date31 Jul 2008
Also published asUS9535906, US20150309997, US20170060853
Publication number12183437, 183437, US 2010/0030549 A1, US 2010/030549 A1, US 20100030549 A1, US 20100030549A1, US 2010030549 A1, US 2010030549A1, US-A1-20100030549, US-A1-2010030549, US2010/0030549A1, US2010/030549A1, US20100030549 A1, US20100030549A1, US2010030549 A1, US2010030549A1
InventorsMichael M. Lee, Justin Gregg, Chad G. Seguin
Original AssigneeLee Michael M, Justin Gregg, Seguin Chad G
Export CitationBiBTeX, EndNote, RefMan
External Links: USPTO, USPTO Assignment, Espacenet
Mobile device having human language translation capability with positional feedback
US 20100030549 A1
Abstract
A mobile electronic device has a touch sensitive screen and an accelerometer. A translator is to translate a word or phrase that is in a first human language and that is entered via a first virtual keyboard displayed on the touch sensitive screen, into a second human language. A translator is to cause the touch sensitive screen to display the translated word or phrase and a second virtual keyboard having characters in the second human language, in response to the accelerometer detecting a change in the physical orientation of the device or movement of the device. Other embodiments are also described and claimed.
Images(7)
Previous page
Next page
Claims(18)
1. A mobile electronic device comprising:
a touch sensitive screen;
an accelerometer; and
a translator to translate a word or phrase that is in a first human language and that is entered via a first virtual keyboard displayed on the touch sensitive screen, into a second human language, and wherein the translator is to cause the touch sensitive screen to display the translated word or phrase and a second virtual keyboard having characters in the second human language, in response to the accelerometer detecting a change in the physical orientation of the device.
2. The device of claim 1 wherein the touch sensitive screen is rectangular.
3. The device of claim 2 wherein the translator is to cause the touch sensitive screen to display the translated word or phrase and the second virtual keyboard when the device is rotated by about ninety degrees or more in a plane of the touch sensitive screen.
4. The device of claim 2 wherein the translator is to cause the touch sensitive screen to display the translated word or phrase and the second virtual keyboard when the device is rotated from a vertical orientation to a horizontal orientation.
5. The device of claim 4 wherein the user interface is to display the word or phrase in the first human language in a portrait view while the device is in the vertical orientation, and the translated word phrase in a landscape view when the device is in the horizontal orientation.
6. The device of claim 1 further comprising a RF-based locater to compute the current geographic location of the mobile electronic device,
wherein the translator causes the user interface to display a graphic that prompts a user of the device to select any one of a plurality of human languages to which the entered word or phrase can be translated by the translator, the plurality of human languages being offered by the translator as a function of the computed current geographic location of the mobile electronic device.
7. The device of claim 6 wherein the translator causes the graphic to be displayed when the accelerometer detects a change in the physical orientation of the device.
8. The device of claim 1 wherein device has a security function that prevents access to other applications once the translator application has been launched, unless a predetermined pass-code is received from the user.
9. A machine-implemented method for operating a mobile electronic device, comprising:
prompting a user of the device to enter a word or phrase in a first language using an associated first soft keyboard that is being displayed right side up; and
translating the entered word or phrase into a second language; and
when the device rotates by a predetermined amount, displaying right side up the translated word or phrase and a second soft keyboard that is associated with the second language.
10. The method of claim 9 wherein said displaying the translated word or phrase replaces the word or phrase in the first language, and the second soft keyboard replaces the first soft keyboard.
11. The method of claim 9 further comprising:
prompting another user of the device to enter a word or phrase in the second language using the second keyboard;
translating the word or phrase in the second language, into the first language; and
when the deice rotates by another predetermined amount, displaying right side up the translated word or phrase in the first language and the first keyboard.
12. The method of claim 9 wherein prompting the user comprises displaying a conversation bubble or dialog box in which the word or phrase being entered by the user is simultaneously displayed.
13. The method of claim 9 further comprising:
computing a current geographic location of the mobile device; and
prompting the user to select any one of a plurality of languages to which the entered word or phrase is to be translated, wherein the plurality of languages are offered as a function of the computed current geographic location.
14. The method of claim 9 further comprising:
launching a translator application under which the prompting of the user, the translating and the displaying of the translated word or phrase are preformed; and
preventing access to other applications once the translator application has been launched unless the user is again authenticated.
15. A mobile electronic device comprising:
a touch sensitive screen; and
a translator to translate a word or phrase, that has been entered via a virtual keyboard on the touch sensitive screen, into another language, wherein the translator has a first language mode of operation and a second language mode of operation, in the first language mode the touch sensitive screen displays a first virtual keyboard for a first language, in the second language mode the touch sensitive screen displays a second virtual keyboard for a second language, and wherein the translator switches between the first and second language modes in response to the device being rotated or undergoing a translation movement.
16. The device of claim 15 wherein the translator is to switch between the first and second modes in response to the device undergoing a rapid translation movement that is brought about by a user tapping the device.
17. The device of claim 15 wherein the translator is to switch between the first and second modes in response to the device undergoing a rotation by about 90 degrees or more in a plane or the touch screen.
18. A mobile electronic device comprising:
means for displaying information right side up, while the device is in a first physical orientation;
means for displaying said information right side up, while the device is in a second, different physical orientation;
means for receiving a word or phrase in a first human language, from a user of the device while the device is in the first physical orientation;
means for translating the word or phrase into a second human language; and
means for displaying the translated word or phrase right side up, while the device is in the second physical orientation.
Description
  • [0001]
    An embodiment of the invention relates to mobile or portable electronic devices having human language translation capability. Other embodiments are also described.
  • BACKGROUND
  • [0002]
    A basic need for most people when traveling in a foreign country is human language translation. One may be riding in a taxi or about to purchase something when there is a sudden and important need to translate a phrase or statement. For example, one may want to ask the cab driver a question about the route he is taking, or wish to ask a salesperson about alternatives to a particular item he is presenting you. A two-way, portable, electronic language translation device is very useful in such circumstances. Such a device has a display and a keyboard that allows the user to type in a word or phrase in the user's native language. The user then presses a button on the keyboard, and the word or phrase is then translated by built-in data processing circuitry of the device into another language before being displayed. With the display showing the translated phrase, the user may then hand the device to the other party who can then read the translated phrase and then respond using the device in a reverse manner but in her own language.
  • SUMMARY
  • [0003]
    An embodiment of the invention provides an enhanced translation experience using a mobile electronic device. In particular, the device provides translation “feedback” to its users in response to a change in the position or physical orientation of the device. In one embodiment, the mobile electronic device may be operated as follows. The device prompts a user to enter a word or phrase in a first human language, using a first soft keyboard that is associated with that language. The soft keyboard is displayed on a touch sensitive screen of the device. The entered word or phrase is then translated by the device into a second human language. The translated word or phrase, as well as a second soft keyboard (associated with the second language) is displayed, when the device rotates by at least a predetermined amount or is moved in a predetermined manner. Thus, the device provides feedback to its users, as to the different languages that it supports, based at least in part on the movement, position or physical orientation of the device.
  • [0004]
    For example, if the device were to have a rectangular shape, in one portrait orientation it may display a soft keyboard and prompt for the entry of a word or phrase in a first human language. When the device has been turned upside down into its other portrait orientation, the previously entered word or phrase is displayed, translated into another language, together with the soft keyboard of the another language. Thus, one user enters a word or phrase in her language, and then turns the device around to an orientation associated with a second language, and hands the device over in that orientation to another user (who is conversant in the second language). At that point, the device is displaying the translated word or phrase, in the second language. The other user can now enter a response in her own language, using the displayed second soft keyboard, and then hands the device back to the first user, turning it back to the original orientation. Thus, two users can communicate with each other via different human languages, by entering their statements into the same device but in different physical orientations, and signaling their intentions to translate simply by handing the device back and forth in the appropriate orientation.
  • [0005]
    The above summary does not include an exhaustive list of all aspects of the present invention. Indeed, it is contemplated that the invention includes all systems and methods that can be practiced from all suitable combinations of the various aspects summarized above, as well as those disclosed in the Detailed Description below and particularly pointed out in the claims filed with the application. Such combinations have particular advantages not specifically recited in the above summary.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • [0006]
    The embodiments of the invention are illustrated by way of example and not by way of limitation in the figures of the accompanying drawings in which like references indicate similar elements. It should be noted that references to “an” or “one” embodiment of the invention in this disclosure are not necessarily to the same embodiment, and they mean at least one.
  • [0007]
    FIG. 1 shows a block diagram of various hardware and software components of a portable multifunction device, in accordance with an embodiment of the invention.
  • [0008]
    FIG. 2 is a top view of the portable multifunction device.
  • [0009]
    FIG. 3 shows the root or main menu of an example graphical user interface running in the device.
  • [0010]
    FIG. 4 shows an example of how the device may be used to translate between two different written languages, assisted by positional feedback.
  • [0011]
    FIG. 5 shows an example of using the device when translating among three different languages, using positional feedback.
  • [0012]
    FIG. 6 is a flow diagram of how the device may be operated to assist in human language translation.
  • DETAILED DESCRIPTION
  • [0013]
    In this section several embodiments of this invention are explained with reference to the appended drawings. Whenever the shapes, relative positions and other aspects of the parts described in the embodiments are not clearly defined, the scope of the invention is not limited only to the parts shown, which are meant merely for the purpose of illustration.
  • [0014]
    Before describing the various translation processes that can be implemented in a portable electronic device, Part I below gives an overview of the relevant components of the device in connection with FIGS. 1-3.
  • I. Overview of an Example Portable Device
  • [0015]
    FIG. 1 is a block diagram of an example portable multifunction device 100 that has the capability for translating human language with positional feedback. The device 100 may be a portable wireless communications device such as a cellular telephone that also contains other functions such as personal digital assistant and digital media (music and/or movie) playback functions. Not all of the functions described here are needed, as the device 100 could alternatively be a dedicated, handheld translator device, without having for instance cellular communications capability. The device 100 has memory 102 which may include random access memory, non-volatile memory such as disk storage, flash memory, and/or other suitable digital storage. Access to the memory 102 by other components of the device such as one or more processors 120 or peripheral interface 118 may be controlled by a memory controller 122. The latter components may be built into the same integrated circuit chip 104, or they may each be part of a separate integrated circuit package.
  • [0016]
    The peripheral interface 118 allows input and output peripherals of the device to communicate with the processors 120 and memory 102. In one example, there are one or more processors 120 that run or execute various software programs or sets of instructions (e.g., applications) that are stored in memory 102, to perform the various functions described below, with the assistance of or through the peripherals.
  • [0017]
    The portable multifunction device 100 may have wireless communications capability enabled by radio frequency (RF) circuitry 108 that receives and sends RF signals via an integrated or built-in antenna of the device 100 (not shown). The RF circuitry may include RF transceivers, as well as digital signal processing circuitry that supports cellular network or wireless local area network protocol communications. The RF circuitry 108 may be used to communicate with networks such as the Internet with such protocols as the World Wide Web, for example. This may be achieved through either the cellular telephone communications network or a wireless local area network, for example. Different wireless communications standards may be implemented as part of the RF circuitry 108, including global system for mobile communications (GSM), enhanced data GSM environment (EDGE), high speed downlink packet access (HSDPA), code division multiple access (CDMA), Bluetooth, wireless fidelity (Wi-Fi), and Wi-Max.
  • [0018]
    The device 100 in this example also includes audio circuitry 110 that provides an interface to acoustic transducers, such as a speaker 111 (a speaker phone, a receiver or a headset) and a microphone 113. These form the audio interface between a user of the device 100 and the various applications that may run in the device 100. The audio circuitry 110 serves to translate digital audio signals produced in the device (e.g., through operation of the processor 120 executing an audio-enabled application) into a format suitable for output to a speaker, and translates audio signals detected by the microphone 130 (e.g., when the user is speaking into the microphone) to digital signals suitable for use by the various applications running in the device. In some embodiments, the audio circuitry may also include a headset jack 212 (see FIG. 2), which enables sound output by a headset worn by the user of the device.
  • [0019]
    The device 100 also has an I/O subsystem 106 that serves to communicatively couple various other peripherals in the device to the peripheral's interface 118. The I/O subsystem 106 may have a display controller 156 that manages the low level processing of data that is displayed on a touch sensitive display system 112 and generated by a touch screen of the system 112. One or more input controllers 116 may be used to receive or send signals from and to other input control devices 116, such as physical buttons (e.g., push buttons, rocker buttons, etc.), dials, slider switches, joy sticks, click wheels, and so forth. In other embodiments, the input controller 160 may enable input and output to other types of devices, such as a keyboard, an infrared port, a universal serial bus, USB, port, or a pointer device such as a mouse. Physical buttons may include an up/down button for volume control of the speaker 111 and a sleep or power on/off button of the device. In contrast to these physical peripherals, the touch sensitive display system 112 (also referred to as the touch screen 112) is used to implement virtual or soft buttons and one or more soft keyboards as described below.
  • [0020]
    The touch sensitive screen 112 is part of a larger input interface and output interface between the device 100 and its user. The display controller 156 receives and/or sends electrical signals from/to the touch screen 112. The latter displays visual output to the user, for example, in the form of graphics, text, icons, video, or any combination thereof (collectively termed “graphics” or image objects). The touch screen 112 also has a touch sensitive surface, sensor, or set of sensors that accept input from the user based on haptic and/or tactile contact. These are aligned directly with the visual display, typically directly above the latter. The touch screen 112 and the display controller 156, along with any associated program modules and/or instructions in memory 102, detect contact, movement, and breaking of the contact on the touch sensitive surface. In addition, they convert the detected contact into interaction with user-interface objects (e.g., soft keys, program launch icons, and web pages) whose associated or representative image objects are being simultaneously displayed on the touch screen 112.
  • [0021]
    The touch screen 112 may include liquid crystal display technology or light emitting polymer display technology, or other suitable display technology. The touch sensing technology may be capacitive, resistive, infrared, and/or surface acoustic wave. A proximity sensor array may also be used to determine one or more points of contact with the touch screen 112. The touch screen 112 may have a resolution in excess of 100 dpi. The user may make contact with the touch screen 112 using any suitable object or appendage, such as a stylist, a finger, and so forth. In some embodiments, the user interface is designed to work primarily with finger-based contacts and gestures, which are generally less precise than stylist-based input due to the larger area of contact of a finger. The device in that case translates the rough finger-based input into a precise pointer/cursor position or command for performing the action desired by the user.
  • [0022]
    The device 100 has a power system 162 for supplying electrical power to its various components. The power system 162 may include a power management system, one or more replenishable or rechargeable power sources such as a battery or fuel cell, a replenishing system, a power or failure detection circuit, as well as other types of circuitry including power conversion and other components associated with the generation, management and distribution of electrical power in a portable device.
  • [0023]
    The device 100 may also include an optical sensor 164, shown in FIG. 1 as being communicatively coupled to the controller 158 in the I/O subsystem 106. The optical sensor may include a charge coupled device or complementary metal oxide semiconductor phototransistors, for example arranged in an array, onto which an optical image of a scene that is before the device is formed. Optical components are also provided including a lens, for example, that is built into the device and that is to bend light from the scene to form an image on the sensor 164. In conjunction with an imaging module 143 running in the device (also referred to as a “camera module”), the sensor 164 may capture still images or video of the scene that is before the device 100.
  • [0024]
    The device 100 also includes one or more accelerometers 168. The accelerometer 168 is communicatively coupled to the peripheral interface 118 and can be accessed by a module being executed by the processor 120. The accelerometer 168 provides information or data about the physical orientation or position of the device, as well as rotation or movement of the device about an axis. This information may be used to detect that the device is, for example, in a vertical or portrait orientation (in the event the device is rectangular shaped) or in a horizontal or landscape orientation. On that basis, a graphics module 132 and/or a text input module 134 are able to display information “right side up” on the touch screen 112, regardless of whether the device is in any portrait or landscape orientation. The processing of the accelerometer data may be performed by the operating system 126 and in particular a driver program that translates raw data from the accelerometer 168 into physical orientation information that can be used by various other modules of the device as described below. The operating system 126 may be an embedded operating system such as Vx Works, OS X, or others which may also include software components and/or drivers for controlling and managing the various hardware components of the device, including memory management, power management, sensor management, and also facilitates communication between various software components or modules.
  • [0025]
    The device 100 shown in FIG. 1 may also include a communication module 128 that manages or facilitates communication with external devices over an external port 124. The external port 124 may include a universal serial bus port, a fire wire port, or other suitable technology, adapted for coupling directly to an external device. The external port 124 may include a multi-pin (e.g., a 30 pin) connector and associated circuitry typically used for docking the device 100 with a desktop personal computer.
  • [0026]
    Turning now to the modules in more detail, the contact/motion module 130 may detect user initiated contact with the touch screen 112 (in conjunction with the display controller 156), and other touch sensitive devices e.g., a touchpad or physical quick wheel. The contact/motion module 130 has various software components for performing operations such as determining if contact with the touch screen has occurred or has been broken, and whether there is movement of the contact and tracking the movement across the touch screen. Determining movement of the point of contact may include determining speed (magnitude), velocity (magnitude and direction), and/or acceleration of the point of contact. These operations may be applied to single contacts (e.g., one finger contacts) or to multiple simultaneous contacts (e.g., multi-touch or multiple finger contacts).
  • [0027]
    The graphics module 132 has various known software components for rendering and displaying graphics on the display of the touch screen 112 including, for example, icons of user interface objects such as soft keys and a soft keyboard. The text input module 134, which may be a component of graphics module 132, provides soft keyboards for entering text in different languages. Such soft keyboards are for use by various applications e.g., the contacts module 137 (address book updating), email client module 140 (composing an email message), browsing module 147 (typing in a web site universal resource locator), and a translation module 141 (for entering words or phrases to be translated).
  • [0028]
    The GPS module 135 determines the geographic location of the device and provides this information for display or use by other applications, such as by a the telephone module 138 for user in location-based dialing and applications that provide location-based services, such as a weather widget, local Yellow Page widget, or map/navigation widgets (not shown). The widget modules 149 depicted here include a calculation widget which displays a soft keypad of a calculator and enables calculator functions, an alarm clock widget, and a dictionary widget that is associated or tied to the particular human language set in the device 100.
  • [0029]
    Other modules that may be provided in the device 100 include a telephone module 138 that is responsible for managing the placement of outbound calls and the receiving of inbound calls made over a wireless telephone network, e.g. a cellular telecommunications network. A calendar module 148 displays a calendar of events and lets the user define and manage events in her calendar. A music player module 146 may manage the downloading, over the Internet or from a local desktop personal computer, of digital media files, such as music and movie files, which are then played back to the user through the audio circuitry 110 and the touch sensitive display system 112.
  • [0030]
    It should be noted that each of the above-identified modules or applications correspond to a set of instructions to be executed by a machine such as the processor 120, for performing one or more of the functions described above. These modules or instructions need not be implemented as separate programs, but rather may be combined or otherwise rearranged in various combinations. For example, the text input module 134 may be integrated with the graphics module 132.
  • [0031]
    In one embodiment, the device 100 is such that most of its functions are performed exclusively through the touch screen 112 and/or a touchpad. By using the touch screen and/or touchpad as the primary input and output control device, the number of physical input and control devices, such as push buttons, dials, and the like on the device may be reduced. In some embodiments, the touchpad may be referred to as a “menu button”. In other embodiments this menu button may include a physical push button or other physical motion input control device, instead of a touchpad. This case is illustrated in the example of the device 100 shown in FIG. 2 where a home button 204, when pressed by the user, causes the display to show the main, home or root menu of the graphical user interface. This is depicted in FIG. 3 where the current time is displayed (image object 404) together with several widgets (image objects 349_3, 349_4, and 349_5) and icons for the phone, mail, browser, and music modules (image objects 338, 340, 347, and 346).
  • [0032]
    Referring back to FIG. 2, this top view of the example portable multifunction device shows that its touch screen 112 is surrounded by various sensors and peripheral devices, including optical sensor 164, proximity sensor 166, accelerometer 168, speaker 111 and microphone 113. The speaker 111 may be a receiver (earpiece) that is positioned near the top of the rectangular shaped device 100, while the microphone 113 is placed near the bottom, thus facilitating use of the device 100 as a conventional telephony handset when making or receiving wireless telephone calls. The device in this case also has a separate push button 206 for powering the device on and off and/or locking the device or placing the device into a sleep mode, a volume adjustment button 208, a subscriber identity module (SIM) card slot 210, a headset jack 212, and an external port 124 for docking and/or charging of the device.
  • [0033]
    Referring now to FIG. 3, a top view of the device 100 is shown, in the situation where the home button 204 has been actuated and the touch screen 112 is displaying the graphical or image objects associated with the home or root menu. The user interface processes running in the device 100, in this example, produce the following image objects that are displayed: current time 404, battery status indicator 406, wireless communications signal strength 402, and tray 408 containing icons for frequently used applications, in this case being phone 338, mail 340 which may include an indicator 410 of the number of unread email messages, browser 347, and music player 346. Higher up on the touch screen 112 are the widgets for calculator 349_3, alarm clock 349_4, and dictionary 349_5. These are, of course, just an example of what can be displayed in the root menu of the device. In many cases, the user can customize the root menu to display those user interface objects that are most frequently used. In this example, the user has chosen to also display an image object for the translation module 141 (labeled human language translator 341 in FIG. 3). Operation of the translation module in different embodiments of the invention is now described in connection with FIG. 4 and the flow diagram of FIG. 6.
  • II. Translation Capabilities
  • [0034]
    Referring now to FIG. 4, two instances of the same device 100 are shown, in different physical orientations. In the first instance on the left, the device 100 (in this case being rectangular) is held by its user (not shown) in a substantially vertical position, also referred to as portrait orientation. Operation begins with the device 100 being in its home state, depicting the home menu, for example, as shown in FIG. 3. Next, the user launches the translator application (included in translation module 141) by gesturing on image object 341 (block 602 in FIG. 6). This results in the touch screen displaying the objects 404, 408 and 406 as depicted on the left hand side of FIG. 4.
  • [0035]
    Object 404 is a prompt box, dialog box or conversation bubble which, when gestured or selected, activates a cursor 405 and allows a first user to enter a word of phrase in a first human language. The first language may be the default or base language of the device 100 (e.g., that was selected when the first user, as the owner of the device, first purchased the device). The first user then gestures letters and punctuation in the first human language, using a first virtual keyboard that is displayed as object 406 (block 604). In addition, there may also be a further object 408 that is a prompt box through which the user selects the target language into which the phrase shown in object 404 will be translated. For example, a drop down list of different human languages may appear when the object 408 is selected or gestured by the user. In this example, the target language that has been selected is Chinese (block 606).
  • [0036]
    The device 100, in particular, the translation module 141, may be designed with the capability of translating between several different human languages. This translation capability may be entirely implemented within the device 100, as part of the translation module 104 which is executed by the processor 120. Such a local translation database (which may be a partial or basic one) may be automatically downloaded into the device 100, for example as part of a device software and firmware installation or a routine update cycle. However, some or all of the language translation capability may be implemented remotely, for example, by a remote server. The device 100 would then access the server over a local area network or a wide area network (e.g., the Internet), for a particular translation task. For instance, the translation module 141 may seek assistance from the remote server when faced with unknown words or phrases. In another embodiment, the entire word or phrase of a particular translation task may be transmitted to the remote server, such that all of the translation function is performed remotely. Connections may be established with the server through a wireless communications channel, such as one over a cellular network or a wireless local area network.
  • [0037]
    Returning to the user process, once the word or phrase to be translated has been entered in the dialog box, the translation module may then proceed with translating the word or phrase into the target language selected by the user (block 608). The device 100 may prompt the user for a further input (e.g., a “Done” command box is selected or gestured by the user) before translation can be completed. As an alternative, the translation may not be completed until after and in response to the device 100 detecting that it has been rotated by at least a predetermined amount. In either case, the touch screen 112 does not display the translated text until after the device detects that it has been rotated by the predetermined amount, e.g. 180 degrees or turned upside down within the plane of the touch screen 112 as shown (block 610). After performing the translation, the translated word or phrase may be displayed in the same or a similar dialog box 404. Thus, in this case, although the device 100 has been turned upside down, the dialog box 404 displays the translated word or phrase right side up. The translation module 141 may in this case need to interact with the graphics module 132 and the text input module 134, to ensure that the translated word or phrase is displayed right side up when the touch screen 112 has been turned upside down.
  • [0038]
    Still referring to FIG. 4, when the device 100 has been rotated by the predetermined amount such that the translation module will cause the translated word or phrase to be displayed right side up, a second soft keyboard that is associated with the second human language into which the word or phrase has been translated is also displayed on the touch screen 112 (block 610). This soft keyboard may replace the original keyboard, in the user interface object 406.
  • [0039]
    At this point, the second user can read the word or phrase of the first user, and then respond by entering a word or phrase, using the second soft keyboard that is depicted by the image object 406 (block 612). This response may be entered in the same dialog box depicted by the image object 404, or it may be in a separate dialog box that contains only the input from the second user. Also, in this two-way translation situation, there may be no need to prompt the second user to select a different target language, because in a two-way translation the “target” language for the second user is in effect the original language used by the first user. Next, the second user turns the device back to its first position or orientation, and the device completes translation of the word or phrase (entered using the second keyboard) into the first language (block 614). The translated word or phrase is then displayed, together with the first keyboard, in the first orientation (block 616).
  • [0040]
    Thus, it can be seen that conversation between two users in different languages can be facilitated using the positional feedback provided by the device 100, where one user enters her message, rotates the device and hands the device over to the other use, and then the other user enters her own message and rotates the device back into the original orientation and hands the device back to the first user.
  • [0041]
    Turning now to FIG. 5, this figure shows how the device 100 may be designed to allow a three-way conversation between three different users, in three different human languages. At the left side of FIG. 5 the device 100 is shown in a vertical orientation (portrait view) where an English speaking user has entered a question into the dialog box 404, using the English soft virtual keyboard depicted by image object 406 on the touch screen. The user has selected Spanish to be the target language, as depicted by the image object 408. Next, the device 100 is rotated by 90 degrees in a clockwise direction in the plane of its touch screen, resulting in a horizontal orientation (landscape view). This is depicted by the instance of the device 100 in the middle portion of FIG. 5. In this example, the English phrase “Where should we go for dinner tonight?” has been replaced in the dialog box by its Spanish translation “Dónde debemos ir nosotros para cena esta noche?” Further, the horizontal orientation has been detected by the device 100 (and in particular by the translation module 141 processing data that is output by the accelerometer 168, see FIG. 1) and thus the English keyboard has been replaced within the image object 406 with a Spanish language keyboard. Also, given that this is a three-way conversation, the dialog box represented by image object 408 which prompts the user to select a target language may be retained, except that now the dialog box prompts the user in Spanish. In this situation, the third user of the device 100 is Chinese speaking, so that the second user who is using the device 100 in its horizontal orientation selects “Chino” as the target language in the dialog box represented by image object 408. Next, the second user rotates the device 100 another 90 degrees in a clockwise direction resulting in the vertical orientation show at the right side of FIG. 5, except that now device 100 is upside down relative to the original orientation used by the English speaking user. Now, in this orientation or position used by the Chinese speaking user, the device has replaced the Spanish phrase in the dialog box represented by image object 404, with its Chinese translation. In addition, the dialog box for the target language (image object 408) now prompts the user to select the target language, in Chinese. A response by the Chinese speaking user may now be entered into the device 100, using the Chinese soft keyboard depicted by image object 406, which has replaced the Spanish soft keyboard. The Chinese speaking user then turns the device back to either the English or Spanish orientation, resulting in the translation of the Chinese phrase into English or Spanish, respectively.
  • [0042]
    In another embodiment, rather than provide the second and third users the option to select the target language, the translation module may be designed so that only the first user (or in this case the owner) of the device 100 may define what the target languages of the translation conversation are. Thus, the first user could select, for example, English to be the language in the vertical orientation, Spanish for the horizontal orientation, and Chinese for the vertical but upside down orientation. Once set in this manner, there would be no need for any individual user to thereafter manually select the particular target language, as the translation module has stored the predetermined association between the different orientations and their respective languages. The device would thus automatically perform the correct translation of the phrase that has been entered in one orientation, into the language of another next orientation. This means that users would simply type in their response to a given statement that has been translated by the device, and then after deciding which language they would like their response to be translated into, would rotate the device into the respective orientation. For example, if the first user (owner) of the device 100 would like her English statement to be translated into Chinese, then, based on the association that has been previously stored or defined, she would rotate the device 180 degrees into the vertical but upside down orientation. The device 100 would then automatically display the phrase translated into Chinese, in response to the device taking on the upside down vertical orientation. If the Chinese speaking user would like to make her response or statement known to the Spanish speaking user, she would enter the statement using the displayed Chinese keyboard and then rotate the device into the horizontal orientation (and hand the device over to the Spanish speaking user). At that point, the Spanish speaking user may decide not to enter any further response or statement, but could simply pass on the Chinese speaking user's message to the English speaking user, by rotating the device back into its original vertical orientation and handing the device back to the English speaking user. The device in that situation would detect the original vertical orientation which is associated with the English language, and would translate either a Spanish phrase or, if none was entered as in this case, the previously entered Chinese phrase, into English.
  • [0043]
    As explained above, various embodiments of a mobile electronic device that have enhanced translation capability have been described. In one embodiment, the device has the following integrated hardware circuit and software components for achieving such capability. First, there is a means for displaying information right side up, while the device is in a first physical orientation, and for displaying the same information right side up in a different, second physical orientation. This may include a graphical user interface using a touch screen and associated display software that can display the same information right side up, in each of two different physical orientations of the device. In other words, despite the different physical orientations, the same information or the same type of information can be displayed right side up while the device is in each orientation. For example, as seen in FIG. 4, the contents of the dialog box which is a word or phrase entered by a user is displayed right side up even though the device is in different physical orientations, respectively. The “information” in this case is the same except that it is being represented in different written human languages. To enable such a capability, the user interface may be viewed as including the graphics module 132 and text input module 134, stored in memory 102, to be executed by one of the processors 120. The translation module 141 would cooperate with the graphics module 132 and the text input module 134, to display the appropriate text in the dialog box (represented by image object 404 in FIG. 4 and FIG. 5).
  • [0044]
    The device 100 also includes a means for receiving a word or phrase in a first human language, from a user of the device while the device is in the first physical orientation. In the above embodiment, the input word or phrase is entered using a virtual (soft) keyboard that is displayed on the touch screen by the text input module 134. The particular language of the keyboard may be selected by the translation module 141, whereas the actual details of displaying and receiving input from the selected keyboard may be governed by the graphics module 132 and the text input module 134. The latter may also be responsible for providing the dialog box into which the word or phrase is entered. Note that this may be the same dialog box or same conversation bubble that displays a word or phrase from a previous user of the device (in a different language). As an alternative or in addition to using the soft keyboard (touch) input, there may be an audio module (not shown) in the memory 102 that when executed by the processor 120 interfaces with the microphone 113, to receive the input word or phrase from the user as audible speech. In other words, the device 100 in this case records the user speaking the word or phrase to be translated, rather than entering it through the touch screen.
  • [0045]
    The device 100 also includes means for translating the word or phrase that has been entered, into a second human language. This may be achieved by automatic translator software that is part of the translation module 141. The translator software may perform an internal or local lookup of a stored table of words and/or phrases in one language, to yield their associated words and/or phrases in another language. Alternatively, or in addition, the translator software could send some or all of the words or phrases to a remote server as explained above, to obtain the translation terms remotely.
  • [0046]
    The device 100 also includes means for displaying the translated word or phrase right side up, while the device is in the second physical orientation. Here, the mechanism to do so would include the user interface which may include aspects of the translation module 141 detecting that the accelerometer 168 has in effect signaled a change in the physical orientation of the device, into the second predetermined physical orientation. The accelerometer is an example of a means for detecting this change in orientation. In response, the translation module would signal the graphics and text input modules 132, 134 to change the virtual keyboard into the one associated with the second physical orientation, and display in a dialog box the translated word or phrase right side up (in view of the second physical orientation).
  • [0047]
    There are some subtleties to the manner in which the device transitions between displaying a word or a phrase in one language and a translated version. In one case, the physical orientation of the device must change first, before the translated phrase is displayed. In other words, the device may continue to display in a “first language mode” until it has been rotated around to the second, predetermined orientation. Only then will the display change to a “second language mode”. The device 100 in that case would have a means for displaying the translated word or phrase, for example, in the landscape view, in response to the device taking on a horizontal orientation or position.
  • [0048]
    In yet another embodiment of the invention, the device 100 has a RF-based locator that can compute the current geographic location of the device. The locator may use RF-based triangulation techniques or GPS techniques, to compute the current location. The current location is then mapped to one or more human languages that are believed to be spoken in and around the area of the location. For example, if the user/owner of the device travels with the device to Tibet, then the RF-based locator could automatically detect Tibet as the location. A previously stored look up table in the device 100 would have associated Tibet with both Mandarin and Tibetan languages. Thus, the device, and in particular the translation module 141, would cause the user interface to display a graphic that prompts the user to select either one of Mandarin and Tibetan into which an entered word or phrase will be translated. In other words, the possible target languages that are available for translation would be offered to the user automatically by the device, as a function of the computed current geographic location of the device.
  • [0049]
    In another embodiment, the device has a security function built-in that prevents access to other applications, once the translator application has been launched, unless a predetermined pass code is received from the user or the user is otherwise again authenticated. That is because when the device is operating in one of its language translation modes, the device will be handed to at least one other person as part of the position feedback translation processes described above. The security function prevents another user from accessing any applications other than the translation application that is currently running.
  • [0050]
    In addition to a multiple user scenario as described above, the device and its positional feedback translation capability may also be used solely by its owner, to teach himself another language. The owner/single user could simply learn the words or phrases that have been translated by the device, in response to that user rotating or moving the device in a manner that causes the translator to respond accordingly. As another alternative, the device may be used by its owner as a personal translator tool, to, for example, translate words and characters that appear on signs or billboards while in a foreign country.
  • [0051]
    The above-described embodiments of the invention in connection with FIGS. 4-6 refer to the device being rotated or turned by its user when seeking to switch the device between different language translation modes of operation. In another embodiment, switching between these different language translation modes occurs not in response to a pure rotation or turning movement about a pivot point, but rather in response to a translation movement of the device. In that case, the device need not be rotated or turned about any axis, but rather may simply be tapped by its user to cause a rapid translational movement of the device which may also be detectible by the built-in accelerometer.
  • [0052]
    An embodiment of the invention may be a machine-readable medium having stored thereon instructions which program a processor to perform some of the operations described above. In other embodiments, some of these operations might be performed by specific hardware components that contain hardwired logic. Those operations might alternatively be performed by any combination of programmed computer components and custom hardware components.
  • [0053]
    A machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer), not limited to Compact Disc Read-Only Memory (CD-ROMs), Read-Only Memory (ROMs), Random Access Memory (RAM), and Erasable Programmable Read-Only Memory (EPROM).
  • [0054]
    The invention is not limited to the specific embodiments described above. For example, although an accelerometer has been described above as the means for detecting a change in the physical orientation or position of the device (that may have been brought about by the user, for example, rotating or turning or tapping the device), other means for detecting movement or changes in the position of the device may alternatively be used. Although the examples depicted in FIGS. 4 and 5 take advantage of the rectangular shape of the example device 100, in that the predetermined orientations of the device are vertical, horizontal, and upside down vertical, orientations that are different from each other by other than 90 degrees may also be feasible depending upon the external shape or peripheral shape of the device 100. Other distinct, predetermined orientations that can be comfortably associated with the shape of the device are possible, e.g. a triangular device could have three distinct orientations each corresponding to a different side of the device being held horizontal by a user. Accordingly, other embodiments are within the scope of the claims.
Patent Citations
Cited PatentFiling datePublication dateApplicantTitle
US2677369 *26 Mar 19524 May 1954Fred L KnowlesApparatus for treatment of the spinal column
US3648691 *24 Feb 197014 Mar 1972Univ Colorado State Res FoundMethod of applying vertebral appliance
US4011602 *6 Oct 197515 Mar 1977Battelle Memorial InstitutePorous expandable device for attachment to bone tissue
US4159536 *8 Apr 197726 Jun 1979Willard E. KehoePortable electronic language translation device
US4257409 *9 Apr 197924 Mar 1981Kazimierz BacalDevice for treatment of spinal curvature
US4573454 *17 May 19844 Mar 1986Hoffman Gregory ASpinal fixation apparatus
US4657550 *16 Jan 198614 Apr 1987Daher Youssef HButtressing device usable in a vertebral prosthesis
US4827918 *14 Aug 19869 May 1989Sven OlerudFixing instrument for use in spinal surgery
US4931055 *1 Jun 19875 Jun 1990John BumpusDistraction rods
US5011484 *10 Oct 198930 Apr 1991Breard Francis HSurgical implant for restricting the relative movement of vertebrae
US5092866 *2 Feb 19903 Mar 1992Breard Francis HFlexible inter-vertebral stabilizer as well as process and apparatus for determining or verifying its tension before installation on the spinal column
US5098433 *12 Apr 198924 Mar 1992Yosef FreedlandWinged compression bolt orthopedic fastener
US5201734 *14 May 199113 Apr 1993Zimmer, Inc.Spinal locking sleeve assembly
US5306275 *31 Dec 199226 Apr 1994Bryan Donald WLumbar spine fixation apparatus and method
US5390683 *21 Feb 199221 Feb 1995Pisharodi; MadhavanSpinal implantation methods utilizing a middle expandable implant
US5395370 *16 Oct 19927 Mar 1995Pina Vertriebs AgVertebral compression clamp for surgical repair to damage to the spine
US5415661 *24 Mar 199316 May 1995University Of MiamiImplantable spinal assist device
US5496318 *18 Aug 19935 Mar 1996Advanced Spine Fixation Systems, Inc.Interspinous segmental spine fixation device
US5518498 *7 Oct 199321 May 1996Angiomed AgStent set
US5609634 *30 Jun 199311 Mar 1997Voydeville; GillesIntervertebral prosthesis making possible rotatory stabilization and flexion/extension stabilization
US5609635 *7 Jun 199511 Mar 1997Michelson; Gary K.Lordotic interbody spinal fusion implants
US5628756 *29 Jul 199613 May 1997Smith & Nephew Richards Inc.Knotted cable attachment apparatus formed of braided polymeric fibers
US5645599 *22 Apr 19968 Jul 1997FixanoInterspinal vertebral implant
US5707390 *5 Jun 199513 Jan 1998General Surgical Innovations, Inc.Arthroscopic retractors
US5716416 *10 Sep 199610 Feb 1998Lin; Chih-IArtificial intervertebral disk and method for implanting the same
US5860977 *27 Oct 199719 Jan 1999Saint Francis Medical Technologies, LlcSpine distraction implant and method
US6022376 *16 Mar 19988 Feb 2000Raymedica, Inc.Percutaneous prosthetic spinal disc nucleus and method of manufacture
US6048342 *27 Oct 199811 Apr 2000St. Francis Medical Technologies, Inc.Spine distraction implant
US6068630 *20 Oct 199830 May 2000St. Francis Medical Technologies, Inc.Spine distraction implant
US6190414 *31 Oct 199620 Feb 2001Surgical Dynamics Inc.Apparatus for fusion of adjacent bone structures
US6214050 *11 May 199910 Apr 2001Donald R. HueneExpandable implant for inter-bone stabilization and adapted to extrude osteogenic material, and a method of stabilizing bones while extruding osteogenic material
US6352537 *17 Sep 19985 Mar 2002Electro-Biology, Inc.Method and apparatus for spinal fixation
US6364883 *23 Feb 20012 Apr 2002Albert N. SantilliSpinous process clamp for spinal fusion and method of operation
US6375682 *6 Aug 200123 Apr 2002Lewis W. FleischmannCollapsible, rotatable and expandable spinal hydraulic prosthetic device
US6402750 *4 Apr 200011 Jun 2002Spinlabs, LlcDevices and methods for the treatment of spinal disorders
US6419704 *8 Oct 199916 Jul 2002Bret FerreeArtificial intervertebral disc replacement methods and apparatus
US6520991 *9 Apr 200118 Feb 2003Donald R. HueneExpandable implant for inter-vertebral stabilization, and a method of stabilizing vertebrae
US6554833 *16 Jul 200129 Apr 2003Expanding Orthopedics, Inc.Expandable orthopedic device
US6582433 *9 Apr 200124 Jun 2003St. Francis Medical Technologies, Inc.Spine fixation device and method
US6582467 *31 Oct 200124 Jun 2003Vertelink CorporationExpandable fusion cage
US6685742 *12 Nov 20023 Feb 2004Roger P. JacksonArticulated anterior expandable spinal fusion cage system
US6695842 *26 Oct 200124 Feb 2004St. Francis Medical Technologies, Inc.Interspinous process distraction system and method with positionable wing and method
US6709435 *28 Mar 200223 Mar 2004A-Spine Holding Group Corp.Three-hooked device for fixing spinal column
US6723126 *1 Nov 200220 Apr 2004Sdgi Holdings, Inc.Laterally expandable cage
US6730126 *12 Feb 20034 May 2004Frank H. Boehm, Jr.Device and method for lumbar interbody fusion
US6733534 *29 Jan 200211 May 2004Sdgi Holdings, Inc.System and method for spine spacing
US6736818 *10 May 200218 May 2004Synthes (U.S.A.)Radially expandable intramedullary nail
US6758863 *12 Dec 20026 Jul 2004Sdgi Holdings, Inc.Vertically expanding intervertebral body fusion device
US6761720 *13 Oct 200013 Jul 2004Spine NextIntervertebral implant
US6905512 *17 Jun 200214 Jun 2005Phoenix Biomedical CorporationSystem for stabilizing the vertebral column including deployment instruments and variable expansion inserts therefore
US6981985 *22 Jan 20023 Jan 2006Boston Scientific Scimed, Inc.Stent bumper struts
US7011685 *5 Jan 200414 Mar 2006Impliant Ltd.Spinal prostheses
US7041136 *23 Apr 20039 May 2006Facet Solutions, Inc.Facet joint replacement
US7048736 *17 May 200223 May 2006Sdgi Holdings, Inc.Device for fixation of spinous processes
US7081120 *12 Dec 200225 Jul 2006Sdgi Holdings, Inc.Instrumentation and method for delivering an implant into a vertebral space
US7162412 *10 Dec 20019 Jan 2007Evidence CorporationMultilingual conversation assist system
US7163558 *28 Nov 200216 Jan 2007Abbott SpineIntervertebral implant with elastically deformable wedge
US7201751 *26 Apr 200110 Apr 2007St. Francis Medical Technologies, Inc.Supplemental spine fixation device
US7217293 *21 Nov 200315 May 2007Warsaw Orthopedic, Inc.Expandable spinal implant
US7238204 *12 Jul 20013 Jul 2007Abbott SpineShock-absorbing intervertebral implant
US7401300 *9 Jan 200415 Jul 2008Nokia CorporationAdaptive user interface input device
US7656393 *23 Jun 20062 Feb 2010Apple Inc.Electronic device having display and surrounding touch sensitive bezel for user interface and control
US7962179 *28 Aug 200714 Jun 2011Htc CorporationHandheld electronic device
US20020021278 *6 Jun 200121 Feb 2002Hinckley Kenneth P.Method and apparatus using multiple sensors in a device with a display
US20030046401 *16 Oct 20016 Mar 2003Abbott Kenneth H.Dynamically determing appropriate computer user interfaces
US20040097931 *14 Oct 200320 May 2004Steve MitchellInterspinous process and sacrum implant and method
US20040127198 *7 Apr 20031 Jul 2004Roskind James A.Automatically changing a mobile device configuration based on environmental condition
US20040133204 *25 Jul 20038 Jul 2004Davies John Bruce ClayfieldExpandable bone nails
US20040133817 *8 Oct 20038 Jul 2004Samsung Electronics Co., Ltd.Portable computer managing power consumption according to display part positions and control method thereof
US20050010293 *20 May 200413 Jan 2005Zucherman James F.Distractible interspinous process implant and method of implantation
US20050049708 *15 Oct 20043 Mar 2005Atkinson Robert E.Devices and methods for the treatment of spinal disorders
US20050165398 *24 Jan 200528 Jul 2005Reiley Mark A.Percutaneous spine distraction implant systems and methods
US20060004447 *30 Jun 20045 Jan 2006Depuy Spine, Inc.Adjustable posterior spinal column positioner
US20060004455 *9 Jun 20055 Jan 2006Alain LeonardMethods and apparatuses for bone restoration
US20060015181 *19 Jul 200419 Jan 2006Biomet Merck France (50% Interest)Interspinous vertebral implant
US20060026535 *18 Jan 20052 Feb 2006Apple Computer Inc.Mode-based graphical user interfaces for touch sensitive input devices
US20060033724 *16 Sep 200516 Feb 2006Apple Computer, Inc.Virtual input device placement on a touch screen user interface
US20060064165 *31 Mar 200523 Mar 2006St. Francis Medical Technologies, Inc.Interspinous process implant including a binder and method of implantation
US20060084983 *20 Oct 200420 Apr 2006The Board Of Trustees Of The Leland Stanford Junior UniversitySystems and methods for posterior dynamic stabilization of the spine
US20060084985 *6 Dec 200420 Apr 2006The Board Of Trustees Of The Leland Stanford Junior UniversitySystems and methods for posterior dynamic stabilization of the spine
US20060084987 *10 Jan 200520 Apr 2006Kim Daniel HSystems and methods for posterior dynamic stabilization of the spine
US20060084988 *10 Mar 200520 Apr 2006The Board Of Trustees Of The Leland Stanford Junior UniversitySystems and methods for posterior dynamic stabilization of the spine
US20060085069 *4 Feb 200520 Apr 2006The Board Of Trustees Of The Leland Stanford Junior UniversitySystems and methods for posterior dynamic stabilization of the spine
US20060089654 *25 Oct 200527 Apr 2006Lins Robert EInterspinous distraction devices and associated methods of insertion
US20060089719 *21 Oct 200427 Apr 2006Trieu Hai HIn situ formation of intervertebral disc implants
US20060106381 *4 Feb 200518 May 2006Ferree Bret AMethods and apparatus for treating spinal stenosis
US20060106397 *2 Dec 200518 May 2006Lins Robert EInterspinous distraction devices and associated methods of insertion
US20060111728 *5 Oct 200525 May 2006Abdou M SDevices and methods for inter-vertebral orthopedic device placement
US20060116690 *20 Jan 20061 Jun 2006Pagano Paul JSurgical instrumentation and method for treatment of a spinal structure
US20060122620 *6 Dec 20048 Jun 2006The Board Of Trustees Of The Leland Stanford Junior UniversitySystems and methods for stabilizing the motion or adjusting the position of the spine
US20060136060 *3 Sep 200322 Jun 2006Jean TaylorPosterior vertebral support assembly
US20070004451 *30 Jun 20054 Jan 2007C Anderson EricControlling functions of a handheld multifunction device
US20070005849 *29 Jun 20054 Jan 2007Microsoft CorporationInput device with audio capablities
US20070061487 *1 Feb 200615 Mar 2007Moore James FSystems and methods for use of structured and unstructured distributed data
US20070136064 *8 Dec 200614 Jun 2007Carroll David WMobile personal computer with movement sensor
US20070151116 *9 Jun 20065 Jul 2007Malandain Hugues FMeasurement instrument for percutaneous surgery
US20080165144 *19 Dec 200710 Jul 2008Scott ForstallPortrait-Landscape Rotation Heuristics for a Portable Multifunction Device
US20090137286 *30 Dec 200828 May 2009Htc CorporationControlling method and system for handheld communication device and recording medium using the same
US20090138736 *26 Jun 200828 May 2009High Tech Computer, Corp.Power management method for handheld electronic device
Referenced by
Citing PatentFiling datePublication dateApplicantTitle
US8248371 *19 Dec 200821 Aug 2012Verizon Patent And Licensing Inc.Accelerometer sensitive soft input panel
US8583373 *17 Nov 200812 Nov 2013At&T Services, Inc.Methods and apparatuses for providing enhanced navigation services
US8692851 *28 May 20108 Apr 2014Apple Inc.Device, method, and graphical user interface with grid transformations during device rotation
US87751565 Aug 20108 Jul 2014Google Inc.Translating languages in response to device motion
US8830165 *24 Jan 20129 Sep 2014Google Inc.User interface
US8843359 *1 Mar 201023 Sep 2014Andrew Nelthropp LauderLanguage translation employing a combination of machine and human translations
US8869071 *29 Mar 201021 Oct 2014Lg Electronics Inc.Mobile terminal and method of controlling the operation of the mobile terminal based on movement of a main body
US895431424 Sep 201210 Feb 2015Google Inc.Providing translation alternatives on mobile devices by usage of mechanic signals
US9053097 *4 May 20129 Jun 2015Ortsbo, Inc.Cross-language communication between proximate mobile devices
US9183199 *25 Mar 201110 Nov 2015Ming-Yuan WuCommunication device for multiple language translation system
US9235272 *8 Sep 201412 Jan 2016Google Inc.User interface
US9292498 *21 Mar 201222 Mar 2016Paypal, Inc.Device orientation based translation system
US9338071 *31 Dec 201410 May 2016Google Inc.Locale profile for a fabric network
US9355094 *12 Nov 201331 May 2016Google Inc.Motion responsive user interface for realtime language translation
US9424252 *2 Apr 201223 Aug 2016PIJIN co. Ltd.Information providing device, information providing method, and computer program
US9430465 *12 Jun 201330 Aug 2016Facebook, Inc.Hybrid, offline/online speech translation system
US9519641 *15 Oct 201213 Dec 2016Abbyy Development LlcPhotography recognition translation
US9536049 *7 Sep 20123 Jan 2017Next It CorporationConversational virtual healthcare assistant
US955235026 Jun 201424 Jan 2017Next It CorporationVirtual assistant conversations for ambiguous user input and goals
US95633088 Dec 20157 Feb 2017Google Inc.User interface
US95636184 Aug 20147 Feb 2017Next It CorporationWearable-based virtual agents
US958957911 Jun 20147 Mar 2017Next It CorporationRegression testing
US9613030 *26 Feb 20164 Apr 2017Paypal, Inc.Device orientation based translation system
US9653000 *6 Dec 201216 May 2017Joon Sung WeeMethod for providing foreign language acquisition and learning service based on context awareness using smart device
US966109322 Feb 201623 May 2017Google Inc.Device control profile for a fabric network
US9665275 *23 Jun 201430 May 2017Google Inc.Techniques for input of a multi-character compound consonant or vowel and transliteration to another language using a touch computing device
US966802430 Mar 201630 May 2017Apple Inc.Intelligent automated assistant for TV user interactions
US97166866 Jan 201525 Jul 2017Google Inc.Device description profile for a fabric network
US20090289958 *26 May 200926 Nov 2009Samsung Electronics Co., Ltd.Display mode switching device and method for mobile terminal
US20100082330 *29 Sep 20081 Apr 2010Yahoo! Inc.Multi-lingual maps
US20100125410 *17 Nov 200820 May 2010Mary Anne HicksMethods and Apparatuses for Providing Enhanced Navigation Services
US20100156798 *19 Dec 200824 Jun 2010Verizon Data Services, LlcAccelerometer Sensitive Soft Input Panel
US20100169074 *15 May 20091 Jul 2010Hon Hai Precision Industry Co., Ltd.Method of configuring user preferences on electronic device
US20100223048 *1 Mar 20102 Sep 2010Andrew Nelthropp LauderLanguage translation employing a combination of machine and human translations
US20100293502 *13 May 201018 Nov 2010Lg Electronics Inc.Mobile terminal equipped with multi-view display and method of controlling the mobile terminal
US20110086673 *29 Mar 201014 Apr 2011Lg Electronics Inc.Mobile terminal and method of controlling the operation of the mobile terminal
US20110164056 *28 May 20107 Jul 2011Bas OrdingDevice, Method, and Graphical User Interface with Grid Transformations During Device Rotation
US20120079532 *29 Sep 201029 Mar 2012Sony CorporationTechniques for developing a television user interface for a secondary device
US20120116751 *9 Nov 201110 May 2012International Business Machines CorporationProviding message text translations
US20120206332 *16 Feb 201116 Aug 2012Sony CorporationMethod and apparatus for orientation sensitive button assignment
US20120245920 *25 Mar 201127 Sep 2012Ming-Yuan WuCommunication device for multiple language translation system
US20120284014 *4 May 20128 Nov 2012Ortsbo, Inc.Cross-Language Communication Between Proximate Mobile Devices
US20120289156 *9 May 201115 Nov 2012Wesley BoudvilleMultiple uses of an e-book reader
US20130035927 *18 Jun 20127 Feb 2013Samsung Electronics Co., Ltd.Display apparatus, control method and server thereof
US20130253900 *21 Mar 201226 Sep 2013Ebay, Inc.Device orientation based translation system
US20130297287 *7 May 20127 Nov 2013Google Inc.Display two keyboards on one tablet computer to allow two users to chat in different languages
US20130326348 *4 Mar 20135 Dec 2013Google Inc.Systems and Methods for Dynamically Providing Fonts Based on Language Settings
US20140074454 *7 Sep 201213 Mar 2014Next It CorporationConversational Virtual Healthcare Assistant
US20140081619 *15 Oct 201220 Mar 2014Abbyy Software Ltd.Photography Recognition Translation
US20140180671 *24 Dec 201226 Jun 2014Maria OsipovaTransferring Language of Communication Information
US20140195218 *2 Apr 201210 Jul 2014PIJIN Co., Ltd.Information Providing Device, Information Providing Method, and Computer Program
US20140297254 *20 Feb 20142 Oct 2014Samsung Electronics Co., Ltd.Text data processing method and electronic device thereof
US20140304640 *23 Jun 20149 Oct 2014Google Inc.Techniques for input of a multi-character compound consonant or vowel and transliteration to another language using a touch computing device
US20140337008 *15 Jan 201313 Nov 2014Sharp Kabushiki KaishaImage processing apparatus, image forming apparatus, program and storage medium
US20150010889 *6 Dec 20128 Jan 2015Joon Sung WeeMethod for providing foreign language acquirement studying service based on context recognition using smart device
US20150051898 *12 Nov 201319 Feb 2015Google Inc.User interface for realtime language translation
US20150066473 *21 Aug 20145 Mar 2015Lg Electronics Inc.Mobile terminal
US20150199083 *7 Mar 201316 Jul 2015Google Inc.Consolidated system tray
US20150234811 *16 Feb 201420 Aug 2015International Business Machines CorporationContext enriched application text translation
US20150370786 *30 Mar 201524 Dec 2015Samsung Electronics Co., Ltd.Device and method for automatic translation
US20160110349 *20 Oct 201521 Apr 2016Kimberly Norman-RosedamLanguage Translating Device
US20170085507 *17 Sep 201523 Mar 2017International Business Machines CorporationAdding images to a text based electronic message
USD761813 *3 Nov 201419 Jul 2016Chris J. KatopisDisplay screen with soccer keyboard graphical user interface
USD764492 *4 Nov 201423 Aug 2016Chris J. KatopisDisplay screen with baseball keyboard graphical user interface
CN102270197A *1 Jun 20107 Dec 2011英业达股份有限公司触控翻译系统及其方法
CN103299361A *2 Aug 201111 Sep 2013谷歌公司Translating languages
CN103838715A *26 Nov 20124 Jun 2014英业达科技有限公司Input system with translation prompt function and input method
CN104423582A *1 Sep 201418 Mar 2015Lg电子株式会社移动终端
CN105117391A *2 Aug 20112 Dec 2015谷歌公司Translating languages
EP2601596A2 *2 Aug 201112 Jun 2013Google, Inc.Translating languages
EP2601596A4 *2 Aug 201115 Jan 2014Google IncTranslating languages
EP2876549A3 *22 Aug 201414 Oct 2015LG Electronics, Inc.Mobile terminal and method of controlling the same terminal
EP2957990A1 *2 Apr 201523 Dec 2015Samsung Electronics Co., LtdDevice and method for automatic translation
WO2012166282A1 *3 May 20126 Dec 2012Ortsbo, Inc.Inter-language communication devices and methods
WO2013141936A3 *28 Dec 201225 Jun 2015Ebay Inc.Device orientation based translation system
WO2015023365A1 *27 Jun 201419 Feb 2015Google Inc.User interface for realtime language translation
WO2016018195A1 *4 Sep 20144 Feb 2016Мыкола Валерийовыч ЦАРЬКОВMethod for learning linguistic units of a foreign language
Classifications
U.S. Classification704/4
International ClassificationG06F17/28
Cooperative ClassificationG06F3/0482, G06F3/04886, G06F3/017, G06F2200/1637, G06F1/1643, G06F1/1626, G06F1/1694, G06F3/0489, G06F3/0412, G06F3/04842, G06F17/289
European ClassificationG06F1/16P9D3, G06F1/16P9P7, G06F17/28U, G06F1/16P3
Legal Events
DateCodeEventDescription
31 Jul 2008ASAssignment
Owner name: APPLE INC.,CALIFORNIA
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LEE, MICHAEL M.;GREGG, JUSTIN;SEGUIN, CHAD G.;SIGNING DATES FROM 20080728 TO 20080731;REEL/FRAME:021324/0429