US20140281942A1 - System and method for text editor text alignment control - Google Patents
System and method for text editor text alignment control Download PDFInfo
- Publication number
- US20140281942A1 US20140281942A1 US13/837,107 US201313837107A US2014281942A1 US 20140281942 A1 US20140281942 A1 US 20140281942A1 US 201313837107 A US201313837107 A US 201313837107A US 2014281942 A1 US2014281942 A1 US 2014281942A1
- Authority
- US
- United States
- Prior art keywords
- text
- block
- alignment
- control interface
- electronic device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G06F17/24—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0486—Drag-and-drop
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/10—Text processing
- G06F40/103—Formatting, i.e. changing of presentation of documents
- G06F40/106—Display of layout of documents; Previewing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/10—Text processing
- G06F40/166—Editing, e.g. inserting or deleting
Definitions
- Example embodiments disclosed herein relate generally to text alignment methodologies for electronic devices, such as handheld electronic devices, and more particularly, to systems and methods for modifying the alignment of a block of text based on an alignment control interface.
- Electronic devices such as computers, laptops, ultrabooks, netbooks, tablets, cellular phones, smart phones, personal digital assistants, etc., typically allow a user to input characters into a text editing application, such as a word processor, e-mail application, or HTML editor.
- a text editing application can allow a user to modify the alignment and position of a block of text, such as a sentence, heading, or paragraph. Modifying the alignment or position of text blocks can be a cumbersome task, due to, for example, the position or layout of controls in a text editing application, particularly where a user needs to quickly and efficiently modify the alignment or position of text blocks.
- FIG. 1 is an example block diagram illustrating an electronic device, consistent with embodiments disclosed herein.
- FIG. 2 is a flowchart illustrating an example method for modifying the alignment of a block of text based on an alignment control interface, consistent with embodiments disclosed herein.
- FIG. 3 is a flowchart illustrating an example method for modifying the position of a block of text based on a float control interface, consistent with embodiments disclosed herein.
- FIG. 4 illustrates example alignments of a block of text and example positions of an alignment control interface, consistent with embodiments disclosed herein.
- FIGS. 5A , 5 B, 5 C, 5 D, and 5 E illustrate example positions of a block of text, consistent with embodiments disclosed herein.
- FIGS. 6A , 6 B, and 6 C illustrate example alignments and positions of a block of text, consistent with embodiments disclosed herein.
- the present disclosure relates to an electronic device.
- the electronic device can be a mobile or handheld wireless communication device such as a cellular phone, smart phone, wireless organizer, personal digital assistant, wirelessly enabled notebook computer, tablet, or similar device.
- the electronic device can also be an electronic device without wireless communication capabilities, such as a desktop computer, handheld electronic game device, digital photograph album, digital camera, or other device.
- Conventional text editing applications often provide a fixed toolbar in one region of the display that includes an arrangement of several buttons related to text alignment, such as “left alignment,” “center alignment,” “right alignment,” and “justified.”
- buttons related to text alignment such as “left alignment,” “center alignment,” “right alignment,” and “justified.”
- example embodiments described herein permit the user of an electronic device to modify the alignment and position of text blocks through user-selectable control interfaces placed in proximity to the text blocks.
- indefinite article “a” or “an” in the specification and the claims is meant to include one or more than one of the feature that it introduces, unless otherwise indicated.
- use of the definite article “the,” particularly after a feature has been introduced with the indefinite article, is meant to include one or more than one of the feature to which it refers (unless otherwise indicated).
- a method for an electronic device having an input device and a display comprises displaying, in proximity of the block of text, an alignment control interface after a precondition is met, detecting a dragging motion associated with the alignment control interface, modifying an alignment of the block of text based, at least in part, on the detected dragging motion, detecting an end of the dragging motion, and, displaying, in a predetermined position associated with the alignment of the block of text, the alignment control interface.
- a method for an electronic device having an input device and a display comprises displaying a block of text within a text editing application, displaying, in proximity of the block of text, a float control interface after a precondition is met, detecting a dragging motion associated with the float control interface, modifying a position of the block of text based, at least in part, on the detected dragging motion, detecting an end of the dragging motion, and, displaying, in a predetermined position associated with the position of the block of text, the float control interface.
- an electronic device comprising a display configured to display characters, an input device, a memory storing one or more instructions, and a processor.
- the processor is configured to execute the one or more instructions to perform: displaying, in proximity of the block of text, an alignment control interface after a precondition is met, detecting a dragging motion associated with the alignment control interface, modifying an alignment of the block of text based, at least in part, on the detected dragging motion, detecting an end of the dragging motion, and, displaying, in a predetermined position associated with the alignment of the block of text, the alignment control interface.
- an electronic device comprising a display configured to display characters, an input device, a memory storing one or more instructions, and a processor.
- the processor is configured to execute the one or more instructions to perform: displaying a block of text within a text editing application, displaying, in proximity of the block of text, a float control interface after a precondition is met, detecting a dragging motion associated with the float control interface, modifying a position of the block of text based, at least in part, on the detected dragging motion, detecting an end of the dragging motion, and, displaying, in a predetermined position associated with the position of the block of text, the float control interface.
- example embodiments in addition to those described below, permit, for example, the user of an electronic device to modify the alignment and position of text blocks through the use of control interfaces placed in proximity to the text blocks, without diverting attention and visual focus from the regions of the display in which the text blocks are positioned. This may result in less movement and a higher degree of association between the user's action and the result, while allowing the user's focus to remain on the regions of the display in which the text blocks are positioned, enhancing efficiency, accuracy, and speed of character input and text block modification.
- FIG. 1 is an example block diagram of an electronic device 100 , consistent with example embodiments disclosed herein.
- Electronic device 100 includes multiple components, such as a processor 102 that controls the overall operation of electronic device 100 .
- Communication functions, including data and voice communications, are performed through a communication subsystem 104 .
- Data received by electronic device 100 is decompressed and decrypted by a decoder 106 .
- the communication subsystem 104 receives messages from and sends messages to a network 150 .
- Network 150 can be any type of network, including, but not limited to, a wired network, a data wireless network, voice wireless network, and dual-mode wireless networks that support both voice and data communications over the same physical base stations.
- Electronic device 100 can be a battery-powered device and include a battery interface 142 for receiving one or more batteries 144 .
- electronic device 100 can be a computer, laptop, ultrabook, netbook, or table device, or another device, and such an electronic device can include all or a subset of the components illustrated in FIG. 1 .
- the choice of components included in electronic device 100 is not critical to any embodiment.
- Processor 102 is coupled to and can interact with additional subsystems such as a Random Access Memory (RAM) 108 ; a memory 110 , such as a hard drive, CD, DVD, flash memory, or a similar storage device; one or more displays 112 ; one or more actuators 120 ; one or more force sensors 122 ; an auxiliary input/output (I/O) subsystem 124 ; a data port 126 ; one or more speakers 128 ; one or more microphones 130 ; short-range communications 132 ; and other device subsystems 134 .
- RAM Random Access Memory
- Display 112 is coupled to and controlled by processor 102 . Characters, such as text, symbols, images, and other items are displayed on display 112 via processor 102 . Characters can be input into the electronic device 100 using a keyboard (not pictured in FIG. 1 ), such as a physical keyboard having keys that are mechanically actuated, or a virtual keyboard having keys rendered on display 112 .
- Processor 102 can also interact with a positioning system 136 for determining the location of electronic device 100 .
- the location can be determined in any number of ways, such as by a computer, by a Global Positioning System (GPS) (which can be included in electronic device 100 ), through a Wi-Fi network, or by having a location entered manually.
- GPS Global Positioning System
- the location can also be determined based on calendar entries.
- electronic device 100 uses a Subscriber Identity Module or a Removable User Identity Module (SIM/RUIM) card 138 inserted into a SIM/RUIM interface 140 for communication with a network, such as network 150 .
- SIM/RUIM Removable User Identity Module
- user identification information can be programmed into memory 110 .
- Electronic device 100 also includes an operating system 146 and programs 148 that are executed by processor 102 and are typically stored in memory 110 or RAM 108 . Additional applications can be loaded onto electronic device 100 through network 150 , auxiliary I/O subsystem 124 , data port 126 , short-range communications subsystem 132 , or any other suitable subsystem.
- a received signal such as a text message, an e-mail message, or a web page download is processed by communication subsystem 104 .
- This processed information is then provided to processor 102 .
- Processor 102 processes the received signal for output to display 112 , to auxiliary I/O subsystem 124 , or a combination of both.
- a user can compose data items, for example e-mail messages, which can be transmitted over network 150 through communication subsystem 104 .
- Speaker 128 outputs audible information converted from electrical signals
- microphone 130 converts audible information into electrical signals for processing.
- FIG. 2 is an example flowchart illustrating an example method 200 for modifying the alignment of a block of text based on an alignment control interface, consistent with example embodiments disclosed herein.
- Memory such as memory 108 or RAM 110
- RAM 110 can include a set of instructions that, when executed by a processor (such as processor 102 ), can be used to modify the alignment of a block of text based on an alignment control interface.
- method 200 begins at step 210 , where the processor 102 displays a block of text within a text editing application.
- the processor 102 receives an input reflecting placement of a caret in, or near, a block of text within a text editing application.
- a caret refers to a cursor (e.g., a blinking cursor) or other symbol indicating a position for inserting or removing text.
- a block of text refers to a character, word, sentence, heading, paragraph, or other unit or units of text.
- the processor 102 checks whether a precondition is met.
- the processor 102 may check whether, during a specified time period, an input is received reflecting placement of a caret in, or near, a block of text, an input is received changing placement of the caret, a character input is received, and/or an input is detected reflecting selection of an application outside of the text editing application.
- a text editing application can include, for example, a word processor, an e-mail application, or a HTML editor.
- the processor 102 determines that the precondition is met unless any of the above-described instances occur.
- the specified time period can vary in duration; exemplary durations include 300 milliseconds and 500 milliseconds.
- One advantage of implementing such a precondition includes avoidance of a “flicker” effect, in which a component of a user interface is shown and hidden repeatedly, which can have the effect of cluttering the display and/or distracting a user.
- the processor 102 displays an alignment control interface in proximity to the block of text.
- the alignment control interface can be a user-selectable visual element.
- the alignment control interface visually indicates the present alignment of the block of text by, for instance, displaying horizontal lines corresponding to the different alignment options.
- the alignment control interface can be referred to as a “grip” or simply as a “control interface.”
- the alignment control interface can be displayed adjacent to, above, below, or within (e.g., layered in) a block of text.
- the alignment control interface can be positioned horizontally based on the present alignment of the block of text.
- the horizontal position of the alignment control interface can correspond to the left region of the block of text.
- the alignment interface control can be horizontally positioned within the middle and right regions of the block of text, respectively.
- the alignment control interface can be positioned vertically according to the top of the block of text.
- the alignment control interface can appear as shown in the exemplary interfaces 410 , 420 , and 430 , of FIG. 4 .
- the processor 102 at step 240 modifies the alignment of the block of text based, at least in part, on the dragging motion.
- the processor modifies the alignment of the block of text, in real time, based on the position of the alignment control interface.
- the alignment of the block of text can be based on the position of the alignment control interface relative to the block of text.
- the alignment control interface is dragged horizontally across the display. As the alignment control interface is dragged from left to right or from right to left, the alignment of the block of text is set depending on the horizontal position of the alignment control interface.
- the alignment of the block of text is set to “left aligned.” Conversely, when the horizontal position of the alignment control interface is in the middle or right regions of the block of text, the alignment of the block of text is set to “centered” and “right aligned,” respectively.
- “leading” and “trailing” alignments can be triggered by dragging the alignment control interface into either one of the edges of the display (e.g., into the margins in a text editing application).
- Various partitions of the block of text into left, middle, and right regions are possible; the choice of a particular partition is not critical to any embodiment.
- the processor 102 detects an end of the dragging motion. After detecting the end of the dragging motion, the processor 102 at step 260 displays the alignment control interface in a predetermined position associated with the alignment of the block of text. If the alignment of the block of text is set to “right aligned,” for example, the alignment interface control may be displayed within the right region of the display. Conversely, if the alignment of the block of text is set to “centered” or “left aligned,” the alignment interface control may be displayed within the middle and left regions of the display, respectively. This is further demonstrated by the exemplary interfaces 410 , 420 , and 430 , and exemplary text blocks 415 , 425 , and 435 , as illustrated in FIG. 4 . In some embodiments, the alignment control interface animates or “snaps into place” in the appropriate region of the display.
- FIG. 3 is an example flowchart illustrating an example method 300 in accordance with some example embodiments.
- Memory such as memory 108 or RAM 110
- a processor such as processor 102
- the processor 102 displays a block of text within a text editing application.
- the processor 102 receives an input reflecting placement of a caret in, or near, a block of text within a text editing application.
- the processor 102 checks whether a precondition is met.
- the processor 102 can check whether, during a specified time period, an input is received reflecting placement of a caret in, or near, a block of text, an input is received changing placement of the caret, a character input is received, and/or an input is detected reflecting selection of an application outside of the text editing application. In some embodiments, the processor 102 determines that the precondition is met unless any of the above-described instances occur.
- the specified time period can vary in duration; exemplary durations include 300 milliseconds and 500 milliseconds.
- the processor 102 displays a float control interface in proximity of the block of text.
- the alignment control interface can be a user-selectable visual element.
- the float control interface can be displayed adjacent to, above, below, or within (e.g., layered in) a block of text.
- the processor 102 detects a dragging motion associated with the float control interface at step 330 .
- a dragging motion can include, for example, a click and drag motion, or another action, in which the position of the float control interface is incrementally repositioned on the display.
- the float control interface is not repositioned on the display until the user completes the click and drag motion described above, or otherwise releases control over the float control interface.
- the dragging motion can be executed, for example, through use of a mouse or other mechanical input device, as well as through touch motions (e.g., a swiping motion) on a touch-sensitive display.
- the processor 102 at step 240 modifies the alignment of the block of text based, at least in part, on the dragging motion.
- the processor modifies the alignment of the block of text, in real time, based on the position of the alignment control interface.
- the block of text can be positioned according to the horizontal position of the float control interface.
- the float control interface is dragged horizontally across the display.
- the position of the block of text is set depending on the horizontal position of the float control interface.
- the block of text is also positioned in the left region of the display.
- the block of text is positioned in the middle and right regions of the display, respectively.
- FIG. 4 illustrates example alignments of a block of text and example positions of an alignment control interface, consistent with embodiments disclosed herein.
- FIG. 4 includes alignment control interfaces 410 , 420 , and 430 , and text blocks 415 , 425 , and 435 .
- alignment control interface 410 is displayed at the top left of text block 415 , which corresponds to the left alignment of text block 415 .
- the visual appearance of alignment control interface 410 depicts the left alignment of text block 415 .
- alignment control interface 410 is dragged from left to right (or from right to left), the alignment of the text block is also modified.
- Alignment control interface 420 is displayed in the middle of text block 425 , which corresponds to the center alignment of text block 425 (also visually depicted in alignment control interface 420 ).
- alignment control interface 430 is displayed at the top right of text block 435 , which corresponds to the right alignment of text block 435 (also visually depicted in alignment control interface 430 ).
- FIGS. 5A , 5 B, and 5 C illustrate example positions of a block of text, consistent with embodiments disclosed herein.
- FIGS. 5A , 5 B, and 5 C illustrate various positions of the text block, after the position of the text block has been modified using the methods described in connection with FIG. 3 , above.
- text block 510 is positioned as “float center.” In some embodiments, this can also be referred to as a default position.
- text block 520 of FIG. 5B is positioned as “float left,” whereas text block 530 of FIG. 5C is positioned as “float right.”
- FIGS. 5B and 5C illustrate, positioning a text block as “float left” or “float right” can result in neighboring text blocks being repositioned relative to the original text block.
- FIGS. 5A , 5 D, and 5 E illustrate additional example positions of a block of text, consistent with embodiments disclosed herein. More specifically, FIGS. 5A , 5 D, and 5 E illustrate various positions of the text block, after the position of the text block has been modified using the methods described in connection with FIG. 2 , above.
- Text blocks 510 , 540 , and 550 of FIGS. 5A , 5 D, and 5 E are positioned as “float center” or a default position, as described above.
- text block 510 has been set to “left aligned”
- text block 540 has been set to “center aligned”
- text block 550 has been set to “right aligned.”
- FIGS. 6A , 6 B, and 6 C show example alignments and positions of a block of text, consistent with embodiments disclosed herein.
- FIGS. 6A , 6 B, and 6 C illustrate various alignments of the text block, after the alignment of the text block has been modified using the methods described in connection with FIG. 2 , above.
- text blocks 610 , 620 , and 630 of FIGS. 6A , 6 B, and 6 C are positioned as “float right” which, as described above, indicates that the neighboring text blocks have been repositioned relative to text blocks 610 , 620 , and 630 .
- text block 610 has been set to “center aligned”
- text block 620 has been set to “left aligned”
- text block 630 has been set to “right aligned.”
- FIGS. 6A , 6 B, and 6 C demonstrate the joint use of alignment control and float control on a given block of text.
Abstract
Description
- Example embodiments disclosed herein relate generally to text alignment methodologies for electronic devices, such as handheld electronic devices, and more particularly, to systems and methods for modifying the alignment of a block of text based on an alignment control interface.
- Electronic devices, such as computers, laptops, ultrabooks, netbooks, tablets, cellular phones, smart phones, personal digital assistants, etc., typically allow a user to input characters into a text editing application, such as a word processor, e-mail application, or HTML editor. A text editing application can allow a user to modify the alignment and position of a block of text, such as a sentence, heading, or paragraph. Modifying the alignment or position of text blocks can be a cumbersome task, due to, for example, the position or layout of controls in a text editing application, particularly where a user needs to quickly and efficiently modify the alignment or position of text blocks.
-
FIG. 1 is an example block diagram illustrating an electronic device, consistent with embodiments disclosed herein. -
FIG. 2 is a flowchart illustrating an example method for modifying the alignment of a block of text based on an alignment control interface, consistent with embodiments disclosed herein. -
FIG. 3 is a flowchart illustrating an example method for modifying the position of a block of text based on a float control interface, consistent with embodiments disclosed herein. -
FIG. 4 illustrates example alignments of a block of text and example positions of an alignment control interface, consistent with embodiments disclosed herein. -
FIGS. 5A , 5B, 5C, 5D, and 5E illustrate example positions of a block of text, consistent with embodiments disclosed herein. -
FIGS. 6A , 6B, and 6C illustrate example alignments and positions of a block of text, consistent with embodiments disclosed herein. - Reference will now be made in detail to various embodiments, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts.
- The present disclosure relates to an electronic device. The electronic device can be a mobile or handheld wireless communication device such as a cellular phone, smart phone, wireless organizer, personal digital assistant, wirelessly enabled notebook computer, tablet, or similar device. The electronic device can also be an electronic device without wireless communication capabilities, such as a desktop computer, handheld electronic game device, digital photograph album, digital camera, or other device.
- Conventional text editing applications often provide a fixed toolbar in one region of the display that includes an arrangement of several buttons related to text alignment, such as “left alignment,” “center alignment,” “right alignment,” and “justified.” In a text editing application, however, it is often preferable to preserve space for writing and avoid using space for objects such as fixed toolbars and buttons. Additionally, it is often desirable to place editing controls in close proximity to the element that they act upon. This may result in less movement and a higher degree of association between action (e.g., selecting an alignment option) and result (e.g., seeing the result of the selection), while permitting the user to modify the alignment or position of text blocks without diverting attention and visual focus from regions of the display in which the text blocks are displayed.
- Accordingly, example embodiments described herein permit the user of an electronic device to modify the alignment and position of text blocks through user-selectable control interfaces placed in proximity to the text blocks.
- Use of the indefinite article “a” or “an” in the specification and the claims is meant to include one or more than one of the feature that it introduces, unless otherwise indicated. Similarly, use of the definite article “the,” particularly after a feature has been introduced with the indefinite article, is meant to include one or more than one of the feature to which it refers (unless otherwise indicated).
- In one example embodiment, a method for an electronic device having an input device and a display is provided. The method comprises displaying, in proximity of the block of text, an alignment control interface after a precondition is met, detecting a dragging motion associated with the alignment control interface, modifying an alignment of the block of text based, at least in part, on the detected dragging motion, detecting an end of the dragging motion, and, displaying, in a predetermined position associated with the alignment of the block of text, the alignment control interface.
- In another example embodiment, a method for an electronic device having an input device and a display is provided. The method comprises displaying a block of text within a text editing application, displaying, in proximity of the block of text, a float control interface after a precondition is met, detecting a dragging motion associated with the float control interface, modifying a position of the block of text based, at least in part, on the detected dragging motion, detecting an end of the dragging motion, and, displaying, in a predetermined position associated with the position of the block of text, the float control interface.
- In another example embodiment, an electronic device is provided. The electronic device comprises a display configured to display characters, an input device, a memory storing one or more instructions, and a processor. The processor is configured to execute the one or more instructions to perform: displaying, in proximity of the block of text, an alignment control interface after a precondition is met, detecting a dragging motion associated with the alignment control interface, modifying an alignment of the block of text based, at least in part, on the detected dragging motion, detecting an end of the dragging motion, and, displaying, in a predetermined position associated with the alignment of the block of text, the alignment control interface.
- In another example embodiment, an electronic device is provided. The electronic device comprises a display configured to display characters, an input device, a memory storing one or more instructions, and a processor. The processor is configured to execute the one or more instructions to perform: displaying a block of text within a text editing application, displaying, in proximity of the block of text, a float control interface after a precondition is met, detecting a dragging motion associated with the float control interface, modifying a position of the block of text based, at least in part, on the detected dragging motion, detecting an end of the dragging motion, and, displaying, in a predetermined position associated with the position of the block of text, the float control interface.
- These example embodiments, in addition to those described below, permit, for example, the user of an electronic device to modify the alignment and position of text blocks through the use of control interfaces placed in proximity to the text blocks, without diverting attention and visual focus from the regions of the display in which the text blocks are positioned. This may result in less movement and a higher degree of association between the user's action and the result, while allowing the user's focus to remain on the regions of the display in which the text blocks are positioned, enhancing efficiency, accuracy, and speed of character input and text block modification.
-
FIG. 1 is an example block diagram of anelectronic device 100, consistent with example embodiments disclosed herein.Electronic device 100 includes multiple components, such as aprocessor 102 that controls the overall operation ofelectronic device 100. Communication functions, including data and voice communications, are performed through acommunication subsystem 104. Data received byelectronic device 100 is decompressed and decrypted by adecoder 106. Thecommunication subsystem 104 receives messages from and sends messages to anetwork 150. Network 150 can be any type of network, including, but not limited to, a wired network, a data wireless network, voice wireless network, and dual-mode wireless networks that support both voice and data communications over the same physical base stations.Electronic device 100 can be a battery-powered device and include abattery interface 142 for receiving one ormore batteries 144. In some embodiments,electronic device 100 can be a computer, laptop, ultrabook, netbook, or table device, or another device, and such an electronic device can include all or a subset of the components illustrated inFIG. 1 . The choice of components included inelectronic device 100 is not critical to any embodiment. -
Processor 102 is coupled to and can interact with additional subsystems such as a Random Access Memory (RAM) 108; amemory 110, such as a hard drive, CD, DVD, flash memory, or a similar storage device; one ormore displays 112; one ormore actuators 120; one ormore force sensors 122; an auxiliary input/output (I/O)subsystem 124; adata port 126; one ormore speakers 128; one ormore microphones 130; short-range communications 132; andother device subsystems 134. -
Display 112 is coupled to and controlled byprocessor 102. Characters, such as text, symbols, images, and other items are displayed ondisplay 112 viaprocessor 102. Characters can be input into theelectronic device 100 using a keyboard (not pictured inFIG. 1 ), such as a physical keyboard having keys that are mechanically actuated, or a virtual keyboard having keys rendered ondisplay 112. -
Processor 102 can also interact with apositioning system 136 for determining the location ofelectronic device 100. The location can be determined in any number of ways, such as by a computer, by a Global Positioning System (GPS) (which can be included in electronic device 100), through a Wi-Fi network, or by having a location entered manually. The location can also be determined based on calendar entries. - In some embodiments, to identify a subscriber for network access,
electronic device 100 uses a Subscriber Identity Module or a Removable User Identity Module (SIM/RUIM)card 138 inserted into a SIM/RUIM interface 140 for communication with a network, such asnetwork 150. Alternatively, user identification information can be programmed intomemory 110. -
Electronic device 100 also includes anoperating system 146 andprograms 148 that are executed byprocessor 102 and are typically stored inmemory 110 orRAM 108. Additional applications can be loaded ontoelectronic device 100 throughnetwork 150, auxiliary I/O subsystem 124,data port 126, short-range communications subsystem 132, or any other suitable subsystem. - A received signal such as a text message, an e-mail message, or a web page download is processed by
communication subsystem 104. This processed information is then provided toprocessor 102.Processor 102 processes the received signal for output to display 112, to auxiliary I/O subsystem 124, or a combination of both. A user can compose data items, for example e-mail messages, which can be transmitted overnetwork 150 throughcommunication subsystem 104. For voice communications, the overall operation ofelectronic device 100 is similar.Speaker 128 outputs audible information converted from electrical signals, andmicrophone 130 converts audible information into electrical signals for processing. -
FIG. 2 is an example flowchart illustrating anexample method 200 for modifying the alignment of a block of text based on an alignment control interface, consistent with example embodiments disclosed herein. Memory (such asmemory 108 or RAM 110) can include a set of instructions that, when executed by a processor (such as processor 102), can be used to modify the alignment of a block of text based on an alignment control interface. - Referring back to
FIG. 2 ,method 200 begins atstep 210, where theprocessor 102 displays a block of text within a text editing application. In some embodiments, theprocessor 102 receives an input reflecting placement of a caret in, or near, a block of text within a text editing application. A caret refers to a cursor (e.g., a blinking cursor) or other symbol indicating a position for inserting or removing text. A block of text refers to a character, word, sentence, heading, paragraph, or other unit or units of text. Atstep 215, theprocessor 102 checks whether a precondition is met. For example, theprocessor 102 may check whether, during a specified time period, an input is received reflecting placement of a caret in, or near, a block of text, an input is received changing placement of the caret, a character input is received, and/or an input is detected reflecting selection of an application outside of the text editing application. A text editing application can include, for example, a word processor, an e-mail application, or a HTML editor. In some embodiments, theprocessor 102 determines that the precondition is met unless any of the above-described instances occur. The specified time period can vary in duration; exemplary durations include 300 milliseconds and 500 milliseconds. One advantage of implementing such a precondition includes avoidance of a “flicker” effect, in which a component of a user interface is shown and hidden repeatedly, which can have the effect of cluttering the display and/or distracting a user. Returning back toFIG. 2 , if atstep 215 theprocessor 102 determines that the precondition is not met, the method returns to step 210. Otherwise, theprocessor 102 continues on to step 220. - At
step 220, theprocessor 102 displays an alignment control interface in proximity to the block of text. The alignment control interface can be a user-selectable visual element. In some embodiments, the alignment control interface visually indicates the present alignment of the block of text by, for instance, displaying horizontal lines corresponding to the different alignment options. The alignment control interface can be referred to as a “grip” or simply as a “control interface.” As an example, the alignment control interface can be displayed adjacent to, above, below, or within (e.g., layered in) a block of text. In some embodiments, the alignment control interface can be positioned horizontally based on the present alignment of the block of text. For example, if the present alignment of the block of text is “left aligned,” the horizontal position of the alignment control interface can correspond to the left region of the block of text. Conversely, if the present alignment of the block of text is “centered” or “right aligned,” the alignment interface control can be horizontally positioned within the middle and right regions of the block of text, respectively. In some embodiments, the alignment control interface can be positioned vertically according to the top of the block of text. The alignment control interface can appear as shown in theexemplary interfaces FIG. 4 . - The
processor 102 detects a dragging motion associated with the alignment control interface atstep 230. A dragging motion can include, for example, a click of the alignment control interface followed by a drag motion, or another action, in which the position of the alignment control interface is incrementally repositioned on the display. In some embodiments, the alignment control interface is not repositioned on the display until the user completes the click and drag motion described above, or otherwise releases control over the alignment control interface. The dragging motion can be executed, for example, through use of a mouse or other mechanical input device, as well as through touch motions (e.g., a swiping motion) on a touch-sensitive display. After detecting the dragging motion atstep 230, theprocessor 102 atstep 240 modifies the alignment of the block of text based, at least in part, on the dragging motion. In some embodiments, the processor modifies the alignment of the block of text, in real time, based on the position of the alignment control interface. For example, the alignment of the block of text can be based on the position of the alignment control interface relative to the block of text. In some embodiments, the alignment control interface is dragged horizontally across the display. As the alignment control interface is dragged from left to right or from right to left, the alignment of the block of text is set depending on the horizontal position of the alignment control interface. When the horizontal position of the alignment control interface is in the left region of the block of text, for example, the alignment of the block of text is set to “left aligned.” Conversely, when the horizontal position of the alignment control interface is in the middle or right regions of the block of text, the alignment of the block of text is set to “centered” and “right aligned,” respectively. This is further demonstrated by theexemplary interfaces FIG. 4 . In some embodiments, “leading” and “trailing” alignments can be triggered by dragging the alignment control interface into either one of the edges of the display (e.g., into the margins in a text editing application). Various partitions of the block of text into left, middle, and right regions are possible; the choice of a particular partition is not critical to any embodiment. - At
step 250, theprocessor 102 detects an end of the dragging motion. After detecting the end of the dragging motion, theprocessor 102 atstep 260 displays the alignment control interface in a predetermined position associated with the alignment of the block of text. If the alignment of the block of text is set to “right aligned,” for example, the alignment interface control may be displayed within the right region of the display. Conversely, if the alignment of the block of text is set to “centered” or “left aligned,” the alignment interface control may be displayed within the middle and left regions of the display, respectively. This is further demonstrated by theexemplary interfaces FIG. 4 . In some embodiments, the alignment control interface animates or “snaps into place” in the appropriate region of the display. -
FIG. 3 is an example flowchart illustrating anexample method 300 in accordance with some example embodiments. Memory (such asmemory 108 or RAM 110) can include a set of instructions that, when executed by a processor (such as processor 102), can be used to implement the steps ofmethod 300. Atstep 310, theprocessor 102 displays a block of text within a text editing application. In some embodiments, theprocessor 102 receives an input reflecting placement of a caret in, or near, a block of text within a text editing application. Atstep 315, theprocessor 102 checks whether a precondition is met. For example, theprocessor 102 can check whether, during a specified time period, an input is received reflecting placement of a caret in, or near, a block of text, an input is received changing placement of the caret, a character input is received, and/or an input is detected reflecting selection of an application outside of the text editing application. In some embodiments, theprocessor 102 determines that the precondition is met unless any of the above-described instances occur. The specified time period can vary in duration; exemplary durations include 300 milliseconds and 500 milliseconds. Returning back toFIG. 3 , if atstep 315 theprocessor 102 determines that the precondition is not met, the method returns to step 310. Otherwise, theprocessor 102 continues on to step 320. - At
step 320, theprocessor 102 displays a float control interface in proximity of the block of text. The alignment control interface can be a user-selectable visual element. For example, the float control interface can be displayed adjacent to, above, below, or within (e.g., layered in) a block of text. Theprocessor 102 detects a dragging motion associated with the float control interface atstep 330. A dragging motion can include, for example, a click and drag motion, or another action, in which the position of the float control interface is incrementally repositioned on the display. In some embodiments, the float control interface is not repositioned on the display until the user completes the click and drag motion described above, or otherwise releases control over the float control interface. The dragging motion can be executed, for example, through use of a mouse or other mechanical input device, as well as through touch motions (e.g., a swiping motion) on a touch-sensitive display. After detecting the dragging motion atstep 230, theprocessor 102 atstep 240 modifies the alignment of the block of text based, at least in part, on the dragging motion. In some embodiments, the processor modifies the alignment of the block of text, in real time, based on the position of the alignment control interface. For example, the block of text can be positioned according to the horizontal position of the float control interface. In some embodiments, the float control interface is dragged horizontally across the display. As the float control interface is dragged from left to right or from right to left, the position of the block of text is set depending on the horizontal position of the float control interface. When the horizontal position of the float control interface is in the left region of the display, for example, the block of text is also positioned in the left region of the display. Conversely, when the horizontal position of the float control interface is in the middle or right regions of the display, the block of text is positioned in the middle and right regions of the display, respectively. - At
step 350, theprocessor 102 detects an end of the dragging motion. After detecting the end of the dragging motion, theprocessor 102 atstep 360 displays the float control interface in a predetermined position associated with the position of the block of text. If the block of text is positioned in the left region of the display, for example, the float interface control may be displayed within the left region of the display. Conversely, if the block of text is positioned in the middle or right regions of the display, the float interface control may be displayed within the middle and right regions of the display, respectively. In some embodiments, the float control interface animates or “snaps into place” in the appropriate region of the display. -
FIG. 4 illustrates example alignments of a block of text and example positions of an alignment control interface, consistent with embodiments disclosed herein.FIG. 4 includesalignment control interfaces FIG. 4 ,alignment control interface 410 is displayed at the top left oftext block 415, which corresponds to the left alignment oftext block 415. In addition, the visual appearance ofalignment control interface 410 depicts the left alignment oftext block 415. As demonstrated inFIG. 4 , whenalignment control interface 410 is dragged from left to right (or from right to left), the alignment of the text block is also modified.Alignment control interface 420 is displayed in the middle oftext block 425, which corresponds to the center alignment of text block 425 (also visually depicted in alignment control interface 420). Andalignment control interface 430 is displayed at the top right oftext block 435, which corresponds to the right alignment of text block 435 (also visually depicted in alignment control interface 430). -
FIGS. 5A , 5B, and 5C illustrate example positions of a block of text, consistent with embodiments disclosed herein.FIGS. 5A , 5B, and 5C illustrate various positions of the text block, after the position of the text block has been modified using the methods described in connection withFIG. 3 , above. As shown inFIG. 5A , for example,text block 510 is positioned as “float center.” In some embodiments, this can also be referred to as a default position. In contrast,text block 520 ofFIG. 5B is positioned as “float left,” whereastext block 530 ofFIG. 5C is positioned as “float right.” AsFIGS. 5B and 5C illustrate, positioning a text block as “float left” or “float right” can result in neighboring text blocks being repositioned relative to the original text block. - Similarly,
FIGS. 5A , 5D, and 5E illustrate additional example positions of a block of text, consistent with embodiments disclosed herein. More specifically,FIGS. 5A , 5D, and 5E illustrate various positions of the text block, after the position of the text block has been modified using the methods described in connection withFIG. 2 , above. Text blocks 510, 540, and 550 ofFIGS. 5A , 5D, and 5E are positioned as “float center” or a default position, as described above. Furthermore, in these examples,text block 510 has been set to “left aligned,”text block 540 has been set to “center aligned,” andtext block 550 has been set to “right aligned.” -
FIGS. 6A , 6B, and 6C show example alignments and positions of a block of text, consistent with embodiments disclosed herein.FIGS. 6A , 6B, and 6C illustrate various alignments of the text block, after the alignment of the text block has been modified using the methods described in connection withFIG. 2 , above. In addition, text blocks 610, 620, and 630 ofFIGS. 6A , 6B, and 6C are positioned as “float right” which, as described above, indicates that the neighboring text blocks have been repositioned relative to text blocks 610, 620, and 630. Furthermore, in these examples,text block 610 has been set to “center aligned,”text block 620 has been set to “left aligned,” andtext block 630 has been set to “right aligned.”FIGS. 6A , 6B, and 6C demonstrate the joint use of alignment control and float control on a given block of text. - Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. It is intended that the specification and examples be considered as examples only, with a true scope and spirit of the invention being indicated by the following claims.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/837,107 US9176940B2 (en) | 2013-03-15 | 2013-03-15 | System and method for text editor text alignment control |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/837,107 US9176940B2 (en) | 2013-03-15 | 2013-03-15 | System and method for text editor text alignment control |
Publications (2)
Publication Number | Publication Date |
---|---|
US20140281942A1 true US20140281942A1 (en) | 2014-09-18 |
US9176940B2 US9176940B2 (en) | 2015-11-03 |
Family
ID=51534333
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/837,107 Active 2034-01-24 US9176940B2 (en) | 2013-03-15 | 2013-03-15 | System and method for text editor text alignment control |
Country Status (1)
Country | Link |
---|---|
US (1) | US9176940B2 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105808079A (en) * | 2014-12-29 | 2016-07-27 | 鸿合科技有限公司 | Method and device for quickly aligning object by means of gesture |
CN110929041A (en) * | 2019-11-20 | 2020-03-27 | 北京邮电大学 | Entity alignment method and system based on layered attention mechanism |
Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5760773A (en) * | 1995-01-06 | 1998-06-02 | Microsoft Corporation | Methods and apparatus for interacting with data objects using action handles |
US5778404A (en) * | 1995-08-07 | 1998-07-07 | Apple Computer, Inc. | String inserter for pen-based computer systems and method for providing same |
US20020032018A1 (en) * | 2000-07-31 | 2002-03-14 | Motient Communications Inc. | Communication system with wireless electronic mail or messaging integrated and/or associated with application program residing on remote computing device |
US20020032705A1 (en) * | 1998-06-17 | 2002-03-14 | Nobuya Higashiyama | Method and system for placing an insertion point in an electronic document |
US6535615B1 (en) * | 1999-03-31 | 2003-03-18 | Acuson Corp. | Method and system for facilitating interaction between image and non-image sections displayed on an image review station such as an ultrasound image review station |
US20050177786A1 (en) * | 2000-09-25 | 2005-08-11 | Adobe Systems Incorporated, A Delaware Corporation | Text composition spacing amount setting device with icon indicators |
US20060036946A1 (en) * | 2004-08-16 | 2006-02-16 | Microsoft Corporation | Floating command object |
US20060055700A1 (en) * | 2004-04-16 | 2006-03-16 | Niles Gregory E | User interface for controlling animation of an object |
WO2006036887A2 (en) * | 2004-09-28 | 2006-04-06 | Yost David A | Improved system of gui text cursor, caret, and selection |
US7046848B1 (en) * | 2001-08-22 | 2006-05-16 | Olcott Peter L | Method and system for recognizing machine generated character glyphs and icons in graphic images |
US20070176922A1 (en) * | 2006-01-27 | 2007-08-02 | Sony Corporation | Information display apparatus, information display method, information display program, graphical user interface, music reproduction apparatus, and music reproduction program |
US20090319879A1 (en) * | 2007-10-19 | 2009-12-24 | Jeffrey Scott | Double click inline edit / single click action |
US20110239153A1 (en) * | 2010-03-24 | 2011-09-29 | Microsoft Corporation | Pointer tool with touch-enabled precise placement |
US20130007606A1 (en) * | 2011-06-30 | 2013-01-03 | Nokia Corporation | Text deletion |
US20130127703A1 (en) * | 2011-08-31 | 2013-05-23 | Max A. Wendt | Methods and Apparatus for Modifying Typographic Attributes |
US9043349B1 (en) * | 2012-11-29 | 2015-05-26 | A9.Com, Inc. | Image-based character recognition |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5867144A (en) | 1991-11-19 | 1999-02-02 | Microsoft Corporation | Method and system for the direct manipulation of information, including non-default drag and drop operation |
US6088027A (en) | 1998-01-08 | 2000-07-11 | Macromedia, Inc. | Method and apparatus for screen object manipulation |
US7254787B2 (en) | 2001-02-15 | 2007-08-07 | Denny Jaeger | Method for formatting text by hand drawn inputs |
US7703036B2 (en) | 2004-08-16 | 2010-04-20 | Microsoft Corporation | User interface for displaying selectable software functionality controls that are relevant to a selected object |
US8201109B2 (en) | 2008-03-04 | 2012-06-12 | Apple Inc. | Methods and graphical user interfaces for editing on a portable multifunction device |
US8117556B2 (en) | 2008-03-31 | 2012-02-14 | Vistaprint Technologies Limited | Target-alignment-and-drop control for editing electronic documents |
KR101524616B1 (en) | 2008-07-07 | 2015-06-02 | 엘지전자 주식회사 | Controlling a Mobile Terminal with a Gyro-Sensor |
US9875013B2 (en) | 2009-03-16 | 2018-01-23 | Apple Inc. | Methods and graphical user interfaces for editing on a multifunction device with a touch screen display |
KR101484826B1 (en) | 2009-08-25 | 2015-01-20 | 구글 잉크. | Direct manipulation gestures |
US20120159318A1 (en) | 2010-12-15 | 2012-06-21 | Microsoft Corporation | Full screen view reading and editing user interface |
KR101842457B1 (en) | 2011-03-09 | 2018-03-27 | 엘지전자 주식회사 | Mobile twrminal and text cusor operating method thereof |
-
2013
- 2013-03-15 US US13/837,107 patent/US9176940B2/en active Active
Patent Citations (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5760773A (en) * | 1995-01-06 | 1998-06-02 | Microsoft Corporation | Methods and apparatus for interacting with data objects using action handles |
US5778404A (en) * | 1995-08-07 | 1998-07-07 | Apple Computer, Inc. | String inserter for pen-based computer systems and method for providing same |
US20020032705A1 (en) * | 1998-06-17 | 2002-03-14 | Nobuya Higashiyama | Method and system for placing an insertion point in an electronic document |
US6535615B1 (en) * | 1999-03-31 | 2003-03-18 | Acuson Corp. | Method and system for facilitating interaction between image and non-image sections displayed on an image review station such as an ultrasound image review station |
US20020032018A1 (en) * | 2000-07-31 | 2002-03-14 | Motient Communications Inc. | Communication system with wireless electronic mail or messaging integrated and/or associated with application program residing on remote computing device |
US20050177786A1 (en) * | 2000-09-25 | 2005-08-11 | Adobe Systems Incorporated, A Delaware Corporation | Text composition spacing amount setting device with icon indicators |
US7046848B1 (en) * | 2001-08-22 | 2006-05-16 | Olcott Peter L | Method and system for recognizing machine generated character glyphs and icons in graphic images |
US20060055700A1 (en) * | 2004-04-16 | 2006-03-16 | Niles Gregory E | User interface for controlling animation of an object |
US20060036946A1 (en) * | 2004-08-16 | 2006-02-16 | Microsoft Corporation | Floating command object |
WO2006036887A2 (en) * | 2004-09-28 | 2006-04-06 | Yost David A | Improved system of gui text cursor, caret, and selection |
US20070176922A1 (en) * | 2006-01-27 | 2007-08-02 | Sony Corporation | Information display apparatus, information display method, information display program, graphical user interface, music reproduction apparatus, and music reproduction program |
US20090319879A1 (en) * | 2007-10-19 | 2009-12-24 | Jeffrey Scott | Double click inline edit / single click action |
US20110239153A1 (en) * | 2010-03-24 | 2011-09-29 | Microsoft Corporation | Pointer tool with touch-enabled precise placement |
US20130007606A1 (en) * | 2011-06-30 | 2013-01-03 | Nokia Corporation | Text deletion |
US20130127703A1 (en) * | 2011-08-31 | 2013-05-23 | Max A. Wendt | Methods and Apparatus for Modifying Typographic Attributes |
US9043349B1 (en) * | 2012-11-29 | 2015-05-26 | A9.Com, Inc. | Image-based character recognition |
Non-Patent Citations (2)
Title |
---|
Dummies.com "How to Select Text for Editing on the Samsung Galaxy Tab" Sep 1, 2012, pp 1-2http://www.dummies.com/how-to/content/how-to-select-text-for-editing-on-the-s * |
Stackoverflow.com, "setTimeout() and setting parameters" Mar 1, 2013, pp 1-5http://stackoverflow.com/questions/5169020/settimeout-and-setting-parameters * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105808079A (en) * | 2014-12-29 | 2016-07-27 | 鸿合科技有限公司 | Method and device for quickly aligning object by means of gesture |
CN110929041A (en) * | 2019-11-20 | 2020-03-27 | 北京邮电大学 | Entity alignment method and system based on layered attention mechanism |
Also Published As
Publication number | Publication date |
---|---|
US9176940B2 (en) | 2015-11-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11316805B2 (en) | Method for transmitting message and electronic device thereof | |
US10180780B2 (en) | Portable electronic device including touch-sensitive display and method of controlling selection of information | |
KR101952177B1 (en) | Mobile terminal and control method thereof | |
KR20140051719A (en) | Mobile terminal and control method thereof | |
KR20150009204A (en) | Mobile terminal and method for controlling the same | |
CN105745612B (en) | For showing the readjustment size technology of content | |
JP2018535462A (en) | Touch heat map | |
US10429946B2 (en) | Electronic device and method for rendering secondary characters | |
US20150138192A1 (en) | Method for processing 3d object and electronic device thereof | |
JP6625312B2 (en) | Touch information recognition method and electronic device | |
JP2017525076A (en) | Character identification method, apparatus, program, and recording medium | |
US9239647B2 (en) | Electronic device and method for changing an object according to a bending state | |
US20140181734A1 (en) | Method and apparatus for displaying screen in electronic device | |
US20170139584A1 (en) | User account switching interface | |
US9176940B2 (en) | System and method for text editor text alignment control | |
KR20140062747A (en) | Method and apparatus for selecting display information in an electronic device | |
EP2778873B1 (en) | System and method for text editor text alignment control | |
KR20130091181A (en) | Mobile terminal and control method thereof | |
EP2669779B1 (en) | Portable electronic device including touch-sensitive display and method of controlling same | |
US11973723B2 (en) | Method for transmitting message and electronic device thereof | |
KR20130091184A (en) | Mobile terminal and docking system thereof | |
KR101729981B1 (en) | Method for content control and mobile terminal using this method | |
KR101968524B1 (en) | Mobile terminal and control method thereof | |
US20140157146A1 (en) | Method for retrieving file and electronic device thereof | |
KR20130097371A (en) | Mobile terminal and controlling method thereof, and recording medium thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: RESEARCH IN MOTION LIMITED, CANADA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:DENT, TERRILL MARK;REEL/FRAME:030069/0861 Effective date: 20130322 |
|
AS | Assignment |
Owner name: BLACKBERRY LIMITED, ONTARIO Free format text: CHANGE OF NAME;ASSIGNOR:RESEARCH IN MOTION LIMITED;REEL/FRAME:033987/0576 Effective date: 20130709 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |
|
AS | Assignment |
Owner name: MALIKIE INNOVATIONS LIMITED, IRELAND Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BLACKBERRY LIMITED;REEL/FRAME:064104/0103 Effective date: 20230511 |
|
AS | Assignment |
Owner name: MALIKIE INNOVATIONS LIMITED, IRELAND Free format text: NUNC PRO TUNC ASSIGNMENT;ASSIGNOR:BLACKBERRY LIMITED;REEL/FRAME:064271/0199 Effective date: 20230511 |