US20140298247A1 - Display device for executing plurality of applications and method of controlling the same - Google Patents
Display device for executing plurality of applications and method of controlling the same Download PDFInfo
- Publication number
- US20140298247A1 US20140298247A1 US14/183,726 US201414183726A US2014298247A1 US 20140298247 A1 US20140298247 A1 US 20140298247A1 US 201414183726 A US201414183726 A US 201414183726A US 2014298247 A1 US2014298247 A1 US 2014298247A1
- Authority
- US
- United States
- Prior art keywords
- window
- command
- input
- application
- display device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
Abstract
In one aspect, method is provided for controlling a display device, comprising: displaying, on a touchscreen display, a first window executing a first application and a second window executing a second application; receiving, at the touchscreen display, a first command input to the first window and a second command input to the second window; determining whether the first command and the second command are received simultaneously; dispatching, by a processor, the first command; and dispatching, by the processor, the second command.
Description
- This application claims priority under 35 U.S.C. §119(a) to Korean Application Serial No. 10-2013-0034042, which was filed in the Korean Intellectual Property Office on Mar. 29, 2013, the entire content of which is hereby incorporated by reference.
- 1. Technical Field
- The present disclosure relates generally to a display device for executing a plurality of applications and a method of controlling the same, and more particularly, to a display device for controlling a display of a window in which a plurality of applications is executed and a method of controlling the same.
- 2. Description of the Related Art
- A desktop computer may have multiple display devices (e.g., multiple monitors, while a mobile device may have only one display device (e.g., a touch screen). A user of a desktop computer may divide a screen of the desktop computer's display device according to a working environment (for example, horizontally or vertically divide the screen while displaying a plurality of windows) and use the divided screens. When a web browser is executed, the user can move a web page displayed by the web browser in an up or down direction by using a page up button or a page down button arranged in a keyboard. When the user uses a mouse instead of the keyboard, the user can move the web page in the up or down direction by selecting a scroll bar located in a side part of the web page by using a cursor or by selecting a top button displayed as a text or an icon located in a bottom part of the web page.
- In contrast to desktop computers, mobile devices tend to have smaller screen sizes. The smaller screen sizes make it more difficult to divide portable device screens among multiple applications. Although various applications which simulate consumers' curiosity and satisfy consumers' demands may be provided on a given mobile device, the limited screen size and a User Interface (UI) and User Interface (UI) of that device may make it inconvenient to execute a plurality of applications in the mobile device at the same time. For example, when one application is executed in a given mobile device, the application may be displayed in an entire display area. When the user desires to execute another application, the user should first end the currently executed application and then select an execution key for executing the desired application. That is, in order to execute several applications in the mobile device, processes of executing and ending the respective applications may need to be repeated, which in turn could make the user feel inconvenienced.
- Furthermore, mobile devices have more limited User Interface (UI) facilities than desktop computers. For example, when a plurality of applications are displayed in device desktop computer, commands may be simultaneously input into different applications from the plurality. By contrast, display controls of mobile devices may be configured to route all commands input into the mobile devices' touchscreen to only one application—namely, the application whose window has the highest activity order. Accordingly, the user of a given mobile device cannot simultaneously use all windows that are displayed on the device's screen at the same time.
- The above limitation may become more noticeable when a plurality of users need to control different windows on the same display device (e.g., touch screen), as it could prevent each of the users from interacting with his or her respective application(s) freely. When the plurality of users input commands into windows, respectively, the window having the highest activity order receives all the commands, so that only one of the applications can be controlled to the exclusion of all others.
- The present invention has been made to solve the above mentioned problems and provides additional advantages, by providing a display device for, when commands are input into a plurality of windows, respectively, inputting an individual command into each of the windows, and a method of controlling the same.
- In one aspect, method is provided for controlling a display device, comprising: displaying, on a touchscreen display, a first window executing a first application and a second window executing a second application; receiving, at the touchscreen display, a first command input to the first window and a second command input to the second window; determining whether the first command and the second command are received simultaneously; dispatching, by a processor, the first command; and dispatching, by the processor, the second command.
- In another aspect, a display device for executing an application is provided, the display device comprising a touch screen coupled to controller. The touch screen is configured to display a first window executing a first application and a second window executing a second application, receive a first command input to the first window and a second command input to the second window, the first command and the second being received simultaneously. The controller is configured to match a position of the first window with an input position of the first command, match a position of the second window with an input position of the second command, dispatch the first command based on whether the position of the first window is determined to match the first command, and dispatch the second command based on whether the position of the second window is determined to match the second command.
- In yet another aspect, a method is provided for controlling a display device, the method comprising: displaying a plurality of windows, each window executing a different application; receiving a plurality of commands that are input into the plurality of windows, each command being received at a different window, wherein the commands are received simultaneously; and dispatching, by a processor, each of the commands to a different one of the plurality of windows.
- In yet another aspect, a display device for executing applications is provided, the display device comprising a touchscreen coupled to a controller. The touchscreen is configured to display a plurality of windows executing applications. The controller is configured to receive a plurality of commands that are input into the plurality of windows, each command being received at a different window, wherein the commands are received simultaneously; dispatch each of the commands to a different one of the plurality of windows.
- In yet another aspect, a method for controlling a display device is provided, the method comprising: receiving commands directed to a first application and a second application, the first application being executed in a first window and the second application being executed in a second window; determining, by a processor, a position of the first window and a position of the second window, the determining being performed by the kernel of an operating system executed by the processor; displaying the first window and the second window; receiving a first command input to the first window and a second command input to the second window, the first command and the second command being received simultaneously; matching an input position of the first command with the position of the first window, the matching being performed by the kernel of the operating system; matching an input position of the second command with the position of the second window, the matching being performed by the kernel of the operating system; outputting a first event corresponding to the first command based on the matching; and outputting a second event corresponding to the second command based on the matching.
- In yet another aspect, a method for controlling a display device is provided, the method comprising: receiving commands directed to a first application and a second application, the first application being executed in a first window and the second application being executed in a second window; determining, by a processor, a position of the first window and a position of the second window, the determining being performed at a platform level; displaying the first window and the second window; receiving a first command input to the first window and a second command input to the second window, the first command and the second command being received simultaneously; matching an input position of the first command with the position of the first window, the matching being performed at the platform level; matching an input position of the second command with the position of the second window, the matching being performed at the platform level; outputting a first event corresponding to the first command based on the matching; and outputting a second event corresponding to the second command based on the matching.
- The above and other aspects, features, and advantages of the present invention will be more apparent from the following detailed description taken in conjunction with the accompanying drawings, in which:
-
FIG. 1 is a block diagram of a display device according to aspects of the disclosure; -
FIG. 2A is a schematic diagram of the display device in accordance with aspects of the disclosure; -
FIG. 2B is a schematic diagram illustrating a Z-order arrangement of active application windows; -
FIG. 3A is a diagram of a window displaying screen presenting a plurality of applications in a freestyle mode according to aspects of the disclosure; -
FIG. 3B is a diagram of a multi-window framework in accordance with aspects of the disclosure; -
FIG. 3C andFIG. 3D are diagrams illustrating a technique for screen order change in accordance with aspects of the disclosure; -
FIG. 4A andFIG. 4B are diagrams depicting the operation of a display device in a split mode in accordance with some aspects of the disclosure; -
FIG. 5 is a diagram illustrating the operation of a display device in a freestyle mode in accordance with aspects of the disclosure; -
FIG. 6A is a flowchart of a method for controlling a display device according to aspects of the disclosure; -
FIG. 6B is a flowchart of another method for controlling a display device according to aspects of the disclosure; -
FIG. 7 is a flowchart of yet another method for controlling a display device according to aspects of the disclosure; -
FIG. 8A andFIG. 8B are diagrams of systems for receiving user commands that are simultaneously input into a display device according to aspects of the disclosure; and -
FIG. 9A ,FIG. 9B andFIG. 9C are schematic diagrams of display devices in accordance with aspects of the disclosure. - Hereinafter, aspects of the disclosure will be presented with reference to descriptions discussed through the accompanying drawings. However, the disclosure is not limited or restricted by the examples presented therein. The same reference numerals shown in respective drawings indicate members for substantially performing the same function.
-
FIG. 1 depicts a block diagram of adisplay device 100 according to aspects of the disclosure. Thedisplay device 100 may include a mobile device, such as a smart phone, a tablet computer (with or without cellular capability), a non-mobile device, such as a desktop computer, a TV display, and or any other suitable type of device. Thedisplay device 100 may be connected to an external device (not shown) by using amobile communication module 120, asub communication module 130, and aconnector 165. The “external device” may include a different device (not shown) from thedisplay device 100, a mobile phone (not shown), a smart phone (not shown), a tablet PC (not shown), and a server (not shown). - The
display device 100 may include atouch screen 190 and atouch screen controller 195. Also, thedisplay device 100 may include acontroller 110, amobile communication module 120, thesub communication module 130, amultimedia module 140, acamera module 150, aGPS module 155, an input/output module 160, asensor module 170, astorage unit 175, and apower supply 180. - The
controller 110 may include aCPU 111, a read-only memory (ROM) 112 for storing a control program for controlling thedisplay device 100, and a random access memory (RAM) 113. TheRAM 113 may be used to store a signal or data that is input into thedisplay device 100 and/or intermediate data generated as a result of operations performed in thedisplay device 100. TheCPU 111 may include a single core, a dual core, a triple core, a quad core, and/or any other suitable type of CPU. TheCPU 111, theROM 112, and theRAM 113 may be connected through an internal bus. Thecontroller 110 may control themobile communication module 120, thesub communication module 130, themultimedia module 140, thecamera module 150, theGPS module 155, the input/output module 160, thesensor module 170, thestorage unit 175, thepower supply 180, thetouch screen 190, and thetouch screen controller 195. - The
mobile communication module 120 may connect thedisplay device 100 with the external device through mobile communication by using at least one or a plurality of antennas (not shown). Themobile communication module 120 may transmit/receive wireless signals for voice phone communications, video phone communications, Short Message Service (SMS) communications, or Multimedia Message Service (MMS) communications and/or any suitable type of communications with another similar device. - The
sub-communication module 130 may include at least one of thewireless LAN module 131 and the nearfield communication module 132. For example, thesub communication module 130 may include only thewireless LAN module 131, only the nearfield communication module 132, or both thewireless LAN module 131 and the nearfield communication module 132. Thewireless LAN module 131 may be connected to the Internet via a wireless Access Point (AP) (not shown). Thewireless LAN module 131 may support a wireless LAN standard (IEEE802.11x) of the Institute of Electrical and Electronics Engineers (IEEE). The nearfield communication module 132 may wirelessly perform near field communications between thedisplay device 100 and a image forming device (not shown). Near field communication techniques may include Bluetooth, Infrared Data Association (IrDA) and the like. - The
display device 100 may include one or more of themobile communication module 120, thewireless LAN module 131, and the nearfield communication module 132. For example, thedisplay device 100 may include a combination of themobile communication module 120, thewireless LAN module 131, and the nearfield communication module 132. - The
multimedia module 140 may include one or more thebroadcasting communication module 141, theaudio reproduction module 142, and thevideo reproduction module 143. Thebroadcasting communication module 141 may receive a broadcasting signal (for example, a TV broadcasting signal, a radio broadcasting signal, or a data broadcasting signal). In addition, thebroadcasting module 141 and broadcasting additional information (for example, Electric Program Guide (EPS) or Electric Service Guide (ESG)) broadcasted from a broadcasting station through a broadcasting communication antenna (not shown) according to a control of thecontroller 110. Theaudio reproduction module 142 may reproduce a digital audio file (for example, a file having an extension of mp3, wma, ogg or way) stored or received according to a control of thecontroller 110. Thevideo reproduction module 143 may reproduce a digital video file (for example, a file having an extension of mpeg, mpg, mp4, avi, mov or mkv) stored or received according to a control of thecontroller 110. Thevideo reproduction module 143 may reproduce the digital audio file. - The
camera module 150 may include at least one of thefirst camera 151 and thesecond camera 152. Although not shown, thecamera module 150 may also include an auxiliary light source (for example, a flash (not shown)) for providing an amount of light required for the photographing. By way of example, thefirst camera 151 may be disposed in a front surface of thedisplay device 100, and thesecond camera 152 may be disposed in a rear surface of thedisplay device 100. As another example, thefirst camera 151 and thesecond camera 152 may be disposed to be, so as to permit the of three-dimensional images or video. - The
GPS module 155 may receive a radio wave from a plurality of GPS satellites (not shown) in Earth orbit and calculate a position of thedisplay device 100 by using Time of Arrival from the GPS satellites (not shown) to thedisplay device 100. - The input/
output module 160 may include at least one of the plurality ofbuttons 161, themicrophone 162, thespeaker 163, thevibration motor 164, theconnector 165, and thekeypad 166. Thebuttons 161 may be formed in a front surface, a side surface, or a rear surface of the housing of thedisplay device 100, and may include at least one of a power/lock button (not shown), a volume button (not shown), a menu button, a home button, a back button, and asearch button 161. Themicrophone 162 may receive voice or other audio and generate an electrical signal according to a control of thecontroller 110. The speaker may include one or more speakers disposed in a proper position or positions in the housing of thedisplay device 100. Any suitable type or number of speakers may be used.Vibration motor 164 may include one or more vibration motors disposed within the housing of thedisplay device 100. Each of thevibration motor 164 may convert an electrical signal to a mechanical vibration. For example, when thedisplay device 100 receives an incoming call, thevibration motor 164 may activate so as to notify the user of the call. As another example, thevibration motor 164 may operate in response to a touch action of the user on thetouch screen 190 and continuous motions of the touch on thetouch screen 190. - The
connector 165 may include an interface (e.g., a USB interface or any other suitable type of interface) for connecting thedisplay device 100 with an external device (not shown) or a power source (not shown). Theconnector 165 may transmit data stored in thestorage unit 175 of thedisplay device 100 to the external device (not shown) through a wired cable connected to theconnector 165 or receive the data from the external device (not shown). Power may be input or a battery (not shown) may be charged from the power source (not shown) through the wired cable connected to theconnector 165. - The
keypad 166 may include a physical keypad (not shown) formed in thedisplay device 100 and/or a virtual keypad (not shown) displayed on thetouch screen 190. In this example, thekeypad 166 may be a virtual keypad. Thesensor module 170 may include at least one sensor for detecting a state of thedisplay device 100. For example, thesensor module 170 may include a proximity sensor for detecting whether the user is close to thedisplay device 100, an illumination sensor (not shown) for detecting an amount of light adjacent to thedisplay device 100, or a motion sensor (not shown) for detecting an operation of the display device 100 (for example, a rotation of thedisplay device 100, or an acceleration or vibration applied to the display device 100). At least one sensor may detect a state of the display device, generate a signal corresponding to the detection, and transmit the generated signal to thecontroller 110.Sensor module 170 may include any suitable type and/or number of sensors. - The
storage unit 175 may store signals or data input/output as a result of the operation of themobile communication module 120, thesub communication module 130, themultimedia module 140, thecamera module 150, theGPS module 155, the input/output module 160, thesensor module 170, and thetouch screen 190 according to a control of thecontroller 110. Thestorage unit 175 may store a control program (that is executed by controller 110) for controlling thedisplay device 100 or thecontroller 110 and applications. Thestorage unit 175 may include a memory card (not shown) (for example, an SD card or a memory stick) mounted to thestorage unit 175, theROM 112 or theRAM 113 within thecontroller 110, or thedisplay device 100, nonvolatile memory, volatile memory, a Hard Disk Drive (HDD), or a Solid State Drive (SSD). - The
power supply 180 may include a battery and/or other circuitry for supplying power to thedisplay device 100. Further, thepower supply 180 may receive power from an external power source (not shown) through the wired cable connected to theconnector 165 to thedisplay device 100. - The
touch screen 190 may be a resistive type of touch screen, a capacitive type of touch screen, an infrared type of touch screen, an acoustic wave type of touch screen, and/or any other suitable type of touch screen. Thetouch screen 190 may provide a user interface for accessing various services, such as telephony services, data transmission, data broadcasting, camera services to the user. Thetouch screen 190 may transmit an analog signal corresponding to at least one touch input into the user interface to thetouch screen controller 195. Thetouch screen 190 may receive at least one touch through a body part of the user (for example, fingers including a thumb) or a touchable input means. Also, thetouch screen 190 may receive a continuous motions as input and may transmit an analog signal corresponding to the continuous motions to thetouch screen controller 195. - The touch according to the present invention is not limited to a touch between the
touch screen 190 and the body part of the user or the touchable input means, but may include a non-touch (for example, a case where a detectable interval between thetouch screen 190 and the body part of the user or the touchable input means is equal to or smaller than 1 mm). The detectable interval of thetouch screen 190 may be changed according to a capability of a structure of thedisplay device 100. - The
touch screen controller 195 may include circuitry for converting an analog signal received from thetouch screen 190 to a digital signal (for example, X and Y coordinates). The digital signal may be provided to thecontroller 110 which may in turn change the state of thetouch screen 190 based on the digital signal. For example, thecontroller 110 may allow a shortcut execution icon (not shown) displayed on thetouch screen 190 to be selected or executed in response to the touch. In some implementations, thetouch screen controller 195 may be integrated into thecontroller 110. -
FIG. 2A is a schematic diagram of thedevice 100 in accordance with aspects of the disclosure. Referring toFIG. 2A , thetouch screen 190 is disposed in a center of afront surface 100 a of thedisplay device 100, covering a substantial portion of the area of thefront surface 100 a of thedisplay device 100. Thefirst camera 151 and anillumination sensor 170 a may be disposed in an edge of thefront surface 100 a of thedisplay device 100. For example, a power/reset button 161 a, avolume button 161 b, thespeaker 163, aterrestrial DMB antenna 141 a, the microphone (not shown), the connector (not shown) and the like may be disposed on theside surface 100 b of thedisplay device 100, and the second camera 152 (not shown) may be disposed on the rear surface (not shown) of thedisplay device 100. - The
touch screen 190 may include amain screen 210 and alower bar 220. In the example ofFIG. 2A , thetouch screen 190 is horizontally arranged and as such, thedisplay device 100 and thetouch screen 190 may have a horizontal length larger than a vertical length. In other examples, thetouch screen 190, however, may be vertically arranged. - In some aspects, the
main screen 210 may include an area where one application or a plurality of applications are executed.FIG. 2A shows an example where a home screen is displayed on the touch screen. 190. The home screen may be the first screen displayed on thetouch screen 190 when thedisplay device 100 is turned on.Execution keys 212 for executing a plurality of applications stored in thedisplay device 100 may be arranged on the home screen in rows and columns. Theexecution keys 212 may be formed in icons, buttons, texts or the like. When anexecution key 212 is touched, an application corresponding to the touchedexecution key 212 may be executed and then displayed on the main screen 196. - The
lower bar 220 may stretch along the lower end of thetouch screen 190 and may include ahome screen button 222, aback button 224, amulti-view mode button 226, and amode switching button 228. In one aspect, pressing the homescreen movement button 222 may cause the home screen to be displayed on themain screen 210. In another aspect, pressing theback button 224 may cause a display a screen executed just before a currently executed screen to be presented on the main screen. Additionally or alternatively, pressing theback button 224 may cause an application most recently used to be terminated. In yet another aspect, pressing themulti-view mode button 226 may cause a plurality of applications to be displayed on themain screen 210 in a multi view mode. In yet another aspect, pressing themode switching button 228 may change the mode in which currently executed applications are displayed on themain screen 210. For example, when themode switching button 228 is touched, a switch is performed between a freestyle mode and a split mode. The freestyle and split modes are discussed further below. - In some implementations, an upper bar (not shown may be include) to display battery information and/or other types of information. Additionally or alternatively, in some implementations, the
lower bar 220 may be omitted thus causing themain screen 210 may occupy the entire area of thetouch screen 190. Furthermore, in some implementations, thelower bar 220 and the upper bar (not shown) may be translucently displayed on the main screen 196 while overlapping each other. -
FIG. 2B is a schematic diagram illustrating a Z-order arrangement of active application windows. As illustrated, under the Z-order arrangement, the screen may be divided into N layers, such that the Nth layer is associated with a higher rank than the N−1th layer. Each layer may have a corresponding window and each application may be executed on the corresponding window. For example, when a first application is executed, the first application may be executed in the window on a first layer. Similarly, when a second application is executed, the second application may be executed in the window on a second layer. And when a third application is executed, the third application may be executed in the window on a third layer. The layers on which applications are generated may be hierarchically generated and they may permit a plurality of windows (first to fourth windows) to be overlapping displayed on themain screen 210. More specifically, afirst window 1 may be displayed on top of asecond window 2; thesecond window 2 may be displayed on top of athird window 3; and thethird window 3 may be displayed on top of a fourth window 4. Thus, when the plurality ofwindows 1 to 4 overlap (at least partially), they may have a display order with respect to the z-axis, herein referred to as a Z-order, which determines which window(s) are displayed on top of the restA layer viewer 5 may be a screen in which Z-order is hierarchized and then displayed. -
FIG. 3A is a diagram of a window displaying screen presenting a plurality of applications in a freestyle mode according to aspects of the disclosure. In some aspects, the freestyle mode may include a display mode in which the plurality ofwindows FIG. 3A , adisplay device 300 may include atouch screen 310. A plurality ofwindows touch screen 310. Further, alower bar 320 may be displayed in a lower end of thetouch screen 310. Each of windows the 311 and 312 may include an execution screen of a particular application, a title bar for the executed application, and/or a control area. Objects related to the application may be displayed on the execution screen of the application. The objects may include text, an image, a button, a check box, a picture, a video, a web, a map and the like. When a user touches an object, a function or event associated with that object may be performed in the object's corresponding application. The object may be called a “view” according to an operating system. The title bar may include at least one control key for controlling a display of the window. For example, the control key may be a window display minimizing button, a window display maximizing button, or a window ending button. - Meanwhile, applications are programs independently implemented by a manufacturer of the
display device 300 or an application developer. Accordingly, a pre-execution of one application is not required to execute another application. Further, although one application ends, another application can be continuously executed. - The applications are distinguished from a complex function application (or dual application) generated by adding some functions (memo function and message transmission/reception function) provided by another application to functions of one application in that the applications are independently implemented programs. However, the complex function application is a single application newly produced to have various functions and thus has differences from conventional applications. Accordingly, the complex function application provides only limited functions without providing various functions unlike the conventional applications. Further, users have the burden of separately purchasing such a new complex function application.
- The
controller 110 may control the display of thewindows controller 110 can set a display rank for thewindows controller 110 can set a first display rank for thewindow 311 and a second display rank for thewindow 312. Moreover, thecontroller 110 may cause windows having a relatively higher display rank, such aswindow 311, to be superimposed on windows that have a lower display rank, such aswindow 312. - In some aspects, the
controller 110 may assign the display rank ofwindows windows controller 110 may give a highest display rank to the window in which a control event is last input. As another example, when the user touches thewindow 311, thecontroller 110 may give the highest display rank to thewindow 311. Afterwards, when the user touches thewindow 312, thecontroller 110 may give the highest display rank to thewindow 312. Stated succinctly, in some aspects, the rank may be a number, a string, and/or another type of indication that is assigned to application windows bycontroller 110 and used to determine the Z-order in which the application windows are displayed. -
FIG. 3B is a diagram of a multi-window framework in accordance with aspects of the disclosure. In some aspects, the multi-view framework may be used to simultaneously display two or more applications. - As illustrated in
FIG. 3B , in aplatform 270, anactivity manager 291, awindow manager 292, and aview system 294 may interact with amulti window platform 400 via one or more Application Program Interface (API) calls. - An
activity manager 291 serves to activate an application such that a plurality of applications is simultaneously performed. Thewindow manager 292 draws or controls a plurality of windows, for example, touches, moves, or resizes the plurality of windows. Acontent provider 293 may enable an application to access data from another application or share a data thereof. Aview system 294 serves to process a layout, a border, and a button of a single window and redraws an entire screen. Apackage manager 275 serves to process and manage an application. Atelephony manager 276 serves to process and manage telephone communication. Aresource manager 277 provides an access to a non-code resource, such as a localized character row, a graphic, a layout file, and the like. Alocation manager 278 serves to process and manage location information using a GPS. Anotification manager 279 serves to process and manage an event generated in a system, for example, an alarm, a battery, and a network connection. - In some aspects, each of the
activity manager 291, thewindow manager 292, theview system 294, andplatform 400 may be implemented in software that is executed bycontroller 110 and/or another processor. Thewindow manager 292 may be operable generate a title bar of each window. Further, thewindow manager 292 may search for the Z-order of each window or determine the Z-order between the windows according to the search. Themulti window platform 400 may include amulti window manager 410 and amulti window service 420. Themulti window manager 410 provides a function of themulti window service 420 in an API form to the user, and a Manager/Service structure may operate based on IPC. Themulti window service 420 may trace lifecycles of applications executed with the multi window and manage a state of each application, such as a size, a position or the like. - The called API can manage a size, a position, and visibility of each window.
- As described above, the framework according to the present invention may be implemented in a manner of providing the independent multi window framework and then calling the API.
- Also, the
application layer 260 may directly call the API from themulti window manager 410. That is, the user can use the API by receiving the API from themulti window manager 410 even when a new application is developed. -
FIGS. 3C and 3D are diagrams illustrating a technique for screen order change in accordance with aspects of the disclosure. In the example ofFIG. 3C , auser 302 may touch thewindow 312.Controller 110 may detect the touch and in turn give the highest display order to thewindow 312. Further, thecontroller 110 gives a next highest order to thewindow 311 being the window that had previously had the highest display order. That is, thecontroller 110 reduces the display order of thewindow 311 by one level whenwindow 312 is touched. Thus in some aspects, thecontroller 110 may assign and reassign the display order ofwindows FIG. 3D illustrates thescreen displaying windows windows FIG. 3D , thewindow 312 having the highest display order may be at least partially superimposed on thewindow 311.FIGS. 4A and 4B are diagrams depicting the operation of a display device in a split mode in accordance with some aspects of the disclosure. In the example ofFIG. 4A two applications are displayed in amain display screen 410 in a split mode. In the split mode, afirst window 440 and asecond window 450 may be displayed not to overlap with each other on themain screen 410. For example, as illustrated inFIG. 4A , themain screen 410 may be divided into two parts and thefirst window 440 is displayed in one part of themain screen 410 and thesecond window 450 in the other part of themain screen 410. Thefirst window 440 and thesecond window 450 may be arranged right next to each other so that they share acommon boundary 470, without overlapping. - Referring to
FIG. 4B , an example is shown where a web browser application is executed in thewindow 440, while a messaging application is executed in thewindow 450. According to this example, a user may search for a desired restaurant through a web browser application executed in thewindow 440 on onetouch screen 420 while making an appointment with a friend to have dinner at that restaurant. As illustrated inFIG. 4B , the user can search for information on the Internet by touching objects on thefirst window 440. Further, the user can talk to a friend through a message service by touching objects on thesecond window 450. -
FIG. 5 is a diagram illustrating the operation of the display device in a freestyle mode in accordance with aspects of the disclosure. Afirst user 10 can input afirst touch 11 into anapplication execution screen 511 of the first window. Further, asecond user 20 can simultaneously input asecond touch 12 into anapplication execution screen 512 of the second window. In some aspect, the input of two touches may be considered to be simultaneous when difference between an input time of thefirst touch 11 and an input time of thesecond touch 12 may be smaller than a predetermined threshold (e.g. 0.5 seconds). Meanwhile, as described above, when two touches are simultaneously input to the conventional display device, the window having the highest activity order receives both thefirst touch 11 and thesecond touch 12.FIG. 6A is a flowchart of a method for controlling the display device according aspects of the disclosure. The display device can display a plurality of windows each displaying an application in step S601. Meanwhile, the user can simultaneously input commands into two or more windows among the plurality of windows in step S603. When it is determined that the commands are simultaneously input to the two or more windows among the plurality of windows (S603-Y), the display device can dispatch of the inputs to its corresponding application.FIG. 6B is a flowchart illustrating a method of controlling the display device according to aspects of the disclosure. The display device can display the first window in which the first application is executed and the second window in which the second application is executed in step S611. The display device can determine whether the first input to the first window and the second input to the second window are simultaneously input in step S613. As described above, the simultaneous input may mean that a difference between input times of the first input and the second input is smaller than a preset threshold. When the first input and the second input are simultaneously input (S613-Y), the display device may output a first event corresponding to the first input and also outputs a second event corresponding to the second input in step S615. -
FIG. 7 is a flowchart illustrating a method of controlling the display device according to aspects of the disclosure. The display device (or processor thereof) can display a plurality of windows each executing an application in step S701. The display device can identify and manage a position of each of the plurality of windows S703. More specifically, themulti window manager 410 may maintain a record of the positions of the windows corresponding to the executed applications. In some implementation, the record may be stored in memory and for each application screen, it may identify a plurality of coordinates that are occupied by that screen. For example, the display device may maintain the record shown in Table 1. -
TABLE 1 Range of X axis coordinate Range of y axis coordinate Window index values values 1 0~230 0~800 2 230~900 0~800 3 900~1280 0~800 - As shown in Table 1, for example, the record may indicate that the first window has x-axis coordinate values ranging from 0 to 230 and y-axis coordinate values ranging from 0 to 800, the second window has x-axis coordinate values ranging from 230 to 900 and y-axis coordinate values ranging from 0 to 800, and the third window has x-axis coordinate values ranging from 900 to 1280 and y-axis coordinate values ranging from 0 to 800. In this example, the first window executes a first application, the second window executes a second application, and the third window executes a third application. Although in this example, the record includes coordinate values, in other examples, any suitable indication of window location may be used.
- For example, in a situation where the
multi window manager 410 receives a command for displaying a plurality of windows, themulti window manager 410 may assign the coordinate values shown in Table 1 to each of the plurality of windows and cause the display device to display the windows at the locations indicated by those values. Further, themulti window manager 410 can store and manage the given coordinate values. For example, when a window size change or a window position change is additionally input by the user, themulti window manager 410 can change at least one of a size and a position of the window in accordance with the corresponding command and then display the changed window. Alternatively, as the at least one of the size and the position of the window is changed, themulti window manager 410 can store and manage the coordinate value of the changed window. - Furthermore, user(s) can simultaneously input commands into two or more of the windows in step S705. For example, the user(s) can simultaneously input two commands as shown in Table 2. If two commands are not simultaneously input, individual touch inputs to each of windows are performed.
-
TABLE 2 Command index Input position Command type 1 (23, 89) Left direction drag gesture 2 (520, 700) Right direction flick gesture - In response to the commands being input, the display device (or processor thereof) can determine a plurality of input positions in step S707 that correspond to the plurality of commands. More specifically, the display device can identify that an input position of a first command is (23, 89) and an input position of a second command is (520, 700). The display device can control to output an event corresponding to the window corresponding to the input position in step S709. For example, the display device can identify that coordinates (23, 89) correspond to the input position of the first command are included in a display range of the first window and coordinates (520, 700) corresponding to the input position of the second command are included in a display range of the second window. Afterwards, the display device can control the event corresponding to a left direction drag gesture which is the first command to be dispatched to the first application and a right direction flick gesture which is the second command to be dispatched to the second application.
-
FIGS. 8A and 8B are diagrams of systems for receiving user commands that are simultaneously input into a display device according to aspects of the disclosure. Moreover, in the example ofFIG. 8A , execution commands of the first application and the second application are input. Akernel 800 may manage a position of each of the first window executing the first application and the second window executing the second application. Additionally or alternatively, thekernel 800 may also maintain window information associated with each window. In some aspects, the window information may include coordinate information identifying an area on atouch panel 191 where the window is displayed. Thekernel 800 may use this information, command input position, to determine the application to which the input command is to be dispatched. Afirst command 801 and asecond command 802 may be input to thetouch panel 191. Thetouch panel 191 may output the two input commands 801 and 802 to thekernel 800. Thekernel 800 may match input positions of thefirst command 801 and thesecond command 802 with positions of the first window and the second window. In some aspects, the input position of a command may include one or more coordinates indicating where on thetouch panel 191 the command has been entered. - The
kernel 800 may determine to dispatch thefirst command 801 to thefirst application 261 and dispatch thesecond command 802 to thesecond application 262 based on a result of the matching. Thekernel 800 may output thefirst command 801 and thesecond command 802 to amulti window platform 400 together with application information to be dispatched. Themulti window platform 400 may dispatch thefirst command 801 input from thekernel 800 to thefirst application 261 and dispatch thesecond command 802 to thesecond application 262. That is, in the example ofFIG. 8A , thekernel 800 may recognize the position information of the window (For example, coordinates value) displayed on the display device through communication with themulti window platform 400 and determine the dispatched application based on the input position of the input command. When a touch event occurs on a display device, the kernel checks the location where the touch event occurred, and can determine whether the touch event occurs on a first window or the second window by comparing the touch event with the position information of the displayed window. In some aspects, the kernel may include software that is executed by controller, such ascontroller 110. - In the example of
FIG. 8B , themulti window platform 400 can manage the position of each of the first window executing the first application and the second window executing the second application. Afirst command 811 and asecond command 812 may be input to thetouch panel 191. Thetouch panel 191 may then provide the two input commands 811 and 812 to thekernel 800. Thekernel 800 may then provide thefirst command 811 and thesecond command 812 to themulti window platform 400. Themulti window platform 400 may manage window information on the executed application. The window information may include coordinate information and/or another indication of an area ontouch panel 191 that is displaying the window. Themulti window platform 400 may then determine the application to which the input command is to be dispatched based on the window information and the command input position. More specifically, themulti window platform 400 may be configured to match input positions of thefirst command 811 and thesecond command 812 with positions of the first window and the second window. - The
multi window platform 400 can determine to dispatch thefirst command 811 to thefirst application 261 and dispatch thesecond command 812 to thesecond application 262 based on a result of the matching. - The
multi window platform 400 can determine to dispatch thefirst command 811 to thefirst application 261 and dispatch thesecond command 812 to thesecond application 262. Themulti window platform 400 may dispatch thefirst command 811 input from thekernel 800 to thefirst application 261 and dispatch thesecond command 812 to thesecond application 262. That is, in the example ofFIG. 8B , themulti window platform 400 can manage the window information and determine the dispatched application based on the input position of the input command. -
FIGS. 9A , 9B and 9C are schematic diagrams of display devices in accordance with aspects of the disclosure. In the example ofFIG. 9A , thedisplay device 100 displays afirst window 901 executing a web browser and asecond window 902 executing a gallery in the split mode. Thefirst window 901 may include a search wordinput window object 902. Thesecond window 901 may display afirst image 911. Afirst user 10 may desire inputting a particular search word into the search word input window. Accordingly, thefirst user 10 can first input atouch gesture 921 to the search wordinput window object 902 in order to activate the search word input window. At the same time, thesecond user 20 may desire changing thefirst image 911 to another image and enjoying the changed image. Accordingly, thesecond user 20 may input a leftdirection flick gesture 922 of changing a display image. Meanwhile, thetouch gesture 921 and the leftdirection flick gesture 922 may be simultaneously input. - In the example of
FIG. 9B , adisplay device 100 provides both thetouch gesture 921 and the left direction flick gesture 822 to an application having the highest activity order. In the embodiment ofFIG. 9B , it is assumed that an activity order of a gallery application is higher than an activity order of a web browser application. Accordingly, the display device 900 according to the comparative example can dispatch both thetouch gesture 921 and the leftdirection flick gesture 922 to the gallery application. The gallery application may recognize thetouch gesture 921 and the leftdirection flick gesture 922 as a pinch-in gesture. Accordingly, the controller (not shown) may cause thefirst image 911 displayed in the gallery application to be reduced and then displayed. - In the example of
FIG. 9C , thecontroller 110 may dispatch thetouch gesture 921 to the web browser application and dispatches the leftdirection flick gesture 922 to the gallery application. Thecontroller 110 can control to output an event corresponding to thetouch gesture 921 to the first window and output an event corresponding to the leftdirection flick gesture 922 to the second window. More specifically, in accordance with thetouch gesture 921, thecontroller 110 may display acursor 903 on a searchword input window 902 and additionally display acharacter input board 904. Further, thecontroller 110 may change thefirst image 911 to thesecond image 912 and then display thesecond image 912 in accordance with the leftdirection flick gesture 922. According to the above description, even when commands are simultaneously input with respect to a plurality of windows, an effect of independently processing the commands can be created. Meanwhile, controlling to output the event in the window may mean changing an execution screen of the corresponding application. Alternatively, outputting the event in the window may mean an output from the display device, not a change of the execution screen of the application. For example, it can be easily understood by those skilled in the art that there is no limitation in outputs of sound, light, or vibration from the display device and types of outputs such as data transmission. - The above-described embodiments according to the present disclosure can be implemented in hardware, firmware or via the execution of software or computer code that can be stored in a recording medium such as a CD ROM, an RAM, a floppy disk, a hard disk, or a magneto-optical disk or computer code downloaded over a network originally stored on a remote recording medium or a non-transitory machine readable medium and to be stored on a local recording medium, so that the methods described herein can be rendered via such software that is stored on the recording medium using a general purpose computer, or a special processor or in programmable or dedicated hardware, such as an ASIC or FPGA. As would be understood in the art, the computer, the processor, microprocessor controller or the programmable hardware include memory components, e.g., RAM, ROM, Flash, etc. that may store or receive software or computer code that when accessed and executed by the computer, processor or hardware implement the processing methods described herein. In addition, it would be recognized that when a general purpose computer accesses code for implementing the processing shown herein, the execution of the code transforms the general purpose computer into a special purpose computer for executing the processing shown herein.
- While the disclosure has presented certain specific examples, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the disclosure as defined by the appended claims.
Claims (22)
1. A method for controlling a display device, the method comprising:
displaying, on a touchscreen display, a first window executing a first application and a second window executing a second application;
receiving, at the touchscreen display, a first command input to the first window and a second command input to the second window;
determining whether the first command and the second command are received simultaneously;
outputting a first event corresponding to the first command; and
outputting a second event corresponding to the second command.
2. The method of claim 1 , wherein displaying the first window and the second window comprises:
determining positions of the first window and the second window;
storing the determined positions of the first window and the second window in a memory; and
displaying the first window and the second window based on the determined positions.
3. The method of claim 2 , further comprising matching the stored position of the first window with an input position of the first command and matching the position of the second window with an input position of the second command.
4. The method of claim 3 , further comprising dispatching the first command to the first application and dispatching the second command to the second application based on the matching outcome.
5. The method of claim 1 , wherein the first command and the second command are received simultaneously when a difference between a time when the first command is received and a time when the second command is received is smaller than a threshold.
6. The method of claim 1 , wherein ouputting the first command includes:
dispatching the first command based on the first event when it is determined that the first command and the second command are received simultaneously; and
otherwise, dispatching the first command based on display rankings of the first command and the second command, when it is determined that the first command and the second command are not received simultaneously.
7. A display device for executing an application, the display device comprising:
a touch screen configured to:
display a first window executing a first application and a second window executing a second application,
receive a first command input to the first window and a second command input to the second window, the first command and the second being received simultaneously; and
a controller configured to output a first event corresponding to the first command and a second event corresponding to the second command.
8. The display device of claim 7 , wherein the controller is configured to determine positions of the first window and the second window, store the determined positions of the first window and the second window, and display the first window and the second window based on the determined positions of the first window and the second window.
9. The display device of claim 7 , wherein the controller matches the stored position of the first window with an input position of the first command and matches the position of the second window with an input position of the second command.
10. The display device of claim 9 , wherein the controller is further configured to:
dispatch the first command to the first application only when the first application the position of the first window is determined to match the first command; and
dispatches the second command to the second application only when the position of the second window is determined to match the second command.
11. The display device of claim 7 , wherein the controller is further configured to determine whether the first command and the second command are input simultaneously, the determining being based on whether a difference between a time when the first command is input and a time when the second command is input is smaller than a threshold.
12. The display device of claim 7 , the first command is dispatched based on whether a position of the first window is determined to match the first command only when the first command and the second command are determined to have been received simultaneously.
13. The display device of claim 8 , wherein the controller is further configured to store in a memory a record indicating the positions of the first window and the second window.
14. The display device of claim 8 , wherein the controller is further configured to:
match a position of the first window with an input position of the first command;
match a position of the second window with an input position of the second command;
dispatch the first command based on whether the position of the first window is determined to match the first command; and
dispatch the second command based on whether the position of the second window is determined to match the second command.
15. A method for controlling a display device, the method comprising:
displaying a plurality of windows, each window executing a different application;
receiving a plurality of commands that are input into the plurality of windows, each command being received at a different window, wherein the commands are received simultaneously; and
simultaneously inputting commands to two or more windows among the plurality of windows; and
processing the commands input to the windows on the input windows.
16. The method of claim 15 , wherein processing the commands includes outputting events corresponding to the commands to applications executed in the windows.
17. The method of claim 15 , wherein processing the commands includes, for each command, comparing an input location associated with that command with locations associated with each one of the plurality of windows.
18. A display device for executing applications, the display device comprising:
a touchscreen configured to display a plurality of windows executing applications; and
a controller coupled to the touchscreen, the controller configured to:
receive a plurality of commands that are input into the plurality of windows, each command being received at a different window, wherein the commands are received simultaneously; and
output each of the commands to a different one of the plurality of windows.
19. The display device of claim 18 , wherein outputting the commands includes outputting events corresponding to the commands to applications executed in the windows.
20. The display device of claim 18 , wherein outputting the commands includes, for each command, comparing an input location associated with that command with locations associated with each one of the plurality of windows.
21. A method for controlling a display device, the method comprising:
receiving commands directed to a first application and a second application, the first application being executed in a first window and the second application being executed in a second window;
determining, by a processor, a position of the first window and a position of the second window, the determining being performed by a kernel of an operating system executed by the processor;
displaying the first window and the second window;
simultaneously receiving a first command input to the first window and a second command input to the second window;
matching an input position of the first command with the position of the first window and matching an input position of the second command with the position of the second window in the kernel level; and
outputting a first event corresponding to the first command and a second event corresponding to the second command based on the matching outcome.
22. A method for controlling a display device, the method comprising:
receiving commands directed to a first application and a second application, the first application being executed in a first window and the second application being executed in a second window;
determining, by a processor, a position of the first window and a position of the second window, the determining being performed at a platform level;
displaying the first window and the second window;
simultaneously receiving a first command input to the first window and a second command input to the second window;
matching an input position of the first command with the position of the first window and matching an input position of the second command with the position of the second window in the platform level; and
outputting a first event corresponding to the first command and a second event corresponding to the second command based on the matching outcome.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/610,989 US9996252B2 (en) | 2013-03-29 | 2017-06-01 | Display device for executing plurality of applications and method of controlling the same |
US16/003,284 US10747420B2 (en) | 2013-03-29 | 2018-06-08 | Display device for executing plurality of applications and method of controlling the same |
US16/993,328 US20200371658A1 (en) | 2013-03-29 | 2020-08-14 | Display device for executing plurality of applications and method of controlling the same |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2013-0034042 | 2013-03-29 | ||
KR1020130034042A KR102102157B1 (en) | 2013-03-29 | 2013-03-29 | Display apparatus for executing plurality of applications and method for controlling thereof |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/610,989 Division US9996252B2 (en) | 2013-03-29 | 2017-06-01 | Display device for executing plurality of applications and method of controlling the same |
Publications (1)
Publication Number | Publication Date |
---|---|
US20140298247A1 true US20140298247A1 (en) | 2014-10-02 |
Family
ID=51622132
Family Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/183,726 Abandoned US20140298247A1 (en) | 2013-03-29 | 2014-02-19 | Display device for executing plurality of applications and method of controlling the same |
US15/610,989 Active US9996252B2 (en) | 2013-03-29 | 2017-06-01 | Display device for executing plurality of applications and method of controlling the same |
US16/003,284 Active 2034-02-26 US10747420B2 (en) | 2013-03-29 | 2018-06-08 | Display device for executing plurality of applications and method of controlling the same |
US16/993,328 Abandoned US20200371658A1 (en) | 2013-03-29 | 2020-08-14 | Display device for executing plurality of applications and method of controlling the same |
Family Applications After (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/610,989 Active US9996252B2 (en) | 2013-03-29 | 2017-06-01 | Display device for executing plurality of applications and method of controlling the same |
US16/003,284 Active 2034-02-26 US10747420B2 (en) | 2013-03-29 | 2018-06-08 | Display device for executing plurality of applications and method of controlling the same |
US16/993,328 Abandoned US20200371658A1 (en) | 2013-03-29 | 2020-08-14 | Display device for executing plurality of applications and method of controlling the same |
Country Status (2)
Country | Link |
---|---|
US (4) | US20140298247A1 (en) |
KR (1) | KR102102157B1 (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160006854A1 (en) * | 2014-07-07 | 2016-01-07 | Canon Kabushiki Kaisha | Information processing apparatus, display control method and recording medium |
US20160357388A1 (en) * | 2015-06-05 | 2016-12-08 | Apple Inc. | Devices and Methods for Processing Touch Inputs Over Multiple Regions of a Touch-Sensitive Surface |
US20180307387A1 (en) * | 2014-01-07 | 2018-10-25 | Samsung Electronics Co., Ltd. | Electronic device and method for operating the electronic device |
US10275150B2 (en) * | 2016-05-12 | 2019-04-30 | Canon Kabushiki Kaisha | Display control apparatus and method of controlling the same |
US20200019366A1 (en) * | 2017-01-26 | 2020-01-16 | Huawei Technologies Co., Ltd. | Data Processing Method and Mobile Device |
US10839354B1 (en) | 2015-05-04 | 2020-11-17 | State Farm Mutual Automobile Insurance Company | Home screen agent and insurance card widget |
USD926814S1 (en) * | 2019-07-08 | 2021-08-03 | UAB “Kurybinis {hacek over (z)}ingsnis” | Computer screen with graphical user interface simulating a layout |
US11314388B2 (en) * | 2016-06-30 | 2022-04-26 | Huawei Technologies Co., Ltd. | Method for viewing application program, graphical user interface, and terminal |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102416071B1 (en) * | 2014-12-12 | 2022-07-06 | 삼성전자주식회사 | Electronic device for chagring and method for controlling power in electronic device for chagring |
KR20180080629A (en) * | 2017-01-04 | 2018-07-12 | 삼성전자주식회사 | Electronic device and method for displaying history of executed application thereof |
US11036390B2 (en) * | 2018-05-25 | 2021-06-15 | Mpi Corporation | Display method of display apparatus |
CN111208925A (en) * | 2019-09-30 | 2020-05-29 | 华为技术有限公司 | Method for establishing application combination and electronic equipment |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5694150A (en) * | 1995-09-21 | 1997-12-02 | Elo Touchsystems, Inc. | Multiuser/multi pointing device graphical user interface system |
US6426762B1 (en) * | 1998-07-17 | 2002-07-30 | Xsides Corporation | Secondary user interface |
US20070226636A1 (en) * | 2006-03-21 | 2007-09-27 | Microsoft Corporation | Simultaneous input across multiple applications |
US20120030568A1 (en) * | 2010-07-30 | 2012-02-02 | Migos Charles J | Device, Method, and Graphical User Interface for Copying User Interface Objects Between Content Regions |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7855718B2 (en) | 2007-01-03 | 2010-12-21 | Apple Inc. | Multi-touch input discrimination |
US8201109B2 (en) | 2008-03-04 | 2012-06-12 | Apple Inc. | Methods and graphical user interfaces for editing on a portable multifunction device |
US8645827B2 (en) | 2008-03-04 | 2014-02-04 | Apple Inc. | Touch event model |
US20100107116A1 (en) * | 2008-10-27 | 2010-04-29 | Nokia Corporation | Input on touch user interfaces |
US8326637B2 (en) | 2009-02-20 | 2012-12-04 | Voicebox Technologies, Inc. | System and method for processing multi-modal device interactions in a natural language voice services environment |
US9274699B2 (en) * | 2009-09-03 | 2016-03-01 | Obscura Digital | User interface for a large scale multi-user, multi-touch system |
US20110087988A1 (en) * | 2009-10-12 | 2011-04-14 | Johnson Controls Technology Company | Graphical control elements for building management systems |
US20120026077A1 (en) * | 2010-07-28 | 2012-02-02 | Google Inc. | Mapping trackpad operations to touchscreen events |
EP2490113B1 (en) * | 2011-02-15 | 2016-11-23 | Lg Electronics Inc. | Display device and method of controlling operation thereof |
US8745525B1 (en) * | 2011-03-29 | 2014-06-03 | Google Inc. | Presenting graphical windows on a device |
US9026946B2 (en) * | 2011-08-08 | 2015-05-05 | Blackberry Limited | Method and apparatus for displaying an image |
US8572515B2 (en) * | 2011-11-30 | 2013-10-29 | Google Inc. | Turning on and off full screen mode on a touchscreen |
US9032292B2 (en) * | 2012-01-19 | 2015-05-12 | Blackberry Limited | Simultaneous display of multiple maximized applications on touch screen electronic devices |
US20140157128A1 (en) * | 2012-11-30 | 2014-06-05 | Emo2 Inc. | Systems and methods for processing simultaneously received user inputs |
-
2013
- 2013-03-29 KR KR1020130034042A patent/KR102102157B1/en active IP Right Grant
-
2014
- 2014-02-19 US US14/183,726 patent/US20140298247A1/en not_active Abandoned
-
2017
- 2017-06-01 US US15/610,989 patent/US9996252B2/en active Active
-
2018
- 2018-06-08 US US16/003,284 patent/US10747420B2/en active Active
-
2020
- 2020-08-14 US US16/993,328 patent/US20200371658A1/en not_active Abandoned
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5694150A (en) * | 1995-09-21 | 1997-12-02 | Elo Touchsystems, Inc. | Multiuser/multi pointing device graphical user interface system |
US6426762B1 (en) * | 1998-07-17 | 2002-07-30 | Xsides Corporation | Secondary user interface |
US20070226636A1 (en) * | 2006-03-21 | 2007-09-27 | Microsoft Corporation | Simultaneous input across multiple applications |
US20120030568A1 (en) * | 2010-07-30 | 2012-02-02 | Migos Charles J | Device, Method, and Graphical User Interface for Copying User Interface Objects Between Content Regions |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180307387A1 (en) * | 2014-01-07 | 2018-10-25 | Samsung Electronics Co., Ltd. | Electronic device and method for operating the electronic device |
US9521234B2 (en) * | 2014-07-07 | 2016-12-13 | Canon Kabushiki Kaisha | Information processing apparatus, display control method and recording medium |
US20160006854A1 (en) * | 2014-07-07 | 2016-01-07 | Canon Kabushiki Kaisha | Information processing apparatus, display control method and recording medium |
US10839354B1 (en) | 2015-05-04 | 2020-11-17 | State Farm Mutual Automobile Insurance Company | Home screen agent and insurance card widget |
US11615379B1 (en) | 2015-05-04 | 2023-03-28 | State Farm Mutual Automobile Insurance Company | Home screen agent and insurance card widget |
US11036353B1 (en) * | 2015-05-04 | 2021-06-15 | State Farm Mutual Automobile Insurance Company | Home screen agent and insurance card widget |
CN107690619A (en) * | 2015-06-05 | 2018-02-13 | 苹果公司 | For handling the apparatus and method of touch input on the multiple regions of touch sensitive surface |
US10474350B2 (en) | 2015-06-05 | 2019-11-12 | Apple Inc. | Devices and methods for processing touch inputs over multiple regions of a touch-sensitive surface |
EP3447622A1 (en) * | 2015-06-05 | 2019-02-27 | Apple Inc. | Devices and methods for processing touch inputs over multiple regions of a touch-sensitive surface |
US9846535B2 (en) * | 2015-06-05 | 2017-12-19 | Apple Inc. | Devices and methods for processing touch inputs over multiple regions of a touch-sensitive surface |
US20160357388A1 (en) * | 2015-06-05 | 2016-12-08 | Apple Inc. | Devices and Methods for Processing Touch Inputs Over Multiple Regions of a Touch-Sensitive Surface |
US10275150B2 (en) * | 2016-05-12 | 2019-04-30 | Canon Kabushiki Kaisha | Display control apparatus and method of controlling the same |
US11314388B2 (en) * | 2016-06-30 | 2022-04-26 | Huawei Technologies Co., Ltd. | Method for viewing application program, graphical user interface, and terminal |
US20200019366A1 (en) * | 2017-01-26 | 2020-01-16 | Huawei Technologies Co., Ltd. | Data Processing Method and Mobile Device |
US10908868B2 (en) * | 2017-01-26 | 2021-02-02 | Huawei Technologies Co., Ltd. | Data processing method and mobile device |
US11567725B2 (en) | 2017-01-26 | 2023-01-31 | Huawei Technologies Co., Ltd. | Data processing method and mobile device |
USD926814S1 (en) * | 2019-07-08 | 2021-08-03 | UAB “Kurybinis {hacek over (z)}ingsnis” | Computer screen with graphical user interface simulating a layout |
USD936698S1 (en) * | 2019-07-08 | 2021-11-23 | UAB “Kūrybinis {hacek over (z)}ingsnis” | Computer screen with graphical user interface simulating a layout |
Also Published As
Publication number | Publication date |
---|---|
US20170269813A1 (en) | 2017-09-21 |
KR102102157B1 (en) | 2020-04-21 |
US9996252B2 (en) | 2018-06-12 |
KR20140118338A (en) | 2014-10-08 |
US10747420B2 (en) | 2020-08-18 |
US20200371658A1 (en) | 2020-11-26 |
US20180292968A1 (en) | 2018-10-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20200371658A1 (en) | Display device for executing plurality of applications and method of controlling the same | |
US11853523B2 (en) | Display device and method of indicating an active region in a multi-window display | |
US10671282B2 (en) | Display device including button configured according to displayed windows and control method therefor | |
US11256389B2 (en) | Display device for executing a plurality of applications and method for controlling the same | |
US10088991B2 (en) | Display device for executing multiple applications and method for controlling the same | |
US20130300684A1 (en) | Apparatus and method for executing multi applications | |
US10386992B2 (en) | Display device for executing a plurality of applications and method for controlling the same | |
US10282088B2 (en) | Configuration of application execution spaces and sub-spaces for sharing data on a mobile tough screen device | |
EP2595043B1 (en) | Mobile device for executing multiple applications and method thereof | |
KR102016975B1 (en) | Display apparatus and method for controlling thereof | |
KR102089707B1 (en) | Display apparatus and method for controlling thereof | |
KR20140068573A (en) | Display apparatus and method for controlling thereof | |
US11604580B2 (en) | Configuration of application execution spaces and sub-spaces for sharing data on a mobile touch screen device | |
KR20150094492A (en) | User terminal device and method for displaying thereof | |
KR20130126428A (en) | Apparatus for processing multiple applications and method thereof | |
KR102084548B1 (en) | Display apparatus and method for controlling thereof | |
KR20140087480A (en) | Display apparatus for excuting plurality of applications and method for controlling thereof | |
KR20140076395A (en) | Display apparatus for excuting applications and method for controlling thereof | |
KR20140028352A (en) | Apparatus for processing multiple applications and method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SAMSUNG ELECTRONICS CO., LTD., KOREA, REPUBLIC OF Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:CHO, SUNG-JAE;REEL/FRAME:032243/0095 Effective date: 20140205 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |