WO2014035123A1 - User terminal apparatus and contol method thereof - Google Patents

User terminal apparatus and contol method thereof Download PDF

Info

Publication number
WO2014035123A1
WO2014035123A1 PCT/KR2013/007695 KR2013007695W WO2014035123A1 WO 2014035123 A1 WO2014035123 A1 WO 2014035123A1 KR 2013007695 W KR2013007695 W KR 2013007695W WO 2014035123 A1 WO2014035123 A1 WO 2014035123A1
Authority
WO
WIPO (PCT)
Prior art keywords
area
window
application window
terminal apparatus
user
Prior art date
Application number
PCT/KR2013/007695
Other languages
French (fr)
Inventor
Kang-Tae Kim
Eun-Young Kim
Duck-Hyun Kim
Chul-Joo Kim
Kwang-Won Sun
Jae-Yeol Lee
Original Assignee
Samsung Electronics Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co., Ltd. filed Critical Samsung Electronics Co., Ltd.
Publication of WO2014035123A1 publication Critical patent/WO2014035123A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Abstract

A user terminal apparatus is disclosed. The user terminal apparatus includes a display which displays an application window on a screen; a user interface unit which receives an input of a user command; and a controller which controls so that a title area is displayed on one area of the application window when a first user command for displaying the title area on the application window is input and that the title area automatically disappears when a predetermined event occurs.

Description

USER TERMINAL APPARATUS AND CONTOL METHOD THEREOF
Methods and apparatuses consistent with the exemplary embodiments relate to a user terminal apparatus and control method thereof, and more particularly, to a touch based user terminal apparatus and a control method thereof.
Due to the development of electronic technologies, various types of display apparatuses are being developed. Especially, display apparatuses such as TVs, PCs, laptop computers, tablet PCs, mobile phones, and MP3 players etc. are widely used in most households.
Recently, in order to satisfy the needs of users who want new and various functions, mobile terminals such as a tablet PC and mobile phone also provide multi tasking environments which execute a plurality of applications at the same time.
There is a need to provide an application window such that it provides an application window in such a multi tasking environment in an appropriate manner.
The purpose of the present invention is to provide a user terminal apparatus which automatically adjusts an application window to satisfy a user’s needs in a multi-tasking environment, and a control method thereof.
According to an exemplary embodiment of the present invention, a user terminal apparatus includes a display which displays an application window on a screen; a user interface unit which receives an input of a user command; and a controller which controls so that a title area is displayed on one area of the application window when a first user command for displaying the title area on the application window is input and that the title area automatically disappears when a predetermined event occurs.
Herein, the predetermined event may be an event where a predetermined time passes from a display point of the title area.
In addition, the controller may display the title area, when the first user command is input with the application window displayed on an entire area of the screen.
In this case, the first user command may be a user manipulation of touching a predetermined area of the application window.
Herein, the one area of the application window may be an area adjacent to the predetermined area.
In addition, the controller may control to enter into a multi window mode for displaying a plurality of application windows on the screen at the same time, when a second user command which manipulates the title area is input with the title area displayed on the application window.
In addition, the controller may reduce the application window to a predetermined size in the multi window mode, and display the title area to be regularly included in one area of the reduced application window.
In this case, the second user command may be a touch manipulation regarding the title area.
In addition, the user terminal apparatus may control to enter into a multi window mode for displaying a plurality of application windows on the screen at the same time, when a double tab manipulation regarding a predetermined area of the application window or a touch and flick manipulation regarding the predetermined area is input.
Herein, the user terminal apparatus may be a touch based mobile terminal.
Meanwhile, according to an exemplary embodiment of the present invention, a control method of a user terminal apparatus may include displaying an application window on a screen; displaying a title area on one area of the application window, when a first user command for displaying the title area on the application window is input; and controlling so that the title area automatically disappears when a predetermined event occurs.
Herein, the predetermined event may be an event where a predetermined time passes from a display point of the title area.
In addition, the displaying may display the title area, when the first user command is input with the application window displayed on an entire area of the screen.
In this case, the first user command may be a user manipulation of touching a predetermined area of the application window.
In addition, the control method of a user terminal apparatus may further include controlling controls to enter into a multi window mode for displaying a plurality of application windows on the screen at the same time, when a second user command which manipulates the title area is input with the title area displayed on the application window.
According to the present invention as mentioned above, it becomes able to efficiently manage a display space of an application window in a multi-tasking environment, improving user convenience.
The above and/or other aspects of the present disclosure will be more apparent by describing certain present disclosure with reference to the accompanying drawings, in which:
FIG. 1 is a view illustrating a user terminal apparatus according to an exemplary embodiment of the present disclosure;
FIG. 2 is a block diagram illustrating a detailed configuration of a user terminal apparatus illustrated in FIG. 1;
FIG. 3a and 3b are a view for explaining a software configuration stored in a storage unit;
FIG. 4 is a view for explaining a method of displaying an application which supports a multi window mode according to an exemplary embodiment of the present disclosure;
FIG. 5a and 5b are a view for explaining an application window display format in a case of executing an application in a multi-window mode and general mode according to another exemplary embodiment of the present disclosure;
FIG. 6a and 6b are a view for explaining a method of moving an application window in a multi window mode according to another exemplary embodiment of the present disclosure;
FIG. 7 is a view for explaining a method of adjusting a size of an application window in a multi window mode according to another exemplary embodiment of the present disclosure;
FIGs. 8 to 10 are views for explaining a method of displaying a title area in an application window according to various exemplary embodiments of the present disclosure; and
FIG. 11 is a view for explaining a control method of a user terminal apparatus according to an exemplary embodiment of the present disclosure.
-
Certain exemplary embodiments are described in higher detail below with reference to the accompanying drawings.
FIG. 1 is a view illustrating a user terminal apparatus according to an exemplary embodiment of the present disclosure.
FIG. 1(a) is a mimetic diagram for explaining an embodiment example of a user terminal apparatus according to an exemplary embodiment of the present disclosure.
As illustrated in FIG. 1(a), the user terminal apparatus 100 may be embodied as a tablet PC, but is not limited thereto, and thus may be embodied as various types of apparatuses which are portable and have display functions such as smart phones, PMPs, PDAs. In addition, the user terminal apparatus 100 has inside a touch screen, and thus may be embodied to execute a program using fingers or a pen.
Especially, a tablet PC which is an embodiment example of the user terminal apparatus 100 of the present disclosure is an apparatus where portability of a PDA and functions of a notebook have been combined. A tablet PC may have functions of a desktop and at the same time use wireless internet, and the main input apparatus may be a touch screen but an existing keyboard or mouse may be connected and used. In addition, a tablet PC may have a function of perceiving the handwriting of what a user wrote and storing it as data.
Meanwhile, the user terminal apparatus 100 may provide a multi-tasking environment executing a plurality of applications at the same time and performing an operation. Hereinbelow is an explanation of a method of providing an application window in a multi tasking environment according to various exemplary embodiments of the present disclosure.
FIG. 1(b) is a block diagram illustrating a configuration of a user terminal apparatus according to an exemplary embodiment of the present disclosure.
According to FIG. 1(b), the user terminal apparatus 100 includes a display 110, user interface unit 120 and controller 130.
The display 110 displays a screen. Herein, the screen may include an application execution screen, GUI (Graphic User Interface) screen etc. which includes various objects such as an image, video and text etc.
Especially, the display 110 may display a plurality of application execution screens, that is a plurality of application windows at the same time, Such a screen display mode is called a multi window mode hereinbelow.
An application window provided in such a multi window mode may be provided in a format where position moving, size adjustment, and pinup function etc. are possible. To this end, the application window may include a title area(or title bar) which includes a menu item for providing the corresponding function. More specifically, on a title area, a maximizing button, end button, pin-up button etc. may be provided. Accordingly, through a manipulation of each button, it is possible to receive a window maximizing command, window end command, and window pinup command etc. Meanwhile, the application window provided in the multi window mode may be embodied to have a small size which occupies a partial area of among the entire screen of the display 110 to enable moving location and size adjustment easily. Hereinbelow, the window mode is called a mini mode.
In addition, the display 110 may display one application window to have a size maximized to the entirety of the screen. Hereinbelow, such a screen display mode is called a normal mode. The application window provided in a normal mode may be provided in a format which does not include a title area so as to provide an application execution screen as wide as possible. Meanwhile, the application window provided in a general mode is called a maximization mode hereinbelow in that it has a maximized size which occupies the entire screen area.
Meanwhile, the display 110 may be embodied as an LCD(Liquid Crystal Display Panel), PLED (Organic Light Emitting Diode) etc., but is not limited thereto. Especially, the display 110 may be embodied as a touch screen format having a mutual layer structure with the touch pad. In this case, the display 110 may be used as a user interface 120 to be explained hereinbelow besides an output device. Herein, the touch screen may be formed to detect not only an input location and area but also touch input pressure as well.
The user interface unit 120 receives various user commands.
Especially, the user interface unit 120 may receive a user command for displaying a title area on the application window.
More specifically, the user interface unit 120 may receive a user command for displaying a title area on the application window with the application window displayed in a maximization mode occupying the entire area of the screen. Herein, the user command may be a user manipulation, of touching a predetermined area of the application window, especially, one-tap manipulation, flick manipulation etc. In addition, the predetermined area may be an information display area(or menu display area) provided in a top end area of the application window, but not limited thereto.
Meanwhile, on the title area displayed according to a user command with the application window maximized, a restoration button for restoring the window in a mini mode and an end button for ending the window can be provided. That is, as aforementioned, it is possible to receive not only a touch manipulation regarding an area excluding an area where a button is provided of the title area but also a user command for restoring the maximization mode into a mini mode through a touch manipulation regarding a restoration button provided on the title area.
In addition, the user interface unit 120 may receive a user command for restoring the size of the application window into a mini mode reduced to a predetermined size with the application window displayed on the title area. Herein, the user command may be a touch manipulation regarding an area excluding the area where the button is provided of the title area, and a touch manipulation regarding a window return button provided on the title area.
Meanwhile, the user interface unit 120 may also receive a user command for restoring the window into a mini mode with the application window maximized and with the title area not displayed. Herein, the user command may be a two-tap manipulation, flick-down manipulation etc. regarding an information display area.
In addition, the user interface unit 120 may receive a user command for moving the location of the application window in a multi window mode, a user command for adjusting a size of the application window, and a user command for pinning up of the application window. Herein, a pin-up command may be embodied to operate in pin-up ON/OFF through a tap operation regarding the pin-up button. That is, during a pin ON, the application window may be displayed in a regular state, and during a PIN OFF, it may be embodied such that moving location and size adjustment are possible
The controller 130 controls the overall operations of the user terminal apparatus 100.
Especially, when a user command for displaying a title area on an application window is input, the controller 130 displays a title area on one area of the application window, and when a predetermined event occurs, the controller 130 may control so that the title area automatically disappears. Herein, the predetermined event may be an event where a predetermined time passes, but not limited thereto.
Herein, the controller 130 may receive a corresponding user command with the application window displayed in a maximization mode. In this case, on the title area which is displayed in the maximization mode, a return button for returning to a mini mode and an end button for ending the window may be provided, unlike the mini mode.
Meanwhile, the title area may be displayed on a top end area of the application window. For example, when a user manipulation of touching an information display area provided on the top end area of the application window is input, a title area is displayed on the adjacent upper area of the information display area, and after a predetermined time passes, the title area may automatically disappear. Herein, the predetermined time may be for example within 3 seconds, but not limited thereto.
In addition, when a user command for manipulating the title area is input with the title area displayed on the application window, the controller 130 may control to enter into the multi window mode for displaying a plurality of application windows on the screen. Herein, the user command may be a manipulation of a flick, especially, a flick down of the title area, but is not limited thereto, and may also be embodied as a one-tap manipulation, and drag manipulation.
In addition, the controller 130 may control so that the application window is displayed in a mini mode reduced to have a predetermined size. In this case, the title area may be displayed regularly on one area of the mini window. That is, in the maximization mode where the application window is displayed on the entire screen, the title area may not be displayed regularly but be displayed on the title area according to the user command, but in the multi window mode, the title area may be regularly displayed.
Meanwhile, in the aforementioned exemplary embodiment, it is explained that the title area is displayed with the application window displayed on the entire screen, that is in a maximization mode, and then is automatically disappears, but is not limited thereto. That is, according to embodiment methods, the technology may be applied to the mini window format.
In addition, the controller 130 may control to display the guide GUI for guiding moving the location and adjusting the size of the application window according to the user command in the multi window mode. In this case, the controller 130 may control to provide a haptic feedback with the guide GUI displayed.
In addition, the controller 130 may provide various menu items for executing the multi window mode and general mode.
FIG. 2 is a block diagram illustrating a detailed configuration according to an exemplary embodiment of a user terminal apparatus illustrated in FIG. 1. According to FIG. 2, the user terminal apparatus 100 includes a display 110, user interface unit 120, controller 130, storage unit 140, application driver 150, feedback provider 160, communication unit 170, audio processor 180, video processor 185, speaker 190, button 191, USB port 192, camera 193, and mike 193. Of the configurative elements illustrated in FIG. 2, detailed explanation on those repetitive of FIG. 1 are omitted.
Operations of the aforementioned controller 130 may be performed by programs stored in the storage unit 140. In the storage unit 140, various data such as various data and contents etc. that are input or predetermined during execution of O/S(Operating System) software module, various applications for driving the user terminal apparatus 100. Especially, applications stored in the storage unit 140 may be differentiated into applications which can and cannot support multi window mode.
In addition, the storage unit 140 may store various formats of templates where the layout for arranging a plurality of application windows on the screen in the multi window mode is predefined.
Besides, the various software modules stored in the storage unit 140 is explained hereinbelow with reference to FIG. 3.
The application driver 150 performs a function of driving and executing an application which could be provided in the user terminal apparatus 100. Herein, an application is an application program which may be executed by itself, and may include various multimedia contents. Herein, the term ‘multimedia contents’ includes text, audio, still image, animation, video, interactivity contents, EPG(Electronic Program Guide) contents from contents providers, electronic messages received from users, and information on current events, but are not limited thereto.
The feedback provider 160 performs a function of providing various feedback according to the function executed in the user terminal apparatus 100.
Especially, the feedback provider 160 may provide a haptic feedback regarding the GUI displayed on the screen. Herein, a haptic feedback is a technology of enabling sensing touch of the user by generating vibration, force or impact to the user terminal apparatus 100, and may also be called computer touch technology.
More specifically, the feedback provider 160 may provide haptic feedback regarding the corresponding guide GUI when a guide GUI for guiding the location moving and size adjusting of the application window is displayed according to the user command in the multi window mode.
In this case, the feedback provider 160 may apply different vibration conditions(for example, vibration frequency, vibration length, vibration intensity, vibration wave, vibration location etc.) according to the control by the controller 130, and provide various feedbacks. Herein, a method of applying vibration methods differently and generating various haptic feedback is a conventional art, and thus detailed explanation is omitted.
Meanwhile, in the aforementioned exemplary embodiment, it has been explained that the feedback provider 160 uses the vibration sensor to provide haptic feedback, but this is merely an exemplary embodiment, and thus the feedback provider 160 may use the piezo sensor to provide haptic feedback.
The communication unit 170 is a configuration of performing communication with various types of external devices according to various types of communication methods. The communication unit 170 includes a wifi chip 171, Bluetooth 172, and wireless communication chip 173 etc.
A wifi chip 171 and Bluetooth chip 172 performs communication in a wifi method, and Bluetooth method, respectively. A wireless communication chip 173 refers to a chip which performs communication according to various communication standards such as IEEE, zigbee, 3G(3rd Generation), 3GPP(3rd Generation Partnership Project), LTE(Long Term Evoloution). Besides the above, the communication unit 170 may further include an NFC chip which operates in an NFC(Near Field Communication) method which uses 13.56MHx band of among various RF-ID frequency bands such as 135kHz, 13.56MHz, 433MHz, 860~960MHz, 2.45GHz etc.
The audio processor 180 is a configurative element which performs processing to audio data. In the audio processor 170, various processing such as decoding, amplifying, and noise filtering etc. may be performed to audio data.
The audio processor 185 is a configurative element which performs processing to video data. The video processor 185 may perform various image processing such as decoding, scaling, noise filtering, frame rate conversion, and resolution conversion etc. to video data.
The speaker 190 is a configurative element which outputs not only various audio data processed in the audio processor 180 but also various notice sound and voice messages etc.
The button 191 may be a button of various types such as a mechanical button, touch pad, and wheel etc. formed in an arbitrary area such as the front surface, side surface, and rear surface etc. of the appearance of the main body. For example, a button for turning ON/OFF the power of the user terminal apparatus 100 may be provided.
The USB port 192 may perform communication with various external apparatuses or charging with various external apparatuses through a USB cable.
The camera 193 is a configuration of photographing a still image or video according to a control by a user. A plurality of cameras 193 may be embodied such as a front camera and rear camera etc.
The mike 194 is a configuration for receiving user voice or other sound and converting the received user voice or other sound into audio data. The controller 130 may use user voice input through the mike 194 in the call process, or convert the input voice into audio data and store it in the storage unit 140.
In a case where a camera 193 and mike 194 are provided, the controller 130 may perform controlling operations according to a user motion perceived by user voice or the camera 193 input through the mike 194. That is, the user terminal apparatus 100 may operate in a motion control mode or voice control mode. In a case of operating in a motion control mode, the controller 130 activates the camera 193 to photograph the user, and tracks changes of the user’s motion to perform a control operation corresponding thereto. In a case of operating in a voice control mode, the controller 130 may analyze the user voice input through the mike, and operate in a voice recognition mode for performing a control operation according to the analyzed user voice.
Besides, various external input ports for connecting with various external terminals such as a headset, mouse, and LAN etc. may be further included.
Meanwhile, the controller 130 uses various programs stored in the storage unit 140 to control the overall operations of the user terminal apparatus 100.
For example, the controller 130 may execute an application stored in the storage unit 140 and configure and display its execution screen, and may also reproduce various content stored in the storage unit 140. In addition, the controller 130 may perform communication with external devices through the communication unit 160.
More specifically, the controller 130 includes a RAM 131, ROM 132, main CPU 133, graphic processor 134, 1st to nth interfaces 135-1~135-n, and bus 136.
The RAM 131, ROM 132, main CPU 133, graphic processor 134, and 1st to nth interfaces 135-1~135-n etc. may be connected to one another through a bus 136.
The 1st to nth interfaces 135-1 to 135n are connected to various configurative elements aforementioned. One of the interfaces may be a network interface which is connected to an external device through a network.
The main CPU 133 accesses the storage unit 140, and performs booting using the O/S stored in the storage unit 140. In addition, the main CPU 133 uses various programs, contents, and data etc. stored in the storage unit 140 to perform various operations.
In the ROM 132, a command set etc. for system booting is stored. When a turn on command is input and power is provided, the main CPU 133 copies the O/S stored in the storage unit 140 to the RAM 131 according to the command stored in the ROM 132, and executes the O/S to boot the system. When the booting is completed, the main CPU 133 copies various application programs stored in the storage unit 140 to the RAM 131, and executes the application program copied to the RAM 131 to perform various operations.
The graphic processor 134 uses a calculator(not illustrated) and renderer(not illustrated) to generate a screen which includes various objects such as an icon, image, and text etc. The calculator uses a control command received from the input apparatus 134 to calculate feature values such as a coordinate value, format, size, color etc. where each object is to be displayed according to a layout of the screen. The renderer generates various screens of various layouts which include objects based on the feature values calculated in the calculator. The screen generated in the renderer is displayed within the display area of the display 110.
Meanwhile, although not illustrated in the views, the user terminal apparatus 100 may include a sensor(not illustrated).
The sensor(not illustrated) may sense various operations such as a touch, rotation, inclination, pressure, and approach etc. regarding the user terminal apparatus 100.
Especially, the sensor(not illustrated) may include a touch sensor which senses a touch. The touch sensor may be embodied into a capacitive or a pressure-sensitive touch sensor. The capacitive touch sensor is a type where dielectric coated on the surface of the display 110 is used to sense micro electricity approved to a user’s body and calculate a touch coordinate when a part of the user’s body is touched to the surface of the display 110. The pressure-sensitive touch sensor is a type where two electrode panels are included, and thus when a user touches the screen, the pressure-sensitive touch sensor senses current flowing due to contact of an upper and lower panel of a touched point to calculate a touch coordinate. As aforementioned, the touch sensor may be embodied in various formats. Besides the above, the sensor may further include a geomagnetic sensor for sensing a rotation state and moving direction etc. of the user terminal apparatus 100, and an acceleration sensor for sensing an inclination degree of the user terminal apparatus 100.
Meanwhile, FIG. 2 is an example of a detailed configuration included in the user terminal apparatus 100, and thus depending on exemplary embodiments, part of the configurative elements illustrated in FIG. 2 may be omitted or changed, or other configurative elements may be further added. For example, a GPS receiver(not illustrated) for receiving a GPS signal from a GPS(Global Positioning System) satellite and calculating a present location of the user terminal apparatus 100, and a DMB receiver(not illustrated) etc. for receiving and processing a DMB(Digital Multimedia Broadcasting) signal may be further included.
FIG. 3A is a view for explaining a configuration of software stored in the storage unit 140.
According to FIG. 3A, in the storage unit 140, software which includes a base module 141, sensing module 142, communication module 143, presentation module 144, web browser module 145, and service module 146 may be stored.
A base module 141 refers to a basic module which processes a signal transmitted from each hardware included in the user terminal apparatus 100 and transmits the processed signal to an upper layer module. The base module 141 includes a storage module 141-1, security module 141-2, and network module 141-3 etc. The storage module 141-1 is a program module which manages database DB or registry. The main CPU 133 may use the storage module 141-1 to access database within the storage unit 140 and read various data. The security module 141-2 is a program module which supports certification, permission, secure storage etc. regarding hardware, and a network module 141-3 is a module for supporting network connection, and includes a DNET module and UpnP module etc.
The sensing module 142 is a module which collects information from various sensors, and analyzes and manages the collected information. The sensing module 142 may include a face recognition module, voice recognition module, motion recognition module, and NFC recognition module etc.
The communication module 143 is a module for performing communication with the outside. The communication module 143 may include a messaging module 143-1 such as a messenger program, SMS(Short Message Service) & MMS(Multimedia Message Service) program, and email program etc. and a telephone module 143-2 which includes a Call Info Aggregator program module and VoIP module etc.
A presentation module 144 is a module for configuring a display screen. The presentation module 144 includes a multimedia module 144-1 for reproducing and outputting multimedia contents, and a UI rendering module 144-2 which performs UI and graphic processing. The multimedia module 144-1 may include a player module, camcorder module, and sound processing module etc. Accordingly, it performs an operation of reproducing various multimedia contents and generating a screen and sound. The UI rendering module 144-2 may include an Image Compositor module which combines images, a coordinate combination module which combines coordinates on the screen for displaying an image, an X11 module which receives various events from hardware, and a 2D/3D UI toolkit which provides a tool for configuring a UI of a 2D or 3D format.
The web browser module 145 refers to a module which performs web browsing and accesses the web server. The web browser module 145 may include various modules such as a web view module which configures a web page, a download agent module which performs downloading, a bookmark module, and webkit module etc.
The service module 146 is a module which includes various applications for providing various services. More specifically, the service module 146 may include various program modules such as a navigation program, contents reproducing program, game program, e-book program, calendar program, alarm management program, and other widget program etc.
In FIG. 3A, various program modules have been illustrated, but some of the various program modules illustrated may of course be omitted, changed, or added depending on types and characteristics of the user terminal apparatus 100. For example, the user terminal apparatus 100 may be embodied in a format which further includes a location based module which is interlocked to hardware such as a GPS chip and supports base service.
Referring to FIG. 3B, an example of a multi-window framework architecture on system (ex. Android® system) used by the user terminal apparatus 100 according to an exemplary embodiment of the present invention in order to display a plurality of application windows on a screen is provided. The framework architecture of FIG. 3B may be one component of the software illustrated in FIG. 3A; however, this will be explained by referring to a separate drawing for convenient explanation.
The multi-window framework architecture may include an application framework 310 and a multi-window framework 320. In this case, the multi-window framework 320 may operate separately from the application framework 310.
The application framework 310 may include an activity manager 311, a window manager 312, and a view system 313. The multi-window framework 320 may include a multi-window manager 321.
As applications are executed, the activity manager 311 may call information corresponding to an executing window of the executed application to the multi-window framework. The activity manager 311 may receive information regarding display mode, size and position of an application executing window based on a life cycle of the application executing window from the multi-window framework. The activity manager 311 may call information regarding display mode, size and position of the application executing window during a creation phase of the life cycle of the application executing window.
Further, the window manager 312 may confirm the application executing window corresponding to a touch inputted by a user. The window manager 312 may provide position information on a display corresponding to a touch input of a user to the multi-window framework, and receive information of the application executing window corresponding to the touch input determined by the multi-window framework from the multi-window framework.
According to exemplary embodiments of the present invention, in response to a touch input of a user, the window manager 312 may receive information regarding position and size of application executing window from the multi-window framework, and determine application executing window corresponding to touch inputting of a user based on the received position and size of application executing window.
The view system 313 may confirm positions and sizes of widget window and pop-up window. In this case, the multi-window framework 320 may determine sizes and positions of widget window and pop-up window, and the view system 313 may receive information regarding sizes and positions of widget window and pop-up window from the window framework.
The multi-window manager 321 included in the multi-window framework manages various operations regarding multi-window functions provided from the user terminal apparatus 100, and provides various Application Programming Interfaces (APIs) regarding multi-window functions. Further, multi-window service may store various APIs regarding multi-window functions. An API regarding common functions of single window and multi-window may be implemented as common class, and an API regarding functions only applied in multi-window may be implemented so as to be divided according to display mode. The application framework 310 may further include a content provider 314, a package manager 315, a telephone manager 316, a resource manager 317, a position manager 318, a notice manager 319, and the like.
The multi-window framework 320 may also include a multi-window service 322, which will not be explained because providing service is described above.
Hereinbelow is explanation on various screen configurations provided according to various exemplary embodiments of the present disclosure.
FIG. 4 is a view for explaining a method of displaying an application which supports a multi window mode according to an exemplary embodiment of the present disclosure.
As illustrated in FIG. 4(a), various menu items may be displayed in an icon interface format on the initial screen according to an exemplary embodiment of the present disclosure.
More specifically, a first menu item 10 may be displayed on a central bottom area of the screen, and a second menu item 20 may be displayed on an upper right area of the screen.
The first menu item 10 plays a function of displaying an application which is capable of supporting a multi window mode on a particular area of a screen.
The second menu item plays a function of displaying all applications which may be provided in the user terminal apparatus 100 on the entire area of the screen.
In the view illustrated in FIG. 4(a), when the first menu item 10 is selected, application icons(hereinafter referred to as applications) which may support the multi window mode are displayed in a row on the bottom of the screen. Hereinbelow, the corresponding area is referred to as a “mini tray” for convenience of explanation. In addition, on a right and left side of the first menu item 10, a third menu item 30 and fourth menu item 40 may be displayed. Herein, the third menu item 30 plays a function of providing various formats of templates which predefined a layout for arranging a plurality of application windows in the multi window mode. The fourth menu item 40 plays a function of providing an application list which is currently in execution in the multi window mode.
Next, when a user manipulation of touching the mini tray 420 area and dragging to a particular direction is input, the applications move to the corresponding drag direction in the FIG. 4(c). Next, when the user’s touch manipulation is released, as illustrated in FIG. 4(d), the movement of the application stops with the display being maintained in a state where the user touch manipulation is released. For example, with the first to 7th applications 421 to 427 displayed on the mini tray 420, when the user touches a particular application 425 and drags it to the left direction, the application 425 moves by the dragged distance, and other icons on the mini tray 420 also move to the same direction. In addition, when the user manipulation is released with the third to 9th icons (423 to 429) displayed on the mini tray 420, the movement of the application on the mini tray 420 stops as it is displayed.
FIG. 5 is a view for explaining an application window display format when an application is executed in a multi window mode or general mode according to other exemplary embodiments of the present disclosure.
As illustrated in FIG. 5a, when the first menu item 10 is selected, an application which may support a multi window mode may be displayed on the bottom of the screen. Next, when a particular application 510 is selected, the corresponding application window 510 may be displayed in a mini mode format. In this case, on the application window 150, the title area 511 may be displayed. On the title area 511, various menu items for supporting the multi window mode may be displayed. For example, as illustrated, a maximization button 511-1, end button 511-2 and pinup button 511-3 may be displayed. Functions thereof have been aforementioned and thus are omitted.
On the other hand, as illustrated in FIG. 5b, when a second menu item 20 is selected, all applications which may be provided in the user terminal apparatus may be displayed on the entire area of the screen. Next, when a particular application 510 is selected, the corresponding application window 520 may be displayed in the maximization mode format. In this case, the title area is not displayed on the application window 520.
That is, even with the same application, the application window may be displayed in different formats according to the execution intention of the user. That is, in a case where the user executed an application in the multi window mode, a mini mode where movement of window location, adjustment of size, and pinup function etc. are possible may be displayed, and in a case where the user executed an application in a general mode, the window may be displayed in a maximization mode, to enable displaying where user’s intention is reflected.
FIG. 6a and 6b are a view for explaining a method of moving the application window in a multi window mode according to other exemplary embodiments of the present disclosure.
According to an exemplary embodiment, as illustrated in FIG. 6a, when the title area 611 of the application window 610 displayed in a mini mode is touched for or more than a predetermined time in the multi window mode, a guide GUI 612 for guiding the window movement and size adjustment may be displayed in the circumference portion of the application window 610. In this case, the guide GUI 612 may be highlighted to differentiate the circumference of the application window 610, or be displayed in a identifiable color.
In addition, in some cases, it is possible to provide haptic feedback on the area where the guide GUI 612 is displayed.
Next, when a user manipulation of touching the title area 611 and dragging to a particular direction is input, the guide GUI 612 may move to the corresponding direction and be displayed. In this case, the application window 610 may be displayed without being moved in the originally displayed area.
Next, when the user’s touch manipulation is released, the application window 610 may be moved to the area where the guide GUI 612 is displayed at the point where the touch manipulation is released, and be displayed.
Meanwhile, according to other exemplary embodiments, as illustrated in FIG. 6b, when a user manipulation of touching the title area 611 of the application window 610 displayed in the mini mode in the multi window mode and dragging to a particular direction is input, the application window 610 itself may be moved to the drag direction and be displayed.
FIG. 7 is a view for explaining a method of adjusting the size of the application window in the multi window mode according to other exemplary embodiments of the present disclosure.
As illustrated in FIG. 7(a), when the title area 711 of the application window 710 displayed in the mini mode in the multi window mode is touched for or more than the predetermined time, as illustrated in FIG. 7(b), the guide GUI 712 for guiding movement and size adjustment may be displayed in the circumference portion of the application window 710.
Next, as illustrated in FIGs. 7(c) and FIG. 7(d), when a particular area of the guide GUI 712 is touched and dragged to a particular direction, or an inside area of the GUI guide is touched and dragged, the guide GUI 712 may be expanded to the drag direction and be displayed. In this case, as illustrated, only the guide GUI 712 of the portion where there is user manipulation may be expanded and moved, while the guide GUI 712 of the portion where there is not user manipulation is fixated in the circumference area of the application window.
Next, as illustrated in FIG. 7(e), when a touch regarding the guide GUI 712 is released, the size of the application window may be expanded by a size corresponding to the guide GUI 712 and be displayed.
In the aforementioned exemplary embodiment, an example was made in the case of expanding the application window, but the same method may be applied to a case of reducing the application window.
Meanwhile, although not illustrated in the views, it is also possible to embody the present disclosure so that the guide GUI is not expanded in the drag direction and the application window itself is expanded in the drag direction and be displayed according to the user drag manipulation.
FIGs. 8 to 10 are views for explaining a method of displaying the title area in on the application window according to various exemplary embodiments of the present disclosure.
As illustrated in FIG. 8, with the application window 810 displayed in the maximization mode, when there is a first user manipulation regarding the information display area 813 provided on the top end of the application window 810, the title area 811 may be displayed on an upper area adjacent to the information display area 813. Herein, the first user manipulation may be a touch manipulation regarding the information display area 813, for example, a tap manipulation, but it is not limited thereto, and thus the first user manipulation may be a flick down, flick up, drag-down and drag-up manipulation. In this case, in the title area, a return button 811-3 and end button 811-5 may be included.
Next, when a predetermined event occurs, the displayed title area 811 may disappear from the application window 810. Herein, the predetermined event may be an event where a predetermined time passes.
That is, when after the title area 811 is displayed the external event regarding the title area 811 does not occur, it becomes able to make the title area 811 disappear to provide the application window in a size as wide as possible to the user.
As illustrated in FIG. 9, when there is a second user manipulation regarding the information display area 913 with the application window 910 displayed in a maximization mode, the title area 911 may be displayed as the application window 910 is changed into mini mode.
As illustrated in FIG. 9(a), when a user touch manipulation regarding the maximization button 911-1 provided in the title area 911 of the application window 910 displayed in the mini mode is input, the application window 910 is displayed in a maximization mode.
Next, when a user manipulation of double tapping the information display area 913 of the application window 910 is input, the title area 911 may be displayed as the application window 910 is restored in the mini mode.
Otherwise, as illustrated in FIG. 9(b), when a user manipulation of flicking down or dragging down the information display area 913 of the application window 910 is input with the application window 910 displayed in the maximization mode, the title area may be displayed as the application window 910 is restored in the mini mode. Herein, the flicking down or dragging down manipulation may be a manipulation of touching the information display area 913 and then flicking or dragging down the information display area 913.
Meanwhile, as illustrated in FIG. 10(a), when there is a user manipulation regarding the title area 1011 with the title area 1011 displayed on the application window 1010 displayed in the maximization mode, it enters into the multi window mode as the application window 101 restores from the maximization mode to the mini mode. Herein, the user manipulation may be a flick down manipulation regarding the title area 1011, but is not limited thereto, and thus it may be a tap, flick up, drag down, or drag up manipulation.
Otherwise, as illustrated in FIG. 10(b), when there is a user manipulation regarding the return button 1011-1 provided on the title area 1011 with the title area 1011 displayed on the application window 1010 displayed in the maximization mode, the application window may be restored to the mini mode and be displayed. Herein, the user manipulation may be a tap manipulation regarding the return button 1011-1.
Meanwhile, in the application window 1010 in the multi window mode, the title area 1011 may not disappear dynamically and be displayed in a regular manner. This is to enable easy movement of location and size adjustment of the application window in the multi window mode which displays a plurality of application windows at the same time. However, depending on exemplary embodiments, the aforementioned exemplary embodiment may also be applied to a case where the multi window mode, that is, the application window 1010 is a mini mode. That is, it is possible to embody the present disclosure in such a manner that the title area is not displayed in the mini mode but only when there is a user manipulation the title area is displayed for a predetermined time and the title area to disappear dynamically.
FIG. 11 is a view for explaining a method of controlling a user terminal apparatus according to an exemplary embodiment of the present disclosure.
According to the method of controlling the user terminal apparatus illustrated in FIG. 11, an application window is displayed on a screen (S1110)
Next, a first user command for displaying a title area on the application window is received (S1120).
When a user command is received, it is controlled so that the title area is displayed on one area of the application window, and that the title area automatically disappears when a predetermined event occurs (S1130). Herein, the predetermined event may be an event where a predetermined time passes from the display point of the title area, but is not limited thereto.
Especially, the displaying (S1110) may display the title area in a state where the application window is displayed on the entire area of the screen, that is when the first user command is input in the maximization mode. That is, when the application window is a mini mode, the title area may be displayed regularly on the application window.
Herein, the first user command may be a user manipulation of touching the predetermined area of the application window. Herein, the touch manipulation may be a tap manipulation, but is not limited thereto.
In addition, the one area of the application window where the title area is displayed may be an area adjacent to the predetermined area which receives the first user command.
In addition, when a second user command is input with the title area displayed on the application window, it may be controlled to enter to a multi window mode for displaying a plurality of application windows on the screen at the same time. Herein, the second user command may be a touch manipulation regarding the title area, but is not limited thereto.
In addition, it is possible to display the application window in a mini mode of a predetermined size in the multi window, and display the title area to be regularly included in the one area of the application window displayed in a mini mode.
In addition, when a double tap manipulation regarding the predetermined area of the application window or a touch and flick manipulation regarding the predetermined area is input, it may be controlled to enter into a multi window mode for displaying a plurality of application windows on the screen at the same time. That is, depending on exemplary embodiment examples, even when the title area is not displayed, it is possible to enable change of the application window from the maximization mode to the mini mode with one predetermined user manipulation.
Meanwhile, a controlling method according to the aforementioned various exemplary embodiments may be embodied as a program and be provided in the user terminal apparatus.
For example, there may be provided a non-transitory computer readable medium which stores a program which performs displaying the application window on a screen, displaying a title area on one area of the application window when a first user command for displaying the title area on the application window is input, and controlling so that the title area automatically disappears when a predetermined event occurs.
A non-transitory computer readable medium refers to a medium which stores data semi-permanently and not for a short period of time, such as a register, cache and memory etc. More specifically, the aforementioned various applications or programs may be stored and provided in a non-transitory computer readable medium such as a CD, DVD, hard disk, Blue ray disk, USB, memory card, and ROM etc.
Although a few embodiments of the present invention have been shown and described, it would be appreciated by those skilled in the art that changes may be made in this embodiment without departing from the principles and spirit of the invention, the scope of which is defined in the claims and their equivalents.

Claims (15)

  1. A user terminal apparatus comprising:
    a display which displays an application window on a screen;
    a user interface unit which receives an input of a user command; and
    a controller which controls so that a title area is displayed on one area of the application window when a first user command for displaying the title area on the application window is input and that the title area automatically disappears when a predetermined event occurs
  2. The user terminal apparatus according to claim 1, wherein the predetermined event is an event where a predetermined time passes from a display point of the title area
  3. The user terminal apparatus according to claim 1 or claim 2, wherein the controller displays the title area, when the first user command is input with the application window displayed on an entire area of the screen
  4. The user terminal apparatus according to claim 1 or claim 2, wherein the first user command is a user manipulation of touching a predetermined area of the application window
  5. The user terminal apparatus according to claim 4, wherein the one area of the application window is an area adjacent to the predetermined area
  6. The user terminal apparatus according to claim 1, wherein the controller controls to enter into a multi window mode for displaying a plurality of application windows on the screen at the same time, when a second user command which manipulates the title area is input with the title area displayed on the application window
  7. The user terminal apparatus according to claim 6, wherein the controller reduces the application window to a predetermined size in the multi window mode, and displays the title area to be regularly included in one area of the reduced application window
  8. The user terminal apparatus according to claim 6, wherein the second user command is a touch manipulation regarding the title area
  9. The user terminal apparatus according to claim 1, which controls to enter into a multi window mode for displaying a plurality of application windows on the screen at the same time, when a double tab manipulation regarding a predetermined area of the application window or a touch and flick manipulation regarding the predetermined area is input
  10. The user terminal apparatus according to claim 1, wherein the user terminal apparatus is a touch based mobile terminal
  11. A control method of a user terminal apparatus, the control method comprising:
    displaying an application window on a screen;
    displaying a title area on one area of the application window, when a first user command for displaying the title area on the application window is input; and
    controlling so that the title area automatically disappears when a predetermined event occurs
  12. A control method of a user terminal apparatus according to claim 11, wherein the predetermined event is an event where a predetermined time passes from a display point of the title area
  13. A control method of a user terminal apparatus according to claim 11 or claim 12, wherein the displaying displays the title area, when the first user command is input with the application window displayed on an entire area of the screen
  14. A control method of a user terminal apparatus according to claim 11 or claim 12, wherein the first user command is a user manipulation of touching a predetermined area of the application window, and the one area of the application window is an area adjacent to the predetermined area
  15. A control method of a user terminal apparatus according to claim 11, further comprising controlling controls to enter into a multi window mode for displaying a plurality of application windows on the screen at the same time, when a second user command which manipulates the title area is input with the title area displayed on the application window.
PCT/KR2013/007695 2012-08-28 2013-08-28 User terminal apparatus and contol method thereof WO2014035123A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020120094502A KR20140028383A (en) 2012-08-28 2012-08-28 User terminal apparatus and contol method thereof
KR10-2012-0094502 2012-08-28

Publications (1)

Publication Number Publication Date
WO2014035123A1 true WO2014035123A1 (en) 2014-03-06

Family

ID=50183869

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2013/007695 WO2014035123A1 (en) 2012-08-28 2013-08-28 User terminal apparatus and contol method thereof

Country Status (2)

Country Link
KR (1) KR20140028383A (en)
WO (1) WO2014035123A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106411651A (en) * 2016-10-31 2017-02-15 安徽汇顿电子科技有限公司 Smart home debugging system based on network communication
CN107179840A (en) * 2017-05-25 2017-09-19 上海传英信息技术有限公司 Controller and its control method
CN108259803A (en) * 2017-07-20 2018-07-06 青岛海信电器股份有限公司 Electric terminal equipment, television terminal, signal input circuit and method
US11232057B2 (en) 2017-07-20 2022-01-25 Hisense Visual Technology Co., Ltd. Terminal device and control method thereof

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101525882B1 (en) * 2014-05-02 2015-06-04 에스코어 주식회사 Method of providing multi display which computer-executable, apparatus performing the same and storage media storing the same

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5473745A (en) * 1994-12-14 1995-12-05 International Business Machines Corporation Exposing and hiding a title bar behind its window using a visual cue
US6304261B1 (en) * 1997-06-11 2001-10-16 Microsoft Corporation Operating system for handheld computing device having program icon auto hide
US20030237043A1 (en) * 2002-06-21 2003-12-25 Microsoft Corporation User interface for media player program
US20050183017A1 (en) * 2001-01-31 2005-08-18 Microsoft Corporation Seekbar in taskbar player visualization mode
EP2346220A1 (en) * 2010-01-15 2011-07-20 Research In Motion Limited Method and portable electronic device for processing images

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5473745A (en) * 1994-12-14 1995-12-05 International Business Machines Corporation Exposing and hiding a title bar behind its window using a visual cue
US6304261B1 (en) * 1997-06-11 2001-10-16 Microsoft Corporation Operating system for handheld computing device having program icon auto hide
US20050183017A1 (en) * 2001-01-31 2005-08-18 Microsoft Corporation Seekbar in taskbar player visualization mode
US20030237043A1 (en) * 2002-06-21 2003-12-25 Microsoft Corporation User interface for media player program
EP2346220A1 (en) * 2010-01-15 2011-07-20 Research In Motion Limited Method and portable electronic device for processing images

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106411651A (en) * 2016-10-31 2017-02-15 安徽汇顿电子科技有限公司 Smart home debugging system based on network communication
CN107179840A (en) * 2017-05-25 2017-09-19 上海传英信息技术有限公司 Controller and its control method
CN108259803A (en) * 2017-07-20 2018-07-06 青岛海信电器股份有限公司 Electric terminal equipment, television terminal, signal input circuit and method
CN108259803B (en) * 2017-07-20 2021-02-02 海信视像科技股份有限公司 Electronic terminal device, television terminal, signal input circuit and method
US11232057B2 (en) 2017-07-20 2022-01-25 Hisense Visual Technology Co., Ltd. Terminal device and control method thereof

Also Published As

Publication number Publication date
KR20140028383A (en) 2014-03-10

Similar Documents

Publication Publication Date Title
WO2014035147A1 (en) User terminal apparatus and controlling method thereof
WO2015119485A1 (en) User terminal device and displaying method thereof
WO2014069750A1 (en) User terminal apparatus and controlling method thereof
WO2014088355A1 (en) User terminal apparatus and method of controlling the same
EP3105649A1 (en) User terminal device and displaying method thereof
WO2015119463A1 (en) User terminal device and displaying method thereof
WO2015119480A1 (en) User terminal device and displaying method thereof
WO2017095040A1 (en) User terminal device and displaying method thereof
WO2014017722A1 (en) Display device for executing multiple applications and method for controlling the same
WO2016167503A1 (en) Display apparatus and method for displaying
WO2016060514A1 (en) Method for sharing screen between devices and device using the same
WO2016052940A1 (en) User terminal device and method for controlling the user terminal device thereof
WO2014175692A1 (en) User terminal device with pen and controlling method thereof
WO2013180454A1 (en) Method for displaying item in terminal and terminal using the same
WO2014182086A1 (en) Display apparatus and user interface screen providing method thereof
WO2014017790A1 (en) Display device and control method thereof
WO2014107011A1 (en) Method and mobile device for displaying image
WO2015178677A1 (en) User terminal device, method for controlling user terminal device, and multimedia system thereof
WO2014098539A1 (en) User terminal apparatus and control method thereof
WO2014035123A1 (en) User terminal apparatus and contol method thereof
WO2016072678A1 (en) User terminal device and method for controlling user terminal device thereof
WO2015020288A1 (en) Display apparatus and the method thereof
WO2015099300A1 (en) Method and apparatus for processing object provided through display
WO2016167610A1 (en) Portable terminal capable of controlling brightness thereof, and brightness control method for same
WO2015178691A1 (en) Display apparatus and controlling method thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13833011

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13833011

Country of ref document: EP

Kind code of ref document: A1