US20100149120A1 - Main image processing apparatus, sub image processing apparatus and control method thereof - Google Patents

Main image processing apparatus, sub image processing apparatus and control method thereof Download PDF

Info

Publication number
US20100149120A1
US20100149120A1 US12/616,541 US61654109A US2010149120A1 US 20100149120 A1 US20100149120 A1 US 20100149120A1 US 61654109 A US61654109 A US 61654109A US 2010149120 A1 US2010149120 A1 US 2010149120A1
Authority
US
United States
Prior art keywords
image processing
processing apparatus
contents
main image
touch input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/616,541
Inventor
Chang-Soo Lee
Yong-hwan Kwon
Joon-Hwan Kim
Heui-jin Kwon
Jeong-yeon Lee
Victor SZILAGYI
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Assigned to SAMSUNG ELECTRONICS CO., LTD. reassignment SAMSUNG ELECTRONICS CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KIM, JOON-HWAN, KWON, HEUI-JIN, KWON, YONG-HWAN, LEE, CHANG-SOO, LEE, JEONG-YEON, Szilagyi, Victor
Publication of US20100149120A1 publication Critical patent/US20100149120A1/en
Priority to US14/636,738 priority Critical patent/US10965980B2/en
Priority to US17/175,127 priority patent/US11375262B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/44Receiver circuitry for the reception of television signals according to analogue transmission standards
    • H04N5/445Receiver circuitry for the reception of television signals according to analogue transmission standards for displaying additional information
    • H04N5/45Picture in picture, e.g. displaying simultaneously another television channel in a region of the screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • H04N21/4222Remote control device emulator integrated into a non-television apparatus, e.g. a PDA, media center or smart toy
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4126The peripheral being portable, e.g. PDAs or mobile phones
    • H04N21/41265The peripheral being portable, e.g. PDAs or mobile phones having a remote control device for bidirectional communication between the remote control device and client device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • H04N21/42208Display device provided on the remote control
    • H04N21/42209Display device provided on the remote control for displaying non-command information, e.g. electronic program guide [EPG], e-mail, messages or a second television channel
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • H04N21/42224Touch pad or touch panel provided on the remote control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications

Definitions

  • Apparatuses and methods consistent with the present invention relate to a main image processing apparatus, a sub image processing apparatus and a control method thereof, and more particularly, to a main image processing apparatus, a sub image processing apparatus and a control method thereof, in which the sub image processing apparatus is used in transmitting a control command to the main image processing apparatus.
  • An image processing apparatus receives a video signal from a broadcasting station or a video signal from an external device such as a digital versatile disc (DVD) player or the like, and processes it to be displayed as an image.
  • the image processing apparatus processes not only the broadcasting signal from the broadcasting station but also various contents such as a game application, a still image (photograph), and a moving picture of digital data.
  • the dual image processing system includes two or more image processing apparatuses and displays contents in various methods according to a user's request.
  • a general dual image processing system receives various control commands from a user through a user input unit provided in the image processing apparatus or an external input device provided separately.
  • control method of a sub image processing apparatus connectable with a main image processing apparatus, the control method including: receiving a user's touch input through the sub image processing apparatus; determining which contents correspond to a touched area where the touch input occurs; sensing change in location of the touch input; and transmitting a control command for the contents to the main image processing apparatus in response to the sensed change in the location of the touch input.
  • the control method of the sub image processing apparatus may further include displaying the contents changed corresponding to the sensed change in the location of the touch input.
  • the control method of the sub image processing apparatus may further include transmitting information about the contents to the main image processing apparatus if a flicking input in which the touch input moves by a predetermined distance or more in a predetermined direction is sensed in the sensing the change in the location of the touch input.
  • the transmitting the control command to the main image processing apparatus includes transmitting the control command that makes the contents displayed in the main image processing apparatus be rotated at a predetermined angle and in a predetermined direction if a circle input in which the touch input rotates at the angle and in the direction is sensed in the sensing the change in the location of the touch input.
  • the transmitting the control command to the main image processing apparatus may include transmitting a channel-switching command for the contents.
  • the control method of the sub image processing apparatus may further include displaying a control panel for the contents.
  • the control method of the sub image processing apparatus may further include closing the displayed control panel if receiving information about a power-off state of the main image processing apparatus.
  • the control method of the sub image processing apparatus may further include: receiving information about a power-off state of the main image processing apparatus; and transmitting a power-on command to the main image processing apparatus.
  • the control method of the sub image processing apparatus may further include animating at least one of the menu item selected corresponding to the touch input and the contents corresponding to the menu item.
  • Another aspect of the present invention is to provide a control method of a main image processing apparatus connectable with a sub image processing apparatus, the control method including: receiving a control command for contents from the sub image processing apparatus; and displaying the contents changed on the basis of the received control command.
  • the displaying the contents changed depending on the control command may include switching a channel to correspond to the contents.
  • control method of the main image processing apparatus further including turning on the main image processing apparatus if the image processing apparatus is being turned off.
  • the control command may include information about rotation at a predetermined angle and in a predetermined direction with regard to the contents, and the displaying the contents changed depending on the control command includes displaying the contents rotated at the angle and in the direction.
  • Still another aspect of the present invention is to provide a sub image processing apparatus connectable with a main image processing apparatus, the sub image processing apparatus including: a communication unit which communicates with the main image processing apparatus; an image processing unit which processes contents; a display unit which displays the processed contents; a user input unit which receives a user's touch input; and a controller which determines which contents correspond to a touched area where the touch input occurs, sensing change in location of the touch input, and controlling the communication unit to transmit a control command for the contents to the main image processing apparatus in response to the sensed change in the location of the touch input.
  • the display unit may display the contents changed corresponding to the sensed change in the location of the touch input.
  • the controller may control the communication unit to transmit information about the contents to the main image processing apparatus if a flicking input in which the touch input moves by a predetermined distance or more in a predetermined direction is sensed.
  • the controller may control the communication unit to transmit the control command that makes the contents displayed in the main image processing apparatus be rotated at a predetermined angle and in a predetermined direction if a circle input in which the touch input rotates at the angle and in the direction is sensed. Further, the controller may control the display unit to close a control panel for the contents if receiving information about a power-off state of the main image processing apparatus.
  • the controller may control the communication unit to transmit a power-on command to the main image processing apparatus if receiving information about a power-off state of the main image processing apparatus.
  • Yet another aspect of the present invention is to provide a main image processing apparatus connectable with a sub image processing apparatus, the main image processing apparatus including: a communication unit which communicates with the sub image processing apparatus; an image processing unit which processes contents; a display unit which displays the processed contents; and a controller which receives a control command for the contents from the sub image processing apparatus, and controls the image processing unit to display the contents changed on the basis of the received control command.
  • the controller may turn on the main image processing apparatus when the main image processing apparatus is being turned off, and control the image processing unit to display the contents.
  • FIGS. 1A and 1B are block diagrams of dual image processing systems according to exemplary embodiments of the present invention.
  • FIGS. 2 and 3 illustrate control of a main image processing apparatus as a user touches a sub image processing apparatus for input
  • FIGS. 4A to 6B shows control screens displayed on the sub image processing apparatus according to an exemplary embodiment of the present invention
  • FIGS. 7A and 7B illustrate control of the main image processing apparatus on the basis of touch input according to an exemplary embodiment of the present invention
  • FIG. 8 is a flowchart of a control method of a sub image processing apparatus according to an exemplary embodiment of the present invention.
  • FIG. 9 is a flowchart of a control method of a main image processing apparatus according to an exemplary embodiment of the present invention.
  • FIG. 1A is a block diagram of a dual image processing system 10 according to an exemplary embodiment of the present invention.
  • the dual image processing system 10 includes a sub image processing apparatus 100 , and a main image processing apparatus 200 that receives a control command from the sub image processing apparatus 100 .
  • the sub image processing apparatus 100 is a portable media player capable of processing an image about various contents such as a still image (photograph), a moving picture, music, etc., and the main image processing apparatus 200 may be achieved by a digital television (TV), a set-top box, etc.
  • the sub image processing apparatus 100 transmits a control command to the main image processing apparatus 200 in order to control operation of the main image processing apparatus 200 .
  • the sub image processing apparatus 100 includes a first image receiving unit 110 , a first image processing unit 120 , a first display unit 130 a, a first user input unit 140 a, a first storage unit 150 , a first communication unit 160 , and a first controller 170 .
  • the first image receiving unit 110 receives a video signal from a broadcasting station or from an external device such as a DVD player or the like.
  • the first image receiving unit 110 includes an antenna, a tuner, etc. to receive a broadcasting signal.
  • the sub image processing apparatus 100 may receive a video signal from the outside via a network.
  • the first image processing unit 120 processes the video signal received by the first image receiving unit 110 into a signal having a format displayable in the first display unit 130 a.
  • the first display unit 130 a displays contents processed by the first image processing unit 120 .
  • the first display unit 130 a may change and display the contents corresponding to change in location of touch input on the first user input unit 140 .
  • the first display unit 130 a includes a display panel (not shown) where the contents are displayed, and a panel driver (not shown) to drive the video signal output from the first image processing unit 120 to be displayed on the display panel.
  • the display panel may include a liquid crystal display (LCD) panel or a plasma display panel (PDP) by way of example.
  • the first user input unit 140 a may be a control panel, which includes at least one button and is provided in the sub image processing apparatus 100 , as a user interface (UI) to receive a user's instruction.
  • the control panel may include specific keys such as a menu key, an arrow key, etc.
  • the first controller 170 determines that a user gives instructions when a button of the control panel is pressed.
  • the first user input unit 140 a may further include a touch panel (hereinafter, it will be also referred to as a “touch pad” or “touch screen”) which is in the first display unit 130 b and receives a user's touch input ( FIG. 1B ).
  • the touch panel may include a graphic user interface (GUI) presented by executing a predetermined application and displayed on the first display unit 130 b as a touch area enabling a user's touch input.
  • GUI graphic user interface
  • the touch input is an input based on a touch of a user, which not only includes a touch, a tap or the like, but also includes at least one of tap and hold, drag and drop, flicking and circle as a directional gesture input.
  • the touch input will be explained in more detail with exemplary embodiments to be described.
  • the first user input unit 140 a, 140 b displays a plurality of menu items 21 (refer to FIG. 5A ), and a menu navigation displayed in an area of the first display unit 130 a, 130 b.
  • the first display unit 130 a, 130 b includes a view zone 22 (refer to FIG. 5A ) to display thumbnail contents corresponding to a menu item 21 selected in the menu navigation, in which the view zone 22 may receive the touch input of a user.
  • the first user input unit 140 a, 140 b receives a user's instruction for controlling the main image processing apparatus 200 .
  • the first user input unit 140 a, 140 b includes a TV control panel 24 (refer to FIG. 5B ) presented to control the contents displayed on the main image processing apparatus 200 and displayed on the first display unit 140 a, 140 b. If the touch input is received through the TV control panel 24 , the first controller 170 controls the first communication unit 160 to transmit a control command corresponding to the touch input to the main image processing apparatus 200 .
  • the first storage unit 150 stores the contents received from the outside.
  • an image stored in the first storage unit 150 may include not only a broadcasting signal transmitted from the broadcasting station and received by the first image receiving unit 110 , but also contents such as a game application, a still image (photograph), and a moving picture of digital data received from various external sources such as a DVD player, an MP3 player, a digital camera, etc.
  • the first storage unit 150 may include an internal storage medium such as a flash memory, an erasable programmable read only memory (EPROM) and a hard disk drive (HDD), or a portable storage medium such as a universal serial bus (USB) memory and a memory card (a memory stick, a compact flash card, and a multi-media card (MMC)).
  • an internal storage medium such as a flash memory, an erasable programmable read only memory (EPROM) and a hard disk drive (HDD), or a portable storage medium such as a universal serial bus (USB) memory and a memory card (a memory stick, a compact flash card, and a multi-media card (MMC)).
  • USB universal serial bus
  • MMC multi-media card
  • the first communication unit 160 performs wire/wireless communication with the outside according to a predetermined communication protocol. Specifically, the first communication unit 160 transmits a control command corresponding to the touch input of a user to the main image processing apparatus 200 , and transmits the contents stored in the first storage unit 150 to the main image processing apparatus 200 . Further, the first communication unit 160 may receive the contents from the main image processing apparatus 200 .
  • control command is a signal for giving various commands of a user such as power on/off of the main image processing apparatus 200 , switch of a channel, synchronization between the sub image processing apparatus 100 and the main image processing apparatus 200 , adjustment of screen and volume, and recording reservation, etc.
  • the first communication unit 160 may communicate with not only the main image processing apparatus 200 but also various connectable external apparatuses to transmit and receive the contents.
  • the first communication unit 160 may include a wired/wireless communication module connectable with the outside locally or through a network based on a predetermined protocol, a USB port connectable with a portable storage medium such as a USB memory, etc.
  • the first controller 170 performs general control of the sub image processing apparatus 100 .
  • the first controller 170 determines contents corresponding to a touched area and controls the first communication unit 160 to transmit a control command about the determined contents to the main image processing apparatus 200 .
  • control command may be given as a control command that makes the same contents as displayed on the first display unit 130 a, 130 b be displayed on a second display unit 230 of the main image processing apparatus 200 .
  • the first controller 170 determines the contents corresponding to the touched area where the touch input is performed, and, if sensing a location change of the touch input by a touch input having directionality (e.g., flicking), transmits the control command about the contents to the main image processing apparatus 200 in response to the sensed location change.
  • the touched area may include the view zone 22 where the contents are displayed.
  • the transmitted control command may include a power-on command for the main image processing apparatus 200 in the state that the main image processing apparatus 200 has been turned off.
  • the first controller 170 transmits the power-on command to the main image processing apparatus 200 when receiving information about the power-off state of the main image processing apparatus 200 in the state that the touch input is given for transmitting the contents displayed on the first display unit 130 a, 130 b to the main image processing apparatus 200 .
  • the first controller 170 controls the first communication unit 160 to transmit the contents displayed on the first display unit 130 a, 130 b to the main image processing apparatus 200 , and transmit the control command for displaying the transmitted contents to be displayed on the second display unit 230 of the main image processing apparatus 200 .
  • the first controller 170 may control the first communication unit 160 to transmit a channel-shifting command for the determined contents.
  • the first controller 170 may be achieved by a relevant software program, and a processor such as a microcomputer, a central processing unit (CPU), or the like to load and execute this program.
  • a processor such as a microcomputer, a central processing unit (CPU), or the like to load and execute this program.
  • the main image processing apparatus 200 may include a second image receiving unit 210 , a second image processing unit 220 , a second display unit 230 , a second user input unit 240 , a second storage unit 250 , a second communication unit 260 , and a second controller 270 .
  • the second image receiving unit 210 receives a video signal from a broadcasting station or from an external device such as a DVD player or the like.
  • the second image receiving unit 210 includes an antenna, a tuner, etc. to receive a broadcasting signal.
  • the second image processing unit 220 processes the video signal received by the second image receiving unit 210 into a signal having a format displayable in the second display unit 230 .
  • the second display unit 230 displays contents processed by the second image processing unit 220 .
  • the second display unit 230 includes a display panel (not shown) where the contents are displayed, and a panel driver (not shown) to drive the video signal output from the second image processing unit 220 to be displayed on the display panel.
  • the display panel may include a liquid crystal display (LCD) panel or a plasma display panel (PDP) by way of example.
  • the second user input unit 240 may be achieved by a control panel, which includes at least one button and is provided in the main image processing apparatus 200 , as a user interface (UI) to receive a user's instruction.
  • the control panel may include specific keys such as a menu key, an arrow key, etc.
  • the second controller 270 determines that a user gives instructions when a button of the control panel is pressed.
  • the second user input unit 240 may further include a graphic user interface (GUI) presented by executing a predetermined application and displayed on the second display unit 230 as a touch area enabling a user's touch input.
  • GUI graphic user interface
  • the second storage unit 250 stores the contents received from the outside.
  • an image stored in the second storage unit 250 may include not only a broadcasting signal transmitted from the broadcasting station and received by the second image receiving unit 210 , but also an image received from the sub image processing apparatus 100 through the second communication unit 260 and contents such as a game application, a still image (photograph), and a moving picture of digital data received from various external sources such as a DVD player, an MP3 player, a digital camera, etc.
  • the second storage unit 250 may include an internal storage medium such as a flash memory, an erasable programmable read only memory (EPROM) and a hard disk drive (HDD), or a portable storage medium such as a universal serial bus (USB) memory and a memory card (a memory stick, a compact flash card, and a multi-media card (MMC)).
  • an internal storage medium such as a flash memory, an erasable programmable read only memory (EPROM) and a hard disk drive (HDD), or a portable storage medium such as a universal serial bus (USB) memory and a memory card (a memory stick, a compact flash card, and a multi-media card (MMC)).
  • USB universal serial bus
  • MMC multi-media card
  • the second communication unit 260 performs wire/wireless communication with the outside according to a predetermined communication protocol. Specifically, the second communication unit 260 receives a control command from the sub image processing apparatus 100 .
  • the second communication unit may receive contents from the sub image processing apparatus 100 .
  • the sub image processing apparatus 100 may decode (or trans-code) the contents to have a predetermined format before transmitting the contents, and transmit it to the main image processing apparatus 200 .
  • control command is a signal for giving various commands of a user such as power on/off of the main image processing apparatus 200 , switch of a channel, synchronization between the sub image processing apparatus 100 and the main image processing apparatus 200 , adjustment of screen and volume, and recording reservation, etc.
  • the second communication unit 260 may communicate with not only the sub image processing apparatus 100 but also various connectable external apparatuses to transmit and receive the contents.
  • the second communication unit 260 may include a wired/wireless communication module connectable with the outside locally or through a network based on a predetermined protocol, a USB port connectable with a portable storage medium such as a USB memory, etc.
  • the second controller 270 performs general control of the main image processing apparatus 200 . In more detail, if receiving a control command from the sub image processing apparatus 100 through the second communication unit 260 , the second controller 270 controls the second image processing unit 220 to control the contents displayed on the second display unit 230 in response to the received control command.
  • the received control command may be given as a control command that makes the same contents as displayed on the first display unit 130 a, 130 b of the sub image processing apparatus 100 be displayed on the second display unit 230 .
  • the second controller 270 may turn on the main image processing apparatus 200 and control the second image processing unit 220 to display the contents corresponding to the relevant control command, when receiving the control command in the state that the main image processing apparatus 200 has been turned off.
  • the second controller 270 may be achieved by a relevant software program, and a processor such as a microcomputer, a central processing unit (CPU), or the like to load and execute this program.
  • a processor such as a microcomputer, a central processing unit (CPU), or the like to load and execute this program.
  • FIGS. 2 and 3 illustrate control of the main image processing apparatus 200 as a user touches the sub image processing apparatus 100 for input.
  • the sub image processing apparatus 100 may receive a flicking input 11 as the touch input while displaying predetermined contents.
  • the flicking input 11 means a touch input based on tap & hold where a finger or the like (e.g., a stylus) moves in a predetermined direction (e.g., toward the main image processing apparatus 200 or an upward direction of the touch screen) and by a predetermined distance or more in a predetermined direction while being in contact with a touching area corresponding to the predetermined contents.
  • the first controller 170 When receiving the flicking input 11 toward the main image processing apparatus 200 for the contents displayed on the first display unit 130 a, 130 b, the first controller 170 transmits the control command to the main image processing apparatus 200 so that the same contents as displayed on the first display unit 130 a, 130 b can be displayed on the second display unit 230 .
  • the sub image processing apparatus 100 may transmit information about relevant contents to the main image processing apparatus 200 .
  • the main image processing apparatus 200 displays the relevant contents on the second display unit 230 on the basis of the information about the received contents.
  • the first controller 170 receives information about the power-off state of the main image processing apparatus 200 and transmits a power-on command to the main image processing apparatus 200 on the basis of the received information about the power-off state.
  • the second controller 270 turns on the main image processing apparatus 200 on the basis of the received power-on command, and controls the second image processing unit 220 so that the same contents as displayed on the first display unit 130 a, 130 b can be displayed on the second display unit 230 on the basis of the control command corresponding to the flicking input 11 .
  • the main image processing apparatus 200 in a “power-off” state may not be entirely “off” in that the image processing apparatus 200 may be in a low power consumption setting so that it can receive and process a power-on command.
  • the first controller 170 may control the first communication unit 160 to transmit a channel-switching command to the main image processing apparatus 200 .
  • the main image processing apparatus 200 displays contents corresponding to a channel changed by the received channel-switching command.
  • the second controller 270 receives the power-on command from the sub image processing apparatus 100 and turns on the main image processing apparatus 200 in response to the power-on command, thereby controlling the second image processing unit 220 to display contents corresponding to the channel-switching command.
  • FIGS. 4A to 6B shows control screens displayed on the first display unit 130 a, 130 b of the sub image processing apparatus 100 according to an exemplary embodiment of the present invention.
  • the menu navigation including the plurality of menu items 21 , and the view zone 22 where the thumbnail contents are displayed corresponding to the menu item 21 selected in the menu navigation.
  • the area where the menu item 21 selected in the menu navigation is displayed is a target zone.
  • the first display unit 130 a, 130 b displays the thumbnail contents corresponding to the selected menu item 23 on the view zone 22 as shown in FIG. 4B .
  • the sub image processing apparatus 100 senses change in location due to the flicking input 11 and transmits a control command about the contents to the main image processing apparatus 200 .
  • the main image processing apparatus 200 displays the contents corresponding to the flicking input 11 .
  • the first display unit 130 a, 130 b may animate at least one of the menu item selected corresponding to the received touch input and the contents displayed on the view zone 22 corresponding to the menu item.
  • the animation in this exemplary embodiment may include the menu item or the contents being displayed while moving in sequence.
  • the first controller 170 may control the first display unit 130 a, 130 b to display the control panel 24 for controlling the contents displayed on the main image processing apparatus 200 .
  • a user may give a control command for the contents to the main image processing apparatus 200 through the control panel 24 .
  • the sub image processing apparatus 100 receives the information about the power-off state of the main image processing apparatus 200 and closes the control panel 24 as shown in FIG. 5C .
  • the sub image processing apparatus 100 may display the contents of the view zone 22 in a full screen display as shown in FIG. 6A .
  • the sub image processing apparatus 100 may receive a circle input 12 as the touch input for the contents.
  • the circle input 12 means a touch input based on rotation at a predetermined angle and in a predetermined direction.
  • the first controller 170 controls the first image processing unit 120 to rotate the contents by the angle and direction of the received circle input 12 as shown in FIG. 6B and display the rotated contents on the first display unit 130 a, 130 b. Further, the first controller 170 controls the first communication unit 160 to transmit a control command for rotating the contents displayed on the main image processing apparatus 200 by the received angle and direction.
  • FIGS. 7A and 7B illustrate control of the main image processing apparatus 200 on the basis of touch input according to an exemplary embodiment of the present invention.
  • the main image processing apparatus 200 receives the control command corresponding to the flicking input and displays the contents changed according to the control command.
  • the user's flicking input is toward a physical location of the main image processing apparatus 200 .
  • the main image processing apparatus 200 receives the control command corresponding to the circle input and displays the contents rotated by the sensed angle and direction.
  • the touch input having the directionality is described with respect to the flicking input the circle input, but not limited thereto.
  • FIG. 8 is a flowchart of the control method of the sub image processing apparatus 100 according to an exemplary embodiment of the present invention.
  • the sub image processing apparatus 100 receives a user's touch input through the first user input unit 140 a, 140 b.
  • the first controller 170 determines which contents correspond to the touched area in response to the touch input.
  • the first controller 170 senses the change in location of the touch input received in the operation S 110 .
  • the sensed change in the location may include the flicking input in which the location of the touch input moves by a predetermined distance or more in a predetermined direction, or the circle input in which the location of the touch input rotates at a predetermined angle and in a predetermined direction.
  • the first controller 170 controls the first image processing unit 120 to change the contents determined in the operation S 120 in response to the change of the location sensed in the operation S 130 , and display it on the first display unit 130 a, 130 b.
  • the first controller 170 controls the first communication unit 160 to send the main image processing apparatus 200 a control command for the contents in response to the change in the location sensed in the operation S 130 .
  • the control command may be given as a control command that makes the same contents as displayed in the operation S 140 be displayed on the main image processing apparatus 200 .
  • the control command in the operation S 140 may include a channel-switching command, a power-on command for the main image processing apparatus 200 , etc.
  • FIG. 9 is a flowchart of the control method of the main image processing apparatus 200 according to an exemplary embodiment of the present invention.
  • the main image processing apparatus 200 receives the control command for the contents from the sub image processing apparatus 100 .
  • the main image processing apparatus 200 is turned on in response to the received control command at operation S 230 .
  • the main image processing apparatus displays the contents corresponding to the control command received in the operation S 210 .
  • the main image processing apparatus 200 displays the contents changed (e.g., rotated) on the basis of the received control command, or switched in the channel.
  • the main image processing apparatus 200 is controlled through the sub image processing apparatus 100 supporting the touch input, and thus it is more convenient for a user to use the dual image processing system 10 .
  • the sub image processing apparatus 100 is used for transmitting the control command to the main image processing apparatus 200 , but not limited thereto.
  • the main image processing apparatus 200 may be used for controlling the sub image processing apparatus 100 , or the control command may be transmitted and received between the main and sub image processing apparatuses 200 and 100 .

Abstract

Disclosed are a main image processing apparatus, a sub image processing apparatus and a control method thereof. The control method of the sub image processing apparatus includes receiving a user's touch input through the sub image processing apparatus; determining which contents correspond to a touched area where the touch input occurs; sensing change in location of the touch input; and transmitting a control command for the contents to the main image processing apparatus in response to the sensed change in the location of the touch input. With this, it is more convenient for a user to control the main image processing apparatus through the sub image processing apparatus supporting the touch input.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This application claims priority from Korean Patent Application No. 10-2008-0125783, filed on Dec. 11, 2008 in the Korean Intellectual Property Office, the disclosure of which is incorporated herein by reference.
  • BACKGROUND OF INVENTION
  • 1. Field of Invention
  • Apparatuses and methods consistent with the present invention relate to a main image processing apparatus, a sub image processing apparatus and a control method thereof, and more particularly, to a main image processing apparatus, a sub image processing apparatus and a control method thereof, in which the sub image processing apparatus is used in transmitting a control command to the main image processing apparatus.
  • 2. Description of the Related Art
  • An image processing apparatus receives a video signal from a broadcasting station or a video signal from an external device such as a digital versatile disc (DVD) player or the like, and processes it to be displayed as an image. The image processing apparatus processes not only the broadcasting signal from the broadcasting station but also various contents such as a game application, a still image (photograph), and a moving picture of digital data.
  • Recently, a demand for a dual image processing system has been on the rise, in which the dual image processing system includes two or more image processing apparatuses and displays contents in various methods according to a user's request.
  • However, a general dual image processing system receives various control commands from a user through a user input unit provided in the image processing apparatus or an external input device provided separately.
  • Thus, there is a growing need for a dual image processing system in which one image processing apparatus can be used to transmit a control command to another image processing apparatus.
  • SUMMARY OF THE INVENTION
  • The foregoing and/or other aspects of the present invention can be achieved by providing a control method of a sub image processing apparatus connectable with a main image processing apparatus, the control method including: receiving a user's touch input through the sub image processing apparatus; determining which contents correspond to a touched area where the touch input occurs; sensing change in location of the touch input; and transmitting a control command for the contents to the main image processing apparatus in response to the sensed change in the location of the touch input.
  • The control method of the sub image processing apparatus may further include displaying the contents changed corresponding to the sensed change in the location of the touch input.
  • The control method of the sub image processing apparatus may further include transmitting information about the contents to the main image processing apparatus if a flicking input in which the touch input moves by a predetermined distance or more in a predetermined direction is sensed in the sensing the change in the location of the touch input.
  • The transmitting the control command to the main image processing apparatus includes transmitting the control command that makes the contents displayed in the main image processing apparatus be rotated at a predetermined angle and in a predetermined direction if a circle input in which the touch input rotates at the angle and in the direction is sensed in the sensing the change in the location of the touch input.
  • The transmitting the control command to the main image processing apparatus may include transmitting a channel-switching command for the contents.
  • The control method of the sub image processing apparatus may further include displaying a control panel for the contents. Here, the control method of the sub image processing apparatus may further include closing the displayed control panel if receiving information about a power-off state of the main image processing apparatus.
  • The control method of the sub image processing apparatus may further include: receiving information about a power-off state of the main image processing apparatus; and transmitting a power-on command to the main image processing apparatus.
  • The control method of the sub image processing apparatus may further include animating at least one of the menu item selected corresponding to the touch input and the contents corresponding to the menu item.
  • Another aspect of the present invention is to provide a control method of a main image processing apparatus connectable with a sub image processing apparatus, the control method including: receiving a control command for contents from the sub image processing apparatus; and displaying the contents changed on the basis of the received control command.
  • The displaying the contents changed depending on the control command may include switching a channel to correspond to the contents.
  • The control method of the main image processing apparatus according to claim 10, further including turning on the main image processing apparatus if the image processing apparatus is being turned off.
  • The control command may include information about rotation at a predetermined angle and in a predetermined direction with regard to the contents, and the displaying the contents changed depending on the control command includes displaying the contents rotated at the angle and in the direction.
  • Still another aspect of the present invention is to provide a sub image processing apparatus connectable with a main image processing apparatus, the sub image processing apparatus including: a communication unit which communicates with the main image processing apparatus; an image processing unit which processes contents; a display unit which displays the processed contents; a user input unit which receives a user's touch input; and a controller which determines which contents correspond to a touched area where the touch input occurs, sensing change in location of the touch input, and controlling the communication unit to transmit a control command for the contents to the main image processing apparatus in response to the sensed change in the location of the touch input.
  • The display unit may display the contents changed corresponding to the sensed change in the location of the touch input.
  • The controller may control the communication unit to transmit information about the contents to the main image processing apparatus if a flicking input in which the touch input moves by a predetermined distance or more in a predetermined direction is sensed.
  • The controller may control the communication unit to transmit the control command that makes the contents displayed in the main image processing apparatus be rotated at a predetermined angle and in a predetermined direction if a circle input in which the touch input rotates at the angle and in the direction is sensed. Further, the controller may control the display unit to close a control panel for the contents if receiving information about a power-off state of the main image processing apparatus.
  • The controller may control the communication unit to transmit a power-on command to the main image processing apparatus if receiving information about a power-off state of the main image processing apparatus.
  • Yet another aspect of the present invention is to provide a main image processing apparatus connectable with a sub image processing apparatus, the main image processing apparatus including: a communication unit which communicates with the sub image processing apparatus; an image processing unit which processes contents; a display unit which displays the processed contents; and a controller which receives a control command for the contents from the sub image processing apparatus, and controls the image processing unit to display the contents changed on the basis of the received control command.
  • The controller may turn on the main image processing apparatus when the main image processing apparatus is being turned off, and control the image processing unit to display the contents.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above and/or other aspects of the present invention will become apparent and more readily appreciated from the following description of the exemplary embodiments, taken in conjunction with the accompanying drawings, in which:
  • FIGS. 1A and 1B are block diagrams of dual image processing systems according to exemplary embodiments of the present invention;
  • FIGS. 2 and 3 illustrate control of a main image processing apparatus as a user touches a sub image processing apparatus for input;
  • FIGS. 4A to 6B shows control screens displayed on the sub image processing apparatus according to an exemplary embodiment of the present invention;
  • FIGS. 7A and 7B illustrate control of the main image processing apparatus on the basis of touch input according to an exemplary embodiment of the present invention;
  • FIG. 8 is a flowchart of a control method of a sub image processing apparatus according to an exemplary embodiment of the present invention; and
  • FIG. 9 is a flowchart of a control method of a main image processing apparatus according to an exemplary embodiment of the present invention.
  • DETAILED DESCRIPTION OF EXEMPLARY EMBODIMENTS OF THE INVENTION
  • Below, exemplary embodiments of the present invention will be described in detail with reference to accompanying drawings.
  • FIG. 1A is a block diagram of a dual image processing system 10 according to an exemplary embodiment of the present invention. The dual image processing system 10 includes a sub image processing apparatus 100, and a main image processing apparatus 200 that receives a control command from the sub image processing apparatus 100.
  • In this exemplary embodiment, the sub image processing apparatus 100 is a portable media player capable of processing an image about various contents such as a still image (photograph), a moving picture, music, etc., and the main image processing apparatus 200 may be achieved by a digital television (TV), a set-top box, etc. The sub image processing apparatus 100 transmits a control command to the main image processing apparatus 200 in order to control operation of the main image processing apparatus 200.
  • As shown in FIG. 1A, the sub image processing apparatus 100 includes a first image receiving unit 110, a first image processing unit 120, a first display unit 130 a, a first user input unit 140 a, a first storage unit 150, a first communication unit 160, and a first controller 170.
  • The first image receiving unit 110 receives a video signal from a broadcasting station or from an external device such as a DVD player or the like. The first image receiving unit 110 includes an antenna, a tuner, etc. to receive a broadcasting signal. Meanwhile, the sub image processing apparatus 100 according to this embodiment may receive a video signal from the outside via a network.
  • The first image processing unit 120 processes the video signal received by the first image receiving unit 110 into a signal having a format displayable in the first display unit 130 a.
  • The first display unit 130 a displays contents processed by the first image processing unit 120. The first display unit 130 a may change and display the contents corresponding to change in location of touch input on the first user input unit 140.
  • The first display unit 130 a includes a display panel (not shown) where the contents are displayed, and a panel driver (not shown) to drive the video signal output from the first image processing unit 120 to be displayed on the display panel. In this exemplary embodiment, the display panel (not shown) may include a liquid crystal display (LCD) panel or a plasma display panel (PDP) by way of example.
  • The first user input unit 140 a may be a control panel, which includes at least one button and is provided in the sub image processing apparatus 100, as a user interface (UI) to receive a user's instruction. The control panel may include specific keys such as a menu key, an arrow key, etc. The first controller 170 determines that a user gives instructions when a button of the control panel is pressed.
  • In another exemplary embodiment, as a first user input unit 140 b, the first user input unit 140 a may further include a touch panel (hereinafter, it will be also referred to as a “touch pad” or “touch screen”) which is in the first display unit 130 b and receives a user's touch input (FIG. 1B). The touch panel may include a graphic user interface (GUI) presented by executing a predetermined application and displayed on the first display unit 130 b as a touch area enabling a user's touch input.
  • Here, the touch input is an input based on a touch of a user, which not only includes a touch, a tap or the like, but also includes at least one of tap and hold, drag and drop, flicking and circle as a directional gesture input. Below, the touch input will be explained in more detail with exemplary embodiments to be described.
  • Further, the first user input unit 140 a, 140 b displays a plurality of menu items 21 (refer to FIG. 5A), and a menu navigation displayed in an area of the first display unit 130 a, 130 b. The first display unit 130 a, 130 b includes a view zone 22 (refer to FIG. 5A) to display thumbnail contents corresponding to a menu item 21 selected in the menu navigation, in which the view zone 22 may receive the touch input of a user.
  • Meanwhile, the first user input unit 140 a, 140 b receives a user's instruction for controlling the main image processing apparatus 200. Specifically, the first user input unit 140 a, 140 b includes a TV control panel 24 (refer to FIG. 5B) presented to control the contents displayed on the main image processing apparatus 200 and displayed on the first display unit 140 a, 140 b. If the touch input is received through the TV control panel 24, the first controller 170 controls the first communication unit 160 to transmit a control command corresponding to the touch input to the main image processing apparatus 200.
  • The first storage unit 150 stores the contents received from the outside. Here, an image stored in the first storage unit 150 may include not only a broadcasting signal transmitted from the broadcasting station and received by the first image receiving unit 110, but also contents such as a game application, a still image (photograph), and a moving picture of digital data received from various external sources such as a DVD player, an MP3 player, a digital camera, etc.
  • The first storage unit 150 may include an internal storage medium such as a flash memory, an erasable programmable read only memory (EPROM) and a hard disk drive (HDD), or a portable storage medium such as a universal serial bus (USB) memory and a memory card (a memory stick, a compact flash card, and a multi-media card (MMC)).
  • The first communication unit 160 performs wire/wireless communication with the outside according to a predetermined communication protocol. Specifically, the first communication unit 160 transmits a control command corresponding to the touch input of a user to the main image processing apparatus 200, and transmits the contents stored in the first storage unit 150 to the main image processing apparatus 200. Further, the first communication unit 160 may receive the contents from the main image processing apparatus 200.
  • Here, the control command is a signal for giving various commands of a user such as power on/off of the main image processing apparatus 200, switch of a channel, synchronization between the sub image processing apparatus 100 and the main image processing apparatus 200, adjustment of screen and volume, and recording reservation, etc.
  • Further, the first communication unit 160 may communicate with not only the main image processing apparatus 200 but also various connectable external apparatuses to transmit and receive the contents.
  • The first communication unit 160 may include a wired/wireless communication module connectable with the outside locally or through a network based on a predetermined protocol, a USB port connectable with a portable storage medium such as a USB memory, etc.
  • The first controller 170 performs general control of the sub image processing apparatus 100. In more detail, if a user touches the first user input unit 140 a, 140 b for input, the first controller 170 determines contents corresponding to a touched area and controls the first communication unit 160 to transmit a control command about the determined contents to the main image processing apparatus 200.
  • Here, the control command may be given as a control command that makes the same contents as displayed on the first display unit 130 a, 130 b be displayed on a second display unit 230 of the main image processing apparatus 200.
  • In more detail, the first controller 170 determines the contents corresponding to the touched area where the touch input is performed, and, if sensing a location change of the touch input by a touch input having directionality (e.g., flicking), transmits the control command about the contents to the main image processing apparatus 200 in response to the sensed location change. Here, the touched area may include the view zone 22 where the contents are displayed.
  • Further, the transmitted control command may include a power-on command for the main image processing apparatus 200 in the state that the main image processing apparatus 200 has been turned off. For example, the first controller 170 transmits the power-on command to the main image processing apparatus 200 when receiving information about the power-off state of the main image processing apparatus 200 in the state that the touch input is given for transmitting the contents displayed on the first display unit 130 a, 130 b to the main image processing apparatus 200. Then, the first controller 170 controls the first communication unit 160 to transmit the contents displayed on the first display unit 130 a, 130 b to the main image processing apparatus 200, and transmit the control command for displaying the transmitted contents to be displayed on the second display unit 230 of the main image processing apparatus 200.
  • Further, the first controller 170 may control the first communication unit 160 to transmit a channel-shifting command for the determined contents.
  • The first controller 170 may be achieved by a relevant software program, and a processor such as a microcomputer, a central processing unit (CPU), or the like to load and execute this program.
  • Meanwhile, as shown in FIGS. 1A and 1B, the main image processing apparatus 200 according to an exemplary embodiment of the present invention may include a second image receiving unit 210, a second image processing unit 220, a second display unit 230, a second user input unit 240, a second storage unit 250, a second communication unit 260, and a second controller 270.
  • The second image receiving unit 210 receives a video signal from a broadcasting station or from an external device such as a DVD player or the like. The second image receiving unit 210 includes an antenna, a tuner, etc. to receive a broadcasting signal.
  • The second image processing unit 220 processes the video signal received by the second image receiving unit 210 into a signal having a format displayable in the second display unit 230.
  • The second display unit 230 displays contents processed by the second image processing unit 220.
  • The second display unit 230 includes a display panel (not shown) where the contents are displayed, and a panel driver (not shown) to drive the video signal output from the second image processing unit 220 to be displayed on the display panel. In this embodiment, the display panel (not shown) may include a liquid crystal display (LCD) panel or a plasma display panel (PDP) by way of example.
  • The second user input unit 240 may be achieved by a control panel, which includes at least one button and is provided in the main image processing apparatus 200, as a user interface (UI) to receive a user's instruction. The control panel may include specific keys such as a menu key, an arrow key, etc. The second controller 270 determines that a user gives instructions when a button of the control panel is pressed.
  • Further, the second user input unit 240 may further include a graphic user interface (GUI) presented by executing a predetermined application and displayed on the second display unit 230 as a touch area enabling a user's touch input.
  • The second storage unit 250 stores the contents received from the outside. Here, an image stored in the second storage unit 250 may include not only a broadcasting signal transmitted from the broadcasting station and received by the second image receiving unit 210, but also an image received from the sub image processing apparatus 100 through the second communication unit 260 and contents such as a game application, a still image (photograph), and a moving picture of digital data received from various external sources such as a DVD player, an MP3 player, a digital camera, etc.
  • The second storage unit 250 may include an internal storage medium such as a flash memory, an erasable programmable read only memory (EPROM) and a hard disk drive (HDD), or a portable storage medium such as a universal serial bus (USB) memory and a memory card (a memory stick, a compact flash card, and a multi-media card (MMC)).
  • The second communication unit 260 performs wire/wireless communication with the outside according to a predetermined communication protocol. Specifically, the second communication unit 260 receives a control command from the sub image processing apparatus 100.
  • Further, the second communication unit may receive contents from the sub image processing apparatus 100. Here, the sub image processing apparatus 100 may decode (or trans-code) the contents to have a predetermined format before transmitting the contents, and transmit it to the main image processing apparatus 200.
  • Here, the control command is a signal for giving various commands of a user such as power on/off of the main image processing apparatus 200, switch of a channel, synchronization between the sub image processing apparatus 100 and the main image processing apparatus 200, adjustment of screen and volume, and recording reservation, etc.
  • Further, the second communication unit 260 may communicate with not only the sub image processing apparatus 100 but also various connectable external apparatuses to transmit and receive the contents.
  • The second communication unit 260 may include a wired/wireless communication module connectable with the outside locally or through a network based on a predetermined protocol, a USB port connectable with a portable storage medium such as a USB memory, etc.
  • The second controller 270 performs general control of the main image processing apparatus 200. In more detail, if receiving a control command from the sub image processing apparatus 100 through the second communication unit 260, the second controller 270 controls the second image processing unit 220 to control the contents displayed on the second display unit 230 in response to the received control command.
  • Here, the received control command may be given as a control command that makes the same contents as displayed on the first display unit 130 a, 130 b of the sub image processing apparatus 100 be displayed on the second display unit 230.
  • Further, the second controller 270 may turn on the main image processing apparatus 200 and control the second image processing unit 220 to display the contents corresponding to the relevant control command, when receiving the control command in the state that the main image processing apparatus 200 has been turned off.
  • The second controller 270 may be achieved by a relevant software program, and a processor such as a microcomputer, a central processing unit (CPU), or the like to load and execute this program.
  • Below, exemplary embodiments of controlling the main image processing apparatus 200 through the sub image processing apparatus 100 will be described in more detail with reference to accompanying drawings.
  • FIGS. 2 and 3 illustrate control of the main image processing apparatus 200 as a user touches the sub image processing apparatus 100 for input.
  • As shown in FIG. 2, the sub image processing apparatus 100 may receive a flicking input 11 as the touch input while displaying predetermined contents. In this embodiment, the flicking input 11 means a touch input based on tap & hold where a finger or the like (e.g., a stylus) moves in a predetermined direction (e.g., toward the main image processing apparatus 200 or an upward direction of the touch screen) and by a predetermined distance or more in a predetermined direction while being in contact with a touching area corresponding to the predetermined contents.
  • When receiving the flicking input 11 toward the main image processing apparatus 200 for the contents displayed on the first display unit 130 a, 130 b, the first controller 170 transmits the control command to the main image processing apparatus 200 so that the same contents as displayed on the first display unit 130 a, 130 b can be displayed on the second display unit 230. To this end, the sub image processing apparatus 100 may transmit information about relevant contents to the main image processing apparatus 200. The main image processing apparatus 200 displays the relevant contents on the second display unit 230 on the basis of the information about the received contents.
  • If the main image processing apparatus 200 has been turned off as shown in FIG. 2, the first controller 170 receives information about the power-off state of the main image processing apparatus 200 and transmits a power-on command to the main image processing apparatus 200 on the basis of the received information about the power-off state. The second controller 270 turns on the main image processing apparatus 200 on the basis of the received power-on command, and controls the second image processing unit 220 so that the same contents as displayed on the first display unit 130 a, 130 b can be displayed on the second display unit 230 on the basis of the control command corresponding to the flicking input 11. In an exemplary embodiment, the main image processing apparatus 200 in a “power-off” state may not be entirely “off” in that the image processing apparatus 200 may be in a low power consumption setting so that it can receive and process a power-on command.
  • As shown in FIG. 3, if the contents displayed on the first display unit 130 a, 130 b of the sub image processing apparatus 100 include channel information, the first controller 170 may control the first communication unit 160 to transmit a channel-switching command to the main image processing apparatus 200. Thus, the main image processing apparatus 200 displays contents corresponding to a channel changed by the received channel-switching command. Here, if the channel-switching command is received while the main image processing apparatus 200 has been turned off, the second controller 270 receives the power-on command from the sub image processing apparatus 100 and turns on the main image processing apparatus 200 in response to the power-on command, thereby controlling the second image processing unit 220 to display contents corresponding to the channel-switching command.
  • FIGS. 4A to 6B shows control screens displayed on the first display unit 130 a, 130 b of the sub image processing apparatus 100 according to an exemplary embodiment of the present invention.
  • As shown in FIG. 4A, on an area of the first display unit 130 a, 130 b are displayed the menu navigation including the plurality of menu items 21, and the view zone 22 where the thumbnail contents are displayed corresponding to the menu item 21 selected in the menu navigation. Here, the area where the menu item 21 selected in the menu navigation is displayed is a target zone.
  • In FIG. 4A if a user selects not the currently displayed contents but another menu item 23 through the touch input, the first display unit 130 a, 130 b displays the thumbnail contents corresponding to the selected menu item 23 on the view zone 22 as shown in FIG. 4B.
  • As shown in FIG. 5A, if there is the flicking input 11 from a user while the view zone 22 displays predetermined contents, the sub image processing apparatus 100 senses change in location due to the flicking input 11 and transmits a control command about the contents to the main image processing apparatus 200. Thus, the main image processing apparatus 200 displays the contents corresponding to the flicking input 11.
  • Here, the first display unit 130 a, 130 b may animate at least one of the menu item selected corresponding to the received touch input and the contents displayed on the view zone 22 corresponding to the menu item. For example, the animation in this exemplary embodiment may include the menu item or the contents being displayed while moving in sequence.
  • As shown in FIG. 5B, the first controller 170 may control the first display unit 130 a, 130 b to display the control panel 24 for controlling the contents displayed on the main image processing apparatus 200. A user may give a control command for the contents to the main image processing apparatus 200 through the control panel 24.
  • If the main image processing apparatus 200 is turned off, the sub image processing apparatus 100 receives the information about the power-off state of the main image processing apparatus 200 and closes the control panel 24 as shown in FIG. 5C.
  • Meanwhile, the sub image processing apparatus 100 may display the contents of the view zone 22 in a full screen display as shown in FIG. 6A.
  • As shown in FIG. 6A, the sub image processing apparatus 100 may receive a circle input 12 as the touch input for the contents. Here, the circle input 12 means a touch input based on rotation at a predetermined angle and in a predetermined direction. When sensing the circle input 12, the first controller 170 controls the first image processing unit 120 to rotate the contents by the angle and direction of the received circle input 12 as shown in FIG. 6B and display the rotated contents on the first display unit 130 a, 130 b. Further, the first controller 170 controls the first communication unit 160 to transmit a control command for rotating the contents displayed on the main image processing apparatus 200 by the received angle and direction.
  • FIGS. 7A and 7B illustrate control of the main image processing apparatus 200 on the basis of touch input according to an exemplary embodiment of the present invention.
  • As shown in FIG. 7A, if the sub image processing apparatus 100 senses a user's flicking input that moves by a predetermined distance or more in a direction toward the main image processing apparatus 200, the main image processing apparatus 200 receives the control command corresponding to the flicking input and displays the contents changed according to the control command. In an exemplary embodiment, the user's flicking input is toward a physical location of the main image processing apparatus 200.
  • As shown in FIG. 7B, if the sub image processing apparatus 100 senses a user's circle input that rotates by a predetermined angle and a predetermined direction, the main image processing apparatus 200 receives the control command corresponding to the circle input and displays the contents rotated by the sensed angle and direction.
  • In the foregoing exemplary embodiment, the touch input having the directionality is described with respect to the flicking input the circle input, but not limited thereto.
  • In the main and sub image processing apparatuses 200 and 100 with this configurations, the control methods thereof will be described below with reference to FIGS. 8 and 9.
  • FIG. 8 is a flowchart of the control method of the sub image processing apparatus 100 according to an exemplary embodiment of the present invention.
  • Referring to FIG. 8, at operation S110 the sub image processing apparatus 100 receives a user's touch input through the first user input unit 140 a, 140 b.
  • At operation S120, the first controller 170 determines which contents correspond to the touched area in response to the touch input.
  • As operation S130, the first controller 170 senses the change in location of the touch input received in the operation S110. Here, the sensed change in the location may include the flicking input in which the location of the touch input moves by a predetermined distance or more in a predetermined direction, or the circle input in which the location of the touch input rotates at a predetermined angle and in a predetermined direction.
  • At operation S140, the first controller 170 controls the first image processing unit 120 to change the contents determined in the operation S120 in response to the change of the location sensed in the operation S130, and display it on the first display unit 130 a, 130 b.
  • At operation S150, the first controller 170 controls the first communication unit 160 to send the main image processing apparatus 200 a control command for the contents in response to the change in the location sensed in the operation S130. Here, the control command may be given as a control command that makes the same contents as displayed in the operation S140 be displayed on the main image processing apparatus 200. Further, the control command in the operation S140 may include a channel-switching command, a power-on command for the main image processing apparatus 200, etc.
  • FIG. 9 is a flowchart of the control method of the main image processing apparatus 200 according to an exemplary embodiment of the present invention.
  • Referring to FIG. 9, at operation S210 the main image processing apparatus 200 receives the control command for the contents from the sub image processing apparatus 100.
  • Here, if the main image processing apparatus 200 has been turned off at operation S220, the main image processing apparatus 200 is turned on in response to the received control command at operation S230.
  • At operation S240, the main image processing apparatus displays the contents corresponding to the control command received in the operation S210. Here, the main image processing apparatus 200 displays the contents changed (e.g., rotated) on the basis of the received control command, or switched in the channel.
  • In the dual image processing system 10 according to the above described exemplary embodiments of the present invention, the main image processing apparatus 200 is controlled through the sub image processing apparatus 100 supporting the touch input, and thus it is more convenient for a user to use the dual image processing system 10.
  • In the foregoing embodiments, the sub image processing apparatus 100 is used for transmitting the control command to the main image processing apparatus 200, but not limited thereto. Alternatively, the main image processing apparatus 200 may be used for controlling the sub image processing apparatus 100, or the control command may be transmitted and received between the main and sub image processing apparatuses 200 and 100.
  • Although a few exemplary embodiments of the present invention have been shown and described, it will be appreciated by those skilled in the art that changes may be made in these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.

Claims (21)

1. A control method of a sub image processing apparatus in communication with a main image processing apparatus, the control method comprising:
receiving a user touch input at the sub image processing apparatus;
determining which at least one of contents correspond to a touched area of the user touch input;
sensing a change in location of the user touch input; and
transmitting a control command for the at least one of contents to the main image processing apparatus in response to the sensed change in the location of the user touch input.
2. The control method of the sub image processing apparatus according to claim 1, further comprising displaying at least one of the contents changed corresponding to the sensed change in the location of the user touch input.
3. The control method of the sub image processing apparatus according to claim 1, further comprising transmitting information about the at least one of the contents to the main image processing apparatus if the sensing the change in the location of the user touch input senses a flicking input in which the user touch input moves by a predetermined distance or more in a predetermined direction.
4. The control method of the sub image processing apparatus according to claim 1, wherein the transmitting the control command for the at least one of the contents to the main image processing apparatus comprises transmitting the control command that rotates the at least one of the contents displayed in the main image processing apparatus at a predetermined angle and in a predetermined direction, if the sensing the change in location of the user touch input senses a circle input in which the user touch input rotates at the predetermined angle and in the predetermined direction.
5. The control method of the sub image processing apparatus according to claim 1, wherein the transmitting the control command for the at least one of the contents to the main image processing apparatus comprises transmitting a channel-switching command for the at least one of the contents.
6. The control method of the sub image processing apparatus according to claim 1, further comprising displaying a control panel for the contents.
7. The control method of the sub image processing apparatus according to claim 6, further comprising closing the displayed control panel if receiving information about a power-off state of the main image processing apparatus.
8. The control method of the sub image processing apparatus according to claim 1, further comprising:
receiving information about a power-off state of the main image processing apparatus; and
transmitting a power-on command to the main image processing apparatus.
9. The control method of the sub image processing apparatus according to claim 1, further comprising animating at least one of a menu item selected corresponding to the user touch input and the contents corresponding to the at least one of the menu item.
10. A control method of a main image processing apparatus connectable with a sub image processing apparatus, the control method comprising:
receiving a control command for at least one of contents from the sub image processing apparatus; and
displaying the at least one of the contents changed based on the received control command.
11. The control method of the main image processing apparatus according to claim 10, wherein the displaying the at least one of the contents changed based on the control command comprises switching a channel to correspond to the at least one of the contents.
12. The control method of the main image processing apparatus according to claim 10, further comprising turning on the main image processing apparatus if the image processing apparatus is off
13. The control method of the main image processing apparatus according to claim 10, wherein the control command comprises information about rotation at a predetermined angle and in a predetermined direction for the at least one of the contents, and
the displaying the at least one of the contents changed based on the control command comprises displaying the at least one of the contents rotated at the predetermined angle and in the predetermined direction.
14. A sub image processing apparatus communicating with a main image processing apparatus, the sub image processing apparatus comprising:
a communication unit which communicates with the main image processing apparatus;
an image processing unit which processes contents;
a display unit which displays the processed contents;
a user input unit which receives a user touch input; and
a controller which determines which at least one of contents corresponds to a touched area of the user touch input, senses a change in location of the user touch input, and controls the communication unit to transmit a control command for the at least one of the contents to the main image processing apparatus in response to the sensed change in the location of the user touch input.
15. The sub image processing apparatus according to claim 14, wherein the display unit displays the at least one of the contents changed corresponding to the sensed change in the location of the user touch input.
16. The sub image processing apparatus according to claim 14, wherein the controller controls the communication unit to transmit information about the at least one of the contents to the main image processing apparatus if a flicking input in which the user touch input moves by a predetermined distance or more in a predetermined direction is sensed.
17. The sub image processing apparatus according to claim 14, wherein the controller controls the communication unit to transmit the control command that makes the at least one of the contents displayed in the main image processing apparatus be rotated at a predetermined angle and in a predetermined direction if a circle input in which the user touch input rotates at the predetermined angle and in the predetermined direction is sensed.
18. The sub image processing apparatus according to claim 14, wherein the controller controls the display unit to close a control panel for the contents if information about a power-off state of the main image processing apparatus is received.
19. The sub image processing apparatus according to claim 14, wherein the controller controls the communication unit to transmit a power-on command to the main image processing apparatus if information about a power-off state of the main image processing apparatus is received.
20. A main image processing apparatus communicating with a sub image processing apparatus, the main image processing apparatus comprising:
a communication unit which communicates with the sub image processing apparatus;
an image processing unit which processes contents;
a display unit which displays the processed contents; and
a controller which receives a control command for at least one of the contents from the sub image processing apparatus, and controls the image processing unit to display the at least one of the contents changed based on the received control command.
21. The main image processing apparatus according to claim 20, wherein the controller turns on the main image processing apparatus when the main image processing apparatus is off, and controls the image processing unit to display the at least one of the contents.
US12/616,541 2008-12-11 2009-11-11 Main image processing apparatus, sub image processing apparatus and control method thereof Abandoned US20100149120A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US14/636,738 US10965980B2 (en) 2008-12-11 2015-03-03 Main image processing apparatus, sub image processing apparatus and control method thereof
US17/175,127 US11375262B2 (en) 2008-12-11 2021-02-12 Main image processing apparatus, sub image processing apparatus and control method thereof

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020080125783A KR101635640B1 (en) 2008-12-11 2008-12-11 Display apparatus, display system and control method thereof
KR10-2008-0125783 2008-12-11

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/636,738 Continuation US10965980B2 (en) 2008-12-11 2015-03-03 Main image processing apparatus, sub image processing apparatus and control method thereof

Publications (1)

Publication Number Publication Date
US20100149120A1 true US20100149120A1 (en) 2010-06-17

Family

ID=41566135

Family Applications (3)

Application Number Title Priority Date Filing Date
US12/616,541 Abandoned US20100149120A1 (en) 2008-12-11 2009-11-11 Main image processing apparatus, sub image processing apparatus and control method thereof
US14/636,738 Active 2030-03-19 US10965980B2 (en) 2008-12-11 2015-03-03 Main image processing apparatus, sub image processing apparatus and control method thereof
US17/175,127 Active US11375262B2 (en) 2008-12-11 2021-02-12 Main image processing apparatus, sub image processing apparatus and control method thereof

Family Applications After (2)

Application Number Title Priority Date Filing Date
US14/636,738 Active 2030-03-19 US10965980B2 (en) 2008-12-11 2015-03-03 Main image processing apparatus, sub image processing apparatus and control method thereof
US17/175,127 Active US11375262B2 (en) 2008-12-11 2021-02-12 Main image processing apparatus, sub image processing apparatus and control method thereof

Country Status (3)

Country Link
US (3) US20100149120A1 (en)
EP (3) EP3570535A1 (en)
KR (1) KR101635640B1 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100167781A1 (en) * 2008-12-30 2010-07-01 Samsung Electronics Co., Ltd. Method for display of dual standby portable terminal and apparatus thereof
US20110187748A1 (en) * 2010-01-29 2011-08-04 Samsung Electronics Co. Ltd. Apparatus and method for rotating output image in mobile terminal
US20120086774A1 (en) * 2009-06-16 2012-04-12 Sangwook Nam 3d display device and selective image display method thereof
US20120139948A1 (en) * 2010-12-07 2012-06-07 Moriya Kinuko Display processing apparatus and display processing method
US20130169625A1 (en) * 2011-12-28 2013-07-04 Samsung Electronics Co., Ltd. Image processing apparatus, upgrade apparatus, display system including the same, and control method thereof
US20130198334A1 (en) * 2010-10-19 2013-08-01 Sony Computer Entertainment Inc. Information processing system, information processing method, information processing program, computer-readable recording medium on which information processing program is stored
US20140218326A1 (en) * 2011-11-08 2014-08-07 Sony Corporation Transmitting device, display control device, content transmitting method, recording medium, and program
US20160073098A1 (en) * 2014-09-10 2016-03-10 Continental Automotive Systems, Inc. Head-up display system using auto-stereoscopy 3d transparent electronic display
CN105472277A (en) * 2011-12-28 2016-04-06 三星电子株式会社 Image processing apparatus, upgrade apparatus, display system including the same, and control method thereof
USD772253S1 (en) * 2013-02-19 2016-11-22 Sony Computer Entertainment Inc. Display panel or screen with an animated graphical user interface
US20160343350A1 (en) * 2015-05-19 2016-11-24 Microsoft Technology Licensing, Llc Gesture for task transfer
US20220299763A1 (en) * 2021-03-19 2022-09-22 Seiko Epson Corporation Display system and display device

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101635640B1 (en) * 2008-12-11 2016-07-05 삼성전자 주식회사 Display apparatus, display system and control method thereof
KR101984462B1 (en) * 2010-12-31 2019-05-30 이베이 인크. Methods and systems for displaying content on multiple networked devices with a simple command
JP5241941B2 (en) * 2011-09-28 2013-07-17 日立コンシューマエレクトロニクス株式会社 Portable terminal, system, information processing method and program
KR102110779B1 (en) * 2013-06-27 2020-05-14 삼성전자 주식회사 Method and apparatus for managing page display mode in application of an user device
KR102166777B1 (en) * 2013-09-30 2020-11-04 삼성전자주식회사 Display apparatus and method for controlling the same
US20160173563A1 (en) * 2014-12-12 2016-06-16 Microsoft Technology Licensing, Llc Rotation Control of an External Display Device
US10073599B2 (en) 2015-01-07 2018-09-11 Microsoft Technology Licensing, Llc Automatic home screen determination based on display device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6396523B1 (en) * 1999-07-29 2002-05-28 Interlink Electronics, Inc. Home entertainment device remote control
US6496927B1 (en) * 1999-06-09 2002-12-17 Amx Corporation Method and configuring a user interface for controlling a controlled device based upon a device class
US20040158854A1 (en) * 2003-02-10 2004-08-12 Shinnosuke Nagasawa Interactive remote control unit
US20040257337A1 (en) * 2003-04-04 2004-12-23 Canon Kabushiki Kaisha Display control device and method, and display system
US20070036128A1 (en) * 2004-02-09 2007-02-15 Matsushita Electric Industrial Co., Ltd. Communication terminal and communication method
US20070146347A1 (en) * 2005-04-22 2007-06-28 Outland Research, Llc Flick-gesture interface for handheld computing devices
US20080196068A1 (en) * 2007-02-09 2008-08-14 Mitac International Corporation Portable multimedia device

Family Cites Families (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0879847A (en) * 1994-09-05 1996-03-22 Hitachi Ltd Information system, av equipment constituting the system and remote control
US5956025A (en) * 1997-06-09 1999-09-21 Philips Electronics North America Corporation Remote with 3D organized GUI for a home entertainment system
US7831930B2 (en) * 2001-11-20 2010-11-09 Universal Electronics Inc. System and method for displaying a user interface for a remote control application
US20020000370A1 (en) * 1999-08-04 2002-01-03 Richard J. Pommer Ion processing of a substrate
US6970127B2 (en) * 2000-01-14 2005-11-29 Terayon Communication Systems, Inc. Remote control for wireless control of system and displaying of compressed video on a display on the remote
US20020002707A1 (en) * 2000-06-29 2002-01-03 Ekel Sylvain G. System and method to display remote content
US20030025738A1 (en) * 2001-07-31 2003-02-06 Eastman Kodak Company User interface including portable display for use with multiple electronic devices
JP3925297B2 (en) * 2002-05-13 2007-06-06 ソニー株式会社 Video display system and video display control device
JP3780982B2 (en) * 2002-07-05 2006-05-31 ソニー株式会社 Video display system, video display method, and display device
JP3991799B2 (en) * 2002-07-15 2007-10-17 株式会社日立製作所 Information processing terminal and recording / reproducing apparatus
US20050024488A1 (en) * 2002-12-20 2005-02-03 Borg Andrew S. Distributed immersive entertainment system
EP1538829B1 (en) * 2003-09-12 2015-11-04 Panasonic Corporation Image displaying apparatus and method
US9053754B2 (en) * 2004-07-28 2015-06-09 Microsoft Technology Licensing, Llc Thumbnail generation and presentation for recorded TV programs
US8456534B2 (en) * 2004-10-25 2013-06-04 I-Interactive Llc Multi-directional remote control system and method
US7461343B2 (en) * 2004-11-08 2008-12-02 Lawrence Kates Touch-screen remote control for multimedia equipment
US7722289B2 (en) * 2004-12-08 2010-05-25 Casella Waste Systems, Inc. Systems and methods for underground storage of biogas
JP4385995B2 (en) * 2005-05-23 2009-12-16 ソニー株式会社 Content display / playback system, content display / playback method, recording medium recording content display / playback program, and operation control apparatus
US20070003612A1 (en) * 2005-06-30 2007-01-04 Microsoft Corporation Capsule
KR100717691B1 (en) 2005-10-08 2007-05-14 삼성전자주식회사 Display Apparatus and Channel Navigation Method Thereof
KR100857508B1 (en) * 2007-04-24 2008-09-08 (주)비욘위즈 Method and apparatus for digital broadcating set-top box controller and digital broadcasting system
KR101536750B1 (en) * 2007-11-08 2015-07-15 삼성전자주식회사 Remote controller for setting mode according to state of broadcast receiving apparatus
US20090303097A1 (en) * 2008-06-09 2009-12-10 Echostar Technologies Llc Systems, methods and apparatus for changing an operational mode of a remote control
US9355554B2 (en) * 2008-11-21 2016-05-31 Lenovo (Singapore) Pte. Ltd. System and method for identifying media and providing additional media content
KR101635640B1 (en) * 2008-12-11 2016-07-05 삼성전자 주식회사 Display apparatus, display system and control method thereof

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6496927B1 (en) * 1999-06-09 2002-12-17 Amx Corporation Method and configuring a user interface for controlling a controlled device based upon a device class
US6396523B1 (en) * 1999-07-29 2002-05-28 Interlink Electronics, Inc. Home entertainment device remote control
US20040158854A1 (en) * 2003-02-10 2004-08-12 Shinnosuke Nagasawa Interactive remote control unit
US20040257337A1 (en) * 2003-04-04 2004-12-23 Canon Kabushiki Kaisha Display control device and method, and display system
US20070036128A1 (en) * 2004-02-09 2007-02-15 Matsushita Electric Industrial Co., Ltd. Communication terminal and communication method
US20070146347A1 (en) * 2005-04-22 2007-06-28 Outland Research, Llc Flick-gesture interface for handheld computing devices
US20080196068A1 (en) * 2007-02-09 2008-08-14 Mitac International Corporation Portable multimedia device

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8494578B2 (en) * 2008-12-30 2013-07-23 Samsung Electronics Co., Ltd. Method for display of dual standby portable terminal and apparatus thereof
US20100167781A1 (en) * 2008-12-30 2010-07-01 Samsung Electronics Co., Ltd. Method for display of dual standby portable terminal and apparatus thereof
US9097903B2 (en) * 2009-06-16 2015-08-04 Lg Electronics Inc. 3D display device and selective image display method thereof
US9869875B2 (en) * 2009-06-16 2018-01-16 Lg Electronics Inc. 3D display device and selective image display method thereof
US20120086774A1 (en) * 2009-06-16 2012-04-12 Sangwook Nam 3d display device and selective image display method thereof
US20150326849A1 (en) * 2009-06-16 2015-11-12 Lg Electronics Inc. 3d display device and selective image display method thereof
US20110187748A1 (en) * 2010-01-29 2011-08-04 Samsung Electronics Co. Ltd. Apparatus and method for rotating output image in mobile terminal
US9272218B2 (en) * 2010-10-19 2016-03-01 Sony Corporation Information processing system, information processing method, information processing program, computer-readable recording medium on which information processing program is stored
US20130198334A1 (en) * 2010-10-19 2013-08-01 Sony Computer Entertainment Inc. Information processing system, information processing method, information processing program, computer-readable recording medium on which information processing program is stored
US8947461B2 (en) * 2010-12-07 2015-02-03 Sharp Kabushiki Kaisha Display processing apparatus and display processing method
US20120139948A1 (en) * 2010-12-07 2012-06-07 Moriya Kinuko Display processing apparatus and display processing method
US20140218326A1 (en) * 2011-11-08 2014-08-07 Sony Corporation Transmitting device, display control device, content transmitting method, recording medium, and program
US9436289B2 (en) * 2011-11-08 2016-09-06 Sony Corporation Transmitting device, display control device, content transmitting method, recording medium, and program
US20130222400A1 (en) * 2011-12-28 2013-08-29 Samsung Electronics Co., Ltd. Image processing apparatus, upgrade apparatus, display system including the same, and control method thereof
US20130169625A1 (en) * 2011-12-28 2013-07-04 Samsung Electronics Co., Ltd. Image processing apparatus, upgrade apparatus, display system including the same, and control method thereof
CN105472277A (en) * 2011-12-28 2016-04-06 三星电子株式会社 Image processing apparatus, upgrade apparatus, display system including the same, and control method thereof
US9367890B2 (en) * 2011-12-28 2016-06-14 Samsung Electronics Co., Ltd. Image processing apparatus, upgrade apparatus, display system including the same, and control method thereof
US9396511B2 (en) * 2011-12-28 2016-07-19 Samsung Electronics Co., Ltd. Image processing apparatus, upgrade apparatus, display system including the same, and control method thereof
USD772253S1 (en) * 2013-02-19 2016-11-22 Sony Computer Entertainment Inc. Display panel or screen with an animated graphical user interface
US20160073098A1 (en) * 2014-09-10 2016-03-10 Continental Automotive Systems, Inc. Head-up display system using auto-stereoscopy 3d transparent electronic display
US20160343350A1 (en) * 2015-05-19 2016-11-24 Microsoft Technology Licensing, Llc Gesture for task transfer
US10102824B2 (en) * 2015-05-19 2018-10-16 Microsoft Technology Licensing, Llc Gesture for task transfer
US20220299763A1 (en) * 2021-03-19 2022-09-22 Seiko Epson Corporation Display system and display device

Also Published As

Publication number Publication date
EP2785049A1 (en) 2014-10-01
KR101635640B1 (en) 2016-07-05
EP2200279A1 (en) 2010-06-23
EP3570535A1 (en) 2019-11-20
US11375262B2 (en) 2022-06-28
US10965980B2 (en) 2021-03-30
US20210168427A1 (en) 2021-06-03
KR20100067296A (en) 2010-06-21
US20150181276A1 (en) 2015-06-25

Similar Documents

Publication Publication Date Title
US11375262B2 (en) Main image processing apparatus, sub image processing apparatus and control method thereof
US11749151B2 (en) Display apparatus and method for displaying
KR102420043B1 (en) Display apparatus and method for displaying
US9621434B2 (en) Display apparatus, remote control apparatus, and method for providing user interface using the same
US8988342B2 (en) Display apparatus, remote controlling apparatus and control method thereof
US20150193036A1 (en) User terminal apparatus and control method thereof
US20160231885A1 (en) Image display apparatus and method
US9811303B2 (en) Display apparatus, multi display system including the same, and control method thereof
US10810789B2 (en) Image display apparatus, mobile device, and methods of operating the same
US11367258B2 (en) Display device, user terminal device, display system including the same and control method thereof
US10284909B2 (en) Display apparatus, user terminal apparatus, system, and controlling method thereof
US20200186641A1 (en) Electronic device and control method therefor
EP2998838B1 (en) Display apparatus and method for controlling the same
KR20150144641A (en) user terminal apparatus and control method thereof
CN107852425B (en) Image display device and operation method thereof
US20160124606A1 (en) Display apparatus, system, and controlling method thereof
US20170180777A1 (en) Display apparatus, remote control apparatus, and control method thereof
US20130024792A1 (en) Information processing device, information processing method, and program
US9990106B2 (en) Electronic device, menu display method and storage medium
KR20140087787A (en) display apparatus and method for controlling the display apparatus therof
US20160364202A1 (en) Apparatus for outputting audio, driving method of apparatus for outputting audio, and non-transitory computer readable recording medium
TWI404343B (en) Signal processing system and button reset method
KR20190054397A (en) Display apparatus and the control method thereof

Legal Events

Date Code Title Description
AS Assignment

Owner name: SAMSUNG ELECTRONICS CO., LTD.,KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LEE, CHANG-SOO;KWON, YONG-HWAN;KIM, JOON-HWAN;AND OTHERS;SIGNING DATES FROM 20091014 TO 20091028;REEL/FRAME:023503/0332

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION