WO2014202819A1 - An apparatus for a 3-d stylus-actuable graphical user interface and associated methods - Google Patents

An apparatus for a 3-d stylus-actuable graphical user interface and associated methods Download PDF

Info

Publication number
WO2014202819A1
WO2014202819A1 PCT/FI2013/050658 FI2013050658W WO2014202819A1 WO 2014202819 A1 WO2014202819 A1 WO 2014202819A1 FI 2013050658 W FI2013050658 W FI 2013050658W WO 2014202819 A1 WO2014202819 A1 WO 2014202819A1
Authority
WO
WIPO (PCT)
Prior art keywords
stylus
application
information
display screen
user interface
Prior art date
Application number
PCT/FI2013/050658
Other languages
French (fr)
Inventor
Jari Olavi SAUKKO
Risto-Matti KUNNAS
Marja Pauliina Salmimaa
Original Assignee
Nokia Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Corporation filed Critical Nokia Corporation
Priority to PCT/FI2013/050658 priority Critical patent/WO2014202819A1/en
Publication of WO2014202819A1 publication Critical patent/WO2014202819A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

An apparatus, the apparatus comprising at least one processor,andat least one memory including computer program code,the at least one memory and the computer program code configured, with the at least one processor, to cause the apparatus to perform at least the following:for a 3-D stylus-actuable graphical user interface element displayed on a display screen, enable the revealing of information associated with the 3-D stylus-actuable graphical user interface element based on 3-D stylus-actuation comprising one or more of: a determined angle of the 3-D stylus with respect to the display screen; and a determined size of the 3-D stylus. [figure5d]

Description

An Apparatus for a 3-D Stylus- Actuable Graphical User Interface and Associated Methods
Technical Field The present disclosure relates to user interfaces, associated methods, computer programs and apparatus. Certain disclosed examples may relate to portable electronic devices, for example so-called hand-portable electronic devices which may be hand-held in use (although they may be placed in a cradle in use). Such hand-portable electronic devices include so-called Personal Digital Assistants (PDAs), mobile telephones, smartphones and other smart devices, and tablet PCs.
The portable electronic devices/apparatus according to one or more disclosed examples may provide one or more audio/text/video communication functions (e.g. tele-communication, video-communication, and/or text transmission (Short Message Service (SMS)/Multimedia Message Service (MMS)/e-mailing) functions), interactive/non-interactive viewing functions (e.g. web-browsing, navigation, TV/program viewing functions), music recording/playing functions (e.g. MP3 or other format and/or (FM/AM) radio broadcast recording/playing), downloading/sending of data functions, image capture function (e.g. using a (e.g. in-built) digital camera), and gaming functions.
Background
Electronic devices may allow users to interact with displayed objects in different ways. For example, a user may touch a touch sensitive display screen over a displayed object to interact with it. The interaction may cause, for example, an application associated with the displayed object to open, content associated with the displayed object to be displayed, or another function to be performed in relation to the displayed object.
The listing or discussion of a prior-published document or any background in this specification should not necessarily be taken as an acknowledgement that the document or background is part of the state of the art or is common general knowledge. One or more examples of the present disclosure may or may not address one or more of the background issues.
Summary
In a first example there is provided an apparatus comprising at least one processor and at least one memory including computer program code, the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus to perform at least the following: for a 3-D stylus-actuable graphical user interface element displayed on a display screen, enable the revealing of information associated with the 3-D stylus-actuable graphical user interface element based on 3-D stylus- actuation comprising one or more of: a determined angle of the 3-D stylus with respect to the display screen; and a determined size of the 3-D stylus. For example, a user may hold a finger over (in touching contact or hovering over) a displayed application icon to reveal particular information about the associated application, and may hold two fingers over the application icon to reveal different particular information associated with the application. The application icon is a 3-D stylus-actuable graphical user interface element, and the user's finger(s) acts as a 3-D stylus. This is an example of information being revealed about a 3-D stylus-actuable graphical user interface element based on a determined size (in this example, the volume) of the 3-D stylus.
As another example, a user may hold a pen over (in touching contact or hovering over) a displayed contact entry in an address book, at a first angle with respect to the display screen, to reveal information about the particular contact (for example, the time and date of the last text-based communication with the contact). The user may hold the pen over (in touching or hovering over) the contact entry at a second different angle with respect to the display screen to reveal additional or alternative information about the particular contact (for example, the first line and/or subject of the last text-based communication with the contact as well as the time and date of transmission). The contact entry can be considered to be a 3-D stylus-actuable graphical user interface element, and the pen can be considered to be a 3-D stylus. This is an example of information being revealed associated with a 3-D stylus-actuable graphical user interface element based on a determined angle of the 3-D stylus with respect to the plane of the display screen.
The determined angle of the 3-D stylus with respect to the display screen may be one or more of: an angle of rotation within the plane of the display screen; and an angle of tilt with respect to the plane of the display screen. Thus, for example, the user may activate a graphical user interface element with a 3-D stylus. The user may vary the angle of rotation of the 3-D stylus within a plane parallel to that of the display screen, and/or vary the angle of tilt between the 3-D stylus and the plane of the display screen, to reveal particular information associated with the 3-D stylus-actuated graphical user interface element.
The 3-D stylus-actuable graphical user interface element may be configured to have an associated plurality of types of information available for revealing, and the apparatus may be configured such that number of revealed particular types of information changes (e.g., increases or decreases) increases as one or more of: the determined angle of the 3-D stylus with respect to the display screen changes (e.g., increases or decreases); and the determined size of the 3-D stylus changes (e.g., increases or decreases). For example, a photograph may be displayed as a 3-D stylus-actuable graphical user interface element. If the user actuates the photograph using one finger (the 3-D stylus) then the time and date when the photograph was taken may be revealed. If the user uses two fingers to actuate the image then the time and date, as well as the location of the photograph capture, may be revealed because the volume (size) of the 3-D stylus has increased. If the user uses three fingers to actuate the image then the time and date, the location of the photograph capture, and the names of people tagged in the photograph may be revealed because the volume of the 3-D stylus has further increased. Thus in this example more types of information about the photograph (the time and date type, the location type, and the tagged people type) are progressively revealed as the size (i.e., the volume) of the 3-D stylus increases.
The 3-D stylus-actuable graphical user interface element may be configured to have an associated plurality of types of information available for revealing, and the apparatus may be configured such that a particular type of information is revealed according to one or more of: the particular determined angle of the 3-D stylus with respect to the display screen; and the particular determined size of the 3-D stylus. For example, actuating a photograph using a 3-D stylus held at an angle of 10° ± 5° from the plane of the display screen may reveal the time and date when the photograph was taken, and actuating a photograph using the 3-D stylus held at an angle of 30° ± 5° from the plane of the display screen may reveal the location where the photograph was taken. In this example a particular type of information is revealed according to the angle of tilt of the 3-D stylus with respect to the plane of the display screen. The particular determined angle may be considered to be a particular angular range in certain examples (for example, between 0° and 30° tilt angle from the plane of the display screen).
The apparatus may be configured such that the amount of a particular type of revealed information associated with the 3-D stylus-actuable graphical user interface element changes (e.g., increases or decreases) as one or more of: the determined angle of the 3-D stylus with respect to the display screen changes (e.g., increases or decreases); and the determined size of the 3-D stylus changes (e.g., increases or decreases). Thus, for example, as the tilt angle of the 3-D stylus decreases, with respect to the plane of the display screen, information revealed about an e-mail may be increased from a subject line, to a subject line plus the first sentence in the e-mail message, to a subject line plus the first paragraph in the e-mail message. The determined size of the 3-D stylus may be one or more of: a detected footprint area of the 3-D stylus; and a detected volume of the 3-D stylus. Thus in some examples the 3-D volume of the 3-D stylus may be detected, and in some examples the effective 2-D footprint/shadow cast by the 3-D stylus may be detected. This detection may be done by the apparatus or by a different apparatus/device. One or more of the angle of the 3-D stylus with respect to the display screen and the size of the 3-D stylus may be determined by a 3-D capacitive touch sensor. One or more of the angle of the 3-D stylus with respect to the display screen and the size of the 3-D stylus may be determined by a visible camera, an infra-red camera, or a heat sensor. The 3-D stylus may be at least one of: a pen, a finger, two fingers, three fingers, four fingers, and a thumb. The 3-D stylus may be at least one of: five fingers, six fingers, seven fingers, eight fingers, and two thumbs. For example, if a large (such as a 10 inch / 25 cm sensing display screen) is used to detect a stylus, then a user may be able to use, for example, eight fingers and both thumbs as a stylus. The size of the stylus may be limited by the detecting range / area of the stylus detector (such as the camera or 3-D capacitive sensing element).
A combination of different types of stylus may be used as a stylus. For example, a user may be able to use a plastic pen and a finger together as a stylus.
The revealed information associated with the 3-D stylus-actuable graphical user interface element may comprise one or more of:
metadata of a particular application associated with the 3-D stylus-actuable graphical user interface element; historical information; and
information about a future event.
The metadata of a particular application associated with the 3-D stylus-actuable graphical user interface element may be associated with one or more of: a music player application, an image application, a movie application, a file manager application, an internet application, a communications application, an application icon, and an application widget. Metadata may be considered to be "data about data", or information which provides information about information. Examples include the size of a file, the number of calls/messages received from a contact, and the time and date when a file was recorded.
The historical information may be associated with one or more of: an e-mailing application, a calling application, a messaging application, a chat application, a social media application, a news feed application, an image application, a movie application, a calendar application, an internet application, an application icon, and an application widget. Historical information may be considered to be information relating to a past event. Examples include the content of a received e-mail message, the recipient details, and date and time of transmission of a transmitted social media status, and the image recorded in a photograph.
The information about a future event may be associated with one or more of: a calendar application, an alarm application, a social media application, an application icon, and an application widget. Examples of information about a future event include the content of a calendar appointment set in the future, or the time of an alarm set for a future time.
The revealed information may be based on data received from a further source. The data may be, for example, geographical location data. The further data source may be a global positioning system (GPS) receiver.
For example, a user may be located in a particular main street of a city. The user may look at an electronic map currently displaying that main street. Buildings on the displayed map may be considered 3-D stylus- actuable graphical user interface elements. While located in the particular main street, the user may be able to reveal certain information relating to places shown on the displayed map which are also on the same particular street. That is, because the user is located in a place which is displayed on a map, the user may reveal certain information about places located on that street on the map. Such information may not be available for revealing in relation to other streets where the user is not currently located, for example. It may be that a first level of information is available for revealing for all locations, but that a second greater level of information is available for revealing in relation to locations matching a user's current location. The user's location may be determined for example using global positioning system (GPS) technology. The user's location may be determined to within a particular area, such as, for example, a building, a street, a group of streets, a region of a town, a town, or a country.
The apparatus may be one or more of: a portable electronic device, a mobile phone, a smartphone, a tablet computer, a surface computer, a laptop computer, a personal digital assistant, a graphics tablet, a pen- based computer, a digital camera, a watch, a non-portable electronic device, a desktop computer, a monitor/display, a household appliance, a server, or a module for one or more of the same.
According to a further example there is provided a computer readable medium comprising computer program code stored thereon, the computer readable medium and computer program code being configured to, when run on at least one processor perform at least the following:
for a 3-D stylus-actuable graphical user interface element displayed on a display screen, enable the revealing of information associated with the 3-D stylus-actuable graphical user interface element based on 3-D stylus-actuation comprising one or more of:
a determined angle of the 3-D stylus with respect to the display screen; and
a determined size of the 3-D stylus.
A computer program may be stored on a storage media (e.g. on a CD, a DVD, a memory stick or other non-transitory medium). A computer program may be configured to run on a device or apparatus as an application. An application may be run by a device or apparatus via an operating system. A computer program may form part of a computer program product. Corresponding computer programs for implementing one or more of the methods disclosed are also within the present disclosure and encompassed by one or more of the described examples.
According to a further example, there is provided a method, the method comprising:
for a 3-D stylus-actuable graphical user interface element displayed on a display screen, enabling the revealing of information associated with the 3-D stylus-actuable graphical user interface element based on 3-D stylus-actuation comprising one or more of:
a determined angle of the 3-D stylus with respect to the display screen; and
a determined size of the 3-D stylus.
According to a further example there is provided an apparatus comprising: means for enabling the revealing of information associated with the 3-D stylus-actuable graphical user interface element based on 3-D stylus-actuation comprising one or more of:
a determined angle of the 3-D stylus with respect to the display screen; and
a determined size of the 3-D stylus;
for a 3-D stylus-actuable graphical user interface element displayed on a display screen.
The present disclosure includes one or more corresponding aspects, examples or features in isolation or in various combinations whether or not specifically stated (including claimed) in that combination or in isolation. Corresponding means and corresponding functional units (e.g., 3-D stylus position determiner, 3-D stylus angle determiner, 3-D stylus size determiner, information revealer) for performing one or more of the discussed functions are also within the present disclosure. The above summary is intended to be merely exemplary and non-limiting.
Brief Description of the Figures A description is now given, by way of example only, with reference to the accompanying drawings, in which: figure 1 illustrates an example apparatus comprising a number of electronic components, including memory and a processor, according to one example of the present disclosure;
figure 2 illustrates an example apparatus comprising a number of electronic components, including memory, a processor and a communication unit, according to another example of the present disclosure; figure 3 illustrates an example apparatus comprising a number of electronic components, including memory and a processor, according to another example of the present disclosure;
figures 4a-4d illustrate an example of revealing information based on the angle of rotation and the stylus size, according to examples of the present disclosure;
figures 5a-5d illustrate an example of revealing different types and amounts of information based on the angle of tilt of a 3-D actuating stylus according to examples of the present disclosure;
figure 6 illustrates detection of a 3-D stylus according to embodiments of the present disclosure;
figures 7a- 7b each illustrate an apparatus in communication with a remote computing element;
figures 8 illustrates a flowchart according to an example method of the present disclosure; and figure 9 illustrates schematically a computer readable medium providing a program. Description of Example Aspects
Electronic devices may allow users to interact with displayed objects in different ways. For example, a user may touch a touch sensitive display screen over a displayed application icon to interact with the icon and open the associated application. As another example, a user may be able to hover over a missed call message and return the call.
A person may wish to view information about a displayed object. For example, if a person wishes to view information about an application, the user may open that application and locate the required information. A user may be able to view information about an application without being required to open the application, for example by highlighting or selecting an application icon. In this case, the information which would be displayed about the application would be a predetermined amount of information covering one or more predetermined types of information.
As an example, if a user wanted to view information relating to an e-mail application without opening the application, the user may be able to highlight/select the e-mail application icon, for example by touching or hovering over the e-mail application icon, or by clicking on it within a mouse pointer. The user may then be presented with particular information such as the number of unread e-mails, for example. If the user wanted to see any other details about these unread e-mails or other e-mails, then the user may be required to open the e-mail application and locate the information of interest. This may be inconvenient and time consuming for the user. As another example, if a user wanted to view information about a playing movie without being interrupted while viewing the movie (for example by stopping the movie or navigating within the movie application) the user may be able to, for example, move a pointer over the playing movie. The user may then be presented with particular predetermined types and amounts of information, such as the current time position in the running length of the movie, the entire running time of the movie, the movie name, the location of the stored movie, the names of the main actors, and possibly other information. If the user only wanted to know the running time of the movie, then the provision of all the other additional information may be distracting and the user cannot concentrate on viewing the movie. Examples discussed herein may be considered to, for a 3-D stylus-actuable graphical user interface element displayed on a display screen, enable the revealing of information associated with the 3-D stylus- actuable graphical user interface element based on 3-D stylus-actuation comprising one or more of: a determined angle of the 3-D stylus with respect to the display screen; and a determined size of the 3-D stylus.
Advantageously a user may be able to determine/control what types of information, and/or the amount of a particular type or types of information are revealed to him associated with a 3-D stylus-actuable graphical user interface element depending on the particular way in which the user actuates the graphical user interface element with a 3-D stylus. In particular, by actuating the graphical user interface element with a 3-D stylus having a particular size and/or a particular angular orientation, the user can readily reveal the types of information of particular interest to him, at a level of detail of interest to him, simply by changing, for example, the angle of rotation, the tilt angle, and the volume of the 3-D stylus used.
Examples disclosed herein may provide a more intuitive and more rapid way for a user to reveal hidden content at a level of detail chosen by the user. If the 3-D stylus is the user's finger, then merely using two fingers rather than one may result in more information being revealed to the user. If the 3-D stylus is a pen, then merely by changing the rotation or tilt angle of the pen, more or less information may be revealed to the user depending on the user's particular requirements. The user need not, for example, navigate within an application or settings menu to select the type and amount of information to be revealed. The user need not, for example, select a particular icon or option according to the level of information he wishes to reveal. Thus, examples disclosed herein allow the user to intuitively interact with any displayed portion of an object of interest to reveal information about that object, at a level of detail chosen by the user's angle and volume of stylus. Other examples depicted in the figures have been provided with reference numerals that correspond to similar features of earlier described examples. For example, feature number 100 can also correspond to numbers 200, 300 etc. These numbered features may appear in the figures but may not have been directly referred to within the description of these particular examples. These have still been provided in the figures to aid understanding of the further examples, particularly in relation to the features of similar earlier described examples. Figure 1 shows an apparatus 100 comprising memory 107, a processor 108, input I and output O. In this example only one processor and one memory are shown but it will be appreciated that other examples may utilise more than one processor and/or more than one memory (e.g. same or different processor/memory types).
In this example the apparatus 100 is an Application Specific Integrated Circuit (ASIC) for a portable electronic device with a touch sensitive display. In other examples the apparatus 100 can be a module for such a device, or may be the device itself, wherein the processor 108 is a general purpose CPU of the device and the memory 107 is general purpose memory comprised by the device. The display, in other examples, may not be touch sensitive.
The input I allows for receipt of signalling to the apparatus 100 from further components, such as components of a portable electronic device (like a touch-sensitive or hover-sensitive display) or the like. The output O allows for onward provision of signalling from within the apparatus 100 to further components such as a display screen, speaker, or vibration module. In this example the input I and output O are part of a connection bus that allows for connection of the apparatus 100 to further components.
The processor 108 is a general purpose processor dedicated to executing/processing information received via the input I in accordance with instructions stored in the form of computer program code on the memory 107. The output signalling generated by such operations from the processor 108 is provided onwards to further components via the output O.
The memory 107 (not necessarily a single memory unit) is a computer readable medium (solid state memory in this example, but may be other types of memory such as a hard drive, ROM, RAM, Flash or the like) that stores computer program code. This computer program code stores instructions that are executable by the processor 108, when the program code is run on the processor 108. The internal connections between the memory 107 and the processor 108 can be understood to, in one or more examples, provide an active coupling between the processor 108 and the memory 107 to allow the processor 108 to access the computer program code stored on the memory 107.
In this example the input I, output O, processor 108 and memory 107 are all electrically connected to one another internally to allow for electrical communication between the respective components I, O, 107, 108. In this example the components are all located proximate to one another so as to be formed together as an ASIC, in other words, so as to be integrated together as a single chip/circuit that can be installed into an electronic device. In other examples one or more or all of the components may be located separately from one another.
Figure 2 depicts an apparatus 200 of a further example, such as a mobile phone. In other examples, the apparatus 200 may comprise a module for a mobile phone (or PDA or audio/video player), and may just comprise a suitably configured memory 207 and processor 208. The example of figure 2 comprises a display device 204 such as, for example, a liquid crystal display (LCD), e-Ink or touch-screen user interface. The apparatus 200 of figure 2 is configured such that it may receive, include, and/or otherwise access data. For example, this example 200 comprises a communications unit 203, such as a receiver, transmitter, and/or transceiver, in communication with an antenna 202 for connecting to a wireless network and/or a port (not shown) for accepting a physical connection to a network, such that data may be received via one or more types of networks. This example comprises a memory 207 that stores data, possibly after being received via antenna 202 or port or after being generated at the user interface 205. The processor 208 may receive data from the user interface 205, from the memory 207, or from the communication unit 203. It will be appreciated that, in certain examples, the display device 204 may incorporate the user interface 205. Regardless of the origin of the data, these data may be outputted to a user of apparatus 200 via the display device 204, and/or any other output devices provided with apparatus. The processor 208 may also store the data for later use in the memory 207. The memory 207 may store computer program code and/or applications which may be used to instruct/enable the processor 208 to perform functions (e.g. read, write, delete, edit or process data).
Figure 3 depicts a further example of an electronic device 300 comprising the apparatus 100 of figure 1. The apparatus 100 can be provided as a module for device 300, or even as a processor/memory for the device 300 or a processor/memory for a module for such a device 300. The device 300 comprises a processor 308 and a storage medium 307, which are connected (e.g. electrically and/or wirelessly) by a data bus 380. This data bus 380 can provide an active coupling between the processor 308 and the storage medium 307 to allow the processor 308 to access the computer program code. It will be appreciated that the components (e.g. memory, processor) of the device/apparatus may be linked via cloud computing architecture. For example, the storage device may be a remote server accessed via the internet by the processor.
The apparatus 100 in figure 3 is connected (e.g. electrically and/or wirelessly) to an input/output interface 370 that receives the output from the apparatus 100 and transmits this to the device 300 via data bus 380. Interface 370 can be connected via the data bus 380 to a display 304 (touch-sensitive or otherwise) that provides information from the apparatus 100 to a user. The display 304 can be part of the device 300 or can be separate. The device 300 also comprises a processor 308 configured for general control of the apparatus 100 as well as the device 300 by providing signalling to, and receiving signalling from, other device components to manage their operation.
The storage medium 307 is configured to store computer code configured to perform, control or enable the operation of the apparatus 100. The storage medium 307 may be configured to store settings for the other device components. The processor 308 may access the storage medium 307 to retrieve the component settings in order to manage the operation of the other device components. The storage medium 307 may be a temporary storage medium such as a volatile random access memory. The storage medium 307 may also be a permanent storage medium such as a hard disk drive, a flash memory, a remote server (such as cloud storage) or a non-volatile random access memory. The storage medium 307 could be composed of different combinations of the same or different memory types. Figures 4a-4d illustrate examples of an apparatus/device 400 in use. Figure 4a shows the apparatus/device 400 before any user input is performed. The display screen 402 is displaying a series of icons/tiles 404 each corresponding to/associated with a particular application. In this example the user is interested in viewing information relating to the messaging application represented by the messaging icon/tile 406.
Between figures 4b and 4c, the angle of rotation φ of the stylus (user's finger) in the plane of the display screen is varied to vary the number of types of information revealed. Between figures 4c and 4d, the size of the stylus (user's finger(s)) is varied to vary the number of types of information revealed. It will be appreciated that in different examples, only one criterion, two criteria, or three criteria (from the stylus size, the stylus angle of rotation φ and the stylus angle of tilt Θ) may be considered in determining the information to be revealed.
The apparatus/device 400 is configured to enable the revealing of information associated with a 3-D stylus-actuable graphical user interface element 404, 406 based on 3-D stylus-actuation, for a 3-D stylus- actuable graphical user interface element 404, 406 displayed on a display screen 402. The revealing of information is based on one or more of a determined angle of rotation of the 3-D stylus within the plane of the display screen; and a determined size of the 3-D stylus. In figure 4b, the user is holding the apparatus/device 400 and is holding his thumb 408 over the messaging tile/icon 406. The user's thumb is a 3-D stylus. The particular rotation angle φ of the user's thumb 408 is at approximately 90° to the currently vertical edge 410 of the apparatus/device 400. This particular angle of rotation of the 3-D stylus 408 within the plane of the display screen 402 is associated with revealing the metadata information of the number of unread messages available for viewing 412. The particular angle may be determined within a predefined tolerance range of angles (for example, the user may position his 3-D stylus within ± 10° from the 90° angle and the angle may be detected as being associated with revealing the metadata indicating the number of unread messages 412). In this example there are currently five unread messages. In this example the particular user input made by the user's thumb in this position also causes the associated tile/icon to be magnified.
In figure 4c, the user is holding one finger 414 over the messaging tile/icon 406. The user's finger is a 3- D stylus. The particular rotation angle φ of the user's finger 414 is between 0° and 30° to the currently vertical edge 410 of the apparatus/device 400. This angle of rotation within the plane of the display screen 402 of the 3-D stylus 414 in this particular rotation angular range is associated with revealing both metadata 416 and historical information 418, 420. The metadata information revealed is the number of unread messages available for viewing 416 (in this example, five unread messages are indicated). The historical information revealed is the name of the sender of each unread message 418, and the first few words of each unread message 420. It may be considered that between figures 4b and 4c, the angle of rotation φ of the stylus has been varied so as to increase the number of types of information revealed about the messaging application tile/icon 406. In figure 4d, the user is holding two fingers 424 over the messaging tile/icon 406. The user's two fingers form a 3-D stylus. The particular rotation angle φ of the user's fingers is the same as that shown in figure 4c, namely between 0° and 30° to the currently vertical edge 410 of the apparatus/device 400. This angle of rotation within the plane of the display screen 402 of the 3-D stylus 414 in this particular rotation angular range is associated with revealing both metadata 416 and historical information 418, 420. The metadata information revealed, as before, is the number of unread messages available for viewing 416.
The size of the 3-D stylus between figures 4c and 4d has increased from one finger to two fingers. This increase in the 3-D stylus size causes an increase in the number of types of historical information revealed due to the 3-D stylus actuation of the messaging tile/icon 406. The historical information revealed includes the name of the sender of each unread message 418, and the first few words of each unread message 420, as when one finger is used as a 3-D stylus. Additionally, due to the increased 3-D stylus size, further types of historical information is revealed, namely the particular type of message received 426 (such as SMS, MMS, or chat message) and the time and day when the message was received 428.
It may be considered that between figures 4c and 4d, the size of the stylus has been varied so as to increase the number of types of information revealed about the messaging application tile/icon 406. In the examples of figures 4c and 4d, the historical information 418, 420, 426, 428 of only the first three unread messages is revealed. The user is able to scroll down the pop-up list to see the information of the other two unread messages (using a scroll arrow 422). In other examples the information for all five unread messages may be revealed with no need for scrolling. It will be appreciated that in some examples, only the angle of rotation φ may be considered regardless of the stylus size/volume, or indeed the angle of tilt Θ of the stylus (discussed in relation to figures 5a-5d). Also, in some examples, only the stylus size (the footprint area and/or the volume) may be considered, regardless of the angle of rotation φ of the stylus or indeed the angle of tilt Θ of the stylus. The increase in the number of types of historical information revealed due to the increase in 3-D stylus size (from one finger to two fingers) may be considered to be an increase in the number of revealed particular types of information as the size of the 3-D stylus increases. For example, if the user uses three fingers as a 3-D stylus over the messaging time/icon 406, then more types of historical information may be revealed, such as an avatar for each contact, for example. This change/increase can occur in discrete steps or progressively as the determined size and /or angle changes.
In some examples the revealed information may be displayed for the duration of the 3-D stylus actuation of the associated 3-D stylus-actuable graphical user interface element. After the 3-D stylus actuation is terminated (for example, the user's finger/pen is removed from the display screen and is no longer detected) then the revealed information may be removed from display. In some examples the revealed information may be displayed for the duration of the 3-D stylus actuation of the associated 3-D stylus-actuable graphical user interface and for a predetermined period of time after the termination of the 3-D stylus actuation. For example, a user may be able to reveal information using a 3-D stylus actuation, and may be able to then remove the 3-D stylus actuating input and further interact with the revealed information for a predetermined period of time (for example three seconds). Such interactions may for example allow the user to scroll through a displayed list using a scroll arrow 422, or to select an item in a revealed list to further interact with it. The further interaction may be, for example, to view a revealed unread message in full, or to open the associated application to allow the user to read a particular message and reply to it.
Figures 5a-5d illustrate examples of an apparatus/device 500 in use. The display screen 502 is showing a photo gallery application displaying previews of three photographs 504, 506 in an album. A central photograph is shown as a large preview image 504 and two photographs 506 either side of the central photograph are shown as thumbnail images. In this example the user is interested in viewing information relating to the central photograph 504.
The central photograph 504 in this example is a 3-D stylus-actuable graphical user interface element, and is configured to have an associated plurality of types of information available for revealing. The apparatus/device 500 is configured to enable the revealing of information associated with the 3-D stylus- actuable graphical user interface element 504 displayed on a display screen 502 based on 3-D stylus- actuation. . The revealing of information in this example is based on a determined angle of tilt between the 3-D stylus 550, 552 and the plane of the display screen 502.
Figure 5a shows a user holding a finger 550 (a 3-D stylus) at an angle Θ] to the plane of the display screen 502. This angle Θ] is normal to the plane of the display screen 502 within a tolerance (for example, a tolerance of ±10°). At this tilt angle the apparatus/device 500 is configured to reveal the metadata of the date when the photograph was recorded 508.
Figure 5b shows a user holding his finger 550 at a shallower angle (¾ to the plane of the display screen 502. This angle (¾ is a smaller angle than the angle Θ] shown in figure 5a. At this decreased tilt angle the apparatus/device 500 is configured to reveal the metadata of the date when the photograph was recorded 508 as well as the additional metadata of the location where the photograph was recorded 510.
Figure 5c shows a user holding a finger 550 at a shallower angle still (¾ to the plane of the display screen 502. This angle (¾ is a smaller angle than the angle Θ2 shown in figure 5b. At this further decreased tilt angle the apparatus/device 500 is configured to reveal the metadata of the date 508, location 510, and also the additional metadata of the weather conditions 512 at the photograph location when the photograph 504 was captured. Thus in this example it may be considered that the 3-D stylus-actuable graphical user interface element 504 is configured to have an associated plurality of types of information 508, 510, 512 available for revealing, and the apparatus/device 500 is configured such that number of revealed particular types of information progressively increases as the determined angle θ], Θ2, Θ3 between the 3-D stylus 550 and the display screen 502 decreases.
In figure 5d, the user is holding two fingers 552 at the angle Θ3 to the plane of the display screen 502. Because the size of the 3-D stylus (the user's two fingers) has increased, compared with the one finger 3- D stylus used in the example of figures 5c, the apparatus/device 500 is configured to reveal an increased amount of information relating to some types of metadata. Further location information 520 is revealed (the country (Sweden) and the latitude and longitude coordinates) and further weather condition 522 information is revealed (the minimum and maximum temperatures recorded in the location where the photograph was recorded on the day the photograph was recorded). The apparatus/device 500 is also configured in this example to display further types of metadata due to the increased size of the 3-D stylus at the particular tilt angle Θ3 of the 3-D stylus. The further metadata types revealed are the name of the person who took the photograph 514 (Matt Harman), the name of the album 516 in which the photograph is located ("Christmas visit to Lund"), and the storage medium 518 in which the photograph is stored (the "data cloud").
Thus in this example it may be considered that the apparatus/device 500 is configured such that the amount of a particular type of revealed information 510, 512 associated with the 3-D stylus-actuable graphical user interface element 504 increases as the determined size of the 3-D stylus 550, 552 increases.
Also, in figure 5d, the angle of tilt Θ3 of the user's fingers 552 and the size of the user's two fingers 552 as a 3-D stylus has been detected and the apparatus/device 500 has used the detected stylus tilt angle Θ3 and size to reveal a series of three actuable buttons 524, 526, 528. In this example the user is able to interact with the revealed information, in this example to: share the associated photograph 506 by selecting the "share" button 524; to delete the associated photograph 506 by selecting the "delete" button 526; and to view more options for interacting with the associated photograph 506 by selecting the "more" button 528.
In this example the revealed information 508, 510, 512, 514, 516, 518, 520, 522, 524, 526, 528 remains displayed for a predetermined delay period (for example, of five seconds) after removal of the 3-D stylus 550, 552 so that the user can make a further user input and select a selectable option to interact with the displayed content and/or revealed information/option buttons. In some examples, it may be that, if the revealed content is for display purposes only and cannot be interacted with, then this information is removed from display when the actuating 3-D stylus input is removed; and that the revealed content remains displayed for a period of time after removal of the 3-D stylus actuating user input if the revealed content can be interacted with (such as the content including a selectable button or a message which may be opened in full if selected).
In this example the user need not perform an interaction with the revealed content 524, 526, 528 by using a stylus 550, 552 at the same tilt angle and having the same size as that used to initially reveal the content. For example, after revealing the content as in figure 5d, the user may simply tap a revealed button 524, 526, 528 with one or more fingers, or a pen, for example, at any stylus angle, to interact with that button. In other examples, the user may be required to use the same angle and/or size of stylus to interact with the revealed content as those used to reveal the content.
It will be appreciated that the above examples illustrate particular ways of revealing particular information using a 3-D stylus, but that any combination of varying stylus angle and/or size, and particular information, for a wide variety of different 3-D stylus actuable graphical user interface elements, may be achieved.
As an example, an apparatus/device with a display screen may display a music video and output the accompanying music. The music video can be considered a 3-D stylus-actuable graphical user interface element. A user may hold a finger (a 3-D stylus) parallel to the plane of the display screen at approximately a right angle to an edge of a display screen (e.g., as per figure 4b) while the video is playing to view, for example, the name of the song being played (the song name is an example of metadata associated with the video file). As the user gradually changes the orientation of his finger by rotating it, further information may be displayed. Thus the revealed song name may be displayed along with the band name, gradually followed by the album name, the year of recording the album, the names of the band members, and the album artwork, as the user's finger rotates. If the user gradually rotates his finger back to the starting position the revealed information may gradually be removed from display. As another example, an apparatus/device with a display screen may display a widget for a social media application. If the user hovers a finger over the widget, then the most recently received social media status update may be revealed (for example, "Janet: I had a great time at the concert with Jake last night!"). This status update is an example of historical content, as it is the content of a social media message/status update which has been received in the past. As the user's finger gradually changes tilt angle with respect to the plane of the display screen, forming a shallower tilt angle, progressively older social media updates may also, or alternatively, be revealed. Also in this example, if the user uses two fingers as a 3-D stylus, then both the content of the social media message(s) and the time when the message(s) were received may be revealed to the user. This additional time of receipt information is revealed due to the increased size of the 3-D stylus (two fingers rather than one finger).
As another example, an apparatus/device may display a calendar application (or an icon/widget associated with a calendar application). If the user hovers a pen over the calendar application icon/widget, then the number of upcoming appointments scheduled for the week ahead may be displayed to the user. If the user then uses a finger with a greater size (e.g., larger volume) than the pen, then the details of the next immediately upcoming appointment may be revealed. Then, if the user uses two fingers as a 3-D stylus, having a greater volume than one finger, then the next two upcoming appointments may be revealed. Also in this example, it may be that if the user's finger(s) are oriented to lie within one particular range of rotation angles within the plane of the display screen, then the time and location of the calendar meeting(s) are revealed. If the user's finger(s) are rotated to lie within a different particular range of rotation angles, then the time, location, meeting title, and names of other attendees of the meeting(s) are revealed. Calendar information such as the location, time, title and attendees may be considered to be future information, as it relates to an event which is scheduled for a time in the future. The number of calendar appointments scheduled for the week ahead may be considered to be metadata, as this is information about the information stored in the calendar.
The above examples relate to revealing metadata relating to a communications (messaging) application, an image application, a movie application, a social media application and a calendar application. Of course, the metadata may be revealed for a wide variety of applications, including other communications applications (e-mail, chat, news feed, and calling (telephonic and internet-based)), music player applications, file manager/explorer applications, internet and internet browsing applications, and marketplace applications (such as online retail and application stores).
The above examples relate to revealing historical information associated with a messaging application, an image application, and a social media application. Of course, the historical information may be associated with a wide variety of applications, such as e-mailing applications, calling applications, chat applications, news feed applications, movie applications, calendar applications, and internet browsing applications.
The above examples relate to revealing information about a future event associated with a calendar application, but of course other such applications relating to future events include alarm applications and social media applications for which information may be revealed. The user may be able to use a 3-D stylus to interact with, for example, an application icon, an application widget, a displayed image or movie, a list entry, a text string, or other graphical user interface element.
The apparatus/device illustrated is a portable electronic device. In other examples, the apparatus/device may be one or more of: a mobile phone, a smartphone, a tablet computer, a surface computer, a laptop computer, a personal digital assistant, a graphics tablet, a pen-based computer, a digital camera, a watch, a non-portable electronic device, a desktop computer, a monitor/display, a household appliance, a server, or a module for one or more of the same.
Figure 6 illustrates detection of 3-D stylus actuation according to examples of the present disclosure. The display screen 602 of an apparatus/device 600 may be (or be overlaid by) a 3-D capacitive touch sensor layer which is able to detect both touch user inputs (whereby a user physically contacts the screen with a finger, pen, or other stylus) and hover user inputs (whereby a stylus is held within a sensing field around the 3-D capacitive touch sensor without necessarily making contact with the layer). The capacitive touch sensor is able to generate a capacitive field, which may be understood as a virtual mesh 604. The capacitive field/virtual mesh 604 is generated in the area surrounding the display screen 602 up to a distance from the screen 602 of, for example 5cm. The capacitive field/virtual mesh 604 may extend past the edges of the display screen 602 in the plane of the display screen 602.
The 3-D capacitive touch sensor is able to detect the size of an object 606, and the orientation (rotation and/or tilt angle) of an object 606 within the virtual mesh 604. In some examples the size of an object/3- D stylus 606 may be determined as a volume, by determining a virtual deformation of the virtual mesh due to an object 606 being positioned within the mesh 604. In some examples the size of an object/3-D stylus 606 may be determined as a footprint area, by determining a 2-D area of the sensor over which an object 606 is positioned. This may be performed by calculating an effective footprint of an object 606 located over the layer. The 3-D capacitive touch sensor may be able to determine the shape, location, movements and speed of movement of the object 606 based on detection of the object within the virtual mesh 604.
A 3-D capacitive touch sensor may be able to detect styluses which are not in physical contact with the sensor, and may be able to detect styluses of different materials in contact with the sensor. Thus, for example, the user may be able to operate the sensor / device even if he is wearing gloves, or he is wearing a plaster / bandage on his finger (the stylus).
In other examples, the size and/or angle of a 3-D stylus 606 located over a display screen 602 may be determined using a visible camera, an infra-red camera, or a heat sensor. For example, a front-facing camera facing in the same direction as the display screen 602 may record images of a 3-D stylus 606 over the display screen 602 and from these images, determine a 3-D stylus size and/or angular orientation. As another example, an infra-red camera or a heat sensor may be able to detect the presence of a user's fingers 606 and/or thumb over the display screen 602 and determine a size and/or angular orientation of the user's fingers/ thumb 606. Two or more detection apparatus may be used in combination in some examples.
Different regions of a finger / thumb may be used as a stylus, such as a fingernail, finger pad, or knuckle. In some examples the angular orientation of the user's finger may be determined by a fingernail algorithm which can use the determined position of a detected fingernail to calculate the angular orientation of the user's finger. This may be performed if the user's finger (or fingernail) is hovering without contacting a sensor / display screen, and / or if the user's finger (or fingernail) is in physical contact with the sensor / display screen. In some examples the angular orientation of the user's finger may be determined by an algorithm which can use the determined positions of detected finger joints to calculate the angular orientation of the user's finger joints. Thus the user may be able to reveal information based on, for example, the angles of the first and second joints of a finger.
The size and/or angle of a 3-D stylus 606 may be determined using a sensing layer, such as a 3-D capacitive sensing layer. In some examples the sensing layer may be located overlying a display screen. In some examples the sensing layer may not be located over a display screen but may be remote from a display screen. For example, the sensing layer may be a wearable sensing element such as an element incorporated into a jacket sleeve or other clothing / accessory, or may be a remote sensing element / device such as a touchpad.
The apparatus / device disclosed herein may be one or more of: a portable electronic device, a mobile phone, a smartphone, a tablet computer, a surface computer, a laptop computer, a personal digital assistant, a graphics tablet, a pen-based computer, a digital camera, a watch, a non-portable electronic device, a desktop computer, a monitor / display, a household appliance, a server, or a module for one or more of the same. In some examples, the apparatus / device may be a deformable apparatus / device, such as a Tollable, flexible and / or bendable apparatus / device. In some examples the apparatus / device may have more than one display screen.
Figure 7a shows an example of an apparatus 700 in communication 706 with a remote server 704. Figure 7b shows an example of an apparatus 700 in communication 706 with a "cloud" 710 for cloud computing. In figures 7a and 7b, apparatus 700 (which may be apparatus 100, 200 or 300) is also in communication 708 with a further apparatus 702. The further apparatus 702 may be a 3-D capacitive touch sensor, or a camera, for example. In other examples, the apparatus 700 and further apparatus 702 may both be comprised within a device such as a portable communications device or computer. Communication may be via a communications unit, for example.
Figure 7a shows the remote computing element to be a remote server 704, with which the apparatus 700 may be in wired or wireless communication (e.g. via the internet, Bluetooth, NFC, a USB connection, or any other suitable connection as known to one skilled in the art). In figure 7b, the apparatus 700 is in communication with a remote cloud 710 (which may, for example, be the Internet, or a system of remote computers configured for cloud computing).
For example, the further apparatus 702 may be a 3-D capacitive touch sensor and may detect distortions in its surrounding field caused by a proximal object such as a 3-D stylus. The measurements may be transmitted via the apparatus 700 to a remote server 704 for processing and the processed results, indicating an on-screen volume/orientation of the 3-D stylus, may be transmitted to the apparatus 700. As another example, the further apparatus 702 may be a camera and may capture images of a user's finger positions in front of the camera. The images may be transmitted via the apparatus 700 to a cloud 710 for (e.g., temporary) recordal and processing. The processed results, indicating an angular position of the user's finger(s), may be transmitted back to the apparatus 700. In some examples, information such as metadata, historical information (e.g., message content and sender information) and or information about a future event (e.g., time and location information about an alarm) which can be revealed by a 3-D stylus actuating a 3-D stylus-actuable graphical user interface element may be stored remotely. In other examples the second apparatus 702 may also be in direct communication with the remote server 704 or cloud 710, for example to transmit stylus position and/or size information for processing.
Figure 8 illustrates a method 800 according to an example of the present disclosure. The method comprises, for a 3-D stylus-actuable graphical user interface element displayed on a display screen, enabling the revealing of information associated with the 3-D stylus-actuable graphical user interface element based on 3-D stylus-actuation comprising one or more of: a determined angle of the 3-D stylus with respect to the display screen; and a determined size of the 3-D stylus.
Figure 9 illustrates schematically a computer/processor readable medium 900 providing a program according to an example of this disclosure. In this example, the computer/ processor readable medium is a disc such as a Digital Versatile Disc (DVD) or a compact disc (CD). In other examples, the computer readable medium may be any medium that has been programmed in such a way as to carry out the functionality herein described. The computer program code may be distributed between the multiple memories of the same type, or multiple memories of a different type, such as ROM, RAM, flash, hard disk, solid state, etc.
Any mentioned apparatus/device/server and/or other features of particular mentioned apparatus/device/server may be provided by apparatus arranged such that they become configured to carry out the desired operations only when enabled, e.g. switched on, or the like. In such cases, they may not necessarily have the appropriate software loaded into the active memory in the non-enabled (e.g. switched off state) and only load the appropriate software in the enabled (e.g. on state). The apparatus may comprise hardware circuitry and/or firmware. The apparatus may comprise software loaded onto memory. Such software/computer programs may be recorded on the same memory/processor/functional units and/or on one or more memories/processors/ functional units.
In some examples, a particular mentioned apparatus/device/server may be pre-programmed with the appropriate software to carry out desired operations, and wherein the appropriate software can be enabled for use by a user downloading a "key", for example, to unlock/enable the software and its associated functionality. Advantages associated with such examples can include a reduced requirement to download data when further functionality is required for a device, and this can be useful in examples where a device is perceived to have sufficient capacity to store such pre-programmed software for functionality that may not be enabled by a user.
Any mentioned apparatus/circuitry/elements/processor may have other functions in addition to the mentioned functions, and that these functions may be performed by the same apparatus/circuitry/elements/processor. One or more disclosed aspects may encompass the electronic distribution of associated computer programs and computer programs (which may be source/transport encoded) recorded on an appropriate carrier (e.g. memory, signal).
Any "computer" described herein can comprise a collection of one or more individual processors/processing elements that may or may not be located on the same circuit board, or the same region/position of a circuit board or even the same device. In some examples one or more of any mentioned processors may be distributed over a plurality of devices. The same or different processor/processing elements may perform one or more functions described herein.
The term "signalling" may refer to one or more signals transmitted as a series of transmitted and/or received electrical/optical signals. The series of signals may comprise one, two, three, four or even more individual signal components or distinct signals to make up said signalling. Some or all of these individual signals may be transmitted/received by wireless or wired communication simultaneously, in sequence, and/or such that they temporally overlap one another.
With reference to any discussion of any mentioned computer and/or processor and memory (e.g. including ROM, CD-ROM etc), these may comprise a computer processor, Application Specific Integrated Circuit (ASIC), field-programmable gate array (FPGA), and/or other hardware components that have been programmed in such a way to carry out the inventive function. The applicant hereby discloses in isolation each individual feature described herein and any combination of two or more such features, to the extent that such features or combinations are capable of being carried out based on the present specification as a whole, in the light of the common general knowledge of a person skilled in the art, irrespective of whether such features or combinations of features solve any problems disclosed herein, and without limitation to the scope of the claims. The applicant indicates that the disclosed aspects/examples may consist of any such individual feature or combination of features. In view of the foregoing description it will be evident to a person skilled in the art that various modifications may be made within the scope of the disclosure.
While there have been shown and described and pointed out fundamental novel features as applied to examples thereof, it will be understood that various omissions and substitutions and changes in the form and details of the devices and methods described may be made by those skilled in the art without departing from the scope of the disclosure. For example, it is expressly intended that all combinations of those elements and/or method steps which perform substantially the same function in substantially the same way to achieve the same results are within the scope of the disclosure. Moreover, it should be recognized that structures and/or elements and/or method steps shown and/or described in connection with any disclosed form or examples may be incorporated in any other disclosed or described or suggested form or example as a general matter of design choice. Furthermore, in the claims means-plus- function clauses are intended to cover the structures described herein as performing the recited function and not only structural equivalents, but also equivalent structures. Thus although a nail and a screw may not be structural equivalents in that a nail employs a cylindrical surface to secure wooden parts together, whereas a screw employs a helical surface, in the environment of fastening wooden parts, a nail and a screw may be equivalent structures.

Claims

1. An apparatus comprising:
at least one processor; and
at least one memory including computer program code,
the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus to perform at least the following:
for a 3-D stylus-actuable graphical user interface element displayed on a display screen, enable the revealing of information associated with the 3-D stylus-actuable graphical user interface element based on 3-D stylus-actuation comprising one or more of:
a determined angle of the 3-D stylus with respect to the display screen; and
a determined size of the 3-D stylus.
2. The apparatus according to claim 1, wherein the determined angle between the 3-D stylus and the display screen is one or more of: an angle of rotation within the plane of the display screen; and an angle of tilt with respect to the plane of the display screen.
3. The apparatus according to claim 1, wherein the 3-D stylus-actuable graphical user interface element is configured to have an associated plurality of types of information available for revealing, and the apparatus is configured such that number of revealed particular types of information changes as one or more of:
the determined angle between the 3-D stylus and the display screen changes; and
the determined size of the 3-D stylus changes.
4. The apparatus according to claim 1, wherein the 3-D stylus-actuable graphical user interface element is configured to have an associated plurality of types of information available for revealing, and the apparatus is configured such that a particular type of information is revealed according to one or more of:
the particular determined angle between the 3-D stylus and the display screen; and
the particular determined size of the 3-D stylus.
5. The apparatus according to claim 1, wherein the apparatus is configured such that the amount of a particular type of revealed information associated with the 3-D stylus-actuable graphical user interface element changes as one or more of:
the determined angle between the 3-D stylus and the display screen changes; and
the determined size of the 3-D stylus changes.
6. The apparatus according to claim 1, wherein the determined size of the 3-D stylus is one or more of: a detected footprint area of the 3-D stylus; and a detected volume of the 3-D stylus.
7. The apparatus according to claim 1, wherein one or more of the angle between the 3-D stylus and the display screen and the volume of the 3-D stylus is determined by a 3-D capacitive touch sensor.
8. The apparatus according to claim 1, wherein the 3-D stylus is at least one of: a pen, a finger, two fingers, three fingers, four fingers, and a thumb.
9. The apparatus according to claim 1, wherein the revealed information associated with the 3-D stylus-actuable graphical user interface element comprises one or more of:
metadata of a particular application associated with the 3-D stylus-actuable graphical user interface element;
historical information; and
information about a future event.
10. The apparatus according to claim 9, wherein the metadata of a particular application associated with the 3-D stylus-actuable graphical user interface element is associated with one or more of: a music player application, an image application, a movie application, a file manager application, an internet application, a communications application, an application icon, and an application widget.
11. The apparatus according to claim 9, wherein the historical information is associated with one or more of: an e-mailing application, a calling application, a messaging application, a chat application, a social media application, a news feed application, an image application, a movie application, a calendar application, an internet application, an application icon, and an application widget.
12. The apparatus according to claim 9, wherein the information about a future event is associated with one or more of: a calendar application, an alarm application, a social media application, an application icon, and an application widget.
13. The apparatus according to claim 1, wherein the apparatus is one or more of: a portable electronic device, a mobile phone, a smartphone, a tablet computer, a surface computer, a laptop computer, a personal digital assistant, a graphics tablet, a pen-based computer, a digital camera, a watch, a nonportable electronic device, a desktop computer, a monitor/display, a household appliance, a server, or a module for one or more of the same.
14. A method comprising:
for a 3-D stylus-actuable graphical user interface element displayed on a display screen, enabling the revealing of information associated with the 3-D stylus-actuable graphical user interface element based on 3-D stylus-actuation comprising one or more of:
a determined angle of the 3-D stylus with respect to the display screen; and
a determined size of the 3-D stylus.
15. A computer readable medium comprising computer program code stored thereon, the computer readable medium and computer program code being configured to, when run on at least one processor perform at least the following: for a 3-D stylus-actuable graphical user interface element displayed on a display screen, enable the revealing of information associated with the 3-D stylus-actuable graphical user interface element based on 3-D stylus-actuation comprising one or more of:
a determined angle of the 3-D stylus with respect to the display screen; and
a determined size of the 3-D stylus.
PCT/FI2013/050658 2013-06-17 2013-06-17 An apparatus for a 3-d stylus-actuable graphical user interface and associated methods WO2014202819A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/FI2013/050658 WO2014202819A1 (en) 2013-06-17 2013-06-17 An apparatus for a 3-d stylus-actuable graphical user interface and associated methods

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/FI2013/050658 WO2014202819A1 (en) 2013-06-17 2013-06-17 An apparatus for a 3-d stylus-actuable graphical user interface and associated methods

Publications (1)

Publication Number Publication Date
WO2014202819A1 true WO2014202819A1 (en) 2014-12-24

Family

ID=48808388

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2013/050658 WO2014202819A1 (en) 2013-06-17 2013-06-17 An apparatus for a 3-d stylus-actuable graphical user interface and associated methods

Country Status (1)

Country Link
WO (1) WO2014202819A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007037809A1 (en) * 2005-09-16 2007-04-05 Apple, Inc. Operation of a computer with touch screen interface
US20100044121A1 (en) * 2008-08-15 2010-02-25 Simon Steven H Sensors, algorithms and applications for a high dimensional touchpad
US20120068941A1 (en) * 2010-09-22 2012-03-22 Nokia Corporation Apparatus And Method For Proximity Based Input

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007037809A1 (en) * 2005-09-16 2007-04-05 Apple, Inc. Operation of a computer with touch screen interface
US20100044121A1 (en) * 2008-08-15 2010-02-25 Simon Steven H Sensors, algorithms and applications for a high dimensional touchpad
US20120068941A1 (en) * 2010-09-22 2012-03-22 Nokia Corporation Apparatus And Method For Proximity Based Input

Similar Documents

Publication Publication Date Title
JP7111862B2 (en) continuity
US11150775B2 (en) Electronic device and method for controlling screen display using temperature and humidity
JP5658765B2 (en) Apparatus and method having multiple application display modes, including a mode with display resolution of another apparatus
EP2570906B1 (en) Mobile terminal and control method thereof
US8892162B2 (en) Vibration sensing system and method for categorizing portable device context and modifying device operation
US9594476B2 (en) Electronic device comprising a touch-screen display and a rear input unit, and method of controlling the same
WO2019000438A1 (en) Method of displaying graphic user interface and electronic device
US20140006994A1 (en) Device, Method, and Graphical User Interface for Displaying a Virtual Keyboard
US20140043277A1 (en) Apparatus and associated methods
US20160349851A1 (en) An apparatus and associated methods for controlling content on a display user interface
US11455096B2 (en) Method for displaying graphical user interface based on gesture and electronic device
WO2019000437A1 (en) Method of displaying graphic user interface and mobile terminal
US20160224221A1 (en) Apparatus for enabling displaced effective input and associated methods
WO2013124799A2 (en) An apparatus and associated methods
KR20100001601A (en) Portable terminal capable of sensing proximity touch
WO2014100948A1 (en) An apparatus and associated methods
WO2014202819A1 (en) An apparatus for a 3-d stylus-actuable graphical user interface and associated methods
JP6154654B2 (en) Program and information processing apparatus
KR20170042218A (en) ELECTRONIC DEVICE AND METHOD FOR DISPLAYING a plurality of items
TW201610823A (en) Reduced size user interface

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13739468

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13739468

Country of ref document: EP

Kind code of ref document: A1