CA2922985A1 - Extendable blade sequence along pannable canvas direction - Google Patents

Extendable blade sequence along pannable canvas direction Download PDF

Info

Publication number
CA2922985A1
CA2922985A1 CA2922985A CA2922985A CA2922985A1 CA 2922985 A1 CA2922985 A1 CA 2922985A1 CA 2922985 A CA2922985 A CA 2922985A CA 2922985 A CA2922985 A CA 2922985A CA 2922985 A1 CA2922985 A1 CA 2922985A1
Authority
CA
Canada
Prior art keywords
blade
canvas
user
user interface
journey
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
CA2922985A
Other languages
French (fr)
Inventor
Stephen Michael Danton
Vishal R. Joshi
Karandeep Singh Anand
William J. Staples
Nafisa BHOJAWALA
Brendyn ALEXANDER
Brad Olenick
Jonah Bush STERLING
Leon Ezequiel Welicki
Madhur Joshi
Jon Harris
Justin Beckwith
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Technology Licensing LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Technology Licensing LLC filed Critical Microsoft Technology Licensing LLC
Publication of CA2922985A1 publication Critical patent/CA2922985A1/en
Abandoned legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/332Query formulation
    • G06F16/3329Natural language query formulation or dialogue systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/242Query formulation
    • G06F16/2428Query predicate definition using graphical user interfaces, including menus and forms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/103Formatting, i.e. changing of presentation of documents
    • G06F40/117Tagging; Marking up; Designating a block; Setting of attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/34Graphical or visual programming
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • G06F9/452Remote windowing, e.g. X-Window System, desktop virtualisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/48Program initiating; Program switching, e.g. by interrupt
    • G06F9/4806Task transfer initiation or dispatching
    • G06F9/4843Task transfer initiation or dispatching by program, e.g. task dispatcher, supervisor, operating system
    • G06F9/485Task life-cycle, e.g. stopping, restarting, resuming execution
    • G06F9/4856Task life-cycle, e.g. stopping, restarting, resuming execution resumption being on a different machine, e.g. task migration, virtual machine migration

Abstract

A user interface that includes a canvas that extends in one direction. An activation mechanism may be used to generate an initial blade in the canvas. A blade is a user interface element that occupies a portion of the canvas in the dimension along which the canvas extends. The blade includes multiple selectable elements that each have a corresponding blade. The user interface includes an extension mechanism configured to present a corresponding subsequent blade on the canvas perhaps adjacent to the prior blade when a selected element from a prior blade is selected, the subsequent blade also including multiple selectable elements that may be hierarchically structured. In this manner, blade chains may be created representative of a journey that the user has taken since initiating the first blade.

Description

EXTENDABLE BLADE SEQUENCE ALONG PANNABLE CANVAS
DIRECTION
BACKGROUND
[0001] A current paradigm for navigating through various information contexts is windows based. A classic example of this is the web browser experience. A user might begin with a home page that occupies the entire browser space. The user might then select a hyperlink, whereupon a new window appears. However, the previous window either disappears or, in the case of exercising an option to open the new page in a new window, the previous window is fully, or at least partially, hidden.
BRIEF SUMMARY
[0002] At least some embodiments described herein relate to a user interface that includes a canvas that extends in one direction. An activation mechanism may be used to generate an initial blade in the canvas. A blade is a user interface element that occupies a portion of the canvas in the dimension along which the canvas extends. For instance, the blade might occupy a majority or even all of the canvas in the portion to which it is assigned. The blade includes multiple selectable elements that each have a corresponding blade. If the selectable element is selected, then the corresponding blade is also presented on the canvas. For instance, the new blade might be displayed adjacent the first blade in the extendable direction of the canvas. The elements in the first blade might be hierarchically structured so that a selectable element in the first blade might actually contain one or more other selectable elements.
[0003] The user interface might more generally include an extension mechanism configured to present a corresponding subsequent blade on the canvas when a selected element from a prior blade is selected, the subsequent blade also including multiple selectable elements that may be hierarchically structured. In this manner, long chains of blades may be created representative of a journey that the user has taken since initiating the first blade.
[0004] This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
BRIEF DESCRIPTION OF THE DRAWINGS
[0005] In order to describe the manner in which the above-recited and other advantages and features can be obtained, a more particular description of various embodiments will be rendered by reference to the appended drawings.
Understanding that these drawings depict only sample embodiments and are not therefore to be considered to be limiting of the scope of the invention, the embodiments will be described and explained with additional specificity and detail through the use of the accompanying drawings in which:
[0006] Figure 1 abstractly illustrates a computing system in which some embodiments described herein may be employed;
[0007] Figure 2A abstractly illustrates an environment in which a portion of a pannable canvas is displayed on a displayable area of a display, and a portion of the canvas is not displayed in the displayable area of the display;
[0008] Figure 2B illustrates the user interface of Figure 2A in a state in which the user has selected an activation control to activate an initial blade in a journey;
[0009] Figure 2C illustrates the user interface of Figure 2B in a state in which the user has selected a selectable element of the initial blade to add an additional blade;
[0010] Figure 3 illustrates a supporting architecture for a user interface of Figures 2A
through 2C;
[0011] Figure 4 illustrates a flowchart of a method for generating a user interface that has a pannable canvas and that presents a sequence of blades representing a user journey through information;
[0012] Figure 5 illustrates a user interface in which a portion of a canvas is displayed;
[0013] Figure 6 illustrates a user interface resulting from the interaction of the user with an activation control;
[0014] Figure 7 illustrates a user interface that shows how information that is global may still be accessed without having to leave the context of the journey;
[0015] Figure 8 illustrates a user interface that is similar to that of Figure 6, except that command window is activated;
[0016] Figure 9 illustrates a user interface having a single blade in the journey, but in which that single blade has a command space activated;
[0017] Figure 10 illustrates a user interface having a single blade in the journey, in a state in which the user is able to select a selectable element from that single blade;
[0018] Figure 11 illustrates a user interface that results from the user selecting the selectable element, and in which a second blade in the journey is correspondingly appended to for a sequence of blades in the journey;
[0019]
Figure 12 illustrates the second blade of Figure 11 only in full representation mode;
[0020]
Figure 13 illustrate a journey view in which a journey may be viewed in its entirety, saved, pinned, shared, or closed;
[0021] Figure 14 illustrates a flow associated with presenting a selectable item or part which then issues command to underlying information resources;
[0022]
Figure 15 illustrates a part presented in different sizes and shapes, and exposing different information in different forms depending on the size and shape;
[0023]
Figure 16 illustrates a notification as it might appear if not associated with any of the activation controls in the favorites area;
[0024]
Figure 17 illustrates the activation of a temporary area and an initial blade in response to selection of the notification;
[0025]
Figure 18 illustrates an initial state in which the user selects to create a website;
[0026]
Figure 19 illustrates a creation pane in which the user enters information about the website and is given options for adding onto the original resource requested;
[0027]
Figure 20 illustrates a creation pane in which the user selects a database to be associated with the website, and the system automatically selects an additional add on of a connection string between the website and the database;
[0028]
Figure 21 illustrates the creation pane in which the user enters information about the database; and
[0029]
Figure 22 illustrates the creation pane in which the user enters information about the connection string between the website and the database.
DETAILED DESCRIPTION
[0030] At least some embodiments described herein relate to a user interface that includes a canvas that extends in one direction along a pannable direction. An activation mechanism may be used to generate an initial blade in the canvas. A blade is a user interface element that occupies a portion of the canvas in the pannable dimension along which the canvas extends. For instance, the blade might occupy a majority or even all of the canvas in the portion to which it is assigned. The blade includes multiple selectable elements that each have a corresponding blade. If the selectable element is selected, then the corresponding blade is also presented on the canvas. For instance, the new blade might be displayed adjacent the first blade in the extendable direction of the canvas. The elements in the first blade might be hierarchically structured so that a selectable element in the first blade might actually contain one or more other selectable elements.
[0031] The user interface might more generally include an extension mechanism configured to present a corresponding subsequent blade on the canvas when a selected element from a prior blade is selected, the subsequent blade also including multiple selectable elements that may be hierarchically structured. In this manner, long chains of blades may be created representative of a journey that the user has taken since initiating the first blade.
[0032] Some introductory discussion of a computing system will be described with respect to Figure 1. Then, example user interfaces, methods and supporting architectures will be described with respect to subsequent figures.
[0033] Computing systems are now increasingly taking a wide variety of forms.
Computing systems may, for example, be handheld devices, appliances, laptop computers, desktop computers, mainframes, distributed computing systems, or even devices that have not conventionally been considered a computing system. In this description and in the claims, the term "computing system" is defined broadly as including any device or system (or combination thereof) that includes at least one physical and tangible processor, and a physical and tangible memory capable of having thereon computer-executable instructions that may be executed by the processor. The memory may take any form and may depend on the nature and form of the computing system. A computing system may be distributed over a network environment and may include multiple constituent computing systems.
[0034] As illustrated in Figure 1, in its most basic configuration, a computing system 100 typically includes at least one processing unit 102 and memory 104. The memory 104 may be physical system memory, which may be volatile, non-volatile, or some combination of the two. The term "memory" may also be used herein to refer to non-volatile mass storage such as physical storage media. If the computing system is distributed, the processing, memory and/or storage capability may be distributed as well.
As used herein, the term "executable module" or "executable component" can refer to software objects, routines, or methods that may be executed on the computing system. The different components, modules, engines, and services described herein may be implemented as objects or processes that execute on the computing system (e.g., as separate threads).
[0035] In the description that follows, embodiments are described with reference to acts that are performed by one or more computing systems. If such acts are implemented in software, one or more processors of the associated computing system that performs the act direct the operation of the computing system in response to having executed computer-executable instructions. For example, such computer-executable instructions may be embodied on one or more computer-readable media that form a computer program product. An example of such an operation involves the manipulation of data.
The computer-executable instructions (and the manipulated data) may be stored in the memory 104 of the computing system 100. Computing system 100 may also contain communication channels 108 that allow the computing system 100 to communicate with other message processors over, for example, network 110.
[0036] The computing system 100 also includes a display 112 on which a user interface, such as the user interfaces described herein, may be rendered. Such user interfaces may be generated in computer hardware or other computer-represented form prior to rendering. The presentation and/or rendering of such user interfaces may be performed by the computing system 100 by having the processing unit(s) 102 execute one or more computer-executable instructions that are embodied on one or more computer-readable media. Such computer-readable media may form all or a part of a computer program product.
[0037]
Embodiments described herein may comprise or utilize a special purpose or general-purpose computer including computer hardware, such as, for example, one or more processors and system memory, as discussed in greater detail below.
Embodiments described herein also include physical and other computer-readable media for carrying or storing computer-executable instructions and/or data structures. Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system. Computer-readable media that store computer-executable instructions are physical storage media. Computer-readable media that carry computer-executable instructions are transmission media. Thus, by way of example, and not limitation, embodiments of the invention can comprise at least two distinctly different kinds of computer-readable media: computer storage media and transmission media.
[0038]
Computer storage media includes RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other tangible medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
[0039] A
"network" is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices.
When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a computer, the computer properly views the connection as a transmission medium.

Transmissions media can include a network and/or data links which can be used to carry or desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
Combinations of the above should also be included within the scope of computer-readable media.
[0040]
Further, upon reaching various computer system components, program code means in the form of computer-executable instructions or data structures can be transferred automatically from transmission media to computer storage media (or vice versa). For example, computer-executable instructions or data structures received over a network or data link can be buffered in RAM within a network interface module (e.g., a "NIC"), and then eventually transferred to computer system RAM and/or to less volatile computer storage media at a computer system. Thus, it should be understood that computer storage media can be included in computer system components that also (or even primarily) utilize transmission media.
[0041]
Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor, cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. The computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the described features or acts described above. Rather, the described features and acts are disclosed as example forms of implementing the claims.
[0042] Those skilled in the art will appreciate that the invention may be practiced in network computing environments with many types of computer system configurations, including, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, pagers, routers, switches, and the like. The invention may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks.
In a distributed system environment, program modules may be located in both local and remote memory storage devices.
[0043]
Figure 2A abstractly illustrates an environment 200A in which a portion 201A
of a canvas 202 is displayed on a displayable area 210 of a display, and a portion 201B of the canvas is not displayed in the displayable area 210 of the display. As an example, the display might be, for instance, the display 112 of the computing system 100 of Figure 1.
The canvas 202 extends in an extendable direction 221 along a single dimension 222. For instance, in Figure 2A through 2C, the extendable direction 221 is rightward, and the single dimension 222 is horizontally.
[0044] The user interface might include a panning control 231 that the user may manipulate in order to pan the canvas 202 along the single dimension 222 so as to gain a view of a desired portion of at least a populated portion of the canvas 202.
Figure 3 illustrates a supporting architecture 300 for a user interface 301. For instance, the user interface 301 is an example of the user interface of Figures 2A through 2C.
The supporting architecture 300 is illustrated as including a panning mechanism 311 which operates to pan the canvas 202 in response to user direction.
[0045] In some embodiments, the single dimension 222 may be configurable to be vertical or horizontal according to the tastes of the user. Also, the extendable direction 221 may be configurable. For instance, if the dimension 222 is vertical, the user might configure the canvas to extend downward, or upward, according to the user's preference.
If the dimension 222 is horizontal, the user might configure the canvas to extend leftward, or rightward, according to the user's preference. The panning control 231 might not activate until the populated portion of the canvas extends to a degree that the populated portion of the canvas cannot be displayed all at once. The supporting architecture 300 illustrates a configuration mechanism 312 that is configured to respond to such user preference configuration.
[0046] As used herein, a "populated" portion of the canvas means that portion of the canvas that contains the sequence of one or more blades built up through user interaction with the canvas. In accordance with the principles described herein, as the user interfaces with the canvas, one or more and potentially many blades may be added to the canvas.
Some or all of these blades may be added contiguously. As such, the "extendable direction" of the canvas refers to the direction in which new blades are added to the prior sequence of one or more contiguous blades upon interacting with selectable elements from the prior blade.
[0047] In this description and in the claims, a "blade" is a user interface element that occupies a position within the canvas. In this sense, a "position" is a range within the single dimension of the canvas. For instance, if the pannable dimension 222 is the horizontal dimension, as in the more specific examples to follow, the blade would occupy the canvas 202 from a left position to a right position. The blade "occupies the position"
within the canvas in the sense that between the two boundaries of the blade, the blade occupies at least a majority of the canvas within those two boundaries for each of a majority of points between the two boundaries including at least the two boundaries themselves. In some embodiments, and in the more concrete examples to follow, the blade occupies all of the canvas between the two boundaries, which are left and right boundaries since the dimension is the horizontal dimension in the concrete examples to follow.
[0048] The user interface 210 also includes an activation control 232 that may be activated by a user to present an initial blade on the canvas 220. Each blade may include one or multiple selectable elements. Each selectable element has a corresponding blade such that if the user selects the corresponding selectable element, the corresponding blade appears on the canvas as a next blade on the canvas. For instance, in Figure 2B, the environment 200B is similar to that of environment 200A, except that the selection of activation control 232 causes an initial blade 241 to appear in the canvas 202. The initial blade includes multiple selectable controls 251A, 251B and 251C. Figure 3 illustrates that the supporting architecture includes an activation mechanism 313 that is responsive to activate the initial blade 241 in response to the user interfacing with the activation control 232. Furthermore, the blade correlator 314 maintains a correlation between a selectable element and a blade. Accordingly, the architecture 300 has a knowledge of what blades correspond to each of the selectable controls 251A, 251B, and 251C.
[0049] The user interface 210 also includes an extension mechanism 315, which causes one or more additional blades to be spawned off of the initial blade 241 in response to selection of one of the selectable elements 251A, 251B and 251C. For instance, suppose that the user selects selectable element 251A, the extension mechanism 315 might then reference the corresponding blade identified by correlator 314, and cause the corresponding blade to appear on the canvas 202. For instance, in Figure 2C, the environment 200C shows that the new blade 242 is caused to appear next to the first blade 241. The new blade 242 likewise has selectable controls 252A and 252B.
Accordingly, this process may be repeated by the user repeatedly interfacing with selectable controls from a prior blade, thereby causing a new blade to be added to the sequence of blades. In this sense, the sequence of blades may be thought of as a journey that the user has taken beginning with the first blade.
[0050] In some embodiments, each blade in the sequence is associated with an information context. The user may thus have at their view a sequence of blades representing a sequence of information contexts that the user has navigated through. Thus, the canvas represents an area that includes visual representations of various information contexts the user has reviewed as part of an information journey.
[0051] In some embodiments, the selectable elements within a particular blade are hierarchically structured. For instance, one selectable element might have one or more child selectable elements. In addition, those child selectable elements might have further child selectable elements, and so forth. Each selectable element has a corresponding blade representation. Accordingly, when generating choosing a selectable element to select in a given prior blade, the user has fine grained control over what kind of blade is to show next. Examples of this hierarchical structuring of selectable elements will be described in subsequent concrete examples.
[0052]
Figure 4 illustrates a flowchart of a method 400 for generating a user interface.
As the method 400 may be performed by the architecture 300 of Figure 3 in the environments of Figures 2A through 2C, the method 400 will now be described with frequent reference to Figures 2A through 2C, Figure 3, as well as of course Figure 4.
[0053] The method 400 includes providing a canvas that extends in a pannable direction in the user interface (act 401). For instance, in Figure 2A through 2C, the canvas 202 is provided with a pannable dimension 222 and which extends in an extendable direction 221. If an initial blade is not activated through user interaction with the activation control 232 ("No" in decision block 402), the canvas 202 might remain in the state of Figure 2A with no blades populating the canvas.
[0054]
However, upon detecting activation of the activation control 232 ("Yes" in decision block 402), the initial blade corresponding to the activation is presented on the canvas (act 403). For instance, in Figure 2B, the initial blade 241 is presented in response to activation of activation control 232. If none of the selectable controls of the initial blade are selected ("No in decision block 404), then the sequence of blades might extend no further. On the other hand, if the user interacts with one of the selectable controls of the prior blade ("Yes" in decision block 404), then the corresponding blade is found (act 405), and presented (act 406). For instance, in Figure 2C, the additional blade 242 is presented.
As each newly added blade also has selectable elements, this process may repeat to build up a large sequence of blades, each representing a different information context. A more concrete example of the user interface will now be described with respect to Figure 5 and numerous subsequent figures.
[0055]
Figure 5 illustrates a user interface 500 in which a portion of a canvas is 501 is displayed. The user interface 500 is an example of what a user interface might look like after completing act 401 of Figure 4. In this example, the panning dimension is the horizontal dimension. In other words, the canvas may be panned left and right within at least the populated portion of the canvas. Also, in this example, the extendable direction is towards the right. In other words, at least a majority (and in this example all) of the new blades that are added and that are to be appended to the sequence of one or more existing blades in the journey are to be added appended at the right end of the sequence.
[0056] The canvas includes a collection 511 of activation controls. The collection might be referred to hereinafter as "favorites". In the illustrated case, there are eight activation controls. Each of the activation controls in the collection 511 is associated with an information context. An information context might be, for example, a web site, a project, a patent tool, or any other context in which information regarding a subject might be provided.
[0057] The user initiates a journey by selecting one of the activation controls. For instance, the user might select the activation control 512 using cursor 513, or by contacting the activation control 512 in the case of a touch screen. This would result in decision block 402 of Figure 4 branching on the "Yes" branch.
[0058]
Figure 6 illustrates a user interface 600 resulting from the interaction of the user with the activation control 512. Note that a blade 601 has been presented (as an example of act 403). The blade 601 represents the beginning of a sequence of blades (or a journey) that the user has just embarked upon. The blade 601 is associated with the information context that was associated with the activation control 512 used to spawn the blade 601. In other words, the blade 601 provides a view into data in the information context. The activation control 512 might be visually highlighted so as to give the user the contextual reference that the blade 601 was generated by activation of the activation control 512, versus any of the other activation controls 511.
[0059] In one embodiment, the blade 601 (or any other blades in the journey for that matter) are scrollable in the direction perpendicular to the panning dimension. For instance, in Figure 6, if the blade 601 was taller than the canvas, then the blade 601 might be scrolled upward and downward to reveal the vertical extent of the blade 601.
[0060] The blade 601 includes a hierarchy of selectable elements, each selectable element having a corresponding blade representation (not shown in Figure 6) which is displayed if the selectable element is selected. At the top of the hierarchy is what is called a "lens" in this example. If, for example, the information context were a web site, the lens might present at a high level the information that might be normally presented when selecting a tab of the web site. For instance, the blade 601 is illustrated as including two lenses 611 and 612.
[0061] At the next level in the hierarchy, each lens 611 includes one or more parts. For instance, lens 611 includes parts 621 through 624, and lens 612 includes parts 625 through 627. Each part represents a view on an aspect of the data, and may be defined by a collection of underlying data and a view definition that defines how that data is to be exposed on the user interface.
[0062] At the lowest level of the hierarchy of selectable elements is an item. Each part may have one or more selectable items. To avoid unnecessarily over-labeling Figure 6, only one selectable item is labeled. For instance, selectable part 621 is illustrated as having selectable item 631.
[0063] A new blade might be launched within the journey by the user selecting any of the lenses 611 and 612, any of the selectable parts 621 through 627, or any of the selectable items within any of the parts. Thus, the user has a large number of options for progressing in the journey through information and the canvas.
[0064] For instance, suppose that the information context of the blade 601 was a patent tool associated with a particular patent application. In that case, the selectable lenses, parts, and items of the blade might allow the user to see summary information in the blade 601 regarding the initial disclosure documents, legal notes, status of the application, actual patent application text, which countries or regions the patent application is being pursued in, and so forth. If the user wanted to see any further details about any of those aspects, the user might the select the element, spawning a new blade that shows further details and selectable items.
[0065]
Figure 7 illustrates a user interface 700 that shows how information that is global may still be accessed without having to leave the context of the journey. In particular if the user selects control 701, a navigation pane 702 appears.
Thus, the user might, for instance, see what other team members are doing (a global task) while remaining within a word processing experience (a journey context). If desired, the user might select an item within the navigation pane 702 to start a new journey, thus enabling rapid switching between journeys.
[0066]
Commanding may be performed in a constituent way across all artifacts in the user interface. For instance, if a command window is activated, context is used to determine what the prospective target for commands is. For instance, if the cursor is above a particular lens, the command window may then appear and show commands tailored towards operation on the lens given the type of the resource or item represented by the lens. In accordance with embodiments described herein, the command window shell might appear the same regardless of what the prospective target is for the commands in the command window, though the commands themselves may differ as appropriate given the prospective target. For instance, the command window might be the same regardless of whether the prospective target is the background canvas, a particular blade lens, a particular lens
[0067] Figure 8 illustrates a user interface 800 that is similar to that of Figure 6, except that command window 801 is activated. The command window 801 has a primary commands section 811, a secondary commands section 812, and an ecosystem peek section 813. For instance, the primary commands section 811 might include the most frequently used commands given a type of the prospective target. The secondary command section 812 might include other non-primary commands.
[0068] The ecosystem peek section might include information items that are not commands, but rather other information (such as notifications or messages) related to the prospective target. For instance, the ecosystem peek section might include an identification of when the prospective target was last updated, if anyone else is editing the prospective target. Furthermore, the ecosystem peek information may include help information that provides a quick insight into how to use the prospective target and where to get more information about the prospective target.
[0069] The end user can interface with a command in the command space or a command window, and see the command implementation as a declarative model. The user might then extend the declarative module using the same means used to implement the declarative model. Thus, users may author their own commands or extend other commands.
[0070]
Figure 9 illustrates a user interface 900 that might appear if the user selects the control 632 of the blade 601 in Figure 6. The control may be present in each blade and may be used to activate a command space within and corresponding to that blade, with commands tailored towards the type of the resource or item represented by the blade. For instance, in Figure 9, upon selecting control 632 of the blade 601, the command space 901 appears, allowing the user to execute commands on the resource or item represented by the blade 601.
[0071]
Figure 10 illustrates a user interface 1000 that is the same as the user interface 700 of Figure 7, except to show that the user is about to select the part 622 of the blade 601. The selection of part 622 is an instance of the method 400 travelling along the "Yes"
branch in decision block 404. In response, the corresponding blade of the part 622 is found (act 405), and presented (act 406). For instance, the user interface 1100 of Figure 11 might appear. Here, the entire pannable canvas has shifted to the left aligned with the left side of the previous blade 601, and the new blade 1101 is caused to appear. The new blade 1101 also has a number of selectable elements, thereby allowing the user an opportunity to further extend the sequence of blades if so desired.
[0072] When a new blade is launched, that new blade may be positioned by default on the canvas adjacent the last blade in a sequence of one or more blades that constitute the journey up to that point. However, default positions may vary by blade. For instance, the blade might be pinned to the right or left portion of the display (if the canvas pans horizontally), where panning of the canvas does not affect the position of the pinned blade.
The blade might also be caused to float, in which case the blade is stationary in some portion of the display despite panning of the underlying canvas. Having blades be pinned or floating while the canvas is pannable allows the blades to be compared to other blades throughout the journey.
[0073] The default position of the blade might be changed due to some user interaction. For instance, if the user interacts with a certain blade that was originally appended to the sequence of blade prior to that blade in the journey, that certain blade may later be pinned to the left or right of the display, or caused to float, depending on the user interaction. Later, that certain blade might return to its default positioning in the sequence of blades.
[0074] Furthermore, when a part is selected from a prior blade, perhaps more than one blade is launched and appended to the sequence of blades to that point. For instance, if the user selects a monitoring part, they might want to see an overview of the monitoring events, all events that happened this week and all events that happened this month. These three sets might instead be displayed in separate blades that are siblings to each other.

Such sibling blades might reordered in the sequence of blades. The selection of some parts might result in multiple blades automatically, while some parts may permit multiple blades one at a time, upon each instance of the user selecting the part.
[0075] The use of a sequence of blades representing a user's journey through information allows for visualization of cause and effect of certain actions.
For instance, when a user edits in the context of a certain part, that certain part may be visually highlighted to reflect to editing. For instance, suppose that a part within blade 1101 were edited, that part might be highlighted. However, the entire chain leading to that part in cause and effect is also highlighted. For instance, part 622, the selection of which caused blade 1101 to be launched in the first place, may likewise be visually highlighted to show editing. Furthermore, the activation control 512 that was selected in order to launch blade 601 might also be highlighted to reflect that there is editing that occurred.
If the user saves the changes then the highlighting showing that editing cause and effect chain may be removed. On the other hand, if the user wants to exit that journey without saving, the activation control 512 remains highlighted, reminding the user that there is something along that chain that needs to be edited. There might be a command that allows the user to select that activation control 512 in a certain way so as to quickly return to the edited part within blade 1101, so that the editing may be completed and/or saved.
[0076] This visualized cause and effect between parts and blades may be utilized for other actions as well. For instance, consider confirmations. The conventional confirmation model (e.g. "Do you want to delete this file?) is modal, which means the model forces the user into a decision when they may or may not have all of the information needed to answer the question. As an example, the user may not know whether someone else is using the file. The user does have the option of cancelling out of the operation unless they know how to answer it. However, this lessens the likelihood that the user will remember to return to the action that caused the confirmation in the first place. The model also forces the user to operate in a one and a time fashion. For instance, the user is unable to work through a set of websites, selecting specific ones for a Stop Running operation, and then execute all of the operations at once from a single location. Lastly, working with contexts of different types (e.g. websites and databases) is even more difficult for users - despite this being a very common scenario.
[0077] In accordance with embodiments described herein, confirmations are not modal. Instead, the when a confirmation occurs in that context of a particular part, the causal chain from the activation control all the way through all of the parts and blades, to the part that generated the confirmation will be highlighted to reflect that there is a confirmation. The user may now exit out of that journey, and evaluate whatever information is desired to evaluate to determine how to respect to the confirmation. The user might then interface with the source of the confirmation (e.g., the activation control) to quickly return to the part that generated the confirmation. The user may then answer the confirmation with more knowledge and security.
[0078] These highlighted paths may also be used to expose errors. For instance, if an error occurs in the context of a part, the user may step back and evaluate the error using other resources. Once the user is satisfied that the error is being handled, the user might close out the error message. Alternatively, the error message may remain until someone handles the problem that caused the error.
[0079] A
highlighted path might also be provided for potential availability message that inform the user of available services or resources that may be used to enhance an experience or performance associated with a part.
[0080] Up until this point in the description, each new blade has defaulted to be less than the entire displayable area. For instance, the blade 601 is illustrated as occupying the entire vertical extent of the canvas, but just one third the horizontal extent of the visible portion of the canvas. Such might be considered to a "browse-mode" of the blade, as such allows panning action to more easily see a larger number of blade to allow the user to quickly browse the information contexts through which the user has navigated as part of the journey. Blade 1101 has likewise been opened in browse mode.
[0081] There might be some interaction that might cause the corresponding blade to have some other display mode. For instance, upon interacting with the part 621 of blade 601 as shown in Figure 11, the user interface 1200 of Figure 12 might appear.
Here, the user interface 1200 includes a full representation 601' of the blade, the full representation occupying much of the displayed canvas. The full mode may contain a different number or combination of lens and parts than the browse mode. Furthermore, any parts that are in common may in fact be differently sized, and thus present different granularities of details on information, as described further below.
[0082] In some embodiments, if the full representation of the blade is arrived at by interacting with a browse-mode representation of the blade, the full representation of the blade may simply be added to the sequence of blades with the prior blade being the browse-mode representation. The canvas does, however, automatically pan in order to display the entire full representation of the blade. However, the user retains the ability to pan the canvas despite the display of the full representation of the blade.
Further interaction with the full representation may allow for a return to the browse-mode view, perhaps removing the full representation of the blade from the sequence of blades on the canvas.
[0083] In some embodiments, for some selectable elements, when selected, the part might default to a full representation, rather than the browse representation.
The default setting for a selectable element might be a function of the underlying type of the resource or item represented by the selectable element. The user might also configure certain types to display in different ways by default, or have a chance to change the default setting. The blade might have size control mechanisms that allow adjustment of the size of the blade to other sizes as well. In any case, if the blade being resized is not the last blade in the sequence of blades representing the journey, the position of the later blades may be adjusted so as to remain adjacent to and unobscured by the blade that has been readjusted.
Thus, blades can be maximized, but that does not mean they grow overtop of existing blades. Instead, the blade grows width-wise (or more generally along the pannable direction of the canvas), pushing adjacent blades to the left and right. A
blade might also be collapsed. For instance, the blade might be made very thin, with little or no information being displayed, but with a control that allows the blade to be expanded again.
[0084] Up until this point, the user interface associated with navigation of a canvas has been described. There might also be a journey view that allows the user to see the entire journey at a glance. This zoomed out view is a semantic representation of the journey, which means that information of importance is maintained and made easier to view, while finer grained details are dropped away. In addition to being able to see more of a navigation context, this view allows users to quickly jump from one blade to another, and do so with an understanding of what they are going to see, rather than having to remember what the target blade contains.
[0085] In addition, the journey view may allow the user to see all of their open journeys, by perhaps scrolling in the perpendicular direction as the canvas pans. This ability to see all open journeys not only makes switching between journeys easy, but also allows the ability to compare journeys. The journey view also enables a apply concept called light path (described later) across journeys.
[0086] The user might also have access to journey controls that allow the user to perform a journey commands (such as save the journey, share the journey, pin the journey as a tile, or close the journey) while viewing the semantics of the journey at a glance. For instance, at any point in the journey navigation, the user might select a title bar (or any other journey view activation control), to cause the user interface to exit the navigation interface, and enter the journey view interface. The journey view might also have a pop-out comment that allows the journey to be displayed in a window, and thus allow the user to use the remaining portions of the display for other purposes. If there is some change to the underlying data that causes a change in the journey, that change may be reflected in the pop-out window of the journey.
[0087]
Figure 13 illustrates a user interface 1300 that illustrates a journey view corresponding to the journey built up through Figure 12. The journey view 1300 includes a journey bar 1301 that spans the extent of all of the blade sequences to that point. The journey bar 1301 includes a command section 1302 that allows the user to pin the journey to a certain area of the screen, share the journey with others (via e-mail, social media, collaborative work environments, and the like), save the journey, or close the journey. For instance, saving a journey might be quite helpful if the journey is to be re-performed frequently, such as the journey being a navigation to a particular database.
Sharing a journey might be helpful, for instance, in the case where a tester finds a specific bug and then send the journey they went on to find that bug to a developer. The developer could then quickly understand not just the context of the bug by the set of insights that lead to the bug.
[0088] Figure 14 illustrates a general system flow associated with any selectable item or any selectable part. The system includes information resources 1401. The transforms 1402 associated with the selectable item or part filter through a portion of the total resources 1401 that are relevant for a particular information context. For instance, the transforms 1402 might provide data associated with a website. The selectable part or item 1403 provides the data associated with the particular item or part to a presenter 1404. The presenter is configured to offer up to the display an appropriate view of the data. For instance, the presenter 1404 might present a chart view of the data. Since the selectable part or item is selectable, there are certain commands that may be issued by the part or item depending on its type. These commands may then be executed as represented by arrow 1405 on the underlying information resources 1401. This would affect potentially not only the experience of the item or part 1404, but might also affect any other item or part that uses that altered data.
[0089] This architecture allows developers significant opportunities. For instance, developers can use existing parts and items developed by others. In addition, the developers can create their own selectable items by defining a new data source and/or defining a new presenter 1404. The developer might define a type for that item, and then define commands 1405 for that item, which would be exposed to the user if the user were to select the command window in the context of that new type. The developers can also create their own selectable parts by using existing items or creating new items to define a new data source 1403 and/or presenter 1404 for that selectable part.
Also, the developer might define a type for that part, and then define commands 1405 for that type, which would be exposed to the user if the user were to select the command window in the context of that new type.
[0090] The developer might access pre-defined lenses, each comprises combinations of parts, depending on the goal to be accomplished. For instance, there might be lenses for presenting overviews, usage/billing, monitoring, configuration, linked resources, team views, authoring, and quick starting amongst others. Developers may further construct their own lens by gathering pre-existing and/or newly developed parts and positioning them with respect to each other.
[0091]
Figure 15 illustrates a single part in six different sizes and shapes. The larger version of the parts intelligently offers up more information to the user. The size of the part is actually taken into consideration by the presenter 1404 (see Figure 14) when considering what information to present, and how that information should be presented.
The intelligence may be applied by the developer of the part when defining a part. In this example, there is a mini-size part version 1501, a small part version 1502, a normal part version 1503, a large part version 1504, a wide part version 1505, and a tall part version 1506. Parts may appear both within blades of a journey, as well as they might actually be an activation control within the set of activation controls 511. In certain contexts, the user might have the option to change the size of the part in accordance with the sizes available for that part. For instance, within the activation controls 511, the user might change a part size to be smaller or larger. The user might also move parts within the activation controls 511 for more preferred layout for the user. By having a discrete number of pre-defined part sizes and shapes, the designers of the part may more intelligently design what information is to be displayed in the part, and how that information is to be displayed, depending on what they judge as most appropriate given the size and shape of the part.
[0092]
Various items may be pinned to the activation controls 511 area as a part that may be activated. For instance, an item from the navigation pane 702 may be pinned as a part in the activation controls 511. Commands from a command window or space may be pinned as a part in the activation controls 511. Likewise, component parts of a lens within a blade may be pinned in the activation controls 511. A pin-all function might also be provided that cases all parts of the blade to be pinned within the activation controls 511 (perhaps as a nested collection within another activation control). Resources, collections, feeds, currated content (such as help topics), and search or query results may likewise be pinned as parts in the activation controls 511. For instance, queries and searches may be performed over a wide variety of information contexts accessible to the computer system.
The user can construct queries that go across multiple contexts and then turn the result of that query into a part. That part represents the result of the query and can then be sized and organized as per the part concepts defined above. That part might be provided into each blade that has a context in which the query results are desired.
[0093]
Figure 16 illustrates a user interface 1600 in which there are no journeys started yet, but in which the activation controls are illustrated. Note that the activation controls have a variety of different sizes including six mini-sizes parts, four small size parts, four normal size parts, and one large size part. The notification 1601 appears. If the notification was tied to one of the activation controls, then the notification might have appeared within one of the activation controls. However, the notification 1601 was not tied to an activation control, and thus appeared separately.
[0094] Upon selecting the notification 1601, the user interface 1701 of Figure 17 may appear. Here, in response to the selection of the notification 1601, the canvas has panned automatically to show a temporary area 1702 just to the right of the activation controls, automatically populated the temporary area 1702 with a part 1703 corresponding to the notification 1601, and automatically selects the part 1703 so that the blade 1704 appears.
Such a response might be performed whenever an initiating blade of journey is launched by means other than selection of one of the activation controls 511. If the user were to close the blade 1704, the temporary area 1702 with the temporary part 1703 may remain, giving the user the option of pinning that part into the activation controls 511. In this description, anything described as being pinned to the activation controls 511 may likewise be pinned to the navigation page 702 or to a blade. For instance, a part may be moved from one blade to another.
[0095] Users often want to create one artifact and then bundle it together with others.
For instance, the user might want to create a website and then bundle the website with a database. Conventionally, this is often done in a one at a time fashion. The user creates a website, then the database, and then links them together. Or, perhaps the user picks a website template that automatically has a website and database in it. If a new component comes along, each template that is to incorporate the component needs to be updated or the user is burdened with one more thing to link together.
[0096] In accordance with embodiments described herein, the user may create an item, and then ask the system what are the other commonly added items associated with the item they are creating. The user can then select those items, choosing to bundle them together. Bundling means they will automatically be created in the right order (e.g. the database has to be created before a caching service can be linked to it) and then automatically linked together. This saves the user the user a tremendous amount of time, and also gives the user the opportunity to improve the overall functionality of the resource.
An example walkthrough of this will be described with respect to Figures 18 through 22.
[0097]
Figure 18 illustrates a user interface 1800 that includes a number of activation controls. Suppose that the user selects the activation control 1801 in order to create a new web site. The user interface 1900 of Figure 19 might then appear. A create website pane 1901 appears that allows the user to enter in values for the web site.
However, the website pane 1901 also presents also add options 1902 for possible resources that could help augment the performance of the website. Figure 20 illustrates a user interface 2000 in which the user has selected a database option. The system is aware that a database and a website will need some kind of connection string, and thus automatically selects a option 2001. A progress bar 2002 then appears showing that the user has completed one of three steps in the total package. If the user selects next, the user interface 2100 of Figure 21 appears. Here, the user enters information associated with the database. Upon selecting next, the user interface 2200 of Figure 22 appears. The user enters information associated with the link, and completes the application. The system automatically builds the application.
[0098]
Accordingly, the principles described herein provide a robust interface for operating in which the user takes an intuitive journey through information, and can quickly understand the context in which he or she is working.
[0099] The present invention may be embodied in other specific forms without departing from its spirit or essential characteristics. The described embodiments are to be considered in all respects only as illustrative and not restrictive. The scope of the invention is, therefore, indicated by the appended claims rather than by the foregoing description.
All changes which come within the meaning and range of equivalency of the claims are to be embraced within their scope.

Claims (10)

1. A computer program product comprising one or more computer-readable storage media having thereon computer-executable instructions that are structured such that, when executed by one or more processors of a computing system, cause the computing system to operate a user interface program that is configured to display a user interface on a display of the computing system, the user interface comprising:
a canvas that extends in an extendable direction on the user interface; and an activation mechanism that may be activated to generate a first blade on the canvas, the first blade including a plurality of elements, each of at least some of the plurality of elements having at least one corresponding blade such that if the corresponding element is selected, the corresponding blade appears on the canvas as a second blade on the canvas.
2. The computer program product in accordance with Claim 1, wherein the second blade is contiguous the first blade in the panning direction of the canvas.
3. The computer program product in accordance with Claim 1, wherein the first blade is scrollable in a direction perpendicular to the extendable direction.
4. The computer program product in accordance with Claim 1, wherein the plurality of elements are hierarchically structured and include at a first selectable element, a parent element, that contains a plurality of child elements at least one of which being selectable, such that a blade corresponding to the parent element appears on the canvas if the parent element is selected, and such that a blade corresponding to the child element appears on the canvas if the selectable child element is selected.
5. A method for generating a user interface, the method comprising:
an act of providing a canvas that extends in a pannable direction in the user interface;
an act of presenting a first blade in a canvas at a first position of the canvas in the pannable direction, the first blade including a first plurality of elements, each of at least some of the first plurality of elements having a corresponding blade such that if the corresponding element is selected, the corresponding blade appears on the canvas as a second blade on the canvas;

an act of detecting a selection of one of the first plurality of elements; and in response to the detection of the selection of the element in the first plurality of elements, an act of presenting a second blade in a canvas at a second position of the canvas in the pannable direction.
6. The method in accordance with Claim 5, wherein the second blade includes a second plurality of elements, each of at least some of the second plurality of elements having a corresponding blade such that if the corresponding element is selected, the corresponding blade appears on the canvas as a third blade on the canvas.
7. The method in accordance with Claim 6, further comprising:
in response to user interaction, an act presenting a viewer that allows an entire sequence of blades including at least the first, second, and third blades to be viewed on the display.
8. The method in accordance with Claim 6, further comprising:
in response to user interaction, an act of saving an entire sequence of blades including at least the first blade, the second blade, and the third blade.
9. The method in accordance with Claim 6, the first blade occupying a majority of the canvas at the first position, and the second blade occupying a majority of the canvas at the second position.
10. The method in accordance with Claim 6, wherein the first blade occupies all of the canvas at the first position, and the second blade occupies all of the canvas at the second position.
CA2922985A 2013-09-30 2014-09-29 Extendable blade sequence along pannable canvas direction Abandoned CA2922985A1 (en)

Applications Claiming Priority (25)

Application Number Priority Date Filing Date Title
US201361884743P 2013-09-30 2013-09-30
US61/884,743 2013-09-30
US201361905105P 2013-11-15 2013-11-15
US201361905116P 2013-11-15 2013-11-15
US201361905129P 2013-11-15 2013-11-15
US201361905119P 2013-11-15 2013-11-15
US201361905101P 2013-11-15 2013-11-15
US201361905128P 2013-11-15 2013-11-15
US201361905114P 2013-11-15 2013-11-15
US201361905111P 2013-11-15 2013-11-15
US61/905,128 2013-11-15
US61/905,119 2013-11-15
US61/905,116 2013-11-15
US61/905,114 2013-11-15
US61/905,105 2013-11-15
US61/905,101 2013-11-15
US61/905,111 2013-11-15
US61/905,129 2013-11-15
US201361905247P 2013-11-17 2013-11-17
US201361905243P 2013-11-17 2013-11-17
US61/905,247 2013-11-17
US61/905,243 2013-11-17
US14/231,846 2014-04-01
US14/231,846 US20150095842A1 (en) 2013-09-30 2014-04-01 Extendable blade sequence along pannable canvas direction
PCT/US2014/057938 WO2015048600A1 (en) 2013-09-30 2014-09-29 Extendable blade sequence along pannable canvas direction

Publications (1)

Publication Number Publication Date
CA2922985A1 true CA2922985A1 (en) 2015-04-02

Family

ID=52741177

Family Applications (2)

Application Number Title Priority Date Filing Date
CA2922725A Abandoned CA2922725A1 (en) 2013-09-30 2014-09-29 Pan and selection gesture detection
CA2922985A Abandoned CA2922985A1 (en) 2013-09-30 2014-09-29 Extendable blade sequence along pannable canvas direction

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CA2922725A Abandoned CA2922725A1 (en) 2013-09-30 2014-09-29 Pan and selection gesture detection

Country Status (17)

Country Link
US (11) US20150095849A1 (en)
EP (6) EP3053027A1 (en)
JP (2) JP6465870B2 (en)
KR (3) KR102186865B1 (en)
CN (6) CN105593813B (en)
AU (2) AU2014324618A1 (en)
BR (1) BR112016004551A8 (en)
CA (2) CA2922725A1 (en)
CL (1) CL2016000729A1 (en)
HK (1) HK1222731A1 (en)
IL (1) IL244368A0 (en)
MX (2) MX2016003946A (en)
PH (1) PH12016500256A1 (en)
RU (2) RU2686822C2 (en)
SG (2) SG11201601888UA (en)
TW (4) TW201528106A (en)
WO (7) WO2015048203A1 (en)

Families Citing this family (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USD750112S1 (en) * 2013-01-04 2016-02-23 Samsung Electronics Co., Ltd. Portable electronic device with graphical user interface
US20150095849A1 (en) * 2013-09-30 2015-04-02 Microsoft Corporation Dialogs positioned with action visualization
USD745877S1 (en) * 2013-10-17 2015-12-22 Microsoft Corporation Display screen with graphical user interface
US10176218B2 (en) * 2014-11-07 2019-01-08 Sap Se OData custom query composer
US10452750B2 (en) 2015-08-04 2019-10-22 Google Llc Systems and methods for interactively presenting a visible portion of a rendering surface on a user device
US10990258B2 (en) * 2015-08-04 2021-04-27 Google Llc Interactively presenting a visible portion of a rendering surface on a user device
JP6812639B2 (en) * 2016-02-03 2021-01-13 セイコーエプソン株式会社 Electronic devices, control programs for electronic devices
US10289297B2 (en) * 2016-08-26 2019-05-14 Google Llc Animating an image to indicate that the image is pannable
US9871911B1 (en) * 2016-09-30 2018-01-16 Microsoft Technology Licensing, Llc Visualizations for interactions with external computing logic
KR102605332B1 (en) * 2016-11-02 2023-11-23 주식회사 넥슨코리아 Device and method to provide content
US10796088B2 (en) * 2017-04-21 2020-10-06 International Business Machines Corporation Specifying a conversational computer agent and its outcome with a grammar
WO2018208047A1 (en) 2017-05-09 2018-11-15 Samsung Electronics Co., Ltd. Method and system for managing and displaying application
US10827319B2 (en) * 2017-06-02 2020-11-03 Apple Inc. Messaging system interacting with dynamic extension app
CN110019718B (en) * 2017-12-15 2021-04-09 上海智臻智能网络科技股份有限公司 Method for modifying multi-turn question-answering system, terminal equipment and storage medium
CN110019717B (en) * 2017-12-15 2021-06-29 上海智臻智能网络科技股份有限公司 Device for modifying multi-turn question-answering system
US11341422B2 (en) 2017-12-15 2022-05-24 SHANGHAI XIAOl ROBOT TECHNOLOGY CO., LTD. Multi-round questioning and answering methods, methods for generating a multi-round questioning and answering system, and methods for modifying the system
US11379252B1 (en) * 2018-01-31 2022-07-05 Parallels International Gmbh System and method for providing layouts for a remote desktop session
US11659003B2 (en) * 2018-08-30 2023-05-23 International Business Machines Corporation Safe shell container facilitating inspection of a virtual container
US10902045B2 (en) * 2018-09-18 2021-01-26 Tableau Software, Inc. Natural language interface for building data visualizations, including cascading edits to filter expressions
US11048871B2 (en) * 2018-09-18 2021-06-29 Tableau Software, Inc. Analyzing natural language expressions in a data visualization user interface
CN109542563B (en) * 2018-11-09 2022-06-07 优信数享(北京)信息技术有限公司 Multi-state integrated android page management method, device and system
US11385766B2 (en) 2019-01-07 2022-07-12 AppEsteem Corporation Technologies for indicating deceptive and trustworthy resources
EP3764210A1 (en) 2019-07-08 2021-01-13 dSPACE digital signal processing and control engineering GmbH Display of display areas on a desktop
US11089050B1 (en) * 2019-08-26 2021-08-10 Ca, Inc. Isolating an iframe of a webpage
US11455339B1 (en) 2019-09-06 2022-09-27 Tableau Software, LLC Incremental updates to natural language expressions in a data visualization user interface
US11474975B2 (en) 2019-09-18 2022-10-18 Microsoft Technology Licensing, Llc Identity represented assets in a content management system
US11199955B2 (en) * 2019-10-02 2021-12-14 Palantir Technologies Inc. Enhanced techniques for building user interfaces
CN110825766A (en) * 2019-11-13 2020-02-21 恩亿科(北京)数据科技有限公司 Query condition generation method and device, server and readable storage medium
CN110995942B (en) * 2019-12-06 2021-08-06 科大国创软件股份有限公司 Soft switch automatic calling method and system based on interface visualization
CN111177455A (en) * 2019-12-31 2020-05-19 精英数智科技股份有限公司 Method, device and equipment for determining cutting tooth load type of coal mining machine and storage medium
CN111610912B (en) * 2020-04-24 2023-10-10 北京小米移动软件有限公司 Application display method, application display device and storage medium
WO2022049220A2 (en) 2020-09-02 2022-03-10 Genmab A/S Antibody therapy
US11698933B1 (en) 2020-09-18 2023-07-11 Tableau Software, LLC Using dynamic entity search during entry of natural language commands for visual data analysis
US11301631B1 (en) 2020-10-05 2022-04-12 Tableau Software, LLC Visually correlating individual terms in natural language input to respective structured phrases representing the natural language input
CN112732243A (en) * 2021-01-11 2021-04-30 京东数字科技控股股份有限公司 Data processing method and device for generating functional component
US11363050B1 (en) 2021-03-25 2022-06-14 Bank Of America Corporation Information security system and method for incompliance detection in data transmission

Family Cites Families (182)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6362033A (en) * 1986-09-02 1988-03-18 Nec Corp Display device for relative information
WO1994024657A1 (en) * 1993-04-20 1994-10-27 Apple Computer Inc. Interactive user interface
US5625763A (en) 1995-05-05 1997-04-29 Apple Computer, Inc. Method and apparatus for automatically generating focus ordering in a dialog on a computer system
JPH09245188A (en) * 1996-03-12 1997-09-19 Fujitsu Ltd Graphic displaying method and its device
US5845299A (en) * 1996-07-29 1998-12-01 Rae Technology Llc Draw-based editor for web pages
US6049812A (en) 1996-11-18 2000-04-11 International Business Machines Corp. Browser and plural active URL manager for network computers
US6128632A (en) * 1997-03-06 2000-10-03 Apple Computer, Inc. Methods for applying rubi annotation characters over base text characters
US6091415A (en) * 1997-05-02 2000-07-18 Inventec Corporation System and method for displaying multiple dialog boxes in a window display
US5886694A (en) 1997-07-14 1999-03-23 Microsoft Corporation Method for automatically laying out controls in a dialog window
US5995101A (en) * 1997-10-29 1999-11-30 Adobe Systems Incorporated Multi-level tool tip
US6236400B1 (en) * 1998-04-02 2001-05-22 Sun Microsystems, Inc. Method and apparatus for controlling the display of hierarchical information
US6473102B1 (en) 1998-05-11 2002-10-29 Apple Computer, Inc. Method and system for automatically resizing and repositioning windows in response to changes in display
US7801913B2 (en) 1998-12-07 2010-09-21 Oracle International Corporation System and method for querying data for implicit hierarchies
US6460060B1 (en) 1999-01-26 2002-10-01 International Business Machines Corporation Method and system for searching web browser history
JP2000331020A (en) * 1999-05-21 2000-11-30 Nippon Telegr & Teleph Corp <Ntt> Method and device for information reference and storage medium with information reference program stored
US6701513B1 (en) 2000-01-14 2004-03-02 Measurement Computing Corporation Program-development environment for use in generating application programs
US7243335B1 (en) 2000-02-17 2007-07-10 Microsoft Corporation Method and system for reducing coding complexity by providing intelligent manipulable defaults
US6681383B1 (en) 2000-04-04 2004-01-20 Sosy, Inc. Automatic software production system
US6473891B1 (en) 2000-05-03 2002-10-29 Lsi Logic Corporation Wire routing to control skew
US7062475B1 (en) 2000-05-30 2006-06-13 Alberti Anemometer Llc Personalized multi-service computer environment
US6750887B1 (en) 2000-06-02 2004-06-15 Sun Microsystems, Inc. Graphical user interface layout manager
US7171455B1 (en) * 2000-08-22 2007-01-30 International Business Machines Corporation Object oriented based, business class methodology for generating quasi-static web pages at periodic intervals
US6919890B2 (en) 2000-09-28 2005-07-19 Curl Corporation Grid and table layout using elastics
US6640655B1 (en) * 2000-10-03 2003-11-04 Varco I/P, Inc. Self tracking sensor suspension mechanism
US6950198B1 (en) 2000-10-18 2005-09-27 Eastman Kodak Company Effective transfer of images from a user to a service provider
US7370040B1 (en) 2000-11-21 2008-05-06 Microsoft Corporation Searching with adaptively configurable user interface and extensible query language
WO2002046878A2 (en) 2000-12-06 2002-06-13 American Express Travel Related Services Company, Inc. Layout generator system and method
US6760128B2 (en) 2000-12-06 2004-07-06 Eastman Kodak Company Providing a payment schedule for utilizing stored images using a designated date
JP2002182812A (en) * 2000-12-14 2002-06-28 Smg Kk Site map display system
US7233998B2 (en) * 2001-03-22 2007-06-19 Sony Computer Entertainment Inc. Computer architecture and software cells for broadband networks
US7203678B1 (en) 2001-03-27 2007-04-10 Bea Systems, Inc. Reconfigurable query generation system for web browsers
US20020147963A1 (en) 2001-04-09 2002-10-10 Lee Rusty Shawn Method and apparatus for generating machine control instructions
US20020180811A1 (en) 2001-05-31 2002-12-05 Chu Sing Yun Systems, methods, and articles of manufacture for providing a user interface with selection and scrolling
US20030011638A1 (en) 2001-07-10 2003-01-16 Sun-Woo Chung Pop-up menu system
US6950993B2 (en) 2001-08-02 2005-09-27 Microsoft Corporation System and method for automatic and dynamic layout of resizable dialog type windows
US6944829B2 (en) 2001-09-25 2005-09-13 Wind River Systems, Inc. Configurable user-interface component management system
US7480864B2 (en) 2001-10-12 2009-01-20 Canon Kabushiki Kaisha Zoom editor
US7620908B2 (en) * 2001-12-28 2009-11-17 Sap Ag Managing a user interface
US20050066037A1 (en) * 2002-04-10 2005-03-24 Yu Song Browser session mobility system for multi-platform applications
CA2385224C (en) 2002-05-07 2012-10-02 Corel Corporation Dockable drop-down dialogs
US7065707B2 (en) * 2002-06-24 2006-06-20 Microsoft Corporation Segmenting and indexing web pages using function-based object models
US7293024B2 (en) 2002-11-14 2007-11-06 Seisint, Inc. Method for sorting and distributing data among a plurality of nodes
US20080177994A1 (en) * 2003-01-12 2008-07-24 Yaron Mayer System and method for improving the efficiency, comfort, and/or reliability in Operating Systems, such as for example Windows
US7000184B2 (en) 2003-01-24 2006-02-14 The Cobalt Group, Inc. Remote web site editing in a standard web browser without external software
US20040165009A1 (en) * 2003-02-20 2004-08-26 International Business Machines Corporation Expansion of interactive user interface components
US7823077B2 (en) 2003-03-24 2010-10-26 Microsoft Corporation System and method for user modification of metadata in a shell browser
US7769794B2 (en) * 2003-03-24 2010-08-03 Microsoft Corporation User interface for a file system shell
US7720616B2 (en) 2003-05-07 2010-05-18 Sureprep, Llc Multi-stage, multi-user engagement submission and tracking process
US7417644B2 (en) 2003-05-12 2008-08-26 Microsoft Corporation Dynamic pluggable user interface layout
US7669140B2 (en) 2003-08-21 2010-02-23 Microsoft Corporation System and method for providing rich minimized applications
US8230366B2 (en) 2003-10-23 2012-07-24 Apple Inc. Dynamically changing cursor for user interface
US8037420B2 (en) * 2003-12-04 2011-10-11 International Business Machines Corporation Maintaining browser navigation relationships and for choosing a browser window for new documents
US7711742B2 (en) 2003-12-11 2010-05-04 International Business Machines Corporation Intelligent data query builder
US20080109785A1 (en) 2004-01-16 2008-05-08 Bailey Bendrix L Graphical Program Having Graphical and/or Textual Specification of Event Handler Procedures for Program Objects
GB2411331A (en) * 2004-02-19 2005-08-24 Trigenix Ltd Rendering user interface using actor attributes
US7577938B2 (en) 2004-02-20 2009-08-18 Microsoft Corporation Data association
US7536672B1 (en) 2004-03-05 2009-05-19 Adobe Systems Incorporated Management of user interaction history with software applications
US7694233B1 (en) * 2004-04-30 2010-04-06 Apple Inc. User interface presentation of information in reconfigured or overlapping containers
CN100343802C (en) * 2004-05-10 2007-10-17 华为技术有限公司 Method and system for unifying users'interface
US8453065B2 (en) * 2004-06-25 2013-05-28 Apple Inc. Preview and installation of user interface elements in a display environment
US8046712B2 (en) 2004-06-29 2011-10-25 Acd Systems International Inc. Management of multiple window panels with a graphical user interface
US8117542B2 (en) * 2004-08-16 2012-02-14 Microsoft Corporation User interface for displaying selectable software functionality controls that are contextually relevant to a selected object
EP1779216A1 (en) 2004-08-20 2007-05-02 Rhoderick John Kennedy Pugh Server authentication
US7434173B2 (en) 2004-08-30 2008-10-07 Microsoft Corporation Scrolling web pages using direct interaction
US7720867B2 (en) 2004-09-08 2010-05-18 Oracle International Corporation Natural language query construction using purpose-driven template
US8819569B2 (en) 2005-02-18 2014-08-26 Zumobi, Inc Single-handed approach for navigation of application tiles using panning and zooming
US7728825B2 (en) * 2005-03-22 2010-06-01 Microsoft Corporation Targeting in a stylus-based user interface
US20060224951A1 (en) * 2005-03-30 2006-10-05 Yahoo! Inc. Multiple window browser interface and system and method of generating multiple window browser interface
US20060236264A1 (en) * 2005-04-18 2006-10-19 Microsoft Corporation Automatic window resize behavior and optimizations
US8195646B2 (en) 2005-04-22 2012-06-05 Microsoft Corporation Systems, methods, and user interfaces for storing, searching, navigating, and retrieving electronic information
US7721225B2 (en) 2005-05-03 2010-05-18 Novell, Inc. System and method for creating and presenting modal dialog boxes in server-side component web applications
US7730418B2 (en) 2005-05-04 2010-06-01 Workman Nydegger Size to content windows for computer graphics
US20070024646A1 (en) 2005-05-23 2007-02-01 Kalle Saarinen Portable electronic apparatus and associated method
US20060282771A1 (en) 2005-06-10 2006-12-14 Tad Vinci Verifying document compliance to a subsidiary standard
US20070033522A1 (en) 2005-08-02 2007-02-08 Lin Frank L System and method for dynamic resizing of web-based GUIs
US7933632B2 (en) 2005-09-16 2011-04-26 Microsoft Corporation Tile space user interface for mobile devices
US8543824B2 (en) 2005-10-27 2013-09-24 Apple Inc. Safe distribution and use of content
US7954064B2 (en) 2005-10-27 2011-05-31 Apple Inc. Multiple dashboards
US8434021B2 (en) * 2005-11-30 2013-04-30 Microsoft Corporation Centralized user interface for displaying contextually driven business content and business related functionality
US7836303B2 (en) 2005-12-09 2010-11-16 University Of Washington Web browser operating system
US8898203B2 (en) 2005-12-27 2014-11-25 International Business Machines Corporation Generating a separable query design object and database schema through visual view editing
JP4635894B2 (en) * 2006-02-13 2011-02-23 ソニー株式会社 Information processing apparatus and method, and program
JP4415961B2 (en) 2006-03-15 2010-02-17 ブラザー工業株式会社 Removable media device and data control program
US20070233854A1 (en) 2006-03-31 2007-10-04 Microsoft Corporation Management status summaries
US20070234195A1 (en) * 2006-04-03 2007-10-04 National Instruments Corporation Simultaneous update of a plurality of user interface elements displayed in a web browser
US7685519B1 (en) * 2006-07-18 2010-03-23 Intuit Inc. Process and apparatus for providing a customizable content tooltip
US20080018665A1 (en) * 2006-07-24 2008-01-24 Jay Behr System and method for visualizing drawing style layer combinations
US8869027B2 (en) 2006-08-04 2014-10-21 Apple Inc. Management and generation of dashboards
US20080065974A1 (en) 2006-09-08 2008-03-13 Tom Campbell Template-based electronic presence management
US7890957B2 (en) 2006-09-08 2011-02-15 Easyonme, Inc. Remote management of an electronic presence
US20080109714A1 (en) 2006-11-03 2008-05-08 Sap Ag Capturing screen information
US8082539B1 (en) 2006-12-11 2011-12-20 Parallels Holdings, Ltd. System and method for managing web-based forms and dynamic content of website
JP5031353B2 (en) 2006-12-15 2012-09-19 キヤノン株式会社 Display device, control method, and program
CN101004685A (en) * 2007-01-08 2007-07-25 叶炜 Method for realizing graphical user interface
US9032329B2 (en) 2007-03-23 2015-05-12 Siemens Product Lifecycle Management Software Inc. System and method for dialog position management
US8321847B1 (en) 2007-05-17 2012-11-27 The Mathworks, Inc. Dynamic function wizard
US20080306933A1 (en) * 2007-06-08 2008-12-11 Microsoft Corporation Display of search-engine results and list
US10019570B2 (en) * 2007-06-14 2018-07-10 Microsoft Technology Licensing, Llc Protection and communication abstractions for web browsers
US8065628B2 (en) 2007-06-25 2011-11-22 Microsoft Corporation Dynamic user interface for previewing live content
KR20090000507A (en) * 2007-06-28 2009-01-07 삼성전자주식회사 Method and apparatus of displaying information
US8762880B2 (en) * 2007-06-29 2014-06-24 Microsoft Corporation Exposing non-authoring features through document status information in an out-space user interface
US8422550B2 (en) * 2007-07-27 2013-04-16 Lagavulin Limited Apparatuses, methods, and systems for a portable, automated contractual image dealer and transmitter
US9009181B2 (en) 2007-08-23 2015-04-14 International Business Machines Corporation Accessing objects in a service registry and repository
US8126840B2 (en) 2007-10-22 2012-02-28 Noria Corporation Lubrication program management system and methods
US8046353B2 (en) 2007-11-02 2011-10-25 Citrix Online Llc Method and apparatus for searching a hierarchical database and an unstructured database with a single search query
CN101499004A (en) * 2008-01-31 2009-08-05 株式会社日立制作所 System and method for connecting virtual machine and user interface
JP2009193423A (en) * 2008-02-15 2009-08-27 Panasonic Corp Input device for electronic equipment
US20090254822A1 (en) 2008-04-04 2009-10-08 International Business Machines Corporation Hi-efficiency wizard framework system and method
US8219385B2 (en) * 2008-04-08 2012-07-10 Incentive Targeting, Inc. Computer-implemented method and system for conducting a search of electronically stored information
JP4171770B1 (en) * 2008-04-24 2008-10-29 任天堂株式会社 Object display order changing program and apparatus
US8375336B2 (en) * 2008-05-23 2013-02-12 Microsoft Corporation Panning content utilizing a drag operation
US8156445B2 (en) 2008-06-20 2012-04-10 Microsoft Corporation Controlled interaction with heterogeneous data
US20100005053A1 (en) 2008-07-04 2010-01-07 Estes Philip F Method for enabling discrete back/forward actions within a dynamic web application
US8345014B2 (en) * 2008-07-12 2013-01-01 Lester F. Ludwig Control of the operating system on a computing device via finger angle using a high dimensional touchpad (HDTP) touch user interface
US8566741B2 (en) * 2008-08-29 2013-10-22 Microsoft Corporation Internal scroll activation and cursor adornment
US8402381B2 (en) * 2008-09-23 2013-03-19 International Business Machines Corporation Automatically arranging widgets of a model within a canvas using iterative region based widget relative adjustments
KR20100049474A (en) 2008-11-03 2010-05-12 삼성전자주식회사 A method for remote user interface session migration to other device
US8095412B1 (en) 2008-11-03 2012-01-10 Intuit Inc. Method and system for evaluating expansion of a business
EP2370872A4 (en) 2008-11-26 2013-04-10 Lila Aps Ahead Dynamic network browser
US7962547B2 (en) 2009-01-08 2011-06-14 International Business Machines Corporation Method for server-side logging of client browser state through markup language
US20100229115A1 (en) * 2009-03-05 2010-09-09 Microsoft Corporation Zoomable user interface data generation
US8806371B2 (en) * 2009-03-26 2014-08-12 Apple Inc. Interface navigation tools
US8819570B2 (en) * 2009-03-27 2014-08-26 Zumobi, Inc Systems, methods, and computer program products displaying interactive elements on a canvas
US20100251143A1 (en) * 2009-03-27 2010-09-30 The Ransom Group, Inc. Method, system and computer program for creating and editing a website
US8819597B2 (en) 2009-04-10 2014-08-26 Google Inc. Glyph entry on computing device
US9213541B2 (en) 2009-04-17 2015-12-15 ArtinSoft Corporation, S.A. Creation, generation, distribution and application of self-contained modifications to source code
US20100287530A1 (en) * 2009-05-05 2010-11-11 Borland Software Corporation Requirements definition using interactive prototyping
US8269737B2 (en) * 2009-08-20 2012-09-18 Hewlett-Packard Development Company, L.P. Method and apparatus for interpreting input movement on a computing device interface as a one- or two-dimensional input
CA2774728C (en) 2009-11-13 2019-02-12 Irdeto Canada Corporation System and method to protect java bytecode code against static and dynamic attacks within hostile execution environments
US8782562B2 (en) * 2009-12-02 2014-07-15 Dell Products L.P. Identifying content via items of a navigation system
US8407598B2 (en) 2009-12-09 2013-03-26 Ralph Lee Burton Dynamic web control generation facilitator
JP5523090B2 (en) * 2009-12-25 2014-06-18 キヤノン株式会社 INPUT DEVICE, CONTROL METHOD FOR INPUT DEVICE, PROGRAM, AND STORAGE MEDIUM
US8533667B2 (en) 2009-12-30 2013-09-10 International Business Machines Corporation Call wizard for information management system (IMS) applications
CN101763218A (en) * 2010-01-06 2010-06-30 广东欧珀移动通信有限公司 Input method for handheld equipment
US20110173537A1 (en) 2010-01-11 2011-07-14 Everspeech, Inc. Integrated data processing and transcription service
WO2011116160A1 (en) 2010-03-19 2011-09-22 Siemens Healthcare Diagnostics Inc. Modular diagnostic instrument workstation architecture and method
US8316323B2 (en) * 2010-03-26 2012-11-20 Microsoft Corporation Breadcrumb navigation through heirarchical structures
US8631350B2 (en) * 2010-04-23 2014-01-14 Blackberry Limited Graphical context short menu
US20120089914A1 (en) * 2010-04-27 2012-04-12 Surfwax Inc. User interfaces for navigating structured content
US20110271184A1 (en) * 2010-04-28 2011-11-03 Microsoft Corporation Client application and web page integration
US9160756B2 (en) 2010-05-19 2015-10-13 International Business Machines Corporation Method and apparatus for protecting markup language document against cross-site scripting attack
JP5674779B2 (en) * 2010-06-03 2015-02-25 パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America Scroll device, scroll method, scroll program, and integrated circuit
CN102270125A (en) * 2010-06-04 2011-12-07 中兴通讯股份有限公司 Device and method for developing Web application
US20110314415A1 (en) * 2010-06-21 2011-12-22 George Fitzmaurice Method and System for Providing Custom Tooltip Messages
US8706854B2 (en) 2010-06-30 2014-04-22 Raytheon Company System and method for organizing, managing and running enterprise-wide scans
US8544027B2 (en) 2010-07-30 2013-09-24 Sap Ag Logical data model abstraction in a physically distributed environment
US8630462B2 (en) * 2010-08-31 2014-01-14 Activate Systems, Inc. Methods and apparatus for improved motion capture
JP2012069065A (en) * 2010-09-27 2012-04-05 Nintendo Co Ltd Information processing program, and information processing device and method
US8612366B2 (en) 2010-09-29 2013-12-17 Moresteam.Com Llc Systems and methods for performing design of experiments
US8990199B1 (en) 2010-09-30 2015-03-24 Amazon Technologies, Inc. Content search with category-aware visual similarity
US20120124555A1 (en) 2010-11-11 2012-05-17 Codekko Software, Inc. Optimization of Compiled Control Objects
US9069577B2 (en) * 2010-11-23 2015-06-30 Apple Inc. Grouping and browsing open windows
CN102023749A (en) * 2010-12-02 2011-04-20 广东宝莱特医用科技股份有限公司 Area dragging treating method of list type control on touch screen interface of medical equipment
US8612874B2 (en) 2010-12-23 2013-12-17 Microsoft Corporation Presenting an application change through a tile
US20120191508A1 (en) 2011-01-20 2012-07-26 John Nicholas Gross System & Method For Predicting Outcome Of An Intellectual Property Rights Proceeding/Challenge
JP2012168790A (en) 2011-02-15 2012-09-06 Brother Ind Ltd Display program and display device
US9384183B2 (en) 2011-03-31 2016-07-05 Infosys Limited Method and system for reporting web standard non-compliance of web pages
US9152616B2 (en) 2011-04-28 2015-10-06 Flipboard, Inc. Template-based page layout for web content
US9753699B2 (en) * 2011-06-16 2017-09-05 Microsoft Technology Licensing, Llc Live browser tooling in an integrated development environment
US8566100B2 (en) 2011-06-21 2013-10-22 Verna Ip Holdings, Llc Automated method and system for obtaining user-selected real-time information on a mobile communication device
US8799862B2 (en) * 2011-06-24 2014-08-05 Alcatel Lucent Application testing using sandboxes
CN102253841B (en) * 2011-08-09 2014-07-23 东莞兆田数码科技有限公司 Small-scale graphical user interface system
US20130080913A1 (en) * 2011-09-22 2013-03-28 Microsoft Corporation Multi-column notebook interaction
US8836654B2 (en) 2011-10-04 2014-09-16 Qualcomm Incorporated Application window position and size control in (multi-fold) multi-display devices
JP5553812B2 (en) * 2011-10-26 2014-07-16 株式会社ソニー・コンピュータエンタテインメント Scroll control device, terminal device, and scroll control method
KR101888457B1 (en) 2011-11-16 2018-08-16 삼성전자주식회사 Apparatus having a touch screen processing plurality of apllications and method for controlling thereof
US8799780B2 (en) 2011-11-28 2014-08-05 International Business Machines Corporation Installation wizard with multidimensional views
US8799988B2 (en) 2012-01-25 2014-08-05 Microsoft Corporation Document communication runtime interfaces
US20150058709A1 (en) 2012-01-26 2015-02-26 Michael Edward Zaletel Method of creating a media composition and apparatus therefore
US10185703B2 (en) 2012-02-20 2019-01-22 Wix.Com Ltd. Web site design system integrating dynamic layout and dynamic content
KR101892567B1 (en) * 2012-02-24 2018-08-28 삼성전자 주식회사 Method and apparatus for moving contents on screen in terminal
US9389872B2 (en) 2012-03-16 2016-07-12 Vmware, Inc. Software wizard implementation framework
EP2665042A1 (en) * 2012-05-14 2013-11-20 Crytek GmbH Visual processing based on interactive rendering
US9043722B1 (en) * 2012-06-19 2015-05-26 Surfwax, Inc. User interfaces for displaying relationships between cells in a grid
US20140096042A1 (en) * 2012-07-09 2014-04-03 Aaron Tyler Travis Method and system for generating and storing a collection of interactive browsers within a navigation plane
US9195477B1 (en) 2012-10-09 2015-11-24 Sencha, Inc. Device profiles, deep linking, and browser history support for web applications
US9244971B1 (en) 2013-03-07 2016-01-26 Amazon Technologies, Inc. Data retrieval from heterogeneous storage systems
US9158518B2 (en) 2013-03-11 2015-10-13 Blackberry Limited Collaborative application development environment using a connected device
WO2014157908A1 (en) 2013-03-27 2014-10-02 Samsung Electronics Co., Ltd. Device and method for displaying execution result of application
US10410003B2 (en) 2013-06-07 2019-09-10 Apple Inc. Multiple containers assigned to an application
US20150095849A1 (en) * 2013-09-30 2015-04-02 Microsoft Corporation Dialogs positioned with action visualization
US9875116B2 (en) 2013-11-26 2018-01-23 Cellco Partnership Sharing of a user input interface of an application session of one application between two or more applications

Also Published As

Publication number Publication date
US20150095813A1 (en) 2015-04-02
KR20160062225A (en) 2016-06-01
US20150095849A1 (en) 2015-04-02
WO2015048205A1 (en) 2015-04-02
KR20160064115A (en) 2016-06-07
SG11201601888UA (en) 2016-04-28
RU2016111604A (en) 2017-10-02
JP6446038B2 (en) 2018-12-26
CN105683909A (en) 2016-06-15
RU2686822C2 (en) 2019-04-30
WO2015048206A1 (en) 2015-04-02
JP2016533556A (en) 2016-10-27
CN105659199A (en) 2016-06-08
BR112016004551A8 (en) 2020-02-11
CN105593813A (en) 2016-05-18
US20150095851A1 (en) 2015-04-02
CN105593813B (en) 2019-09-24
IL244368A0 (en) 2016-04-21
RU2016111604A3 (en) 2018-07-12
WO2015048600A1 (en) 2015-04-02
CN105683907B (en) 2019-06-28
EP3053028A1 (en) 2016-08-10
TW201516834A (en) 2015-05-01
MX2016004113A (en) 2016-06-06
TW201528103A (en) 2015-07-16
EP3053017A1 (en) 2016-08-10
EP3053031A1 (en) 2016-08-10
JP6465870B2 (en) 2019-02-06
EP3053030A1 (en) 2016-08-10
CN105683907A (en) 2016-06-15
RU2016111610A3 (en) 2018-07-05
US9483549B2 (en) 2016-11-01
CL2016000729A1 (en) 2016-11-18
US20150095811A1 (en) 2015-04-02
WO2015048601A1 (en) 2015-04-02
US20150095842A1 (en) 2015-04-02
KR102186865B1 (en) 2020-12-04
CN105683908B (en) 2019-11-19
AU2014324618A1 (en) 2016-02-25
KR20160063340A (en) 2016-06-03
CN105683908A (en) 2016-06-15
US9792354B2 (en) 2017-10-17
PH12016500256A1 (en) 2016-05-16
US20150095854A1 (en) 2015-04-02
CN105683909B (en) 2019-06-25
WO2015048203A1 (en) 2015-04-02
US9727636B2 (en) 2017-08-08
EP3053028B1 (en) 2018-12-12
US20150095759A1 (en) 2015-04-02
CN105593812A (en) 2016-05-18
TW201528106A (en) 2015-07-16
CA2922725A1 (en) 2015-04-02
WO2015048204A1 (en) 2015-04-02
US20150095365A1 (en) 2015-04-02
US20150095791A1 (en) 2015-04-02
TW201528108A (en) 2015-07-16
MX2016003946A (en) 2016-06-17
US20150095846A1 (en) 2015-04-02
SG10201802632SA (en) 2018-05-30
JP2016532924A (en) 2016-10-20
EP3053027A1 (en) 2016-08-10
RU2679540C2 (en) 2019-02-11
US20150095812A1 (en) 2015-04-02
US9805114B2 (en) 2017-10-31
HK1222731A1 (en) 2017-07-07
US9754018B2 (en) 2017-09-05
WO2015048602A1 (en) 2015-04-02
AU2014324620A1 (en) 2016-02-25
EP3053029A1 (en) 2016-08-10
RU2016111610A (en) 2017-10-02
US9672276B2 (en) 2017-06-06

Similar Documents

Publication Publication Date Title
US20150095842A1 (en) Extendable blade sequence along pannable canvas direction
US11544257B2 (en) Interactive table-based query construction using contextual forms
US10990419B2 (en) Dynamic multi monitor display and flexible tile display
US11442924B2 (en) Selective filtered summary graph
US11544248B2 (en) Selective query loading across query interfaces
US7739622B2 (en) Dynamic thumbnails for document navigation
US10261660B2 (en) Orbit visualization animation
US20190155802A1 (en) Supplementing events displayed in a table format
US20170329483A1 (en) Viewport for multi application user interface
US20130332865A1 (en) Activity initiation and notification user interface
US8572500B2 (en) Application screen design allowing interaction
EP3127013B1 (en) Service gallery user interface presentation
US20230359442A1 (en) Code context assembly
US11366571B2 (en) Visualization components including sliding bars
US20110234637A1 (en) Smart gestures for diagram state transitions

Legal Events

Date Code Title Description
FZDE Discontinued

Effective date: 20200930