US20150095812A1 - Extensible and context-aware commanding infrastructure - Google Patents

Extensible and context-aware commanding infrastructure Download PDF

Info

Publication number
US20150095812A1
US20150095812A1 US14/231,873 US201414231873A US2015095812A1 US 20150095812 A1 US20150095812 A1 US 20150095812A1 US 201414231873 A US201414231873 A US 201414231873A US 2015095812 A1 US2015095812 A1 US 2015095812A1
Authority
US
United States
Prior art keywords
context
commands
command
sensitive
user interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/231,873
Inventor
Andrew Birck
Brad Olenick
Leon Ezequiel Welicki
Nafisa Bhojawala
Stephen Michael Danton
Jonathan Lucero
Dina-Marie Ledonna Supino
Jesse David Francisco
Vishal R. Joshi
Karandeep Singh Anand
William J. Staples
Madhur Joshi
Julio O. Casal
Jonah Bush Sterling
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Technology Licensing LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Technology Licensing LLC filed Critical Microsoft Technology Licensing LLC
Priority to US14/231,873 priority Critical patent/US20150095812A1/en
Assigned to MICROSOFT CORPORATION reassignment MICROSOFT CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LUCERO, Jonathan, DANTON, STEPHEN MICHAEL, STAPLES, WILLIAM J., STERLING, Jonah Bush, JOSHI, VISHAL R., FRANCISCO, JESSE DAVID, BHOJAWALA, Nafisa, CASAL, Julio O., BIRCK, Andrew, OLENICK, BRAD, JOSHI, MADHUR, ANAND, KARANDEEP SINGH, WELICKI, LEON EZEQUIEL, SUPINO, Dina-Marie Ledonna
Assigned to MICROSOFT TECHNOLOGY LICENSING, LLC reassignment MICROSOFT TECHNOLOGY LICENSING, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MICROSOFT CORPORATION
Assigned to MICROSOFT TECHNOLOGY LICENSING, LLC reassignment MICROSOFT TECHNOLOGY LICENSING, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MICROSOFT CORPORATION
Publication of US20150095812A1 publication Critical patent/US20150095812A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/332Query formulation
    • G06F16/3329Natural language query formulation or dialogue systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/242Query formulation
    • G06F16/2428Query predicate definition using graphical user interfaces, including menus and forms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/103Formatting, i.e. changing of presentation of documents
    • G06F40/117Tagging; Marking up; Designating a block; Setting of attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/34Graphical or visual programming
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • G06F9/452Remote windowing, e.g. X-Window System, desktop virtualisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/48Program initiating; Program switching, e.g. by interrupt
    • G06F9/4806Task transfer initiation or dispatching
    • G06F9/4843Task transfer initiation or dispatching by program, e.g. task dispatcher, supervisor, operating system
    • G06F9/485Task life-cycle, e.g. stopping, restarting, resuming execution
    • G06F9/4856Task life-cycle, e.g. stopping, restarting, resuming execution resumption being on a different machine, e.g. task migration, virtual machine migration

Definitions

  • Computing systems and networks have transformed the way we work, play, and communicate.
  • Computing systems obtain there functionality by executing commands on computing resources accessible to the computing system. Commands might be, for instance, initiated by a user. In that case, the user interfaces with a visualization of the command, thereby causing corresponding operations on the computing asset.
  • the user may be presented with dialogs that ask for confirmation, inform of success or failure, or inform of progress of the command.
  • At least some embodiments described herein relate to computing systems in which multiple non-context-sensitive or core commands may be initiated from each of a number of different user interface contexts. There are also multiple context-sensitive mechanism for visualizing the commands depending on which of the multiple possible user interface contexts that the commands appear. At least some embodiments described herein also related to the presentation of dialogs at various stages of the command lifecycle without the system needing to know the underlying operations of the command, and allowing the developer to specify when dialogs are to appear in that lifecycle.
  • FIG. 1 abstractly illustrates an example computing system in which the principles described herein may be employed
  • FIG. 2 illustrates a user interface element in the form of a blade, and in which commands are displayed in a command bar;
  • FIG. 3 illustrates a user interface that represents modifications to the user interface that would occur if the user selects the command bar expansion control of FIG. 2 ;
  • FIG. 4 illustrates an example context menu that represents another example of a context-sensitive mechanism for visualizing controls
  • FIG. 5 illustrates an extended example context menu that represents another example of a context-sensitive mechanism for visualizing controls
  • FIGS. 6A through 6D illustrates various visualizations of the same commands across different user interface contexts
  • FIG. 7 illustrates that the commands within FIGS. 6A through 6D are indeed the same
  • FIG. 8 illustrates a life-cycle that the system may be aware of for all commands, whether built-in or extrinsic;
  • FIG. 9 illustrates an example of a dialog that may appear upon initiating a stop website command.
  • FIG. 10 illustrates that each stage in a life cycle of a command can surface a different dialog, with the application developer indicating whether the corresponding dialog is to appear at each stage in the state machine.
  • Commanding is a common way of describing behavior in a system, whether distributed or otherwise.
  • Each command represents a unit of functionality that can be applicable to an asset within the system, to the system itself, or to any arbitrary artifact.
  • Commands can be provided by the system (i.e., built-in commands) or by other parties (extrinsic commands).
  • commands are provided consistently in an entire system, even though the system itself may be operating a number of different applications composed by entirely different parties.
  • the embodiments described herein help security by running commands in the right isolation mode, such that harmful (but not necessarily malicious) code does not compromise the system.
  • the command should not block the user interface so they should run asynchronously.
  • the embodiments described herein allow commands to be surfaced following the same patterns (e.g. command bar; context menu; etc.) (i.e., also referred to herein as a context-sensitive mechanism for visualization) and provide interactivity options to the users (e.g. dialogs) so they can participate in the operation and also understand the operation's status and result.
  • the principles described herein may be implemented using a computing system.
  • the users may be engaging with the system using a client computing system.
  • the executable logic supporting the system and providing visualizations thereon may also be performed using a computing system.
  • the computing system may even be distributed. Accordingly, a brief description of a computing system will now be provided.
  • Computing systems are now increasingly taking a wide variety of forms. Computing systems may, for example, be handheld devices, appliances, laptop computers, desktop computers, mainframes, distributed computing systems, or even devices that have not conventionally been considered a computing system.
  • the term “computing system” is defined broadly as including any device or system (or combination thereof) that includes at least one physical and tangible processor, and a physical and tangible memory capable of having thereon computer-executable instructions that may be executed by the processor.
  • the memory may take any form and may depend on the nature and form of the computing system.
  • a computing system may be distributed over a network environment and may include multiple constituent computing systems. An example computing system is illustrated in FIG. 1 .
  • a computing system 100 typically includes at least one processing unit 102 and memory 104 .
  • the memory 104 may be physical system memory, which may be volatile, non-volatile, or some combination of the two.
  • the term “memory” may also be used herein to refer to non-volatile mass storage such as physical storage media. If the computing system is distributed, the processing, memory and/or storage capability may be distributed as well.
  • the term “executable module” or “executable component” can refer to software objects, routines, or methods that may be executed on the computing system. The different components, modules, engines, and services described herein may be implemented as objects or processes that execute on the computing system (e.g., as separate threads).
  • embodiments are described with reference to acts that are performed by one or more computing systems. If such acts are implemented in software, one or more processors of the associated computing system that performs the act direct the operation of the computing system in response to having executed computer-executable instructions.
  • such computer-executable instructions may be embodied on one or more computer-readable media that form a computer program product.
  • An example of such an operation involves the manipulation of data.
  • the computer-executable instructions (and the manipulated data) may be stored in the memory 104 of the computing system 100 .
  • Computing system 100 may also contain communication channels 108 that allow the computing system 100 to communicate with other message processors over, for example, network 110 .
  • the computing system 100 also includes a display 112 on which a user interface, such as the user interfaces described herein, may be rendered.
  • a user interface such as the user interfaces described herein
  • Such user interfaces may be generated in computer hardware or other computer-represented form prior to rendering.
  • the presentation and/or rendering of such user interfaces may be performed by the computing system 100 by having the processing unit(s) 102 execute one or more computer-executable instructions that are embodied on one or more computer-readable media.
  • Such computer-readable media may form all or a part of a computer program product.
  • Embodiments described herein may comprise or utilize a special purpose or general-purpose computer including computer hardware, such as, for example, one or more processors and system memory, as discussed in greater detail below.
  • Embodiments described herein also include physical and other computer-readable media for carrying or storing computer-executable instructions and/or data structures.
  • Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system.
  • Computer-readable media that store computer-executable instructions are physical storage media.
  • Computer-readable media that carry computer-executable instructions are transmission media.
  • embodiments of the invention can comprise at least two distinctly different kinds of computer-readable media: computer storage media and transmission media.
  • Computer storage media includes RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other tangible medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
  • a “network” is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices.
  • a network or another communications connection can include a network and/or data links which can be used to carry or desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of computer-readable media.
  • program code means in the form of computer-executable instructions or data structures can be transferred automatically from transmission media to computer storage media (or vice versa).
  • computer-executable instructions or data structures received over a network or data link can be buffered in RAM within a network interface module (e.g., a “NIC”), and then eventually transferred to computer system RAM and/or to less volatile computer storage media at a computer system.
  • a network interface module e.g., a “NIC”
  • NIC network interface module
  • computer storage media can be included in computer system components that also (or even primarily) utilize transmission media.
  • Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor, cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions.
  • the computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code.
  • the invention may be practiced in network computing environments with many types of computer system configurations, including, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, pagers, routers, switches, and the like.
  • the invention may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks.
  • program modules may be located in both local and remote memory storage devices.
  • a user interface element (often called herein a “part”) represents a basic unit of the user interface.
  • Each of at least some of the parts are associated with corresponding controls that the user may interact with to thereby cause the system to execute respective commands.
  • the execution of the command may, for instance, return data to project via the corresponding part.
  • the parts may incorporate extrinsic commands that implement given contracts, and may reason about them.
  • commands can be associated with resources in the system (such as a website, database, an arbitrary artifact, the system itself, or a piece of the user interface). This association may be persistent, such that when that resource is displayed in different user interface contexts, the non-context-sensitive commands associated with that resource are still available, but displayed using the right context-sensitive mechanism.
  • the context-sensitive mechanism for visualizing these core commands may be a user experience form factor appropriate for the context in which the part is displayed. Consistency may also be further achieved by having the same context-visualization mechanism used to display the commands of any user interface element that is displayed in a particular user interface context. Commands are thus offered to the user via a well-defined experience that is consistent across the entire system.
  • the system provides a set of abstractions through a portal that enable application developers to create commands.
  • a command encapsulates an action in the system.
  • the composition tree describes structure and the commands describe behavior.
  • Commands provide a well-defined surface that the system can reason about to support units of behavior.
  • Commands can be system commands (built-in) and custom commands (provided by the application developer). Commands are offered to the user via a well-defined experience that is consistent across the entire system. This experience is built-in and cannot be redefined by application developers.
  • Application developers can only contribute with new commands, but not with new ways of exposing those commands to the user. Thus, the manner of exposing commands (the command experience) is governed by the system.
  • Commands provide application developers and users a consistent model across applications (sometimes referred to as “extensions”) compatible with browser capabilities and scalable to all parties to describe behavior in the system.
  • the command may have affinity with portal assets which can make them available everywhere that asset is presented if so desired
  • the user interface may be a rich in allowing different user interface elements to be presented in entirely different contexts.
  • the user interface might include different contexts such as a favorites area, a blade, and hubs. These represent different places where user interface elements (also called herein “parts”) can be displayed.
  • the commands associated with a given asset can be available in all of these different contexts, enabling taking action on a resource in the way that is more convenient (and in addition new commands can be added to accommodate the specifics of each context in case is needed).
  • Commands can be associated with different resources in the system.
  • the resource might be a portion of the user interface itself, such as a part.
  • a portion of the user interface may be what will be referred to herein as a “blade”.
  • a blade is a user interface element that may be placed on a canvas that extends in an extendible direction (such as horizontally). For substantially all of a particular range of the canvas in the dimension of the extendible dimension, the blade may occupy substantially all of the canvas in the dimension perpendicular to the extendible direction of the canvas.
  • the resource might also be associated with an actual asset in the system, such as a website, database, virtual machine, and so forth. Blades or other parts can be also associated with assets creating a transitive relationship between the commands and its container if so desired.
  • FIG. 2 illustrates a user interface element 200 in the form of a blade.
  • the blade is associated with an asset in the form of a website (called “thisisanothersite1” in the user interface element 200 of FIG. 2 ).
  • Commands are presented in a command bar 210 at the top of the blade in this context.
  • the command bar 210 represents a context-sensitive mechanism for visualizing commands when the commands are displayed in the context of a blade.
  • the command bar 210 is illustrated as visualizing three non-context-sensitive commands including the start command 211 , the stop command 212 , and the restart command 213 .
  • the start command 211 is deemphasized as not selectable since the web site has already running as evidenced within the status window 230 .
  • the command 211 , 212 and 213 are “non-context-sensitive” in that regardless of the user interface context in which the non-context-sensitive commands are displayed, at least the selectable non-context-sensitive commands (in this case stop command 212 and the restart command 213 ) will still be displayed.
  • the command bar 210 also includes an overflow control 221 (also called hereinafter a “command bar expansion control 221 ”) that is presented when there are more commands associated with the blade than the blade can display in the available space.
  • FIG. 3 illustrates a user interface 300 that represents modifications to the user interface 200 that would occur if the user selects the command bar expansion control 221 .
  • the command bar 210 is augmented to be an augmented command bar 310 that shows a second row of commands. For instance, a non-context-sensitive delete command 214 is illustrated in the second row.
  • the selectable non-context-sensitive commands 212 through 214 may be presented regardless of where commands appears in the user interface context.
  • the non-context sensitive commands may be basic commands associated with the resource that the user might like to initiate regardless of the user interface context in which the resource is presented. For instance, a user might like to start, stop, restart, or delete a web site from any one of a number of different user interface contexts.
  • the command bars 210 and 310 also include context-sensitive commands.
  • an overflow indicator 221 hereinafter referred to as a “command bar expand command” 221 .
  • a browse command 222 which is related to the underlying asset (e.g., the web site), but which is specific to a presentation in a particular user interface context. For instance, a user might like to browse to the web site when the web site is associated with the blade (since there is more space available to usefully browse), but the user might not be so interested in browsing if they are working in the context of a smaller user interface portion that is associated with that web site.
  • Another context-sensitive command is illustrated as the reset publish profile command 223 in FIG. 3 .
  • FIG. 3 illustrates another context-sensitive command in the form of a command bar collapse control 224 , which when selected returns the user interface 300 to that of the user interface 200 of FIG. 2 .
  • context-sensitive commands including one or more of 1) non-selectable non-context-sensitive commands (such as the start command 211 ), 2) commands that are associated with an underlying resource, but which are not to be performed in every user interface context (such as the browse command 222 and the reset publish profile command 223 ), and 3) and commands that are associated with the user interface element itself, but not the underlying resource (such as the overflow indicator 221 or command bar expansion command).
  • FIGS. 2 and 3 illustrate user interface elements when web commands are associated with a blade, which is one example of a user interface context.
  • web commands may be displayed in other user interface contexts. For instance, suppose that the web commands are in a smaller user interface part that is within the favorites area, within an activity pane or grid. In those user interface contexts, the active context sensitive commands may be displayed in a context menu.
  • FIG. 4 illustrates an example context menu 400 .
  • the context menu 400 again visualizes the active non-context-sensitive commands, including the stop command 412 (corresponds to stop command 212 of FIG. 2 ), the restart command 413 (corresponding to the restart command 213 of FIG. 2 ), and the delete command 414 (corresponding to the delete command 214 of FIG. 3 ). Because the state of the underlying resource (i.e., the web site) was persisted, the system recognized that the start command is not a selectable command given the current state of the resource. Thus, when accessing commands for that same web site via another user interface context, the active non-context-sensitive commands are again displayed.
  • the context menu 400 also includes a single context-sensitive command in the form of an unpin command 421 , which would remove the associated user interface element from the user interface context.
  • the blade user interface element 200 is one example of a user interface context with the command bar 210 being an associated context-sensitive visualization for the commands.
  • the context menu 400 is another example of the associated context-sensitive visualization for the commands, which is associated with other user interface contexts (such as smaller parts, favorites areas, grids, activity panes, and so forth).
  • FIG. 5 illustrates another user interface element 500 that represents a more extended context menu.
  • This user interface element 500 might appear when accessing commands to operate on the web site from yet another user interface context. Accordingly, the user interface element 500 represents yet a third example context-sensitive mechanism for visualizing the web commands.
  • the user interface element 500 again displays the non-context sensitive commands including the stop command 512 (corresponding to the stop command 212 and 412 in FIGS. 2 and 4 , respectively), the restart command 513 (corresponding to the restart command 213 and 413 in FIGS. 2 and 4 , respectively, and the delete command 514 (corresponding to the delete command 214 and 414 in FIGS. 3 and 4 , respectively).
  • the underlying resource is the web site, and the state of the web site has been preserved. Accordingly, the start command 511 (corresponding to the start command 211 of FIG. 2 ) is displayed, but in deemphasized form.
  • the browse command (corresponding to browse command 222 of FIG. 2 ), and the reset publish profile command 523 (corresponding to the reset publish profile command 223 of FIG. 2 ), are also displayed, even though they are context-sensitive commands.
  • the commands 511 through 514 , 522 and 523 are application commands 510 (also referred to herein as “extrinsic commands”) being offered by application developers and not underlying system.
  • the extended context menu 500 also includes system commands 530 , such as an unpin command 531 , and size selection commands 522 through 524 .
  • system commands 530 are offered by the system regardless of the underlying resource, so long as the commands were selected within the given user interface context that generated the extended context menu 500 .
  • the built-in commands provide general infrastructure services (pin/unpin parts, resizing parts, restoring layout, and so forth) and are general in that they apply across all usage domains.
  • Commands provided by application developers are domain specific. For instance, an example set of extrinsic commands for a web site application might include “start”, “stop”, “delete website” and so forth.
  • Commands are authored by application developers by leveraging a set of artifacts (interfaces and bases classes provided by the system) that expose the command contract to the application developers. This allows the application developer to provide the actual behavior of the command (what happens when the command is executed), provide dialogs (which are optional) that will display at different moments of the command's life cycle, and influence the command life-cycle.
  • the non-context-sensitive commands can follow a resource in multiple contexts.
  • commands associated with a website can be present in the website's blade (see FIG. 6A ), in the website startboard part (see FIG. 6B ), when the website is displayed in a grid (see FIG. 6C ), in the notifications panel, when the website is part of a search result (see FIG. 6D ) or anywhere the website is surfaced.
  • the status of the underlying resource is considered in each of FIGS. 6A through 6D , in that a start command is not offered given that the web site has already started.
  • the stop command, the restart command, and the delete command are offered regardless of the user interface context in which the commands and associated resource are visualized.
  • 6A through 6B illustrate the non-context-sensitive commands being displayed via a different context-sensitive mechanism as a result of being in a different user interface context. Even though the user experience for displaying the commands and the context where the command is displayed may be different, the actual command is the same as illustrated in FIG. 7 .
  • FIG. 8 illustrates a life-cycle 800 that the system may be aware of for all commands, whether built-in or extrinsic.
  • the life-cycle 800 may be tracked by, for example, a command state tracking module, which may be a single module or a collection of modules.
  • the developer can specify whether or not constrained user interface elements (or dialogs) are to appear at each of the transition 811 through 814 for each command. Accordingly, when making a transition 811 through 814 , the system can check to determine whether a dialog is to appear as part of the transition. For instance, such dialogs could ask users for confirmations, inform of progress, or inform of the result of an operation, all depending on which transition 811 through 814 is being made, and what the resource associated with the command is.
  • dialogs could ask users for confirmations, inform of progress, or inform of the result of an operation, all depending on which transition 811 through 814 is being made, and what the resource associated with the command is.
  • FIG. 9 illustrates an example of a dialog 900 .
  • the system is aware that the user has selected a stop command. The system may then track the overall lifecycle 800 of the stop command even though the system might not be aware of all that is involved in stopping the web site.
  • the system is also aware of the resource type being operated upon (i.e., a web site) as well as an identifier for that resource (“Wandering”).
  • the stop command begins transitioning (as represented by transition 811 ) from the Non state 801 to the Pending state 802 .
  • the author verifies that the browser developer has not indicated that a dialog is to appear at this point for this type of resource (e.g., website), and/or that the web site developer has not indicated that a dialog is to appear at this point for that particular resource (e.g., the “Wandering website”).
  • the browser developer and/or the website author may also specify a dialog template in cases in which there are multiple templates that could be used for that transition and resource type.
  • dialog 900 the system verifies that a dialog is to appear, and thus presents dialog 900 .
  • the dialog 900 may be generated knowing nothing more than which transition is involved (and potentially also a dialog template to use which may also be specified by the developer).
  • the dialog 900 may then populate the dialog template using the name of the resource (e.g., “Wandering” website), and then present the dialog to the user.
  • the presentation of dialogs may be consistent throughout the system regardless of the command being executed, or the resource being operated upon, even without the system knowing the specifics of the underlying operations that support the command.
  • dialogs are data-driven and extremely constrained to provide a uniform user experience across applications. Dialogs include confirmation (with yes/no buttons), show progress (deterministic and non-deterministic), show success, and show failure (with a retry button). Application developers can configure the command to surface the dialogs at certain points in the lifecycle of the operation.
  • each stage in the life cycle can surface a different dialog, with the application developer indicating whether the corresponding dialog is to appear at each stage in the state machine.
  • the portal can provide abstractions that application developers can use to create intrinsic commands that the system will recognize (at least to the point of being able to track the state machine of FIG. 10 ).
  • An example can be extensible abstractions for “Save” and “Discard” commands that when used in forms are subject to the validation state and changes to the underlying form.
  • Commands are executed asynchronously by the system. Commands provided by application authors are executed leveraging the system's isolation model to ensure that they do not compromise the overall portal (as the execution is isolated within the application that owns the command).
  • the application developer creates a small set of commands that are available in multiple contexts. All capabilities for his commands (execution logic, dialogs, and so forth) are preserved and the necessary user experience is adapted to the constraints of where the command is rendered. This makes possible that users can interact with a resource at any place in the user interface. There is no single location where “actions” can be executed but rather any place in the portal allows rich interactions with resources.

Abstract

Computing systems in which multiple non-context-sensitive or core commands may be initiated from each of a number of different user interface contexts. There are also multiple context-sensitive mechanism for visualizing the commands depending on which of the multiple possible user interface contexts that the commands appear. At least some embodiments described herein also related to the presentation of dialogs at various stages of the command lifecycle without the system needing to know the underlying operations of the command, and allowing the developer to specify when dialogs are to appear in that lifecycle.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application claims the benefit of each of the following provisional patent applications, and each of the following provisional patent applications are incorporated herein by reference in their entirety:
    • 1. U.S. Provisional Application Ser. No. 61/905,114, filed Nov. 15, 2013;
    • 2. U.S. Provisional Application Ser. No. 61/884,743, filed Sep. 30, 2013;
    • 3. U.S. Provisional Application Ser. No. 61/905,111, filed Nov. 15, 2013;
    • 4. U.S. Provisional Application Ser. No. 61/905,243, filed Nov. 17, 2013;
    • 5. U.S. Provisional Application Ser. No. 61/905,116, filed Nov. 15, 2013;
    • 6. U.S. Provisional Application Ser. No. 61/905,129, filed Nov. 15, 2013;
    • 7. U.S. Provisional Application Ser. No. 61/905,105, filed Nov. 15, 2013;
    • 8. U.S. Provisional Application Ser. No. 61/905,247, filed Nov. 17, 2013;
    • 9. U.S. Provisional Application Ser. No. 61/905,101, filed Nov. 15, 2013;
    • 10. U.S. Provisional Application Ser. No. 61/905,128, filed Nov. 15, 2013; and
    • 11. U.S. Provisional Application Ser. No. 61/905,119, filed Nov. 15, 2013.
    BACKGROUND
  • Computing systems and networks have transformed the way we work, play, and communicate. Computing systems obtain there functionality by executing commands on computing resources accessible to the computing system. Commands might be, for instance, initiated by a user. In that case, the user interfaces with a visualization of the command, thereby causing corresponding operations on the computing asset. During various stages of the lifecycle of a command, the user may be presented with dialogs that ask for confirmation, inform of success or failure, or inform of progress of the command.
  • BRIEF SUMMARY
  • At least some embodiments described herein relate to computing systems in which multiple non-context-sensitive or core commands may be initiated from each of a number of different user interface contexts. There are also multiple context-sensitive mechanism for visualizing the commands depending on which of the multiple possible user interface contexts that the commands appear. At least some embodiments described herein also related to the presentation of dialogs at various stages of the command lifecycle without the system needing to know the underlying operations of the command, and allowing the developer to specify when dialogs are to appear in that lifecycle.
  • This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • In order to describe the manner in which the above-recited and other advantages and features of the invention can be obtained, a more particular description of the invention briefly described above will be rendered by reference to specific embodiments thereof which are illustrated in the appended drawings. Understanding that these drawings depict only typical embodiments of the invention and are not therefore to be considered to be limiting of its scope, the invention will be described and explained with additional specificity and detail through the use of the accompanying drawings in which:
  • FIG. 1 abstractly illustrates an example computing system in which the principles described herein may be employed;
  • FIG. 2 illustrates a user interface element in the form of a blade, and in which commands are displayed in a command bar;
  • FIG. 3 illustrates a user interface that represents modifications to the user interface that would occur if the user selects the command bar expansion control of FIG. 2;
  • FIG. 4 illustrates an example context menu that represents another example of a context-sensitive mechanism for visualizing controls;
  • FIG. 5 illustrates an extended example context menu that represents another example of a context-sensitive mechanism for visualizing controls;
  • FIGS. 6A through 6D illustrates various visualizations of the same commands across different user interface contexts;
  • FIG. 7 illustrates that the commands within FIGS. 6A through 6D are indeed the same;
  • Even though the user experience for displaying the commands and the context where the command is displayed may be different, the actual command is the same as illustrated in FIG. 7;
  • FIG. 8 illustrates a life-cycle that the system may be aware of for all commands, whether built-in or extrinsic;
  • FIG. 9 illustrates an example of a dialog that may appear upon initiating a stop website command; and
  • FIG. 10 illustrates that each stage in a life cycle of a command can surface a different dialog, with the application developer indicating whether the corresponding dialog is to appear at each stage in the state machine.
  • DETAILED DESCRIPTION
  • Commanding is a common way of describing behavior in a system, whether distributed or otherwise. Each command represents a unit of functionality that can be applicable to an asset within the system, to the system itself, or to any arbitrary artifact. Commands can be provided by the system (i.e., built-in commands) or by other parties (extrinsic commands).
  • In accordance with the principles described herein, commands are provided consistently in an entire system, even though the system itself may be operating a number of different applications composed by entirely different parties. Furthermore, the embodiments described herein help security by running commands in the right isolation mode, such that harmful (but not necessarily malicious) code does not compromise the system. Preferably, the command should not block the user interface so they should run asynchronously. As far as the user experience, the embodiments described herein allow commands to be surfaced following the same patterns (e.g. command bar; context menu; etc.) (i.e., also referred to herein as a context-sensitive mechanism for visualization) and provide interactivity options to the users (e.g. dialogs) so they can participate in the operation and also understand the operation's status and result.
  • The principles described herein may be implemented using a computing system. For instance, the users may be engaging with the system using a client computing system. The executable logic supporting the system and providing visualizations thereon may also be performed using a computing system. The computing system may even be distributed. Accordingly, a brief description of a computing system will now be provided.
  • Computing systems are now increasingly taking a wide variety of forms. Computing systems may, for example, be handheld devices, appliances, laptop computers, desktop computers, mainframes, distributed computing systems, or even devices that have not conventionally been considered a computing system. In this description and in the claims, the term “computing system” is defined broadly as including any device or system (or combination thereof) that includes at least one physical and tangible processor, and a physical and tangible memory capable of having thereon computer-executable instructions that may be executed by the processor. The memory may take any form and may depend on the nature and form of the computing system. A computing system may be distributed over a network environment and may include multiple constituent computing systems. An example computing system is illustrated in FIG. 1.
  • As illustrated in FIG. 1, in its most basic configuration, a computing system 100 typically includes at least one processing unit 102 and memory 104. The memory 104 may be physical system memory, which may be volatile, non-volatile, or some combination of the two. The term “memory” may also be used herein to refer to non-volatile mass storage such as physical storage media. If the computing system is distributed, the processing, memory and/or storage capability may be distributed as well. As used herein, the term “executable module” or “executable component” can refer to software objects, routines, or methods that may be executed on the computing system. The different components, modules, engines, and services described herein may be implemented as objects or processes that execute on the computing system (e.g., as separate threads).
  • In the description that follows, embodiments are described with reference to acts that are performed by one or more computing systems. If such acts are implemented in software, one or more processors of the associated computing system that performs the act direct the operation of the computing system in response to having executed computer-executable instructions. For example, such computer-executable instructions may be embodied on one or more computer-readable media that form a computer program product. An example of such an operation involves the manipulation of data. The computer-executable instructions (and the manipulated data) may be stored in the memory 104 of the computing system 100. Computing system 100 may also contain communication channels 108 that allow the computing system 100 to communicate with other message processors over, for example, network 110.
  • The computing system 100 also includes a display 112 on which a user interface, such as the user interfaces described herein, may be rendered. Such user interfaces may be generated in computer hardware or other computer-represented form prior to rendering. The presentation and/or rendering of such user interfaces may be performed by the computing system 100 by having the processing unit(s) 102 execute one or more computer-executable instructions that are embodied on one or more computer-readable media. Such computer-readable media may form all or a part of a computer program product.
  • Embodiments described herein may comprise or utilize a special purpose or general-purpose computer including computer hardware, such as, for example, one or more processors and system memory, as discussed in greater detail below. Embodiments described herein also include physical and other computer-readable media for carrying or storing computer-executable instructions and/or data structures. Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system. Computer-readable media that store computer-executable instructions are physical storage media. Computer-readable media that carry computer-executable instructions are transmission media. Thus, by way of example, and not limitation, embodiments of the invention can comprise at least two distinctly different kinds of computer-readable media: computer storage media and transmission media.
  • Computer storage media includes RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other tangible medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
  • A “network” is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a computer, the computer properly views the connection as a transmission medium. Transmissions media can include a network and/or data links which can be used to carry or desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of computer-readable media.
  • Further, upon reaching various computer system components, program code means in the form of computer-executable instructions or data structures can be transferred automatically from transmission media to computer storage media (or vice versa). For example, computer-executable instructions or data structures received over a network or data link can be buffered in RAM within a network interface module (e.g., a “NIC”), and then eventually transferred to computer system RAM and/or to less volatile computer storage media at a computer system. Thus, it should be understood that computer storage media can be included in computer system components that also (or even primarily) utilize transmission media.
  • Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor, cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. The computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the described features or acts described above. Rather, the described features and acts are disclosed as example forms of implementing the claims.
  • Those skilled in the art will appreciate that the invention may be practiced in network computing environments with many types of computer system configurations, including, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, pagers, routers, switches, and the like. The invention may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks. In a distributed system environment, program modules may be located in both local and remote memory storage devices.
  • In accordance with principles described herein, a user interface element (often called herein a “part”) represents a basic unit of the user interface. Each of at least some of the parts are associated with corresponding controls that the user may interact with to thereby cause the system to execute respective commands. The execution of the command may, for instance, return data to project via the corresponding part. The parts may incorporate extrinsic commands that implement given contracts, and may reason about them.
  • In accordance with the principles described herein, commands (also called hereinafter “non-context-sensitive commands” or “core commands”) can be associated with resources in the system (such as a website, database, an arbitrary artifact, the system itself, or a piece of the user interface). This association may be persistent, such that when that resource is displayed in different user interface contexts, the non-context-sensitive commands associated with that resource are still available, but displayed using the right context-sensitive mechanism. For instance, the context-sensitive mechanism for visualizing these core commands may be a user experience form factor appropriate for the context in which the part is displayed. Consistency may also be further achieved by having the same context-visualization mechanism used to display the commands of any user interface element that is displayed in a particular user interface context. Commands are thus offered to the user via a well-defined experience that is consistent across the entire system.
  • The system provides a set of abstractions through a portal that enable application developers to create commands. A command encapsulates an action in the system. The composition tree describes structure and the commands describe behavior. Commands provide a well-defined surface that the system can reason about to support units of behavior. Commands can be system commands (built-in) and custom commands (provided by the application developer). Commands are offered to the user via a well-defined experience that is consistent across the entire system. This experience is built-in and cannot be redefined by application developers. Application developers can only contribute with new commands, but not with new ways of exposing those commands to the user. Thus, the manner of exposing commands (the command experience) is governed by the system.
  • Commands provide application developers and users a consistent model across applications (sometimes referred to as “extensions”) compatible with browser capabilities and scalable to all parties to describe behavior in the system. The command may have affinity with portal assets which can make them available everywhere that asset is presented if so desired
  • The user interface may be a rich in allowing different user interface elements to be presented in entirely different contexts. For instance, as will be described further below, the user interface might include different contexts such as a favorites area, a blade, and hubs. These represent different places where user interface elements (also called herein “parts”) can be displayed. The commands associated with a given asset can be available in all of these different contexts, enabling taking action on a resource in the way that is more convenient (and in addition new commands can be added to accommodate the specifics of each context in case is needed).
  • Commands can be associated with different resources in the system. For instance, the resource might be a portion of the user interface itself, such as a part. As another example of a portion of the user interface may be what will be referred to herein as a “blade”. A blade is a user interface element that may be placed on a canvas that extends in an extendible direction (such as horizontally). For substantially all of a particular range of the canvas in the dimension of the extendible dimension, the blade may occupy substantially all of the canvas in the dimension perpendicular to the extendible direction of the canvas. The resource might also be associated with an actual asset in the system, such as a website, database, virtual machine, and so forth. Blades or other parts can be also associated with assets creating a transitive relationship between the commands and its container if so desired.
  • Commands are visualized through different context-sensitive mechanisms, depending on the user interface context in which the associated part is displayed. Each context-sensitive mechanism supports a particular user experience. For instance, FIG. 2 illustrates a user interface element 200 in the form of a blade. The blade is associated with an asset in the form of a website (called “thisisanothersite1” in the user interface element 200 of FIG. 2). Commands are presented in a command bar 210 at the top of the blade in this context. Thus, the command bar 210 represents a context-sensitive mechanism for visualizing commands when the commands are displayed in the context of a blade.
  • In FIG. 2, the command bar 210 is illustrated as visualizing three non-context-sensitive commands including the start command 211, the stop command 212, and the restart command 213. The start command 211 is deemphasized as not selectable since the web site has already running as evidenced within the status window 230. As will be seen from the subsequent windows, the command 211, 212 and 213 are “non-context-sensitive” in that regardless of the user interface context in which the non-context-sensitive commands are displayed, at least the selectable non-context-sensitive commands (in this case stop command 212 and the restart command 213) will still be displayed.
  • The command bar 210 also includes an overflow control 221 (also called hereinafter a “command bar expansion control 221”) that is presented when there are more commands associated with the blade than the blade can display in the available space. FIG. 3 illustrates a user interface 300 that represents modifications to the user interface 200 that would occur if the user selects the command bar expansion control 221. Note that the command bar 210 is augmented to be an augmented command bar 310 that shows a second row of commands. For instance, a non-context-sensitive delete command 214 is illustrated in the second row.
  • The selectable non-context-sensitive commands 212 through 214 may be presented regardless of where commands appears in the user interface context. For instance, when the user interface element is associated with a resource, the non-context sensitive commands may be basic commands associated with the resource that the user might like to initiate regardless of the user interface context in which the resource is presented. For instance, a user might like to start, stop, restart, or delete a web site from any one of a number of different user interface contexts.
  • Referring again to FIGS. 2 and 3, the command bars 210 and 310 also include context-sensitive commands. For instance, there is an overflow indicator 221 (hereinafter referred to as a “command bar expand command” 221). Furthermore, there is a browse command 222, which is related to the underlying asset (e.g., the web site), but which is specific to a presentation in a particular user interface context. For instance, a user might like to browse to the web site when the web site is associated with the blade (since there is more space available to usefully browse), but the user might not be so interested in browsing if they are working in the context of a smaller user interface portion that is associated with that web site. Another context-sensitive command is illustrated as the reset publish profile command 223 in FIG. 3. FIG. 3 illustrates another context-sensitive command in the form of a command bar collapse control 224, which when selected returns the user interface 300 to that of the user interface 200 of FIG. 2.
  • In one embodiment, context-sensitive commands including one or more of 1) non-selectable non-context-sensitive commands (such as the start command 211), 2) commands that are associated with an underlying resource, but which are not to be performed in every user interface context (such as the browse command 222 and the reset publish profile command 223), and 3) and commands that are associated with the user interface element itself, but not the underlying resource (such as the overflow indicator 221 or command bar expansion command).
  • FIGS. 2 and 3 illustrate user interface elements when web commands are associated with a blade, which is one example of a user interface context. However, web commands may be displayed in other user interface contexts. For instance, suppose that the web commands are in a smaller user interface part that is within the favorites area, within an activity pane or grid. In those user interface contexts, the active context sensitive commands may be displayed in a context menu. FIG. 4 illustrates an example context menu 400.
  • The context menu 400 again visualizes the active non-context-sensitive commands, including the stop command 412 (corresponds to stop command 212 of FIG. 2), the restart command 413 (corresponding to the restart command 213 of FIG. 2), and the delete command 414 (corresponding to the delete command 214 of FIG. 3). Because the state of the underlying resource (i.e., the web site) was persisted, the system recognized that the start command is not a selectable command given the current state of the resource. Thus, when accessing commands for that same web site via another user interface context, the active non-context-sensitive commands are again displayed. The context menu 400 also includes a single context-sensitive command in the form of an unpin command 421, which would remove the associated user interface element from the user interface context.
  • In this example, the blade user interface element 200 is one example of a user interface context with the command bar 210 being an associated context-sensitive visualization for the commands. The context menu 400 is another example of the associated context-sensitive visualization for the commands, which is associated with other user interface contexts (such as smaller parts, favorites areas, grids, activity panes, and so forth).
  • FIG. 5 illustrates another user interface element 500 that represents a more extended context menu. This user interface element 500 might appear when accessing commands to operate on the web site from yet another user interface context. Accordingly, the user interface element 500 represents yet a third example context-sensitive mechanism for visualizing the web commands.
  • The user interface element 500 again displays the non-context sensitive commands including the stop command 512 (corresponding to the stop command 212 and 412 in FIGS. 2 and 4, respectively), the restart command 513 (corresponding to the restart command 213 and 413 in FIGS. 2 and 4, respectively, and the delete command 514 (corresponding to the delete command 214 and 414 in FIGS. 3 and 4, respectively). Note that the underlying resource is the web site, and the state of the web site has been preserved. Accordingly, the start command 511 (corresponding to the start command 211 of FIG. 2) is displayed, but in deemphasized form. The browse command (corresponding to browse command 222 of FIG. 2), and the reset publish profile command 523 (corresponding to the reset publish profile command 223 of FIG. 2), are also displayed, even though they are context-sensitive commands.
  • The commands 511 through 514, 522 and 523 are application commands 510 (also referred to herein as “extrinsic commands”) being offered by application developers and not underlying system. The extended context menu 500 also includes system commands 530, such as an unpin command 531, and size selection commands 522 through 524. Such system commands 530 are offered by the system regardless of the underlying resource, so long as the commands were selected within the given user interface context that generated the extended context menu 500.
  • The built-in commands provide general infrastructure services (pin/unpin parts, resizing parts, restoring layout, and so forth) and are general in that they apply across all usage domains. Commands provided by application developers are domain specific. For instance, an example set of extrinsic commands for a web site application might include “start”, “stop”, “delete website” and so forth.
  • Commands are authored by application developers by leveraging a set of artifacts (interfaces and bases classes provided by the system) that expose the command contract to the application developers. This allows the application developer to provide the actual behavior of the command (what happens when the command is executed), provide dialogs (which are optional) that will display at different moments of the command's life cycle, and influence the command life-cycle.
  • The non-context-sensitive commands can follow a resource in multiple contexts. For example, commands associated with a website can be present in the website's blade (see FIG. 6A), in the website startboard part (see FIG. 6B), when the website is displayed in a grid (see FIG. 6C), in the notifications panel, when the website is part of a search result (see FIG. 6D) or anywhere the website is surfaced. Note that the status of the underlying resource is considered in each of FIGS. 6A through 6D, in that a start command is not offered given that the web site has already started. Furthermore, note that the stop command, the restart command, and the delete command are offered regardless of the user interface context in which the commands and associated resource are visualized. Each of FIGS. 6A through 6B illustrate the non-context-sensitive commands being displayed via a different context-sensitive mechanism as a result of being in a different user interface context. Even though the user experience for displaying the commands and the context where the command is displayed may be different, the actual command is the same as illustrated in FIG. 7.
  • FIG. 8 illustrates a life-cycle 800 that the system may be aware of for all commands, whether built-in or extrinsic. The life-cycle 800 may be tracked by, for example, a command state tracking module, which may be a single module or a collection of modules.
  • Before the command is initiated, there is no operation, which is the none state 801 in FIG. 8. When the command is initiated, the state transitions 811 to a pending state 802. In the pending state, the command is in process, and the results are pending. If the command is cancelled (transition 812), then the operation ends transitioning to the None state 801. If the operation completes and is successful (transition 813), the result is success (the “Success” state 803 in FIG. 8). Otherwise, if the operation is not successful (transition 814), the result is failure (the “Failure” state 804 in FIG. 4). The system understands this lifecycle even without understanding what specifically the underlying operation(s) of the command are doing.
  • The developer can specify whether or not constrained user interface elements (or dialogs) are to appear at each of the transition 811 through 814 for each command. Accordingly, when making a transition 811 through 814, the system can check to determine whether a dialog is to appear as part of the transition. For instance, such dialogs could ask users for confirmations, inform of progress, or inform of the result of an operation, all depending on which transition 811 through 814 is being made, and what the resource associated with the command is.
  • FIG. 9 illustrates an example of a dialog 900. In this example, the system is aware that the user has selected a stop command. The system may then track the overall lifecycle 800 of the stop command even though the system might not be aware of all that is involved in stopping the web site. The system is also aware of the resource type being operated upon (i.e., a web site) as well as an identifier for that resource (“Wandering”).
  • Immediately upon receiving the stop command, the stop command begins transitioning (as represented by transition 811) from the Non state 801 to the Pending state 802. However, as part of this transition, the author verifies that the browser developer has not indicated that a dialog is to appear at this point for this type of resource (e.g., website), and/or that the web site developer has not indicated that a dialog is to appear at this point for that particular resource (e.g., the “Wandering website”). The browser developer and/or the website author may also specify a dialog template in cases in which there are multiple templates that could be used for that transition and resource type.
  • Here, the system verifies that a dialog is to appear, and thus presents dialog 900. The dialog 900 may be generated knowing nothing more than which transition is involved (and potentially also a dialog template to use which may also be specified by the developer). The dialog 900 may then populate the dialog template using the name of the resource (e.g., “Wandering” website), and then present the dialog to the user. Thus, the presentation of dialogs may be consistent throughout the system regardless of the command being executed, or the resource being operated upon, even without the system knowing the specifics of the underlying operations that support the command.
  • These dialogs are data-driven and extremely constrained to provide a uniform user experience across applications. Dialogs include confirmation (with yes/no buttons), show progress (deterministic and non-deterministic), show success, and show failure (with a retry button). Application developers can configure the command to surface the dialogs at certain points in the lifecycle of the operation.
  • As illustrated in FIG. 10, each stage in the life cycle can surface a different dialog, with the application developer indicating whether the corresponding dialog is to appear at each stage in the state machine.
  • In some cases, the portal can provide abstractions that application developers can use to create intrinsic commands that the system will recognize (at least to the point of being able to track the state machine of FIG. 10). An example can be extensible abstractions for “Save” and “Discard” commands that when used in forms are subject to the validation state and changes to the underlying form.
  • Commands are executed asynchronously by the system. Commands provided by application authors are executed leveraging the system's isolation model to ensure that they do not compromise the overall portal (as the execution is isolated within the application that owns the command).
  • The application developer creates a small set of commands that are available in multiple contexts. All capabilities for his commands (execution logic, dialogs, and so forth) are preserved and the necessary user experience is adapted to the constraints of where the command is rendered. This makes possible that users can interact with a resource at any place in the user interface. There is no single location where “actions” can be executed but rather any place in the portal allows rich interactions with resources.
  • Accordingly, a system has been described that provides consistency in how commands are visualized, as well as how dialogs associated with the command lifecycle are visualized. This is true regardless of there being user interface elements of different applications within the system.
  • The present invention may be embodied in other specific forms without departing from its spirit or essential characteristics. The described embodiments are to be considered in all respects only as illustrative and not restrictive. The scope of the invention is, therefore, indicated by the appended claims rather than by the foregoing description. All changes which come within the meaning and range of equivalency of the claims are to be embraced within their scope.

Claims (20)

What is claimed is:
1. A computer program product comprising one or more computer-readable storage media having thereon computer-executable instructions that are structured such that, when executed by one or more processors of a computing system, cause the computing system to instantiate and/or operate the following:
a plurality of non-context-sensitive commands that may be initiated from each of a plurality of user interface contexts; and
a plurality of context-sensitive mechanisms for visualizing the plurality of non-context-sensitive commands depending on which of the plurality of user interface contexts in which the plurality non-context sensitive commands appear.
2. The computer program product in accordance with claim 1, the plurality of non-context-sensitive commands being associated with a resource upon which each of the plurality of non-context-sensitive commands acts when selected.
3. The computer program product in accordance with claim 1, the one or more computer-readable storage media having thereon computer-executable instructions that are structured such that, when executed by one or more processors of a computing system, cause the computing system to further instantiate and/or operate the following:
one or more context-sensitive commands visualized with the plurality of non-context-sensitive commands, but which may differ depending on which of the plurality of user interface contexts in which the plurality of non-context-sensitive commands appear.
4. The computer program product in accordance with claim 1, the different contexts in the user interface including a first set of one or more contexts associated with a first application, and a second set of one or more contexts associated with a second application.
5. The computer program product in accordance with claim 4, the first set of one or more contexts associated with the first application including a first context of a particular context type, and the second set of one or more contexts associated with the second application also including a second context also of the same particular context type.
6. The computer program product in accordance with claim 5, any context of the particular context type being associated with a particular context-sensitive mechanism for visualizing the plurality of non-context-sensitive commands, such that the mechanism for visualizing the plurality of non-context-sensitive commands is the same in the first context of the particular context type and associated with the first application as the mechanism for visualizing the plurality of non-context-sensitive commands in the second context of the particular context type and associated with the second application.
7. The computer program product in accordance with claim 1, the plurality of non-context-sensitive commands being a first plurality of non-context-sensitive commands, the one or more computer-readable storage media further having thereon computer-executable instructions that are structured such that, when executed by one or more processors of the computing system, cause the computing system to instantiate and/or operate the following:
a second plurality of non-context-sensitive commands that may be initiated from each of the plurality of user interface contexts; the plurality of context-sensitive mechanisms also for visualizing the second plurality of non-context-sensitive commands,
such that for a given user interface context, the same context-sensitive mechanism is used to visualize the first plurality of non-context-sensitive commands as would be used to visualize the second plurality of non-context-sensitive commands in that given user interface context.
8. The computer program product in accordance with claim 1, the plurality of non-context-sensitive commands being selectable from a user interface element, the execution of at least one of the plurality of non-context-sensitive commands resulting in a change in data displayed in the user interface element.
9. The computer program product in accordance with claim 1, the plurality of context-sensitive mechanisms being intrinsic to a system, and not alterable by applications running within the system.
10. A method for executing a command from a user interface element, the method comprising:
an act of initiating the command; and
an act of tracking the command at a plurality of stages;
for each of the plurality of stages, determining whether a dialog is indicated as to be displayed, and if so, displaying a dialog for the corresponding stage that is consistent across a plurality of commands.
11. The method in accordance with claim 10, the method being performed by a command state tracking module, and the command being a first command, the method further comprising:
an act of initiating a second command; and
an act of tracking the second command at a plurality of stages of the second command that are the same as the plurality of stages of the first command;
for each of the plurality of stages of the second command, determining whether a dialog is indicated as to be displayed, and if so, displaying a dialog for the corresponding stage that is consistent across a plurality of commands.
12. The method in accordance with claim 11, the first and second commands both being intrinsic commands.
13. The method in accordance with claim 11, the first and second commands both being extrinsic commands.
14. The method in accordance with claim 11, one of the first and second commands being an extrinsic command, and the other of the first and second commands being an intrinsic command.
15. The method in accordance with claim 10, wherein the act of determining whether a dialog is to be displayed is performed at least at each transition between the plurality of stages.
16. The method in accordance with claim 15, the determination of whether or not to display a dialog being provided by a developer.
17. The method in accordance with claim 15, the act of displaying the dialog performed at each transition being a function of the transition.
18. The method in accordance with claim 10, the act of displaying the dialog being a function of a resource that is being operated upon.
19. A computer program product comprising one or more computer-readable storage media having thereon computer-executable instructions that are structured such that, when executed by one or more processors of a computing system, cause the computing system to perform a method for executing a command from a user interface element, the method comprising:
an act of initiating the command; and
an act of tracking the command at a plurality of stages;
for each of the plurality of stages, an act of determining whether a dialog is indicated as to be displayed, and if so, displaying a dialog for the corresponding stage that is consistent across a plurality of commands.
20. The computer program product in accordance with claim 19, the plurality of commands including a plurality of extrinsic commands and a plurality of intrinsic commands.
US14/231,873 2013-09-30 2014-04-01 Extensible and context-aware commanding infrastructure Abandoned US20150095812A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/231,873 US20150095812A1 (en) 2013-09-30 2014-04-01 Extensible and context-aware commanding infrastructure

Applications Claiming Priority (12)

Application Number Priority Date Filing Date Title
US201361884743P 2013-09-30 2013-09-30
US201361905128P 2013-11-15 2013-11-15
US201361905114P 2013-11-15 2013-11-15
US201361905101P 2013-11-15 2013-11-15
US201361905129P 2013-11-15 2013-11-15
US201361905105P 2013-11-15 2013-11-15
US201361905111P 2013-11-15 2013-11-15
US201361905116P 2013-11-15 2013-11-15
US201361905119P 2013-11-15 2013-11-15
US201361905247P 2013-11-17 2013-11-17
US201361905243P 2013-11-17 2013-11-17
US14/231,873 US20150095812A1 (en) 2013-09-30 2014-04-01 Extensible and context-aware commanding infrastructure

Publications (1)

Publication Number Publication Date
US20150095812A1 true US20150095812A1 (en) 2015-04-02

Family

ID=52741177

Family Applications (11)

Application Number Title Priority Date Filing Date
US14/231,883 Active 2034-11-24 US9672276B2 (en) 2013-09-30 2014-04-01 Multi-act creation user interface element
US14/231,897 Active 2035-02-28 US9805114B2 (en) 2013-09-30 2014-04-01 Composable selection model through reusable component
US14/231,873 Abandoned US20150095812A1 (en) 2013-09-30 2014-04-01 Extensible and context-aware commanding infrastructure
US14/231,917 Abandoned US20150095846A1 (en) 2013-09-30 2014-04-01 Pan and selection gesture detection
US14/231,862 Active 2035-01-24 US9792354B2 (en) 2013-09-30 2014-04-01 Context aware user interface parts
US14/231,891 Active 2034-11-10 US9483549B2 (en) 2013-09-30 2014-04-01 Persisting state at scale across browser sessions
US14/231,912 Abandoned US20150095849A1 (en) 2013-09-30 2014-04-01 Dialogs positioned with action visualization
US14/231,869 Active 2034-08-22 US9754018B2 (en) 2013-09-30 2014-04-01 Rendering interpreter for visualizing data provided from restricted environment container
US14/231,846 Abandoned US20150095842A1 (en) 2013-09-30 2014-04-01 Extendable blade sequence along pannable canvas direction
US14/231,880 Abandoned US20150095365A1 (en) 2013-09-30 2014-04-01 Query building using schema
US14/231,905 Active 2034-09-08 US9727636B2 (en) 2013-09-30 2014-04-01 Generating excutable code from complaint and non-compliant controls

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US14/231,883 Active 2034-11-24 US9672276B2 (en) 2013-09-30 2014-04-01 Multi-act creation user interface element
US14/231,897 Active 2035-02-28 US9805114B2 (en) 2013-09-30 2014-04-01 Composable selection model through reusable component

Family Applications After (8)

Application Number Title Priority Date Filing Date
US14/231,917 Abandoned US20150095846A1 (en) 2013-09-30 2014-04-01 Pan and selection gesture detection
US14/231,862 Active 2035-01-24 US9792354B2 (en) 2013-09-30 2014-04-01 Context aware user interface parts
US14/231,891 Active 2034-11-10 US9483549B2 (en) 2013-09-30 2014-04-01 Persisting state at scale across browser sessions
US14/231,912 Abandoned US20150095849A1 (en) 2013-09-30 2014-04-01 Dialogs positioned with action visualization
US14/231,869 Active 2034-08-22 US9754018B2 (en) 2013-09-30 2014-04-01 Rendering interpreter for visualizing data provided from restricted environment container
US14/231,846 Abandoned US20150095842A1 (en) 2013-09-30 2014-04-01 Extendable blade sequence along pannable canvas direction
US14/231,880 Abandoned US20150095365A1 (en) 2013-09-30 2014-04-01 Query building using schema
US14/231,905 Active 2034-09-08 US9727636B2 (en) 2013-09-30 2014-04-01 Generating excutable code from complaint and non-compliant controls

Country Status (17)

Country Link
US (11) US9672276B2 (en)
EP (6) EP3053031A1 (en)
JP (2) JP6446038B2 (en)
KR (3) KR102186865B1 (en)
CN (6) CN105683907B (en)
AU (2) AU2014324618A1 (en)
BR (1) BR112016004551A8 (en)
CA (2) CA2922725A1 (en)
CL (1) CL2016000729A1 (en)
HK (1) HK1222731A1 (en)
IL (1) IL244368A0 (en)
MX (2) MX2016003946A (en)
PH (1) PH12016500256A1 (en)
RU (2) RU2686822C2 (en)
SG (2) SG10201802632SA (en)
TW (4) TW201528106A (en)
WO (7) WO2015048203A1 (en)

Families Citing this family (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USD750112S1 (en) * 2013-01-04 2016-02-23 Samsung Electronics Co., Ltd. Portable electronic device with graphical user interface
US9672276B2 (en) * 2013-09-30 2017-06-06 Microsoft Technology Licensing, Llc Multi-act creation user interface element
USD745877S1 (en) * 2013-10-17 2015-12-22 Microsoft Corporation Display screen with graphical user interface
US10176218B2 (en) * 2014-11-07 2019-01-08 Sap Se OData custom query composer
US10452750B2 (en) 2015-08-04 2019-10-22 Google Llc Systems and methods for interactively presenting a visible portion of a rendering surface on a user device
US10990258B2 (en) * 2015-08-04 2021-04-27 Google Llc Interactively presenting a visible portion of a rendering surface on a user device
JP6812639B2 (en) * 2016-02-03 2021-01-13 セイコーエプソン株式会社 Electronic devices, control programs for electronic devices
US10289297B2 (en) * 2016-08-26 2019-05-14 Google Llc Animating an image to indicate that the image is pannable
US9871911B1 (en) * 2016-09-30 2018-01-16 Microsoft Technology Licensing, Llc Visualizations for interactions with external computing logic
KR102605332B1 (en) * 2016-11-02 2023-11-23 주식회사 넥슨코리아 Device and method to provide content
US10796088B2 (en) * 2017-04-21 2020-10-06 International Business Machines Corporation Specifying a conversational computer agent and its outcome with a grammar
WO2018208047A1 (en) 2017-05-09 2018-11-15 Samsung Electronics Co., Ltd. Method and system for managing and displaying application
US10827319B2 (en) * 2017-06-02 2020-11-03 Apple Inc. Messaging system interacting with dynamic extension app
US11341422B2 (en) 2017-12-15 2022-05-24 SHANGHAI XIAOl ROBOT TECHNOLOGY CO., LTD. Multi-round questioning and answering methods, methods for generating a multi-round questioning and answering system, and methods for modifying the system
CN110019717B (en) * 2017-12-15 2021-06-29 上海智臻智能网络科技股份有限公司 Device for modifying multi-turn question-answering system
CN110019718B (en) * 2017-12-15 2021-04-09 上海智臻智能网络科技股份有限公司 Method for modifying multi-turn question-answering system, terminal equipment and storage medium
US11379252B1 (en) * 2018-01-31 2022-07-05 Parallels International Gmbh System and method for providing layouts for a remote desktop session
US11659003B2 (en) * 2018-08-30 2023-05-23 International Business Machines Corporation Safe shell container facilitating inspection of a virtual container
US10902045B2 (en) * 2018-09-18 2021-01-26 Tableau Software, Inc. Natural language interface for building data visualizations, including cascading edits to filter expressions
US11048871B2 (en) * 2018-09-18 2021-06-29 Tableau Software, Inc. Analyzing natural language expressions in a data visualization user interface
CN109542563B (en) * 2018-11-09 2022-06-07 优信数享(北京)信息技术有限公司 Multi-state integrated android page management method, device and system
US11385766B2 (en) 2019-01-07 2022-07-12 AppEsteem Corporation Technologies for indicating deceptive and trustworthy resources
EP3764210A1 (en) 2019-07-08 2021-01-13 dSPACE digital signal processing and control engineering GmbH Display of display areas on a desktop
US11089050B1 (en) * 2019-08-26 2021-08-10 Ca, Inc. Isolating an iframe of a webpage
US11042558B1 (en) 2019-09-06 2021-06-22 Tableau Software, Inc. Determining ranges for vague modifiers in natural language commands
US11474975B2 (en) 2019-09-18 2022-10-18 Microsoft Technology Licensing, Llc Identity represented assets in a content management system
US11199955B2 (en) * 2019-10-02 2021-12-14 Palantir Technologies Inc. Enhanced techniques for building user interfaces
CN110825766A (en) * 2019-11-13 2020-02-21 恩亿科(北京)数据科技有限公司 Query condition generation method and device, server and readable storage medium
CN110995942B (en) * 2019-12-06 2021-08-06 科大国创软件股份有限公司 Soft switch automatic calling method and system based on interface visualization
CN111177455A (en) * 2019-12-31 2020-05-19 精英数智科技股份有限公司 Method, device and equipment for determining cutting tooth load type of coal mining machine and storage medium
CN111610912B (en) * 2020-04-24 2023-10-10 北京小米移动软件有限公司 Application display method, application display device and storage medium
US20230310599A1 (en) 2020-09-02 2023-10-05 Genmab A/S Antibody therapy
US11698933B1 (en) 2020-09-18 2023-07-11 Tableau Software, LLC Using dynamic entity search during entry of natural language commands for visual data analysis
US11301631B1 (en) 2020-10-05 2022-04-12 Tableau Software, LLC Visually correlating individual terms in natural language input to respective structured phrases representing the natural language input
CN112732243A (en) * 2021-01-11 2021-04-30 京东数字科技控股股份有限公司 Data processing method and device for generating functional component
US11363050B1 (en) 2021-03-25 2022-06-14 Bank Of America Corporation Information security system and method for incompliance detection in data transmission

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5625763A (en) * 1995-05-05 1997-04-29 Apple Computer, Inc. Method and apparatus for automatically generating focus ordering in a dialog on a computer system
US6049812A (en) * 1996-11-18 2000-04-11 International Business Machines Corp. Browser and plural active URL manager for network computers
US6128632A (en) * 1997-03-06 2000-10-03 Apple Computer, Inc. Methods for applying rubi annotation characters over base text characters
US6460060B1 (en) * 1999-01-26 2002-10-01 International Business Machines Corporation Method and system for searching web browser history
US20030011638A1 (en) * 2001-07-10 2003-01-16 Sun-Woo Chung Pop-up menu system
US20050088410A1 (en) * 2003-10-23 2005-04-28 Apple Computer, Inc. Dynamically changing cursor for user interface
US20080177994A1 (en) * 2003-01-12 2008-07-24 Yaron Mayer System and method for improving the efficiency, comfort, and/or reliability in Operating Systems, such as for example Windows
US20080201401A1 (en) * 2004-08-20 2008-08-21 Rhoderick Pugh Secure server authentication and browsing
US20110131532A1 (en) * 2009-12-02 2011-06-02 Russell Deborah C Identifying Content via Items of a Navigation System
US20110265035A1 (en) * 2010-04-23 2011-10-27 Marc Anthony Lepage Graphical context menu
US20120131496A1 (en) * 2010-11-23 2012-05-24 Apple Inc. Grouping and Browsing Open Windows
US20120246487A1 (en) * 2009-11-13 2012-09-27 Irdeto Canada Corporation System and Method to Protect Java Bytecode Code Against Static And Dynamic Attacks Within Hostile Execution Environments
US8345014B2 (en) * 2008-07-12 2013-01-01 Lester F. Ludwig Control of the operating system on a computing device via finger angle using a high dimensional touchpad (HDTP) touch user interface
US20140258970A1 (en) * 2013-03-11 2014-09-11 Research In Motion Limited Collaborative application development environment using a connected device

Family Cites Families (168)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6362033A (en) * 1986-09-02 1988-03-18 Nec Corp Display device for relative information
WO1994024657A1 (en) * 1993-04-20 1994-10-27 Apple Computer Inc. Interactive user interface
JPH09245188A (en) * 1996-03-12 1997-09-19 Fujitsu Ltd Graphic displaying method and its device
US5845299A (en) * 1996-07-29 1998-12-01 Rae Technology Llc Draw-based editor for web pages
US6091415A (en) * 1997-05-02 2000-07-18 Inventec Corporation System and method for displaying multiple dialog boxes in a window display
US5886694A (en) * 1997-07-14 1999-03-23 Microsoft Corporation Method for automatically laying out controls in a dialog window
US5995101A (en) * 1997-10-29 1999-11-30 Adobe Systems Incorporated Multi-level tool tip
US6236400B1 (en) * 1998-04-02 2001-05-22 Sun Microsystems, Inc. Method and apparatus for controlling the display of hierarchical information
US6473102B1 (en) 1998-05-11 2002-10-29 Apple Computer, Inc. Method and system for automatically resizing and repositioning windows in response to changes in display
US7801913B2 (en) * 1998-12-07 2010-09-21 Oracle International Corporation System and method for querying data for implicit hierarchies
JP2000331020A (en) * 1999-05-21 2000-11-30 Nippon Telegr & Teleph Corp <Ntt> Method and device for information reference and storage medium with information reference program stored
US6701513B1 (en) 2000-01-14 2004-03-02 Measurement Computing Corporation Program-development environment for use in generating application programs
US7243335B1 (en) 2000-02-17 2007-07-10 Microsoft Corporation Method and system for reducing coding complexity by providing intelligent manipulable defaults
US6681383B1 (en) 2000-04-04 2004-01-20 Sosy, Inc. Automatic software production system
US6473891B1 (en) 2000-05-03 2002-10-29 Lsi Logic Corporation Wire routing to control skew
US7062475B1 (en) 2000-05-30 2006-06-13 Alberti Anemometer Llc Personalized multi-service computer environment
US6750887B1 (en) 2000-06-02 2004-06-15 Sun Microsystems, Inc. Graphical user interface layout manager
US7171455B1 (en) 2000-08-22 2007-01-30 International Business Machines Corporation Object oriented based, business class methodology for generating quasi-static web pages at periodic intervals
US6919890B2 (en) 2000-09-28 2005-07-19 Curl Corporation Grid and table layout using elastics
US6640655B1 (en) * 2000-10-03 2003-11-04 Varco I/P, Inc. Self tracking sensor suspension mechanism
US6950198B1 (en) 2000-10-18 2005-09-27 Eastman Kodak Company Effective transfer of images from a user to a service provider
US7370040B1 (en) 2000-11-21 2008-05-06 Microsoft Corporation Searching with adaptively configurable user interface and extensible query language
AU2002233991A1 (en) 2000-12-06 2002-06-18 American Express Travel Related Services Company, Inc. Layout generator system and method
US6760128B2 (en) 2000-12-06 2004-07-06 Eastman Kodak Company Providing a payment schedule for utilizing stored images using a designated date
JP2002182812A (en) * 2000-12-14 2002-06-28 Smg Kk Site map display system
US7233998B2 (en) 2001-03-22 2007-06-19 Sony Computer Entertainment Inc. Computer architecture and software cells for broadband networks
US7203678B1 (en) 2001-03-27 2007-04-10 Bea Systems, Inc. Reconfigurable query generation system for web browsers
US20020147963A1 (en) 2001-04-09 2002-10-10 Lee Rusty Shawn Method and apparatus for generating machine control instructions
US20020180811A1 (en) 2001-05-31 2002-12-05 Chu Sing Yun Systems, methods, and articles of manufacture for providing a user interface with selection and scrolling
US6950993B2 (en) 2001-08-02 2005-09-27 Microsoft Corporation System and method for automatic and dynamic layout of resizable dialog type windows
US6944829B2 (en) * 2001-09-25 2005-09-13 Wind River Systems, Inc. Configurable user-interface component management system
US7480864B2 (en) 2001-10-12 2009-01-20 Canon Kabushiki Kaisha Zoom editor
US7620908B2 (en) 2001-12-28 2009-11-17 Sap Ag Managing a user interface
US20050066037A1 (en) * 2002-04-10 2005-03-24 Yu Song Browser session mobility system for multi-platform applications
CA2385224C (en) 2002-05-07 2012-10-02 Corel Corporation Dockable drop-down dialogs
US7065707B2 (en) 2002-06-24 2006-06-20 Microsoft Corporation Segmenting and indexing web pages using function-based object models
US7293024B2 (en) 2002-11-14 2007-11-06 Seisint, Inc. Method for sorting and distributing data among a plurality of nodes
US7000184B2 (en) 2003-01-24 2006-02-14 The Cobalt Group, Inc. Remote web site editing in a standard web browser without external software
US20040165009A1 (en) * 2003-02-20 2004-08-26 International Business Machines Corporation Expansion of interactive user interface components
US7769794B2 (en) 2003-03-24 2010-08-03 Microsoft Corporation User interface for a file system shell
US7823077B2 (en) 2003-03-24 2010-10-26 Microsoft Corporation System and method for user modification of metadata in a shell browser
US7720616B2 (en) * 2003-05-07 2010-05-18 Sureprep, Llc Multi-stage, multi-user engagement submission and tracking process
US7417644B2 (en) 2003-05-12 2008-08-26 Microsoft Corporation Dynamic pluggable user interface layout
US7669140B2 (en) * 2003-08-21 2010-02-23 Microsoft Corporation System and method for providing rich minimized applications
US8037420B2 (en) * 2003-12-04 2011-10-11 International Business Machines Corporation Maintaining browser navigation relationships and for choosing a browser window for new documents
US7711742B2 (en) 2003-12-11 2010-05-04 International Business Machines Corporation Intelligent data query builder
US20080109785A1 (en) 2004-01-16 2008-05-08 Bailey Bendrix L Graphical Program Having Graphical and/or Textual Specification of Event Handler Procedures for Program Objects
GB2411331A (en) * 2004-02-19 2005-08-24 Trigenix Ltd Rendering user interface using actor attributes
US7577938B2 (en) * 2004-02-20 2009-08-18 Microsoft Corporation Data association
US7536672B1 (en) 2004-03-05 2009-05-19 Adobe Systems Incorporated Management of user interaction history with software applications
US7694233B1 (en) * 2004-04-30 2010-04-06 Apple Inc. User interface presentation of information in reconfigured or overlapping containers
CN100343802C (en) * 2004-05-10 2007-10-17 华为技术有限公司 Method and system for unifying users'interface
US8453065B2 (en) 2004-06-25 2013-05-28 Apple Inc. Preview and installation of user interface elements in a display environment
US8046712B2 (en) 2004-06-29 2011-10-25 Acd Systems International Inc. Management of multiple window panels with a graphical user interface
US8117542B2 (en) * 2004-08-16 2012-02-14 Microsoft Corporation User interface for displaying selectable software functionality controls that are contextually relevant to a selected object
US7434173B2 (en) 2004-08-30 2008-10-07 Microsoft Corporation Scrolling web pages using direct interaction
US7720867B2 (en) 2004-09-08 2010-05-18 Oracle International Corporation Natural language query construction using purpose-driven template
US8819569B2 (en) 2005-02-18 2014-08-26 Zumobi, Inc Single-handed approach for navigation of application tiles using panning and zooming
US7728825B2 (en) * 2005-03-22 2010-06-01 Microsoft Corporation Targeting in a stylus-based user interface
US20060224951A1 (en) * 2005-03-30 2006-10-05 Yahoo! Inc. Multiple window browser interface and system and method of generating multiple window browser interface
US20060236264A1 (en) * 2005-04-18 2006-10-19 Microsoft Corporation Automatic window resize behavior and optimizations
US8195646B2 (en) 2005-04-22 2012-06-05 Microsoft Corporation Systems, methods, and user interfaces for storing, searching, navigating, and retrieving electronic information
US7721225B2 (en) * 2005-05-03 2010-05-18 Novell, Inc. System and method for creating and presenting modal dialog boxes in server-side component web applications
US7730418B2 (en) 2005-05-04 2010-06-01 Workman Nydegger Size to content windows for computer graphics
US20070024646A1 (en) 2005-05-23 2007-02-01 Kalle Saarinen Portable electronic apparatus and associated method
US20060282771A1 (en) 2005-06-10 2006-12-14 Tad Vinci Verifying document compliance to a subsidiary standard
US20070033522A1 (en) 2005-08-02 2007-02-08 Lin Frank L System and method for dynamic resizing of web-based GUIs
US7933632B2 (en) 2005-09-16 2011-04-26 Microsoft Corporation Tile space user interface for mobile devices
US8543824B2 (en) 2005-10-27 2013-09-24 Apple Inc. Safe distribution and use of content
US7954064B2 (en) 2005-10-27 2011-05-31 Apple Inc. Multiple dashboards
US8434021B2 (en) * 2005-11-30 2013-04-30 Microsoft Corporation Centralized user interface for displaying contextually driven business content and business related functionality
US7836303B2 (en) 2005-12-09 2010-11-16 University Of Washington Web browser operating system
US8898203B2 (en) * 2005-12-27 2014-11-25 International Business Machines Corporation Generating a separable query design object and database schema through visual view editing
JP4635894B2 (en) 2006-02-13 2011-02-23 ソニー株式会社 Information processing apparatus and method, and program
JP4415961B2 (en) * 2006-03-15 2010-02-17 ブラザー工業株式会社 Removable media device and data control program
US20070233854A1 (en) 2006-03-31 2007-10-04 Microsoft Corporation Management status summaries
US20070234195A1 (en) 2006-04-03 2007-10-04 National Instruments Corporation Simultaneous update of a plurality of user interface elements displayed in a web browser
US7685519B1 (en) * 2006-07-18 2010-03-23 Intuit Inc. Process and apparatus for providing a customizable content tooltip
US20080018665A1 (en) * 2006-07-24 2008-01-24 Jay Behr System and method for visualizing drawing style layer combinations
US8869027B2 (en) 2006-08-04 2014-10-21 Apple Inc. Management and generation of dashboards
US20080065974A1 (en) 2006-09-08 2008-03-13 Tom Campbell Template-based electronic presence management
US7890957B2 (en) 2006-09-08 2011-02-15 Easyonme, Inc. Remote management of an electronic presence
US20080109714A1 (en) 2006-11-03 2008-05-08 Sap Ag Capturing screen information
US8082539B1 (en) * 2006-12-11 2011-12-20 Parallels Holdings, Ltd. System and method for managing web-based forms and dynamic content of website
JP5031353B2 (en) * 2006-12-15 2012-09-19 キヤノン株式会社 Display device, control method, and program
CN101004685A (en) * 2007-01-08 2007-07-25 叶炜 Method for realizing graphical user interface
US9032329B2 (en) 2007-03-23 2015-05-12 Siemens Product Lifecycle Management Software Inc. System and method for dialog position management
US8321847B1 (en) 2007-05-17 2012-11-27 The Mathworks, Inc. Dynamic function wizard
US20080306933A1 (en) * 2007-06-08 2008-12-11 Microsoft Corporation Display of search-engine results and list
US10019570B2 (en) 2007-06-14 2018-07-10 Microsoft Technology Licensing, Llc Protection and communication abstractions for web browsers
US8065628B2 (en) 2007-06-25 2011-11-22 Microsoft Corporation Dynamic user interface for previewing live content
KR20090000507A (en) * 2007-06-28 2009-01-07 삼성전자주식회사 Method and apparatus of displaying information
US8762880B2 (en) * 2007-06-29 2014-06-24 Microsoft Corporation Exposing non-authoring features through document status information in an out-space user interface
US8422550B2 (en) 2007-07-27 2013-04-16 Lagavulin Limited Apparatuses, methods, and systems for a portable, automated contractual image dealer and transmitter
US9009181B2 (en) * 2007-08-23 2015-04-14 International Business Machines Corporation Accessing objects in a service registry and repository
US8126840B2 (en) 2007-10-22 2012-02-28 Noria Corporation Lubrication program management system and methods
US8046353B2 (en) 2007-11-02 2011-10-25 Citrix Online Llc Method and apparatus for searching a hierarchical database and an unstructured database with a single search query
CN101499004A (en) * 2008-01-31 2009-08-05 株式会社日立制作所 System and method for connecting virtual machine and user interface
JP2009193423A (en) * 2008-02-15 2009-08-27 Panasonic Corp Input device for electronic equipment
US20090254822A1 (en) 2008-04-04 2009-10-08 International Business Machines Corporation Hi-efficiency wizard framework system and method
US8219385B2 (en) 2008-04-08 2012-07-10 Incentive Targeting, Inc. Computer-implemented method and system for conducting a search of electronically stored information
JP4171770B1 (en) * 2008-04-24 2008-10-29 任天堂株式会社 Object display order changing program and apparatus
US8375336B2 (en) * 2008-05-23 2013-02-12 Microsoft Corporation Panning content utilizing a drag operation
US8156445B2 (en) 2008-06-20 2012-04-10 Microsoft Corporation Controlled interaction with heterogeneous data
US20100005053A1 (en) 2008-07-04 2010-01-07 Estes Philip F Method for enabling discrete back/forward actions within a dynamic web application
US8566741B2 (en) 2008-08-29 2013-10-22 Microsoft Corporation Internal scroll activation and cursor adornment
US8402381B2 (en) * 2008-09-23 2013-03-19 International Business Machines Corporation Automatically arranging widgets of a model within a canvas using iterative region based widget relative adjustments
US8095412B1 (en) 2008-11-03 2012-01-10 Intuit Inc. Method and system for evaluating expansion of a business
KR20100049474A (en) * 2008-11-03 2010-05-12 삼성전자주식회사 A method for remote user interface session migration to other device
US20100306696A1 (en) 2008-11-26 2010-12-02 Lila Aps (Ahead.) Dynamic network browser
US7962547B2 (en) * 2009-01-08 2011-06-14 International Business Machines Corporation Method for server-side logging of client browser state through markup language
US20100229115A1 (en) 2009-03-05 2010-09-09 Microsoft Corporation Zoomable user interface data generation
US8806371B2 (en) * 2009-03-26 2014-08-12 Apple Inc. Interface navigation tools
US8819570B2 (en) * 2009-03-27 2014-08-26 Zumobi, Inc Systems, methods, and computer program products displaying interactive elements on a canvas
US20100251143A1 (en) 2009-03-27 2010-09-30 The Ransom Group, Inc. Method, system and computer program for creating and editing a website
US8819597B2 (en) 2009-04-10 2014-08-26 Google Inc. Glyph entry on computing device
US9213541B2 (en) * 2009-04-17 2015-12-15 ArtinSoft Corporation, S.A. Creation, generation, distribution and application of self-contained modifications to source code
US20100287530A1 (en) 2009-05-05 2010-11-11 Borland Software Corporation Requirements definition using interactive prototyping
US8269737B2 (en) * 2009-08-20 2012-09-18 Hewlett-Packard Development Company, L.P. Method and apparatus for interpreting input movement on a computing device interface as a one- or two-dimensional input
US8407598B2 (en) 2009-12-09 2013-03-26 Ralph Lee Burton Dynamic web control generation facilitator
JP5523090B2 (en) * 2009-12-25 2014-06-18 キヤノン株式会社 INPUT DEVICE, CONTROL METHOD FOR INPUT DEVICE, PROGRAM, AND STORAGE MEDIUM
US8533667B2 (en) 2009-12-30 2013-09-10 International Business Machines Corporation Call wizard for information management system (IMS) applications
CN101763218A (en) * 2010-01-06 2010-06-30 广东欧珀移动通信有限公司 Input method for handheld equipment
US20110173537A1 (en) 2010-01-11 2011-07-14 Everspeech, Inc. Integrated data processing and transcription service
EP2548200A4 (en) 2010-03-19 2014-01-22 Siemens Healthcare Diagnostics System and method for changeable focus modal windows
US8316323B2 (en) 2010-03-26 2012-11-20 Microsoft Corporation Breadcrumb navigation through heirarchical structures
US20120089914A1 (en) * 2010-04-27 2012-04-12 Surfwax Inc. User interfaces for navigating structured content
US20110271184A1 (en) * 2010-04-28 2011-11-03 Microsoft Corporation Client application and web page integration
US9160756B2 (en) 2010-05-19 2015-10-13 International Business Machines Corporation Method and apparatus for protecting markup language document against cross-site scripting attack
US9110586B2 (en) * 2010-06-03 2015-08-18 Panasonic Intellectual Property Corporation Of America Scrolling apparatus, scrolling method, non-transitory computer readable recording medium and intergrated circuit
CN102270125A (en) * 2010-06-04 2011-12-07 中兴通讯股份有限公司 Device and method for developing Web application
US20110314415A1 (en) * 2010-06-21 2011-12-22 George Fitzmaurice Method and System for Providing Custom Tooltip Messages
US8706854B2 (en) * 2010-06-30 2014-04-22 Raytheon Company System and method for organizing, managing and running enterprise-wide scans
US8544027B2 (en) 2010-07-30 2013-09-24 Sap Ag Logical data model abstraction in a physically distributed environment
US8630462B2 (en) * 2010-08-31 2014-01-14 Activate Systems, Inc. Methods and apparatus for improved motion capture
JP2012069065A (en) * 2010-09-27 2012-04-05 Nintendo Co Ltd Information processing program, and information processing device and method
US8612366B2 (en) 2010-09-29 2013-12-17 Moresteam.Com Llc Systems and methods for performing design of experiments
US8990199B1 (en) 2010-09-30 2015-03-24 Amazon Technologies, Inc. Content search with category-aware visual similarity
US20120124555A1 (en) 2010-11-11 2012-05-17 Codekko Software, Inc. Optimization of Compiled Control Objects
CN102023749A (en) * 2010-12-02 2011-04-20 广东宝莱特医用科技股份有限公司 Area dragging treating method of list type control on touch screen interface of medical equipment
US8612874B2 (en) * 2010-12-23 2013-12-17 Microsoft Corporation Presenting an application change through a tile
US20120191502A1 (en) 2011-01-20 2012-07-26 John Nicholas Gross System & Method For Analyzing & Predicting Behavior Of An Organization & Personnel
JP2012168790A (en) 2011-02-15 2012-09-06 Brother Ind Ltd Display program and display device
US9384183B2 (en) 2011-03-31 2016-07-05 Infosys Limited Method and system for reporting web standard non-compliance of web pages
US9152616B2 (en) 2011-04-28 2015-10-06 Flipboard, Inc. Template-based page layout for web content
US9753699B2 (en) * 2011-06-16 2017-09-05 Microsoft Technology Licensing, Llc Live browser tooling in an integrated development environment
US8566100B2 (en) 2011-06-21 2013-10-22 Verna Ip Holdings, Llc Automated method and system for obtaining user-selected real-time information on a mobile communication device
US8799862B2 (en) 2011-06-24 2014-08-05 Alcatel Lucent Application testing using sandboxes
CN102253841B (en) * 2011-08-09 2014-07-23 东莞兆田数码科技有限公司 Small-scale graphical user interface system
US20130080913A1 (en) * 2011-09-22 2013-03-28 Microsoft Corporation Multi-column notebook interaction
US8836654B2 (en) 2011-10-04 2014-09-16 Qualcomm Incorporated Application window position and size control in (multi-fold) multi-display devices
JP5553812B2 (en) * 2011-10-26 2014-07-16 株式会社ソニー・コンピュータエンタテインメント Scroll control device, terminal device, and scroll control method
KR101888457B1 (en) 2011-11-16 2018-08-16 삼성전자주식회사 Apparatus having a touch screen processing plurality of apllications and method for controlling thereof
US8799780B2 (en) 2011-11-28 2014-08-05 International Business Machines Corporation Installation wizard with multidimensional views
US8799988B2 (en) 2012-01-25 2014-08-05 Microsoft Corporation Document communication runtime interfaces
US20150058709A1 (en) 2012-01-26 2015-02-26 Michael Edward Zaletel Method of creating a media composition and apparatus therefore
US10185703B2 (en) 2012-02-20 2019-01-22 Wix.Com Ltd. Web site design system integrating dynamic layout and dynamic content
KR101892567B1 (en) * 2012-02-24 2018-08-28 삼성전자 주식회사 Method and apparatus for moving contents on screen in terminal
US9389872B2 (en) 2012-03-16 2016-07-12 Vmware, Inc. Software wizard implementation framework
EP2665042A1 (en) 2012-05-14 2013-11-20 Crytek GmbH Visual processing based on interactive rendering
US9043722B1 (en) * 2012-06-19 2015-05-26 Surfwax, Inc. User interfaces for displaying relationships between cells in a grid
US20140096042A1 (en) * 2012-07-09 2014-04-03 Aaron Tyler Travis Method and system for generating and storing a collection of interactive browsers within a navigation plane
US9195477B1 (en) 2012-10-09 2015-11-24 Sencha, Inc. Device profiles, deep linking, and browser history support for web applications
US9244971B1 (en) 2013-03-07 2016-01-26 Amazon Technologies, Inc. Data retrieval from heterogeneous storage systems
WO2014157908A1 (en) 2013-03-27 2014-10-02 Samsung Electronics Co., Ltd. Device and method for displaying execution result of application
US10410003B2 (en) 2013-06-07 2019-09-10 Apple Inc. Multiple containers assigned to an application
US9672276B2 (en) * 2013-09-30 2017-06-06 Microsoft Technology Licensing, Llc Multi-act creation user interface element
US9875116B2 (en) 2013-11-26 2018-01-23 Cellco Partnership Sharing of a user input interface of an application session of one application between two or more applications

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5625763A (en) * 1995-05-05 1997-04-29 Apple Computer, Inc. Method and apparatus for automatically generating focus ordering in a dialog on a computer system
US6049812A (en) * 1996-11-18 2000-04-11 International Business Machines Corp. Browser and plural active URL manager for network computers
US6128632A (en) * 1997-03-06 2000-10-03 Apple Computer, Inc. Methods for applying rubi annotation characters over base text characters
US6460060B1 (en) * 1999-01-26 2002-10-01 International Business Machines Corporation Method and system for searching web browser history
US20030011638A1 (en) * 2001-07-10 2003-01-16 Sun-Woo Chung Pop-up menu system
US20080177994A1 (en) * 2003-01-12 2008-07-24 Yaron Mayer System and method for improving the efficiency, comfort, and/or reliability in Operating Systems, such as for example Windows
US20050088410A1 (en) * 2003-10-23 2005-04-28 Apple Computer, Inc. Dynamically changing cursor for user interface
US20080201401A1 (en) * 2004-08-20 2008-08-21 Rhoderick Pugh Secure server authentication and browsing
US8345014B2 (en) * 2008-07-12 2013-01-01 Lester F. Ludwig Control of the operating system on a computing device via finger angle using a high dimensional touchpad (HDTP) touch user interface
US20120246487A1 (en) * 2009-11-13 2012-09-27 Irdeto Canada Corporation System and Method to Protect Java Bytecode Code Against Static And Dynamic Attacks Within Hostile Execution Environments
US20110131532A1 (en) * 2009-12-02 2011-06-02 Russell Deborah C Identifying Content via Items of a Navigation System
US20110265035A1 (en) * 2010-04-23 2011-10-27 Marc Anthony Lepage Graphical context menu
US20120131496A1 (en) * 2010-11-23 2012-05-24 Apple Inc. Grouping and Browsing Open Windows
US20140258970A1 (en) * 2013-03-11 2014-09-11 Research In Motion Limited Collaborative application development environment using a connected device

Also Published As

Publication number Publication date
WO2015048203A1 (en) 2015-04-02
JP2016532924A (en) 2016-10-20
JP2016533556A (en) 2016-10-27
US20150095791A1 (en) 2015-04-02
US20150095813A1 (en) 2015-04-02
CN105593812A (en) 2016-05-18
AU2014324618A1 (en) 2016-02-25
CN105683907A (en) 2016-06-15
CL2016000729A1 (en) 2016-11-18
WO2015048205A1 (en) 2015-04-02
CN105683908A (en) 2016-06-15
PH12016500256A1 (en) 2016-05-16
US20150095365A1 (en) 2015-04-02
CA2922725A1 (en) 2015-04-02
WO2015048206A1 (en) 2015-04-02
KR20160064115A (en) 2016-06-07
CN105683909B (en) 2019-06-25
HK1222731A1 (en) 2017-07-07
EP3053028A1 (en) 2016-08-10
RU2016111604A (en) 2017-10-02
TW201528108A (en) 2015-07-16
TW201528106A (en) 2015-07-16
KR20160063340A (en) 2016-06-03
EP3053031A1 (en) 2016-08-10
US9483549B2 (en) 2016-11-01
WO2015048602A1 (en) 2015-04-02
CN105593813A (en) 2016-05-18
RU2686822C2 (en) 2019-04-30
WO2015048601A1 (en) 2015-04-02
SG10201802632SA (en) 2018-05-30
SG11201601888UA (en) 2016-04-28
AU2014324620A1 (en) 2016-02-25
WO2015048600A1 (en) 2015-04-02
US20150095851A1 (en) 2015-04-02
RU2679540C2 (en) 2019-02-11
US9792354B2 (en) 2017-10-17
RU2016111610A3 (en) 2018-07-05
IL244368A0 (en) 2016-04-21
CN105659199A (en) 2016-06-08
BR112016004551A8 (en) 2020-02-11
EP3053028B1 (en) 2018-12-12
US20150095849A1 (en) 2015-04-02
JP6446038B2 (en) 2018-12-26
US9754018B2 (en) 2017-09-05
US20150095811A1 (en) 2015-04-02
CN105683909A (en) 2016-06-15
TW201516834A (en) 2015-05-01
EP3053027A1 (en) 2016-08-10
RU2016111610A (en) 2017-10-02
US20150095854A1 (en) 2015-04-02
TW201528103A (en) 2015-07-16
US20150095846A1 (en) 2015-04-02
RU2016111604A3 (en) 2018-07-12
EP3053017A1 (en) 2016-08-10
JP6465870B2 (en) 2019-02-06
WO2015048204A1 (en) 2015-04-02
CN105683908B (en) 2019-11-19
US9727636B2 (en) 2017-08-08
US20150095759A1 (en) 2015-04-02
KR102186865B1 (en) 2020-12-04
EP3053030A1 (en) 2016-08-10
US20150095842A1 (en) 2015-04-02
EP3053029A1 (en) 2016-08-10
US9805114B2 (en) 2017-10-31
MX2016004113A (en) 2016-06-06
CA2922985A1 (en) 2015-04-02
KR20160062225A (en) 2016-06-01
CN105683907B (en) 2019-06-28
CN105593813B (en) 2019-09-24
US9672276B2 (en) 2017-06-06
MX2016003946A (en) 2016-06-17

Similar Documents

Publication Publication Date Title
US20150095812A1 (en) Extensible and context-aware commanding infrastructure
US8533666B2 (en) Interactive design environments to visually model, debug and execute resource oriented programs
US20130060596A1 (en) Easy Process Modeling Platform
US11797273B2 (en) System and method for enhancing component based development models with auto-wiring
US20150113500A1 (en) Integrated visualization for modeled customizations
EP3127013B1 (en) Service gallery user interface presentation
CN113168335A (en) Application integration for robotic process automation
US10678561B2 (en) Virtualizing extension code in an application
US9513931B2 (en) System for context based user requests for functionality
US20230385363A1 (en) Web site preview generation based on web site type
WO2023229693A1 (en) Web site preview generation with action control
US20220245206A1 (en) Process flow builder for user-configurable web component sequences
US10175953B2 (en) User interface control and communication
US11797638B2 (en) Aggregate component for parallel browser-initiated actions
Nguyen et al. An efficient platform for mobile application development on cloud environments
Rezaei Timelog system on Android OS

Legal Events

Date Code Title Description
AS Assignment

Owner name: MICROSOFT CORPORATION, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BIRCK, ANDREW;OLENICK, BRAD;WELICKI, LEON EZEQUIEL;AND OTHERS;SIGNING DATES FROM 20140621 TO 20140811;REEL/FRAME:033657/0531

AS Assignment

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034747/0417

Effective date: 20141014

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:039025/0454

Effective date: 20141014

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION