US20160018970A1 - Visualization Object Receptacle - Google Patents
Visualization Object Receptacle Download PDFInfo
- Publication number
- US20160018970A1 US20160018970A1 US14/803,898 US201514803898A US2016018970A1 US 20160018970 A1 US20160018970 A1 US 20160018970A1 US 201514803898 A US201514803898 A US 201514803898A US 2016018970 A1 US2016018970 A1 US 2016018970A1
- Authority
- US
- United States
- Prior art keywords
- icon
- stack
- application
- user interface
- icons
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0483—Interaction with page-structured environments, e.g. book metaphor
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
An icon receptacle is disposed along a depth aspect, and one or more icons are disposed within the icon receptacle, one of which is a stack item.
Description
- This application is a continuation application of and claims priority to U.S. application Ser. No. 11/760,692, filed on Jun. 8, 2007.
- A graphical user interface allows a large number of graphical objects or items to be displayed on a display screen at the same time. Leading personal computer operating systems, such as the Apple Mac OS®, provide user interfaces in which a number of graphical representations of system objects can be displayed according to the needs of the user. Example system objects include system functions, alerts, windows, peripherals, files, and applications. Taskbars, menus, virtual buttons, a mouse, a keyboard, and other user interface elements provide mechanisms for accessing and/or activating the system objects corresponding to the displayed representations.
- The graphical objects and access to the corresponding system objects and related functions, however, should be presented in a manner that facilitates an intuitive user experience. The use of metaphors that represent concrete, familiar ideas facilitate such an intuitive user experience. For example, the metaphor of file folders can be used for storing documents; the metaphor of a file cabinet can be used for storing information on a hard disk; and the metaphor of the desktop can be used for an operating system interface.
- As the capabilities of processing devices progress, however, so do the demands on the graphical user interface to convey information to the users in an intuitive manner.
- Disclosed herein are methods, apparatus and systems including a visualization object receptacle. In one implementation, a computer readable medium stores instructions that are executable by a processing device, and upon such execution cause the processing device to generate a graphical user interface on a display device. The graphical user interface defines a depth aspect and a visualization object receptacle disposed along the depth aspect. A visualization object comprises a collective representative of graphical user interface elements that can be displayed in the visualization object receptacle.
- In another implementation, a visualization object receptacle disposed along a depth aspect is generated, and one or more visualization objects are generated within the visualization object receptacle. At least one of the visualization objects is a stack item.
- In another implementation, a three-dimensional desktop defining a depth aspect includes a visualization object receptacle disposed along the depth aspect. One or more visualization objects are disposed within the visualization object receptacle, and at least one of the visualization objects comprises a stack item.
-
FIG. 1 is a block diagram of an example system that can be utilized to implement the systems and methods described herein. -
FIG. 2 is a block diagram of an example user interface architecture. -
FIG. 3 is an image of an example visualization object receptacle. -
FIG. 4 is an image of an example stack item. -
FIG. 5 is a block diagram of an example user interface engine architecture. -
FIG. 6 is a block diagram of an example system layer structure that can be utilized to implement the systems and methods described herein. -
FIG. 7 is a block diagram of an example multidimensional desktop environment. -
FIG. 8 is another block diagram of the example multidimensional desktop environment. -
FIG. 9 is another block diagram of the example multidimensional desktop environment. -
FIG. 10 is another block diagram of the example multidimensional desktop environment. -
FIG. 11 is a block diagram of another example multidimensional desktop environment. -
FIG. 12 is a block diagram of another example multidimensional desktop environment. -
FIG. 13 is a block diagram of another example multidimensional desktop environment. -
FIG. 14 is a block diagram of another example multidimensional desktop environment. -
FIG. 15 is a block diagram of another example multidimensional desktop environment. -
FIGS. 16A-D are block diagrams of other example multidimensional desktop environments. -
FIG. 17 is a block diagram of an example desktop transition. -
FIGS. 18A-18D are block diagrams of example visualization object receptacle indicators. -
FIGS. 19A and 19B are block diagrams of an example contextual menu for a visualization object receptacle. -
FIG. 20 is a block diagram of an example visualization object receptacle including type-ahead indications. -
FIGS. 21A and 21B are block diagrams of example selection indicators for a visualization model. -
FIG. 22 is a block diagram of another example multidimensional desktop environment. -
FIG. 23 is a block diagram of another example visualization object receptacle. -
FIG. 24 is a block diagram of an example stack item. -
FIG. 25 is a block diagram of another example stack item. -
FIG. 26 is a block diagram of another example stack item. -
FIG. 27 is a block diagram of another example stack item. -
FIGS. 28A and 28B are block diagrams of example stack items that are color-coded. -
FIG. 29 is a block diagram illustrating an example contextual control scheme applied to an example stack item. -
FIG. 30 is a block diagram illustrating the application of an example visualization model to an example stack item. -
FIGS. 31A and 31B are block diagrams illustrating the application of another example visualization model to an example stack item. -
FIG. 32 is a block diagram illustrating the application of another example visualization model to an example stack item. -
FIG. 33A is a block diagram of an example group association of an example stack item. -
FIG. 33B is a block diagram of an example group association of system objects. -
FIG. 34 is a flow diagram of an example process for transitioning a desktop. -
FIG. 35 is a flow diagram of another example process for transitioning between desktop types. -
FIG. 36 is a flow diagram of an example process for generating a multidimensional desktop environment. -
FIG. 37 is a flow diagram of an example process for rendering a side surface in a multidimensional desktop environment. -
FIG. 38 is a flow diagram of an example process for scrolling a side surface in a multidimensional desktop environment. -
FIG. 39 is a flow diagram of an example process for generating a selection indicator. -
FIG. 40 is a flow diagram of an example process for rendering desktop items. -
FIG. 41 is a flow diagram of an example process for generating an example application environment in a multidimensional desktop environment. -
FIG. 42 is a flow diagram of an example process for transitioning between application environments. -
FIG. 43 is a flow diagram of an example process for generating a visualization object receptacle. -
FIG. 44 is a flow diagram of an example process for color coding visualization objects. -
FIG. 45 is a flow diagram of an example process for color coding visualization objects of related system objects. -
FIG. 46 is a flow diagram of another example process for generating a visualization object receptacle. -
FIG. 47 is a flow diagram of an example process for generating a stack item. -
FIG. 48 is a flow diagram of an example process for displaying stack elements according to modal states. -
FIG. 49 is a flow diagram of an example process for selecting interaction models and/or visualization models. -
FIG. 50 is a flow diagram of another example process for generating a stack item. -
FIG. 51 is a flow diagram of an example process for displaying a stack item according to an execution context. -
FIG. 52 is a flow diagram of an example process for generating and displaying a stack item. -
FIG. 53 is a flow diagram of an example process for automatically selecting and applying an interaction model to a stack item. -
FIG. 54 is a flow diagram of another example process for automatically selecting and applying an interaction model to a stack item. -
FIG. 55 is a flow diagram of another example process for automatically selecting and applying an interaction model to a stack item. -
FIG. 56 is a flow diagram of another example process for automatically selecting and applying an interaction model to a stack item. -
FIG. 57 is a flow diagram of an example process for generating a divet. -
FIG. 58 is a flow diagram of an example process for generating a divet contextual menu. -
FIG. 1 is a block diagram of anexample system 100 that can be utilized to implement the systems and methods described herein. Thesystem 100 can, for example, be implemented in a computer device, such as any one of the personal computer devices available from Apple Inc., or other electronic devices. Other example implementations can also include video processing devices, multimedia processing devices, portable computing devices, portable communication devices, set top boxes, and other electronic devices. - The
example system 100 includes aprocessing device 102, afirst data store 104, asecond data store 106, agraphics device 108,input devices 110,output devices 112, and anetwork device 114. Abus system 116, such as a data bus and a motherboard, can be used to establish and control data communication between thecomponents - The
processing device 102 can, for example, include one or more microprocessors. Thefirst data store 104 can, for example, include a random access memory storage device, such as a dynamic random access memory, or other types of computer-readable medium memory devices. Thesecond data store 106 can, for example, include one or more hard drives, a flash memory, and/or a read only memory, or other types of computer-readable medium memory devices. - The
graphics device 108 can, for example, include a video card, a graphics accelerator card, or a display adapter, and is configured to generate and output images to a display device. In one implementation, thegraphics device 108 can be realized in a dedicated hardware card connected to thebus system 116. In another implementation, thegraphics device 108 can be realized in a graphics controller integrated into a chipset of thebus system 116. Other implementations can also be used. -
Example input devices 110 can include a keyboard, a mouse, a stylus, a video camera, a multi-touch surface, etc., andexample output devices 112 can include a display device, an audio device, etc. - The
network interface 114 can, for example, include a wired or wireless network device operable to communicate data to and from anetwork 118. Thenetwork 118 can include one or more local area networks (LANs) or a wide area network (WAN), such as the Internet. - In an implementation, the
system 100 includes instructions defining an operating system stored in thefirst data store 104 and/or thesecond data store 106. Example operating systems can include the MAC OS® X series operating system, the WINDOWS® based operating system, or other operating systems. Upon execution of the operating system instructions, access to various system objects is enabled. Example system objects include data files, applications, functions, windows, etc. To facilitate an intuitive user experience, thesystem 100 includes a graphical user interface that provides the user access to the various system objects and conveys information about thesystem 100 to the user in an intuitive manner. -
FIG. 2 is a block diagram of an exampleuser interface architecture 200. Theuser interface architecture 200 includes a user interface (UI)engine 202 that provides the user access to the various system objects 204 and conveys information about thesystem 100 to the user. - Upon execution, the
UI engine 202 can cause thegraphics device 108 to generate a graphical user interface on anoutput device 112, such as a display device. In one implementation, the graphical user interface can include amultidimensional desktop 210 and amultidimensional application environment 212. In an implementation, themultidimensional desktop 210 and themultidimensional application environment 212 include x-, y- and z-axis aspects, e.g., a height, width and depth aspect. The x-, y- and z-axis aspects may define a three-dimensional environment, e.g., a “3D” or “2.5D” environment that includes a z-axis, e.g., depth, aspect. - In an implementation, the
multidimensional desktop 210 can include use interface elements, such as visualization objects 220, avisualization object receptacle 222, and stackitems 224. In some implementations, the visualization objects 220, thevisualization object receptacle 222 and thestack items 224 can be presented in a pseudo-three dimensional (i.e., “2.5D”) or a three-dimensional environment as graphical objects having a depth aspect. - A
visualization object 220 can, for example, be a visual representation of a system object. In some implementations, the visualization objects 220 are icons. Other visualization objects can also be used, e.g., alert notification windows, menu command bars, windows, or other visual representations of system objects. - In an implementation, the
multidimensional application environment 212 can include an application environment distributed along a depth aspect. For example, a content frame, e.g., an application window, can be presented on a first surface, and control elements, e.g., toolbar commands, can be presented on a second surface. -
FIG. 3 is an image of an examplevisualization object receptacle 300. In one implementation, thevisualization object receptacle 300 can include x-, y- and z-axis aspects, e.g., a height, width and depth. In another implementation, thevisualization object receptacle 300 can have only a y- and z-axis aspect, e.g., a width and depth. In another implementation, thevisualization object receptacle 300 can have only an x- and y-axis aspect, e.g., a height and a width. An example implementation of avisualization object receptacle 300 is the “Dock” user interface in the MAC OS® X Leopard operating system. Other implementations can also be used. - In some implementations, or more visualization objects, e.g.,
icons visualization object receptacle 300, e.g., anicon receptacle 300. In one implementation, a lighting and shading effect is applied to emphasize the depth aspect of thevisualization object receptacle 300, as illustrated by the correspondingshadows reflections icons - In some implementations, the
visualization object receptacle 300 can includefront surface 319 to generate a height aspect. In some implementations, anotch 320 can be included in thevisualization object receptacle 300. Thenotch 320 can, for example, be utilized to arrange visualization objects related to particular programs or functions, e.g., files and folders can be disposed on a first side of thenotch 320 and applications can be disposed on a second side of thenotch 320; or a user may define arrangements according to thenotch 320, etc. - In some implementations, the
visualization object receptacle 300 can include status indicators, e.g., 330 and 332, disposed on thefront surface 319. Thestatus indicators status indicator 330 may be illuminate in a yellow color to indicate that thefolder 304 is receiving a file download, and thestatus indicator 332 may be illuminate in a green color to indicate that a program associated with thevisualization object 308 is running. - In some implementations, the
visualization object receptacle 300 may only define a depth aspect, e.g., thevisualization object receptacle 300 may not include afront surface 319. In some implementations, the top surface of thevisualization object receptacle 300 can be modeled as a liquid for addition and removal of visualization objects. For example, when a visualization object is added to thevisualization object receptacle 300, the adjacent visualization objects may move apart to define an open space, and the added visualization object may emerge from the surface into the open space. Surface perturbations, e.g., ripples, can be generated to enhance the visual effect of the addition of the visualization object. Visualization objects can be removed by a substantially reversed visual effect. - In another implementation, when a visualization object is added to the
visualization object receptacle 300, the adjacent visualization objects may move apart to define an open space, and the added visualization object may fall onto the surface into the open space. Surface perturbations, e.g., ripples and splashes, can be generated to enhance the visual effect of the addition of the visualization object. Visualization objects can be removed by a substantially reversed visual effect. Additional features of visualization object receptacles and visualization objects disposed therein are described in more detail below. -
FIG. 4 is an image of anexample stack item 400. In one implementation, thestack item 400 is a system object that includes a plurality of stack elements, e.g., stackelements stack item 400 is associated with thestack elements stack elements - In one implementation, a
stack item identifier 410 can be displayed on the top stack element, e.g.,stack element 402. In one implementation, thestack item identifier 410 can, for example, comprise a title describing a stack type, e.g., “images” or “documents.” In another implementation, thestack item identifier 410 can, for example, comprise a visual indicator indicating an aspect of the stack, e.g., a dollar sign $ can be displayed for a stack item including system objects related to a financial analysis tool; or a representation of a coin can be displayed as a surface beneath the stack item, etc. Thestack item identifier 410 can, for example, be automatically generated, or can be generated by the user. Other stack item identifiers can also be used. - In one implementation, the
stack elements FIG. 4 . Other stack arrangements can also be used. In one implementation, eachstack element unique indicium - In some implementations, the
stack elements - The
stack item 400 can include visualization objects related to different types of system objects. For example, a stack item can include stack elements related to peripheral devices, e.g., hard drives, universal serial bus devices, etc.; or can include stack elements related to application windows; or can include stack elements related to system functions, e.g., menus, a shutdown function, a sleep function, a backup function, etc.; or can includes stack elements related to recent system alerts; or other system objects. - In some implementations, a
stack item 400 can include visualization objects related to different system views. For example, thestack element 402 can correspond to a work environment; thestack element 404 can correspond to a gaming environment; thestack element 406 can correspond to a music environment; and thestack element 408 can correspond to a movie environment. Selection of any of the corresponding elements 402-408 can cause the user interface to transition to the corresponding environment. - In some implementations, a
stack item 400 can include visualization objects related to multiple monitors. For example, if a monitor in a dual monitor user environment is disabled, the corresponding visualization objects displayed on the disabled monitor can collapse into a monitor stack on the remaining monitor. - Additional features of the stack items and corresponding stack elements are described in more detail below.
-
FIG. 5 is a block diagram of an example user interface engine architecture 500. TheUI engine 202 can, for example, include an interaction andvisualization model engine 502, aphysics engine 504, and acontext engine 506. Other engines can also be included. - In one implementation, the interaction and
visualization model engine 502 can identify an association characteristic of associated visualization objects, e.g., icons. The associated graphical elements can be collectively displayed, e.g., in an object stack, or can be distributed in a desktop/folder hierarchy in which only one icon is displayed. Based on the identified characteristic, the interaction andvisualization model engine 502 can automatically select an interaction model and/or visualization mode that defines how the user may interact with and view the associated graphical elements. For example, if an identified association characteristic is the quantity of associated icons, an interaction model and/or visualization model for browsing the documents related to the icons can be selected based on the quantity. For example, if the quantity of associated icons is less than a first threshold, e.g., four, a mouse-over of any one of the four associated icons can present the associated icons in juxtaposition. Likewise, if the quantity of associated icons is greater than the first threshold and less than a second threshold, e.g., 16, a mouse-over of any one of the associated icons can present the associated icons in an overlapping display in which the icons cycle from back to front. Additionally, if the quantity of associated icons is greater than the second threshold, then a mouse-over of any one of the associated icons can present a scrollable list of associated documents. - Other interaction models and visualization model selection schemes can also be implemented. For example, the interaction and
visualization model engine 502 can cause related visualization objects to move across a user interface when a particular visualization object type is selected, e.g., selection of a word processing program icon may cause word processing document icons to move toward the word processing program icons. In another implementation, selection of a visualization object can cause unrelated visualization objects to be de-emphasize (e.g., reduce in size), and/or related visualization objects to be emphasized (e.g., increase in size). In another implementation, selection of a visualization object can cause related visualization objects to become illuminated. - In one implementation, the
physics engine 504 can apply a physics aspect, such as Newtonian physics models based on mass, velocity, etc., to the visual representations of system objects, such as icons. In an implementation, the icons can be modeled as rigid bodies or non-rigid bodies. For example, placing an icon on a surface next to adjacent icons can cause the adjacent icons to shift positions in response to a simulated disturbance from the icon placement. In one implementation, icon magnetism can be selectively enabled or disabled by the user. In one implementation, icons return to their initial positions upon a disabling of the magnetism aspect. In another implementation, a magnet icon can have a magnetism aspect selected by the user, e.g., a magnetism with respect to a word processing application, or a magnetism with respect to two or more applications, or a magnetism with respect to the last time a document was accessed, e.g., within the last two days, etc. - Other physics models can also be applied. For example, an application icon can include a magnetism aspect, and placing the magnetic application icon on the desktop can cause icons related to the application icon, e.g., icons representing application document files, to be attracted to the magnetic icon and move towards the magnetic icon. Likewise, icons for unrelated system objects, e.g., other application icons and other document icons, can be modeled as having an opposite magnetic polarity from the selected magnetic icon, and thus will be repulsed and shift away from the selected magnetic icon.
- The
context engine 506 can, for example, provide contextual control of a stack item based on a context. For example, stack items, such as thestack item 400, can be defined according to a protection context. Accordingly, system objects corresponding to stack elements within the stack item cannot be deleted until dissociated from the stack item. In some implementations, astack item 400 can have a locked context, and access to thestack item 400 can be password protected. Other contextual control can also be provided, such as contextual control based on a temporal context, e.g., a new object stack of recently added system objects; a download context, such as a download stack for recently downloaded files; or an execution context, or other context types. -
FIG. 6 is block diagram of example system layers 600 that can be utilized to implement the systems and methods described herein. Other system layer implementations, however, can also be used. - In an implementation, a user interface engine, such as the
UI engine 202, or another UI engine capable of generating a three-dimensional user interface environment, operates at anapplication level 602 and implements graphical functions and features available through an application program interface (API)layer 604. Example graphical functions and features include graphical processing, supported by a graphics API, image processing, support by an imaging API, and video processing, supported by a video API. - The
API layer 604, in turn, interfaces with agraphics library layer 606. Thegraphics library layer 604 can, for example, be implemented as a software interface to graphics hardware, such as an implementation of the OpenGL specification. A driver/hardware layer 608 includes drivers and associated graphics hardware, such as a graphics card and associated drivers. -
FIG. 7 is a block diagram 700 of an example multidimensional desktop environment. In the example implementation, themultidimensional desktop environment 700 includes aback surface 702 axially disposed, e.g., along the z-axis, from aviewing surface 704. In one implementation, theback surface 702 can, for example, be a two-dimensional desktop environment, including one ormore menus viewing surface 704 can be defined by the entire image on a display device, e.g., a “front pane.” One or more side surfaces, such as side surfaces 706, 708, 710 and 712, are extended from theback surface 702 to theviewing surface 704. A visualization object receptacle, e.g., anicon 714 is generated on one or more of the side surfaces, such asside surface 706. Although only one visualization object receptacle is shown, addition icon receptacles can also be displayed, e.g., along theside surface 708. - In one implementation, a
reflection region 716 can be generated on theside surface 706, e.g., the “floor.” In an implementation, a reflection of theback surface 702 and of graphical items placed on thereflection region 716 can be generated, e.g., shapes 760 and 762 generatereflections reflection region 716. - In an implementation, the
visualization object receptacle 714 is positioned at aforward terminus 718 of thereflection region 716. In one implementation, theforward terminus 718 can be offset by an axial distance d from theviewing surface 704. In another implementation, theforward terminus 718 can terminate at the plane defined by theviewing surface 704. - In an implementation, the side surfaces 706, 708, 710 and 712 can intersect at
intersections FIG. 7 , fewer or greater numbers of side surfaces can be defined; for example, in an implementation,only side surfaces side surface 710. - In an implementation, the
intersections intersections intersections - In an implementation, the side surfaces 706, 708, 710 and 712 are colored to emphasize the
back surface 702 andreflection region 716. For example, the side surfaces 706, 708, 710 and 712 can be black in color, or respective patterns or colors can be rendered on each side surface. Other differentiation schemes including color schemes and image schemes can also be applied. - The
visualization object receptacle 714 can include a plurality of visualization objects, e.g.,icons icons visualization object receptacle 714 andicons visualization object receptacle 300 ofFIG. 3 , and as described in more detail below. - In an implementation, stack
items visualization object receptacle 714 and theback surface 702. Thestack items FIG. 4 above, and as described in more detail below. In the implementation ofFIG. 7 , thestack items stack items reflections reflection region 716. - Selection of a particular stack element in a stack item can, for example, launch an associated application if the stack element represents an application document; or perform a system function if the stack element represents a system function; or can instantiate some other system process.
- In an implementation, a stack item can be placed on the
visualization object receptacle 714. In another implementation, behavior of a stack item when in thevisualization object receptacle 714 is similar to the behavior of the stack item when placed on thereflection region 716. - In an implementation, representations of system objects, e.g., icons, stack items, etc., can be disposed on the side surfaces 708, 710 and 712. For example, a window displayed on the
back surface 702 can be selected and dragged to one of the side surfaces 708, 710, or 712. Likewise, a stack item, such asstack item 750, can be dragged and disposed on one of the side surfaces 708, 710, or 712. - In one implementation, a stack item is created when a representation of a system object, e.g., an icon, is placed on the surface of the
reflection region 716. For example, an icon related to a document can be displayed on thesurface 712; upon a selection, dragging and placement of the icon on thereflection region 716, a stack item is created with at least the icon as a stack element. In an implementation, a stack item can also be created by a keyboard input; for example, a user can create a stack item for open windows by a Ctrl-W input, or create a stack item for peripherals by a Ctrl-P input, etc. Other processes to create stack items can also be used. - In one implementation, existing stack items are displaced to provide space for a newly created stack item. In one implementation, the
reflection region 716 can be defined by a surface aspect, such as an equable texture, and thestack items reflection region 716 can be defined by a grid aspect, and thestack items - Other textures and surface behaviors can also be used. In one implementation, a motion model is dependent on a selected surface aspect. For example, an equable texture, such as an image of a hardwood floor or a polished metallic surface, can be associated with a rigid-body Newtonian physics model; conversely, a visible grid aspect, or a raised texture, such as an image of a carpet, pebbles, etc., can be associated with a grid snap. In another implementation, the motion mode and textures can be selected independently.
- In one implementation, a maximum number of stack items can be displayed in the
reflection region 716. Upon the insertion or creation of a new stack item, one or more existing stack items are removed from thereflection region 716. In one implementation, a consolidated stack item can be created. The consolidated stack item can, for example, be a collection of stack items with each stack item being represented by a corresponding stack element. Selection of a corresponding stack element in a consolidated stack item will cause the corresponding stack item to be positioned on the reflection region, and will likewise cause another stack item to be positioned in the consolidated stack item. - In another implementation, one or more existing stack items can be removed from the
reflection region 716 by transitioning to an edge of thereflection region 716 and fading from view, e.g., thestack item 750 may shift towards theintersection 707 and fade by an atomizing effect, by a falling effect, or by some other effect. In another implementation, one or more existing stack items are removed from thereflection region 716 by transitioning to an edge of thereflection region 716 and moving onto one of the side surfaces, e.g., thestack item 750 may shift towards theintersection 707 and move up theside surface 708. -
FIG. 8 is another block diagram 800 of the example multidimensional desktop environment. In the block diagram ofFIG. 8 , thevisualization object receptacle 714 has been adjustably disposed along a depth axis, e.g., a z-axis, such that thevisualization object receptacle 714 is disposed on theback surface 702. In one implementation, thevisualization object receptacle 714 can, for example, be preeminently displayed. Thevisualization object receptacle 714 can, for example, be preeminently displayed by rendering thevisualization object receptacle 714 in front of other graphical objects. For example, theicon 742 in thevisualization object receptacle 716 is displayed in front of thestack item 750. Other methods can be used to preeminently display thevisualization object receptacle 714, such as rendering graphical objects displayed in front of the visualization object receptacle as translucent objects. -
FIG. 9 is another block diagram 900 of the example multidimensional desktop environment. The system implementing the multidimensional desktop environment graphical user interface, such as thesystem 100 ofFIG. 1 , has received a selection command for thestack item 750. A selection command for a stack item can be generated by, for example, a mouse-over, a mouse click, a keyboard input, or by some other input. - In the implementation shown in
FIG. 9 , a visualization model that causes thestack elements stack item 750. Thus, in response to a user input, e.g., a selection or a mouse over, thefirst stack item 750 enters a second modal state from a first modal state and the forwardmost stack element 772 fans upward, followed by thestack items stack item 750 is selected, a user can, for example, select and open a document related to one of thestack elements stack elements stack item 750, e.g., ceasing the mouse over, causes thestack elements stack item 750, and the stack item returns to the first modal state. Other selection processes can also be used. - In one implementation, the
stack elements path 780. In another implementation, thestack elements paths - In one implementation, one of several interaction and/or visualization models can be automatically selected for application to a stack item, such as the
stack item 750. The selection can, for example, be based on a characteristic of thestack item 750, e.g., the number ofstack elements stack elements stack elements stack elements FIG. 9 . - Other interaction and/or visualization model selection criterion or criteria can also be used. For example, stack elements related to documents in the
stack item 754 can be displayed in an overlapping leafing mode in which the document titles appear, as the user is more likely to discern the relevance of a document from the title than a thumbnail image of a first page of a document. -
FIG. 10 is another block diagram 1000 of the example multidimensional desktop environment. The system implementing the multidimensional desktop environment graphical user interface, such as thesystem 100 ofFIG. 1 , has received a selection command for thestack item 750, and a visualization model that causes thestack elements stack item 750. In the implementation ofFIG. 10 , the selection criterion can, for example, be based on a quantity. For example, if the quantity of associated icons is less than a first threshold, e.g., five, a selection of thestack item 750 can present thestack elements FIG. 10 . - In one implementation, a selection indicator can be generated to indicate a selected stack item. For example, an under-
lighting effect 1002 can be generated to indicate selection of thestack item 750. Other selection indicators can also be used, such as backlighting effects, enlargement effects, outlining effects, or other effects. -
Additional stack items stack items stack item 1004 can automatically appear when the system implementing the graphical user interface ofFIG. 10 , such as thesystem 100 ofFIG. 1 , receives a notification that an event associated with another user that is designated as an “online buddy” has occurred, e.g., the “online buddy” has logged onto a network. - In another implementation, a stack item, such as the
stack item 1006, can automatically appear when an application corresponding to the stack item is selected or executed. For example, selecting theicon 732, which illustratively corresponds to a music application, will instantiate thestack item 1006 in accordance with a selection and/or execution context. - Other contextual controls can also be used, such as modal states, temporal contexts, etc.
-
FIG. 11 is a block diagram of another example multidimensional desktop environment. The multidimensional desktop environment ofFIG. 11 includes aback surface 1102 axially disposed, e.g., along the z-axis, from aviewing surface 1104. In one implementation, theback surface 1102 can, for example, be a two-dimensional desktop environment, including one ormore menus side surfaces visualization object receptacle 1114 is generated on one or more of the side surfaces, such asside surface 1106. - In one implementation, a
reflection region 1116 can be generated on theside surface 1106, e.g., the “floor.” Thereflection region 1116 can, for example, generate a reflection of theback surface 1102 and desktop items placed on thereflection region 1116. - In an implementation, the side surfaces 1106, 1108, 1110 and 1112 are colored to emphasize the
back surface 1102 and thereflection region 1116. For example, the side surfaces 1106, 1108, 1110 and 1112 can be black in color, or respective patterns, colors, or images can be rendered on each side surface. Other differentiation schemes including color schemes and image schemes can also be applied. - The
visualization object receptacle 1114 can include a plurality of visualization objects, e.g.,icons icons icons icons icon 1130 can correspond to a deletion function. Other system objects can also be represented, such as file items, peripheral items, etc. - In an implementation,
stack items visualization object receptacle 1114 and theback surface 1102. A selection indicator can, for example, be generated to indicate a selected stack item. For example, an enlargement effect can be used to indicate a selection of thestack item 1146. Other selection indicators can also be used. - In an implementation, the
reflection region 1116 can be defined by agrid aspect 1150, and thestack items grid aspect 1150 can be visible, e.g., a grid outline, or an association with a texture image. In another implementation, the grid aspect can be invisible. - In another implementation, stack items can be scrolled from side-to-side and/or from front-to-back (or back-to-front) on the
surface 1106. For example, upon a selection of thesurface 1106, e.g., by clicking on thesurface 1106, thesurface 1106 can be scrolled in the directions indicated by thearrows intersections left edge 1157 and theright edge 1159 of thereflection region 1116 may define a scroll ingress and a scroll egress for a left-to-right scroll direction. In one implementation, stack items are emplaced on thefloor surface 1106 at the scroll ingress 1156 (or 1157), and displaced from thefloor surface 1106 at the scroll egress 1158 (or 1159). In one implementation, one or more existing stack items are displaced from thesurface 1106 by fading from view, e.g., fading by an atomizing effect, by a falling effect, or by some other effect. - In another implementation, one or more existing stack items are displaced from the
surface 1106 moving onto one of the side surfaces, e.g.,surface 1112. In another implementation, one or more existing stack items are removed from thesurface 1106 by moving into a stack element that includes displaced stacks, e.g., “anchor” stacks near theintersections - In one implementation, windows, such as
windows back surface 1102. Thewindows surfaces reflection region 1116 of thesurface 1106, generates a stack item having the selected window as a stack element. Selecting the stack item can, for example, cause the window to reappear in the original position on theback surface 1102. - In one implementation, placing a window on one of the surfaces, such as the
surface 1108, generates a representation of the window, e.g., awindow thumbnail 1170 onsurface 1108. The corresponding window can, for example, be restored by dragging the window thumbnail onto theback surface 1102, or by selecting and double-clicking on thewindow thumbnail 1170, or by some other command invocation. - In one implementation, a lighting aspect can generate a shadow and/or reflection for representations of system objects placed on a side surface. For example, a lighting aspect can generate a reflection or
shadow 1172 of thewindow thumbnail 1170. In one implementation, a shadow and/or reflection cast on thereflection region 1116 from theback surface 1102 can be limited to a selected representation of a system object. For example, if thewindow 1160 is currently selected, the shadow or reflection on thereflection region 1116 can be limited to thewindow 1160, and the remainingwindows - In another implementation, the lighting aspect can generate an illumination effect from the
window thumbnail 1170 onto one or more surfaces. For example, the illumination effect can comprise a simulated sunbeam emanating from thewindow 1170. In one implementation, the illumination effect can change according to local environmental states, e.g., the sunbeam can track across the surfaces according to a local time; the intensity of the sunbeam can be modulated according to the local time and local weather conditions that are received over thenetwork 118, e.g., high intensity for sunny days, low intensity for overcast days and during the early evening, and/or being eliminated after a local sunset time and generated after a local sunrise time. - In another implementation, the lighting aspect described above can be associated with a weather widget that can be displayed on one or more of the surfaces. Selection of the weather widget can, for example, provide a detailed weather summary of a selected region.
- In another implementation, a stack item, such as the
stack item 1128, can be operatively associated with window instances, such aswindows windows stack elements stack item 1128 in response to a first command, and thewindows back surface 1102 from the minimized state in response to a second command. - In an implementation, the first and second commands are toggle commands. For example, selection of the
entire stack item 1128, e.g., by receiving a click command substantially concurrently with a mouse-over on thestack item 1128, can cause all windows associated with the stack element, e.g.,windows back surface 1102. Upon cessation of the click command, thewindows - In another example implementation, selection of a stack element, such as selection of the
stack element 1163 by receiving a click command after a cursor has hovered over thestack element 1163 in excess of a time period, can cause thestack element 1163 to be removed from thestack item 1128. In response, thewindow 1162 can reappear on theback surface 1102. - In an implementation, the lighting aspect can be configured to generate a shadow effect for each representation of a system object. For example, a selected window can cast shadows on subsequent windows to emphasize a depth aspect and an overall user interface relationship; a stack item can cast a shadow on adjacent representations of systems objects; selecting an dragging an icon can cause a shadow of the icon to be generated on the side and back surfaces as the icon is moved, etc.
-
FIG. 12 is a block diagram of another example multidimensional desktop environment. In the implementation ofFIG. 12 , thereflection region 1116 is defined by surface aspect having an equable texture on which stack items are displaced in response to a new stack item. For example, thestack items new stack item 1210. As thenew stack item 1210 drops onto thesurface 1106, thestack items new stack item 1210. - In one implementation, a maximum number of stack items can be displayed on the
surface 1106. If the addition of a new stack item causes the number of displayed stack items to be exceeded, then a stack item nearest a surface intersection can be displaced from the surface. For example, if the maximum number of stack items to be displayed is four, then thestack item 1208 can continue to move to the edge of thesurface 1106, where thestack item 1208 is displaced, e.g., fades from view, atomizes, etc. - In one implementation, the
surfaces surface 1108 can display afile desktop item 1220, e.g., a document icon, and thesurface 1112 can display a program desktop item, e.g., anapplication icon 1222. In one implementation, thefile desktop item 1220 corresponds to an open file in anapplication window 1224, and theapplication icon 1222 corresponds to the executing application. - In another implementation, a plurality of file desktop items and application desktop items can be displayed on the
respective surfaces surface 1112 can display two icons corresponding to two executing applications. Selection of one of the application icons can, for example, cause corresponding application windows to be displayed on theback surface 1102 and corresponding document icons to be displayed on thesurface 1108. -
FIG. 13 is a block diagram of another example multidimensional desktop environment. In this example implementation, theback surface 1302 does not include menu items, e.g.,menus stack item 1304 is utilized to access menus corresponding tomenus stack elements stack item 1304 and a positioning of the stack item onto theback surface 1302 can causecorresponding menu items back surface 1302. - The multidimensional desktop environment of
FIG. 13 can, for example, also facilitate a multidimensional application environment. For example, an applicationcontent presentation surface 1310, e.g., an application instance displaying editable data, can be displayed on theback surface 1302, and one or more application control elements can be displayed on one or more side surfaces. For example, atool bar 1312 can be displayed on thesurface 1108 to provide access totoolbar function buttons - Likewise,
menu items 1330 can be displayed on thesurface 1112. In one implementation, selection of a menu item generates a textual menu that is axially disposed so that the textual menu appears to be suspended between theback surface 1302 and the viewing surface. For example, selecting the “File” menu from themenu items 1330 can generate the floatingtextual menu 1332, which can, for example, include ashadow effect 1334 on theback surface 1302. -
FIG. 14 is a block diagram of another example multidimensional desktop environment. The multidimensional desktop environment ofFIG. 14 also facilitates a multidimensional application environment. For example, anapplication content frame 1410, e.g., a window displaying editable data, can be displayed on theback surface 1102, and one or more application control elements can be displayed on one or more side surfaces. For example, a three-dimensionalfunction icon arrangement 1420 can be displayed on thesurface 1108, andmenu items 1430 can be displayed on thesurface 1112. - The three-dimensional
function icon arrangement 1420 can, for example, include three-dimensional function icons dimensional function icon dimensional function icon - In an implementation, three-dimensional function icons can be added to the
surface 1108 by use of a menu, such as, for example, the “Customize” menu on thesurface 1112. In an implementation, a physics model can be applied to model rotation, movement and displacement of the three-dimensional function icons dimensional function icon 1428 can cause the remaining three-dimensional function icons surface 1108. - In an implementation, a three-dimensional
login visualization object 1442 can be utilized to facilitate user logins and/or user environments. For example, three sides of thelogin visualization object 1442 may correspond to login/logout commands for users; and the remaining three sides of the cube can correspond to user environments and/or other user-definable functions for a current user session. - In an implementation, a portal 1440 can be included on a surface, such as the
back surface 1102. The portal 1440 can be selected to transition to another multi-dimensional environment. In one implementation, the portal 1440 can facilitate transitioning between different application environments, e.g., between two applications that are currently executing. In another implementation, the portal can facilitate transitioning between different multi-dimensions desktop environments, e.g., from a first environment configured for a work environment to a second environment configured for a leisure environment. In another implementation, the portal 1440 can facilitate transitioning between a two-dimensional desktop environment and a three dimensional desktop environment. Other transitions can also be facilitated by theportal 1440. -
FIG. 15 is a block diagram of another example multidimensional desktop environment. In the implementationFIG. 15 , windows can be dragged or displaced across one or more surfaces. For example, thestack item 1128 can includestack elements windows stack element 1503, causes the correspondingwindow 1502 to transition into view from thesurface 1108 and onto theback surface 1102. Likewise, thewindow 1504, corresponding to theunselected stack element 1505, transitions out of view by sliding across theback surface 1102 and thesurface 1112. Other processes to displace, hide, or otherwise deemphasize system objects, such as windows, can also be used. - In an implementation, a
stack item 1510 can includestack elements stack element 1512 can transition the graphical user interface to a two-dimensional desktop, and selection of thestack element 1514 can transition to another application environment. - Additional features can also be realized by other implementations. For example, in one implementation, each surface in the multidimensional desktop environment can implement different behavior and/or functional characteristics. In one implementation, each surface can implement different presentation characteristics. For example, on the
bottom surface 1106, icons and other system object representations can be displayed according to a large scale; on theside surface 1108, icons and other system object representations can be displayed according to a small scale; on theback surface 1102, icons and other system object representations can be displayed in a list format; etc. Selecting and dragging an icon or other system object representation from one surface to another will likewise cause the icon and other system object representation to be displayed according to the presentation characteristic of the surface upon which the icon and other system object representation is finally disposed. - In another implementation, a surface can implement a deletion characteristic. For example, the last access time for icons and other system object representations can be monitored. If the last access time for an icon or other system object representation exceeds a first threshold, the icon or other system object representation can be automatically transitioned to the surface implementing the deletion characteristic, e.g.,
surface 1112. Additionally, if the last access time for the icon or other system object representation located on thesurface 1112 exceeds a second threshold, the icon or other system object representation can be automatically deleted from view. - In one implementation, a configuration tool can be used to facilitate configuration of the surface characteristic of each surface by the user. For example, a configuration menu can present one or more presentation characteristics for associated with one or more surfaces. The one or more presentation characteristics can, for example, be associated by check boxes associated with each surface. Other configuration tools can also be used.
-
FIG. 16A is a block diagram of another example multidimensional desktop environment. The multidimensional desktop environment ofFIG. 16A can, for example, implement the features described with respect toFIGS. 2-5 and 7-15. In the example implementation, the multidimensional desktop environment 1600 includes anarcuate back surface 1602 that is axially disposed, e.g., along the z-axis, from aviewing surface 1604. In one implementation, areflection region 1116 can be generated on theside surface 1606, e.g., the “floor.” In an implementation, the side surfaces 1606, 1608, 1610 and 1612 can be defined by arcuate regions havingcurvature intersections - A curved
visualization object receptacle 1614 can includevisualization object reflection region 1616.Stack items curvature intersections 1607 and 1609, respectively. Other arrangements can also be used. - Other multidimensional desktop environment geometries can also be used. For example, in one implementation, the multidimensional desktop environment can conform to a tetrahedron-shaped environment in which a front surface of the tetrahedron defines a viewing surface, and the remaining three surfaces define a left surface, a bottom surface, and a side surface. In another implementation, the multidimensional desktop environment can conform to a triangular environment, in which one axis of the triangle defines the viewing surface and the remaining two sides of the triangle define a left surface and a right surface. Other geometries can also be used.
- In one implementation, a configuration tool can be used to facilitate configuration of the multidimensional desktop environment by the user. For example, a configuration menu can present one or more multidimensional desktop environment geometries for selection by the user, such as a rectangular geometry, an arcuate geometry, a triangular geometry, etc. Selection of a geometry can cause the multidimensional desktop environment to be rendered according to the selected geometry.
-
FIG. 16B is a block diagram of another example multidimensional desktop environment. The environment ofFIG. 16B is similar to the environments ofFIGS. 2-5 and 7-15 above, except that theback surface 1640 and thefloor surface 706 define the desktop environment. The features described above with respect to thefloor surface 706 inFIGS. 2-5 can be implemented in the desktop environment ofFIG. 16B . -
FIG. 16C is a block diagram of another example multidimensional desktop environment. The environment ofFIG. 16C is similar to the environment ofFIG. 16B above, except that theback surface 1650 defines the desktop environment. Avisualization object receptacle 1652 defining a depth aspect can also be displayed near the bottom of theback surface 1650. In some implementations, a depth aspect is further emphasized by generating reflections on the surface of thevisualization object receptacle 1652. For example, the visualization objects on theback surface 1650, e.g., thefolder icon 1656 and theapplication window 1658, can generatereflections visualization object receptacle 1652. - In some implementations, the
visualization object receptacle 1652 can have a flat height aspect, e.g., the surface of thevisualization object receptacle 1652 can appear as a solid flat plane, or a translucent or transparent plane. In other implementations, a height aspect can be generated. - Visualization objects, such as
icons visualization object receptacle 1652. In some implementations, astatus indicator 1669 can illuminate to indicate a status. For example, thestack item 1668 may correspond to recent downloads, e.g., system updates, documents, etc., and the illumination may be lit to indicate that a download is currently in progress. Thestatus indicator 1669 can, for example, illuminate according to a color code to indicate different status states. - In some implementations, selecting a stack item causes the stack item to expand to display stack elements according to a visualization model, e.g., stack
elements collapse widget 1670 can be generated when the contents of a stack item, e.g., stackelements corresponding visualization frame 1674 that surrounds thestack elements - In some implementations, selection of a “Show in Finder”
command object 1682 can display a Finder window for a folder containing thestack items stack items command object 1682 can display a Finder window containing thestack items stack items - In some implementations, a stack item collection process can identify visualization objects on a desktop and collapse the objects into a stack item. For example, the
application windows - In some implementations, textual strings associated with the visualization objects, e.g., filenames associated with icons, can be centrally truncated. A centrally truncated string displays the beginning of the textual string and the end of the textual string. In some implementations, a file extension can be shown by the central truncation. In other implementations, the file extension can be omitted. Positing a cursor on the textual string, or on the visualization object associated with the textual string, can cause the entire textual string to be displayed. For example, as shown in
FIG. 16C , thetextual string 1677, i.e., “Movie of Page's birthday.mpg” is truncated to “Mov . . . day.mpg.” Conversely, thetextual string 1679, i.e., “Movie of Julia.mpg,” which is positioned beneath a cursor, is fully displayed. -
FIG. 16D is a block diagram of another example multidimensional desktop environment. The environment ofFIG. 16C is similar to the environment ofFIG. 16B above, except that a fanning visualization model is displayed for thestack items stack items -
FIG. 17 is a block diagram of an example desktop transition. In one implementation, a computer system, such as thesystem 100 ofFIG. 1 , can be configured to transition between a two-dimensional desktop 1702 and a three-dimensional desktop 1730. For example, the twodimensional desktop 1702 defines aviewing surface 1703 and includesfolders icon 1712 corresponding to a hard drive, andicon 1714 corresponding to a network, and anicon display region 1720 that displays a plurality oficons 1722. - In response to a transition command, the system can, for example, depth transition the two-
dimensional desktop 1702 from theviewing surface 1703 to define aback surface 1732, and one or more side surfaces, such asside surfaces back surface 1732 to theviewing surface 1703. Avisualization object receptacle 1730 can be generated on thesurface 1706, and one ormore icons 1732 corresponding to desktop items can be disposed in the visualization object receptacle. In the example implementation ofFIG. 17 , theicons 1732 correspond to theicons 1722. - In one implementation, stack items, such as
stack items desktop folders back surface 1732. In one implementation, two-dimensional desktop items that are not represented by a corresponding icon after the transition to the three-dimensional desktop 1730 can, for example, remain on theback surface 1732. For example, theicons back surface 1732. In another implementation, the two-dimensional desktop items that are not represented by a corresponding icon after the transition to the three-dimensional desktop 1730 can, for example, be eliminated from theback surface 1732. In another implementation, the two-dimensional desktop items that are not represented by a corresponding icon after the transition to the three-dimensional desktop 1730 can, for example, be eliminated from theback surface 1732 and represented by corresponding stack elements in a “catch all” stack item, such asstack item 1750. - The transition from the two-
dimensional desktop 1702 to a three-dimensional desktop 1730 can be substantially reversed to transition from the three-dimensional desktop 1730 to the two-dimensional desktop 1702. -
FIG. 18A is a block diagram of an example visualization object receptacle indicator. An examplevisualization object receptacle 1802 includes visualization objects, e.g.,icons selection indicator 1820 can be used to indicate a selected icon. In one implementation, theselection indicator 1820 is generated by an under-lighting effect that illuminates the surface of thevisualization object receptacle 1802 below a selected icon, such as theicon 1806. Other selection indicators can also be used, such asselection status indicator 1821, or backlighting effects, outlining effects, or other indicators. -
FIG. 18B is a block diagram of another example visualization object receptacle indicator. In an implementation, aselection indicator 1822 can be used to indicate a selected icon. In one implementation, theselection indicator 1822 is generated by an enlargement of a selected icon, such asicon 1806, relative to adjacent icons, and an under-lighting effect that illuminates the surface of thevisualization object receptacle 1802 below a selectedicon 1806 andadjacent icons selection status indicator 1821, theselection status indicator 1821 can expand into a largeselection status indicator 1823. -
FIG. 18C is a block diagram of another example visualization object receptacle indicator. In an implementation, aselection indicator 1824 can be used to indicate a selected icon. In one implementation, theselection indicator 1824 is generated by an enlargement of a selected icon, such asicon 1806, relative to adjacent icons, and a backlighting effect that illuminates the surface of thevisualization object receptacle 1802 below a selectedicon 1806 and illuminatesadjacent icons -
FIG. 18D is a block diagram of another example visualization object receptacle indicator. Thevisualization object receptacle 1802 can, for example, include one or more status indicators to indicate the status of a system object associated with one or more icons. For example, astatus indicator 1830 indicating an unselected and executing application can be generated by an under-lighting effect of a first color; astatus indicator 1832 indicating a selected and executing application can be generated by an under-lighting effect of a first color; and astatus indicator 1834 indicating a launching application can be generated by an under-lighting effect of a third color. - Other status indicator schemes can also be used. For example, in one implementation, a
status indicator 1834 indicating a launching application can be generated by a pulsing under-lighting effect. In another implementation, status indicators can indicate a status by an intensity; for example, an icon corresponding to an open document, e.g., a document icon, a stack item, or an application icon, can be backlit with a relatively high intensity, and an icon corresponding to an open and unselected document can be backlit with a relatively low intensity. For example, in implementations utilizingstatus indicators -
FIGS. 19A and 19B are block diagrams of an example contextual menu for avisualization object receptacle 1802. In some implementations, aselectable divet 1902 can be displayed proximate to an icon, e.g.,icon 1804, to indicate an actionable state associated with a system object represented by theicon 1804. For example, if theicon 1804 is representative of a system update process or program, theselectable divet 1902 can be displayed when a system update is available. - The
selectable divet 1902 can, for example, be a floating orb proximate to theicon 1804. Other shapes or visual representations can also be used. In some implementations, theselectable divet 1902 is color coded according to a color code to indicate corresponding actionable states. -
FIG. 19B illustrates an examplecontextual menu 1910 that can be displayed proximate to theicon 1804 in response to a selection of theselectable divet 1902. Thecontextual menu 1910 can include one or more menu options, e.g.,menu options icon 1804. In some implementations, thedivet 1902 remains until a necessary action is taken. In other implementations, thedivet 1902 can be removed by a corresponding selection of one of the menu options in thecontextual menu 1910. In some implementations, thedivet 1902 can fade from view if it is not selected after a period of time, e.g., 30 minutes. -
FIG. 20 is a block diagram of a visualization object receptacle including type-ahead indications. In some implementations, one ormore highlight indicators more highlight indicators icons highlight indicators highlight indicators 2002 and/or 2003 to turn off; and a third keyboard input “o” would cause thehighlight indicators 2004 and/or 2005 to turn off. According, theicon 1804, corresponding to the textual description “clock” would be selected by the type input data c, l and o. - Other selection indications based on type input can be used. For example, stack elements from a stack item can disappear in response to type input. Thus, if a stack item includes stack elements entitled “Clock,” “Calculator,” “Classics,” “Movies,” and “Safari,” the keyboard input “c” would cause the “Movies” and “Safari” visualization object to disappear. A subsequent keyboard input “a” would cause the “Clock” and “Classics” visualization objects to disappear.
- In addition to selections based on a textual description beginning with the type input data, selections based on the type input data can also be based on whether the textual description of the visualization object contains the text input or ends with text. For example, all stack elements having .mac extensions can be visualized by selecting an “Ends with” type input option and entering the type input “m,” “a” and “c.”
-
FIGS. 21A and 21B are block diagrams of example selection indicators for a visualization model. InFIG. 21A , stackelements more highlight indicators FIG. 20 above. For example, the focus rings 2109 and 2111 can be generated in response to the keyboard input “c.” A subsequent keyboard input “l” would cause thefocus ring 2111 to fade from view. - In
FIG. 21B ,stack elements mouse cursor 2120 is first positioned over thevisualization object 2110, afirst focus ring 2111 can be generated completely or partially around thevisualization object 2110. However, if themouse cursor 2120 is moved to a position over thevisualization object 2108, thefirst focus ring 2111 will fade from view and asecond focus ring 2109 will be generated around the visualization object 2018. - In some implementations, the focus ring persists around a visualization object until the
mouse cursor 2120 is positioned over another visualization object. In some implementations, the focus ring persists around a visualization object only when themouse cursor 2120 is positioned over the visualization object. Other processes for generating and removing selection indicators can also be used. -
FIG. 22 is a block diagram of another example multidimensional desktop environment. In an implementation, an indicator can, for example, be used to indicate representations of system objects having an association. For example, theicon 2206, thestack item 2208, thefolder 2210 and thewindow 2212 can be related by having corresponding system objects related to, for example, an application, e.g., theicon 2206 can be the application icon; thestack item 2208 can provide access to particular documents related to the application; thefolder 2210 can define a data store storing all application documents; and thewindow 2212 can be an instance of the executing application. In one implementation, selection of any one of theicon 2206,stack item 2208,folder 2210 orwindow 2212 can generate a common selection indicator for all items. The common selection indicator can, for example, be realized by a lighting effect, such as a backlighting effect, by a temporary pulsing effect, or by some other permanent or transient effect. -
FIG. 23 is a block diagram of another examplevisualization object receptacle 2302. The examplevisualization object receptacle 2302 includes a plurality ofvisualization object rows visualization object columns visualization object receptacle 2302 includes a plurality ofvisualization objects 2304 disposed within thevisualization object receptacle 2302 according to thevisualization object rows visualization object columns - Although two visualization object rows and six visualization object columns are shown, the visualization object receptacle can include additional or fewer visualization object rows and visualization object columns. In an implementation, a subset of the visualization object rows and visualization object columns can, for example, be visible at any one time.
- The visualization object rows and visualization object columns can, for example, be traversed by shifting the rows and/or columns in unison, as indicated by the solid arrows. For example, when a cursor is positioned on the visualization object receptacle, such as the cursor in the position defined by the intersection of the
visualization object row 2312 and thevisualization object column 2332, a command (e.g., a control-click command) can cause the visualization object rows and/or columns to shift in unison in response to movement of the cursor. In another implementation, each visualization object row and visualization object column can, for example, be traversed individually by shifting a particular row or column, as indicated by the dashed arrows. For example, when the cursor is positioned on thevisualization object receptacle 2302, an option-click command can cause the correspondingvisualization object row 2312 and/or thecorresponding column 2332 to shift individually in response to movement of the cursor. Other visualization object receptacle navigation schemes can also be used. -
FIG. 24 is a block diagram of anexample stack item 2400. Thestack item 2400 includes a plurality ofstack elements boundary 2420 defined by thestack elements stack item 2400. In one implementation, placement of an icon within the inclusion region generates a stack element associated with the icon. Likewise, placement of a stack element without the inclusion region disassociates the stack element with thestack item 2400. In another implementation, the inclusion region can be separate from the stack item. - In some implementations, the display size of the
stack item 2400 can change according to a state. For example, if a system object corresponding to a stack element in thestack item 2400 requires attention, the size of thestack item 2400 is adjusted to be rendered at a larger display size. Likewise, positioning a mouse cursor over thestack item 2400 can cause thestack item 2400 to be rendered at a larger display size. - In some implementations, the
stack item 2400 can change orientation and or appearance according to a state. For example, positioning a mouse cursor over thestack item 2400 can cause thestack item 2400 to rotate, or can cause stack elements in thestack item 2400 to randomly shuffle. -
FIG. 25 is a block diagram of anotherexample stack item 2500. Thestack item 2500 includes a plurality ofstack elements stack elements stack element -
FIG. 26 is a block diagram of anotherexample stack item 2600. Thestack item 2600 includes a plurality ofstack elements stack element 2602 corresponds to an application icon of an application system object, and thestack element stack item 2602 can, for example, be preeminently disposed with respect to thestack elements stack item 2602 can be permanently placed on the top of the aggregation ofstack elements stack item 2600, such as by selecting the stack element 2612 and placing the stack element 2612 on top of thestack element 2602, or an addition of a new stack element, will not displace thestack element 2602 from the preeminent position. - Other methods of preeminently disposing a stack element related to an application icon can also be used.
FIG. 27 , for example, is a block diagram of anotherexample stack item 2700 in which thestack element 2602 is preeminently disposed by enlarging theapplication element 2602 relative to thestack elements stack elements stack element 2602 can be rendered with an opaque effect so that the entirety of thestack element 2602 is discernable no matter the position of thestack element 2602 in the stack item. -
FIG. 28A is a block diagram ofexample stack items FIG. 28A , instantiation of eachstack item stack items - In one implementation, a stack element associated with a system object is further associated with a stack item if a relevant date associated with the system object is within the date range associated with the stack item. For example, if the
stack items stack item 2802 corresponds to word processing documents modified today; the stack elements in thestack item 2804 corresponds to word processing documents modified within the last week; and the stack elements in thestack item 2806 corresponds to word processing documents modified within the last month. -
FIG. 28B is a block diagram of anexample stack items 2810 that is color-coded. In the implementation ofFIG. 28B ,stack elements stack elements stack elements 2830 and 2832 are color-coded to identify system objects added during the last week; and stackelements -
FIG. 29 is a block diagram illustrating an example contextual control scheme applied to anexample stack item 2900. For example, the contextual control can be anapplication context 2910 that defines an executing and selectedstate 2912, an executing and non selectedstate 2914, and a not executingstate 2916. An executing and selectedstate 2912 can occur, for example, when an application window of an executing or launching application is selected. An executing and not selectedstate 2914 can occur, for example, when another process other than the application is selected. A not executingstate 2916 can occur, for example, when execution of an application is terminated. In one implementation, the stack item is displayed during the executing and selectedstate 2912; is minimized, e.g., deemphasized, during the executing and not selected state; and is suppressed, e.g., unallocated or hidden from view during the not executingstate 2916. - Other types of contextual control can also be used. For example, contextual control based on a user level associated with system objects, such as a root user level or supervisor level, can control instantiation of a stack item and/or instantiation of stack elements within the stack item, and can, for example, further control commands available to the user.
-
FIG. 30 is a block diagram illustrating the application of an example visualization model to anexample stack item 3000. The visualization model can, for example, be implemented according to first and second modal states. In the first modal state, thestack item 3000 is displayed with thestack elements stack elements - The example visualization model illustrated in
FIG. 30 can, for example, define a multidimensional path defined by afirst terminus 3020 and asecond terminus 3022, and generates a disposition of thestack elements stack elements first terminus 3020 and thesecond terminus 3022 in response to a user input. - In one implementation, an indicator can indicate a preeminent disposition of a stack element. For example, the
stack item 3002 can be highlighted by a focus ring when in the preeminent position defining thefirst terminus 3020. -
FIG. 31A is a block diagram illustrating another example visualization model for anexample stack item 3100. The visualization model can, for example, be implemented according to first and second modal states as described with respect toFIG. 30 . In the second modal state, thestack elements stack elements stack elements stack elements - In one implementation, an indicator can indicate a preeminent disposition of a stack element. For example, the
stack item 3002 can be highlighted by a focus ring when in the preeminent position defined by the upper left quadrant position. -
FIG. 31B is a block diagram illustrating the application of another example visualization model to anexample stack item 3120. The visualization model is similar to the visualization model ofFIG. 31A , except that thestack elements FIG. 31B are curved, other paths can also be use, e.g., straight paths, corkscrew paths, sinusoidal paths, or combinations of such paths. -
FIG. 32 is a block diagram illustrating the application of another example visualization model to anexample stack item 3200. Thestack item 3200 can, for example, include dozens, hundreds or even thousands of stack items. For example, thestack elements stack element 3212 can be displayed as a translucent stack element, or can be a final stack element near a vanishing point. - The visualization model can, for example, be implemented according to first and second modal states as described with respect to
FIG. 30 . In the second modal state, a subset of all the stack elements, e.g., stackelements navigation control 3220 can, for example, be displayed proximate to the arrangement of stack elements, and a selection of either an “up”directional portion 3222 or a “down”directional portion 3224 can cause the stack elements to traverse through the list view format in an up or down direction, respectively. For example, selecting the “down”directional portion 3224 will cause thestack element 3202 to be removed from the list view display, cause thestack elements stack element 3212 to appear at the top of the list view display. - Selection of a
navigation divet 3226 can generate a contextual menu that includes one or more sort commands. Example sort commands include sorting by date added, sorting by file size, sorting by file type, etc. - In the implementation of
FIG. 32 , the list view traverses an actuate path as indicated by thecurved arrow 3230, e.g., a model of a curved surface that is normal to the viewing plane at the central stack element, e.g.,stack element 3206. Accordingly, stack elements that are not normal to the viewing surface, e.g., stackelements - In some implementations, a user interface engine, e.g., the
UI engine 202 ofFIG. 2 , can pre-cache display data for a subset of the stack elements displayed in the list view format. The pre-caching can be limited to stack elements that are within a certain number of stack elements to be displayed in the list view. For example, thestack element 3200 may include thousands of photograph image files; theUI engine 202, however, may only pre-cache thumbnail images of the next five stack elements to be displayed by selection of the “up”directional portion 3222 and “down”directional portion 3224, respectively. - In another implementation, a stack item, upon selection, may rotate to a side and present the stack elements as a series of graphical representations of book spines, e.g., such as in a book shelf. Depending on the number of stack elements, the book shelf may be one level, multiple levels, or may be extend into a vanishing point and be traversed in response to a user input. Visualization object can be “pulled” from the bookshelf in response to a user input, e.g., a mouse command or a mouse hover, and a subsequent command, e.g., a mouse click, can open a file associated with the visualization object. Other visualization models can also be used.
-
FIG. 33A is a block diagram of anexample group association 3300 of anexample stack item 3310. Thegroup association 3300, can, for example, be based one or more identified association characteristics of thestack elements group association 3300 can comprise a project association, e.g., files associated with a presentation developed with afirst project application 3302 and which utilizes data from files associated with asecond project application 3304. - In one implementation, an interaction model can be selected based on the project association. In an implementation, a multiple launch interaction model can be selected when any one of the system objects related to the
stack elements applications applications - In another implementation, a synchronization interaction model can be selected when one of the system objects related to the
stack elements - In another implementation, a reconciliation interaction model can be selected when one of the system objects related to the
stack elements stack element 3312 is replaced by a new file. The reconciliation interaction model can, for example, provide one or more contextual menus or other interaction aspects to prompt a user to reconcile all stack elements when any one of the stack elements are replaced. Other reconciliation interaction models can also be used. - Interaction and/or visualization models can also be applied to other representations of system objects. For example, in one implementation, the system objects can include window instances in the multidimensional desktop environment, and the association characteristics can include a quantity of non-minimized window instances. Accordingly, an interaction model can be automatically selected for facilitating operations on the open windows, depending on the number of open windows. For example, if the number of open windows is greater than five, selection of a browse command can cause the open windows to be automatically displayed in an overlapping arrangement for browsing; and if the number of open windows is less than five, selection of the browse command can cause the open windows to be automatically displayed in a matrix arrangement for browsing.
-
FIG. 33B is a block diagram of an example group association of system objects. Thegroup association 3350, can, for example, be based one or more identified association characteristics of the system objects, such asdocuments group association 3350 can, for example, be utilized to select one or more visualization and/or interaction models as described above. However, thedocuments documents -
FIG. 34 is a flow diagram of anexample process 3400 for transitioning a desktop. Theprocess 3400 can, for example, be implemented in a processing device, such as thesystem 100 ofFIG. 1 , implementing user interface software and/or hardware, such as the example implementations described with respect toFIGS. 2 , 5 and 6. -
Stage 3402 depth transitions a two-dimensional desktop from a viewing surface to a back surface. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can depth transition a two-dimensional desktop, such as thedesktop 1702 ofFIG. 17 , from a viewing surface to a back surface, such as from theviewing surface 1703 to theback surface 1732 as shown inFIG. 17 . -
Stage 3404 generates one or more side surfaces extending from the back surface to the viewing surface. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can generate one or more side surfaces extending from the back surface to the viewing surface, such as the side surfaces 1706, 1708 and 1710 ofFIG. 17 . -
Stage 3406 generates a visualization object receptacle, e.g., an icon receptacle, on the one or more side surfaces. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can generate an icon receptacle on the one or more side surfaces, such as thevisualization object receptacle 1730 on thesurface 1706 ofFIG. 17 . -
Stage 3408 disposes one or more visualization object, e.g., icons, corresponding to desktop items within the visualization object receptacle. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can dispose one or more icons corresponding to desktop items within the visualization object receptacle, such as theicons 1732 in thevisualization object receptacle 1730, which correspond to theicons 1722 ofFIG. 17 . -
FIG. 35 is a flow diagram of anotherexample process 3500 for transitioning between desktop types. Theprocess 3500 can, for example, be implemented in a processing device, such as thesystem 100 ofFIG. 1 , implementing user interface software and/or hardware, such as the example implementations described with respect toFIGS. 2 , 5 and 6. -
Stage 3502 identifies two-dimensional desktop items in a two-dimensional desktop environment. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can identify two-dimensional desktop items in a two-dimensional desktop environment, such as thefolders FIG. 17 . -
Stage 3504 generates three-dimensional desktop items based on the identified two-dimensional desktop items. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can generate three-dimensional desktop items based on the identified two-dimensional desktop items, such as thestack items FIG. 17 , which correspond to thefolders -
Stage 3506 eliminates the two-dimensional desktop items from view. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can eliminate two-dimensional desktop items from view, such as the elimination of thefolders back surface 1732 ofFIG. 17 . -
Stage 3508 generates the three-dimensional desktop items on at least one surface (e.g., a side surface). For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can generate the three-dimensional desktop items on at least one side surface, such as thestack items bottom side surface 1706 ofFIG. 17 . -
FIG. 36 is a flow diagram of anexample process 3600 for generating a multidimensional desktop environment. Theprocess 3600 can, for example, be implemented in a processing device, such as thesystem 100 ofFIG. 1 , implementing user interface software and/or hardware, such as the example implementations described with respect toFIGS. 2 , 5 and 6. -
Stage 3602 axially disposes a back surface from a viewing surface. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can axially dispose a back surface from a viewing surface, such as theback surface 1102 being axially disposed from theviewing surface 1104, as shown inFIG. 11 . -
Stage 3604 extends one or more side surfaces from the back surface to the viewing surface. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can extend one or more side surfaces from the back surface to the viewing surface, such as the side surfaces 1106, 1108, 1110 and 1112, as shown inFIG. 11 . -
Stage 3606 generates a visualization object receptacle on one or more of the side surfaces. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can generate an icon receptacle on one or more of the side surfaces, such as thevisualization object receptacle 1114 on theside surface 1106, as shown inFIG. 11 . -
Stage 3608 generates within the visualization object receptacle one or more visualization objects, e.g., icons, corresponding to one or more system objects. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can generate within the visualization object receptacle one or more icons corresponding to one or more system objects, such as theicons FIG. 11 . -
FIG. 37 is a flow diagram of anexample process 3700 for rendering a side surface in a multidimensional desktop environment. Theprocess 3700 can, for example, be implemented in a processing device, such as thesystem 100 ofFIG. 1 , implementing user interface software and/or hardware, such as the example implementations described with respect toFIGS. 2 , 5 and 6. -
Stage 3702 generates stack items on a surface (e.g., a side surface). For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can generate stacks items on a side surface, such as thestack items side surface 1106, as shown inFIG. 11 . -
Stage 3704 renders a surface texture on the surface. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can render a surface texture on the side surface, such as thegrid texture 1150 on theside surface 1106, as shown inFIG. 11 . -
FIG. 38 is a flow diagram of anexample process 3800 for scrolling a side surface in a multidimensional desktop environment. Theprocess 3800 can, for example, be implemented in a processing device, such as thesystem 100 ofFIG. 1 , implementing user interface software and/or hardware, such as the example implementations described with respect toFIGS. 2 , 5 and 6. -
Stage 3802 scrolls the side surface in response to a scroll command. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can scroll the side surface in response to a scroll command, such as theside surface 1106 in the directions indicated by one or more of thearrows FIG. 11 . -
Stage 3804 scrolls the stack items in a scroll direction. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can scroll the stack items in a scroll direction, such as thestack items arrows FIG. 11 . -
Stage 3806 displaces a stack item(s) from the side surface at a scroll egress. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can displace a stack item(s) from the side surface at a scroll egress, such as the scroll egress 1158 (or 1159), as shown inFIG. 11 . -
Stage 3808 emplaces a stack item(s) on the side surface at a scroll ingress. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can emplace a stack items on the side surface at a scroll ingress, such as the scroll ingress 1156 (or 1157) as shown inFIG. 11 . -
FIG. 39 is a flow diagram of anexample process 3900 for generating a selection indicator. Theprocess 3900 can, for example, be implemented in a processing device, such as thesystem 100 ofFIG. 1 , implementing user interface software and/or hardware, such as the example implementations described with respect toFIGS. 2 , 5 and 6. -
Stage 3902 generates an under lighting effect as the selection indicator. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can generate an under lighting effect as the selection indicator, such as theselection indicator 1822 ofFIG. 18B . -
Stage 3904 generates an enlargement effect as the selection indicator. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can generate an enlargement effect as the selection indicator, such as the enlargement of thestack indicator 1806 as shown inFIG. 18B . -
FIG. 40 is a flow diagram of anexample process 4000 for rendering desktop items. Theprocess 4000 can, for example, be implemented in a processing device, such as thesystem 100 ofFIG. 1 , implementing user interface software and/or hardware, such as the example implementations described with respect toFIGS. 2 , 5 and 6. -
Stage 4002 generates stack items on a first side surface corresponding to a plurality of desktop items. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can generate stack items on a first side surface corresponding to a plurality of desktop items, such as thestack items visualization object receptacle 1114 andicons FIG. 12 . -
Stage 4004 generates icons corresponding to program items on a second side surface. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can generate icons corresponding to program items on a second side surface, such as theapplication icon 1222 on thesurface 1112, as shown inFIG. 12 . -
Stage 4006 generates icons corresponding to file items on a third side surface. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can generate icons corresponding to file items on a third side surface, such as thefile desktop item 1220 on thesurface 1108 ofFIG. 12 . -
FIG. 41 is a flow diagram of anexample process 4100 for generating an example application environment in a multidimensional desktop environment. Theprocess 4100 can, for example, be implemented in a processing device, such as thesystem 100 ofFIG. 1 , implementing user interface software and/or hardware, such as the example implementations described with respect toFIGS. 2 , 5 and 6. -
Stage 4102 axially disposes a back surface from a viewing surface. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can axially dispose a back surface from a viewing surface, such as theback surface 1102 that is axially disposed from the viewing surface inFIG. 14 . -
Stage 4104 extends one or more side surfaces from the back surface to the viewing surface. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can extend one or more side surfaces from the back surface to the viewing surface, such as the side surfaces 1106, 1108, and 1112, as shown inFIG. 14 . -
Stage 4106 generates an application content frame for an application on the back surface. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can generate an application content frame for an application on the back surface, such as theapplication content frame 1410 on theback surface 1102, as shown inFIG. 14 . -
Stage 4108 generates one or more application control elements for the application on the one or more side surfaces. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can generate one or more application control elements for the application on the one or more side surfaces, such as thefunction icons FIG. 14 . The application control elements, e.g., thefunction icons -
FIG. 42 is a flow diagram of anexample process 4200 for transitioning between application environments. Theprocess 4200 can, for example, be implemented in a processing device, such as thesystem 100 ofFIG. 1 , implementing user interface software and/or hardware, such as the example implementations described with respect toFIGS. 2 , 5 and 6. -
Stage 4202 generates an application portal on one of the side surfaces. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can generate an application portal on one of the side surfaces, such as thestack item 1510 that includesstack elements FIG. 15 . -
Stage 4204 transitions from a first application environment to a second application environment in response to a selection of the application portal. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can transition from a first application environment to a second application environment in response to a selection of the application portal. As described with respect toFIG. 15 , selection of thestack element 1514 can transition to another application environment. -
FIG. 43 is a flow diagram of anexample process 4300 for generating a visualization object receptacle. Theprocess 4300 can, for example, be implemented in a processing device, such as thesystem 100 ofFIG. 1 , implementing user interface software and/or hardware, such as the example implementations described with respect toFIGS. 2 , 5 and 6. -
Stage 4302 generates a visualization object receptacle disposed along a depth aspect. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can generate a visualization object receptacle disposed along a depth aspect, such as thevisualization object receptacle 1114, as shown inFIG. 12 . -
Stage 4304 generates one or more visualization objects disposed within the visualization object receptacle. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can generate one or more visualization objects disposed within the visualization object receptacle, such as the visualization objects 1122, 1124, 1126, 1128, 1130 and 1132, as shown inFIG. 12 . -
Stage 4306 preeminently displays the visualization object receptacle. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can preeminently display the visualization object receptacle, such as by displaying the visualization object receptacle near the viewing surface ofFIG. 12 , or by displaying the visualization object receptacle as described with respect to thevisualization object receptacle 714 ofFIG. 8 . -
Stage 4308 generates at least one of the visualization objects as a stack item. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can generate at least one of the visualization objects as a stack item, such as thestack items FIG. 12 . -
FIG. 44 is a flow diagram of anexample process 4400 for color coding visualization objects. Theprocess 4400 can, for example, be implemented in a processing device, such as thesystem 100 ofFIG. 1 , implementing user interface software and/or hardware, such as the example implementations described with respect toFIGS. 2 , 5 and 6. -
Stage 4402 associates a first color with an executing application. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can associate a first color with an executing application, such as thestatus indicator 1830, as shown inFIG. 18D . -
Stage 4404 associates a second color with a selected and executing application. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can associate a second color with a selected and executing application, such as thestatus indicator 1832, as shown inFIG. 18D . -
Stage 4406 associates a third color with a launching of an application. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can associate a third color with a launching of an application, such as thestatus indicator 1834, as shown inFIG. 18D . -
FIG. 45 is a flow diagram of anexample process 4500 for color coding visualization objects of related system objects. Theprocess 4500 can, for example, be implemented in a processing device, such as thesystem 100 ofFIG. 1 , implementing user interface software and/or hardware, such as the example implementations described with respect toFIGS. 2 , 5 and 6. -
Stage 4502 color codes a selected visualization object disposed in the visualization object receptacle. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can color code a selected visualization object disposed in the visualization object receptacle, such as color coding thevisualization object 2206, as shown inFIG. 22 . -
Stage 4504 applies a corresponding color code to the desktop items associated with the selected visualization object. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can apply a corresponding color code to the desktop items associated with the selected visualization object, such as color coding thestack item 2208, thefolder 2210 and thewindow 2212, as shown inFIG. 22 . -
FIG. 46 is a flow diagram of anotherexample process 4600 for generating a visualization object receptacle. Theprocess 4600 can, for example, be implemented in a processing device, such as thesystem 100 ofFIG. 1 , implementing user interface software and/or hardware, such as the example implementations described with respect toFIGS. 2 , 5 and 6. -
Stage 4602 defines visualization object rows in the visualization object receptacle. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can define visualization object rows in the visualization object receptacle, such as thevisualization object rows FIG. 23 . -
Stage 4604 defines visualization object columns in the visualization object receptacle. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can define visualization object columns in the visualization object receptacle, such as thevisualization object columns FIG. 23 . -
Stage 4606 disposes the visualization objects within the visualization object receptacle according to the visualization object rows and visualization object columns. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can dispose the visualization objects within the visualization object receptacle according to the visualization object rows and visualization object columns, as indicated by the solid and dashed arrows shown inFIG. 23 . -
FIG. 47 is a flow diagram of anexample process 4700 for generating a stack item. Theprocess 4700 can, for example, be implemented in a processing device, such as thesystem 100 ofFIG. 1 , implementing user interface software and/or hardware, such as the example implementations described with respect toFIGS. 2 , 5 and 6. -
Stage 4702 generates or identifies a plurality of stack elements corresponding to computer system objects. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can generate a plurality of stack elements corresponding to computer system objects, such as the stack elements shown inFIG. 29 . -
Stage 4704 associates the plurality of stack elements with a stack item. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can associate the plurality of stack elements with a stack item, such as thestack item 2900, as shown inFIG. 29 . -
Stage 4706 aggregates the stack elements into the stack item. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can aggregate the stack elements into the stack item, such as by overlapping the stack elements to form the stack item inFIG. 29 . -
Stage 4708 provides context control of the stack item. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can provides context control of the stack item, such as theapplication context 2910, as shown inFIG. 29 . -
FIG. 48 is a flow diagram of anexample process 4800 for displaying stack elements according to modal states. Theprocess 4800 can, for example, be implemented in a processing device, such as thesystem 100 ofFIG. 1 , implementing user interface software and/or hardware, such as the example implementations described with respect toFIGS. 2 , 5 and 6. -
Stage 4802 displays the stack elements in a substantial overlapping arrangement in a first modal state. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can display the stack elements in a substantial overlapping arrangement in a first modal state, such as the overlapping arrangement of the stack items in thestack element 3000 in the first modal state, as shown inFIG. 30 . -
Stage 4804 displays the stack elements in a browsing arrangement in the second modal state. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can display the stack elements in a browsing arrangement in the second modal state, such as the fanning arrangement defined by thefirst terminus 3020 and thesecond terminus 3022, as shown inFIG. 30 . -
Stage 4806 enables the selection of a stack element in the second modal state. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can enable the selection of a stack element in the second modal state, such as a selection of the preeminently disposedstack element 3002, as shown inFIG. 30 . -
FIG. 49 is a flow diagram of anexample process 4900 for selecting interaction models and/or visualization models. Theprocess 4900 can, for example, be implemented in a processing device, such as thesystem 100 ofFIG. 1 , implementing user interface software and/or hardware, such as the example implementations described with respect toFIGS. 2 , 5 and 6. -
Stage 4902 identifies a characteristic of stack elements associated with a stack item. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can identify a quantity of stack elements associated with the stack item, such as the quantity ofstack elements FIG. 30 , or a type associated with the stack item. -
Stage 4904 identifies interaction models and/or visualization models. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can identify a plurality of visualization models, e.g., browsing arrangements, such as the browsing arrangements described with respect toFIGS. 30 and 31 , or interaction models, such as the interaction models described with respect toFIGS. 33A and 33B . -
Stage 4906 selects an interaction model and/or visualization model based on the characteristic of the stack elements (e.g., the quantity of stack elements, or the type of the stack elements). For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can select one of a plurality of browsing arrangements, such as selection the fanning arrangement, as shown inFIG. 30 , or select one of a plurality of interaction modes, as described with respect toFIGS. 33A and 33B . -
FIG. 50 is a flow diagram of anotherexample process 5000 for generating a stack item. Theprocess 5000 can, for example, be implemented in a processing device, such as thesystem 100 ofFIG. 1 , implementing user interface software and/or hardware, such as the example implementations described with respect toFIGS. 2 , 5 and 6. -
Stage 5002 defines the date ranges for a temporal context. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can define the date ranges for a temporal context, such as the date ranges described with respect toFIGS. 28A and 28B . -
Stage 5004 associates the corresponding stack items with each date range. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can associate the corresponding stack items with each date range, such as thestack items FIGS. 28A and 28B . -
Stage 5006 determines for each stack element a date associated with each associated system object. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can determine for each stack element a date associated with each associated system object, such as a file modification date, as described with respect toFIGS. 28A and 28B . -
Stage 5008 associates the stack elements with the stack items based on the date ranges associated with the stack items and the dates associated with each system object. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can associate the stack elements with the stack items based on the date ranges associated with the stack items and the dates associated with each system object, such as the stack elements associated with thestack items FIG. 28 . -
FIG. 51 is a flow diagram of anexample process 5100 for displaying a stack item according to an execution context. Theprocess 5100 can, for example, be implemented in a processing device, such as thesystem 100 ofFIG. 1 , implementing user interface software and/or hardware, such as the example implementations described with respect toFIGS. 2 , 5 and 6. -
Stage 5102 associates a stack item with an application system object. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can associate a stack item with an application system object, such as the association of thestack item 2900 with an application, as shown inFIG. 29 . -
Stage 5104 associates stack elements associated with the application system object with the stack item associated with the application system object. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can associate stack elements associated with the application system object with the stack item associated with the application system object, such as the stack elements of thestack item 2900, as shown inFIG. 29 . -
Stage 5106 displays the stack item associated with the application system object during an executing context. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can display the stack item associated with the application system object during an executing context, such as the displaying of thestack item 2900 during an executing and selectedstate 2912, as shown inFIG. 29 . -
FIG. 52 is a flow diagram of anexample process 5200 for generating and displaying a stack item. Theprocess 5200 can, for example, be implemented in a processing device, such as thesystem 100 ofFIG. 1 , implementing user interface software and/or hardware, such as the example implementations described with respect toFIGS. 2 , 5 and 6. -
Stage 5202 associates a plurality of stack elements with an application. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can associate a plurality of stack elements with an application, such as thestack element 2600 with an application, as shown inFIGS. 26 and 27 . -
Stage 5204 identifies stack file elements and stack application elements. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can identify stack file elements and stack application elements, such as thefile elements application element 2602, as shown inFIGS. 26 and 27 . -
Stage 5206 associates a stack item with the plurality of stack items. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can associate a stack item with the plurality of stack elements, such as thestack time 2600 with thestack elements FIGS. 26 and 27 . -
Stage 5208 aggregates stack elements to generate stack items. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can aggregate stack elements to generate stack items, such as the aggregation shown inFIG. 26 or 27. - Stage 5210 preeminently disposes the application element. For example, the
system 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can preeminently dispose the application element, such as the preeminently disposedstack element 2602, as shown inFIG. 26 or 27. -
FIG. 53 is a flow diagram of anexample process 5300 for automatically selecting and applying an interaction model to a stack item. Theprocess 5300 can, for example, be implemented in a processing device, such as thesystem 100 ofFIG. 1 , implementing user interface software and/or hardware, such as the example implementations described with respect toFIGS. 2 , 5 and 6. -
Stage 5302 associates visualizations of system objects. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can associate the visualizations of system objects, such as the visualizations corresponding to thestack elements FIG. 30 . -
Stage 5304 identifies one or more association characteristics of the associated visualizations. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can identify or more association characteristics of the associated visualizations, such as the number of stack elements shown inFIG. 30 . -
Stage 5306 automatically selects an interaction model from a plurality of interaction models based on the identified one or more associated characteristics. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can automatically select an interaction model from a plurality of interaction models based on the identified one or more associated characteristics, such as selecting one of the interaction models shown inFIGS. 30 and 31 . -
Stage 5308 applies the selected interaction model to the associated visualizations. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can apply the selected interaction model to the associated visualizations, such as the fanning arrangement as shown inFIG. 30 . -
FIG. 54 is a flow diagram of anotherexample process 5400 for automatically selecting and applying an interaction model to a stack item. Theprocess 5400 can, for example, be implemented in a processing device, such as thesystem 100 ofFIG. 1 , implementing user interface software and/or hardware, such as the example implementations described with respect toFIGS. 2 , 5 and 6. -
Stage 5402 identifies a quantity of visualizations in the stack association. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can identify a quantity of visualizations in the stack association, such as the quantity ofstack elements FIG. 31A . -
Stage 5404 selects the interaction model from the plurality of interaction models based on the quantity. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can select the interaction model from the plurality of interaction models based on the quantity, such as the interaction model shown inFIG. 31A . -
FIG. 55 is a flow diagram of anotherexample process 5500 for automatically selecting and applying an interaction model to a stack item. Theprocess 4000 can, for example, be implemented in a processing device, such as thesystem 100 ofFIG. 1 , implementing user interface software and/or hardware, such as the example implementations described with respect toFIGS. 2 , 5 and 6. -
Stage 5502 identifies a type of stack element in the stack association. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can identify a type of stack element in the stack association, such as, for example, a document type. -
Stage 5504 selects the interaction model from the plurality of interaction models based on the type. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can select the interaction model from the plurality of interaction models based on the type, such as, for example, an interaction model designed for the document type. -
FIG. 56 is a flow diagram of anotherexample process 5600 for automatically selecting and applying an interaction model to a stack item. Theprocess 5600 can, for example, be implemented in a processing device, such as thesystem 100 ofFIG. 1 , implementing user interface software and/or hardware, such as the example implementations described with respect toFIGS. 2 , 5 and 6. -
Stage 5602 identifies a group association of stack elements in the stack association. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can identify a group association of stack elements in the stack association, such as the project association ofFIG. 33A . -
Stage 5604 selects the interaction model from the plurality of interaction models based on the group association. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can select the interaction model from the plurality of interaction models based on the group association, such as a multiple launch interaction model, a synchronization interaction model, or a reconciliation interaction model. -
FIG. 57 is a flow diagram of anexample process 5700 for generating a divet. Theprocess 5700 can, for example, be implemented in a processing device, such as thesystem 100 ofFIG. 1 , implementing user interface software and/or hardware, such as the example implementations described with respect toFIGS. 2 , 5 and 6. -
Stage 5702 generates a visualization object receptacle disposed along a depth aspect. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can generate thevisualization object receptacle 1802 ofFIG. 19A . -
Stage 5704 generates one or more visualization objects disposed within the visualization object receptacle. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can generate the one ormore visualization objects FIG. 19A . -
Stage 5706 identifies an actionable state associated with one of the visualization objects. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can identify an actionable state, e.g., a system update availability, associated with thevisualization object 1804 ofFIG. 19A . -
Stage 5708 generates a divet displayed proximate to the visualization object to indicate an actionable state associated with the visualization object. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can generate thedivet 1902 ofFIG. 19A . -
FIG. 58 is a flow diagram of anexample process 5800 for generating a divet contextual menu. Theprocess 5800 can, for example, be implemented in a processing device, such as thesystem 100 ofFIG. 1 , implementing user interface software and/or hardware, such as the example implementations described with respect toFIGS. 2 , 5 and 6. -
Stage 5802 receives a selection of the divet. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can receive a selection, e.g., a mouse click, of thedivet 1902 ofFIG. 19A . -
Stage 5804 generates a contextual menu proximate to the visualization object in response to receiving the selection divet. For example, thesystem 100, implementing any one of the UI engines described inFIGS. 2 , 5 and 6, can generate thecontextual menu 1910 ofFIG. 19B . - The apparatus, methods, flow diagrams, and structure block diagrams described in this patent document may be implemented in computer processing systems including program code comprising program instructions that are executable by the computer processing system. Other implementations may also be used. Additionally, the flow diagrams and structure block diagrams described in this patent document, which describe particular methods and/or corresponding acts in support of steps and corresponding functions in support of disclosed structural means, may also be utilized to implement corresponding software structures and algorithms, and equivalents thereof.
- This written description sets forth the best mode of the invention and provides examples to describe the invention and to enable a person of ordinary skill in the art to make and use the invention. This written description does not limit the invention to the precise terms set forth. Thus, while the invention has been described in detail with reference to the examples set forth above, those of ordinary skill in the art may effect alterations, modifications and variations to the examples without departing from the scope of the invention.
Claims (25)
1-45. (canceled)
46. A graphical user interface, comprising:
an icon receptacle disposed along a depth aspect of a user interface;
a plurality of icons disposed within the icon receptacle; and
a selection indicator displayed on the icon receptacle adjacent an icon from among the plurality of icons to indicate that the icon has been selected.
47. The graphical user interface of claim 46 , wherein
the icon receptacle comprises a front surface and a depth surface, the plurality of icons being disposed on the depth surface, and
the selection indicator comprises an illumination effect displayed on the front surface of the icon receptacle in front of the selected icon.
48. The graphical user interface of claim 46 , wherein
the icon receptacle comprises a depth surface onto which the plurality of icons are disposed, and
the selection indicator comprises an under-lighting effect displayed on the depth surface underneath and around the selected icon.
49. The graphical user interface of claim 48 , wherein
the selection indicator further comprises an enlargement of the selected icon relative to adjacent icons on the depth surface, and
the under-lighting effect extends over portions of the depth surface onto which adjacent icons are located.
50. The graphical user interface of claim 48 , wherein the icon has been selected in response to detection of a user-operated cursor hovering over the icon.
51. The graphical user interface of claim 48 , wherein the icon has been selected in response to one or more characteristics associated with the icon satisfying a user-input criterion.
52. A graphical user interface, comprising:
an icon receptacle disposed along a depth aspect of a user interface;
a plurality of icons disposed within the icon receptacle; and
a status indicator displayed on the icon receptacle adjacent an icon from among the plurality of icons to indicate a status of an application associated with the icon.
53. The graphical user interface of claim 52 , wherein the status indicator is color coded in accordance with the application status.
54. The graphical user interface of claim 53 , wherein the status indicator comprises
a first color when the application is being launched,
a second color when the application is being executed while being selected, and
a third color when the application is being executed without being selected.
55. The graphical user interface of claim 53 further comprising
one or more desktop items relating to the application, the one or more desktop items being disposed outside of the icon receptacle,
wherein the status indicator comprises a color that also is applied the desktop items relating to the application.
56. The graphical user interface of claim 55 , wherein the one or more desktop items comprise one or more of
an icon of a document associated with the application,
an icon of a file folder that stores one or more files associated with the application, or
a window in which the application is configured to run.
57. The graphical user interface of claim 52 , wherein the status indicator comprises a transient effect during a launching of the application.
58. The graphical user interface of claim 52 , wherein
the icon receptacle comprises a front surface and a depth surface, the plurality of icons being disposed on the depth surface, and
the status indicator comprises an illumination effect displayed, when the application is being executed, on the front surface of the icon receptacle in front of the selected icon.
59. The graphical user interface of claim 52 , wherein
the icon receptacle comprises a depth surface onto which the plurality of icons are disposed, and
the status indicator comprises an under-lighting effect displayed, when the application is being executed, on the depth surface underneath and around the selected icon.
60. The graphical user interface of claim 59 , wherein the under-lighting effect pulsates during a launching of the application.
61. The graphical user interface of claim 52 , wherein
the status indicator comprises an illumination effect,
the illumination effect has
a first intensity when the application is being executed while being selected, and
a second intensity, lower than the first intensity, when the application is being executed without being selected.
62. A method executed by a hardware processor, the method comprising:
displaying an icon receptacle disposed along a depth aspect of a graphical user interface;
displaying a plurality of icons disposed on a depth surface of the icon receptacle; and
displaying an indicator on the icon receptacle adjacent an icon from among the plurality of icons to indicate either (i) that the icon has been selected or (ii) a status of an application associated with the icon.
63. The method of claim 62 , wherein the displaying of the indicator to indicate that the icon has been selected comprises presenting an illumination effect on a front surface of the icon receptacle different from the depth surface.
64. The method of claim 62 , wherein the displaying of the indicator to indicate that the icon has been selected comprises presenting an under-lighting effect on the depth surface underneath and around the selected icon.
65. The method of claim 64 , wherein
the displaying of the indicator to indicate that the icon has been selected further comprises enlarging the selected icon relative to adjacent icons on the depth surface, and
the presenting of the under-lighting effect on the depth surface further comprises extending the under-lighting effect over portions of the depth surface onto which adjacent icons are located.
66. The method of claim 62 , wherein the displaying of the indicator to indicate the status of the application associated with the icon is performed using
a first color when the application is being launched,
a second color when the application is being executed while being selected, and
a third color when the application is being executed without being selected.
67. The method of claim 66 , comprising
displaying, outside of the icon receptacle, one or more desktop items relating to the application; and
applying the same color to the indicator and to the desktop items relating to the application.
68. The method of claim 62 , wherein the displaying of the indicator to indicate the status of the application associated with the icon comprises presenting, when the application is being executed, an illumination effect on a front surface of the icon receptacle different from the depth surface.
69. The method of claim 62 , wherein the displaying of the indicator to indicate the status of the application associated with the icon comprises presenting an illumination effect that has
a first intensity when the application is being executed while being selected, and
a second intensity, lower than the first intensity, when the application is being executed without being selected.
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/803,898 US20160018970A1 (en) | 2007-06-08 | 2015-07-20 | Visualization Object Receptacle |
US16/550,220 US11086495B2 (en) | 2007-06-08 | 2019-08-24 | Visualization object receptacle |
US17/347,947 US20210357108A1 (en) | 2007-06-08 | 2021-06-15 | Visualization Object Receptacle |
US18/118,605 US20230205406A1 (en) | 2007-06-08 | 2023-03-07 | Visual Object Receptacle |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/760,692 US9086785B2 (en) | 2007-06-08 | 2007-06-08 | Visualization object receptacle |
US14/803,898 US20160018970A1 (en) | 2007-06-08 | 2015-07-20 | Visualization Object Receptacle |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/760,692 Continuation US9086785B2 (en) | 2007-06-08 | 2007-06-08 | Visualization object receptacle |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/550,220 Continuation US11086495B2 (en) | 2007-06-08 | 2019-08-24 | Visualization object receptacle |
Publications (1)
Publication Number | Publication Date |
---|---|
US20160018970A1 true US20160018970A1 (en) | 2016-01-21 |
Family
ID=40097044
Family Applications (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/760,692 Active 2029-02-03 US9086785B2 (en) | 2007-06-08 | 2007-06-08 | Visualization object receptacle |
US14/803,898 Abandoned US20160018970A1 (en) | 2007-06-08 | 2015-07-20 | Visualization Object Receptacle |
US16/550,220 Active US11086495B2 (en) | 2007-06-08 | 2019-08-24 | Visualization object receptacle |
US17/347,947 Pending US20210357108A1 (en) | 2007-06-08 | 2021-06-15 | Visualization Object Receptacle |
US18/118,605 Pending US20230205406A1 (en) | 2007-06-08 | 2023-03-07 | Visual Object Receptacle |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/760,692 Active 2029-02-03 US9086785B2 (en) | 2007-06-08 | 2007-06-08 | Visualization object receptacle |
Family Applications After (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/550,220 Active US11086495B2 (en) | 2007-06-08 | 2019-08-24 | Visualization object receptacle |
US17/347,947 Pending US20210357108A1 (en) | 2007-06-08 | 2021-06-15 | Visualization Object Receptacle |
US18/118,605 Pending US20230205406A1 (en) | 2007-06-08 | 2023-03-07 | Visual Object Receptacle |
Country Status (1)
Country | Link |
---|---|
US (5) | US9086785B2 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10019851B2 (en) * | 2016-10-25 | 2018-07-10 | Microsoft Technology Licensing, Llc | Positioning objects in three-dimensional graphical space |
US20200034023A1 (en) * | 2018-07-27 | 2020-01-30 | Nintendo Co., Ltd. | Non-transitory computer-readable storage medium with executable program stored thereon, information processing apparatus, information processing method, and information processing system |
US10599288B2 (en) | 2016-05-09 | 2020-03-24 | Beijing Xiaomi Mobile Software Co., Ltd. | Method and device for displaying an application interface |
US11503256B2 (en) | 2019-09-04 | 2022-11-15 | Material Technologies Corporation | Object feature visualization apparatus and methods |
US11622096B2 (en) | 2019-09-04 | 2023-04-04 | Material Technologies Corporation | Object feature visualization apparatus and methods |
Families Citing this family (108)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9213365B2 (en) | 2010-10-01 | 2015-12-15 | Z124 | Method and system for viewing stacked screen displays using gestures |
US9207717B2 (en) | 2010-10-01 | 2015-12-08 | Z124 | Dragging an application to a screen using the application manager |
US8166415B2 (en) * | 2006-08-04 | 2012-04-24 | Apple Inc. | User interface for backup management |
US8892997B2 (en) | 2007-06-08 | 2014-11-18 | Apple Inc. | Overflow stack user interface |
US8667418B2 (en) | 2007-06-08 | 2014-03-04 | Apple Inc. | Object stack |
US8745535B2 (en) * | 2007-06-08 | 2014-06-03 | Apple Inc. | Multi-dimensional desktop |
US9086785B2 (en) | 2007-06-08 | 2015-07-21 | Apple Inc. | Visualization object receptacle |
US8615720B2 (en) * | 2007-11-28 | 2013-12-24 | Blackberry Limited | Handheld electronic device and associated method employing a graphical user interface to output on a display virtually stacked groups of selectable objects |
US20090327939A1 (en) * | 2008-05-05 | 2009-12-31 | Verizon Data Services Llc | Systems and methods for facilitating access to content instances using graphical object representation |
TWI374382B (en) * | 2008-09-01 | 2012-10-11 | Htc Corp | Icon operation method and icon operation module |
KR20100041006A (en) * | 2008-10-13 | 2010-04-22 | 엘지전자 주식회사 | A user interface controlling method using three dimension multi-touch |
US8411046B2 (en) | 2008-10-23 | 2013-04-02 | Microsoft Corporation | Column organization of content |
US20100107100A1 (en) | 2008-10-23 | 2010-04-29 | Schneekloth Jason S | Mobile Device Style Abstraction |
US8385952B2 (en) | 2008-10-23 | 2013-02-26 | Microsoft Corporation | Mobile communications device user interface |
US8762885B2 (en) * | 2008-12-15 | 2014-06-24 | Verizon Patent And Licensing Inc. | Three dimensional icon stacks |
US20100169828A1 (en) * | 2008-12-29 | 2010-07-01 | International Business Machines Corporation | Computer desktop organization via magnet icons |
WO2010080934A1 (en) * | 2009-01-07 | 2010-07-15 | David Colter | Method and apparatus for user interface movement scheme |
US8238876B2 (en) | 2009-03-30 | 2012-08-07 | Microsoft Corporation | Notifications |
US8175653B2 (en) | 2009-03-30 | 2012-05-08 | Microsoft Corporation | Chromeless user interface |
US8355698B2 (en) | 2009-03-30 | 2013-01-15 | Microsoft Corporation | Unlock screen |
KR101588733B1 (en) * | 2009-07-21 | 2016-01-26 | 엘지전자 주식회사 | Mobile terminal |
RU2422877C1 (en) * | 2009-11-16 | 2011-06-27 | Виталий Евгеньевич Пилкин | Method of indicating infected electronic files |
US8782562B2 (en) * | 2009-12-02 | 2014-07-15 | Dell Products L.P. | Identifying content via items of a navigation system |
US20110131531A1 (en) * | 2009-12-02 | 2011-06-02 | Deborah Russell | Touch Friendly Applications in an Information Handling System Environment |
JP5361697B2 (en) * | 2009-12-21 | 2013-12-04 | キヤノン株式会社 | Display control apparatus and display control method |
KR20150070197A (en) * | 2010-01-11 | 2015-06-24 | 애플 인크. | Electronic text manipulation and display |
US10397639B1 (en) | 2010-01-29 | 2019-08-27 | Sitting Man, Llc | Hot key systems and methods |
EP2369820B1 (en) * | 2010-03-22 | 2016-04-06 | BlackBerry Limited | Management and display of grouped messages on a communication device |
US8539353B2 (en) * | 2010-03-30 | 2013-09-17 | Cisco Technology, Inc. | Tabs for managing content |
US8522165B2 (en) * | 2010-06-18 | 2013-08-27 | Adobe Systems Incorporated | User interface and method for object management |
US9207859B2 (en) * | 2010-09-14 | 2015-12-08 | Lg Electronics Inc. | Method and mobile terminal for displaying fixed objects independent of shifting background images on a touchscreen |
US8949736B2 (en) * | 2010-10-15 | 2015-02-03 | Sap Se | System and method for immersive process design collaboration on mobile devices |
US10740117B2 (en) | 2010-10-19 | 2020-08-11 | Apple Inc. | Grouping windows into clusters in one or more workspaces in a user interface |
US9292196B2 (en) | 2010-10-19 | 2016-03-22 | Apple Inc. | Modifying the presentation of clustered application windows in a user interface |
US9542202B2 (en) * | 2010-10-19 | 2017-01-10 | Apple Inc. | Displaying and updating workspaces in a user interface |
US9658732B2 (en) | 2010-10-19 | 2017-05-23 | Apple Inc. | Changing a virtual workspace based on user interaction with an application window in a user interface |
US20120159395A1 (en) | 2010-12-20 | 2012-06-21 | Microsoft Corporation | Application-launching interface for multiple modes |
US8689123B2 (en) | 2010-12-23 | 2014-04-01 | Microsoft Corporation | Application reporting in an application-selectable user interface |
US8612874B2 (en) | 2010-12-23 | 2013-12-17 | Microsoft Corporation | Presenting an application change through a tile |
KR101831641B1 (en) * | 2011-02-11 | 2018-04-05 | 삼성전자 주식회사 | Method and apparatus for providing graphic user interface in mobile terminal |
WO2012108714A2 (en) * | 2011-02-11 | 2012-08-16 | Samsung Electronics Co., Ltd. | Method and apparatus for providing graphic user interface in mobile terminal |
US10152192B2 (en) | 2011-02-21 | 2018-12-11 | Apple Inc. | Scaling application windows in one or more workspaces in a user interface |
JP5221694B2 (en) * | 2011-03-08 | 2013-06-26 | 株式会社東芝 | Electronic device, object display method, and object display program. |
TW201237801A (en) * | 2011-03-11 | 2012-09-16 | J Touch Corp | Method for processing three-dimensional image vision effects |
US9354899B2 (en) * | 2011-04-18 | 2016-05-31 | Google Inc. | Simultaneous display of multiple applications using panels |
US8893033B2 (en) | 2011-05-27 | 2014-11-18 | Microsoft Corporation | Application notifications |
US9104307B2 (en) | 2011-05-27 | 2015-08-11 | Microsoft Technology Licensing, Llc | Multi-application environment |
TWI480794B (en) * | 2011-07-10 | 2015-04-11 | Compal Electronics Inc | Information display method and electronic device |
US20130057587A1 (en) | 2011-09-01 | 2013-03-07 | Microsoft Corporation | Arranging tiles |
US9146670B2 (en) | 2011-09-10 | 2015-09-29 | Microsoft Technology Licensing, Llc | Progressively indicating new content in an application-selectable user interface |
US20130076654A1 (en) * | 2011-09-27 | 2013-03-28 | Imerj LLC | Handset states and state diagrams: open, closed transitional and easel |
KR20130041484A (en) * | 2011-10-17 | 2013-04-25 | 삼성전자주식회사 | Method and apparatus for operating menu screen of user device |
JP5238872B2 (en) * | 2011-12-02 | 2013-07-17 | 株式会社東芝 | Information processing apparatus, display control method, and program |
US20130145266A1 (en) * | 2011-12-02 | 2013-06-06 | Louie D. Mantia | Graphical user interface having interactive stacks of images corresponding to digital assets |
US9223472B2 (en) | 2011-12-22 | 2015-12-29 | Microsoft Technology Licensing, Llc | Closing applications |
CN102566927A (en) * | 2011-12-28 | 2012-07-11 | 惠州Tcl移动通信有限公司 | Touch screen unlocking method and device |
US9928562B2 (en) | 2012-01-20 | 2018-03-27 | Microsoft Technology Licensing, Llc | Touch mode and input type recognition |
US20130191781A1 (en) * | 2012-01-20 | 2013-07-25 | Microsoft Corporation | Displaying and interacting with touch contextual user interface |
JP5861478B2 (en) * | 2012-01-31 | 2016-02-16 | 富士通株式会社 | Display control program, display control apparatus, and display control method |
JP5942456B2 (en) * | 2012-02-10 | 2016-06-29 | ソニー株式会社 | Image processing apparatus, image processing method, and program |
US9378581B2 (en) * | 2012-03-13 | 2016-06-28 | Amazon Technologies, Inc. | Approaches for highlighting active interface elements |
JP5844707B2 (en) * | 2012-09-28 | 2016-01-20 | 富士フイルム株式会社 | Image display control device, image display device, program, and image display method |
KR20140042270A (en) * | 2012-09-28 | 2014-04-07 | 삼성전자주식회사 | Method for executing for application and an electronic device thereof |
KR102126292B1 (en) * | 2012-11-19 | 2020-06-24 | 삼성전자주식회사 | Method for displaying a screen in mobile terminal and the mobile terminal therefor |
KR20140068410A (en) * | 2012-11-28 | 2014-06-09 | 삼성전자주식회사 | Method for providing user interface based on physical engine and an electronic device thereof |
KR102028119B1 (en) * | 2012-11-28 | 2019-11-04 | 삼성전자주식회사 | Method for displaying for application an electronic device thereof |
USD744510S1 (en) * | 2013-01-04 | 2015-12-01 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with animated graphical user interface |
US10257301B1 (en) | 2013-03-15 | 2019-04-09 | MiMedia, Inc. | Systems and methods providing a drive interface for content delivery |
US9606719B2 (en) * | 2013-04-02 | 2017-03-28 | Facebook, Inc. | Interactive elements in a user interface |
EP2800020B1 (en) * | 2013-04-30 | 2020-11-04 | Dassault Systèmes | A computer-implemented method for manipulating three-dimensional modeled objects of an assembly in a three-dimensional scene. |
KR102148725B1 (en) | 2013-07-31 | 2020-08-28 | 삼성전자주식회사 | Method and Apparatus for displaying application |
US20150082224A1 (en) * | 2013-09-13 | 2015-03-19 | MoreStream Development LLC | Computer graphical user interface system, and method for project mapping |
US9557884B2 (en) | 2013-10-22 | 2017-01-31 | Linkedin Corporation | System for generating a user interface for a social network and method therefor |
USD765723S1 (en) * | 2013-12-30 | 2016-09-06 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with icon |
USD753147S1 (en) * | 2013-12-30 | 2016-04-05 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with icon |
USD754155S1 (en) * | 2014-01-07 | 2016-04-19 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with graphical user interface |
USD754683S1 (en) * | 2014-01-07 | 2016-04-26 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with graphical user interface |
USD754156S1 (en) * | 2014-01-07 | 2016-04-19 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with graphical user interface |
USD763867S1 (en) * | 2014-01-07 | 2016-08-16 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with graphical user interface |
USD754154S1 (en) * | 2014-01-07 | 2016-04-19 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with graphical user interface |
USD754153S1 (en) * | 2014-01-07 | 2016-04-19 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with graphical user interface |
USD745040S1 (en) * | 2014-01-29 | 2015-12-08 | 3M Innovative Properties Company | Display screen or portion thereof with animated graphical user interface |
US10303324B2 (en) * | 2014-02-10 | 2019-05-28 | Samsung Electronics Co., Ltd. | Electronic device configured to display three dimensional (3D) virtual space and method of controlling the electronic device |
TWD165580S (en) * | 2014-03-07 | 2015-01-21 | 三緯國際立體列印科技股份 | Changeable graphical user interface for display screen |
WO2015149347A1 (en) | 2014-04-04 | 2015-10-08 | Microsoft Technology Licensing, Llc | Expandable application representation |
WO2015154276A1 (en) | 2014-04-10 | 2015-10-15 | Microsoft Technology Licensing, Llc | Slider cover for computing device |
CN105378582B (en) | 2014-04-10 | 2019-07-23 | 微软技术许可有限责任公司 | Calculate the foldable cap of equipment |
USD763882S1 (en) * | 2014-04-25 | 2016-08-16 | Tencent Technology (Shenzhen) Company Limited | Portion of a display screen with animated graphical user interface |
USD765110S1 (en) * | 2014-04-25 | 2016-08-30 | Tencent Technology (Shenzhen) Company Limited | Portion of a display screen with animated graphical user interface |
USD770487S1 (en) * | 2014-04-30 | 2016-11-01 | Tencent Technology (Shenzhen) Company Limited | Display screen or portion thereof with graphical user interface |
USD770488S1 (en) * | 2014-04-30 | 2016-11-01 | Tencent Technology (Shenzhen) Company Limited | Portion of a display screen with graphical user interface |
CN104050004B (en) * | 2014-06-30 | 2018-01-09 | 宇龙计算机通信科技(深圳)有限公司 | interface icon color setting method, device and terminal |
WO2016039570A1 (en) * | 2014-09-12 | 2016-03-17 | Samsung Electronics Co., Ltd. | Method and device for executing applications through application selection screen |
CN106662891B (en) | 2014-10-30 | 2019-10-11 | 微软技术许可有限责任公司 | Multi-configuration input equipment |
KR102215997B1 (en) * | 2014-10-30 | 2021-02-16 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
USD761868S1 (en) * | 2014-12-30 | 2016-07-19 | Microsoft Corporation | Display screen with icon |
US10600012B2 (en) * | 2015-05-01 | 2020-03-24 | The United States Of America, As Represented By The Secretary Of The Navy | Human-machine visualization interfaces and processes for providing real time or near real time actionable information relative to one or more elements of one or more networks, networks, and systems of networks |
US11138306B2 (en) * | 2016-03-14 | 2021-10-05 | Amazon Technologies, Inc. | Physics-based CAPTCHA |
JP6689492B2 (en) * | 2016-03-14 | 2020-04-28 | 富士ゼロックス株式会社 | Terminal device, data processing system and program |
US10353540B2 (en) * | 2016-03-30 | 2019-07-16 | Kyocera Document Solutions Inc. | Display control device |
US10345997B2 (en) * | 2016-05-19 | 2019-07-09 | Microsoft Technology Licensing, Llc | Gesture-controlled piling of displayed data |
US10824294B2 (en) * | 2016-10-25 | 2020-11-03 | Microsoft Technology Licensing, Llc | Three-dimensional resource integration system |
JP7027800B2 (en) * | 2017-10-23 | 2022-03-02 | 富士フイルムビジネスイノベーション株式会社 | Information processing equipment and programs |
US11402988B2 (en) * | 2017-11-08 | 2022-08-02 | Viacom International Inc. | Tiling scroll display |
CN108089786B (en) * | 2017-12-14 | 2019-12-31 | Oppo广东移动通信有限公司 | User interface display method, device, equipment and storage medium |
US11093100B2 (en) * | 2018-03-08 | 2021-08-17 | Microsoft Technology Licensing, Llc | Virtual reality device with varying interactive modes for document viewing and editing |
CN108536352B (en) * | 2018-03-23 | 2021-08-06 | 努比亚技术有限公司 | Desktop icon stacking method, mobile terminal and computer-readable storage medium |
USD960927S1 (en) * | 2020-09-30 | 2022-08-16 | Snap Inc. | Display screen or portion thereof with a graphical user interface |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5757371A (en) * | 1994-12-13 | 1998-05-26 | Microsoft Corporation | Taskbar with start menu |
US20010028369A1 (en) * | 2000-03-17 | 2001-10-11 | Vizible.Com Inc. | Three dimensional spatial user interface |
US20030052927A1 (en) * | 2001-09-20 | 2003-03-20 | International Business Machines Corporation | System and method for indicating a status of multiple features of a data processing system |
US20030142143A1 (en) * | 2002-01-28 | 2003-07-31 | International Business Machines Corporation | Varying heights of application images to convey application status |
US20050021336A1 (en) * | 2003-02-10 | 2005-01-27 | Katsuranis Ronald Mark | Voice activated system and methods to enable a computer user working in a first graphical application window to display and control on-screen help, internet, and other information content in a second graphical application window |
US20060061597A1 (en) * | 2004-09-17 | 2006-03-23 | Microsoft Corporation | Method and system for presenting functionally-transparent, unobstrusive on-screen windows |
US20070016873A1 (en) * | 2005-07-15 | 2007-01-18 | Microsoft Corporation | Visual expression of a state of an application window |
US20070256032A1 (en) * | 2006-04-28 | 2007-11-01 | Petri John E | Presenting identifiers and states of processes in a stacked cursor |
US20080046832A1 (en) * | 2006-08-15 | 2008-02-21 | International Business Machines Corporation | Notification of state transition of an out-of-focus application |
US20110145753A1 (en) * | 2006-03-20 | 2011-06-16 | British Broadcasting Corporation | Content provision |
US20120242692A1 (en) * | 2011-03-17 | 2012-09-27 | Kevin Laubach | Linear Progression Based Window Management |
Family Cites Families (185)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5333256A (en) * | 1989-05-15 | 1994-07-26 | International Business Machines Corporation | Methods of monitoring the status of an application program |
JP3245655B2 (en) | 1990-03-05 | 2002-01-15 | インキサイト ソフトウェア インコーポレイテッド | Workspace display processing method |
FR2662009B1 (en) | 1990-05-09 | 1996-03-08 | Apple Computer | MULTIPLE FACES MANOPULABLE ICON FOR DISPLAY ON COMPUTER. |
FR2693810B1 (en) | 1991-06-03 | 1997-01-10 | Apple Computer | USER INTERFACE SYSTEMS WITH DIRECT ACCESS TO A SECONDARY DISPLAY AREA. |
JPH07104766B2 (en) * | 1991-10-28 | 1995-11-13 | インターナショナル・ビジネス・マシーンズ・コーポレイション | Method and apparatus for displaying multiple objects from menu of data processing system |
US5461710A (en) * | 1992-03-20 | 1995-10-24 | International Business Machines Corporation | Method for providing a readily distinguishable template and means of duplication thereof in a computer system graphical user interface |
EP0592638B1 (en) * | 1992-04-30 | 2001-02-07 | Apple Computer, Inc. | Method and apparatus for organizing information in a computer system |
AU5451794A (en) | 1992-10-28 | 1994-05-24 | Intellution, Inc. | A dynamic graphical system configuration utility |
US5555354A (en) * | 1993-03-23 | 1996-09-10 | Silicon Graphics Inc. | Method and apparatus for navigation within three-dimensional information landscape |
US6262732B1 (en) * | 1993-10-25 | 2001-07-17 | Scansoft, Inc. | Method and apparatus for managing and navigating within stacks of document pages |
US5565657A (en) * | 1993-11-01 | 1996-10-15 | Xerox Corporation | Multidimensional user interface input device |
US5943050A (en) | 1994-04-07 | 1999-08-24 | International Business Machines Corporation | Digital image capture control |
US5564004A (en) | 1994-04-13 | 1996-10-08 | International Business Machines Corporation | Method and system for facilitating the selection of icons |
JPH0869367A (en) | 1994-08-29 | 1996-03-12 | Casio Comput Co Ltd | Menu display device |
US5673377A (en) * | 1994-09-02 | 1997-09-30 | Ray Dream, Inc. | Method and system for displaying a representation of a three-dimensional object with surface features that conform to the surface of the three-dimensional object |
US5515486A (en) | 1994-12-16 | 1996-05-07 | International Business Machines Corporation | Method, apparatus and memory for directing a computer system to display a multi-axis rotatable, polyhedral-shape panel container having front panels for displaying objects |
US5838317A (en) * | 1995-06-30 | 1998-11-17 | Microsoft Corporation | Method and apparatus for arranging displayed graphical representations on a computer interface |
US5678015A (en) | 1995-09-01 | 1997-10-14 | Silicon Graphics, Inc. | Four-dimensional graphical user interface |
US5754809A (en) | 1995-12-12 | 1998-05-19 | Dell U.S.A., L.P. | Perspective windowing technique for computer graphical user interface |
US5801699A (en) | 1996-01-26 | 1998-09-01 | International Business Machines Corporation | Icon aggregation on a graphical user interface |
US6043818A (en) * | 1996-04-30 | 2000-03-28 | Sony Corporation | Background image with a continuously rotating and functional 3D icon |
US6002403A (en) | 1996-04-30 | 1999-12-14 | Sony Corporation | Graphical navigation control for selecting applications on visual walls |
US5880733A (en) * | 1996-04-30 | 1999-03-09 | Microsoft Corporation | Display system and method for displaying windows of an operating system to provide a three-dimensional workspace for a computer system |
JP2000509534A (en) * | 1996-04-30 | 2000-07-25 | ソニー エレクトロニクス インク | User interface for organizing and executing programs, files and data in a computer system |
US5745109A (en) * | 1996-04-30 | 1998-04-28 | Sony Corporation | Menu display interface with miniature windows corresponding to each page |
US5802466A (en) * | 1996-06-28 | 1998-09-01 | Mci Communications Corporation | Personal communication device voice mail notification apparatus and method |
US5736985A (en) * | 1996-07-02 | 1998-04-07 | International Business Machines Corp. | GUI pushbutton with multi-function mini-button |
US5767854A (en) | 1996-09-27 | 1998-06-16 | Anwar; Mohammed S. | Multidimensional data display and manipulation system and methods for using same |
US6088032A (en) | 1996-10-04 | 2000-07-11 | Xerox Corporation | Computer controlled display system for displaying a three-dimensional document workspace having a means for prefetching linked documents |
US5835094A (en) * | 1996-12-31 | 1998-11-10 | Compaq Computer Corporation | Three-dimensional computer environment |
JPH10283158A (en) | 1997-02-04 | 1998-10-23 | Fujitsu Ltd | Stereoscopic display device for window and method therefor |
US6734884B1 (en) | 1997-04-04 | 2004-05-11 | International Business Machines Corporation | Viewer interactive three-dimensional objects and two-dimensional images in virtual three-dimensional workspace |
US6271842B1 (en) | 1997-04-04 | 2001-08-07 | International Business Machines Corporation | Navigation via environmental objects in three-dimensional workspace interactive displays |
US6281898B1 (en) | 1997-05-16 | 2001-08-28 | Philips Electronics North America Corporation | Spatial browsing approach to multimedia information retrieval |
US5767855A (en) | 1997-05-19 | 1998-06-16 | International Business Machines Corporation | Selectively enlarged viewer interactive three-dimensional objects in environmentally related virtual three-dimensional workspace displays |
US6025839A (en) | 1997-06-06 | 2000-02-15 | International Business Machines Corp. | Method for displaying information in a virtual reality environment |
US6577330B1 (en) | 1997-08-12 | 2003-06-10 | Matsushita Electric Industrial Co., Ltd. | Window display device with a three-dimensional orientation of windows |
US6184881B1 (en) * | 1997-10-21 | 2001-02-06 | International Business Machines Corporation | Color and symbol coded visual cues for relating screen menu to executed process |
US6275829B1 (en) * | 1997-11-25 | 2001-08-14 | Microsoft Corporation | Representing a graphic image on a web page with a thumbnail-sized image |
US6613100B2 (en) * | 1997-11-26 | 2003-09-02 | Intel Corporation | Method and apparatus for displaying miniaturized graphical representations of documents for alternative viewing selection |
JP2938420B2 (en) * | 1998-01-30 | 1999-08-23 | インターナショナル・ビジネス・マシーンズ・コーポレイション | Function selection method and apparatus, storage medium storing control program for selecting functions, object operation method and apparatus, storage medium storing control program for operating objects, storage medium storing composite icon |
US6122647A (en) * | 1998-05-19 | 2000-09-19 | Perspecta, Inc. | Dynamic generation of contextual links in hypertext documents |
US6363404B1 (en) | 1998-06-26 | 2002-03-26 | Microsoft Corporation | Three-dimensional models with markup documents as texture |
US6229542B1 (en) | 1998-07-10 | 2001-05-08 | Intel Corporation | Method and apparatus for managing windows in three dimensions in a two dimensional windowing system |
US6353926B1 (en) * | 1998-07-15 | 2002-03-05 | Microsoft Corporation | Software update notification |
US6577304B1 (en) | 1998-08-14 | 2003-06-10 | I2 Technologies Us, Inc. | System and method for visually representing a supply chain |
US6738809B1 (en) * | 1998-08-21 | 2004-05-18 | Nortel Networks Limited | Network presence indicator for communications management |
JP4032649B2 (en) | 1998-08-24 | 2008-01-16 | 株式会社日立製作所 | How to display multimedia information |
US6597358B2 (en) | 1998-08-26 | 2003-07-22 | Intel Corporation | Method and apparatus for presenting two and three-dimensional computer applications within a 3D meta-visualization |
US6188405B1 (en) | 1998-09-14 | 2001-02-13 | Microsoft Corporation | Methods, apparatus and data structures for providing a user interface, which exploits spatial memory, to objects |
US6243093B1 (en) | 1998-09-14 | 2001-06-05 | Microsoft Corporation | Methods, apparatus and data structures for providing a user interface, which exploits spatial memory in three-dimensions, to objects and which visually groups matching objects |
US6414677B1 (en) | 1998-09-14 | 2002-07-02 | Microsoft Corporation | Methods, apparatus and data structures for providing a user interface, which exploits spatial memory in three-dimensions, to objects and which visually groups proximally located objects |
US6160553A (en) | 1998-09-14 | 2000-12-12 | Microsoft Corporation | Methods, apparatus and data structures for providing a user interface, which exploits spatial memory in three-dimensions, to objects and in which object occlusion is avoided |
US6166738A (en) | 1998-09-14 | 2000-12-26 | Microsoft Corporation | Methods, apparatus and data structures for providing a user interface, which exploits spatial memory in three-dimensions, to objects |
US6054989A (en) | 1998-09-14 | 2000-04-25 | Microsoft Corporation | Methods, apparatus and data structures for providing a user interface, which exploits spatial memory in three-dimensions, to objects and which provides spatialized audio |
JP3956553B2 (en) | 1998-11-04 | 2007-08-08 | 富士ゼロックス株式会社 | Icon display processing device |
US6337698B1 (en) * | 1998-11-20 | 2002-01-08 | Microsoft Corporation | Pen-based interface for a notepad computer |
US6590593B1 (en) | 1999-04-06 | 2003-07-08 | Microsoft Corporation | Method and apparatus for handling dismissed dialogue boxes |
US6765567B1 (en) * | 1999-04-06 | 2004-07-20 | Microsoft Corporation | Method and apparatus for providing and accessing hidden tool spaces |
US7119819B1 (en) * | 1999-04-06 | 2006-10-10 | Microsoft Corporation | Method and apparatus for supporting two-dimensional windows in a three-dimensional environment |
US6909443B1 (en) * | 1999-04-06 | 2005-06-21 | Microsoft Corporation | Method and apparatus for providing a three-dimensional task gallery computer interface |
US6262724B1 (en) * | 1999-04-15 | 2001-07-17 | Apple Computer, Inc. | User interface for presenting media information |
GB9908631D0 (en) | 1999-04-15 | 1999-06-09 | Canon Kk | Search engine user interface |
US6426761B1 (en) | 1999-04-23 | 2002-07-30 | Internation Business Machines Corporation | Information presentation system for a graphical user interface |
US7222309B2 (en) * | 1999-06-02 | 2007-05-22 | Earthlink, Inc. | System and method of a web browser with integrated features and controls |
US7263667B1 (en) | 1999-06-09 | 2007-08-28 | Microsoft Corporation | Methods, apparatus and data structures for providing a user interface which facilitates decision making |
US6480210B1 (en) | 1999-06-30 | 2002-11-12 | Koninklijke Philips Electronics N.V. | Video browsing space |
US20010030664A1 (en) * | 1999-08-16 | 2001-10-18 | Shulman Leo A. | Method and apparatus for configuring icon interactivity |
US7134095B1 (en) | 1999-10-20 | 2006-11-07 | Gateway, Inc. | Simulated three-dimensional navigational menu system |
US7322524B2 (en) * | 2000-10-20 | 2008-01-29 | Silverbrook Research Pty Ltd | Graphic design software using an interface surface |
US6388181B2 (en) * | 1999-12-06 | 2002-05-14 | Michael K. Moe | Computer graphic animation, live video interactive method for playing keyboard music |
US6313855B1 (en) | 2000-02-04 | 2001-11-06 | Browse3D Corporation | System and method for web browsing |
US7546538B2 (en) * | 2000-02-04 | 2009-06-09 | Browse3D Corporation | System and method for web browsing |
US7249326B2 (en) * | 2000-04-06 | 2007-07-24 | Microsoft Corporation | Method and system for reducing notification area clutter |
JP4325075B2 (en) * | 2000-04-21 | 2009-09-02 | ソニー株式会社 | Data object management device |
US6938218B1 (en) | 2000-04-28 | 2005-08-30 | James Nolen | Method and apparatus for three dimensional internet and computer file interface |
US6983424B1 (en) | 2000-06-23 | 2006-01-03 | International Business Machines Corporation | Automatically scaling icons to fit a display area within a data processing system |
US6583798B1 (en) * | 2000-07-21 | 2003-06-24 | Microsoft Corporation | On-object user interface |
US8117281B2 (en) * | 2006-11-02 | 2012-02-14 | Addnclick, Inc. | Using internet content as a means to establish live social networks by linking internet users to each other who are simultaneously engaged in the same and/or similar content |
US7168051B2 (en) * | 2000-10-10 | 2007-01-23 | Addnclick, Inc. | System and method to configure and provide a network-enabled three-dimensional computing environment |
US6727924B1 (en) | 2000-10-17 | 2004-04-27 | Novint Technologies, Inc. | Human-computer interface including efficient three-dimensional controls |
US7673241B2 (en) * | 2002-06-26 | 2010-03-02 | Siebel Systems, Inc. | User interface for multi-media communication for the visually disabled |
US7134092B2 (en) | 2000-11-13 | 2006-11-07 | James Nolen | Graphical user interface method and apparatus |
US6922815B2 (en) * | 2000-11-21 | 2005-07-26 | James A. Nolen, III | Display method and apparatus for facilitating interaction with Web sites |
JP2002175139A (en) | 2000-12-07 | 2002-06-21 | Sony Corp | Information processor, menu display method and program storage medium |
US7266768B2 (en) * | 2001-01-09 | 2007-09-04 | Sharp Laboratories Of America, Inc. | Systems and methods for manipulating electronic information using a three-dimensional iconic representation |
US20030088452A1 (en) * | 2001-01-19 | 2003-05-08 | Kelly Kevin James | Survey methods for handheld computers |
US7216305B1 (en) * | 2001-02-15 | 2007-05-08 | Denny Jaeger | Storage/display/action object for onscreen use |
US6915489B2 (en) * | 2001-03-28 | 2005-07-05 | Hewlett-Packard Development Company, L.P. | Image browsing using cursor positioning |
US6987512B2 (en) | 2001-03-29 | 2006-01-17 | Microsoft Corporation | 3D navigation techniques |
US20040030741A1 (en) | 2001-04-02 | 2004-02-12 | Wolton Richard Ernest | Method and apparatus for search, visual navigation, analysis and retrieval of information from networks with remote notification and content delivery |
US20020167546A1 (en) * | 2001-05-10 | 2002-11-14 | Kimbell Benjamin D. | Picture stack |
US7107549B2 (en) | 2001-05-11 | 2006-09-12 | 3Dna Corp. | Method and system for creating and distributing collaborative multi-user three-dimensional websites for a computer system (3D Net Architecture) |
US6816176B2 (en) | 2001-07-05 | 2004-11-09 | International Business Machines Corporation | Temporarily moving adjacent or overlapping icons away from specific icons being approached by an on-screen pointer on user interactive display interfaces |
US6886138B2 (en) | 2001-07-05 | 2005-04-26 | International Business Machines Corporation | Directing users′ attention to specific icons being approached by an on-screen pointer on user interactive display interfaces |
US7299418B2 (en) * | 2001-09-10 | 2007-11-20 | International Business Machines Corporation | Navigation method for visual presentations |
US7299419B2 (en) * | 2001-09-28 | 2007-11-20 | Business Objects, S.A. | Apparatus and method for combining discrete logic visual icons to form a data transformation block |
US7146576B2 (en) | 2001-10-30 | 2006-12-05 | Hewlett-Packard Development Company, L.P. | Automatically designed three-dimensional graphical environments for information discovery and visualization |
US7043701B2 (en) * | 2002-01-07 | 2006-05-09 | Xerox Corporation | Opacity desktop with depth perception |
AUPS058602A0 (en) | 2002-02-15 | 2002-03-14 | Canon Kabushiki Kaisha | Representing a plurality of independent data items |
US6850255B2 (en) * | 2002-02-28 | 2005-02-01 | James Edward Muschetto | Method and apparatus for accessing information, computer programs and electronic communications across multiple computing devices using a graphical user interface |
US20030179240A1 (en) * | 2002-03-20 | 2003-09-25 | Stephen Gest | Systems and methods for managing virtual desktops in a windowing environment |
US7249327B2 (en) * | 2002-03-22 | 2007-07-24 | Fuji Xerox Co., Ltd. | System and method for arranging, manipulating and displaying objects in a graphical user interface |
US7610563B2 (en) * | 2002-03-22 | 2009-10-27 | Fuji Xerox Co., Ltd. | System and method for controlling the display of non-uniform graphical objects |
US20030189602A1 (en) * | 2002-04-04 | 2003-10-09 | Dalton Dan L. | Method and apparatus for browsing images in a digital imaging device |
US7093199B2 (en) * | 2002-05-07 | 2006-08-15 | International Business Machines Corporation | Design environment to facilitate accessible software |
WO2004001575A1 (en) * | 2002-06-24 | 2003-12-31 | National Instruments Corporation | Task based polymorphic graphical program function nodes |
US7292243B1 (en) | 2002-07-02 | 2007-11-06 | James Burke | Layered and vectored graphical user interface to a knowledge and relationship rich data source |
AU2003242939A1 (en) | 2002-07-11 | 2004-02-02 | Koninklijke Philips Electronics N.V. | Conditionally blocking reproduction of content items |
US7814055B2 (en) | 2002-08-28 | 2010-10-12 | Apple Inc. | Method of managing a calendar and a computer system for implementing that method |
US7913183B2 (en) | 2002-10-08 | 2011-03-22 | Microsoft Corporation | System and method for managing software applications in a graphical user interface |
US7373612B2 (en) | 2002-10-21 | 2008-05-13 | Battelle Memorial Institute | Multidimensional structured data visualization method and apparatus, text visualization method and apparatus, method and apparatus for visualizing and graphically navigating the world wide web, method and apparatus for visualizing hierarchies |
US7493573B2 (en) * | 2003-02-07 | 2009-02-17 | Sun Microsystems, Inc. | Scrolling vertical column mechanism for cellular telephone |
US7536650B1 (en) * | 2003-02-25 | 2009-05-19 | Robertson George G | System and method that facilitates computer desktop use via scaling of displayed objects with shifts to the periphery |
US7286526B2 (en) * | 2003-03-14 | 2007-10-23 | International Business Machines Corporation | Uniform management of mixed network systems |
US7523391B1 (en) * | 2003-03-25 | 2009-04-21 | Microsoft Corporation | Indicating change to data form |
US7343567B2 (en) | 2003-04-25 | 2008-03-11 | Microsoft Corporation | System and method for providing dynamic user information in an interactive display |
US7237201B2 (en) * | 2003-05-20 | 2007-06-26 | Aol Llc | Geographic location notification based on identity linking |
JP2005010854A (en) | 2003-06-16 | 2005-01-13 | Sony Computer Entertainment Inc | Information presenting method and system |
GB2404546B (en) * | 2003-07-25 | 2005-12-14 | Purple Interactive Ltd | A method of organising and displaying material content on a display to a viewer |
US20050204306A1 (en) * | 2003-09-15 | 2005-09-15 | Hideya Kawahara | Enhancements for manipulating two-dimensional windows within a three-dimensional display model |
US7480873B2 (en) | 2003-09-15 | 2009-01-20 | Sun Microsystems, Inc. | Method and apparatus for manipulating two-dimensional windows within a three-dimensional display model |
US20050066292A1 (en) * | 2003-09-24 | 2005-03-24 | Xerox Corporation | Virtual piles desktop interface |
US6990637B2 (en) * | 2003-10-23 | 2006-01-24 | Microsoft Corporation | Graphical user interface for 3-dimensional view of a data collection based on an attribute of the data |
US8656274B2 (en) * | 2003-10-30 | 2014-02-18 | Avaya Inc. | Automatic identification and storage of context information associated with phone numbers in computer documents |
US8095882B2 (en) * | 2003-10-30 | 2012-01-10 | Avaya Technology Corp. | Additional functionality for telephone numbers and utilization of context information associated with telephone numbers in computer documents |
US20050125489A1 (en) * | 2003-11-26 | 2005-06-09 | Hanes David H. | System and method for determining messages on a server as relating to at least one functional component of a client system |
WO2005055034A1 (en) * | 2003-12-01 | 2005-06-16 | Research In Motion Limited | Previewing a new event on a small screen device |
US7490314B2 (en) * | 2004-01-30 | 2009-02-10 | Microsoft Corporation | System and method for exposing tasks in a development environment |
US7478328B2 (en) * | 2004-02-17 | 2009-01-13 | Think-Cell Software Gmbh | Method of entering a presentation into a computer |
JP4325449B2 (en) * | 2004-03-19 | 2009-09-02 | ソニー株式会社 | Display control device, display control method, and recording medium |
US7694236B2 (en) * | 2004-04-23 | 2010-04-06 | Microsoft Corporation | Stack icons representing multiple objects |
US7657846B2 (en) * | 2004-04-23 | 2010-02-02 | Microsoft Corporation | System and method for displaying stack icons |
US7490295B2 (en) * | 2004-06-25 | 2009-02-10 | Apple Inc. | Layer for accessing user interface elements |
US20060020904A1 (en) * | 2004-07-09 | 2006-01-26 | Antti Aaltonen | Stripe user interface |
US7178111B2 (en) | 2004-08-03 | 2007-02-13 | Microsoft Corporation | Multi-planar three-dimensional user interface |
US7441201B1 (en) | 2004-10-19 | 2008-10-21 | Sun Microsystems, Inc. | Method for placing graphical user interface components in three dimensions |
US20060092751A1 (en) * | 2004-11-04 | 2006-05-04 | Hewlett-Packard Development Company, L.P. | Peripheral management |
US20060107229A1 (en) | 2004-11-15 | 2006-05-18 | Microsoft Corporation | Work area transform in a graphical user interface |
AU2004240229B2 (en) | 2004-12-20 | 2011-04-07 | Canon Kabushiki Kaisha | A radial, three-dimensional, hierarchical file system view |
US7478326B2 (en) | 2005-01-18 | 2009-01-13 | Microsoft Corporation | Window information switching system |
US8341541B2 (en) * | 2005-01-18 | 2012-12-25 | Microsoft Corporation | System and method for visually browsing of open windows |
US8464176B2 (en) * | 2005-01-19 | 2013-06-11 | Microsoft Corporation | Dynamic stacking and expansion of visual items |
US7661069B2 (en) | 2005-03-31 | 2010-02-09 | Microsoft Corporation | System and method for visually expressing user interface elements |
US20070011617A1 (en) | 2005-07-06 | 2007-01-11 | Mitsunori Akagawa | Three-dimensional graphical user interface |
US20070055947A1 (en) * | 2005-09-02 | 2007-03-08 | Microsoft Corporation | Animations and transitions |
US7725839B2 (en) | 2005-11-15 | 2010-05-25 | Microsoft Corporation | Three-dimensional active file explorer |
US7730425B2 (en) * | 2005-11-30 | 2010-06-01 | De Los Reyes Isabelo | Function-oriented user interface |
WO2007064987A2 (en) * | 2005-12-04 | 2007-06-07 | Turner Broadcasting System, Inc. (Tbs, Inc.) | System and method for delivering video and audio content over a network |
EP2214094A3 (en) * | 2005-12-19 | 2010-10-06 | Research In Motion Limited | Computing device and method of indicating status of application program |
US7562312B2 (en) * | 2006-01-17 | 2009-07-14 | Samsung Electronics Co., Ltd. | 3-dimensional graphical user interface |
US8250486B2 (en) * | 2006-01-19 | 2012-08-21 | International Business Machines Corporation | Computer controlled user interactive display interface for accessing graphic tools with a minimum of display pointer movement |
US20070192727A1 (en) | 2006-01-26 | 2007-08-16 | Finley William D | Three dimensional graphical user interface representative of a physical work space |
US7823068B2 (en) * | 2006-02-28 | 2010-10-26 | Mark Anthony Ogle Cowtan | Internet-based, dual-paned virtual tour presentation system with orientational capabilities and versatile tabbed menu-driven area for multi-media content delivery |
US20070214431A1 (en) | 2006-03-08 | 2007-09-13 | Microsoft Corporation | Smart gadget resizing |
US20070214436A1 (en) * | 2006-03-13 | 2007-09-13 | Myers Raymond L Jr | Positional navigation graphic link system |
JP2007257336A (en) | 2006-03-23 | 2007-10-04 | Sony Corp | Information processor, information processing method and program thereof |
US20070233782A1 (en) * | 2006-03-28 | 2007-10-04 | Silentclick, Inc. | Method & system for acquiring, storing, & managing software applications via a communications network |
EP2010999A4 (en) * | 2006-04-21 | 2012-11-21 | Google Inc | System for organizing and visualizing display objects |
DE102006021400B4 (en) * | 2006-05-08 | 2008-08-21 | Combots Product Gmbh & Co. Kg | Method and device for providing a selection menu associated with a displayed symbol |
US20080126956A1 (en) * | 2006-08-04 | 2008-05-29 | Kodosky Jeffrey L | Asynchronous Wires for Graphical Programming |
US20080040678A1 (en) * | 2006-08-14 | 2008-02-14 | Richard Crump | Interactive Area Guide Method, System and Apparatus |
EP1895374B1 (en) * | 2006-08-29 | 2016-04-06 | Rockwell Automation Technologies, Inc. | HMI devices with integrated user-defined behaviour |
US7800694B2 (en) * | 2006-08-31 | 2010-09-21 | Microsoft Corporation | Modular grid display |
US7665033B2 (en) * | 2006-08-31 | 2010-02-16 | Sun Microsystems, Inc. | Using a zooming effect to provide additional display space for managing applications |
US7581186B2 (en) * | 2006-09-11 | 2009-08-25 | Apple Inc. | Media manager with integrated browsers |
US8001472B2 (en) * | 2006-09-21 | 2011-08-16 | Apple Inc. | Systems and methods for providing audio and visual cues via a portable electronic device |
US20080115077A1 (en) * | 2006-11-09 | 2008-05-15 | International Business Machines Corporation | Persistent status indicator for calendar |
TWI340340B (en) | 2006-12-01 | 2011-04-11 | Inst Information Industry | User interface apparatus, method, application program, and computer readable medium thereof |
US10156953B2 (en) * | 2006-12-27 | 2018-12-18 | Blackberry Limited | Method for presenting data on a small screen |
US8009628B2 (en) | 2007-02-28 | 2011-08-30 | Telefonaktiebolaget Lm Ericsson (Publ) | Private mobility management enhancements |
US7904062B2 (en) * | 2007-03-08 | 2011-03-08 | Yahoo! Inc. | Scrolling mobile advertisements |
US8640100B2 (en) * | 2007-04-20 | 2014-01-28 | National Instruments Corporation | Debugging a statechart using a graphical program |
KR100886557B1 (en) | 2007-05-03 | 2009-03-02 | 삼성전자주식회사 | System and method for face recognition based on adaptive learning |
US8667418B2 (en) * | 2007-06-08 | 2014-03-04 | Apple Inc. | Object stack |
US9086785B2 (en) | 2007-06-08 | 2015-07-21 | Apple Inc. | Visualization object receptacle |
US8745535B2 (en) * | 2007-06-08 | 2014-06-03 | Apple Inc. | Multi-dimensional desktop |
US7996782B2 (en) * | 2007-06-08 | 2011-08-09 | National Instruments Corporation | Data transfer indicator icon in a diagram |
US8892997B2 (en) * | 2007-06-08 | 2014-11-18 | Apple Inc. | Overflow stack user interface |
US20080307330A1 (en) * | 2007-06-08 | 2008-12-11 | Apple Inc. | Visualization object divet |
US8473859B2 (en) | 2007-06-08 | 2013-06-25 | Apple Inc. | Visualization and interaction models |
US8381122B2 (en) * | 2007-06-08 | 2013-02-19 | Apple Inc. | Multi-dimensional application environment |
US7792748B1 (en) * | 2007-09-19 | 2010-09-07 | Capital One Financial Corporation | Method and system for performing a financial transaction using a user interface |
JP2009211663A (en) | 2008-03-06 | 2009-09-17 | Nissan Motor Co Ltd | Display controller and item display method |
US9086885B2 (en) | 2012-12-21 | 2015-07-21 | International Business Machines Corporation | Reducing merge conflicts in a development environment |
-
2007
- 2007-06-08 US US11/760,692 patent/US9086785B2/en active Active
-
2015
- 2015-07-20 US US14/803,898 patent/US20160018970A1/en not_active Abandoned
-
2019
- 2019-08-24 US US16/550,220 patent/US11086495B2/en active Active
-
2021
- 2021-06-15 US US17/347,947 patent/US20210357108A1/en active Pending
-
2023
- 2023-03-07 US US18/118,605 patent/US20230205406A1/en active Pending
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5757371A (en) * | 1994-12-13 | 1998-05-26 | Microsoft Corporation | Taskbar with start menu |
US20010028369A1 (en) * | 2000-03-17 | 2001-10-11 | Vizible.Com Inc. | Three dimensional spatial user interface |
US20030052927A1 (en) * | 2001-09-20 | 2003-03-20 | International Business Machines Corporation | System and method for indicating a status of multiple features of a data processing system |
US20030142143A1 (en) * | 2002-01-28 | 2003-07-31 | International Business Machines Corporation | Varying heights of application images to convey application status |
US20050021336A1 (en) * | 2003-02-10 | 2005-01-27 | Katsuranis Ronald Mark | Voice activated system and methods to enable a computer user working in a first graphical application window to display and control on-screen help, internet, and other information content in a second graphical application window |
US20060061597A1 (en) * | 2004-09-17 | 2006-03-23 | Microsoft Corporation | Method and system for presenting functionally-transparent, unobstrusive on-screen windows |
US20070016873A1 (en) * | 2005-07-15 | 2007-01-18 | Microsoft Corporation | Visual expression of a state of an application window |
US20110145753A1 (en) * | 2006-03-20 | 2011-06-16 | British Broadcasting Corporation | Content provision |
US20070256032A1 (en) * | 2006-04-28 | 2007-11-01 | Petri John E | Presenting identifiers and states of processes in a stacked cursor |
US20080046832A1 (en) * | 2006-08-15 | 2008-02-21 | International Business Machines Corporation | Notification of state transition of an out-of-focus application |
US20120242692A1 (en) * | 2011-03-17 | 2012-09-27 | Kevin Laubach | Linear Progression Based Window Management |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10599288B2 (en) | 2016-05-09 | 2020-03-24 | Beijing Xiaomi Mobile Software Co., Ltd. | Method and device for displaying an application interface |
US11416112B2 (en) | 2016-05-09 | 2022-08-16 | Beijing Xiaomi Mobile Software Co., Ltd. | Method and device for displaying an application interface |
US10019851B2 (en) * | 2016-10-25 | 2018-07-10 | Microsoft Technology Licensing, Llc | Positioning objects in three-dimensional graphical space |
US20200034023A1 (en) * | 2018-07-27 | 2020-01-30 | Nintendo Co., Ltd. | Non-transitory computer-readable storage medium with executable program stored thereon, information processing apparatus, information processing method, and information processing system |
US11003312B2 (en) * | 2018-07-27 | 2021-05-11 | Nintendo Co., Ltd. | Non-transitory computer-readable storage medium with executable program stored thereon, information processing apparatus, information processing method, and information processing |
US11503256B2 (en) | 2019-09-04 | 2022-11-15 | Material Technologies Corporation | Object feature visualization apparatus and methods |
US11622096B2 (en) | 2019-09-04 | 2023-04-04 | Material Technologies Corporation | Object feature visualization apparatus and methods |
US11681751B2 (en) | 2019-09-04 | 2023-06-20 | Material Technologies Corporation | Object feature visualization apparatus and methods |
US11683459B2 (en) * | 2019-09-04 | 2023-06-20 | Material Technologies Corporation | Object feature visualization apparatus and methods |
Also Published As
Publication number | Publication date |
---|---|
US20230205406A1 (en) | 2023-06-29 |
US11086495B2 (en) | 2021-08-10 |
US9086785B2 (en) | 2015-07-21 |
US20190377478A1 (en) | 2019-12-12 |
US20210357108A1 (en) | 2021-11-18 |
US20080307364A1 (en) | 2008-12-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11086495B2 (en) | Visualization object receptacle | |
US8381122B2 (en) | Multi-dimensional application environment | |
US8667418B2 (en) | Object stack | |
US8473859B2 (en) | Visualization and interaction models | |
US8745535B2 (en) | Multi-dimensional desktop | |
US20080307330A1 (en) | Visualization object divet | |
US10579250B2 (en) | Arranging tiles | |
US20160011742A1 (en) | System And Method For Providing User Access | |
US20080307359A1 (en) | Grouping Graphical Representations of Objects in a User Interface | |
EP1207449A2 (en) | A method and system for facilitating the selection of icons | |
AU2021218062B2 (en) | Object stack | |
AU2023201453A1 (en) | Object stack |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |