US20090231356A1 - Graphical user interface for selection of options from option groups and methods relating to same - Google Patents

Graphical user interface for selection of options from option groups and methods relating to same Download PDF

Info

Publication number
US20090231356A1
US20090231356A1 US12/406,066 US40606609A US2009231356A1 US 20090231356 A1 US20090231356 A1 US 20090231356A1 US 40606609 A US40606609 A US 40606609A US 2009231356 A1 US2009231356 A1 US 2009231356A1
Authority
US
United States
Prior art keywords
color
region
user
computer program
palette
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/406,066
Inventor
Kevin Barnes
Satya Mallick
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
TAAZ Inc
Original Assignee
Photometria Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Photometria Inc filed Critical Photometria Inc
Priority to US12/406,066 priority Critical patent/US20090231356A1/en
Assigned to PHOTOMETRIA, INC. reassignment PHOTOMETRIA, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BARNES, KEVIN, MALLICK, SATYA
Publication of US20090231356A1 publication Critical patent/US20090231356A1/en
Assigned to TAAZ, INC. reassignment TAAZ, INC. CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: PHOTOMETRIA, INC.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/14Display of multiple viewports
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0251Targeted advertisements
    • G06Q30/0255Targeted advertisements based on user history
    • G06Q30/0256User search
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0251Targeted advertisements
    • G06Q30/0269Targeted advertisements based on user profile or attribute
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0273Determination of fees for advertising
    • G06Q30/0275Auctions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q40/00Finance; Insurance; Tax strategies; Processing of corporate or income taxes
    • G06Q40/04Trading; Exchange, e.g. stocks, commodities, derivatives or currency exchange
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/02Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
    • G09G5/06Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed using colour palettes, e.g. look-up tables
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0666Adjustment of display parameters for control of colour parameters, e.g. colour temperature
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/0407Resolution change, inclusive of the use of different resolutions for different screen areas
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/10Mixing of images, i.e. displayed pixel being the result of an operation, e.g. adding, on the corresponding input pixels

Definitions

  • One or more embodiments of the invention described herein pertain to the field of computer systems. More particularly, but not by way of limitation, one or more embodiments of the invention enable the rendering of a computer graphical user interface for the selection of options from option groups.
  • the graphical user interface provides users with screen elements that enable the efficient selection of context appropriate colors that are applied to an image in a corresponding screen region.
  • Color selection interfaces generally lack an ability to allow a user to select a color based on the availability of the color and/or the appropriateness of a color in a given situation.
  • Color selection interfaces that allow for the grouping of colors into palettes often do so in a fashion that is arbitrary to the shade of color. For example, many group all reds together into a single palette.
  • the interfaces provided for color selection are generally formulaic and vary little from program to program. Almost all involve a process requiring a large amount of user experimentation to create the desired color based on manual selection of the hue, saturation and luminosity (HSL) values or Red, Green, Blue (RGB) values. Others simply provide a limited supply of colors. These interfaces also lack the ability to allow a user to select from a wide variety of colors, which is necessary in an interface for selecting colors for transference to a photographic image. For example, if a user picks a shade of orange from an HSL color selection interface, the interface is unable to provide a name for the color for said user to go to a local hardware store and attempt to purchase a matching paint color with which to paint a house.
  • Interfaces for color selection also lack a coherent and ordered method of tracking recent selections within the interface, while maintaining the context appropriate application of these selections as described above.
  • One or more embodiments of the invention are directed to a graphical user interface for the selection of options from option groups.
  • the graphical user interface described herein provides users with an arrangement of screen elements that enables the user to make color choices within a particular context and apply the selected colors to an image.
  • a graphical user interface may enable a user to see the results of applying makeup to an image of a person's face.
  • This interface may offer multiple tabs to enable a user to select a particular section of a person's face to which the user chooses to apply makeup.
  • the interface may enable a user to select the color palette of the makeup to be applied through a circular region with a series of option group tabs.
  • Each option group tab has a group of associated colors.
  • the outer portion of the circular region, or the flywheel may have multiple color segments. The color segments are arranged so that adjacent color segments may be perceived to be most similar.
  • the inner portion of the circle may present the history of the colors previously chosen and may have an uncolored center circle, and a series of uncolored circles surrounding the center circle.
  • a new group of color segments may be presented in the flywheel portion of the circle.
  • the size of that segment may expand to allow the user to see the color more clearly.
  • the user clicks on a color segment the section of the person's face selected changes to the color of the color segment.
  • the selected color fills the center circle.
  • the center circle may be filled with the selected different color and one of the circles surrounding the center circle may be then filled with the previous selected color.
  • the center circle and the circles surrounding the center circle present the history of the previously selected color choices.
  • one or more embodiments of the invention are directed to providing users with a graphical user interface that enables the users to apply virtual makeup to an image. Users may, for instance, upload a picture of them and use the graphical user interface components described herein to apply virtual makeup to the image.
  • users utilize the graphical user interface components to make color choices about various color shades of makeup such as foundation, concealer, blush, eye-shadow, eye-liner, mascara, lipstick, lip liner, lip gloss, and contact lenses. Color choices are made by a user as to what color of makeup to apply, and the system renders the chosen color to the image.
  • the user's color choices and the context within which the choices were made are retained in a recent selection screen region of the interface.
  • the method described here is not limited as to the type of computer it may run upon and may for instance operate on any generalized computer system that has the computational ability to execute the methods described herein and can display the results of the users' choices on a display means.
  • the computer typically includes at least a keyboard, a display device such as a monitor, and a pointing device such as a mouse.
  • the computer also typically comprises a random access memory, a read only memory, a central processing unit and a storage device such as a hard disk drive.
  • the computer may also comprise a network connection that allows the computer to send and receive data through a computer network such as the Internet.
  • the invention may be embodied on mobile computer platforms such as cell phones, Personal Desktop Assistants (PDAs), kiosks, games boxes or any other computational device
  • options as it is used here relates to an option that is selectable by a user which relate to the applicable of said option to a subject.
  • the options are colors associated with facial makeup that are further applied to a subject image, and the context for the options provided is dependent on the part of the image the color is to be applied to.
  • the system is able to dynamically generate options presented to the user and may, for instance, determine what colors to layout on the fly based on the user's selection. The number of colors and the particular colors to be displayed are generally dictated by user choice. If a user asks to only see a certain type, category or subcategory of makeup the system renders the color choices based on the user's input.
  • SPF foundations the system supporting the graphical user interface obtains the makeup that is classified as such, determines the colors to be displayed using corresponding color data associated with those makeup choices and renders the choices within the graphical user interface for selection by the user.
  • the visual presentation used within the graphical user interface to present color options to the user can vary depending upon which embodiment of the invention is implemented.
  • a circular color flywheel type interface is utilized to present the color choices.
  • the colors displayed on the color flywheel show the user what color choices are available for application to the image.
  • the user may change the colors displayed on the color flywheel by defining or selecting a new option group.
  • Each option group has an associated collection of colors and when the option group is active, the associated colors are displayed on the color flywheel.
  • information about the operations and color choices already made by the user is displayed using a collection of circles.
  • the center most circle indicates which color is active and presently applied to the associated image and the various circles surrounding this active circle depict what other colors have been or may be applied to the image.
  • One advantage of using a color flywheel to implement one or more embodiments of the invention is that the colors on the color flywheel can be determined on the fly as was mentioned above and will be more fully described throughout this document.
  • the method described herein in the context of a graphical user interface enables users to visually glance at a history view as the history data is collected and gathered based upon user selections within a specific set of option groups. As new options are selected, the interface presents and distinguishes between the currently applied and selected option as well as options that have been recently selected through the use of dynamic graphical representation. A number of items are kept in the history, and this number can as large or as little as is called for in any implementation of an embodiment of the invention.
  • the interface moves in sequence from the most recently selected to the earliest selected option. Upon the interface history being filled the earliest selected option is removed from the list to make room for new selections.
  • One or more embodiments of the invention are also able to recognize a new selection by a user that is already in the recent history and has not been removed and is able to resort the history instead of duplicating the history entry and taking up a second history position on the interface.
  • the interface is able to retain its history through navigation. As a user navigates through option groups which are associated with the same subject area of application the history persists, allowing for selections by a user to be retained and referred back to even between option groups. A new history is created once the user navigates to a new subject area and the graphical representation of the history is blanked. However should a user choose to return to the previous subject area the history for that area would once again become available. The history may also persist across user sessions.
  • the interface is configured to present makeup options for application to an image of a face
  • a user selecting a set of lipstick colors for application to the face would see their most recent selections represented on the recently selected history regions of the interface.
  • the interface would be repopulated with options appropriate to eye shadows, and a new history would be created based on those selections. Navigating back to the lipstick options would repopulate both the interface with options and the previous history.
  • Context sensitivity with respect to the operations to be applied to the image is incorporated into one or more embodiments of the invention. For instance, when an image of a person is obtained by the system, the system processes the image using facial detection algorithms to identify the various parts of a person's face such as the eyes, nose, mouth, skin, cheeks, chin, hair, eyebrows, ears, or any other anatomically identifiable feature. This facial information is stored by the system for later use. When the user is later working on applying color changes to the uploaded image, the information obtained about the image such as where the eyes are located, what part of the image is skin, and where the lips are located is used to present a limited set of operations to the user based on the context within which a particular set of tasks may be applied.
  • the system is configured to present operations to the user that are relevant to the location on the image where the user has positioned the cursor or otherwise selected.
  • a mouse click presents operations that are eye specific.
  • the operations are lip specific (e.g., relate the application of lipstick, lip liner, or lip gloss) and when the cursor is over a relevant portion of the skin the operations presented coincide with the location of the cursor (e.g., relate to foundation or concealer).
  • the location within the image where context sensitive menus such as the ones described are presented depends upon the particular face within the image and is driven by the system making use of an automatic facial detection algorithm or by having the user manually identify key anatomical features of the face.
  • FIG. 1 shows an exemplary embodiment of an option selector in which option palettes are selected.
  • FIG. 1A illustrates the color palettes selected by an option group in one or more embodiments of the invention.
  • FIG. 1B illustrates an exemplary embodiment where a single color is selected by a user.
  • FIG. 1C illustrates a history of colors in one or more embodiments of the invention.
  • FIG. 1D illustrates an example method for ordering colors in one or more embodiments of the invention.
  • FIG. 2 shows an exemplary method in which the option selector operates where colors and color palettes are used as the selections.
  • FIG. 3 shows an exemplary method in which the option selector of FIG. 1 operates with any available options.
  • FIG. 4 shows an exemplary embodiment of the option selector of FIG. 1 in a web application.
  • FIG. 5 shows the high level operation of the option selector within an application.
  • FIG. 6 shows a method for creating, saving and using a look to apply items associated with the look to an image.
  • the look attributes are saved independent of the image.
  • FIG. 7 shows an image being provided to the system, in which the user can select an already saved ‘look’ to apply to their image, or start on their own and create a new look for the uploaded image.
  • FIG. 8 shows an exemplary embodiment of an interface allowing for the ‘at-a-glance’ viewing of an image with multiple applied looks.
  • FIG. 9 presents exemplary computer and peripherals which, when programmed as described herein, may operate as a specially programmed computer capable of implementing one or more methods, apparatus and/or systems of the invention.
  • One or more embodiments of the invention are implemented in the context of a graphical user interface for selection of options from option groups and methods related to the same.
  • This disclosure relies in part on the novel programming algorithms and user interface elements discussed in two co-pending U.S. Patent applications filed on the same day and owned by the same assignee. These applications are entitled “SYSTEM AND METHOD FOR CREATING AND SHARING PERSONALIZED VIRTUAL MAKEOVER,” Ser. No. ______, filed 17 Mar. 2009, hereinafter known as the “Personalized Makeovers” co-pending patent application and “METHOD OF MONETIZING ONLINE PERSONALIZED BEAUTY PRODUCT SELECTIONS”, Ser. No. ______, filed 17 Mar. 2009, hereinafter, the “Monetizing” co-pending patent application. These patent applications are hereby incorporated by reference into this specification.
  • FIG. 1 shows a graphical user interface comprised of interactive screen regions that, upon activation by the end user, apply desired effects to an associated image. For instance a user may utilize the interface to identify a type of makeup and color and to then apply the selected makeup to a corresponding image. Images are generally uploaded or otherwise saved into a computer for application of the methodology described herein. Once an image is available to the system the user identifies a grouping (e.g., makeup type) that will have an associated collection of options (e.g., color choices). Each grouping has a set of stored values that defines the attributes of the group.
  • a grouping e.g., makeup type
  • options e.g., color choices
  • the color values associated with each type of foundation that falls within the selected grouping is stored in the system.
  • the values for each item are used to populate the option group.
  • there are various types of makeup e.g., foundation, concealer, blush, eye-shadow, eye-liner, mascara, lipstick, lip liner, lip gloss, and contact lenses.
  • the user selects a makeup type such as foundation and then defines a grouping within the type.
  • the user may want a color palette from a certain brand of makeup or choose colors that are recommended for a certain skin type.
  • the user may want makeup that has certain characteristics (e.g., SPF level 15 or any other user preferred characteristic) and can select such a grouping.
  • Each of the individual items of makeup has a set of discrete color values.
  • the system obtains the various discrete values for the different items of makeup that fall within the grouping.
  • these discrete color values are then presented to the user for selection and application to a corresponding image via the dynamically generated graphical user interface described in FIG. 1 and throughout.
  • the generated graphical user interface may not be dynamically generated.
  • a different flywheel may be loaded for each corresponding option such as lipstick for example.
  • a color flywheel type format is used to display the available color choices to the user. It will be apparent however to those of ordinary skill in the art that any geometric shape, or combination of shapes, can be used in alternative to a circle and hence while some of the advantages described herein result from the circular color flywheel format other shapes such as an ellipse, a triangle, or a polygon are feasible to implement.
  • a circular format may consistently generate the same interface regardless of the number of colors within the grouping.
  • a square format may be employed such that the size of the segments may change to accommodate more colors.
  • FIG. 1 which illustrates the color flywheel type embodiment of the invention denotes an option group tab at 101 .
  • Each option group tab has a set of associated items where each item with the group has discrete color values that are presented on the color flywheel. If the option group relates to eye shadow for instance, the color values depicted are for each item of eye shadow within the group.
  • a visual cue that provides the user with an easy way to determine what the option group relates to may provide the user with information used to navigate between option groups.
  • each option group is associated with a color palette that defines a plurality of color choices the user may select. The colors on the color palette are associated with a particular item (e.g., makeup item) that is part of the grouping.
  • the corresponding color palette associated with the selected option group is displayed.
  • the palette displayed in flywheel display segments 102 a - 102 ff are the colors associated with option group tab 101 . If option group tab 101 was representative of a group of lipstick colors, flywheel display segments 102 a through 102 ff would represent the various colors of lipstick within the group.
  • option group tab 101 is stored as a group of references to the items within the grouping. In this case for instance, option group tab 101 is a set of stored references to the items of lipstick and their corresponding discrete color values.
  • FIG. 1A illustrates how the screen appears once an option group is selected and the corresponding color flywheel (in this case screen region 114 ) is populated.
  • the number of colors displayed in screen region 114 is not fixed but depends on the option group.
  • flywheel display segments 102 a through 102 ff contain various color choices but the interface depicted may be configured to contain other options. In the examples given this region is divided into sections around the circumference of the circular region 103 shows in FIG. 1 . Other methods of displaying the options within flywheel display segments 102 a through 102 ff are also contemplated as being within the scope and spirit of the invention. Hence readers should note that other graphical arrangements that place the options associated with a group into an arrangement where the choices are reasonably proximate to the grouping and the history data are also feasible.
  • Screen elements around this middle currently selected screen region 104 contain a history of color values identifying recent selections made by the user as shown in this example at 105 - 112 .
  • the user Upon activation of the interface the user is presented with options dynamically generated or preloaded by the application.
  • the option group selection unit is populated at 101 with options the system determined to be relevant. This determination is made by performing a lookup of the options (e.g., color values, etc . . . ) associated with a specific option group.
  • the options provided in the Option Selector device of FIG. 1 are used to provide color choices to be applied to a picture, in this example a face.
  • the application in this example is able to recognize different parts of the face to which makeup would be applied and then make changes.
  • FIG. 4 A generalized interface for making use of the graphical user interface depicted in FIGS. 1 , 1 A, 1 B, and 1 C is shown at FIG. 4 .
  • a user selects a subject at points 401 - 403 , which relates to a specific portion of the subject image.
  • the user is further presented with specific options relating to the selected portion at 405 .
  • the screen region at 405 showing available details for selection would represent lipstick, lip liner, and lip gloss as options to populate the option selector interface.
  • the selection of lipstick in this example would then populate the palette, or option group, selection interface shown on FIG.
  • Each palette contains a variety of lipstick colors, grouped by context. For example, the colors red and orange would be grouped together in one palette, and neutral or naked colors would be grouped in another.
  • the color selection is populated with the colors associated with the palette at flywheel display segment 102 a. Selection of a color from this interface applies the lipstick color to the subject image's mouth in a manner consistent with lipstick being applied to a person's face. Selection of a new detail, such as lip liner, repopulates the interface with associated palettes and colors for that detail.
  • the colors available in each palette are in one or more embodiments of the invention options that can be applied to an image of a person in a specific way. Lipstick for instance is applied to lips, eye-shadows to eyelids, and blush to cheeks.
  • the color options presented to a user are therefore provided based on the context in which they are to be applied.
  • the palettes and colors presented to the user relate to eye makeup, foundations, or others makeup type applications.
  • the methods described here may be applied to other situations where there is a grouping that has options for application to an image.
  • the planned decoration of a home where colors are applied based on a desired ‘theme’ or ‘look’.
  • Applying colors to an image is not the only embodiment of the invention as textures, patterns, text, fonts, and any other form of graphically represented information can be applied as option groups and options for this interface.
  • Circular region 103 depicts the history interface in accordance with one or more embodiments of the invention, where 104 through 112 represent specific positions in the history, ranging from the most currently selected option, through a number of recently selected options.
  • an option e.g. color value
  • the interface e.g., color flywheel
  • the selected option is shown at currently selected screen region 104 , which is used to represent the ‘currently applied’ selection.
  • a lipstick color selected by the user and applied to the subject image would then display on the applied image as well as in current selected screen region 104 .
  • FIG. 1B shows the invention in one or more example embodiments where a single color is selected by a user.
  • a visual cue in the form of an arrow or other display element indicating the selection made ( 131 ) is provided to show the selected color.
  • a large patch of this selected color is shown at screen region 116 .
  • the colors are enlarged to show the user a bigger view of the color associated with the selection. This mouse over effect is depicted at screen element 130 in FIGS. 1B and 1C .
  • FIG. 1C illustrates the use of the recently selected screen regions 105 through 112 .
  • the newly selected option is shown at the currently selected screen region 104 .
  • This begins to build a history, and moves the last selected color from 104 to the most recently selected screen region of 105 .
  • the currently selected screen region 104 and recently selected screen regions 105 - 112 have a progression associated.
  • the new option occupies the current selected space of 104 , thereby moving the most recently selected options around recently selected screen regions 105 - 112 in the manner shown.
  • FIG. 1C illustrates the use of the recently selected screen regions 105 through 112 .
  • 1C shows in an example embodiment of the invention that a user has selected various colors which are moved around the screen region. Upon these regions being filled with a history, and a new selection being made, the interface is able to drop the earliest selection from recently selected screen region 112 in order to make room for newer selections to be made.
  • the flywheel is part of a graphical user interface where users select a color from a finite number of colors, typically between 50 and 200 colors. Rather than displaying colors in an arbitrary order, it may be preferable for the colors to be ordered in a fashion where successive colors appear similar and where families of colors are proximal. In the virtual makeover application, it is necessary to layout colors on the flywheel in real-time because the selection of colors to be presented to a user may vary over time (e.g., the collection of products being offered by merchants may change, or a user may only want to see colors of lipstick from her two favorite brands).
  • Colors lie in a 3-dimensional space, often specified in terms of the three primaries red, green blue (RGB) or by their Hue, Saturation and Luminance (HSL).
  • RGB red, green blue
  • HSL Hue, Saturation and Luminance
  • mapping colors to the flywheel we are projecting colors from a 3-D space onto a 1-D circle.
  • the flywheel data structure may be a one-dimensional sorted circular doubly linked list of colors for example.
  • the projection may be data-dependent, using the set of input colors to determine the projection.
  • color(i) is the color assigned to the i-th position on the flywheel
  • ⁇ x,y> means the distance in color space between colors x and y.
  • the distance between colors could be any metric.
  • the metric may be the square of the Euclidian distance in the RGB color space.
  • a circular doubly linked data structure has nodes which may contain information describing each color's coordinates in color space for example.
  • a greedy algorithm may be employed to sort through the circular doubly linked data structure with the goal of re-ordering the colors so that the colors are ordered such that colors that are perceived to be most similar are close together. Once the greedy algorithm completes the sorting process, the flywheel display may then be populated with the color in the first node filling the first screen region in the flywheel display, the color in the second node filling the second screen region in the flywheel display, and so forth.
  • FIG. 1D illustrates an example method for ordering colors in one or more embodiments of the invention.
  • a circular doubly linked flywheel data structure is created which may hold “N” records.
  • a doubly linked flywheel data structure may consist of a sequence of nodes with each containing data fields and references pointing to the next or previous nodes.
  • Each of the nodes in the flywheel data structure may be associated with a screen region on the flywheel display for example.
  • Each of the colors in a palette may be associated with each of the nodes in the flywheel data structure.
  • the coordinates in color space for each color may be associated with the flywheel data structure. Thus, for a given color palette, each color will be associated with a node in the circular doubly linked flywheel data structure.
  • variable integer “n” is set to 1.
  • the Local Cost is calculated for colors n ⁇ 1, n, n+1, and n+2.
  • the Local Cost is calculated by adding the distance in color space between color(n) and color(n ⁇ 1) to the distance in color space between color(n+1) and color(n+2) and subtracting the distance in color space between color(n ⁇ 1) and color(n+1) and subtracting the distance in color space between color(n) and color(n+2).
  • the calculation may be executed in a tangible memory medium or a microprocessor-based computer for example.
  • the value of Local Cost may be considered. Should the value of the Local Cost be less than zero (“0”), the flow may divert to block 154 . Should the value of the Local Cost be not less than zero, the flow will divert to block 155 .
  • the order of color(n) and color(n+1) may be swapped.
  • the variable integer n incremented by one and is set to the value of n plus one.
  • the value of “n” is compared to that of “N.” When the value of “n” exceeds the value of “N,” the process flow is diverted to block 158 .
  • the process flow is diverted to block 157 .
  • the process flow is diverted to block 158 . If the accumulated time for the process does not exceed a timeout value, the process flow is diverted to block 152 .
  • the flywheel display may be displayed.
  • the first screen region may be filled with color( 1 ), the second screen region filled color( 2 ), and so forth for example.
  • the flywheel may be displayed on a display monitor for example.
  • the user interface has the ability to populate option groups on a context-dependent basis.
  • color palettes and the associated colors are populated based on the specific area of the subject image that colors are intended to be applied to.
  • the circular region 103 is capable of persistently remembering selected options during navigation of other palettes. So if a user were to pick three colors from one palette under ‘lipstick,’ subsequently change palettes and pick three more colors, all six would be displayed as recent selections in the history. This persistence is capable of surviving during navigation of other areas of the interface.
  • the palette and option group tab 101 and flywheel display segments 102 a through 102 ff are repopulated with new options upon the selection of a new detail.
  • the interface is also able to recognize whether an option being selected by a user is already in the recent history recently selected screen regions 105 - 112 , and able to reorder the selections within the history region to represent the new selection order without duplication of the selection in the history or the unnecessary dropping off of a selection from the recently selected screen region 112 .
  • FIG. 2 represents the use of the interface in an embodiment relating to colors being applied to a subject image.
  • the interface obtains the palette selection and populates flywheel display segments 102 a through 102 ff with the colors associated with the selected palette.
  • the selection of a color from flywheel display segment 102 a applies said color to a subject image, and displays the color in the currently selected region of 104 .
  • the method then proceeds through decisions relating to the history region of the interface.
  • the interface determines whether or not a history needs to be built, or whether it already exists. A history will need to be built if more than one color has been selected from at least one of the palette options provided as described above.
  • the interface can perform the application of a newly selected color to the image, while representing the currently selected color in the current selected region of 104 , while moving the last selected color to the recently selected regions of 105 .
  • a further decision must be made depending on whether the history regions have been filled with options already or otherwise. This decision occurs at step 207 .
  • Steps 208 and 209 occur in the event that the history has not already been filed, and performs the application of a newly selected color to the image, rotating the color history around circular region 103 to make room for a new addition to the list.
  • Steps 210 through 212 occur in the event that the history region has already been filled, and performs the same function above but instead removing the earliest color from the list in order to free up a section. For example, if recently selected screen regions 105 - 112 were filled with color selections and a user selects a new color, the new color is displayed in currently selected screen region 104 as currently applied, while moving the last selected to 105 . All of the other recently selected colors rotate around, with the color occupying recently selected screen region 112 being removed from the list to make room. At the end of the decisions at 206 , 209 or 212 , the interface then loops the decision process.
  • the interface obtains the option group selections and populates flywheel display segments 102 a through 102 ff with the options associated with the selected option group.
  • the selection of an option from flywheel display segment 102 a applies said option to the associated subject, and displays the selected option, or its relevant representation, in the ‘currently selected’ screen region of 104 .
  • the method then proceeds through decisions relating to the history region of the interface.
  • the interface determines whether or not a history needs to be built, or whether it already exists.
  • a history will need to be built if more than one option has been selected from at least one of the option group options provided as described above.
  • the interface can perform the application of a newly selected option to the associated subject, while representing the currently selected option in the current selected region of 104 , while moving the last selected option to the recently selected regions of 105 .
  • a further decision must be made depending on whether the history regions have been filled with options already or otherwise. This decision occurs at step 307 .
  • Steps 308 and 309 occur in the event that the history has not already been filed, and performs the application of a newly selected option to the subject, rotating the option history around circular region 103 to make room for a new addition to the list.
  • Steps 310 through 312 occur in the event that the history region has already been filled, and performs the same function above but instead removing the earliest option from the list in order to free up a section. For example, if recently selected screen regions 105 - 112 were filled with option selections and a user selects a new option, the new option is displayed in currently selected screen region 104 as currently applied, while moving the last selected to 105 . All of the other recently selected options rotate around, with the option occupying recently selected screen region 112 being removed from the list to make room.
  • the interface then loops the decision process.
  • FIG. 4 shows an embodiment of the invention in the context of a hypertext page where the options and option groups presented and selectable by the user are presented through the internet to a client computer.
  • an image is presented to the application and areas of the picture for application of colors are identified.
  • the eyes, skin and mouth are identified as being areas for color application and associated with the selection regions of 401 , 402 and 403 .
  • These areas can be further separated into details of color applications, for example the selection of eyes would allow the user to further select eye shadow, mascara, and eyeliner options, whereas selection of the mouth would allow the user to further select lipstick, lip liner and lip gloss.
  • the palette groups are provided at 405 .
  • Selection of the mouth region on 401 populates region 405 with the detail selections of lipstick, lip liner and lip gloss. Further selection of one of these options from 405 populates the color selection interface at 406 which has been described above in FIG. 1 .
  • the selection of a color from 406 applies the color to the subject image at 407 , while populating the color history regions of 103 as described fully above. It will be apparent that there are many embodiments of this interface, whether operated on a local system or through a computer network, and where the subject having options applied may be other than a facial image having makeup applied.
  • the subject image would be a graphical representation of a room, where the subject selections of 401 , 402 and 403 could represent the walls, floor ceiling of said room, and further options would allow a user to select wallpaper, paint and fabric.
  • the interface is not limited to a set of three options in each category, but that any plurality of option groups and options can be presented to the user, and that other embodiments of this invention other than those described in example could be used.
  • FIG. 5 describes the use of the interface within the context of the application.
  • a subject image is presented to the application at 501 , which is then analyzed by the application (see “Personalized Makeovers” co-pending patent application) and areas of the image are identified at step 502 .
  • Steps 503 - 507 obtain the options and option groups to be presented to the user through the interface, which are then provided and selected based on the area of image where options are to be applied.
  • the application processes both the subject and the options at step 508 and presents the user with the resulting output at step 509 .
  • the steps are repeated for each subject presented to the application, and each option selected by a user.
  • an image of a face is provided to the application which is then analyzed into areas at 502 .
  • Option groups such as the lipstick, lip liner, eye shadow and foundation options are collected and presented at 507 to the user.
  • the application processes the selection at 508 , and presents an output of the image with the selected colors and makeup types selected at 509 .
  • the system processes the image to identify the various features within the image. This is achieved using the facial recognition or other image processing algorithms or located manually by the user. Once the features within an image are located the system is then able to take actions based on the user input.
  • the image in this example is a face, which has been divided into sections for application of makeup. The eyes, mouth, and skin have been identified as areas for application, and assigned as ‘hotspots’ within the image. These ‘hotspots’ allow the user to directly interact with the image being modified and apply options relevant to the hotspot. For instance, blush is applied to the cheeks and lipstick to the lips.
  • each hotspot is determined by the system once the facial features are identified by the system or user.
  • the eyes, lips, cheeks, eyelashes or any other facial features have an associated set of action.
  • These hotspots which may differ from image to image, are based on the facial or image recognition system identifying the features.
  • the user confirms or adjusts the first attempt the system makes at identifying the features.
  • the system presents actions that can be performed on the part of the image associated with the hotspot.
  • commands that can be performed on the part of the image associated with the hotspot may be activated by other means such as a user touching a touch screen or through the use of a light pen for example.
  • This provides the user with context sensitive menus that are based on different parts of the image having been given a feature classification by the system. An eyelid is thus identified as such as are every other feature within the image.
  • the system uses this feature information to present commands to the user that are relevant to the feature.
  • a user using an activation command, most commonly a left click on a mouse pointer device, is able to access a menu wherein the menu relates to the area being clicked.
  • commands are presented when a user clicks the right or other buttons on a computer mouse.
  • right clicking on the area of an image that was identified as being eyes presents an eye-related context menu.
  • the user may then select one of the operations on the menu and perform direct manipulation of the image.
  • a right click on a part of the image that has no specific identifier would present the user with options that are applicable to the entire image.
  • a number of beauty products can be applied to an image to create a personalized makeover.
  • the products are applied to specific areas of the image to which they apply lipstick and lip liner to lip regions of the image, eye shadow and liner to the eyes, and so on until a full set of makeup has been created by the user in this personalized makeover.
  • This completed set of selections made by a user and applied to the subject image is then a complete “look.”
  • look is used herein as a noun in a sense to describe an overall style and appearance of a subject, such as a person, personal fashion, or interior decoration.
  • makeover is used as a noun to describe the creation of a new look to improve attractiveness in a subject, conform to societal fashions, or simply modify the appearance.
  • this interface creates a new ‘look’ which is made up of the options selected by a user through this interface and applied to the subject image.
  • this look could contain a specific style and color of eye shadow for application, a specific color and amount of foundation to apply, specific colors of blush and thickness of application, and so on until all desired makeups are specified and given color and quantity values within the look.
  • a ‘look’ is the collection of options selected by a user from the option selection interface, the ‘look’ can be saved and stored independently of the subject image used to produce the look. Also, ‘looks’ can be predetermined and applied to any number of subject images. For example, if a user wished to apply a ‘high fashion look’ to their own face, the user can provide their own image at 601 , which is processed by the system (further described and referenced in the “Personalized Makeovers” copending patent application).
  • the areas of the image are identified to correspond with the mouth, eyes, skin areas and other parts of the image.
  • the high fashion ‘look’ contains data relating to the beauty products used to create the look, such as a specific brand and color of eye shadow, lipstick and so on. This data is then applied over the subject image to create the look.
  • the look could, in the context of interior decoration, contain furniture types with an associated style, wall color and texture with associated RGB color values and/or a texture image, and door types. It will be apparent to the reader that the concept of a saved look can relate to any number of options, selected from option groups, associated with a subject and presented to the user applied to the subject.
  • looks can be created by a user, and subsequently saved, and selected by another user for application to a new subject image.
  • the image shown at FIG. 4 element 407 is an active rendering of the subject image, with areas of the image identified, and the ‘look’ data applied over the identified areas of the subject image.
  • the ‘look’ is saved and associated with the subject image.
  • Other users are capable of viewing this database and determining what ‘looks’ they like, and further apply the look to a subject image applied by them.
  • the information that defines a look is stored in a data structure that defines the color values and region information associated with each item that is needed to create the look.
  • the color values are those the user selected while creating the look and the region information defines what part of the image the colors should be applied against.
  • the various features identified during facial detection are used to define the regions for application of the color values. This information is saved and can later be applied to any image even if the look was created using a different image.
  • FIG. 6 A method for applying a look to an image in accordance with one or more embodiments of the invention is shown in FIG. 6 .
  • the look attributes or options such as what makeup was used to create the look are saved independent of the image.
  • an image is provided to the system and analyzed by the system, where areas for option application such as color values are identified.
  • the features within the image are identified and the user is presented with an interface for defining what makeup to apply to which features. Choices are made for instance as to what lipstick, eye-shadow, foundation, blush, or other makeup items make a look.
  • the look is created and applied to the subject image at step 602 .
  • the look is then saved in a structure that contains the saved combinations of beauty products, hair styles, accessories, and associated with the uploaded image at step 603 .
  • a second user can review the interface and see uploaded images with the saved looks applied at step 604 .
  • the second user can then upload a new image at step 605 , which is further given a saved look to apply to the image in the same manner as the previous image at step 606 .
  • FIG. 7 shows the image being provided to the system, in which the user can select an already saved ‘look’ to apply to their image, or start on their own and create a new look for the uploaded image.
  • the image may be uploaded or selected by a user.
  • the image is analyzed and the image is broken into sections for the option or look application.
  • the system determines whether the user wishes to apply a saved look to an image or whether to allow the user to create their own look through the system of applying options from option groups described above at 703 .
  • the user is then presented with the interface shown in FIG. 4 , where the image at 407 can either be unmodified and ready for option application at step 704 , or can be presented as an image rendered with options from a saved look at step 705 .
  • the rendered image is presented to the user at step 707 .
  • the look is applied to the image as layers over the image using the order generally used during the application of makeup. Foundation for instance is applied before blush, etc.
  • the process of applying a makeover is as follows: A makeover is represented as a base image, a set of layers that may follow the layers of application of real life makeup. Rendering proceeds by applying the lowest layers up through the highest layers. The low layers would include concealer and foundation. Blush might be applied on top of this. A hair image layer at the top layer with reasonable choices being made by the system about what layer is best to apply next.
  • the makeup associated with the look has an applied area which is a defined style. This can be represented as a mask which might be binary or it might be an alpha mask containing numerical values.
  • the mask shape may be predefined. It might be warped by a user.
  • multi-toned eye shadow may be represented by three masks, one for three different eye shadow products, and these will be rendered in order.
  • the mask shapes are generally different. The user may modify the masks so as to, for example, fall on the crease of the eye and extend to the eye brow.
  • the mask for applying products like foundation or tanners or bronzers might use an automatically constructed mask based on automatic skin detection as described elsewhere. It may use a predefined mask whose shape and location is reference to automatically or manually detected face features.
  • a rendering process for each layer takes the partially rendered image to that point and renders the new layer to the region defined by the mask.
  • the output of applying that layer will be a function of the makeup properties including its color, transparency, luster, gloss, metals content, and the underlying image color.
  • There are different ways to model each of the ways types of makeup are applied.
  • One method is called alpha blending.
  • Others use the separation of specular from diffuse reflectance and apply colors to these components.
  • Others use shape models of the face to determine specular components.
  • the masks and their positions, and the product parameters are called a look.
  • This look may be saved to a disk or other storage device.
  • the look may be loaded and applied to an original image.
  • the look may be applied to a second image of the same person.
  • the underlying facial features from the first image must be brought into correspondence with the second image.
  • This correspondence induces a transformation of the masked regions.
  • the foundation and bronzer layer would still use the automatically detected skin mask.
  • the subsequent layers would be transformed according to the correspondence.
  • Fast rendering can be accomplished using rendering at multiple resolutions and backgrounding. Each layer is rendered in turn at low resolution, then higher resolutions until the full resolution rendering proceeds.
  • a user may be modifying the makeover (mask positions) faster than the rendering can terminate when at full resolution.
  • makeover mask positions
  • With multi-resolution rendering with backgrounding if a mouse event causes a movement of a layer, the rendering will terminate at whatever resolution it has reached and rendering would commence at the new position starting at the lowest resolution. This provides a high degree of interactivity. As the defined ‘looks’ can be considered separate but still associated with a subject presented to the system, it is possible for the same subject to have multiple looks applied to it and viewed at a glance.
  • FIG. 8 shows, in the context of our makeup example, an interface allowing for the ‘at-a-glance’ viewing of an image with multiple applied looks.
  • the original subject image, without any application, is shown at 801 .
  • a copy of the image with a specific applied look is shown at 802 .
  • This look would be a user created, and user saved look containing details of the beauty product types and options being applied to the image.
  • a subsequent copy of the image is shown at 803 with a different look attached. This allows a user to compare images with various makeup selections (options) applied and make a personal selection based on the images presented to them.
  • One or more embodiments of the invention may be implemented in the form of one or more computer programs that when executed in computer memory may cause one or more computer processors to initiate the methods and processes described herein.
  • the files assembled to makeup the software program may be stored on one or more computer-readable medium and retrieved when needed to carry out the programmed methods and processes described herein.
  • one or more embodiments of the invention may comprise computer programs, data and other information further comprising but not limited to: sets of computer instructions, code sequences, configuration information, data and other information in any form, format or language usable by a general purpose computer or other data processing device, such that when such a computer or device contains, is programmed with, or has access to said computer programs, the data and other information transforms said general purpose computer into a machine capable of performing the methods and processes described herein, and specifically such as those described above.
  • Various embodiments of the invention may be implemented as a method, apparatus, or article of manufacture using standard programming and/or engineering techniques to produce software, firmware, hardware, computer-readable media or any combination thereof.
  • article of manufacture (or alternatively, “computer program product,”) as used herein is intended to encompass a computer program of any form accessible from any computer-readable device, carrier or media.
  • the software in which various embodiments are implemented may be accessible through a transmission medium, such as for example, from a server over the network.
  • the article of manufacture in which the program is implemented may also employ transmission media, such as a network transmission line and/or a wireless transmission media.
  • a computer-readable medium suitable to provide computer readable instructions and/or computer readable data for the methods and processes described herein may be any type of magnetic, optical, electrical or other storage medium including disk, tape, CD, DVD, flash drive, thumb drive, storage card, distributed storage or any other memory device, location, approach or other storage medium or technique known to those of skill in the art.
  • the methods described here may not be limited as to the type of computer it may run upon and may for instance operate on any generalized computer system that has the computational ability to execute the methods described herein and can display the results of the user's choices on one or more display devices.
  • Display devices appropriate for providing interaction with the invention described herein includes, but is not limited to, computer monitors, cell phones, PDAs, televisions, or any other form of computer controllable output display.
  • a computer system refers to but is not limited to any type of computing device, including its associated computer software, data, peripheral devices, communications equipment and any required or desired computers that may achieve direct or indirect communication with a primary computing device.
  • a general-purpose computer may be utilized to implement one or more aspects of the invention.
  • the computer may include various input and output means, including but not limited to a keyboard or other textual input devices, a display device such as a monitor or other display screen, and a pointing device and/or user selection indicator such as a mouse, keypad, touch screen, pointing device, or other known input/output devices known to those of skill in the art.
  • the general purpose computer described herein may include one or more banks of random access memory, read only memory, and one or more central processing unit(s).
  • the general purpose computer described herein may also include one or more data storage device(s) such as a hard disk drive, or other computer readable medium discussed above.
  • An operating system that executes within the computer memory may provide an interface between the hardware and software.
  • the operating system may be responsible for managing, coordinating and sharing of the limited resources within the computer.
  • Software programs that run on the computer may be performed by an operating system to provide the program of the invention with access to the resources needed to execute. In other embodiments the program may run stand-alone on the processor to perform the methods described herein.
  • the method(s) described herein when loaded on or executing through or by one or more general purpose computer(s) described above, may transform the general purpose computer(s) into a specially programmed computer able to perform the method or methods described herein.
  • the computer-readable storage medium(s) encoded with computer program instructions that, when accessed by a computer, may cause the computer to load the program instructions to a memory there accessible, thereby creates a specially programmed computer able to perform the methods described herein as a specially programmed computer.
  • the specially programmed computer of the invention may also comprise a connection that allows the computer to send and/or receive data through a computer network such as the Internet or other communication network.
  • a computer network such as the Internet or other communication network.
  • Mobile computer platforms such as cellular telephones, Personal Desktop Assistants (PDAs), other hand-held computing devices, digital recorders, wearable computing devices, kiosks, set top boxes, games boxes or any other computational device, portable, personal, real or virtual or otherwise, may also qualify as a computer system or part of a computer system capable of executing the methods described herein as a specially programmed computer.
  • FIG. 9 depicts a general-purpose computer and peripherals, when programmed as described herein, may operate as a specially programmed computer capable of implementing one or more methods, apparatus and/or systems of the invention.
  • Processor 907 may be coupled to bi-directional communication infrastructure 902 such as Communication Infrastructure System Bus 902 .
  • Communication Infrastructure 902 may generally be a system bus that provides an interface to the other components in the general-purpose computer system such as Processor 907 , Main Memory 906 , Display Interface 908 , Secondary Memory 912 and/or Communication Interface 924 .
  • Main memory 906 may provide a computer readable medium for accessing and executed stored data and applications.
  • Display Interface 908 may communicate with Display Unit 910 which may be utilized to display outputs to the user of the specially-programmed computer system.
  • Display Unit 910 may comprise one or more monitors that may visually depict aspects of the computer program to the user.
  • Main Memory 906 and Display Interface 908 may be coupled to Communication Infrastructure 902 , which may serve as the interface point to Secondary Memory 912 and Communication Interface 924 .
  • Secondary Memory 912 may provide additional memory resources beyond main Memory 906 , and may generally function as a storage location for computer programs to be executed by Processor 907 . Either fixed or removable computer-readable media may serve as Secondary Memory 912 .
  • Secondary Memory 912 may comprise, for example, Hard Disk 914 and Removable Storage Drive 916 that may have an associated Removable Storage Unit 918 . There may be multiple sources of Secondary Memory 912 and systems of the invention may be configured as needed to support the data storage requirements of the user and the methods described herein. Secondary Memory 912 may also comprise Interface 920 that serves as an interface point to additional storage such as Removable Storage Unit 922 . Numerous types of data storage devices may serve as repositories for data utilized by the specially programmed computer system of the invention. For example, magnetic, optical or magnetic-optical storage systems, or any other available mass storage technology that provides a repository for digital information may be used.
  • Communication Interface 924 may be coupled to Communication Infrastructure 902 and may serve as a conduit for data destined for or received from Communication Path 926 .
  • a Network Interface Card (NIC) is an example of the type of device that once coupled to Communication Infrastructure 902 may provide a mechanism for transporting data to Communication Path 926 .
  • Computer networks such Local Area Networks (LAN), Wide Area Networks (WAN), Wireless networks, optical networks, distributed networks, the Internet or any combination thereof are some examples of the type of communication paths that may be utilized by the specially program computer system of the invention.
  • Communication Path 926 may comprise any type of telecommunication network or interconnection fabric that can transport data to and from Communication Interface 924 .
  • HID 930 may be provided.
  • HIDs that enable users to input commands or data to the specially programmed computer of the invention may comprise a keyboard, mouse, touch screen devices, microphones or other audio interface devices, motion sensors or the like, as well as any other device able to accept any kind of human input and in turn communicate that input to Processor 907 to trigger one or more responses from the specially programmed computer of the invention are within the scope of the system of the invention.
  • FIG. 9 depicts a physical device
  • the scope of the system of the invention may also encompass a virtual device, virtual machine or simulator embodied in one or more computer programs executing on a computer or computer system and acting or providing a computer system environment compatible with the methods and processes of the invention.
  • a virtual machine, process, device or otherwise performs substantially similarly to that of a physical computer system of the invention, such a virtual platform will also fall within the scope of a system of the invention, notwithstanding the description herein of a physical system such as that in FIG. 9 .
  • One or more embodiments of the invention are configured to enable the specially programmed computer of the invention to take the input data given and transform it into an interface enabling dynamic layout of options or color within a group for application to an image, arrangement and ordering of colors, color selection history, context sensitive menus, and look history and saving, by applying one or more of the methods and/or processes of the invention as described herein.
  • the methods described herein are able to transform the raw input data provided to the system of the invention into a resulting output of the system using the specially programmed computer as described.
  • One or more embodiments of the invention are configured to enable a general-purpose computer to take one or more color palettes and the color choices associated with each color palette, from memory and transform and display the graphical user interface component on Display Unit 910 for example.
  • the user through the interaction with the Human Interface Device 930 , enters a selection of a region of a subject image.
  • Processor 907 receives the selection of a region of a subject image, transforms the color palette and color choices data, and transmits the information to the Display Unit 910 for display.
  • the user may interact with the computer system through Human Interface Device 930 and may select a second color choice which causes Processor 907 to process the information and transmit signal to the graphical user interface on Display Unit 910 .
  • Processor 907 when a user selects an identified region of a subject image, Processor 907 transmits data to the Display Unit 910 and enables the user to see context sensitive menus that is associated with the specific identified region of the subject image.
  • Processor 907 requests for the color records for a particular color palette. Processor 907 then calculates the arrangement of the colors in a color flywheel display through the use of a greedy algorithm. Processor 907 then transmits the signals to the Display Unit 910 where they may be displayed to a user.
  • Human Interface device 903 accepts a user's input in which the user may select a first identified region of a subject image to apply a first color, and a second identified region of a subject image to apply a second color.
  • Processor 907 may process the metadata describing the first color and second color and store that information as a “look” in Main Memory 906 for example.

Abstract

A system providing a graphical user interface for the selection of options from option groups and method for implementing the same. A graphical user interface enables a user to see the results of applying virtual makeup to an image of a person's face. The interface may enable a user to select the color of the makeup to be applied through a circular region with a series of option tabs. The outer portion of the circular region, or the flywheel, may have multiple color segments. The color segments are arranged so that adjacent color segments may be perceived to be most similar. The inner portion of the circle may present the history of the colors previously chosen and may have an uncolored center circle, and a series of uncolored circles surrounding the center circle.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application claims priority to three U.S. Provisional Patent Applications, all filed on Mar. 17, 2008, and all co-owned by the same assignee. These applications are entitled “SYSTEM AND METHOD FOR CREATING AND SHARING PERSONALIZED VIRTUAL MAKEOVERS,” Ser. No. 61/037,323, “GRAPHICAL USER INTERFACE FOR SELECTION OF OPTIONS FROM OPTION GROUPS AND METHODS RELATING TO SAME,” Ser. No. 61/037,319, and “METHOD OF MONETIZING ONLINE PERSONALIZED BEAUTY PRODUCT SELECTIONS,” Ser. No. 61/037,314, filed 17 Mar. 2008. These provisional patent applications are hereby incorporated by reference in their entirety into this specification.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • One or more embodiments of the invention described herein pertain to the field of computer systems. More particularly, but not by way of limitation, one or more embodiments of the invention enable the rendering of a computer graphical user interface for the selection of options from option groups. In at least one context the graphical user interface provides users with screen elements that enable the efficient selection of context appropriate colors that are applied to an image in a corresponding screen region.
  • 2. Description of the Related Art
  • Computer systems have made longstanding use of various systems for enabling users to affect the color of what is displayed on the computer screen. Early programs allowed a user to draw shapes and text in a rudimentary form. While the earliest incarnations of these graphic design programs functioned only in black and white, as computers developed color options were attached. These early applications, for example PC Paintbrush for Microsoft's Disk Operating System (DOS), created in 1985, allowed a user to select a drawing color from a palette. The user was presented with a variety of colors to choose from in rowed boxes where different shades of color were represented by a mix of pixels using a 16-bit Enhanced Graphics Adapter and compatible display.
  • As graphical programs continued to develop more color options became available, and truer representation of the color spectrum developed. A number of formats for color selection exist including the rowed format continued from early graphics programs into later versions, such as Microsoft's Paint for Windows XP. Customization of these color selectors allow a user to move a point through a large square that contains certain colors, and also provides access to the millions of shades in between. The user may create a desired color by selecting hue, saturation and luminosity values for a color. While some vary, most color selection user interfaces use this method for color selection. Less commonly, a hexadecimal value associated with a color that is recognized by Hypertext Markup Language (HTML) browsers may be selected to create a desired color.
  • Color selection interfaces, however, generally lack an ability to allow a user to select a color based on the availability of the color and/or the appropriateness of a color in a given situation. Color selection interfaces that allow for the grouping of colors into palettes often do so in a fashion that is arbitrary to the shade of color. For example, many group all reds together into a single palette.
  • The interfaces provided for color selection are generally formulaic and vary little from program to program. Almost all involve a process requiring a large amount of user experimentation to create the desired color based on manual selection of the hue, saturation and luminosity (HSL) values or Red, Green, Blue (RGB) values. Others simply provide a limited supply of colors. These interfaces also lack the ability to allow a user to select from a wide variety of colors, which is necessary in an interface for selecting colors for transference to a photographic image. For example, if a user picks a shade of orange from an HSL color selection interface, the interface is unable to provide a name for the color for said user to go to a local hardware store and attempt to purchase a matching paint color with which to paint a house.
  • Interfaces for color selection also lack a coherent and ordered method of tracking recent selections within the interface, while maintaining the context appropriate application of these selections as described above.
  • For at least the limitations described above there is a need for a computer graphical user interface for the selection of options from option groups in various contexts such as the one described in further detail throughout this document.
  • BRIEF SUMMARY OF THE INVENTION
  • One or more embodiments of the invention are directed to a graphical user interface for the selection of options from option groups. By way of example and not by limitation, the graphical user interface described herein provides users with an arrangement of screen elements that enables the user to make color choices within a particular context and apply the selected colors to an image.
  • In one or more embodiments of the invention, a graphical user interface may enable a user to see the results of applying makeup to an image of a person's face. This interface may offer multiple tabs to enable a user to select a particular section of a person's face to which the user chooses to apply makeup. The interface may enable a user to select the color palette of the makeup to be applied through a circular region with a series of option group tabs. Each option group tab has a group of associated colors. The outer portion of the circular region, or the flywheel, may have multiple color segments. The color segments are arranged so that adjacent color segments may be perceived to be most similar. The inner portion of the circle may present the history of the colors previously chosen and may have an uncolored center circle, and a series of uncolored circles surrounding the center circle.
  • In one or more embodiments of the invention, when a user selects an option group tab, a new group of color segments may be presented in the flywheel portion of the circle. As the user moves and holds the cursor above a colored segment, the size of that segment may expand to allow the user to see the color more clearly. When the user clicks on a color segment, the section of the person's face selected changes to the color of the color segment. In addition, the selected color fills the center circle. When the user selects a different color, the center circle may be filled with the selected different color and one of the circles surrounding the center circle may be then filled with the previous selected color. As the user selects additional different colors, the center circle and the circles surrounding the center circle present the history of the previously selected color choices.
  • In the example described here, the graphical user interface components and the methods enabling the graphical user interface components to operate as described are illustrated via a virtual make over. Thus, while not limited solely to such an implementation one or more embodiments of the invention are directed to providing users with a graphical user interface that enables the users to apply virtual makeup to an image. Users may, for instance, upload a picture of them and use the graphical user interface components described herein to apply virtual makeup to the image. Hence in at least one embodiment of the invention, users utilize the graphical user interface components to make color choices about various color shades of makeup such as foundation, concealer, blush, eye-shadow, eye-liner, mascara, lipstick, lip liner, lip gloss, and contact lenses. Color choices are made by a user as to what color of makeup to apply, and the system renders the chosen color to the image.
  • The user's color choices and the context within which the choices were made are retained in a recent selection screen region of the interface. The method described here is not limited as to the type of computer it may run upon and may for instance operate on any generalized computer system that has the computational ability to execute the methods described herein and can display the results of the users' choices on a display means. The computer typically includes at least a keyboard, a display device such as a monitor, and a pointing device such as a mouse. The computer also typically comprises a random access memory, a read only memory, a central processing unit and a storage device such as a hard disk drive. In some embodiments of the interface, the computer may also comprise a network connection that allows the computer to send and receive data through a computer network such as the Internet. The invention may be embodied on mobile computer platforms such as cell phones, Personal Desktop Assistants (PDAs), kiosks, games boxes or any other computational device
  • The term options as it is used here relates to an option that is selectable by a user which relate to the applicable of said option to a subject. For instance, in one embodiment, the options are colors associated with facial makeup that are further applied to a subject image, and the context for the options provided is dependent on the part of the image the color is to be applied to. In one or more embodiments of the invention the system is able to dynamically generate options presented to the user and may, for instance, determine what colors to layout on the fly based on the user's selection. The number of colors and the particular colors to be displayed are generally dictated by user choice. If a user asks to only see a certain type, category or subcategory of makeup the system renders the color choices based on the user's input. For instance, if the user selects only organic, SPF foundations the system supporting the graphical user interface obtains the makeup that is classified as such, determines the colors to be displayed using corresponding color data associated with those makeup choices and renders the choices within the graphical user interface for selection by the user.
  • The visual presentation used within the graphical user interface to present color options to the user can vary depending upon which embodiment of the invention is implemented. In at least one embodiment of the invention a circular color flywheel type interface is utilized to present the color choices. The colors displayed on the color flywheel show the user what color choices are available for application to the image. The user may change the colors displayed on the color flywheel by defining or selecting a new option group. Each option group has an associated collection of colors and when the option group is active, the associated colors are displayed on the color flywheel. In the center portion of the color flywheel information about the operations and color choices already made by the user is displayed using a collection of circles. The center most circle indicates which color is active and presently applied to the associated image and the various circles surrounding this active circle depict what other colors have been or may be applied to the image. One advantage of using a color flywheel to implement one or more embodiments of the invention is that the colors on the color flywheel can be determined on the fly as was mentioned above and will be more fully described throughout this document.
  • The method described herein in the context of a graphical user interface enables users to visually glance at a history view as the history data is collected and gathered based upon user selections within a specific set of option groups. As new options are selected, the interface presents and distinguishes between the currently applied and selected option as well as options that have been recently selected through the use of dynamic graphical representation. A number of items are kept in the history, and this number can as large or as little as is called for in any implementation of an embodiment of the invention. The interface moves in sequence from the most recently selected to the earliest selected option. Upon the interface history being filled the earliest selected option is removed from the list to make room for new selections. One or more embodiments of the invention are also able to recognize a new selection by a user that is already in the recent history and has not been removed and is able to resort the history instead of duplicating the history entry and taking up a second history position on the interface.
  • The interface is able to retain its history through navigation. As a user navigates through option groups which are associated with the same subject area of application the history persists, allowing for selections by a user to be retained and referred back to even between option groups. A new history is created once the user navigates to a new subject area and the graphical representation of the history is blanked. However should a user choose to return to the previous subject area the history for that area would once again become available. The history may also persist across user sessions.
  • In one embodiment used for example where the interface is configured to present makeup options for application to an image of a face, a user selecting a set of lipstick colors for application to the face would see their most recent selections represented on the recently selected history regions of the interface. Were the user to choose to select eye shadow colors instead, the interface would be repopulated with options appropriate to eye shadows, and a new history would be created based on those selections. Navigating back to the lipstick options would repopulate both the interface with options and the previous history.
  • Context sensitivity with respect to the operations to be applied to the image is incorporated into one or more embodiments of the invention. For instance, when an image of a person is obtained by the system, the system processes the image using facial detection algorithms to identify the various parts of a person's face such as the eyes, nose, mouth, skin, cheeks, chin, hair, eyebrows, ears, or any other anatomically identifiable feature. This facial information is stored by the system for later use. When the user is later working on applying color changes to the uploaded image, the information obtained about the image such as where the eyes are located, what part of the image is skin, and where the lips are located is used to present a limited set of operations to the user based on the context within which a particular set of tasks may be applied. The system is configured to present operations to the user that are relevant to the location on the image where the user has positioned the cursor or otherwise selected. When the cursor is located over the eye, for instance, a mouse click presents operations that are eye specific. When the cursor is positioned over the lips the operations are lip specific (e.g., relate the application of lipstick, lip liner, or lip gloss) and when the cursor is over a relevant portion of the skin the operations presented coincide with the location of the cursor (e.g., relate to foundation or concealer). The location within the image where context sensitive menus such as the ones described are presented depends upon the particular face within the image and is driven by the system making use of an automatic facial detection algorithm or by having the user manually identify key anatomical features of the face.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The patent or application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawing(s) will be provided by the Office upon request and payment of the necessary fee.
  • The above and other aspects, features and advantages of the invention will be more apparent from the following more particular description thereof, presented in conjunction with the following drawings wherein:
  • FIG. 1 shows an exemplary embodiment of an option selector in which option palettes are selected.
  • FIG. 1A illustrates the color palettes selected by an option group in one or more embodiments of the invention.
  • FIG. 1B illustrates an exemplary embodiment where a single color is selected by a user.
  • FIG. 1C illustrates a history of colors in one or more embodiments of the invention.
  • FIG. 1D illustrates an example method for ordering colors in one or more embodiments of the invention.
  • FIG. 2 shows an exemplary method in which the option selector operates where colors and color palettes are used as the selections.
  • FIG. 3 shows an exemplary method in which the option selector of FIG. 1 operates with any available options.
  • FIG. 4 shows an exemplary embodiment of the option selector of FIG. 1 in a web application.
  • FIG. 5 shows the high level operation of the option selector within an application.
  • FIG. 6 shows a method for creating, saving and using a look to apply items associated with the look to an image. The look attributes are saved independent of the image.
  • FIG. 7 shows an image being provided to the system, in which the user can select an already saved ‘look’ to apply to their image, or start on their own and create a new look for the uploaded image.
  • FIG. 8 shows an exemplary embodiment of an interface allowing for the ‘at-a-glance’ viewing of an image with multiple applied looks.
  • FIG. 9 presents exemplary computer and peripherals which, when programmed as described herein, may operate as a specially programmed computer capable of implementing one or more methods, apparatus and/or systems of the invention.
  • DETAILED DESCRIPTION
  • A graphical user interface for the selection of options from option groups and method for implementing the same will now be described. In the example given here the invention is described in the context of a graphical user interface component that is used to enable users to select and apply virtual makeup to an image. In the following exemplary description numerous specific details are set forth in order to provide a more thorough understanding of embodiments of the invention. It will be apparent, however, to an artisan of ordinary skill that the present invention may be practiced without incorporating all aspects of the specific details described herein. In other instances, specific features, quantities, or measurements well known to those of ordinary skill in the art have not been described in detail so as not to obscure the invention. Readers should note that, although examples of the invention are set forth herein, the invention is not limited to the specific examples given in that the claims, and the full scope of any equivalents, are what define the invention.
  • One or more embodiments of the invention are implemented in the context of a graphical user interface for selection of options from option groups and methods related to the same. This disclosure relies in part on the novel programming algorithms and user interface elements discussed in two co-pending U.S. Patent applications filed on the same day and owned by the same assignee. These applications are entitled “SYSTEM AND METHOD FOR CREATING AND SHARING PERSONALIZED VIRTUAL MAKEOVER,” Ser. No. ______, filed 17 Mar. 2009, hereinafter known as the “Personalized Makeovers” co-pending patent application and “METHOD OF MONETIZING ONLINE PERSONALIZED BEAUTY PRODUCT SELECTIONS”, Ser. No. ______, filed 17 Mar. 2009, hereinafter, the “Monetizing” co-pending patent application. These patent applications are hereby incorporated by reference into this specification.
  • Interface Enabling Dynamic Layout of Options or Color Within a Group for Application to an Image
  • FIG. 1 shows a graphical user interface comprised of interactive screen regions that, upon activation by the end user, apply desired effects to an associated image. For instance a user may utilize the interface to identify a type of makeup and color and to then apply the selected makeup to a corresponding image. Images are generally uploaded or otherwise saved into a computer for application of the methodology described herein. Once an image is available to the system the user identifies a grouping (e.g., makeup type) that will have an associated collection of options (e.g., color choices). Each grouping has a set of stored values that defines the attributes of the group. For example, if the user selected a brand or generalized color palette of foundation to apply to the skin region within an image, the color values associated with each type of foundation that falls within the selected grouping is stored in the system. When an item within the group is part of the grouping made by the user the values for each item are used to populate the option group. In the context of a virtual makeup interface, for example, there are various types of makeup (e.g., foundation, concealer, blush, eye-shadow, eye-liner, mascara, lipstick, lip liner, lip gloss, and contact lenses). The user selects a makeup type such as foundation and then defines a grouping within the type. The user may want a color palette from a certain brand of makeup or choose colors that are recommended for a certain skin type. In other instances the user may want makeup that has certain characteristics (e.g., SPF level 15 or any other user preferred characteristic) and can select such a grouping. Each of the individual items of makeup has a set of discrete color values. When the user identifies the grouping, the system obtains the various discrete values for the different items of makeup that fall within the grouping. In one or more embodiments of the invention, these discrete color values are then presented to the user for selection and application to a corresponding image via the dynamically generated graphical user interface described in FIG. 1 and throughout. In one or more embodiments of the invention, the generated graphical user interface may not be dynamically generated. In one or more embodiments of the invention, a different flywheel may be loaded for each corresponding option such as lipstick for example.
  • In the figure depicted as an example of one or more embodiments of the invention a color flywheel type format is used to display the available color choices to the user. It will be apparent however to those of ordinary skill in the art that any geometric shape, or combination of shapes, can be used in alternative to a circle and hence while some of the advantages described herein result from the circular color flywheel format other shapes such as an ellipse, a triangle, or a polygon are feasible to implement. In one or more embodiments of the invention, a circular format may consistently generate the same interface regardless of the number of colors within the grouping. In one or more embodiments of the invention, a square format may be employed such that the size of the segments may change to accommodate more colors.
  • FIG. 1 which illustrates the color flywheel type embodiment of the invention denotes an option group tab at 101. Each option group tab has a set of associated items where each item with the group has discrete color values that are presented on the color flywheel. If the option group relates to eye shadow for instance, the color values depicted are for each item of eye shadow within the group. A visual cue that provides the user with an easy way to determine what the option group relates to (e.g., what type of makeup) may provide the user with information used to navigate between option groups. In general each option group is associated with a color palette that defines a plurality of color choices the user may select. The colors on the color palette are associated with a particular item (e.g., makeup item) that is part of the grouping. When an option group is selected the corresponding color palette associated with the selected option group is displayed. In the example depicted in FIG. 1 for instance, when option group tab 101 is active, the palette displayed in flywheel display segments 102 a-102 ff are the colors associated with option group tab 101. If option group tab 101 was representative of a group of lipstick colors, flywheel display segments 102 a through 102 ff would represent the various colors of lipstick within the group.
  • As shown in FIGS. 1 and 1A, in the context of colors, different color palettes can be selected by selecting a different option group (e.g., 113). The selection of a specific option group at 101 triggers the population of flywheel display segments 102 a, 102 b, and 102 c through 102 ff with the options associated with the selected option group. In one or more embodiments of the invention option group tab 101 is stored as a group of references to the items within the grouping. In this case for instance, option group tab 101 is a set of stored references to the items of lipstick and their corresponding discrete color values. When the user activates option group tab 101 the system obtains the corresponding color values and dynamically populates flywheel display segments 102 a through 102 ff with the color of items within the group. FIG. 1A illustrates how the screen appears once an option group is selected and the corresponding color flywheel (in this case screen region 114) is populated. The number of colors displayed in screen region 114 is not fixed but depends on the option group.
  • In the example given at FIG. 1, flywheel display segments 102 a through 102 ff contain various color choices but the interface depicted may be configured to contain other options. In the examples given this region is divided into sections around the circumference of the circular region 103 shows in FIG. 1. Other methods of displaying the options within flywheel display segments 102 a through 102 ff are also contemplated as being within the scope and spirit of the invention. Hence readers should note that other graphical arrangements that place the options associated with a group into an arrangement where the choices are reasonably proximate to the grouping and the history data are also feasible.
  • Currently selected screen region 104 which in the example depicted here is located in the middle of the user interface allows for the display of the currently selected option. Screen elements around this middle currently selected screen region 104 contain a history of color values identifying recent selections made by the user as shown in this example at 105-112.
  • Upon activation of the interface the user is presented with options dynamically generated or preloaded by the application. Once a user enters the application, the option group selection unit is populated at 101 with options the system determined to be relevant. This determination is made by performing a lookup of the options (e.g., color values, etc . . . ) associated with a specific option group. In the examples provided, the options provided in the Option Selector device of FIG. 1 are used to provide color choices to be applied to a picture, in this example a face. The application in this example is able to recognize different parts of the face to which makeup would be applied and then make changes.
  • A generalized interface for making use of the graphical user interface depicted in FIGS. 1, 1A, 1B, and 1C is shown at FIG. 4. In the example provided and shown in FIG. 4, a user selects a subject at points 401-403, which relates to a specific portion of the subject image. Upon the selection of one of these subjects the user is further presented with specific options relating to the selected portion at 405. For example, if the subject selected was the mouth, the screen region at 405 showing available details for selection would represent lipstick, lip liner, and lip gloss as options to populate the option selector interface. The selection of lipstick in this example would then populate the palette, or option group, selection interface shown on FIG. 1, option group tab 101, and at 406. Each palette contains a variety of lipstick colors, grouped by context. For example, the colors red and orange would be grouped together in one palette, and neutral or naked colors would be grouped in another. Upon selection of a palette, the color selection is populated with the colors associated with the palette at flywheel display segment 102 a. Selection of a color from this interface applies the lipstick color to the subject image's mouth in a manner consistent with lipstick being applied to a person's face. Selection of a new detail, such as lip liner, repopulates the interface with associated palettes and colors for that detail. The colors available in each palette are in one or more embodiments of the invention options that can be applied to an image of a person in a specific way. Lipstick for instance is applied to lips, eye-shadows to eyelids, and blush to cheeks.
  • The color options presented to a user are therefore provided based on the context in which they are to be applied. In the makeup example the palettes and colors presented to the user relate to eye makeup, foundations, or others makeup type applications. Outside the context of the makeup example described herein the methods described here may be applied to other situations where there is a grouping that has options for application to an image. For example the planned decoration of a home where colors are applied based on a desired ‘theme’ or ‘look’. Applying colors to an image is not the only embodiment of the invention as textures, patterns, text, fonts, and any other form of graphically represented information can be applied as option groups and options for this interface.
  • As previously mentioned, parts of the interface comprise a history region. Circular region 103 depicts the history interface in accordance with one or more embodiments of the invention, where 104 through 112 represent specific positions in the history, ranging from the most currently selected option, through a number of recently selected options. Upon selection of an option (e.g. color value) from the interface (e.g., color flywheel) and the application of this option to the associated subject image, the selected option is shown at currently selected screen region 104, which is used to represent the ‘currently applied’ selection. In keeping with our provided example embodiment, a lipstick color selected by the user and applied to the subject image would then display on the applied image as well as in current selected screen region 104. FIG. 1B shows the invention in one or more example embodiments where a single color is selected by a user. A visual cue in the form of an arrow or other display element indicating the selection made (131) is provided to show the selected color. A large patch of this selected color is shown at screen region 116. Before a color is selected but while the mouse is over the color flywheel portion of the interface, the colors are enlarged to show the user a bigger view of the color associated with the selection. This mouse over effect is depicted at screen element 130 in FIGS. 1B and 1C.
  • FIG. 1C illustrates the use of the recently selected screen regions 105 through 112. Upon the selection of a subsequent option from the flywheel display segment 102 a, the newly selected option is shown at the currently selected screen region 104. This begins to build a history, and moves the last selected color from 104 to the most recently selected screen region of 105. As illustrated in FIG. 1, the currently selected screen region 104 and recently selected screen regions 105-112 have a progression associated. As a new selection is made by the user, the new option occupies the current selected space of 104, thereby moving the most recently selected options around recently selected screen regions 105-112 in the manner shown. FIG. 1C shows in an example embodiment of the invention that a user has selected various colors which are moved around the screen region. Upon these regions being filled with a history, and a new selection being made, the interface is able to drop the earliest selection from recently selected screen region 112 in order to make room for newer selections to be made.
  • Arrangement and Ordering of Colors
  • The flywheel is part of a graphical user interface where users select a color from a finite number of colors, typically between 50 and 200 colors. Rather than displaying colors in an arbitrary order, it may be preferable for the colors to be ordered in a fashion where successive colors appear similar and where families of colors are proximal. In the virtual makeover application, it is necessary to layout colors on the flywheel in real-time because the selection of colors to be presented to a user may vary over time (e.g., the collection of products being offered by merchants may change, or a user may only want to see colors of lipstick from her two favorite brands).
  • Colors lie in a 3-dimensional space, often specified in terms of the three primaries red, green blue (RGB) or by their Hue, Saturation and Luminance (HSL). In mapping colors to the flywheel, we are projecting colors from a 3-D space onto a 1-D circle. Unlike the standard color wheel, which is 2-dimensional and where colors are typically organized in polar coordinates with the hue defining the angle and the saturation defining the radius, the flywheel data structure may be a one-dimensional sorted circular doubly linked list of colors for example. Rather than performing this dimensionality reduction in a data independent way (e.g., projecting to the flywheel just using Hue), the projection may be data-dependent, using the set of input colors to determine the projection. This is important since the colors being displayed in a flywheel for certain applications (e.g., display of cosmetics colors) may not be uniformly distributed through the color space, but are often highly clustered. For example, lipsticks may contain many red-toned colors, but may contain very few blue, yellow and green tones.
  • To assign a discrete set of N colors to N positions on a flywheel display, we define a cost function
  • COST = i = 0 N - 1 color ( i ) , color ( i + 1 ) mod N
  • where color(i) is the color assigned to the i-th position on the flywheel, and where <x,y> means the distance in color space between colors x and y. For the subsequent method, the distance between colors could be any metric. In the implementation, the metric may be the square of the Euclidian distance in the RGB color space. Finding the assignment of colors to positions to minimize this cost function is a combinatorial optimization problem. Finding the global minimum of this cost function is computationally expensive, and so a greedy algorithm can be used which is fast, but is not guaranteed to find the global minimum. A greedy algorithm estimates the solution of a problem by making the locally optimal choice at each stage with the hope of finding the global optimum. In one or more embodiments of the invention, a color palette has multiple colors. A circular doubly linked data structure has nodes which may contain information describing each color's coordinates in color space for example. A greedy algorithm may be employed to sort through the circular doubly linked data structure with the goal of re-ordering the colors so that the colors are ordered such that colors that are perceived to be most similar are close together. Once the greedy algorithm completes the sorting process, the flywheel display may then be populated with the color in the first node filling the first screen region in the flywheel display, the color in the second node filling the second screen region in the flywheel display, and so forth.
  • FIG. 1D illustrates an example method for ordering colors in one or more embodiments of the invention. At block 150, a circular doubly linked flywheel data structure is created which may hold “N” records. A doubly linked flywheel data structure may consist of a sequence of nodes with each containing data fields and references pointing to the next or previous nodes. Each of the nodes in the flywheel data structure may be associated with a screen region on the flywheel display for example. Each of the colors in a palette may be associated with each of the nodes in the flywheel data structure. In one or more embodiments of the invention, the coordinates in color space for each color may be associated with the flywheel data structure. Thus, for a given color palette, each color will be associated with a node in the circular doubly linked flywheel data structure.
  • At block 151, variable integer “n” is set to 1. At block 152, the Local Cost is calculated for colors n−1, n, n+1, and n+2. The Local Cost is calculated by adding the distance in color space between color(n) and color(n−1) to the distance in color space between color(n+1) and color(n+2) and subtracting the distance in color space between color(n−1) and color(n+1) and subtracting the distance in color space between color(n) and color(n+2). In one or more embodiments of the invention, the calculation may be executed in a tangible memory medium or a microprocessor-based computer for example.
  • At block 153, the value of Local Cost may be considered. Should the value of the Local Cost be less than zero (“0”), the flow may divert to block 154. Should the value of the Local Cost be not less than zero, the flow will divert to block 155. At block 154, the order of color(n) and color(n+1) may be swapped. At block 155, the variable integer n incremented by one and is set to the value of n plus one. At block 156, the value of “n” is compared to that of “N.” When the value of “n” exceeds the value of “N,” the process flow is diverted to block 158. When the value of “n” does not exceed the value of “N,” the process flow is diverted to block 157. At block 157, if the accumulated time for the process exceeds a timeout value, the process flow is diverted to block 158. If the accumulated time for the process does not exceed a timeout value, the process flow is diverted to block 152.
  • At block 158, the flywheel display may be displayed. The first screen region may be filled with color(1), the second screen region filled color(2), and so forth for example. In one or more embodiments of the invention, the flywheel may be displayed on a display monitor for example.
  • Color Selection History
  • As previously mentioned, the user interface has the ability to populate option groups on a context-dependent basis. In the makeup example, color palettes and the associated colors are populated based on the specific area of the subject image that colors are intended to be applied to. In this, and any other embodiments of the invention, the circular region 103 is capable of persistently remembering selected options during navigation of other palettes. So if a user were to pick three colors from one palette under ‘lipstick,’ subsequently change palettes and pick three more colors, all six would be displayed as recent selections in the history. This persistence is capable of surviving during navigation of other areas of the interface. As previously mentioned the palette and option group tab 101 and flywheel display segments 102 a through 102 ff are repopulated with new options upon the selection of a new detail. The navigation to a new region, for example ‘lip liner,’ would create a new ‘history’ to repopulate, and previous selections from the lipstick would no longer display in circular region 103. A new history would be created based on user selections under lip liner. However, should the user return to the lipstick detail, the palette and option group tab 101 and flywheel display segments 102 a through 102 ff would be repopulated, and the previous history associated with that detail would once again be displayed. This allows for a user to navigate through extensive options and sub options, while keeping the histories for each section separate and persistent through navigation without requiring separate instances of the interface for each detail being used. It will be apparent to the reader that the use of histories that are connected to each group of palettes and color groups can be applied to any embodiment where multiple option groups are available depending on the context of selection to be made. For example, wall colors retained in a history would be persistent and remembered through navigation of other areas of home decoration such as wallpaper options.
  • The interface is also able to recognize whether an option being selected by a user is already in the recent history recently selected screen regions 105-112, and able to reorder the selections within the history region to represent the new selection order without duplication of the selection in the history or the unnecessary dropping off of a selection from the recently selected screen region 112.
  • FIG. 2 represents the use of the interface in an embodiment relating to colors being applied to a subject image. At step 201 the interface obtains the palette selection and populates flywheel display segments 102 a through 102 ff with the colors associated with the selected palette. At step 202, the selection of a color from flywheel display segment 102 a applies said color to a subject image, and displays the color in the currently selected region of 104. At step 203, the method then proceeds through decisions relating to the history region of the interface. At step 204 the interface determines whether or not a history needs to be built, or whether it already exists. A history will need to be built if more than one color has been selected from at least one of the palette options provided as described above. In the event that a history does not exist and is required, at steps 205 through 206 the interface can perform the application of a newly selected color to the image, while representing the currently selected color in the current selected region of 104, while moving the last selected color to the recently selected regions of 105. In the event that a history already exists a further decision must be made depending on whether the history regions have been filled with options already or otherwise. This decision occurs at step 207. Steps 208 and 209 occur in the event that the history has not already been filed, and performs the application of a newly selected color to the image, rotating the color history around circular region 103 to make room for a new addition to the list. Steps 210 through 212 occur in the event that the history region has already been filled, and performs the same function above but instead removing the earliest color from the list in order to free up a section. For example, if recently selected screen regions 105-112 were filled with color selections and a user selects a new color, the new color is displayed in currently selected screen region 104 as currently applied, while moving the last selected to 105. All of the other recently selected colors rotate around, with the color occupying recently selected screen region 112 being removed from the list to make room. At the end of the decisions at 206, 209 or 212, the interface then loops the decision process.
  • While the following paragraph appears to be duplicative, the intention of this paragraph and accompanying figure is intended to display a higher level of abstraction at which this invention can operate with a number of options, option groups and history interfaces dependent on the context in which the invention is applied. At step 301 the interface obtains the option group selections and populates flywheel display segments 102 a through 102 ff with the options associated with the selected option group. At step 302, the selection of an option from flywheel display segment 102 a applies said option to the associated subject, and displays the selected option, or its relevant representation, in the ‘currently selected’ screen region of 104. At step 303, the method then proceeds through decisions relating to the history region of the interface. At step 304 the interface determines whether or not a history needs to be built, or whether it already exists. A history will need to be built if more than one option has been selected from at least one of the option group options provided as described above. In the event that a history does not exist and is required, at steps 305 through 306 the interface can perform the application of a newly selected option to the associated subject, while representing the currently selected option in the current selected region of 104, while moving the last selected option to the recently selected regions of 105. In the event that a history already exists a further decision must be made depending on whether the history regions have been filled with options already or otherwise. This decision occurs at step 307. Steps 308 and 309 occur in the event that the history has not already been filed, and performs the application of a newly selected option to the subject, rotating the option history around circular region 103 to make room for a new addition to the list. Steps 310 through 312 occur in the event that the history region has already been filled, and performs the same function above but instead removing the earliest option from the list in order to free up a section. For example, if recently selected screen regions 105-112 were filled with option selections and a user selects a new option, the new option is displayed in currently selected screen region 104 as currently applied, while moving the last selected to 105. All of the other recently selected options rotate around, with the option occupying recently selected screen region 112 being removed from the list to make room. At the end of the decisions at 306, 309 or 312, the interface then loops the decision process.
  • As touched on above, FIG. 4 shows an embodiment of the invention in the context of a hypertext page where the options and option groups presented and selectable by the user are presented through the internet to a client computer. In the example provided in FIG. 4, an image is presented to the application and areas of the picture for application of colors are identified. In this case, the eyes, skin and mouth are identified as being areas for color application and associated with the selection regions of 401, 402 and 403. These areas can be further separated into details of color applications, for example the selection of eyes would allow the user to further select eye shadow, mascara, and eyeliner options, whereas selection of the mouth would allow the user to further select lipstick, lip liner and lip gloss. Upon selection of an identified area, the palette groups are provided at 405. Selection of the mouth region on 401 populates region 405 with the detail selections of lipstick, lip liner and lip gloss. Further selection of one of these options from 405 populates the color selection interface at 406 which has been described above in FIG. 1. The selection of a color from 406 applies the color to the subject image at 407, while populating the color history regions of 103 as described fully above. It will be apparent that there are many embodiments of this interface, whether operated on a local system or through a computer network, and where the subject having options applied may be other than a facial image having makeup applied. In a home development embodiment for example the subject image would be a graphical representation of a room, where the subject selections of 401, 402 and 403 could represent the walls, floor ceiling of said room, and further options would allow a user to select wallpaper, paint and fabric. It will also be apparent to the reader that the interface is not limited to a set of three options in each category, but that any plurality of option groups and options can be presented to the user, and that other embodiments of this invention other than those described in example could be used.
  • Context Sensitive Menus
  • FIG. 5 describes the use of the interface within the context of the application. A subject image is presented to the application at 501, which is then analyzed by the application (see “Personalized Makeovers” co-pending patent application) and areas of the image are identified at step 502. Steps 503-507 obtain the options and option groups to be presented to the user through the interface, which are then provided and selected based on the area of image where options are to be applied. Upon the selection of an option by the user the application processes both the subject and the options at step 508 and presents the user with the resulting output at step 509. The steps are repeated for each subject presented to the application, and each option selected by a user. In our makeup example, an image of a face is provided to the application which is then analyzed into areas at 502. Option groups, such as the lipstick, lip liner, eye shadow and foundation options are collected and presented at 507 to the user. Upon selection of a particular makeup type and color the application processes the selection at 508, and presents an output of the image with the selected colors and makeup types selected at 509.
  • When an image is provided to the system the system processes the image to identify the various features within the image. This is achieved using the facial recognition or other image processing algorithms or located manually by the user. Once the features within an image are located the system is then able to take actions based on the user input. The image in this example is a face, which has been divided into sections for application of makeup. The eyes, mouth, and skin have been identified as areas for application, and assigned as ‘hotspots’ within the image. These ‘hotspots’ allow the user to directly interact with the image being modified and apply options relevant to the hotspot. For instance, blush is applied to the cheeks and lipstick to the lips.
  • The location of each hotspot is determined by the system once the facial features are identified by the system or user. The eyes, lips, cheeks, eyelashes or any other facial features have an associated set of action. These hotspots, which may differ from image to image, are based on the facial or image recognition system identifying the features. In some cases the user confirms or adjusts the first attempt the system makes at identifying the features. In one or more embodiments of the invention, when the user clicks the mouse over a hotspot the system presents actions that can be performed on the part of the image associated with the hotspot. In one or more embodiments of the invention, commands that can be performed on the part of the image associated with the hotspot may be activated by other means such as a user touching a touch screen or through the use of a light pen for example. This provides the user with context sensitive menus that are based on different parts of the image having been given a feature classification by the system. An eyelid is thus identified as such as are every other feature within the image.
  • The system uses this feature information to present commands to the user that are relevant to the feature. This means that a user, using an activation command, most commonly a left click on a mouse pointer device, is able to access a menu wherein the menu relates to the area being clicked. In one or more embodiments of the invention, commands are presented when a user clicks the right or other buttons on a computer mouse. In this example, right clicking on the area of an image that was identified as being eyes presents an eye-related context menu. The user may then select one of the operations on the menu and perform direct manipulation of the image. A right click on a part of the image that has no specific identifier would present the user with options that are applicable to the entire image. These context sensitive menus, and the options they provide, are associated with the particular image upon that image being processed and shown to the user through the interface provided.
  • Look History/Saving
  • In the examples given, a number of beauty products can be applied to an image to create a personalized makeover. The products are applied to specific areas of the image to which they apply lipstick and lip liner to lip regions of the image, eye shadow and liner to the eyes, and so on until a full set of makeup has been created by the user in this personalized makeover. This completed set of selections made by a user and applied to the subject image is then a complete “look.” The term look is used herein as a noun in a sense to describe an overall style and appearance of a subject, such as a person, personal fashion, or interior decoration. The term makeover is used as a noun to describe the creation of a new look to improve attractiveness in a subject, conform to societal fashions, or simply modify the appearance. As such, giving a subject image a makeover through this interface creates a new ‘look’ which is made up of the options selected by a user through this interface and applied to the subject image. For example, this look could contain a specific style and color of eye shadow for application, a specific color and amount of foundation to apply, specific colors of blush and thickness of application, and so on until all desired makeups are specified and given color and quantity values within the look.
  • When users create a look by making various makeup choices and applying those choices to an image to arrive at a “look” the look can be saved and later applied to a different image and or the same image. Since a ‘look’ is the collection of options selected by a user from the option selection interface, the ‘look’ can be saved and stored independently of the subject image used to produce the look. Also, ‘looks’ can be predetermined and applied to any number of subject images. For example, if a user wished to apply a ‘high fashion look’ to their own face, the user can provide their own image at 601, which is processed by the system (further described and referenced in the “Personalized Makeovers” copending patent application). In processing, the areas of the image are identified to correspond with the mouth, eyes, skin areas and other parts of the image. The high fashion ‘look’ contains data relating to the beauty products used to create the look, such as a specific brand and color of eye shadow, lipstick and so on. This data is then applied over the subject image to create the look. In other embodiments, the look could, in the context of interior decoration, contain furniture types with an associated style, wall color and texture with associated RGB color values and/or a texture image, and door types. It will be apparent to the reader that the concept of a saved look can relate to any number of options, selected from option groups, associated with a subject and presented to the user applied to the subject.
  • Alternatively, looks can be created by a user, and subsequently saved, and selected by another user for application to a new subject image. In this, the image shown at FIG. 4 element 407 is an active rendering of the subject image, with areas of the image identified, and the ‘look’ data applied over the identified areas of the subject image. Once the user has made selections that he or she is happy with, the ‘look’ is saved and associated with the subject image. Other users are capable of viewing this database and determining what ‘looks’ they like, and further apply the look to a subject image applied by them.
  • The information that defines a look is stored in a data structure that defines the color values and region information associated with each item that is needed to create the look. The color values are those the user selected while creating the look and the region information defines what part of the image the colors should be applied against. The various features identified during facial detection are used to define the regions for application of the color values. This information is saved and can later be applied to any image even if the look was created using a different image.
  • A method for applying a look to an image in accordance with one or more embodiments of the invention is shown in FIG. 6. The look attributes or options such as what makeup was used to create the look are saved independent of the image. At step 601, an image is provided to the system and analyzed by the system, where areas for option application such as color values are identified. The features within the image are identified and the user is presented with an interface for defining what makeup to apply to which features. Choices are made for instance as to what lipstick, eye-shadow, foundation, blush, or other makeup items make a look. At step 602 the look is created and applied to the subject image at step 602. The look is then saved in a structure that contains the saved combinations of beauty products, hair styles, accessories, and associated with the uploaded image at step 603. A second user can review the interface and see uploaded images with the saved looks applied at step 604. The second user can then upload a new image at step 605, which is further given a saved look to apply to the image in the same manner as the previous image at step 606.
  • FIG. 7 shows the image being provided to the system, in which the user can select an already saved ‘look’ to apply to their image, or start on their own and create a new look for the uploaded image. At step 701, the image may be uploaded or selected by a user. At step 702, the image is analyzed and the image is broken into sections for the option or look application. The system determines whether the user wishes to apply a saved look to an image or whether to allow the user to create their own look through the system of applying options from option groups described above at 703. The user is then presented with the interface shown in FIG. 4, where the image at 407 can either be unmodified and ready for option application at step 704, or can be presented as an image rendered with options from a saved look at step 705. The rendered image is presented to the user at step 707.
  • The look is applied to the image as layers over the image using the order generally used during the application of makeup. Foundation for instance is applied before blush, etc. In one or more embodiments of the invention the process of applying a makeover is as follows: A makeover is represented as a base image, a set of layers that may follow the layers of application of real life makeup. Rendering proceeds by applying the lowest layers up through the highest layers. The low layers would include concealer and foundation. Blush might be applied on top of this. A hair image layer at the top layer with reasonable choices being made by the system about what layer is best to apply next. The makeup associated with the look has an applied area which is a defined style. This can be represented as a mask which might be binary or it might be an alpha mask containing numerical values. The mask shape may be predefined. It might be warped by a user. For example, multi-toned eye shadow may be represented by three masks, one for three different eye shadow products, and these will be rendered in order. The mask shapes are generally different. The user may modify the masks so as to, for example, fall on the crease of the eye and extend to the eye brow.
  • The mask for applying products like foundation or tanners or bronzers might use an automatically constructed mask based on automatic skin detection as described elsewhere. It may use a predefined mask whose shape and location is reference to automatically or manually detected face features.
  • A rendering process for each layer takes the partially rendered image to that point and renders the new layer to the region defined by the mask. The output of applying that layer will be a function of the makeup properties including its color, transparency, luster, gloss, metals content, and the underlying image color. There are different ways to model each of the ways types of makeup are applied. One method is called alpha blending. Others use the separation of specular from diffuse reflectance and apply colors to these components. Others use shape models of the face to determine specular components. Once a layer is applied, it is not modified by layers above.
  • The masks and their positions, and the product parameters (color, transparency, luster, gloss, metal content, and flecks) are called a look. This look may be saved to a disk or other storage device. The look may be loaded and applied to an original image. The look may be applied to a second image of the same person. For this to be effective, the underlying facial features from the first image must be brought into correspondence with the second image. This correspondence induces a transformation of the masked regions. The foundation and bronzer layer would still use the automatically detected skin mask. The subsequent layers would be transformed according to the correspondence.
  • Fast rendering can be accomplished using rendering at multiple resolutions and backgrounding. Each layer is rendered in turn at low resolution, then higher resolutions until the full resolution rendering proceeds. In an interactive setting such as a makeup editor, a user may be modifying the makeover (mask positions) faster than the rendering can terminate when at full resolution. With multi-resolution rendering with backgrounding, if a mouse event causes a movement of a layer, the rendering will terminate at whatever resolution it has reached and rendering would commence at the new position starting at the lowest resolution. This provides a high degree of interactivity. As the defined ‘looks’ can be considered separate but still associated with a subject presented to the system, it is possible for the same subject to have multiple looks applied to it and viewed at a glance. Conversely, it is also possible for the same look to be applied to multiple images. FIG. 8 shows, in the context of our makeup example, an interface allowing for the ‘at-a-glance’ viewing of an image with multiple applied looks. The original subject image, without any application, is shown at 801. A copy of the image with a specific applied look is shown at 802. This look would be a user created, and user saved look containing details of the beauty product types and options being applied to the image. A subsequent copy of the image is shown at 803 with a different look attached. This allows a user to compare images with various makeup selections (options) applied and make a personal selection based on the images presented to them.
  • Computer System Aspect
  • One or more embodiments of the invention may be implemented in the form of one or more computer programs that when executed in computer memory may cause one or more computer processors to initiate the methods and processes described herein. The files assembled to makeup the software program may be stored on one or more computer-readable medium and retrieved when needed to carry out the programmed methods and processes described herein. Within the scope of a computer-implemented embodiment of the invention, readers should note that one or more embodiments of the invention may comprise computer programs, data and other information further comprising but not limited to: sets of computer instructions, code sequences, configuration information, data and other information in any form, format or language usable by a general purpose computer or other data processing device, such that when such a computer or device contains, is programmed with, or has access to said computer programs, the data and other information transforms said general purpose computer into a machine capable of performing the methods and processes described herein, and specifically such as those described above.
  • Various embodiments of the invention may be implemented as a method, apparatus, or article of manufacture using standard programming and/or engineering techniques to produce software, firmware, hardware, computer-readable media or any combination thereof. The term “article of manufacture” (or alternatively, “computer program product,”) as used herein is intended to encompass a computer program of any form accessible from any computer-readable device, carrier or media. In addition, the software in which various embodiments are implemented may be accessible through a transmission medium, such as for example, from a server over the network. The article of manufacture in which the program is implemented may also employ transmission media, such as a network transmission line and/or a wireless transmission media. Those skilled in the art will recognize that many modifications may be made to this configuration without departing from the scope of the invention.
  • A computer-readable medium suitable to provide computer readable instructions and/or computer readable data for the methods and processes described herein may be any type of magnetic, optical, electrical or other storage medium including disk, tape, CD, DVD, flash drive, thumb drive, storage card, distributed storage or any other memory device, location, approach or other storage medium or technique known to those of skill in the art.
  • In one or more embodiments of the invention, the methods described here may not be limited as to the type of computer it may run upon and may for instance operate on any generalized computer system that has the computational ability to execute the methods described herein and can display the results of the user's choices on one or more display devices. Display devices appropriate for providing interaction with the invention described herein includes, but is not limited to, computer monitors, cell phones, PDAs, televisions, or any other form of computer controllable output display. As used herein, a computer system refers to but is not limited to any type of computing device, including its associated computer software, data, peripheral devices, communications equipment and any required or desired computers that may achieve direct or indirect communication with a primary computing device.
  • In one or more embodiments of the invention, a general-purpose computer may be utilized to implement one or more aspects of the invention. In one or more embodiments of the invention, the computer may include various input and output means, including but not limited to a keyboard or other textual input devices, a display device such as a monitor or other display screen, and a pointing device and/or user selection indicator such as a mouse, keypad, touch screen, pointing device, or other known input/output devices known to those of skill in the art. The general purpose computer described herein may include one or more banks of random access memory, read only memory, and one or more central processing unit(s). The general purpose computer described herein may also include one or more data storage device(s) such as a hard disk drive, or other computer readable medium discussed above. An operating system that executes within the computer memory may provide an interface between the hardware and software. The operating system may be responsible for managing, coordinating and sharing of the limited resources within the computer. Software programs that run on the computer may be performed by an operating system to provide the program of the invention with access to the resources needed to execute. In other embodiments the program may run stand-alone on the processor to perform the methods described herein.
  • In one or more embodiments of the invention, the method(s) described herein, when loaded on or executing through or by one or more general purpose computer(s) described above, may transform the general purpose computer(s) into a specially programmed computer able to perform the method or methods described herein. In one or more embodiments of the invention, the computer-readable storage medium(s) encoded with computer program instructions that, when accessed by a computer, may cause the computer to load the program instructions to a memory there accessible, thereby creates a specially programmed computer able to perform the methods described herein as a specially programmed computer.
  • The specially programmed computer of the invention may also comprise a connection that allows the computer to send and/or receive data through a computer network such as the Internet or other communication network. Mobile computer platforms such as cellular telephones, Personal Desktop Assistants (PDAs), other hand-held computing devices, digital recorders, wearable computing devices, kiosks, set top boxes, games boxes or any other computational device, portable, personal, real or virtual or otherwise, may also qualify as a computer system or part of a computer system capable of executing the methods described herein as a specially programmed computer.
  • FIG. 9 depicts a general-purpose computer and peripherals, when programmed as described herein, may operate as a specially programmed computer capable of implementing one or more methods, apparatus and/or systems of the invention. Processor 907 may be coupled to bi-directional communication infrastructure 902 such as Communication Infrastructure System Bus 902. Communication Infrastructure 902 may generally be a system bus that provides an interface to the other components in the general-purpose computer system such as Processor 907, Main Memory 906, Display Interface 908, Secondary Memory 912 and/or Communication Interface 924.
  • Main memory 906 may provide a computer readable medium for accessing and executed stored data and applications. Display Interface 908 may communicate with Display Unit 910 which may be utilized to display outputs to the user of the specially-programmed computer system. Display Unit 910 may comprise one or more monitors that may visually depict aspects of the computer program to the user. Main Memory 906 and Display Interface 908 may be coupled to Communication Infrastructure 902, which may serve as the interface point to Secondary Memory 912 and Communication Interface 924. Secondary Memory 912 may provide additional memory resources beyond main Memory 906, and may generally function as a storage location for computer programs to be executed by Processor 907. Either fixed or removable computer-readable media may serve as Secondary Memory 912. Secondary Memory 912 may comprise, for example, Hard Disk 914 and Removable Storage Drive 916 that may have an associated Removable Storage Unit 918. There may be multiple sources of Secondary Memory 912 and systems of the invention may be configured as needed to support the data storage requirements of the user and the methods described herein. Secondary Memory 912 may also comprise Interface 920 that serves as an interface point to additional storage such as Removable Storage Unit 922. Numerous types of data storage devices may serve as repositories for data utilized by the specially programmed computer system of the invention. For example, magnetic, optical or magnetic-optical storage systems, or any other available mass storage technology that provides a repository for digital information may be used.
  • Communication Interface 924 may be coupled to Communication Infrastructure 902 and may serve as a conduit for data destined for or received from Communication Path 926. A Network Interface Card (NIC) is an example of the type of device that once coupled to Communication Infrastructure 902 may provide a mechanism for transporting data to Communication Path 926. Computer networks such Local Area Networks (LAN), Wide Area Networks (WAN), Wireless networks, optical networks, distributed networks, the Internet or any combination thereof are some examples of the type of communication paths that may be utilized by the specially program computer system of the invention. Communication Path 926 may comprise any type of telecommunication network or interconnection fabric that can transport data to and from Communication Interface 924.
  • To facilitate user interaction with the specially programmed computer system of the invention, one or more Human Interface Devices (HID) 930 may be provided. Some examples of HIDs that enable users to input commands or data to the specially programmed computer of the invention may comprise a keyboard, mouse, touch screen devices, microphones or other audio interface devices, motion sensors or the like, as well as any other device able to accept any kind of human input and in turn communicate that input to Processor 907 to trigger one or more responses from the specially programmed computer of the invention are within the scope of the system of the invention.
  • While FIG. 9 depicts a physical device, the scope of the system of the invention may also encompass a virtual device, virtual machine or simulator embodied in one or more computer programs executing on a computer or computer system and acting or providing a computer system environment compatible with the methods and processes of the invention. Where a virtual machine, process, device or otherwise performs substantially similarly to that of a physical computer system of the invention, such a virtual platform will also fall within the scope of a system of the invention, notwithstanding the description herein of a physical system such as that in FIG. 9.
  • One or more embodiments of the invention are configured to enable the specially programmed computer of the invention to take the input data given and transform it into an interface enabling dynamic layout of options or color within a group for application to an image, arrangement and ordering of colors, color selection history, context sensitive menus, and look history and saving, by applying one or more of the methods and/or processes of the invention as described herein. Thus the methods described herein are able to transform the raw input data provided to the system of the invention into a resulting output of the system using the specially programmed computer as described.
  • One or more embodiments of the invention are configured to enable a general-purpose computer to take one or more color palettes and the color choices associated with each color palette, from memory and transform and display the graphical user interface component on Display Unit 910 for example. The user, through the interaction with the Human Interface Device 930, enters a selection of a region of a subject image. Processor 907 receives the selection of a region of a subject image, transforms the color palette and color choices data, and transmits the information to the Display Unit 910 for display. The user may interact with the computer system through Human Interface Device 930 and may select a second color choice which causes Processor 907 to process the information and transmit signal to the graphical user interface on Display Unit 910.
  • In one or more embodiments of the invention, when a user selects an identified region of a subject image, Processor 907 transmits data to the Display Unit 910 and enables the user to see context sensitive menus that is associated with the specific identified region of the subject image.
  • In one or more embodiments of the invention, Processor 907 requests for the color records for a particular color palette. Processor 907 then calculates the arrangement of the colors in a color flywheel display through the use of a greedy algorithm. Processor 907 then transmits the signals to the Display Unit 910 where they may be displayed to a user.
  • In or more embodiments of the invention, Human Interface device 903 accepts a user's input in which the user may select a first identified region of a subject image to apply a first color, and a second identified region of a subject image to apply a second color. Processor 907 may process the metadata describing the first color and second color and store that information as a “look” in Main Memory 906 for example.
  • While the invention herein disclosed has been described by means of specific embodiments and applications thereof, numerous modifications and variations could be made thereto by those skilled in the art without departing from the scope of the invention set forth in the claims.

Claims (32)

1. A computer program product for rendering a graphical user interface component for selecting color to apply to an image comprising computer readable program code, said computer readable program code executing in a tangible memory medium and configured to:
display a graphical user interface component comprising a palette selection region, a color selection region, a history screen region further comprising a current selected region and a plurality of recently selected screen regions;
identify an identified region of a subject image to apply a color;
define at least one color palette to associate with said palette selection region;
display a plurality of color choices within said at least one color palette in said color selection region;
obtain input from a user representing a first color choice made by said user;
apply said first color choice to said identified region of a subject image;
display said first color choice in said current selected region;
if second color choice is input by said user, apply said new color choice to said identified region of said subject image; and
display said new color choice in said current selected region; and
display said first color choice in one of said plurality of recently selected screen regions.
2. The computer program product of claim 1 wherein said color palette is defined by said user.
3. The computer program product of claim 1 wherein said color palette is defined by a default selection.
4. The computer program product of claim 1 wherein said color palette is defined based on a color profile of an associated image.
5. The computer program product of claim 1 wherein said color palette is defined based on said identified region of said subject image.
6. The computer program product of claim 1 wherein said color palette represents colors associated with a cosmetic product.
7. The computer program product of claim 1 wherein said color palette represents a product line of products and their associated characteristics.
8. The computer program product of claim 6 wherein said cosmetic product is a facial cosmetic product.
9. The computer program product of claim 8 wherein said facial cosmetic product is foundation.
10. The computer program product of claim 8 wherein said facial cosmetic product is blush.
11. The computer program product of claim 8 wherein said facial cosmetic product is concealer.
12. The computer program product of claim 8 wherein said facial cosmetic product is eye shadow.
13. The computer program product of claim 8 wherein said facial cosmetic product is mascara.
14. The computer program product of claim 8 wherein said facial cosmetic product is eye liner.
15. The computer program product of claim 8 wherein said facial cosmetic product is lipstick.
16. The computer program product of claim 8 wherein said facial cosmetic product is lip gloss.
17. The computer program product of claim 8 wherein said facial cosmetic product is lip liner.
18. The computer program product of claim 1 wherein said palette selection region is a tab adjacent to said color selection region.
19. The computer program product of claim 1 wherein said color selection region is visually represented as a flywheel separated into segments for plurality of said color choices.
20. The computer program product of claim 1 wherein said recently selected screen region moves in descending order of most to least recently selected color choices.
21. The computer program product of claim 1 wherein definition of said at least one color palette to associate with said palette selection region is defined by said user.
22. The computer program product of claim 1 wherein definition of said at least one color palette to associate with said palette selection region is defined by a system.
23. The computer program product of claim 1 wherein definition of said at least one color palette to associate with said palette selection region is defined by said identified region of subject image.
24. The computer program product of claim 1 wherein a context sensitive menu is presented to said user when said user selects said identified region of said subject image, wherein said context sensitive menu presents at least one action that is associated with said identified region.
25. The computer program product of claim 1 further configured to present at least one option group wherein said at least one option group is associated with said at least one color palette.
26. A method for displaying a color selection region comprising:
creating a circular doubly linked flywheel data structure, said flywheel data structure comprising a plurality of records;
associating each record of said plurality of records with a segment on a flywheel display;
associating each color of a plurality of colors with each record of said plurality of records;
calculating a local cost for a color;
when said local cost is less than zero, swapping said record of said color with a next record of a next color; and
displaying a first color in said plurality of color on a first segment of said flywheel display.
27. The method of claim 26 wherein said calculating a local cost for a color comprises determining differences in color space of said record and said next record.
28. The method of claim 26 wherein said flywheel display is visually represented as a circle.
29. The method of claim 26 wherein said flywheel display is visually represented as an ellipse.
30. A computer program product for rendering a graphical user interface component for selecting color to apply to an image comprising computer readable program code, said computer readable program code executing in a tangible memory medium and configured to:
display a graphical user interface component comprising a palette selection region, a color selection region further comprising a flywheel display, a history screen region further comprising a current selected region and a plurality of recently selected screen regions;
identify a region of a subject image to apply a color to;
define at least one color palette to associate with said palette selection region;
create a circular doubly linked flywheel data structure, said flywheel data structure comprising a plurality of records;
associate each record of said plurality of records with a segment on a flywheel display;
associate each color of a plurality of colors with each record of said plurality of records;
calculate a local cost for a color;
when said local cost is less than zero, swap said record of said color with a next record of a next color; and
display a first color in said plurality of color on a first segment of said flywheel display.
obtain input from a user representing a first color choice made by said user;
apply said first color choice to said identified region of a subject image;
display said first color choice in said current selected region;
if second color choice is input by said user, apply said new color choice to said identified region of said subject image;
display said new color choice in said current selected region; and
display said first color choice in one of said plurality of recently selected screen regions.
31. A computer program product for rendering a graphical user interface component for selecting color to apply to an image comprising computer readable program code, said computer readable program code executing in a tangible memory medium and configured to:
display a graphical user interface component comprising a palette selection region, a color selection region;
identify a first identified region of a subject image to apply a first color;
obtain input from a user to determine a first color choice of said first identified region;
identify a second identified region of said subject image to apply a second color;
obtain input from said user to determine a second color choice of said second identified region; and,
store a look of said subject image, said look comprising said first color choice and said second color choice.
32. A computer readable storage medium encoded with computer program instructions which when accessed by a computer cause the computer to load the program instructions to a memory therein creating a special purpose data structure causing the computer to operate as a specially programmed computer executing a method of displaying color selection history comprising:
displaying in a specially programmed computer a graphical user interface component comprising a palette selection region, a color selection region, a history screen region further comprising a current selected region and a plurality of recently selected screen regions;
receiving through a human interface device, an identified region of a subject image to apply a color;
computing at least one color palette to associate with said palette selection region in one or more processors;
displaying a plurality of color choices within at least one color palette in said color selection region;
receiving through said human interface device input from a user representing a first color choice made by said user;
displaying said first color choice to said identified region of a subject image;
displaying said first color choice in said current selected region;
if second color choice is input by said user, displaying said new color choice to said identified region of said subject image; and
displaying said new color choice in said current selected region; and
displaying said first color choice in one of said plurality of recently selected screen regions.
US12/406,066 2008-03-17 2009-03-17 Graphical user interface for selection of options from option groups and methods relating to same Abandoned US20090231356A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/406,066 US20090231356A1 (en) 2008-03-17 2009-03-17 Graphical user interface for selection of options from option groups and methods relating to same

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US3731408P 2008-03-17 2008-03-17
US3731908P 2008-03-17 2008-03-17
US3732308P 2008-03-17 2008-03-17
US12/406,066 US20090231356A1 (en) 2008-03-17 2009-03-17 Graphical user interface for selection of options from option groups and methods relating to same

Publications (1)

Publication Number Publication Date
US20090231356A1 true US20090231356A1 (en) 2009-09-17

Family

ID=41062548

Family Applications (3)

Application Number Title Priority Date Filing Date
US12/406,066 Abandoned US20090231356A1 (en) 2008-03-17 2009-03-17 Graphical user interface for selection of options from option groups and methods relating to same
US12/406,063 Abandoned US20090234716A1 (en) 2008-03-17 2009-03-17 Method of monetizing online personal beauty product selections
US12/406,099 Expired - Fee Related US9058765B1 (en) 2008-03-17 2009-03-17 System and method for creating and sharing personalized virtual makeovers

Family Applications After (2)

Application Number Title Priority Date Filing Date
US12/406,063 Abandoned US20090234716A1 (en) 2008-03-17 2009-03-17 Method of monetizing online personal beauty product selections
US12/406,099 Expired - Fee Related US9058765B1 (en) 2008-03-17 2009-03-17 System and method for creating and sharing personalized virtual makeovers

Country Status (1)

Country Link
US (3) US20090231356A1 (en)

Cited By (94)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090280150A1 (en) * 2008-05-09 2009-11-12 Tamar Lara Kamen Targeted And Individualized Cosmetic Delivery
US20100068247A1 (en) * 2008-09-16 2010-03-18 Tsung-Wei Robert Mou Method And System For Providing Targeted And Individualized Delivery Of Cosmetic Actives
US20100235152A1 (en) * 2009-03-11 2010-09-16 Kimura Mitsunori Interactive contact lens simulation system and method
US20110113378A1 (en) * 2009-11-09 2011-05-12 International Business Machines Corporation Contextual abnormality captchas
US20110123703A1 (en) * 2008-05-09 2011-05-26 Fatemeh Mohammadi Method and System For Automatic or Manual Evaluation to Provide Targeted and Individualized Delivery of Cosmetic Actives in a Mask or Patch Form
US20110164787A1 (en) * 2009-07-13 2011-07-07 Pierre Legagneur Method and system for applying cosmetic and/or accessorial enhancements to digital images
US20110197164A1 (en) * 2010-02-11 2011-08-11 Samsung Electronics Co. Ltd. Method and system for displaying screen in a mobile device
US20120105336A1 (en) * 2010-10-27 2012-05-03 Hon Hai Precision Industry Co., Ltd. Electronic cosmetic case with 3d function
US20120221418A1 (en) * 2000-08-24 2012-08-30 Linda Smith Targeted Marketing System and Method
WO2012122419A1 (en) * 2011-03-08 2012-09-13 Affinnova, Inc. System and method for concept development
US20120306991A1 (en) * 2011-06-06 2012-12-06 Cisco Technology, Inc. Diminishing an Appearance of a Double Chin in Video Communications
US20130019208A1 (en) * 2011-07-14 2013-01-17 Microsoft Corporation Managing content color through context based color menu
US20130111337A1 (en) * 2011-11-02 2013-05-02 Arcsoft Inc. One-click makeover
WO2013062718A1 (en) * 2011-10-28 2013-05-02 Apple Inc. On-screen image adjustments
US8491926B2 (en) 2008-09-16 2013-07-23 Elc Management Llc Method and system for automatic or manual evaluation to provide targeted and individualized delivery of cosmetic actives in a mask or patch form
US20130271485A1 (en) * 2010-10-29 2013-10-17 Omron Corporation Image-processing device, image-processing method, and control program
US20140040789A1 (en) * 2012-05-08 2014-02-06 Adobe Systems Incorporated Tool configuration history in a user interface
US20140047389A1 (en) * 2012-08-10 2014-02-13 Parham Aarabi Method and system for modification of digital images through rotational cascading-effect interface
US20140111539A1 (en) * 2012-10-22 2014-04-24 FiftyThree, Inc. Methods and apparatus for providing color palette management within a graphical user interface
CN103885702A (en) * 2012-12-20 2014-06-25 宏达国际电子股份有限公司 Menu Management Methods And Systems
CN104380339A (en) * 2013-04-08 2015-02-25 松下电器(美国)知识产权公司 Image processing device, image processing method, and program, capable of virtual reproduction of makeup application state
GB2518589A (en) * 2013-07-30 2015-04-01 Holition Ltd Image processing
CN104866165A (en) * 2014-02-21 2015-08-26 联想(北京)有限公司 Information processing method and electronic equipment
AU2015101183B4 (en) * 2014-09-02 2015-11-19 Apple Inc. User interface for receiving user input
US9208132B2 (en) 2011-03-08 2015-12-08 The Nielsen Company (Us), Llc System and method for concept development with content aware text editor
WO2016054164A1 (en) * 2014-09-30 2016-04-07 Tcms Transparent Beauty, Llc Precise application of cosmetic looks from over a network environment
US9311383B1 (en) 2012-01-13 2016-04-12 The Nielsen Company (Us), Llc Optimal solution identification system and method
USD754182S1 (en) * 2013-12-20 2016-04-19 Teenage Engineering Ab Display screen or portion thereof with graphical user interface
US9459781B2 (en) 2012-05-09 2016-10-04 Apple Inc. Context-specific user interfaces for displaying animated sequences
US9467436B2 (en) * 2013-01-04 2016-10-11 Gary Stephen Shuster Captcha systems and methods
USRE46178E1 (en) 2000-11-10 2016-10-11 The Nielsen Company (Us), Llc Method and apparatus for evolutionary design
US20170004635A1 (en) * 2006-08-14 2017-01-05 Albert D. Edgar System and Method for Applying a Reflectance Modifying Agent to Change a Person's Appearance Based on a Digital Image
US9547425B2 (en) 2012-05-09 2017-01-17 Apple Inc. Context-specific user interfaces
EP3017415A4 (en) * 2013-07-03 2017-01-18 Glasses.Com Inc. Systems and methods for recommending products via crowdsourcing and detecting user characteristics
US9785995B2 (en) 2013-03-15 2017-10-10 The Nielsen Company (Us), Llc Method and apparatus for interactive evolutionary algorithms with respondent directed breeding
US9799041B2 (en) 2013-03-15 2017-10-24 The Nielsen Company (Us), Llc Method and apparatus for interactive evolutionary optimization of concepts
US20180042361A1 (en) * 2014-12-02 2018-02-15 L'oreal Dispensing system and method for learning to use such a dispensing system
US9916075B2 (en) 2015-06-05 2018-03-13 Apple Inc. Formatting content for a reduced-size user interface
US20180129367A1 (en) * 2016-11-04 2018-05-10 Microsoft Technology Licensing, Llc Action-enabled inking tools
US20180129366A1 (en) * 2016-11-04 2018-05-10 Microsoft Technology Licensing, Llc Personalized persistent collection of customized inking tools
US10016046B2 (en) 2005-08-12 2018-07-10 Tcms Transparent Beauty, Llc System and method for applying a reflectance modifying agent to improve the visual attractiveness of human skin
CN108292423A (en) * 2015-12-25 2018-07-17 松下知识产权经营株式会社 Local dressing producing device, local dressing utilize program using device, local dressing production method, local dressing using method, local dressing production process and local dressing
USD823320S1 (en) * 2017-05-24 2018-07-17 Koninklijke Philips N.V. Display screen with graphical user interface
US10055121B2 (en) 2015-03-07 2018-08-21 Apple Inc. Activity based thresholds and feedbacks
US10092082B2 (en) 2007-05-29 2018-10-09 Tcms Transparent Beauty Llc Apparatus and method for the precision application of cosmetics
EP3022639B1 (en) * 2013-07-16 2018-10-31 Pinterest, Inc. Object based contextual menu controls
USD844013S1 (en) * 2017-05-24 2019-03-26 Koninklijke Philips N.V. Display screen with animated graphical user interface
US10254948B2 (en) 2014-09-02 2019-04-09 Apple Inc. Reduced-size user interfaces for dynamically updated application overviews
US10268367B2 (en) 2010-02-19 2019-04-23 Microsoft Technology Licensing, Llc Radial menus with bezel gestures
US10272294B2 (en) 2016-06-11 2019-04-30 Apple Inc. Activity and workout updates
CN109844800A (en) * 2016-10-14 2019-06-04 松下知识产权经营株式会社 Virtual cosmetic device and virtual cosmetic method
US10354263B2 (en) 2011-04-07 2019-07-16 The Nielsen Company (Us), Llc Methods and apparatus to model consumer choice sourcing
US10452253B2 (en) 2014-08-15 2019-10-22 Apple Inc. Weather user interface
US10486174B2 (en) 2007-02-12 2019-11-26 Tcms Transparent Beauty Llc System and method for applying a reflectance modifying agent electrostatically to improve the visual attractiveness of human skin
US10613745B2 (en) 2012-05-09 2020-04-07 Apple Inc. User interface for receiving user input
US10771606B2 (en) 2014-09-02 2020-09-08 Apple Inc. Phone user interface
US10809884B2 (en) 2017-11-06 2020-10-20 The Sherwin-Williams Company Paint color selection and display system and method
US10824317B2 (en) * 2017-06-14 2020-11-03 Behr Process Corporation Systems and methods for assisting with color selection
US10849406B2 (en) 2014-12-02 2020-12-01 L'oreal System for dispensing at least one makeup product and method for dispensing and evaluating makeup
US10872318B2 (en) 2014-06-27 2020-12-22 Apple Inc. Reduced size user interface
US10901576B1 (en) 2016-11-01 2021-01-26 Swimc Llc Color selection and display
WO2021021442A1 (en) * 2019-07-31 2021-02-04 L'oreal Improved color wheel interface
US10925377B2 (en) 2014-12-02 2021-02-23 L'oreal Dispensing system having at least two outlet interfaces
US10990270B2 (en) 2012-05-09 2021-04-27 Apple Inc. Context-specific user interfaces
US20210166067A1 (en) * 2018-09-21 2021-06-03 Fujifilm Corporation Image suggestion apparatus, image suggestion method, and image suggestion program
USD923033S1 (en) 2020-02-12 2021-06-22 SpotLogic, Inc. Computer display panel with a home screen graphical user interface for an application that optimizes interpersonal interaction
USD924916S1 (en) 2020-02-12 2021-07-13 SpotLogic, Inc. Computer display panel with a meeting planning graphical user interface for an application that optimizes interpersonal interaction
CN113126846A (en) * 2021-04-27 2021-07-16 广州市妇女儿童医疗中心 Intelligent medicine information processing method and device, computer equipment and storage medium
USD925595S1 (en) * 2020-02-12 2021-07-20 SpotLogic, Inc. Computer display panel with a graphical user interface for an application that optimizes interpersonal interaction
US11076680B2 (en) 2014-12-02 2021-08-03 L'oreal System for dispensing a makeup product
US20210241501A1 (en) * 2020-01-31 2021-08-05 L'oreal System and method of lipstick bulktone and application evaluation
US11087388B1 (en) 2016-10-31 2021-08-10 Swimc Llc Product-focused search method and apparatus
USD932507S1 (en) 2020-02-12 2021-10-05 SpotLogic, Inc. Computer display panel with a meeting objective editing graphical user interface for an application that optimizes interpersonal interaction
US20210318796A1 (en) * 2018-08-17 2021-10-14 Matrix Analytics Corporation System and Method for Fabricating Decorative Surfaces
USD933692S1 (en) 2020-02-12 2021-10-19 SpotLogic, Inc. Computer display panel with a meeting objective editing graphical user interface for an application that optimizes interpersonal interaction
USD940731S1 (en) * 2019-10-31 2022-01-11 Eli Lilly And Company Display screen with a graphical user interface
US11225373B2 (en) 2014-12-02 2022-01-18 L'oreal Assembly comprising an airbrush
US11341685B2 (en) * 2019-05-03 2022-05-24 NipLips, LLC Color-matching body part to lip product
US20220240650A1 (en) * 2021-01-29 2022-08-04 L'oreal Remote beauty consultation system
US20220269402A1 (en) * 2016-06-29 2022-08-25 Dassault Systemes Generation of a color of an object displayed on a gui
US20220326837A1 (en) * 2021-04-13 2022-10-13 Apple Inc. Methods for providing an immersive experience in an environment
US11604571B2 (en) 2014-07-21 2023-03-14 Apple Inc. Remote user interface
EP4160540A1 (en) * 2021-09-29 2023-04-05 Beijing Dajia Internet Information Technology Co., Ltd. Method and apparatus for producing special effect, electronic device and storage medium
US11636662B2 (en) 2021-09-30 2023-04-25 Snap Inc. Body normal network light and rendering control
US11651572B2 (en) 2021-10-11 2023-05-16 Snap Inc. Light and rendering of garments
US11657417B2 (en) 2015-04-02 2023-05-23 Nielsen Consumer Llc Methods and apparatus to identify affinity between segment attributes and product characteristics
US11670059B2 (en) 2021-09-01 2023-06-06 Snap Inc. Controlling interactive fashion based on body gestures
US11673054B2 (en) 2021-09-07 2023-06-13 Snap Inc. Controlling AR games on fashion items
US11714536B2 (en) 2021-05-21 2023-08-01 Apple Inc. Avatar sticker editor user interfaces
US11734866B2 (en) 2021-09-13 2023-08-22 Snap Inc. Controlling interactive fashion based on voice
US11816144B2 (en) 2022-03-31 2023-11-14 Pinterest, Inc. Hair pattern determination and filtering
GB2619283A (en) * 2022-05-26 2023-12-06 Holition Ltd Simulating foundation makeup effect in augmented images
US11900506B2 (en) * 2021-09-09 2024-02-13 Snap Inc. Controlling interactive fashion based on facial expressions
US11921998B2 (en) 2020-05-11 2024-03-05 Apple Inc. Editing features of an avatar

Families Citing this family (102)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011515729A (en) * 2008-02-13 2011-05-19 チェン,ヤウリン,シー. Beauty product sales system and method
US20090276704A1 (en) * 2008-04-30 2009-11-05 Finn Peter G Providing customer service hierarchies within a virtual universe
US8365092B2 (en) * 2008-07-03 2013-01-29 Ebay Inc. On-demand loading of media in a multi-media presentation
US8893015B2 (en) 2008-07-03 2014-11-18 Ebay Inc. Multi-directional and variable speed navigation of collage multi-media
US10282391B2 (en) 2008-07-03 2019-05-07 Ebay Inc. Position editing tool of collage multi-media
US8386406B2 (en) * 2009-07-08 2013-02-26 Ebay Inc. Systems and methods for making contextual recommendations
TW201118778A (en) * 2009-11-17 2011-06-01 Inst Information Industry System and method for recommending product and automatic service equipment thereof
US8494901B2 (en) * 2010-02-17 2013-07-23 Ebay Inc. Methods and systems for multi-merchant couponing
JP5648299B2 (en) * 2010-03-16 2015-01-07 株式会社ニコン Eyeglass sales system, lens company terminal, frame company terminal, eyeglass sales method, and eyeglass sales program
US20120016773A1 (en) * 2010-07-19 2012-01-19 Strauss Steven D Customizable Method and System for Determining a Preliminary Cost Estimate for a Home Renovation Project
US20120036048A1 (en) 2010-08-06 2012-02-09 Diy Media, Inc. System and method for distributing multimedia content
US20120066315A1 (en) * 2010-09-14 2012-03-15 Douglas Louis Tuman Visual identifiers as links to access resources
US8626589B2 (en) * 2011-01-26 2014-01-07 Google Inc. Auction-based application launching
US20120327257A1 (en) * 2011-06-24 2012-12-27 O'keefe Brian Joseph Photo product using images from different locations
US11928172B2 (en) * 2011-08-04 2024-03-12 Tara Chand Singhal Systems and methods for a web browser for use in handheld wireless devices that renders web pages without advertisement
US9262766B2 (en) * 2011-08-31 2016-02-16 Vibrant Media, Inc. Systems and methods for contextualizing services for inline mobile banner advertising
US20130054356A1 (en) * 2011-08-31 2013-02-28 Jason Richman Systems and methods for contextualizing services for images
WO2013033445A2 (en) * 2011-08-31 2013-03-07 Vibrant Media Inc. Systems and methods for contextualizing a toolbar, an image and inline mobile banner advertising
US9047383B1 (en) 2011-09-06 2015-06-02 Google Inc. Analyzing user profiles
MX2014005448A (en) * 2011-11-07 2014-08-22 Outerwall Inc Consumer operated kiosk for sampling beauty products and associated systems and methods.
US20130159895A1 (en) * 2011-12-15 2013-06-20 Parham Aarabi Method and system for interactive cosmetic enhancements interface
US20130268305A1 (en) * 2012-04-09 2013-10-10 Social Club Hub, Inc. Social club networking environment
US9449412B1 (en) * 2012-05-22 2016-09-20 Image Metrics Limited Adaptive, calibrated simulation of cosmetic products on consumer devices
US9460462B1 (en) * 2012-05-22 2016-10-04 Image Metrics Limited Monetization using video-based simulation of cosmetic products
US8560625B1 (en) * 2012-09-01 2013-10-15 Google Inc. Facilitating photo sharing
JP2014052858A (en) * 2012-09-07 2014-03-20 Sony Corp Information processing device and method, program, and information processing system
US10650445B1 (en) 2012-10-30 2020-05-12 Amazon Technologies, Inc. Collaborative bidding in an online auction
US20140207608A1 (en) * 2013-01-23 2014-07-24 Cortney Leupke System of Providing an Enhanced Salon Experience
USD737376S1 (en) 2013-03-14 2015-08-25 Outerwall Inc Consumer operated kiosk for sampling products
CN105209870B (en) 2013-03-15 2018-11-20 Matchco公司 System and method for external preparation that is specified and preparing customization
US20140351092A1 (en) * 2013-05-21 2014-11-27 Heidi Burkhart Methods, Systems, and Media for Marketing Beauty Products and Services
JP5662537B2 (en) * 2013-05-27 2015-01-28 株式会社Kpiソリューションズ Information processing system and information processing method
US10002498B2 (en) * 2013-06-17 2018-06-19 Jason Sylvester Method and apparatus for improved sales program and user interface
US9477973B2 (en) 2013-06-25 2016-10-25 International Business Machines Visually generated consumer product presentation
CN103489107B (en) * 2013-08-16 2015-11-25 北京京东尚科信息技术有限公司 A kind of method and apparatus making virtual fitting model image
EP3072098A4 (en) * 2013-11-22 2017-04-19 Hair Construction, Inc. Networked style logistics
WO2015114785A1 (en) * 2014-01-30 2015-08-06 楽天株式会社 Attribute display system, attribute display method, and attribute display program
US20150302423A1 (en) * 2014-04-17 2015-10-22 Xerox Corporation Methods and systems for categorizing users
USD748196S1 (en) 2014-08-27 2016-01-26 Outerwall Inc. Consumer operated kiosk for sampling products
TWI680747B (en) * 2014-11-12 2020-01-01 日商新力股份有限公司 Information processing device, information processing method and information processing program
KR20160084151A (en) * 2015-01-05 2016-07-13 주식회사 모르페우스 Method, system and non-transitory computer-readable recording medium for providing face-based service
US20160239867A1 (en) * 2015-02-16 2016-08-18 Adobe Systems Incorporated Online Shopping Cart Analysis
RU2596062C1 (en) 2015-03-20 2016-08-27 Автономная Некоммерческая Образовательная Организация Высшего Профессионального Образования "Сколковский Институт Науки И Технологий" Method for correction of eye image using machine learning and method of machine learning
US9679192B2 (en) * 2015-04-24 2017-06-13 Adobe Systems Incorporated 3-dimensional portrait reconstruction from a single photo
WO2017090794A1 (en) * 2015-11-26 2017-06-01 추이안 System for accessory matching through three-dimensional simulation
JP6212533B2 (en) * 2015-12-28 2017-10-11 株式会社オプティム Screen sharing system, screen sharing method, and screen sharing program
ES2912310T3 (en) 2016-01-05 2022-05-25 Reald Spark Llc Gaze Correction in Multiview Images
WO2017127784A1 (en) * 2016-01-21 2017-07-27 Skwarek Alison M Virtual hair consultation
US20170263031A1 (en) * 2016-03-09 2017-09-14 Trendage, Inc. Body visualization system
CN107180453B (en) * 2016-03-10 2019-08-16 腾讯科技(深圳)有限公司 The edit methods and device of character face's model
US10665004B2 (en) 2016-04-14 2020-05-26 C. J. Wereski System and method for editing and monetizing personalized images at a venue
CN107346386A (en) * 2016-05-05 2017-11-14 阿里巴巴集团控股有限公司 A kind of certification, information generating method and device for certification
US9940519B2 (en) 2016-06-24 2018-04-10 Fotonation Limited Image processing method and system for iris recognition
JP6448869B2 (en) * 2016-08-05 2019-01-09 株式会社オプティム Image processing apparatus, image processing system, and program
JP6861287B2 (en) * 2016-10-18 2021-04-21 スノー コーポレーション Effect sharing methods and systems for video
US10755459B2 (en) * 2016-10-19 2020-08-25 Adobe Inc. Object painting through use of perspectives or transfers in a digital medium environment
JP7252701B2 (en) * 2017-05-23 2023-04-05 株式会社Preferred Networks Systems, Programs and Methods
US20180350155A1 (en) * 2017-05-31 2018-12-06 L'oreal System for manipulating a 3d simulation of a person by adjusting physical characteristics
US10332293B2 (en) * 2017-06-09 2019-06-25 Facebook, Inc. Augmenting reality with reactive programming
WO2018227349A1 (en) * 2017-06-12 2018-12-20 美的集团股份有限公司 Control method, controller, intelligent mirror and computer readable storage medium
US10453374B2 (en) * 2017-06-23 2019-10-22 Samsung Electronics Co., Ltd. Display apparatus and method for displaying
WO2019014646A1 (en) 2017-07-13 2019-01-17 Shiseido Americas Corporation Virtual facial makeup removal, fast facial detection and landmark tracking
CN107403149A (en) * 2017-07-17 2017-11-28 广东欧珀移动通信有限公司 Iris identification method and related product
EP4293574A3 (en) 2017-08-08 2024-04-03 RealD Spark, LLC Adjusting a digital representation of a head region
IT201700099120A1 (en) * 2017-09-05 2019-03-05 Salvatore Lamanna LIGHTING SYSTEM FOR SCREEN OF ANY KIND
KR102421539B1 (en) 2017-10-20 2022-07-14 로레알 Method of making custom applicators for application of cosmetic compositions
KR102546863B1 (en) * 2017-10-20 2023-06-22 로레알 Manufacturing method of custom applicator for application of cosmetic composition
US11157985B2 (en) 2017-11-29 2021-10-26 Ditto Technologies, Inc. Recommendation system, method and computer program product based on a user's physical features
CN108596992B (en) * 2017-12-31 2021-01-01 广州二元科技有限公司 Rapid real-time lip gloss makeup method
CN107895343B (en) * 2017-12-31 2021-02-23 广州二元科技有限公司 Image processing method for quickly and simply blush based on facial feature positioning
US10936175B2 (en) 2018-02-02 2021-03-02 Perfect Corp. Systems and methods for implementing a pin mechanism in a virtual cosmetic application
US10607264B2 (en) 2018-02-02 2020-03-31 Perfect Corp. Systems and methods for virtual application of cosmetic effects to photo albums and product promotion
US10691932B2 (en) 2018-02-06 2020-06-23 Perfect Corp. Systems and methods for generating and analyzing user behavior metrics during makeup consultation sessions
US10431010B2 (en) * 2018-02-09 2019-10-01 Perfect Corp. Systems and methods for virtual application of cosmetic effects to a remote user
US10574881B2 (en) * 2018-02-15 2020-02-25 Adobe Inc. Smart guide to capture digital images that align with a target image model
US11017575B2 (en) 2018-02-26 2021-05-25 Reald Spark, Llc Method and system for generating data to provide an animated visual representation
US10395436B1 (en) 2018-03-13 2019-08-27 Perfect Corp. Systems and methods for virtual application of makeup effects with adjustable orientation view
US10762665B2 (en) * 2018-05-23 2020-09-01 Perfect Corp. Systems and methods for performing virtual application of makeup effects based on a source image
US10719729B2 (en) * 2018-06-06 2020-07-21 Perfect Corp. Systems and methods for generating skin tone profiles
US11676157B2 (en) 2018-07-13 2023-06-13 Shiseido Company, Limited System and method for adjusting custom topical agents
US11257142B2 (en) 2018-09-19 2022-02-22 Perfect Mobile Corp. Systems and methods for virtual application of cosmetic products based on facial identification and corresponding makeup information
US10685457B2 (en) 2018-11-15 2020-06-16 Vision Service Plan Systems and methods for visualizing eyewear on a user
US11113702B1 (en) * 2018-12-12 2021-09-07 Amazon Technologies, Inc. Online product subscription recommendations based on a customers failure to perform a computer-based action and a monetary value threshold
FR3091610B1 (en) * 2019-01-08 2021-05-28 Surys Digital image processing method
CN109886778A (en) * 2019-01-29 2019-06-14 上海华程西南国际旅行社有限公司 The recommended method and system of the tie-in sale product of air ticket
US10832493B2 (en) * 2019-02-28 2020-11-10 Soul Vision Creations Private Limited Programmatic hairstyle opacity compositing for 3D rendering
US11163940B2 (en) * 2019-05-25 2021-11-02 Microsoft Technology Licensing Llc Pipeline for identifying supplemental content items that are related to objects in images
US20210100992A1 (en) * 2019-10-07 2021-04-08 The Procter & Gamble Company Method of Making Applicator With Precision Eye Opening
CN111400764B (en) * 2020-03-25 2021-05-07 支付宝(杭州)信息技术有限公司 Personal information protection wind control model training method, risk identification method and hardware
US11335088B2 (en) * 2020-03-30 2022-05-17 Snap Inc. Augmented reality item collections
KR20220161461A (en) 2020-03-31 2022-12-06 스냅 인코포레이티드 Augmented Reality Experiences for Physical Products in Messaging Systems
CN111741337B (en) * 2020-06-29 2022-04-22 北京金山安全软件有限公司 Recommendation information display method, device and equipment
CN112200626A (en) * 2020-09-30 2021-01-08 京东方科技集团股份有限公司 Method and device for determining recommended product, electronic equipment and computer readable medium
US20220101405A1 (en) * 2020-09-30 2022-03-31 Revieve Oy System and method for determining a skin tone
CN112348736B (en) * 2020-10-12 2023-03-28 武汉斗鱼鱼乐网络科技有限公司 Method, storage medium, device and system for removing black eye
US11528427B2 (en) * 2020-11-27 2022-12-13 Jk Holdings, Llc. Shape and reflectance reconstruction
US20230050535A1 (en) * 2021-01-11 2023-02-16 Tetavi Ltd. Volumetric video from an image source
FR3133257A1 (en) * 2022-03-04 2023-09-08 L'oreal Experimental systems, devices and methods for the design of cosmetic applications
WO2023099960A1 (en) * 2021-11-30 2023-06-08 L'oreal Cosmetic application design experience systems, devices, and methods
US20230260009A1 (en) * 2022-02-15 2023-08-17 Loop Commerce, Inc. Systems and methods for dynamic post-transaction orders and redemption
KR102436130B1 (en) * 2022-04-25 2022-08-26 주식회사 룰루랩 Method and apparatus for determining fitness for skin analysis image
US11638553B1 (en) * 2022-04-29 2023-05-02 Lululab Inc. Skin condition analyzing and skin disease diagnosis device

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5893373A (en) * 1994-07-18 1999-04-13 Reynolds; Justine Bedell Method for application of cosmetics
US5924426A (en) * 1997-04-17 1999-07-20 Galazin; Norma Cosmetic personal color analysis method and kit using value scale, colors and charts
US6293284B1 (en) * 1999-07-07 2001-09-25 Division Of Conopco, Inc. Virtual makeover
US6697518B2 (en) * 2000-11-17 2004-02-24 Yale University Illumination based image synthesis
US6728407B1 (en) * 1999-05-17 2004-04-27 International Business Machines Corporation Method for automatically determining trackers along contour and storage medium storing program for implementing the same
US6750890B1 (en) * 1999-05-17 2004-06-15 Fuji Photo Film Co., Ltd. Method and device for displaying a history of image processing information
US20050033662A1 (en) * 2003-08-04 2005-02-10 James Buch Method for visualizing differing types of window coverings within a room setting
US7072815B1 (en) * 2002-08-06 2006-07-04 Xilinx, Inc. Relocation of components for post-placement optimization
US20060167959A1 (en) * 2003-02-25 2006-07-27 Koninklijke Philips Electronics N.V. Storing programs on disk for multiple-user retrieval
US20070188491A1 (en) * 2005-12-12 2007-08-16 Ensco, Inc. System and method for fast efficient contour shading of sampled data
US20080007564A1 (en) * 2004-11-01 2008-01-10 Koshi Tokunaga Image Processing Apparatus and Image Processing Method
US20080062192A1 (en) * 2006-09-13 2008-03-13 Adobe Systems Incorporated Color selection interface
US7502033B1 (en) * 2002-09-30 2009-03-10 Dale Axelrod Artists' color display system
US8085276B2 (en) * 2006-11-30 2011-12-27 Adobe Systems Incorporated Combined color harmony generation and artwork recoloring mechanism

Family Cites Families (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB9019538D0 (en) * 1990-09-07 1990-10-24 Philips Electronic Associated Tracking a moving object
JP2001268594A (en) * 2000-03-15 2001-09-28 Infiniteface.Com Inc Client server system for three-dimensional beauty simulation
US6785421B1 (en) * 2000-05-22 2004-08-31 Eastman Kodak Company Analyzing images to determine if one or more sets of materials correspond to the analyzed images
AU7664301A (en) * 2000-06-27 2002-01-21 Ruth Gal Make-up and fashion accessory display and marketing system and method
US7079158B2 (en) * 2000-08-31 2006-07-18 Beautyriot.Com, Inc. Virtual makeover system and method
US6412658B1 (en) * 2001-06-01 2002-07-02 Imx Labs, Inc. Point-of-sale body powder dispensing system
US7437344B2 (en) * 2001-10-01 2008-10-14 L'oreal S.A. Use of artificial intelligence in providing beauty advice
US7324668B2 (en) * 2001-10-01 2008-01-29 L'oreal S.A. Feature extraction in beauty analysis
US7634103B2 (en) * 2001-10-01 2009-12-15 L'oreal S.A. Analysis using a three-dimensional facial image
US6845171B2 (en) * 2001-11-19 2005-01-18 Microsoft Corporation Automatic sketch generation
US7082211B2 (en) * 2002-05-31 2006-07-25 Eastman Kodak Company Method and system for enhancing portrait images
US20050018216A1 (en) * 2003-07-22 2005-01-27 International Business Machines Corporation Apparatus and method to advertise to the consumer based off a digital image
US7412105B2 (en) * 2003-10-03 2008-08-12 Adobe Systems Incorporated Tone selective adjustment of images
US20070019882A1 (en) * 2004-01-30 2007-01-25 Shoji Tanaka Makeup simulation program, makeup simulation device, and makeup simulation method
US7872654B2 (en) * 2004-11-15 2011-01-18 Dreamworks Animation Llc Animating hair using pose controllers
US20060179453A1 (en) * 2005-02-07 2006-08-10 Microsoft Corporation Image and other analysis for contextual ads
US20060197775A1 (en) * 2005-03-07 2006-09-07 Michael Neal Virtual monitor system having lab-quality color accuracy
US7418371B2 (en) * 2005-03-30 2008-08-26 Seoul National University Industry Foundation Method and system for graphical hairstyle generation using statistical wisp model and pseudophysical approaches
US7612794B2 (en) * 2005-05-25 2009-11-03 Microsoft Corp. System and method for applying digital make-up in video conferencing
US8031206B2 (en) * 2005-10-12 2011-10-04 Noregin Assets N.V., L.L.C. Method and system for generating pyramid fisheye lens detail-in-context presentations
US20070174085A1 (en) * 2006-01-26 2007-07-26 Koo Charles C System and method for ordered recommendation of healthcare or personal care products
US7634108B2 (en) * 2006-02-14 2009-12-15 Microsoft Corp. Automated face enhancement
US8660319B2 (en) * 2006-05-05 2014-02-25 Parham Aarabi Method, system and computer program product for automatic and semi-automatic modification of digital images of faces
WO2007140609A1 (en) * 2006-06-06 2007-12-13 Moreideas Inc. Method and system for image and video analysis, enhancement and display for communication
US8077931B1 (en) * 2006-07-14 2011-12-13 Chatman Andrew S Method and apparatus for determining facial characteristics
US7733346B2 (en) * 2006-07-28 2010-06-08 Sony Corporation FACS solving in motion capture
US20080218532A1 (en) * 2007-03-08 2008-09-11 Microsoft Corporation Canvas-like authoring experience atop a layout engine
US20080319844A1 (en) * 2007-06-22 2008-12-25 Microsoft Corporation Image Advertising System
US8433611B2 (en) * 2007-06-27 2013-04-30 Google Inc. Selection of advertisements for placement with content
US8437514B2 (en) * 2007-10-02 2013-05-07 Microsoft Corporation Cartoon face generation
US7958066B2 (en) * 2007-11-02 2011-06-07 Hunch Inc. Interactive machine learning advice facility

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5893373A (en) * 1994-07-18 1999-04-13 Reynolds; Justine Bedell Method for application of cosmetics
US5924426A (en) * 1997-04-17 1999-07-20 Galazin; Norma Cosmetic personal color analysis method and kit using value scale, colors and charts
US6000407A (en) * 1997-04-17 1999-12-14 Galazin; Norma Cosmetic personal color analysis method and kit using value scale, colors, seasonal color designation, and charts
US6728407B1 (en) * 1999-05-17 2004-04-27 International Business Machines Corporation Method for automatically determining trackers along contour and storage medium storing program for implementing the same
US6750890B1 (en) * 1999-05-17 2004-06-15 Fuji Photo Film Co., Ltd. Method and device for displaying a history of image processing information
US6293284B1 (en) * 1999-07-07 2001-09-25 Division Of Conopco, Inc. Virtual makeover
US6697518B2 (en) * 2000-11-17 2004-02-24 Yale University Illumination based image synthesis
US7072815B1 (en) * 2002-08-06 2006-07-04 Xilinx, Inc. Relocation of components for post-placement optimization
US7502033B1 (en) * 2002-09-30 2009-03-10 Dale Axelrod Artists' color display system
US20060167959A1 (en) * 2003-02-25 2006-07-27 Koninklijke Philips Electronics N.V. Storing programs on disk for multiple-user retrieval
US20050033662A1 (en) * 2003-08-04 2005-02-10 James Buch Method for visualizing differing types of window coverings within a room setting
US20080007564A1 (en) * 2004-11-01 2008-01-10 Koshi Tokunaga Image Processing Apparatus and Image Processing Method
US20070188491A1 (en) * 2005-12-12 2007-08-16 Ensco, Inc. System and method for fast efficient contour shading of sampled data
US20080062192A1 (en) * 2006-09-13 2008-03-13 Adobe Systems Incorporated Color selection interface
US8085276B2 (en) * 2006-11-30 2011-12-27 Adobe Systems Incorporated Combined color harmony generation and artwork recoloring mechanism

Cited By (172)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10783528B2 (en) * 2000-08-24 2020-09-22 Facecake Marketing Technologies, Inc. Targeted marketing system and method
US20120221418A1 (en) * 2000-08-24 2012-08-30 Linda Smith Targeted Marketing System and Method
USRE46178E1 (en) 2000-11-10 2016-10-11 The Nielsen Company (Us), Llc Method and apparatus for evolutionary design
US11147357B2 (en) 2005-08-12 2021-10-19 Tcms Transparent Beauty, Llc System and method for applying a reflectance modifying agent to improve the visual attractiveness of human skin
US11445802B2 (en) 2005-08-12 2022-09-20 Tcms Transparent Beauty, Llc System and method for applying a reflectance modifying agent to improve the visual attractiveness of human skin
US10016046B2 (en) 2005-08-12 2018-07-10 Tcms Transparent Beauty, Llc System and method for applying a reflectance modifying agent to improve the visual attractiveness of human skin
US10043292B2 (en) * 2006-08-14 2018-08-07 Tcms Transparent Beauty Llc System and method for applying a reflectance modifying agent to change a person's appearance based on a digital image
US20170004635A1 (en) * 2006-08-14 2017-01-05 Albert D. Edgar System and Method for Applying a Reflectance Modifying Agent to Change a Person's Appearance Based on a Digital Image
US10467779B2 (en) 2007-02-12 2019-11-05 Tcms Transparent Beauty Llc System and method for applying a reflectance modifying agent to change a person's appearance based on a digital image
US10163230B2 (en) 2007-02-12 2018-12-25 Tcms Transparent Beauty Llc System and method for applying a reflectance modifying agent to change a person's appearance based on a digital image
US10486174B2 (en) 2007-02-12 2019-11-26 Tcms Transparent Beauty Llc System and method for applying a reflectance modifying agent electrostatically to improve the visual attractiveness of human skin
US10092082B2 (en) 2007-05-29 2018-10-09 Tcms Transparent Beauty Llc Apparatus and method for the precision application of cosmetics
US8597667B2 (en) 2008-05-09 2013-12-03 Elc Management Llc Targeted and individualized cosmetic delivery
US20090280150A1 (en) * 2008-05-09 2009-11-12 Tamar Lara Kamen Targeted And Individualized Cosmetic Delivery
US8358348B2 (en) 2008-05-09 2013-01-22 Elc Management Llc Method and system for automatic or manual evaluation to provide targeted and individualized delivery of cosmetic actives in a mask or patch form
US20110123703A1 (en) * 2008-05-09 2011-05-26 Fatemeh Mohammadi Method and System For Automatic or Manual Evaluation to Provide Targeted and Individualized Delivery of Cosmetic Actives in a Mask or Patch Form
US8491926B2 (en) 2008-09-16 2013-07-23 Elc Management Llc Method and system for automatic or manual evaluation to provide targeted and individualized delivery of cosmetic actives in a mask or patch form
US8425477B2 (en) 2008-09-16 2013-04-23 Elc Management Llc Method and system for providing targeted and individualized delivery of cosmetic actives
US20100068247A1 (en) * 2008-09-16 2010-03-18 Tsung-Wei Robert Mou Method And System For Providing Targeted And Individualized Delivery Of Cosmetic Actives
US20100235152A1 (en) * 2009-03-11 2010-09-16 Kimura Mitsunori Interactive contact lens simulation system and method
US20110164787A1 (en) * 2009-07-13 2011-07-07 Pierre Legagneur Method and system for applying cosmetic and/or accessorial enhancements to digital images
US8498456B2 (en) * 2009-07-13 2013-07-30 Stylecaster, Inc. Method and system for applying cosmetic and/or accessorial enhancements to digital images
US8495518B2 (en) * 2009-11-09 2013-07-23 International Business Machines Corporation Contextual abnormality CAPTCHAs
US20110113378A1 (en) * 2009-11-09 2011-05-12 International Business Machines Corporation Contextual abnormality captchas
US9501216B2 (en) * 2010-02-11 2016-11-22 Samsung Electronics Co., Ltd. Method and system for displaying a list of items in a side view form and as a single three-dimensional object in a top view form in a mobile device
US20110197164A1 (en) * 2010-02-11 2011-08-11 Samsung Electronics Co. Ltd. Method and system for displaying screen in a mobile device
US10268367B2 (en) 2010-02-19 2019-04-23 Microsoft Technology Licensing, Llc Radial menus with bezel gestures
US8421769B2 (en) * 2010-10-27 2013-04-16 Hon Hai Precision Industry Co., Ltd. Electronic cosmetic case with 3D function
US20120105336A1 (en) * 2010-10-27 2012-05-03 Hon Hai Precision Industry Co., Ltd. Electronic cosmetic case with 3d function
US20130271485A1 (en) * 2010-10-29 2013-10-17 Omron Corporation Image-processing device, image-processing method, and control program
US9208132B2 (en) 2011-03-08 2015-12-08 The Nielsen Company (Us), Llc System and method for concept development with content aware text editor
US9218614B2 (en) 2011-03-08 2015-12-22 The Nielsen Company (Us), Llc System and method for concept development
WO2012122419A1 (en) * 2011-03-08 2012-09-13 Affinnova, Inc. System and method for concept development
WO2012122428A1 (en) * 2011-03-08 2012-09-13 Affinnova, Inc. System and method for concept development
WO2012122430A1 (en) * 2011-03-08 2012-09-13 Affinnova, Inc. System and method for concept development
WO2012122424A1 (en) * 2011-03-08 2012-09-13 Affinnova, Inc. System and method for concept development
US9111298B2 (en) 2011-03-08 2015-08-18 Affinova, Inc. System and method for concept development
WO2012122431A1 (en) * 2011-03-08 2012-09-13 Affinnova, Inc. System and method for concept development
US9262776B2 (en) 2011-03-08 2016-02-16 The Nielsen Company (Us), Llc System and method for concept development
US9208515B2 (en) 2011-03-08 2015-12-08 Affinnova, Inc. System and method for concept development
US8868446B2 (en) 2011-03-08 2014-10-21 Affinnova, Inc. System and method for concept development
US10354263B2 (en) 2011-04-07 2019-07-16 The Nielsen Company (Us), Llc Methods and apparatus to model consumer choice sourcing
US11842358B2 (en) 2011-04-07 2023-12-12 Nielsen Consumer Llc Methods and apparatus to model consumer choice sourcing
US11037179B2 (en) 2011-04-07 2021-06-15 Nielsen Consumer Llc Methods and apparatus to model consumer choice sourcing
US20120306991A1 (en) * 2011-06-06 2012-12-06 Cisco Technology, Inc. Diminishing an Appearance of a Double Chin in Video Communications
US8687039B2 (en) * 2011-06-06 2014-04-01 Cisco Technology, Inc. Diminishing an appearance of a double chin in video communications
US20130019208A1 (en) * 2011-07-14 2013-01-17 Microsoft Corporation Managing content color through context based color menu
WO2013062718A1 (en) * 2011-10-28 2013-05-02 Apple Inc. On-screen image adjustments
US9424799B2 (en) 2011-10-28 2016-08-23 Apple Inc. On-screen image adjustments
US20130111337A1 (en) * 2011-11-02 2013-05-02 Arcsoft Inc. One-click makeover
US9311383B1 (en) 2012-01-13 2016-04-12 The Nielsen Company (Us), Llc Optimal solution identification system and method
US20140040789A1 (en) * 2012-05-08 2014-02-06 Adobe Systems Incorporated Tool configuration history in a user interface
US10496259B2 (en) 2012-05-09 2019-12-03 Apple Inc. Context-specific user interfaces
US11740776B2 (en) 2012-05-09 2023-08-29 Apple Inc. Context-specific user interfaces
US9547425B2 (en) 2012-05-09 2017-01-17 Apple Inc. Context-specific user interfaces
US10990270B2 (en) 2012-05-09 2021-04-27 Apple Inc. Context-specific user interfaces
US9582165B2 (en) 2012-05-09 2017-02-28 Apple Inc. Context-specific user interfaces
US10613743B2 (en) 2012-05-09 2020-04-07 Apple Inc. User interface for receiving user input
US9459781B2 (en) 2012-05-09 2016-10-04 Apple Inc. Context-specific user interfaces for displaying animated sequences
US9804759B2 (en) 2012-05-09 2017-10-31 Apple Inc. Context-specific user interfaces
US10606458B2 (en) 2012-05-09 2020-03-31 Apple Inc. Clock face generation based on contact on an affordance in a clock face selection mode
US10613745B2 (en) 2012-05-09 2020-04-07 Apple Inc. User interface for receiving user input
US8910082B2 (en) * 2012-08-10 2014-12-09 Modiface Inc. Method and system for modification of digital images through rotational cascading-effect interface
US20140047389A1 (en) * 2012-08-10 2014-02-13 Parham Aarabi Method and system for modification of digital images through rotational cascading-effect interface
US20140111539A1 (en) * 2012-10-22 2014-04-24 FiftyThree, Inc. Methods and apparatus for providing color palette management within a graphical user interface
US9563972B2 (en) * 2012-10-22 2017-02-07 FifthyThree, Inc. Methods and apparatus for providing color palette management within a graphical user interface
CN103885702A (en) * 2012-12-20 2014-06-25 宏达国际电子股份有限公司 Menu Management Methods And Systems
US9467436B2 (en) * 2013-01-04 2016-10-11 Gary Stephen Shuster Captcha systems and methods
US9860247B2 (en) * 2013-01-04 2018-01-02 Gary Stephen Shuster CAPTCHA systems and methods
US11574354B2 (en) 2013-03-15 2023-02-07 Nielsen Consumer Llc Methods and apparatus for interactive evolutionary algorithms with respondent directed breeding
US10839445B2 (en) 2013-03-15 2020-11-17 The Nielsen Company (Us), Llc Method and apparatus for interactive evolutionary algorithms with respondent directed breeding
US9785995B2 (en) 2013-03-15 2017-10-10 The Nielsen Company (Us), Llc Method and apparatus for interactive evolutionary algorithms with respondent directed breeding
US9799041B2 (en) 2013-03-15 2017-10-24 The Nielsen Company (Us), Llc Method and apparatus for interactive evolutionary optimization of concepts
US11195223B2 (en) 2013-03-15 2021-12-07 Nielsen Consumer Llc Methods and apparatus for interactive evolutionary algorithms with respondent directed breeding
US20150145882A1 (en) * 2013-04-08 2015-05-28 Panasonic Intellectual Property Corporation Of America Image processing device, image processing method, and program, capable of virtual reproduction of makeup application state
US9603437B2 (en) * 2013-04-08 2017-03-28 Panasonic Intellectual Property Corporation Of America Image processing device, image processing method, and program, capable of virtual reproduction of makeup application state
EP2985732A4 (en) * 2013-04-08 2016-04-13 Panasonic Ip Corp America Image processing device, image processing method, and program, capable of virtual reproduction of makeup application state
CN104380339A (en) * 2013-04-08 2015-02-25 松下电器(美国)知识产权公司 Image processing device, image processing method, and program, capable of virtual reproduction of makeup application state
EP3017415A4 (en) * 2013-07-03 2017-01-18 Glasses.Com Inc. Systems and methods for recommending products via crowdsourcing and detecting user characteristics
EP3022639B1 (en) * 2013-07-16 2018-10-31 Pinterest, Inc. Object based contextual menu controls
US10152199B2 (en) 2013-07-16 2018-12-11 Pinterest, Inc. Object based contextual menu controls
GB2518589A (en) * 2013-07-30 2015-04-01 Holition Ltd Image processing
GB2518589B (en) * 2013-07-30 2019-12-11 Holition Ltd Image processing
USD754182S1 (en) * 2013-12-20 2016-04-19 Teenage Engineering Ab Display screen or portion thereof with graphical user interface
CN104866165A (en) * 2014-02-21 2015-08-26 联想(北京)有限公司 Information processing method and electronic equipment
US11250385B2 (en) 2014-06-27 2022-02-15 Apple Inc. Reduced size user interface
US11720861B2 (en) 2014-06-27 2023-08-08 Apple Inc. Reduced size user interface
US10872318B2 (en) 2014-06-27 2020-12-22 Apple Inc. Reduced size user interface
US11604571B2 (en) 2014-07-21 2023-03-14 Apple Inc. Remote user interface
US10452253B2 (en) 2014-08-15 2019-10-22 Apple Inc. Weather user interface
US11550465B2 (en) 2014-08-15 2023-01-10 Apple Inc. Weather user interface
US11042281B2 (en) 2014-08-15 2021-06-22 Apple Inc. Weather user interface
US11922004B2 (en) 2014-08-15 2024-03-05 Apple Inc. Weather user interface
AU2015101183B4 (en) * 2014-09-02 2015-11-19 Apple Inc. User interface for receiving user input
US10254948B2 (en) 2014-09-02 2019-04-09 Apple Inc. Reduced-size user interfaces for dynamically updated application overviews
US10771606B2 (en) 2014-09-02 2020-09-08 Apple Inc. Phone user interface
US11700326B2 (en) 2014-09-02 2023-07-11 Apple Inc. Phone user interface
WO2016054164A1 (en) * 2014-09-30 2016-04-07 Tcms Transparent Beauty, Llc Precise application of cosmetic looks from over a network environment
US10553006B2 (en) 2014-09-30 2020-02-04 Tcms Transparent Beauty, Llc Precise application of cosmetic looks from over a network environment
US10602830B2 (en) * 2014-12-02 2020-03-31 L'oreal Dispensing system and method for learning to use such a dispensing system
US11076680B2 (en) 2014-12-02 2021-08-03 L'oreal System for dispensing a makeup product
US10849406B2 (en) 2014-12-02 2020-12-01 L'oreal System for dispensing at least one makeup product and method for dispensing and evaluating makeup
US11225373B2 (en) 2014-12-02 2022-01-18 L'oreal Assembly comprising an airbrush
US10925377B2 (en) 2014-12-02 2021-02-23 L'oreal Dispensing system having at least two outlet interfaces
US20180042361A1 (en) * 2014-12-02 2018-02-15 L'oreal Dispensing system and method for learning to use such a dispensing system
US10055121B2 (en) 2015-03-07 2018-08-21 Apple Inc. Activity based thresholds and feedbacks
US10409483B2 (en) 2015-03-07 2019-09-10 Apple Inc. Activity based thresholds for providing haptic feedback
US11657417B2 (en) 2015-04-02 2023-05-23 Nielsen Consumer Llc Methods and apparatus to identify affinity between segment attributes and product characteristics
US10572132B2 (en) 2015-06-05 2020-02-25 Apple Inc. Formatting content for a reduced-size user interface
US9916075B2 (en) 2015-06-05 2018-03-13 Apple Inc. Formatting content for a reduced-size user interface
CN108292423A (en) * 2015-12-25 2018-07-17 松下知识产权经营株式会社 Local dressing producing device, local dressing utilize program using device, local dressing production method, local dressing using method, local dressing production process and local dressing
US20180268572A1 (en) * 2015-12-25 2018-09-20 Panasonic Intellectual Property Management Co., Ltd. Makeup part generating apparatus, makeup part utilizing apparatus, makeup part generating method, makeup part utilizing method, non-transitory computer-readable recording medium storing makeup part generating program, and non-transitory computer-readable recording medium storing makeup part utilizing program
US10783672B2 (en) * 2015-12-25 2020-09-22 Panasonic Intellectual Property Management Co., Ltd. Makeup part generating apparatus, makeup part utilizing apparatus, makeup part generating method, makeup part utilizing method, non-transitory computer-readable recording medium storing makeup part generating program, and non-transitory computer-readable recording medium storing makeup part utilizing program
US11660503B2 (en) 2016-06-11 2023-05-30 Apple Inc. Activity and workout updates
US10272294B2 (en) 2016-06-11 2019-04-30 Apple Inc. Activity and workout updates
US11918857B2 (en) 2016-06-11 2024-03-05 Apple Inc. Activity and workout updates
US11161010B2 (en) 2016-06-11 2021-11-02 Apple Inc. Activity and workout updates
US11148007B2 (en) 2016-06-11 2021-10-19 Apple Inc. Activity and workout updates
US11733851B2 (en) * 2016-06-29 2023-08-22 Dassault Systemes Generation of a color of an object displayed on a GUI
US20220269402A1 (en) * 2016-06-29 2022-08-25 Dassault Systemes Generation of a color of an object displayed on a gui
US11501479B2 (en) * 2016-10-14 2022-11-15 Panasonic Intellectual Property Management Co., Ltd. Virtual make-up apparatus and virtual make-up method
US11069105B2 (en) * 2016-10-14 2021-07-20 Panasonic Intellectual Property Management Co., Ltd. Virtual make-up apparatus and virtual make-up method
CN109844800A (en) * 2016-10-14 2019-06-04 松下知识产权经营株式会社 Virtual cosmetic device and virtual cosmetic method
US11087388B1 (en) 2016-10-31 2021-08-10 Swimc Llc Product-focused search method and apparatus
US10901576B1 (en) 2016-11-01 2021-01-26 Swimc Llc Color selection and display
US10739988B2 (en) * 2016-11-04 2020-08-11 Microsoft Technology Licensing, Llc Personalized persistent collection of customized inking tools
US20180129366A1 (en) * 2016-11-04 2018-05-10 Microsoft Technology Licensing, Llc Personalized persistent collection of customized inking tools
US20180129367A1 (en) * 2016-11-04 2018-05-10 Microsoft Technology Licensing, Llc Action-enabled inking tools
US10871880B2 (en) * 2016-11-04 2020-12-22 Microsoft Technology Licensing, Llc Action-enabled inking tools
USD823320S1 (en) * 2017-05-24 2018-07-17 Koninklijke Philips N.V. Display screen with graphical user interface
USD844013S1 (en) * 2017-05-24 2019-03-26 Koninklijke Philips N.V. Display screen with animated graphical user interface
US11960713B2 (en) * 2017-06-14 2024-04-16 Behr Process Corporation Systems and methods for assisting with color selection
US20220057924A1 (en) * 2017-06-14 2022-02-24 Behr Process Corporation Systems And Methods For Assisting With Color Selection
US11650729B2 (en) * 2017-06-14 2023-05-16 Behr Process Corporation Systems and methods for assisting with color selection
US11169682B2 (en) * 2017-06-14 2021-11-09 Behr Process Corporation Systems and methods for assisting with color selection
US20230280898A1 (en) * 2017-06-14 2023-09-07 Behr Process Corporation Systems And Methods For Assisting With Color Selection
US10824317B2 (en) * 2017-06-14 2020-11-03 Behr Process Corporation Systems and methods for assisting with color selection
US10809884B2 (en) 2017-11-06 2020-10-20 The Sherwin-Williams Company Paint color selection and display system and method
US20210318796A1 (en) * 2018-08-17 2021-10-14 Matrix Analytics Corporation System and Method for Fabricating Decorative Surfaces
US20210166067A1 (en) * 2018-09-21 2021-06-03 Fujifilm Corporation Image suggestion apparatus, image suggestion method, and image suggestion program
US11599739B2 (en) * 2018-09-21 2023-03-07 Fujifilm Corporation Image suggestion apparatus, image suggestion method, and image suggestion program
US11341685B2 (en) * 2019-05-03 2022-05-24 NipLips, LLC Color-matching body part to lip product
KR20220035260A (en) * 2019-07-31 2022-03-21 로레알 Improved color wheel interface
WO2021021442A1 (en) * 2019-07-31 2021-02-04 L'oreal Improved color wheel interface
JP7145358B2 (en) 2019-07-31 2022-09-30 ロレアル Improved color wheel interface
US10977836B2 (en) * 2019-07-31 2021-04-13 L'oreal Color wheel interface
KR102462620B1 (en) * 2019-07-31 2022-11-03 로레알 Improved color wheel interface
JP2022535155A (en) * 2019-07-31 2022-08-04 ロレアル Improved color wheel interface
CN114424156A (en) * 2019-07-31 2022-04-29 莱雅公司 Improved color wheel interface
USD940731S1 (en) * 2019-10-31 2022-01-11 Eli Lilly And Company Display screen with a graphical user interface
US11875428B2 (en) * 2020-01-31 2024-01-16 L'oreal System and method of lipstick bulktone and application evaluation
US20210241501A1 (en) * 2020-01-31 2021-08-05 L'oreal System and method of lipstick bulktone and application evaluation
USD924916S1 (en) 2020-02-12 2021-07-13 SpotLogic, Inc. Computer display panel with a meeting planning graphical user interface for an application that optimizes interpersonal interaction
USD925595S1 (en) * 2020-02-12 2021-07-20 SpotLogic, Inc. Computer display panel with a graphical user interface for an application that optimizes interpersonal interaction
USD923033S1 (en) 2020-02-12 2021-06-22 SpotLogic, Inc. Computer display panel with a home screen graphical user interface for an application that optimizes interpersonal interaction
USD933692S1 (en) 2020-02-12 2021-10-19 SpotLogic, Inc. Computer display panel with a meeting objective editing graphical user interface for an application that optimizes interpersonal interaction
USD932507S1 (en) 2020-02-12 2021-10-05 SpotLogic, Inc. Computer display panel with a meeting objective editing graphical user interface for an application that optimizes interpersonal interaction
US11921998B2 (en) 2020-05-11 2024-03-05 Apple Inc. Editing features of an avatar
US11849829B2 (en) * 2021-01-29 2023-12-26 L'oreal Remote beauty consultation system
US20220240650A1 (en) * 2021-01-29 2022-08-04 L'oreal Remote beauty consultation system
US20220326837A1 (en) * 2021-04-13 2022-10-13 Apple Inc. Methods for providing an immersive experience in an environment
CN113126846A (en) * 2021-04-27 2021-07-16 广州市妇女儿童医疗中心 Intelligent medicine information processing method and device, computer equipment and storage medium
US11714536B2 (en) 2021-05-21 2023-08-01 Apple Inc. Avatar sticker editor user interfaces
US11670059B2 (en) 2021-09-01 2023-06-06 Snap Inc. Controlling interactive fashion based on body gestures
US11673054B2 (en) 2021-09-07 2023-06-13 Snap Inc. Controlling AR games on fashion items
US11900506B2 (en) * 2021-09-09 2024-02-13 Snap Inc. Controlling interactive fashion based on facial expressions
US11734866B2 (en) 2021-09-13 2023-08-22 Snap Inc. Controlling interactive fashion based on voice
EP4160540A1 (en) * 2021-09-29 2023-04-05 Beijing Dajia Internet Information Technology Co., Ltd. Method and apparatus for producing special effect, electronic device and storage medium
US11636662B2 (en) 2021-09-30 2023-04-25 Snap Inc. Body normal network light and rendering control
US11651572B2 (en) 2021-10-11 2023-05-16 Snap Inc. Light and rendering of garments
US11816144B2 (en) 2022-03-31 2023-11-14 Pinterest, Inc. Hair pattern determination and filtering
GB2619283A (en) * 2022-05-26 2023-12-06 Holition Ltd Simulating foundation makeup effect in augmented images

Also Published As

Publication number Publication date
US9058765B1 (en) 2015-06-16
US20090234716A1 (en) 2009-09-17

Similar Documents

Publication Publication Date Title
US20090231356A1 (en) Graphical user interface for selection of options from option groups and methods relating to same
US11854072B2 (en) Applying virtual makeup products
US11854070B2 (en) Generating virtual makeup products
US20230043249A1 (en) Avatar Editing Environment
US10534605B2 (en) Application system having a gaming engine that enables execution of a declarative language
US9817561B2 (en) Proposing visual display components for processing data
Bruckner et al. Result-driven exploration of simulation parameter spaces for visual effects design
US8306286B1 (en) Method and apparatus for determining facial characteristics
KR102294134B1 (en) Authoring tools for synthesizing hybrid slide-canvas presentations
US9024952B2 (en) Discovering and configuring representations of data via an insight taxonomy
US20160260237A1 (en) Extensions for modifying a graphical object to display data
US11087503B2 (en) Interactive color palette interface for digital painting
US20090278848A1 (en) Drawing familiar graphs while system determines suitable form
CN107066465A (en) Information presentation system
US20210304453A1 (en) Augmented reality experiences for physical products in a messaging system
US10019143B1 (en) Determining a principal image from user interaction
US20100332485A1 (en) Ordering of data items
CN111798559A (en) Selection device and method of virtual image characteristics
CN109782975A (en) A kind of manicure device image processing method, system, nail art device and medium
CN110231903B (en) Parameter adjusting method and device
CN107404427A (en) One kind chat background display method and device
US20240144626A1 (en) Avatar editing environment
Guevarra Modeling and Animation Using Blender
Guevarra et al. Blending with Blender: The Shading Workspace
Van de Broek et al. Perspective Chapter: Evolution of User Interface and User Experience in Mobile Augmented and Virtual Reality Applications

Legal Events

Date Code Title Description
AS Assignment

Owner name: PHOTOMETRIA, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BARNES, KEVIN;MALLICK, SATYA;REEL/FRAME:022711/0696

Effective date: 20090316

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: TAAZ, INC., CALIFORNIA

Free format text: CHANGE OF NAME;ASSIGNOR:PHOTOMETRIA, INC.;REEL/FRAME:033917/0911

Effective date: 20101005