WO2010045593A2 - A system and method for content customization based on emotional state of the user - Google Patents
A system and method for content customization based on emotional state of the user Download PDFInfo
- Publication number
- WO2010045593A2 WO2010045593A2 PCT/US2009/061062 US2009061062W WO2010045593A2 WO 2010045593 A2 WO2010045593 A2 WO 2010045593A2 US 2009061062 W US2009061062 W US 2009061062W WO 2010045593 A2 WO2010045593 A2 WO 2010045593A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- content
- engine
- profile
- submitted
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
Abstract
A new approach is proposed that contemplates systems and methods to present a script of content comprising one or more content items to a user online, wherein such content is not only relevant to addressing a problem submitted by the user, but is also customized and tailored to the specific needs and preferences of the user based on the user's profile and/or emotional state at the time. Such an approach enables a personal "agent" that understands the user's emotional state, specific needs and interests by maintaining a personal profile and history of the user. Based on in-depth personal knowledge and understanding, the agent is capable of identifying, retrieving, customizing, and presenting to the user a unique experience that distinguishes it from the experiences of any other users in the general public.
Description
A SYSTEM AND METHOD FOR CONTENT CUSTOMIZATION BASED ON EMOTIONAL
STATE OF THE USER
BACKGROUND With the growing volume of content available over the Internet, people are increasingly seeking answers to their questions or problems online. Due to the overwhelming amount of information that is available online, however, it is often difficult for a lay person to browse over the Web and find the content that actually addresses his/her problem. Even when the user is able to find the content that is relevant to address his/her problem, such content is most likely to be of "one size fits all" type that addresses concerns of the general public while it does not target the specific needs of the user as an individual. Although some online vendors do keep track of web surfing and/or purchasing history or tendency of a user online for the purpose of recommending services and products to the user based on such information, such online footprint of the user is only passively gathered or monitored, which often does not truly reflect the user's real intention or interest. For a non-limiting example, the fact that a person purchased certain goods as gifts for his/her friend(s) is not indicative of his/her own interest in such goods. Furthermore, under certain circumstances, the content that the user is looking for may depend heavily upon the user's emotional state (mood) at the time the problem is submitted. For a non-limiting example, the user may be looking for totally different things, depending upon whether he/she is in happy or sad mood, when he/she asks for "music that feels good."
The foregoing examples of the related art and limitations related therewith are intended to be illustrative and not exclusive. Other limitations of the related art will become apparent upon a reading of the specification and a study of the drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 depicts an example of a system diagram to support content customization based on user profile. FIG. 2 illustrates an example of the various information that may be included in a user profile.
FIG. 3 illustrates an example of a three-dimensional emotion circumplex model, which illustrates relationships within and between primary emotions.
FIG. 4 depicts a flowchart of an example of a process to establish the user's profile and/or assess his/her emotional state.
FIG. 5 illustrates an example of various types of content items in a script of content and the potential elements in each of them. FIG. 6 depicts a flowchart of an example of a process to support content customization based on user profile.
DETAILED DESCRIPTION OF EMBODIMENTS
The approach is illustrated by way of example and not by way of limitation in the figures of the accompanying drawings in which like references indicate similar elements. It should be noted that references to "an" or "one" or "some" embodiment(s) in this disclosure are not necessarily to the same embodiment, and such references mean at least one. A new approach is proposed that contemplates systems and methods to present a script of content (also known as a user experience, referred to hereinafter as "content") comprising one or more content items to a user online, wherein such content is not only relevant to addressing a problem submitted by the user, but is also customized and tailored to the specific needs and preferences of the user based on the user's profile and/or emotional state at the time. Such an approach enables a personal "agent" that understands the user's emotional state, specific needs and interests by maintaining a personal profile of the user. Such profile is more than a simple tracking of the user's activities online by further including feedback and answers provided by the user him/herself to prior engagements and/or "interview" questions by the agent. Based on such in-depth personal knowledge and understanding, the agent is capable of identifying, retrieving, customizing, and presenting the content to the user that specifically addresses his/her problem or concern. With such an approach, a user can efficiently and accurately find what he/she is looking for and have a unique experience that distinguishes it from the experiences by any other person in the general public, while vendors in various market segments that include but are not limited to on-line advertising, computer games, leadership/management training, and adult education, can better provide their customers with content that is tailored to meet each individual client's personal and emotional needs.
FIG. 1 depicts an example of a system diagram to support content customization based on user's profile and emotional state at the time. Although the diagrams depict
components as functionally separate, such depiction is merely for illustrative purposes. It will be apparent that the components portrayed in this figure can be arbitrarily combined or divided into separate software, firmware and/or hardware components. Furthermore, it will also be apparent that such components, regardless of how they are combined or divided, can execute on the same host or multiple hosts, and wherein the multiple hosts can be connected by one or more networks.
In the example of FIG. 1 , the system 100 includes a user interaction engine 102, which includes at least a user interface 104, a display component 106, and a communication interface 108; a profile engine 110, which includes at least a communication interface 112 and a profiling component 114; a profile library (database) 116 coupled to the profile engine 110; a content engine 118, which includes at least a communication interface 120, a content retrieval component 122, and a customization component 124; a script template library (database) 126 and a content library (database) 128, both coupled to the content engine 118; and a network 130. As used herein, the term engine refers to software, firmware, hardware, or other component that is used to effectuate a purpose. The engine will typically include software instructions that are stored in non-volatile memory (also referred to as secondary memory). When the software instructions are executed, at least a subset of the software instructions is loaded into memory (also referred to as primary memory) by a processor. The processor then executes the software instructions in memory. The processor may be a shared processor, a dedicated processor, or a combination of shared or dedicated processors. A typical program will include calls to hardware components (such as I/O devices), which typically requires the execution of drivers. The drivers may or may not be considered part of the engine, but the distinction is not critical.
As used herein, the term library or database is used broadly to include any known or convenient means for storing data, whether centralized or distributed, relational or otherwise.
In the example of FIG. 1 , each of the engines and libraries can run on one or more hosting devices (hosts). Here, a host can be a computing device, a communication device, a storage device, or any electronic device capable of running a software component. For non-limiting examples, a computing device can be but is not limited to a laptop PC, a desktop PC, a tablet PC, an iPod, a PDA, or a server machine. A storage device can be but is not limited to a hard disk drive, a flash memory drive, or any
portable storage device. A communication device can be but is not limited to a mobile phone.
In the example of FIG. 1 , the communication interface 108, 112, and 120 are software components that enables the user interaction engine 102, the profile engine 110, and the content engine 118 to communicate with each other following certain communication protocols, such as TCP/IP protocol. The communication protocols between two devices are well known to those of skill in the art. In the example of FIG. 1 , the network 130 enables the user interaction engine 102, the profile engine 110, and the content engine 118 to communicate and interact with each other. Here, the network 130 can be a communication network based on certain communication protocols, such as TCP/IP protocol. Such network can be but is not limited to, internet, intranet, wide area network (WAN), local area network (LAN), wireless network, Bluetooth, VViFi, and mobile communication network. The physical connections of the network and the communication protocols are well known to those of skill in the art.
In the example of FIG. 1 , the user interaction engine 102 is configured to enable a user to submit or raise a problem to which the user intends to seek help or counseling via the user interface 104 and to present to the user a script of a content relevant to addressing the problem submitted by the user via the display component 106. Here, the problem (or question, interest, issue, event, condition, or concern, hereinafter referred to a problem) of the user provides the context for the content that is to the presented to him/her. The problem can be related to one or more of personal, emotional, spiritual, relational, physical, practical, or any other need of the user. In some embodiments, the user interface 104 can be a Web-based browser, which allows the user to access the system 100 remotely via the network 130.
In some embodiments, the user interaction engine 102 presents a pre-determined list of problems that could possibly be submitted by the user in the form of a list, such as a pull down menu, and the user may submit his/her problem by simply picking and choosing a problem in the menu. Such menu can be organized by various categories or topics in more than one level. By organizing and standardizing the potential problems from the user, the menu not only saves the user's time and effort in submitting the problems, but also makes it easier to identify relevant script templates and/or content items for the problem submitted. In some embodiments, the user interaction engine 102 is configured to enable the user to provide feedback to the content presented to him/her via the user interface 104.
Here, such feedback can be, for non-limiting examples, ratings or ranking of the content, indication of preference as whether the user would like to see the same or similar content in the same category in the future, or any written comments or suggestions on the content that eventually drives the customization of the content. For non-limiting examples, a rating can be from 0-10 where 0 is worst and 10 is best, or 5 stars. There can also be a comment by a user can be that he/she does not want to see content item such as poetry.
In the example of FIG. 1 , the profile engine 110 manages a profile of the user maintained in the profile library 116 via the profiling component 114 for the purpose of generating and customizing the content to be presented to the user. The user profile may contain at least the following areas of user information:
Administrative information includes account information such as name, region, email address, and payment options of the user.
Static profile contains information of the user that does not change over time, such as the user's gender and date of birth to calculate his/her age and for potential astrological consideration.
Dynamic profile contains information of the user that may change over time, such as parental status, marital status, relationship status, as well as current interests, hobbies, habits, and concerns of the user. In addition, the dynamic profile may also contain contains ADA-compliance information of the user, such as poor eyesight, hearing loss, etc., which reflects the user's present physical conditions.
Psycho-Spiritual Dimension describes the psychological, spiritual, and religious component of the user, such as the user's belief system (a religious, philosophical or intellectual tradition, e.g., Christian, Buddhist, Jewish, atheist, non-religious), degree of adherence (e.g., committed/devout, practicing, casual, no longer practicing, "openness" to alternatives) and influences (e.g., none, many, parents, mother, father, other relative, friend, spouse, spiritual leader/religious leader, self).
Community Profile contains information defining how the user interacts with the online community of experts and professionals (e.g., which of the experts he/she likes or dislikes in the community and which problems to which the user is willing to receive request for wisdom (RFW) and to provide his/her own input on the matter).
FIG. 2 illustrates an example of the various information that may be included in a user profile.
In some embodiments, the profile engine 110 initiates one or more questions to the user via the user interaction engine 102 for the purpose of soliciting and gathering at
least part of the information listed above to establish the profile of the user. Here, such questions focus on the aspects of the user's life that are not available through other means. The questions initiated by the profile engine 110 may focus on the personal interests of the spiritual dimensions as well as dynamic and community profiles of the user. For a non-limiting example, the questions may focus on the user's personal interest, which may not be truly obtained by simply observing the user's purchasing habits.
In some embodiments, the profile engine 110 updates the profile of the user via the profiling component 114 based on the prior history/record and dates of one or more of: problems that have been raised by the user; relevant content that has been presented to the user; script templates that have been used to generate and present the content to the user; feedback from the user to the content that has been presented to the user.
In some embodiments, the profile engine 110 assesses the emotional state of the user at the time when he/she submits the problem before any content is generated, customized, and delivered to address the user's problem. Typically, the user's emotional state is not part of the problem he/she submitted unless the user submits "feelings" as a key problem to be addressed. The assessment of the user's emotional state, however, is especially important when the user's emotional state lies at positive or negative extremes, such as joy, rage, or terror, since it may substantially affect the answer or content that the user is looking for- the user apparently would look for different things to the same problem depending upon whether he/she is happy or sad. By assessing the user's emotional state prior to generating, customizing, and delivering the content to address the specific problem submitted by the user, the system is able to customize the content so that the content not only addresses the problem submitted by the user based on the user's profile, but also reflects and meets the user's emotional need at the time to improve the effectiveness and utility of the content before it is delivered to the user. The table below shows examples of possible primary, secondary, and tertiary emotion states as summarized in Parrott, W. (2001) in Emotions in Social Psychology, Psychology Press, Philadelphia.
Primary Secondary
Tertiary emotions emotion emotion
Adoration, affection, love, fondness, liking, attraction, caring, Love Affection tenderness, compassion, sentimentality
Arousal, desire, lust, passion, infatuation
Lust
Longing
Longing
Amusement, bliss, cheerfulness, gaiety, glee, jolliness, joviality, joy, delight, enjoyment, gladness, happiness, jubilation, elation,
Joy Cheerfulness satisfaction, ecstasy, euphoria
Enthusiasm, zeal, zest, excitement, thrill, exhilaration
Zest
Contentment, pleasure
Contentment
Pride, triumph
Pride
Eagerness, hope, optimism
Optimism
Enthrallment, rapture
Enthrallment
Relief
Relief
Amazement, surprise, astonishment
Surprise Surprise
Aggravation, irritation, agitation, annoyance, grouchiness,
Anger Irritation grumpiness
Exasperation, frustration
Exasperation
Anger, rage, outrage, fury, wrath, hostility, ferocity, bitterness,
Rage hate, loathing, scorn, spite, vengefulness, dislike, resentment
Disgust, revulsion, contempt
Disgust
Envy, jealousy
Envy
Torment
Torment
Agony, suffering, hurt, anguish
Sadness Suffering
Depression, despair, hopelessness, gloom, glumness, sadness,
Sadness unhappiness, grief, sorrow, woe, misery, melancholy
Dismay, disappointment, displeasure
Disappointment Guilt, shame, regret, remorse
Shame
Alienation, isolation, neglect, loneliness, rejection, homesickness, defeat, dejection, insecurity, embarrassment,
Neglect humiliation, insult
Pity, sympathy
Sympathy
Alarm, shock, fear, fright, horror, terror, panic, hysteria,
Fear Horror mortification
Anxiety, nervousness, tenseness, uneasiness, apprehension,
Nervousness worry, distress, dread
In some embodiments, the profile engine 110 initiates one or more questions to the user via the user interaction engine 102 for the purpose of soliciting and gathering at least part of the information necessary to establish the profile of the user and/or to assess the user's emotional state. Here, such questions focus on the aspects of the user's life and his/her current emotional state that are not available through other means. The questions initiated by the profile engine 110 may focus on the personal interests of the spiritual dimensions of the user's past profile as well as the present emotional well being of the user. For a non-limiting example, the questions may focus on how the user is feeling right now and whether he/she is up or down for the moment, which may not be truly obtained by simply observing the user's past behavior or activities.
In some embodiments, the profile engine 110 presents a visual representation of emotions, such as a location-appropriate version of an unfolded emotion circumplex, to the user via the user interaction engine 102, and enables the user to select up to three of his/her active emotional states by clicking on the appropriate region on the circumplex. FIG. 3 illustrates an example of a three-dimensional emotion circumplex
model, which illustrates relationships within and between eight primary emotions much the way a color wheel illustrates relationships between colors. The vertical dimension of the cone 302 represents intensity, with different emotions of similar intensities sharing circular bands. The eight main segments 304 are designed to suggest eight primary emotional dimensions arranged as four pairs of opposites- anger, fear, sadness, disgust, surprise, curiosity, acceptance and joy. In some embodiments, additional key emotions, such as lust, loneliness and jealousy can also be represented in the circumplex. In addition, the profile engine 110 can adjust or reverse the direction of certain emotional intensity so that some subtle emotions are in the center of the circumplex while the extremes are on the edges of the circumplex. For a non-limiting example, such reversal of emotional intensity would allow a "peace" emotion-state to be in the center of the circumplex, symbolizing the synonymous nature of "peace" and "centered ness." FIG. 4 depicts a flowchart of an example of a process to establish the user's profile and/or assess his/her emotional state. Although this figure depicts functional steps in a particular order for purposes of illustration, the process is not limited to any particular order or arrangement of steps. One skilled in the relevant art will appreciate that the various steps portrayed in this figure could be omitted, rearranged, combined and/or adapted in various ways. In the example of FIG. 4, the flowchart 400 starts at block 402 where identity of the user submitting a problem for help or counseling is identified. If the user is a first time visitor, the flowchart 400 continues to block 304 where the user is registered. The flowchart 400 then continues to block 306 where a set of interview questions are initiated to solicit information from the user for the purpose of establishing the user's profile and/or assessing his/her emotional state at the time. The flowchart 400 continues to block 408 where the user is optionally presented with a visual representation of emotions and enabled to select up to three of his/her active emotional states. The flowchart 400 ends at block 410 where the profile and/or emotional state of the user is provided to the content engine 118 for the purpose of retrieving and customizing the content relevant to the problem.
In the example of FIG. 1 , the content engine 118 identifies and retrieves the content relevant to the problem submitted by the user via the content retrieval component 122 and customizes the content based on the profile and/or emotional state of the user at the time via customization component 124 in order to present to the user a unique experience. A script of content herein can include one or more content items, each of
which can be individually identified, retrieved, composed, and presented by the content engine 118 to the user online as part of the user's multimedia experience (MME). Here, each content item can be, but is not limited to, a media type of a (displayed or spoken) text (for a non-limiting example, an article, a quote, a personal story, or a book passage), a (still or moving) image, a video clip, an audio clip (for a non-limiting example, a piece of music or sounds from nature), and other types of content items from which a user can learn information or be emotionally impacted. Here, each item of the content can either be provided by another party or created or uploaded by the user him/herself. In some embodiments, each of a text, image, video, and audio item can include one or more elements of: title, author (name, unknown, or anonymous), body (the actual item), source, type, and location. For a non-limiting example, a text item can include a source element of one of literary, personal experience, psychology, self help, and religious, and a type element of one of essay, passage, personal story, poem, quote, sermon, speech, and summary. For another non-limiting example, a video an audio, and an image item can all include a location element that points to the location (e.g., file path or URL) or access method of the video, audio, or image item. In addition, an audio item may also include elements on album, genre, or track number of the audio item as well as its audio type (music or spoken word). In some embodiments, the content engine 118 can associate each of a text, image, video, and audio item that is purchasable with a link to a resource of the item where such content item can be purchased from an affiliated vendor of the item, such as Amazon Associates, iTunes, etc. The user interaction engine 102 can then present the link together with the corresponding item in the content to the user and enable the user to purchase a content item of his/her interest by clicking the link associated with the content item. FIG. 5 illustrates an example of various types of content items and the potential elements in each of them.
In some embodiments, the content engine 118 may customize the content based on the user's profile including one or more of: the user's prior visits, his/her recent comments and ratings on content related to the same or relevant problems, and his/her response to requests for wisdom. For a non-limiting example, content items that did not appeal to the user in the past based on his/her feedback will likely be excluded. In some situations when the user is not sure what he/she is looking for, the user may simply choose "Get me through the day" from the problem list and the content engine 118 will automatically retrieve and present content to the user based on the user's profile.
When the user is a first time visitor or his/her profile is otherwise thin, the content engine 118 may automatically identify and retrieve content items relevant to the problem.
In some embodiments, the content engine 118 may customize the content based on the user's emotional state at the time. More specifically, the content engine 118 may generate and present the user with content that focuses on addressing both the problem he/she has submitted and the user's emotional need at the time. If no such dual-purpose content exists in the content library 128 or can be generated to serve both aims, the content engine 118 may generate a portion of the content that focuses first on the problem submitted by the user, and then generate another portion of the content that focuses on the emotion need of the user. The ratio between problem-related portion and emotion-related portion of the content (if no dual-purpose content exists) is set to reflect the urgency of the user's emotional state at the time as indicated by the assessment by the profile engine 110. For a non-limiting example, if the user is highly emotional and depressed at the time when he/she asks for content that "feels good," the content engine 118 should generate content that includes relaxing and soothing images, quotations, and music instead of fast-paced content with cheerful tones. In some embodiments, the content engine 118 may customize the content based on an "experience path" of the user. Here, the user experience path can be a psychological process (e.g., stages of grief: denial → anger -> bargaining -» depression → acceptance). The user experience path contains an ordered list of path nodes, each of which represents a stage in the psychological process. By associating the user experience path and path nodes with a content item, the content engine 118 can select appropriate content items for the user that are appropriate to his/her current stage in the psychological process.
In some embodiments, the content engine 118 may identify and retrieve the content in response to the problem submitted by the user by identifying a script template for the problem submitted by the user and generating a script of the content by retrieving content items based on the script template. Here, a script template defines a sequence of media types with timing information for the corresponding content items to be composed as part of the multi-media content. For each type of content item in the content, the script template may specify whether the content item is repeatable or non- repeatable, how many times it should be repeated (if repeatable) as part of the script, or what the delay should be between repeats. For repeatable content Items, more
recently viewed content Items should have a lower chance of selection that less recently viewed (or never viewed) content items.
In the example of FIG. 1 , the profile library 116 embedded in a computer readable medium, which in operation, maintains a set of user profiles of the users. Once the content has been generated and presented to a user, the profile of the user stored in the profile library 116 can be updated to include the problem submitted by the user as well as the content presented to him/her as part of the user history. If the user optionally provides feedback on the content, the profile of the user can also be updated to include the user's feedback on the content. In the example of FIG. 1 , the script template library 126 maintains script templates corresponding to the pre-defined set of problems that are available to the user, while the content library 128 maintains content items as well as definitions, tags, and resources of the content relevant to the user-submitted problems. In some embodiments, the content engine 118 may automatically generate a script template for the problem by periodically data mining the relevant content items in the content library 128. More specifically, the content engine 118 may first browse through and identify content item's categories in the content library 128 that are most relevant to the problem submitted. The content engine 118 then determines the most effective way to present such relevant content items based on, for non-limiting examples, the nature of the content items (e.g., displayable or audible), and the feedback received from users as how they would prefer the content items to be presented to them to best address the problem. The content engine 118 then generates the script template for the problem and saves the template in the script library 126. In the example of FIG. 1 , the content library 128 covers both the definition of content items and how the content tags are applied. It may serve as a media "book shelf that includes a collection of content items relevant and customized based on each user's profile, experiences, and preferences. The content engine 118 may retrieve content items either from the content library 128 or, in case the content items relevant are not available there, identify the content items over the Web and save them in the content library 128 so that these content items will be readily available for future use.
In some embodiments, the content items in content library 128 can be tagged and organized appropriately to enable the content engine 118 to access and browse the content library 128. Here, the content engine 118 may browse the content items by problems, types of content items, dates collected, and by certain categories such as belief systems to build the content based on the user's profile and/or understanding of
the items' "connections" with the problem submitted by the user. For a non-limiting example, a sample music clip might be selected to be included in the content because it was encoded for a user with an issue of sadness.
In some embodiments, the content engine 118 may allow the user to add self-created content items (such as his/her personal stories, self-composed or edited images, audios, or video clips) into the content library 128 and make them available either for his/her own use only or more widely available to other users who may share the same problem with the user. In some embodiments, the content engine 118 may occasionally include one or more content items in the customized content for the purpose of gathering feedback from the user. Here, the content items can be randomly selected by the content engine 118 from categories in the content library 128 that are relevant to the problem submitted by the user. Such content items may be newly generated and/or included in the content library 128 and have not been provided to users on a large scale. It is thus important to gather feedback on such content items from a group of users in order to evaluate via feedback such content.
In some embodiments, each content item in content library 128 can be associated with multiple tags for the purpose of easy identification, retrieval, and customization by the content engine 118 based on the user's profile. For a non-limiting example, a content item can be tagged as generic (default value assigned) or humorous (which should be used only when humor is appropriate). For another non-limiting example, a pair of (belief system, degree of adherence range) can be used to tag a content item as either appropriate for all Christians (Christian, 0-10) or only for devout Christians (Christian, 8- 10). Thus, the content engine 118 will only retrieve a content item for the user where the tag of the content item matches the user's profile.
In some embodiments, the content engine 118 incorporates wisdom from a community of users and experts into the customized content. Here, the wisdom can simply be content items such as expert opinions and advice that have been supplied in response to a request for wisdom (RFW) issued by the user. The content items are treated just like any other content items once they are reviewed and rated/commented by the user. While the system 100 depicted in FIG. 1 is in operation, the user interaction engine 102 enables the user to login and submit a problem of his/her concern via the user interface 104. The user interaction engine 102 communicates the identity of the user together with the problem submitted by the user to the content engine 118 and/or the profile engine 110. Once the user is registered, the profile engine 110 may establish a profile
of the user that accurately reflect the user's interests or concerns and/or assess the user's emotional state at the time when he/she submits the problem by interviewing the user with a set of questions and/or presenting the user with a visual representation of emotions to enable the user to select his/her active emotional state(s). Upon receiving the problem and the identity of the user, the content engine 118 obtains the emotional state of the user, as well as the profile of the user from the profile library 116 and the script template of the problem from the script template library 126, respectively. The content engine 118 then identifies and retrieves content items based on the script template of the problem from the content library 128 via the content retrieval component 122 and populates the script template based on the user's profile to create a script of the content that addresses the user's problem and reflects the user's emotional state via the customization component 124. Once the content is generated, the user interaction engine 102 presents it to the user via the display component 106 and enables the user to rate or provide feedback to the content presented. The profile engine 110 may then update the user's profile with the history of the problems submitted by the user, the content items presented to the user, and the feedback and ratings from the user of the content.
FIG. 6 depicts a flowchart of an example of a process to support content customization based on user's profile and emotional state at the time. Although this figure depicts functional steps in a particular order for purposes of illustration, the process is not limited to any particular order or arrangement of steps. One skilled in the relevant art will appreciate that the various steps portrayed in this figure could be omitted, rearranged, combined and/or adapted in various ways. In the example of FIG. 6, the flowchart 600 starts at block 602 where a user is enabled to submit a problem to which the user intends to seek help or counseling. The problem submission process can be done via a user interface and be standardized via a list of pre-defined problems organized by topics and categories.
In the example of FIG. 6, the flowchart 600 continues block 604 where a profile of the user is established and his/her emotional state at the time the problem is submitted is assessed. At least a portion of the profile can be established and the emotional state can be assessed by initiating interview questions to the user targeted at soliciting information on his/her personal interests and/or concerns. In addition, a visual representation of emotions can be presented to the user to enable the user to select one or more of his/her active emotion states at the time.
In the example of FIG. 6, the flowchart 600 continues block 606 where a content comprising one or more content items that is relevant to the problem submitted by the user is identified and retrieved. Here, content items can be automatically identified and retrieved based on a script template associated with the problem submitted by the user and a script of the content can be formed by "filling" the script template with the content retrieved.
In the example of FIG. 6, the flowchart 600 continues block 608 where the retrieved content is customized based on the profile and/or the current emotional state of the user. Such customization reflects the user's preference as to what kind of content items he/she would like to be included in the content to fit his/her emotional state at the time, as well as how each of the items in the content is preferred to be presented to him/her.
In the example of FIG. 6, the flowchart 600 ends at block 610 where the customized content relevant to the problem is presented to the user. Optionally, the user may also be presented with links to resources from which items in the presented content can be purchased. The presented content items may also be saved for future reference. In the example of FIG. 6, the flowchart 600 may optionally continue to block 612 where the user is enabled to provide feedback by rating and commenting on the content presented. Such feedback will then be used to update the profile of the user in order to make future content customization more accurate.
One embodiment may be implemented using a conventional general purpose or a specialized digital computer or microprocessor(s) programmed according to the teachings of the present disclosure, as will be apparent to those skilled in the computer art. Appropriate software coding can readily be prepared by skilled programmers based on the teachings of the present disclosure, as will be apparent to those skilled in the software art. The invention may also be implemented by the preparation of integrated circuits or by interconnecting an appropriate network of conventional component circuits, as will be readily apparent to those skilled in the art. One embodiment includes a computer program product which is a machine readable medium (media) having instructions stored thereon/in which can be used to program one or more hosts to perform any of the features presented herein. The machine readable medium can include, but is not limited to, one or more types of disks including floppy disks, optical discs, DVD, CD-ROMs, micro drive, and magneto-optical disks, ROMs, RAMs, EPROMs, EEPROMs, DRAMs, VRAMs, flash memory devices, magnetic or optical cards, nanosystems (including molecular memory ICs), or any type
of media or device suitable for storing instructions and/or data. Stored on any one of the computer readable medium (media), the present invention includes software for controlling both the hardware of the general purpose/specialized computer or microprocessor, and for enabling the computer or microprocessor to interact with a human viewer or other mechanism utilizing the results of the present invention. Such software may include, but is not limited to, device drivers, operating systems, execution environments/containers, and applications.
The foregoing description of various embodiments of the claimed subject matter has been provided for the purposes of illustration and description. It is not intended to be exhaustive or to limit the claimed subject matter to the precise forms disclosed. Many modifications and variations will be apparent to the practitioner skilled in the art. Particularly, while the concept "interface" is used in the embodiments of the systems and methods described above, it will be evident that such concept can be interchangeably used with equivalent software concepts such as, class, method, type, module, component, bean, module, object model, process, thread, and other suitable concepts. While the concept "component" is used in the embodiments of the systems and methods described above, it will be evident that such concept can be interchangeably used with equivalent concepts such as, class, method, type, interface, module, object model, and other suitable concepts. Embodiments were chosen and described in order to best describe the principles of the invention and its practical application, thereby enabling others skilled in the relevant art to understand the claimed subject matter, the various embodiments and with various modifications that are suited to the particular use contemplated.
Claims
What is claimed is:
1. A system, comprising: a user interaction engine, which in operation, enables the user to submit a problem to which the user intends to seek help or counseling; presents to the user a content relevant to addressing the problem submitted by the user; a profile engine, which in operation, assesses an emotional state of a user at the time the problem is submitted; a content engine, which in operation, identifies and retrieves the content relevant to the problem submitted by the user; customizes the content based on the emotional state of the user at the time.
2. The system of claim 1 , wherein: the problem submitted by the user relates to one or more of: personal, emotional, psychological, spiritual, relational, physical, practical, or any other needs of the user.
3. The system of claim 1 , wherein: the emotional state of the user includes one or more of primary, secondary, and tertiary emotions of the user, 4. The system of claim 1 , wherein: the profile engine establishes and maintains a profile of the user.
5. The system of claim 4, wherein: the content engine customizes the content based on the profile of the user.
6. The system of claim 1 , wherein: the profile engine initiates one or more questions to the user to solicit information for the purpose of assessing the emotional state of the user.
7. The system of claim 1 , wherein: the profile engine presents a visual representation of emotions to the user and enables the user to select one or more of his/her active emotion states via the visual representation. 8. The system of claim 7, wherein:
the visual representation of emotions is a three-dimensional emotion circumplex. 9. The system of claim 7, wherein: the profile engine adjusts emotions represented and their positions in the visual representation of emotions. 10. The system of claim 1 , wherein: the user interaction engine is configured to enable the user to provide feedback to the content presented.
11. The system of claim 1 , wherein: the content engine identifies a script template relevant to the problem submitted by the user; customizes the script template based on the profile of the user; retrieves the content based on the script template.
12. The system of claim 1 , wherein: the content includes one or more items, wherein each of the one or more items is a text, an image, an audio, or a video item.
13. The system of claim 1 , further comprising: a content library embedded in a computer readable medium, which in operation, maintains content as well as definitions, tags, and source of the content relevant to user- submitted problems. 14. The system of claim 13, wherein: the content in content library are tagged and organized appropriately for the purpose of easy identification, retrieval, and customization.
15. The system of claim 13, wherein: the content engine associates a link to a resource of each item in the content. 16. The system of claim 15, wherein: the user interaction engine presents the link together with the corresponding item in the content to the user.
17. The system of claim 1 , wherein: the content engine customizes the content based on one or more of: the user's prior visits, his/her recent comments and ratings on content related to the same or relevant problems, and his/her response to requests for wisdom.
18 The system of claim 1 , wherein the content engine generates the content that focuses on addressing both the problem the user submitted and the user's emotional need at the time
19 The system of claim 1 , wherein the content engine sets the ratio between problem-related portion and emotion-related portion of the content to reflect urgency of the user's emotional state at the time
20 The system of claim 1, wherein the content engine customizes the content based on an experience path of the user
21 The system of claim 1 , wherein the content engine includes one or more randomly selected content items in the content for the purpose of gathering feedback from the user
22 The system of claim 1 , wherein the content engine incorporates opinions and advice from a community of users and experts into the content 23 A computer-implemented method, comprising enabling the user to submit a problem to which the user intends to seek help or counseling, assessing an emotional state of a user at the time the problem is submitted, identifying and retrieving a content relevant to the problem submitted by the user, customizing the content based on the emotional state of the user, presenting the customized content relevant to the problem to the user 24 The method of claim 23, further comprising establishing and maintaining a profile of the user, customizing the content based on the profile of the user 25 The method of claim 23, further comprising initiating one or more questions to the user to solicit information for the purpose of assessing the emotional state of the user
26 The method of claim 23, further comprising presenting a visual representation of emotions to the user and enables the user to select one or more of his/her active emotion states via the visual representation
27 The method of claim 26, further comprising
adjusting emotions represented and their positions in the visual representation of emotions
28 The method of claim 23, further comprising enabling the user to provide feedback to the content presented 29 The method of claim 23, further comprising identifying a script template for the problem submitted by the user, customizing the script template based on the profile of the user, retrieving the content based on the script template
30 The method of claim 23, further comprising maintaining definitions, tags, and source of content relevant to user-submitted problems
31 The method of claim 23, further comprising tagging the content appropriately for the purpose of easy identification, retrieval, and customization
32 The method of claim 23, further comprising associating a source of or a link to each item in the content, presenting the source and the link together with the corresponding item in the content to the user
33 The method of claim 23, further comprising customizing the content based on one or more of the user's prior visits, his/her recent comments and ratings on content related to the same or relevant problems, and his/her response to requests for wisdom
34 The method of claim 23, further comprising generating the content that focuses on addressing both the problem the user submitted and the user's emotional need at the time 35 The method of claim 23, further comprising setting the ratio between problem-related portion and emotion-related portion of the content to reflect urgency of the user's emotional state at the time
36 The method of claim 23, further comprising customizing the content based on an experience path of the user 37 The method of claim 23, further comprising
including one or more randomly selected content items in the content for the purpose of gathering feedback from the user.
38. The method of claim 23, further comprising: incorporating opinions and advice from a community of users and experts into the content.
39. A system, comprising: means for enabling the user to submit a problem to which the user intends to seek help or counseling; means for assessing an emotional state of a user at the time the problem is submitted; means for identifying and retrieving a content relevant to the problem submitted by the user; means for customizing the content based on the emotional state of the user; means for presenting the customized content relevant to the problem to the user.
40. A machine readable medium having software instructions stored thereon that when executed cause a system to: assess an emotional state of a user at the time the problem is submitted; enable the user to submit a problem to which the user intends to seek help or counseling; identify and retrieve a content relevant to the problem submitted by the user; customize the content based on the emotional state of the user; present the customized content relevant to the problem to the user.
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/253,893 US20100100826A1 (en) | 2008-10-17 | 2008-10-17 | System and method for content customization based on user profile |
US12/253,893 | 2008-10-17 | ||
US12/476,953 | 2009-06-02 | ||
US12/476,953 US20100107075A1 (en) | 2008-10-17 | 2009-06-02 | System and method for content customization based on emotional state of the user |
Publications (3)
Publication Number | Publication Date |
---|---|
WO2010045593A2 true WO2010045593A2 (en) | 2010-04-22 |
WO2010045593A8 WO2010045593A8 (en) | 2010-06-24 |
WO2010045593A3 WO2010045593A3 (en) | 2010-08-12 |
Family
ID=42107286
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2009/061062 WO2010045593A2 (en) | 2008-10-17 | 2009-10-16 | A system and method for content customization based on emotional state of the user |
Country Status (2)
Country | Link |
---|---|
US (1) | US20100107075A1 (en) |
WO (1) | WO2010045593A2 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113053492A (en) * | 2021-04-02 | 2021-06-29 | 北方工业大学 | Self-adaptive virtual reality intervention system and method based on user background and emotion |
Families Citing this family (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8589874B2 (en) * | 2007-06-11 | 2013-11-19 | Microsoft Corporation | Visual interface to represent scripted behaviors |
JP2010204739A (en) * | 2009-02-27 | 2010-09-16 | Internatl Business Mach Corp <Ibm> | Support device, program and support method |
WO2011109716A2 (en) * | 2010-03-04 | 2011-09-09 | Neumitra LLC | Devices and methods for treating psychological disorders |
US8442849B2 (en) * | 2010-03-12 | 2013-05-14 | Yahoo! Inc. | Emotional mapping |
US20110225043A1 (en) * | 2010-03-12 | 2011-09-15 | Yahoo! Inc. | Emotional targeting |
US20110225049A1 (en) * | 2010-03-12 | 2011-09-15 | Yahoo! Inc. | Emoticlips |
US8888497B2 (en) * | 2010-03-12 | 2014-11-18 | Yahoo! Inc. | Emotional web |
GB201005835D0 (en) * | 2010-04-08 | 2010-05-26 | Psycho Social Interventions Ltd | Interactive system for use in management of psychological problems |
US8683348B1 (en) * | 2010-07-14 | 2014-03-25 | Intuit Inc. | Modifying software based on a user's emotional state |
US9514481B2 (en) * | 2010-12-20 | 2016-12-06 | Excalibur Ip, Llc | Selection and/or modification of an ad based on an emotional state of a user |
US20110145041A1 (en) * | 2011-02-15 | 2011-06-16 | InnovatioNet | System for communication between users and global media-communication network |
US20120265811A1 (en) * | 2011-04-12 | 2012-10-18 | Anurag Bist | System and Method for Developing Evolving Online Profiles |
US9026476B2 (en) * | 2011-05-09 | 2015-05-05 | Anurag Bist | System and method for personalized media rating and related emotional profile analytics |
US20120324491A1 (en) * | 2011-06-17 | 2012-12-20 | Microsoft Corporation | Video highlight identification based on environmental sensing |
US8595227B2 (en) * | 2011-06-28 | 2013-11-26 | Sap Ag | Semantic activity awareness |
TWI449410B (en) * | 2011-07-29 | 2014-08-11 | Nat Univ Chung Cheng | Personalized Sorting Method of Internet Audio and Video Data |
US8903758B2 (en) | 2011-09-20 | 2014-12-02 | Jill Benita Nephew | Generating navigable readable personal accounts from computer interview related applications |
US8306977B1 (en) * | 2011-10-31 | 2012-11-06 | Google Inc. | Method and system for tagging of content |
US9202251B2 (en) | 2011-11-07 | 2015-12-01 | Anurag Bist | System and method for granular tagging and searching multimedia content based on user reaction |
US11064257B2 (en) | 2011-11-07 | 2021-07-13 | Monet Networks, Inc. | System and method for segment relevance detection for digital content |
US10638197B2 (en) | 2011-11-07 | 2020-04-28 | Monet Networks, Inc. | System and method for segment relevance detection for digital content using multimodal correlations |
US9426538B2 (en) | 2013-11-20 | 2016-08-23 | At&T Intellectual Property I, Lp | Method and apparatus for presenting advertising in content having an emotional context |
US10387173B1 (en) | 2015-03-27 | 2019-08-20 | Intuit Inc. | Method and system for using emotional state data to tailor the user experience of an interactive software system |
US10169827B1 (en) * | 2015-03-27 | 2019-01-01 | Intuit Inc. | Method and system for adapting a user experience provided through an interactive software system to the content being delivered and the predicted emotional impact on the user of that content |
US10332122B1 (en) | 2015-07-27 | 2019-06-25 | Intuit Inc. | Obtaining and analyzing user physiological data to determine whether a user would benefit from user support |
US11086391B2 (en) | 2016-11-30 | 2021-08-10 | At&T Intellectual Property I, L.P. | Methods, and devices for generating a user experience based on the stored user information |
US11601715B2 (en) | 2017-07-06 | 2023-03-07 | DISH Technologies L.L.C. | System and method for dynamically adjusting content playback based on viewer emotions |
US10171877B1 (en) | 2017-10-30 | 2019-01-01 | Dish Network L.L.C. | System and method for dynamically selecting supplemental content based on viewer emotions |
CN109963172A (en) * | 2017-12-14 | 2019-07-02 | 深圳Tcl新技术有限公司 | Movie and video programs recommended method, system and storage medium based on psychological development |
US10427046B2 (en) * | 2017-12-22 | 2019-10-01 | Take-Two Interactive Software, Inc. | System and method for game object and environment generation |
KR101964438B1 (en) * | 2018-07-09 | 2019-04-01 | 넷마블 주식회사 | Method and apparatus for providing content service |
US11412298B1 (en) | 2018-10-02 | 2022-08-09 | Wells Fargo Bank, N.A. | Systems and methods of interactive goal setting tools |
US11068285B2 (en) | 2019-09-19 | 2021-07-20 | Adobe Inc. | Machine-learning models applied to interaction data for determining interaction goals and facilitating experience-based modifications to interface elements in online environments |
EP4113536A1 (en) * | 2021-06-30 | 2023-01-04 | Tata Consultancy Services Limited | Non-obtrusive method and system for detection of emotional loneliness of a person |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020191775A1 (en) * | 2001-06-19 | 2002-12-19 | International Business Machines Corporation | System and method for personalizing content presented while waiting |
US20050096973A1 (en) * | 2003-11-04 | 2005-05-05 | Heyse Neil W. | Automated life and career management services |
KR100727340B1 (en) * | 2005-10-19 | 2007-06-15 | 김용훈 | System and method for providing food information according to present situation |
US20070179351A1 (en) * | 2005-06-30 | 2007-08-02 | Humana Inc. | System and method for providing individually tailored health-promoting information |
Family Cites Families (71)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5064410A (en) * | 1984-12-12 | 1991-11-12 | Frenkel Richard E | Stress control system and method |
US5717923A (en) * | 1994-11-03 | 1998-02-10 | Intel Corporation | Method and apparatus for dynamically customizing electronic information to individual end users |
JP3063073B2 (en) * | 1995-06-30 | 2000-07-12 | 富士ゼロックス株式会社 | Image analysis expression adding device |
US5790426A (en) * | 1996-04-30 | 1998-08-04 | Athenium L.L.C. | Automated collaborative filtering system |
US5862223A (en) * | 1996-07-24 | 1999-01-19 | Walker Asset Management Limited Partnership | Method and apparatus for a cryptographically-assisted commercial network system designed to facilitate and support expert-based commerce |
US5732232A (en) * | 1996-09-17 | 1998-03-24 | International Business Machines Corp. | Method and apparatus for directing the expression of emotion for a graphical user interface |
US6317722B1 (en) * | 1998-09-18 | 2001-11-13 | Amazon.Com, Inc. | Use of electronic shopping carts to generate personal recommendations |
US6363154B1 (en) * | 1998-10-28 | 2002-03-26 | International Business Machines Corporation | Decentralized systems methods and computer program products for sending secure messages among a group of nodes |
ES2216608T3 (en) * | 1998-11-30 | 2004-10-16 | Index Systems, Inc. | SMART AGENT BASED ON THE REALIZATION OF PROFILES BASED ON HABITS, STATISTICAL INFERENCE AND PSYCHO-DEMOGRAPHIC DATA. |
US20030195872A1 (en) * | 1999-04-12 | 2003-10-16 | Paul Senn | Web-based information content analyzer and information dimension dictionary |
US6477272B1 (en) * | 1999-06-18 | 2002-11-05 | Microsoft Corporation | Object recognition with co-occurrence histograms and false alarm probability analysis for choosing optimal object recognition process parameters |
US7472071B2 (en) * | 1999-11-23 | 2008-12-30 | Expertviewpoint Llc. | Interactive system for managing questions and answers among users and experts |
US6434549B1 (en) * | 1999-12-13 | 2002-08-13 | Ultris, Inc. | Network-based, human-mediated exchange of information |
IT1316301B1 (en) * | 2000-01-26 | 2003-04-10 | Castelli Clino Trini | METHOD AND DEVICE FOR CATALOGING AND INFORMATION SEARCH |
US6868525B1 (en) * | 2000-02-01 | 2005-03-15 | Alberti Anemometer Llc | Computer graphic display visualization system and method |
NZ520461A (en) * | 2000-02-14 | 2005-03-24 | First Opinion Corp | Automated diagnostic system and method |
WO2001072002A2 (en) * | 2000-03-17 | 2001-09-27 | America Online, Inc. | Shared groups rostering system |
US6539395B1 (en) * | 2000-03-22 | 2003-03-25 | Mood Logic, Inc. | Method for creating a database for comparing music |
US6801909B2 (en) * | 2000-07-21 | 2004-10-05 | Triplehop Technologies, Inc. | System and method for obtaining user preferences and providing user recommendations for unseen physical and information goods and services |
US20020059378A1 (en) * | 2000-08-18 | 2002-05-16 | Shakeel Mustafa | System and method for providing on-line assistance through the use of interactive data, voice and video information |
US7890374B1 (en) * | 2000-10-24 | 2011-02-15 | Rovi Technologies Corporation | System and method for presenting music to consumers |
US7188081B1 (en) * | 2000-10-30 | 2007-03-06 | Microsoft Corporation | Electronic shopping basket |
US6629104B1 (en) * | 2000-11-22 | 2003-09-30 | Eastman Kodak Company | Method for adding personalized metadata to a collection of digital images |
US6970883B2 (en) * | 2000-12-11 | 2005-11-29 | International Business Machines Corporation | Search facility for local and remote interface repositories |
US7047169B2 (en) * | 2001-01-18 | 2006-05-16 | The Board Of Trustees Of The University Of Illinois | Method for optimizing a solution set |
US20020147619A1 (en) * | 2001-04-05 | 2002-10-10 | Peter Floss | Method and system for providing personal travel advice to a user |
US6623427B2 (en) * | 2001-09-25 | 2003-09-23 | Hewlett-Packard Development Company, L.P. | Biofeedback based personal entertainment system |
US7665024B1 (en) * | 2002-07-22 | 2010-02-16 | Verizon Services Corp. | Methods and apparatus for controlling a user interface based on the emotional state of a user |
JPWO2004072883A1 (en) * | 2003-02-12 | 2006-06-01 | 株式会社日立製作所 | Usability evaluation support method and system |
US20040237759A1 (en) * | 2003-05-30 | 2004-12-02 | Bill David S. | Personalizing content |
JP2005010854A (en) * | 2003-06-16 | 2005-01-13 | Sony Computer Entertainment Inc | Information presenting method and system |
US7693827B2 (en) * | 2003-09-30 | 2010-04-06 | Google Inc. | Personalization of placed content ordering in search results |
US20050079474A1 (en) * | 2003-10-14 | 2005-04-14 | Kenneth Lowe | Emotional state modification method and system |
US20050108031A1 (en) * | 2003-11-17 | 2005-05-19 | Grosvenor Edwin S. | Method and system for transmitting, selling and brokering educational content in streamed video form |
US7526459B2 (en) * | 2003-11-28 | 2009-04-28 | Manyworlds, Inc. | Adaptive social and process network systems |
US20060106793A1 (en) * | 2003-12-29 | 2006-05-18 | Ping Liang | Internet and computer information retrieval and mining with intelligent conceptual filtering, visualization and automation |
WO2005091175A1 (en) * | 2004-03-15 | 2005-09-29 | Yahoo! Inc. | Search systems and methods with integration of user annotations |
WO2005089481A2 (en) * | 2004-03-17 | 2005-09-29 | Kong Francis K | Method and apparatus creating, integrating, and using a patient medical history |
US20070067297A1 (en) * | 2004-04-30 | 2007-03-22 | Kublickis Peter J | System and methods for a micropayment-enabled marketplace with permission-based, self-service, precision-targeted delivery of advertising, entertainment and informational content and relationship marketing to anonymous internet users |
US7496567B1 (en) * | 2004-10-01 | 2009-02-24 | Terril John Steichen | System and method for document categorization |
US8112384B2 (en) * | 2004-10-27 | 2012-02-07 | Actus Potentia, Inc. | System and method for problem solving through dynamic/interactive concept-mapping |
US7685510B2 (en) * | 2004-12-23 | 2010-03-23 | Sap Ag | System and method for grouping data |
US20070255674A1 (en) * | 2005-01-10 | 2007-11-01 | Instant Information Inc. | Methods and systems for enabling the collaborative management of information based upon user interest |
US10740722B2 (en) * | 2005-04-25 | 2020-08-11 | Skyword Inc. | User-driven media system in a computer network |
US7720791B2 (en) * | 2005-05-23 | 2010-05-18 | Yahoo! Inc. | Intelligent job matching system and method including preference ranking |
US20070038717A1 (en) * | 2005-07-27 | 2007-02-15 | Subculture Interactive, Inc. | Customizable Content Creation, Management, and Delivery System |
US20090307629A1 (en) * | 2005-12-05 | 2009-12-10 | Naoaki Horiuchi | Content search device, content search system, content search system server device, content search method, computer program, and content output device having search function |
US20070150281A1 (en) * | 2005-12-22 | 2007-06-28 | Hoff Todd M | Method and system for utilizing emotion to search content |
WO2007101230A2 (en) * | 2006-02-28 | 2007-09-07 | Momjunction, Inc. | Method for sharing document between groups over a distributed network |
US7610255B2 (en) * | 2006-03-31 | 2009-10-27 | Imagini Holdings Limited | Method and system for computerized searching and matching multimedia objects using emotional preference |
CN101421723A (en) * | 2006-04-10 | 2009-04-29 | 雅虎公司 | Client side editing application for optimizing editing of media assets originating from client and server |
US7761464B2 (en) * | 2006-06-19 | 2010-07-20 | Microsoft Corporation | Diversifying search results for improved search and personalization |
US20080059447A1 (en) * | 2006-08-24 | 2008-03-06 | Spock Networks, Inc. | System, method and computer program product for ranking profiles |
US8005768B2 (en) * | 2006-11-28 | 2011-08-23 | Samsung Electronics Co., Ltd. | Multimedia file reproducing apparatus and method |
US7788247B2 (en) * | 2007-01-12 | 2010-08-31 | Microsoft Corporation | Characteristic tagging |
US20080320037A1 (en) * | 2007-05-04 | 2008-12-25 | Macguire Sean Michael | System, method and apparatus for tagging and processing multimedia content with the physical/emotional states of authors and users |
US8868463B2 (en) * | 2007-06-08 | 2014-10-21 | At&T Intellectual Property I, L.P. | System and method of managing digital rights |
US7885986B2 (en) * | 2007-06-27 | 2011-02-08 | Microsoft Corporation | Enhanced browsing experience in social bookmarking based on self tags |
US8359319B2 (en) * | 2007-08-27 | 2013-01-22 | Sudhir Pendse | Tool for personalized search |
CN101149950A (en) * | 2007-11-15 | 2008-03-26 | 北京中星微电子有限公司 | Media player for implementing classified playing and classified playing method |
KR101060487B1 (en) * | 2007-11-19 | 2011-08-30 | 서울대학교산학협력단 | Apparatus and method for content recommendation using tag cloud |
US10152721B2 (en) * | 2007-11-29 | 2018-12-11 | International Business Machines Corporation | Aggregate scoring of tagged content across social bookmarking systems |
KR100917784B1 (en) * | 2007-12-24 | 2009-09-21 | 한성주 | Method and system for retrieving information of collective emotion based on comments about content |
US20090240736A1 (en) * | 2008-03-24 | 2009-09-24 | James Crist | Method and System for Creating a Personalized Multimedia Production |
WO2009132279A1 (en) * | 2008-04-25 | 2009-10-29 | Rsc The Quality Measurement Company | System and method for measuring user response |
US8175530B2 (en) * | 2008-06-12 | 2012-05-08 | Motorola Mobility, Inc. | Personalizing entertainment experiences based on user profiles |
US8161036B2 (en) * | 2008-06-27 | 2012-04-17 | Microsoft Corporation | Index optimization for ranking using a linear model |
US8266254B2 (en) * | 2008-08-19 | 2012-09-11 | International Business Machines Corporation | Allocating resources in a distributed computing environment |
US8316393B2 (en) * | 2008-10-01 | 2012-11-20 | At&T Intellectual Property I, L.P. | System and method for a communication exchange with an avatar in a media communication system |
KR101541497B1 (en) * | 2008-11-03 | 2015-08-04 | 삼성전자 주식회사 | Computer readable medium recorded contents, Contents providing apparatus for mining user information, Contents providing method, User information providing method and Contents searching method |
TW201022968A (en) * | 2008-12-10 | 2010-06-16 | Univ Nat Taiwan | A multimedia searching system, a method of building the system and associate searching method thereof |
-
2009
- 2009-06-02 US US12/476,953 patent/US20100107075A1/en not_active Abandoned
- 2009-10-16 WO PCT/US2009/061062 patent/WO2010045593A2/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020191775A1 (en) * | 2001-06-19 | 2002-12-19 | International Business Machines Corporation | System and method for personalizing content presented while waiting |
US20050096973A1 (en) * | 2003-11-04 | 2005-05-05 | Heyse Neil W. | Automated life and career management services |
US20070179351A1 (en) * | 2005-06-30 | 2007-08-02 | Humana Inc. | System and method for providing individually tailored health-promoting information |
KR100727340B1 (en) * | 2005-10-19 | 2007-06-15 | 김용훈 | System and method for providing food information according to present situation |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113053492A (en) * | 2021-04-02 | 2021-06-29 | 北方工业大学 | Self-adaptive virtual reality intervention system and method based on user background and emotion |
Also Published As
Publication number | Publication date |
---|---|
WO2010045593A3 (en) | 2010-08-12 |
US20100107075A1 (en) | 2010-04-29 |
WO2010045593A8 (en) | 2010-06-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20100107075A1 (en) | System and method for content customization based on emotional state of the user | |
Tiggemann et al. | Social media is not real: The effect of ‘Instagram vs reality’images on women’s social comparison and body image | |
Pilipets | From Netflix streaming to Netflix and chill: The (dis) connected body of serial binge-viewer | |
Nikolinakou et al. | Viral video ads: Emotional triggers and social media virality | |
Hobbs et al. | Liquid love? Dating apps, sex, relationships and the digital transformation of intimacy | |
Baym et al. | Mindfully scrolling: Rethinking Facebook after time deactivated | |
US20100100826A1 (en) | System and method for content customization based on user profile | |
Van Koningsbruggen et al. | Spontaneous hedonic reactions to social media cues | |
Knoll et al. | Good guy vs. bad guy: The influence of parasocial interactions with media characters on brand placement effects | |
Galloway et al. | Does Movie Viewing Cultivate Young People's Unrealistic Expectations About Love and Marriage? | |
Lunenfeld | Design research: Methods and perspectives | |
US20100114937A1 (en) | System and method for content customization based on user's psycho-spiritual map of profile | |
O'Neill | The work of seduction: Intimacy and subjectivity in the London ‘seduction community’ | |
US20110113041A1 (en) | System and method for content identification and customization based on weighted recommendation scores | |
Quiroz | Cultural Tourism in Transnational Adoption: “Staged Authenticity” and Its Implications for Adopted Children | |
US20170235849A1 (en) | Matching system and method psychometric instrument system and method and system and method using same | |
Wang | Using attitude functions, self-efficacy, and norms to predict attitudes and intentions to use mobile devices to access social media during sporting event attendance | |
US20100106668A1 (en) | System and method for providing community wisdom based on user profile | |
Castillo-Abdul et al. | Hola followers! Content analysis of YouTube channels of female fashion influencers in Spain and Ecuador | |
US20100100542A1 (en) | System and method for rule-based content customization for user presentation | |
Gershon | What do we talk about when we talk about animation | |
Zhang et al. | Dawn or dusk? Will virtual tourism begin to boom? An integrated model of AIDA, TAM, and UTAUT | |
King | ‘Chick Crack’: Self-Esteem, Science and Women’s Dating Advice | |
US20100100827A1 (en) | System and method for managing wisdom solicited from user community | |
Hardy et al. | Examining sexually explicit material use in adults over the age of 65 years |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09821347 Country of ref document: EP Kind code of ref document: A2 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 09821347 Country of ref document: EP Kind code of ref document: A2 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |