US20050060641A1 - Audiovisual information management system with selective updating - Google Patents

Audiovisual information management system with selective updating Download PDF

Info

Publication number
US20050060641A1
US20050060641A1 US10/977,039 US97703904A US2005060641A1 US 20050060641 A1 US20050060641 A1 US 20050060641A1 US 97703904 A US97703904 A US 97703904A US 2005060641 A1 US2005060641 A1 US 2005060641A1
Authority
US
United States
Prior art keywords
description
user
preferences
audio
video
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/977,039
Inventor
Muhammed Sezan
Petrus Beek
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sharp Laboratories of America Inc
Original Assignee
Sharp Laboratories of America Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sharp Laboratories of America Inc filed Critical Sharp Laboratories of America Inc
Priority to US10/977,039 priority Critical patent/US20050060641A1/en
Assigned to SHARP LABORATORIES OF AMERICA, INC. reassignment SHARP LABORATORIES OF AMERICA, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: VAN BEEK, PETRUS, SEZN, MUHAMMED IBRAHIM
Publication of US20050060641A1 publication Critical patent/US20050060641A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • H04L12/2805Home Audio Video Interoperability [HAVI] networks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • H04L12/2816Controlling appliance services of a home automation network by calling their functionalities
    • H04L12/282Controlling appliance services of a home automation network by calling their functionalities based on user interaction within the home
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • H04L2012/2847Home automation networks characterised by the type of home appliance used
    • H04L2012/2849Audio/video appliances

Definitions

  • the present invention relates to a system for managing audiovisual information, and in particular to a system for audiovisual information browsing, filtering, searching, archiving, and personalization.
  • Video cassette recorders may record video programs in response to pressing a record button or may be programmed to record video programs based on the time of day.
  • the viewer must program the VCR based on information from a television guide to identify relevant programs to record. After recording, the viewer scans through the entire video tape to select relevant portions of the program for viewing using the functionality provided by the VCR, such as fast forward and fast reverse.
  • the searching and viewing is based on a linear search, which may require significant time to locate the desired portions of the program(s) and fast forward to the desired portion of the tape.
  • RePlayTV and TiVo have developed hard disk based systems that receive, record, and play television broadcasts in a manner similar to a VCR.
  • the systems may be programmed with the viewer's viewing preferences.
  • the systems use a telephone line interface to receive scheduling information similar to that available from a television guide. Based upon the system programming and the scheduling information, the system automatically records programs that may be of potential interest to the viewer. Unfortunately, viewing the recorded programs occurs in a linear manner and may require substantial time.
  • each system must be programmed for an individual's preference, likely in a different manner.
  • each individual viewer is required to program the device according to his particular viewing preferences.
  • each different type of device has different capabilities and limitations which limit the selections of the viewer.
  • each device includes a different interface which the viewer may be unfamiliar with. Further, if the operator's manual is inadvertently misplaced it may be difficult for the viewer to efficiently program the device.
  • the present invention overcomes the aforementioned drawbacks of the prior art by providing a method of using a system with at least one of audio, image, and a video comprising a plurality of frames comprising the steps of providing a usage preferences description scheme where the usage preference description scheme includes at least one of a browsing preferences description scheme, a filtering preferences description scheme, a search preferences description scheme, and a device preferences description scheme.
  • the browsing preferences description scheme relates to a user's viewing preferences.
  • the filtering and search preferences description schemes relate to at least one of (1) content preferences of the at least one of audio, image, and video, (2) classification preferences of the at least one of audio, image, and video, (3) keyword preferences of the at least one of audio, image, and video, and (4) creation preferences of the at least one of audio, image, and video.
  • the device preferences description scheme relates to user's preferences regarding presentation characteristics.
  • a usage history description scheme is provided where the usage preference description scheme includes at least one of a browsing history description scheme, a filtering history description scheme, a search history description scheme, and a device usage history description scheme.
  • the browsing history description scheme relates to a user's viewing preferences.
  • the filtering and search history description schemes relate to at least one of (1) content usage history of the at least one of audio, image, and video, (2) classification usage history of the at least one of audio, image, and video, (3) keyword usage history of the at least one of audio, image, and video, and (4) creation usage history of the at least one of audio, image, and video.
  • the device usage history description scheme relates to user's preferences regarding presentation characteristics. The usage preferences description scheme and the usage history description scheme are used to enhance system functionality.
  • FIG. 1 is an exemplary embodiment of a program, a system, and a user, with associated description schemes, of an audiovisual system of the present invention.
  • FIG. 2 is an exemplary embodiment of the audiovisual system, including an analysis module, of FIG. 1 .
  • FIG. 3 is an exemplary embodiment of the analysis module of FIG. 2 .
  • FIG. 4 is an illustration of a thumbnail view (category) for the audiovisual system.
  • FIG. 5 is an illustration of a thumbnail view (channel) for the audiovisual system.
  • FIG. 6 is an illustration of a text view (channel) for the audiovisual system.
  • FIG. 7 is an illustration of a frame view for the audiovisual system.
  • FIG. 8 is an illustration of a shot view for the audiovisual system.
  • FIG. 9 is an illustration of a key frame view the audiovisual system.
  • FIG. 10 is an illustration of a highlight view for the audiovisual system.
  • FIG. 11 is an illustration of an event view for the audiovisual system.
  • FIG. 12 is an illustration of a character/object view for the audiovisual system.
  • FIG. 13 is an alternative embodiment of a program description scheme including a syntactic structure description scheme, a semantic structure description scheme, a visualization description scheme, and a meta information description scheme.
  • FIG. 14 is an exemplary embodiment of the visualization description scheme of FIG. 13 .
  • FIG. 15 is an exemplary embodiment of the meta information description scheme of FIG. 13 .
  • FIG. 16 is an exemplary embodiment of a segment description scheme for the syntactic structure description scheme of FIG. 13 .
  • FIG. 17 is an exemplary embodiment of a region description scheme for the syntactic structure description scheme of FIG. 13 .
  • FIG. 18 is an exemplary embodiment of a segment/region relation description scheme for the syntactic structure description scheme of FIG. 13 .
  • FIG. 19 is an exemplary embodiment of an event description scheme for the semantic structure description scheme of FIG. 13 .
  • FIG. 20 is an exemplary embodiment of an object description scheme for the semantic structure description scheme of FIG. 13 .
  • FIG. 21 is an exemplary embodiment of an event/object relation graph description scheme for the syntactic structure description scheme of FIG. 13 .
  • FIG. 22 is an exemplary embodiment of a user preference description scheme.
  • FIG. 23 is an exemplary embodiment of the interrelationship between a usage history description scheme, an agent, and the usage preference description scheme of FIG. 22 .
  • FIG. 24 is an exemplary embodiment of the interrelationship between audio and/or video programs together with their descriptors, user identification, and the usage preference description scheme of FIG. 22 .
  • FIG. 25 is an exemplary embodiment of a usage preference description scheme of FIG. 22 .
  • FIG. 26 is an exemplary embodiment of the interrelationship between the usage description schemes and an MPEG-7 description schemes.
  • FIG. 27 is an exemplary embodiment of a usage history description scheme of FIG. 22 .
  • FIG. 28 is an exemplary system incorporating the user history description scheme.
  • a VCR permits the selection of the recording times but the user has to correlate the television guide with the desired recording times.
  • a preferred set of preselected radio stations for his home stereo and also presumably selecting the same set of preselected stations for each of the user's vehicles. If another household member desires a different set of preselected stereo selections, the programming of each audio device would need to be reprogrammed at substantial inconvenience.
  • the present inventors came to the realization that users of visual information and listeners to audio information, such as for example radio, audio tapes, video tapes, movies, and news, desire to be entertained and informed in more than merely one uniform manner.
  • the audiovisual information presented to a particular user should be in a format and include content suited to their particular viewing preferences.
  • the format should be dependent on the content of the particular audiovisual information.
  • the amount of information presented to a user or a listener should be limited to only the amount of detail desired by the particular user at the particular time. For example with the ever increasing demands on the user's time, the user may desire to watch only 10 minutes of or merely the highlights of a basketball game.
  • the present inventors came to the realization that the necessity of programming multiple audio and visual devices with their particular viewing preferences is a burdensome task, especially when presented with unfamiliar recording devices when traveling.
  • traveling users desire to easily configure unfamiliar devices, such as audiovisual devices in a hotel room, with their viewing and listening preferences in a efficient manner.
  • the present inventors came to the further realization that a convenient technique of merely recording the desired audio and video information is not sufficient because the presentation of the information should be in a manner that is time efficient, especially in light of the limited time frequently available for the presentation of such information.
  • the user should be able to access only that portion of all of the available information that the user is interested in, while skipping the remainder of the information.
  • a user is not capable of watching or otherwise listening to the vast potential amount of information available through all, or even a small portion of, the sources of audio and video information.
  • the user is not likely even aware of the potential content of information that he may be interested in.
  • the present inventors came to the realization that a system that records and presents to the user audio and video information based upon the user's prior viewing and listening habits, preferences, and personal characteristics, generally referred to as user information, is desirable.
  • the system may present such information based on the capabilities of the system devices. This permits the system to record desirable information and to customize itself automatically to the user and/or listener.
  • user, viewer, and/or listener terms may be used interchangeability for any type of content.
  • user information should be portable between and usable by different devices so that other devices may likewise be configured automatically to the particular user's preferences upon receiving the viewing information.
  • the present inventors analyzed a typical audio and video presentation environment to determine the significant portions of the typical audiovisual environment.
  • the video, image, and/or audio information 10 is provided or otherwise made available to a user and/or a (device) system.
  • the video, image, and/or audio information is presented to the user from the system 12 (device), such as a television set or a radio.
  • the user interacts both with the system (device) 12 to view the information 10 in a desirable manner and has preferences to define which audio, image, and/or video information is obtained in accordance with the user information 14 .
  • the present inventors After the proper identification of the different major aspects of an audiovisual system the present inventors then realized that information is needed to describe the informational content of each portion of the audiovisual system 16 .
  • each portion is identified together with its interrelationship to the other portions.
  • the description schemes include data that is auxiliary to the programs 10 , the system 12 , and the user 14 , to store a set of information, ranging from human readable text to encoded data, that can be used in enabling browsing, filtering, searching, archiving, and personalization.
  • the three portions may be combined together to provide an interactivity not previously achievable.
  • different programs 10 , different users 14 , and different systems 12 may be combined together in any combination, while still maintaining full compatibility and functionality. It is to be understood that the description scheme may contain the data itself or include links to the data, as desired.
  • a program description scheme 18 related to the video, still image, and/or audio information 10 preferably includes two sets of information, namely, program views and program profiles.
  • the program views define logical structures of the frames of a video that define how the video frames are potentially to be viewed suitable for efficient browsing.
  • the program views may contain a set of fields that contain data for the identification of key frames, segment definitions between shots, highlight definitions, video summary definitions, different lengths of highlights, thumbnail set of frames, individual shots or scenes, representative frame of the video, grouping of different events, and a close-up view.
  • the program view descriptions may contain thumbnail, slide, key frame, highlights, and close-up views so that users can filter and search not only at the program level but also within a particular program.
  • the description scheme also enables users to access information in varying detail amounts by supporting, for example, a key frame view as a part of a program view providing multiple levels of summary ranging from coarse to fine.
  • the program profiles define distinctive characteristics of the content of the program, such as actors, stars, rating, director, release date, time stamps, keyword identification, trigger profile, still profile, event profile, character profile, object profile, color profile, texture profile, shape profile, motion profile, and categories.
  • the program profiles are especially suitable to facilitate filtering and searching of the audio and video information.
  • the description scheme enables users to have the provision of discovering interesting programs that they may be unaware of by providing a user description scheme.
  • the user description scheme provides information to a software agent that in turn performs a search and filtering on behalf of the user by possibly using the system description scheme and the program description scheme information. It is to be understood that in one of the embodiments of the invention merely the program description scheme is included.
  • Program views contained in the program description scheme are a feature that supports a functionality such as close-up view.
  • close-up view a certain image object, e.g., a famous basketball player such as Michael Jordan, can be viewed up close by playing back a close-up sequence that is separate from the original program.
  • An alternative view can be incorporated in a straightforward manner.
  • Character profile on the other hand may contain spatio-temporal position and size of a rectangular region around the character of interest. This region can be enlarged by the presentation engine, or the presentation engine may darken outside the region to focus the user's attention to the characters spanning a certain number of frames.
  • Information within the program description scheme may contain data about the initial size or location of the region, movement of the region from one frame to another, and duration and terms of the number of frames featuring the region.
  • the character profile also provides provision for including text annotation and audio annotation about the character as well as web page information, and any other suitable information.
  • Such character profiles may include the audio annotation which is separate from and in addition to the associated audio track of the video.
  • the program description scheme may likewise contain similar information regarding audio (such as radio broadcasts) and images (such as analog or digital photographs or a frame of a video).
  • the user description scheme 20 preferably includes the user's personal preferences, and information regarding the user's viewing history such as for example browsing history, filtering history, searching history, and device setting history.
  • the user's personal preferences includes information regarding particular programs and categorizations of programs that the user prefers to view.
  • the user description scheme may also include personal information about the particular user, such as demographic and geographic information, e.g. zip code and age.
  • the explicit definition of the particular programs or attributes related thereto permits the system 16 to select those programs from the information contained within the available program description schemes 18 that may be of interest to the user. Frequently, the user does not desire to learn to program the device nor desire to explicitly program the device.
  • the user description scheme 20 may not be sufficiently robust to include explicit definitions describing all desirable programs for a particular user.
  • the capability of the user description scheme 20 to adapt to the viewing habits of the user to accommodate different viewing characteristics not explicitly provided for or otherwise difficult to describe is useful.
  • the user description scheme 20 may be augmented or any technique can be used to compare the information contained in the user description scheme 20 to the available information contained in the program description scheme 18 to make selections.
  • the user description scheme provides a technique for holding user preferences ranging from program categories to program views, as well as usage history.
  • User description scheme information is persistent but can be updated by the user or by an intelligent software agent on behalf of the user at any arbitrary time. It may also be disabled by the user, at any time, if the user decides to do so.
  • the user description scheme is modular and portable so that users can carry or port it from one device to another, such as with a handheld electronic device or smart card or transported over a network connecting multiple devices.
  • user description scheme is standardized among different manufacturers or products, user preferences become portable. For example, a user can personalize the television receiver in a hotel room permitting users to access information they prefer at any time and anywhere. In a sense, the user description scheme is persistent and timeless based.
  • selected information within the program description scheme may be encrypted since at least part of the information may be deemed to be private (e.g., demographics).
  • a user description scheme may be associated with an audiovisual program broadcast and compared with a particular user's description scheme of the receiver to readily determine whether or not the program's intended audience profile matches that of the user. It is to be understood that in one of the embodiments of the invention merely the user description scheme is included.
  • the system description scheme 22 preferably manages the individual programs and other data.
  • the management may include maintaining lists of programs, categories, channels, users, videos, audio, and images.
  • the management may include the capabilities of a device for providing the audio, video, and/or images. Such capabilities may include, for example, screen size, stereo, AC3, DTS, color, black/white, etc.
  • the management may also include relationships between any one or more of the user, the audio, and the images in relation to one or more of a program description scheme(s) and a user description scheme(s). In a similar manner the management may include relationships between one or more of the program description scheme(s) and user description scheme(s). It is to be understood that in one of the embodiments of the invention merely the system description scheme is included.
  • the descriptors of the program description scheme and the user description scheme should overlap, at least partially, so that potential desirability of the program can be determined by comparing descriptors representative of the same information.
  • the program and user description scheme may include the same set of categories and actors.
  • the program description scheme has no knowledge of the user description scheme, and vice versa, so that each description scheme is not dependant on the other for its existence. It is not necessary for the description schemes to be fully populated. It is also beneficial not to include the program description scheme with the user description scheme because there will likely be thousands of programs with associated description schemes which if combined with the user description scheme would result in a unnecessarily large user description scheme. It is desirable to maintain the user description scheme small so that it is more readily portable. Accordingly, a system including only the program description scheme and the user description scheme would be beneficial.
  • the user description scheme and the system description scheme should include at least partially overlapping fields. With overlapping fields the system can capture the desired information, which would otherwise not be recognized as desirable.
  • the system description scheme preferably includes a list of users and available programs. Based on the master list of available programs, and associated program description scheme, the system can match the desired programs. It is also beneficial not to include the system description scheme with the user description scheme because there will likely be thousands of programs stored in the system description schemes which if combined with the user description scheme would result in a unnecessarily large user description scheme. It is desirable to maintain the user description scheme small so that it is more readily portable.
  • the user description scheme may include radio station preselected frequencies and/or types of stations, while the system description scheme includes the available stations for radio stations in particular cities. When traveling to a different city the user description scheme together with the system description scheme will permit reprogramming the radio stations. Accordingly, a system including only the system description scheme and the user description scheme would be beneficial.
  • the program description scheme and the system description scheme should include at least partially overlapping fields. With the overlapping fields, the system description scheme will be capable of storing the information contained within the program description scheme, so that the information is properly indexed. With proper indexing, the system is capable of matching such information with the user information, if available, for obtaining and recording suitable programs. If the program description scheme and the system description scheme were not overlapping then no information would be extracted from the programs and stored.
  • System capabilities specified within the system description scheme of a particular viewing system can be correlated with a program description scheme to determine the views that can be supported by the viewing system. For instance, if the viewing device is not capable of playing back video, its system description scheme may describe its viewing capabilities as limited to keyframe view and slide view only.
  • Program description scheme of a particular program and system description scheme of the viewing system are utilized to present the appropriate views to the viewing system.
  • a server of programs serves the appropriate views according to a particular viewing system's capabilities, which may be communicated over a network or communication channel connecting the server with user's viewing device.
  • the program description scheme is associated with the program, even if displayed at a different time. Accordingly, a system including only the system description scheme and the program description scheme would be beneficial.
  • the programs 10 , the users 14 , and the system 12 may be interchanged with one another while maintaining the functionality of the entire system 16 .
  • the audio, visual, or audiovisual program 38 is received by the system 16 .
  • the program 38 may originate at any suitable source, such as for example broadcast television, cable television, satellite television, digital television, Internet broadcasts, world wide web, digital video discs, still images, video cameras, laser discs, magnetic media, computer hard drive, video tape, audio tape, data services, radio broadcasts, and microwave communications.
  • the program description stream may originate from any suitable source, such as for example PSIP/DVB-SI information in digital television broadcasts, specialized digital television data services, specialized Internet services, world wide web, data files, data over the telephone, and memory, such as computer memory.
  • the program, user, and/or system description scheme may be transported over a network (communication channel).
  • the system description scheme may be transported to the source to provide the source with views or other capabilities that the device is capable of using.
  • the source provides the device with image, audio, and/or video content customized or otherwise suitable for the particular device.
  • the system 16 may include any device(s) suitable to receive any one or more of such programs 38 .
  • An audiovisual program analysis module 42 performs an analysis of the received programs 38 to extract and provide program related information (descriptors) to the description scheme (DS) generation module 44 .
  • the program related information may be extracted from the data stream including the program 38 or obtained from any other source, such as for example data transferred over a telephone line, data already transferred to the system 16 in the past, or data from an associated file.
  • the program related information preferably includes data defining both the program views and the program profiles available for the particular program 38 .
  • the analysis module 42 performs an analysis of the programs 38 using information obtained from (i) automatic audio-video analysis methods on the basis of low-level features that are extracted from the program(s), (ii) event detection techniques, (iii) data that is available (or extractable) from data sources or electronic program guides (EPGs, DVB-SI, and PSIP), and (iv) user information obtained from the user description scheme 20 to provide data defining the program description scheme.
  • the selection of a particular program analysis technique depends on the amount of readily available data and the user preferences. For example, if a user prefers to watch a 5 minute video highlight of a particular program, such as a basketball game, the analysis module 42 may invoke a knowledge based system 90 ( FIG. 3 ) to determine the highlights that form the best 5 minute summary.
  • the knowledge based system 90 may invoke a commercial filter 92 to remove commercials and a slow motion detector 54 to assist in creating the video summary.
  • the analysis module 42 may also invoke other modules to bring information together (e.g., textual information) to author particular program views.
  • the analysis module 42 may create a key-frame summary by identifying key-frames of a multi-level summary and passing the information to be used to generate the program views, and in particular a key frame view, to the description scheme. Referring also to FIG.
  • the analysis module 42 may also include other sub-modules, such as for example, a de-mux/decoder 60 , a data and service content analyzer 62 , a text processing and text summary generator 64 , a close caption analyzer 66 , a title frame generator 68 , an analysis manager 70 , an audiovisual analysis and feature extractor 72 , an event detector 74 , a key-frame summarizer 76 , and a highlight summarizer 78 .
  • a de-mux/decoder 60 the analysis module 42 may also include other sub-modules, such as for example, a de-mux/decoder 60 , a data and service content analyzer 62 , a text processing and text summary generator 64 , a close caption analyzer 66 , a title frame generator 68 , an analysis manager 70 , an audiovisual analysis and feature extractor 72 , an event detector 74 , a key-frame summarizer 76 , and a highlight summarizer 78 .
  • the generation module 44 receives the system information 46 for the system description scheme.
  • the system information 46 preferably includes data for the system description scheme 22 generated by the generation module 44 .
  • the generation module 44 also receives user information 48 including data for the user description scheme.
  • the user information 48 preferably includes data for the user description scheme generated within the generation module 44 .
  • the user input 48 may include, for example, meta information to be included in the program and system description scheme.
  • the user description scheme (or corresponding information) is provided to the analysis module 42 for selective analysis of the program(s) 38 .
  • the user description scheme may be suitable for triggering the highlight generation functionality for a particular program and thus generating the preferred views and storing associated data in the program description scheme.
  • the generation module 44 and the analysis module 42 provide data to a data storage unit 50 .
  • the storage unit 50 may be any storage device, such as memory or magnetic media.
  • a search, filtering, and browsing (SFB) module 52 implements the description scheme technique by parsing and extracting information contained within the description scheme.
  • the SFB module 52 may perform filtering, searching, and browsing of the programs 38 , on the basis of the information contained in the description schemes.
  • An intelligent software agent is preferably included within the SFB module 52 that gathers and provides user specific information to the generation module 44 to be used in authoring and updating the user description scheme (through the generation module 44 ). In this manner, desirable content may be provided to the user though a display 80 .
  • the selections of the desired program(s) to be retrieved, stored, and/or viewed may be programmed, at least in part, through a graphical user interface 82 .
  • the graphical user interface may also include or be connected to a presentation engine for presenting the information to the user through the graphical user interface.
  • the intelligent management and consumption of audiovisual information using the multi-part description stream device provides a next-generation device suitable for the modern era of information overload.
  • the device responds to changing lifestyles of individuals and families, and allows everyone to obtain the information they desire anytime and anywhere they want.
  • An example of the use of the device may be as follows. A user comes home from work late Friday evening being happy the work week is finally over. The user desires to catch up with the events of the world and then watch ABC's 20/20 show later that evening. It is now 9 PM and the 20/20 show will start in an hour at 10 PM. The user is interested in the sporting events of the week, and all the news about the Microsoft case with the Department of Justice.
  • the user description scheme may include a profile indicating a desire that the particular user wants to obtain all available information regarding the Microsoft trial and selected sporting events for particular teams.
  • the system description scheme and program description scheme provide information regarding the content of the available information that may selectively be obtained and recorded.
  • the system in an autonomous manner, periodically obtains and records the audiovisual information that may be of interest to the user during the past week based on the three description schemes.
  • the device most likely has recorded more than one hour of audiovisual information so the information needs to be condensed in some manner.
  • the user starts interacting with the system with a pointer or voice commands to indicate a desire to view recorded sporting programs.
  • the user On the display, the user is presented with a list of recorded sporting events including Basketball and Soccer. Apparently the user's favorite Football team did not play that week because it was not recorded.
  • the user is interested in basketball games and indicates a desire to view games.
  • a set of title frames is presented on the display that captures an important moment of each game.
  • the user selects the Chicago Bulls game and indicates a desire to view a 5 minute highlight of the game.
  • the system automatically generates highlights.
  • the highlights may be generated by audio or video analysis, or the program description scheme includes data indicating the frames that are presented for a 5 minute highlight.
  • the system may have also recorded web-based textual information regarding the particular Chicago-Bulls game which may be selected by the user for viewing. If desired, the summarized information may be recorded onto a storage device, such as a DVD with a label.
  • the stored information may also include an index code so that it can be located at a later time. After viewing the sporting events the user may decide to read the news about the Microsoft trial. It is now 9:50 PM and the user is done viewing the news.
  • the user has selected to delete all the recorded news items after viewing them. The user then remembers to do one last thing before 10 PM in the evening.
  • the next day the user desires to watch the VHS tape that he received from his brother that day, containing footage about his brother's new baby girl and his vacation to Peru last summer.
  • the user wants to watch the whole 2-hour tape but he is anxious to see what the baby looks like and also the new stadium built in Lima, which was not there last time he visited Peru.
  • the user plans to take a quick look at a visual summary of the tape, browse, and perhaps watch a few segments for a couple of minutes, before the user takes his daughter to her piano lesson at 10 AM the next morning.
  • the user plugs in the tape into his VCR, that is connected to the system, and invokes the summarization functionality of the system to scan the tape and prepare a summary.
  • the user can then view the summary the next morning to quickly discover the baby's looks, and playback segments between the key-frames of the summary to catch a glimpse of the crying baby.
  • the system may also record the tape content onto the system hard drive (or storage device) so the video summary can be viewed quickly. It is now 10:10 PM, and it seems that the user is 10 minutes late for viewing 20/20. Fortunately, the system, based on the three description schemes, has already been recording 20/20 since 10 PM. Now the user can start watching the recorded portion of 20/20 as the recording of 20/20 proceeds. The user will be done viewing 20/20 at 11:10 PM.
  • the average consumer has an ever increasing number of multimedia devices, such as a home audio system, a car stereo, several home television sets, web browsers, etc.
  • the user currently has to customize each of the devices for optimal viewing and/or listening preferences.
  • a removable storage device such as a smart card
  • the user may insert the card including the user preferences into such media devices for automatic customization.
  • the user only has to specify his preferences at most once, on a single device and subsequently, the descriptors are automatically uploaded into devices by the removable storage device.
  • the user description scheme may also be loaded into other devices using a wired or wireless network connection, e.g.
  • the system can store the user history and create entries in the user description scheme based on the's audio and video viewing habits. In this manner, the user would never need to program the viewing information to obtain desired information.
  • the user descriptor scheme enables modeling of the user by providing a central storage for the user's listening, viewing, browsing preferences, and user's behavior. This enables devices to be quickly personalized, and enables other components, such as intelligent agents, to communicate on the basis of a standardized description format, and to make smart inferences regarding the user's preferences.
  • FIG. 2 depicts an audiovisual searching, filtering, browsing, and/or recording appliance that is personalizable.
  • the list of more specific applications/implementations given below is not exhaustive but covers a range.
  • the user description scheme is a major enabler for personalizable audiovisual appliances. If the structure (syntax and semantics) of the description schemes is known amongst multiple appliances, the user (user) can carry (or otherwise transfer) the information contained within his user description scheme from one appliance to another, perhaps via a smart card—where these appliances support smart card interface—in order to personalize them. Personalization can range from device settings, such as display contrast and volume control, to settings of television channels, radio stations, web stations, web sites, geographic information, and demographic information such as age, zip code etc. Appliances that can be personalized may access content from different sources. They may be connected to the web, terrestrial or cable broadcast, etc., and they may also access multiple or different types of single media such as video, music, etc.
  • Different members of the household can instantly personalize the viewing experience by inserting their own smart card into the family remote. In the absence of such a remote, this same type of personalization can be achieved by plugging in the smart card directly to the television system.
  • the remote may likewise control audio systems.
  • the television receiving system holds user description schemes for multiple users (users) in local storage and identify different users (or group of users) by using an appropriate input interface. For example an interface using user-voice identification technology. It is noted that in a networked system the user description scheme may be transported over the network.
  • the user description scheme is generated by direct user input, and by using a software that watches the user to determine his/her usage pattern and usage history. User description scheme can be updated in a dynamic fashion by the user or automatically.
  • a well defined and structured description scheme design allows different devices to interoperate with each other.
  • a modular design also provides portability.
  • the description scheme adds new functionality to those of the current VCR.
  • An advanced VCR system can learn from the user via direct input of preferences, or by watching the usage pattern and history of the user.
  • the user description scheme holds user's preferences users and usage history.
  • An intelligent agent can then consult with the user description scheme and obtain information that it needs for acting on behalf of the user. Through the intelligent agent, the system acts on behalf of the user to discover programs that fit the taste of the user, alert the user about such programs, and/or record them autonomously.
  • An agent can also manage the storage in the system according to the user description scheme, i.e., prioritizing the deletion of programs (or alerting the user for transfer to a removable media), or determining their compression factor (which directly impacts their visual quality) according to user's preferences and history.
  • the program description scheme and the system description scheme work in collaboration with the user description scheme in achieving some tasks.
  • the program description scheme and system description scheme in an advanced VCR or other system will enable the user to browse, search, and filter audiovisual programs. Browsing in the system offers capabilities that are well beyond fast forwarding and rewinding. For instance, the user can view a thumbnail view of different categories of programs stored in the system. The user then may choose frame view, shot view, key frame view, or highlight view, depending on their availability and user's preference. These views can be readily invoked using the relevant information in the program description scheme, especially in program views. The user at any time can start viewing the program either in parts, or in its entirety.
  • the program description scheme may be readily available from many services such as: (i) from broadcast (carried by EPG defined as a part of ATSC-PSIP (ATSC-Program Service Integration Protocol) in USA or DVB-SI (Digital Video Broadcast-Service Information) in Europe); (ii) from specialized data services (in addition to PSIP/DVB-SI); (iii) from specialized web sites; (iv) from the media storage unit containing the audiovisual content (e.g., DVD); (v) from advanced cameras (discussed later), and/or may be generated (i.e., for programs that are being stored) by the analysis module 42 or by user input 48 .
  • EPG ATSC-PSIP
  • DVB-SI Digital Video Broadcast-Service Information
  • Contents of digital still and video cameras can be stored and managed by a system that implements the description schemes, e.g., a system as shown in FIG. 2 .
  • Advanced cameras can store a program description scheme, for instance, in addition to the audiovisual content itself.
  • the program description scheme can be generated either in part or in its entirety on the camera itself via an appropriate user input interface (e.g., speech, visual menu drive, etc.). Users can input to the camera the program description scheme information, especially those high-level (or semantic) information that may otherwise be difficult to automatically extract by the system.
  • Some camera settings and parameters e.g., date and time
  • quantities computed in the camera e.g., color histogram to be included in the color profile
  • the system can browse the camera content, or transfer the camera content and its description scheme to the local storage for future use. It is also possible to update or add information to the description scheme generated in the camera.
  • the IEEE 1394 and Havi standard specifications enable this type of “audiovisual content” centric communication among devices.
  • the description scheme API's can be used in the context of Havi to browse and/or search the contents of a camera or a DVD which also contain a description scheme associated with their content, i.e., doing more than merely invoking the PLAY API to play back and linearly view the media.
  • the description schemes may be used in archiving audiovisual programs in a database.
  • the search engine uses the information contained in the program description scheme to retrieve programs on the basis of their content.
  • the program description scheme can also be used in navigating through the contents of the database or the query results.
  • the user description scheme can be used in prioritizing the results of the user query during presentation. It is possible of course to make the program description scheme more comprehensive depending on the nature of the particular application.
  • the description scheme fulfills the user's desire to have applications that pay attention and are responsive to their viewing and usage habits, preferences, and personal demographics.
  • the proposed user description scheme directly addresses this desire in its selection of fields and interrelationship to other description schemes. Because the description schemes are modular in nature, the user can port his user description scheme from one device to another in order to “personalize” the device.
  • the proposed description schemes can be incorporated into current products similar to those from TiVo and Replay TV in order to extend their entertainment informational value.
  • the description scheme will enable audiovisual browsing and searching of programs and enable filtering within a particular program by supporting multiple program views such as the highlight view.
  • the description scheme will handle programs coming from sources other than television broadcasts for which TiVo and Replay TV are not designed to handle.
  • other products may be interconnected to such devices to extend their capabilities, such as devices supporting an MPEG 7 description.
  • MPEG-7 is the Moving Pictures Experts Group-7, acting to standardize descriptions and description schemes for audiovisual information.
  • the device may also be extended to be personalized by multiple users, as desired.
  • the intelligent software agents can communicate among themselves to make intelligent inferences regarding the user's preferences.
  • the development and upgrade of intelligent software agents for browsing and filtering applications can be simplified based on the standardized user description scheme.
  • the description scheme is multi-modal in the following sense that it holds both high level (semantic) and low level features and/or descriptors.
  • the high and low level descriptors are actor name and motion model parameters, respectively.
  • High level descriptors are easily readable by humans while low level descriptors are more easily read by machines and less understandable by humans.
  • the program description scheme can be readily harmonized with existing EPG, PSIP, and DVB-SI information facilitating search and filtering of broadcast programs. Existing services can be extended in the future by incorporating additional information using the compliant description scheme.
  • one case may include audiovisual programs that are prerecorded on a media such as a digital video disc where the digital video disc also contains a description scheme that has the same syntax and semantics of the description scheme that the FSB module uses.
  • a transcoder converter
  • the user may want to browse and view the content of the digital video disc. In this case, the user may not need to invoke the analysis module to author a program description. However, the user may want to invoke his or her user description scheme in filtering, searching and browsing the digital video disc content. Other sources of program information may likewise be used in the same manner.
  • FIGS. 4-12 An example of an audiovisual interface is shown in FIGS. 4-12 which is suitable for the preferred audiovisual description scheme.
  • FIG. 4 by selecting the thumbnail function as a function of category provides a display with a set of categories on the left hand side. Selecting a particular category, such as news, provides a set of thumbnail views of different programs that are currently available for viewing.
  • the different programs may also include programs that will be available at a different time for viewing.
  • the thumbnail views are short video segments that provide an indication of the content of the respective actual program that it corresponds with.
  • a thumbnail view of available programs in terms of channels may be displayed, if desired.
  • FIG. 6 a text view of available programs in terms of channels may be displayed, if desired.
  • a frame view of particular programs may be displayed, if desired.
  • a representative frame is displayed in the center of the display with a set of representative frames of different programs in the left hand column. The frequency of the number of frames may be selected, as desired.
  • a set of frames are displayed on the lower portion of the display representative of different frames during the particular selected program.
  • a shot view of particular programs may be displayed, as desired.
  • a representative frame of a shot is displayed in the center of the display with a set of representative frames of different programs in the left hand column.
  • a set of shots are displayed on the lower portion of the display representative of different shots (segments of a program, typically sequential in nature) during the particular selected program.
  • a key frame view of particular programs may be displayed, as desired.
  • a representative frame is displayed in the center of the display with a set of representative frames of different programs in the left hand column.
  • a set of key frame views are displayed on the lower portion of the display representative of different key frame portions during the particular selected program. The number of key frames in each key frame view can be adjusted by selecting the level.
  • a highlight view may likewise be displayed, as desired.
  • an event view may likewise be displayed, as desired.
  • a character/object view may likewise be displayed, as desired.
  • the description scheme may be implemented in any language and include any of the included descriptions (or more), as desired.
  • the proposed program description scheme includes three major sections for describing a video program.
  • the first section identifies the described program.
  • the second section defines a number of views which may be useful in browsing applications.
  • Program ID ⁇ ProgramID> program-id ⁇ /ProgramID>
  • Source Location ⁇ SourceLocation> source-url ⁇ /SourceLocation>
  • Thumbnail View ⁇ ThumbnailView> ⁇ Image> thumbnail-image ⁇ /Image> ⁇ /ThumbnailView>
  • Date-Time Profile ⁇ DateTimeProfile> ⁇ ProductionDate> date ⁇ /ProductionDate> ⁇ ReleaseDate> date ⁇ /ReleaseDate> ⁇ RecordingDate> date ⁇ /RecordingDate> ⁇ RecordingTime> time ⁇ /RecordingTime> ... ⁇ /DateTimeProfile>
  • Keyword Profile ⁇ KeywordProfile> keyword ... ⁇ /KeywordProfile>
  • Trigger Profile ⁇ TriggerProfile> trigger-frame-id ... ⁇ /TriggerProfile>
  • Shape Profile ⁇ ShapeProfile> ... ⁇ /ShapeProfile>
  • the proposed user description scheme includes three major sections for describing a user.
  • the first section identifies the described user.
  • the second section records a number of settings which may be preferred by the user.
  • Filtering Preferences ⁇ FilteringPreferences> ⁇ Categories> category-name ... ⁇ /Categories> ⁇ Channels> channel-number ... ⁇ /Channels> ⁇ Ratings> rating-id ... ⁇ /Ratings> ⁇ Shows> show-name ... ⁇ /Shows> ⁇ Authors> author-name ... ⁇ /Authors> ⁇ Producers> producer-name ... ⁇ /Producers> ⁇ Directors> director-name ... ⁇ /Directors> ⁇ Actors> actor-name ... ⁇ /Actors> ⁇ Keywords> keyword ... ⁇ /Keywords> ⁇ Titles> title-text ... ⁇ /Titles> ... ⁇ /FilteringPreferences>
  • Search Preferences ⁇ SearchPreferences> ⁇ Categories> category-name ... ⁇ /Categories> ⁇ Channels> channel-number ... ⁇ /Channels> ⁇ Ratings> rating-id ... ⁇ /Ratings> ⁇ Shows> show-name ... ⁇ /Shows> ⁇ Authors> author-name ... ⁇ /Authors> ⁇ Producers> producer-name ... ⁇ /Producers> ⁇ Directors> director-name ... ⁇ /Directors> ⁇ Actors> actor-name ... ⁇ /Actors> ⁇ Keywords> keyword ... ⁇ /Keywords> ⁇ Titles> title-text ... ⁇ /Titles> ... ⁇ /SearchPreferences>
  • Filtering History ⁇ FilteringHistory> ⁇ Categories> category-name ... ⁇ /Categories> ⁇ Channels> channel-number ... ⁇ /Channels> ⁇ Ratings> rating-id ... ⁇ /Ratings> ⁇ Shows> show-name ... ⁇ /Shows> ⁇ Authors> author-name ... ⁇ /Authors> ⁇ Producers> producer-name ... ⁇ /Producers> ⁇ Directors> director-name ... ⁇ /Directors> ⁇ Actors> actor-name ... ⁇ /Actors> ⁇ Keywords> keyword ... ⁇ /Keywords> ⁇ Titles> title-text ... ⁇ /Titles> ... ⁇ /FilteringHistory>
  • the descriptor ⁇ FilteringHistory> captures the history of a user's filtering related activities.
  • the proposed system description scheme includes four major sections for describing a user.
  • the first section identifies the described system.
  • the second section keeps a list of all known users.
  • the third section keeps lists of available programs.
  • Channels ⁇ Channels> ⁇ Channel> ⁇ ChannelID> channel-id ⁇ /ChannelID> ⁇ ChannelName> channel-name ⁇ /ChannelName> ⁇ SubChannels> sub-channel-id ... ⁇ /SubChannels> ⁇ /Channel> ⁇ Channel> ⁇ ChannelID> channel-id ⁇ /ChannelID> ⁇ ChannelName> channel-name ⁇ /ChannelName> ⁇ SubChannels> sub-channel-id ... ⁇ /SubChannels> ⁇ /Channel> ... ⁇ /Channels> ⁇ /Channels> ... ⁇ /Channels>
  • the descriptor ⁇ Views> lists views which are supported by a video system or device. Each view is specified by the descriptor ⁇ View>.
  • the descriptor ⁇ ViewName> contains a string which should match with one of the following views used in the program description schemes: ThumbnailView, SlideView, FrameView, ShotView, KeyFrameView, HighlightView, EventView, and CloseUpView.
  • the modified program description scheme 400 includes four separate types of information, namely, a syntactic structure description scheme 402 , a semantic structure description scheme 404 , a visualization description scheme 406 , and a meta information description scheme 408 . It is to be understood that in any particular system one or more of the description schemes may be included, as desired.
  • the visualization description scheme 406 enables fast and effective browsing of video program (and audio programs) by allowing access to the necessary data, preferably in a one-step process.
  • the visualization description scheme 406 provides for several different presentations of the video content (or audio), such as for example, a thumbnail view description scheme 410 , a key frame view description scheme 412 , a highlight view description scheme 414 , an event view description scheme 416 , a close-up view description scheme 418 , and an alternative view description scheme 420 .
  • Other presentation techniques and description schemes may be added, as desired.
  • the thumbnail view description scheme 410 preferably includes an image 422 or reference to an image representative of the video content and a time reference 424 to the video.
  • the key frame view description scheme 412 preferably includes a level indicator 426 and a time reference 428 .
  • the level indicator 426 accommodates the presentation of a different number of key frames for the same video portion depending on the user's preference.
  • the highlight view description scheme 414 includes a length indicator 430 and a time reference 432 .
  • the length indicator 430 accommodates the presentation of a different highlight duration of a video depending on the user's preference.
  • the event view description scheme 416 preferably includes an event indicator 434 for the selection of the desired event and a time reference 436 .
  • the close-up view description scheme 418 preferably includes a target indicator 438 and a time reference 440 .
  • the alternate view description scheme preferably includes a source indicator 442 .
  • the meta information description scheme 408 generally includes various descriptors which carry general information about a video (or audio) program such as the title, category, keywords, etc. Additional descriptors, such as those previously described, may be included, as desired.
  • the syntactic structure description scheme 402 specifies the physical structure of a video program (or audio), e.g., a table of contents.
  • the physical features may include for example, color, texture, motion, etc.
  • the syntactic structure description scheme 402 preferably includes three modules, namely a segment description scheme 450 , a region description scheme 452 , and a segment/region relation graph description scheme 454 .
  • the segment description scheme 450 may be used to define relationships between different portions of the video consisting of multiple frames of the video.
  • a segment description scheme 450 may contain another segment description scheme 450 and/or shot description scheme to form a segment tree. Such a segment tree may be used to define a temporal structure of a video program.
  • a video program may be segmented into story units, scenes, and shots, from which the segment description scheme 450 may contain such information as a table of contents.
  • the shot description scheme may contain a number of key frame description schemes, a mosaic description scheme(s), a camera motion description scheme(s), etc.
  • the key frame description scheme may contain a still image description scheme which may in turn contains color and texture descriptors. It is noted that various low level descriptors may be included in the still image description scheme under the segment description scheme. Also, the visual descriptors may be included in the region description scheme which is not necessarily under a still image description scheme. On example of a segment description scheme 450 is shown in FIG. 16 .
  • the region description scheme 452 defines the interrelationships between groups of pixels of the same and/or different frames of the video.
  • the region description scheme 452 may also contain geometrical features, color, texture features, motion features, etc.
  • the segment/region relation graph description scheme 454 defines the interrelationships between a plurality of regions (or region description schemes), a plurality of segments (or segment description schemes), and/or a plurality of regions (or description schemes) and segments (or description schemes).
  • the semantic structure description scheme 404 is used to specify semantic features of a video program (or audio), e.g. semantic events.
  • the semantic structure description scheme 404 preferably includes three modules, namely an event description scheme 480 , an object description scheme 482 , and an event/objection relation graph description scheme 484 .
  • the event description scheme 480 may be used to form relationships between different events of the video normally consisting of multiple frames of the video.
  • An event description scheme 480 may contain another event description scheme 480 to form a segment tree. Such an event segment tree may be used to define a semantic index table for a video program. Multiple event trees may be created and thereby creating multiple index tables.
  • a video program may include multiple events, such as a basketball dunk, a fast break, and a free throw, and the event description scheme may contain such information as an index table.
  • the event description scheme may also contain references which link the event to the corresponding segments and/or regions specified in the syntactic structure description scheme. On example of an event description scheme is shown in FIG. 19 .
  • the object description scheme 482 defines the interrelationships between groups of pixels of the same and/or different frames of the video representative of objects.
  • the object description scheme 482 may contain another object description scheme and thereby form an object tree. Such an object tree may be used to define an object index table for a video program.
  • the object description scheme may also contain references which link the object to the corresponding segments and/or regions specified in the syntactic structure description scheme.
  • the event/object relation graph description scheme 484 defines the interrelationships between a plurality of events (or event description schemes), a plurality of objects (or object description schemes), and/or a plurality of events (or description schemes) and objects (or description schemes).
  • the present inventors came the realization that the particular design of the user preference description scheme is important to implement portability, while permitting adaptive updating, of the user preference description scheme. Moreover, the user preference description scheme should be readily usable by the system while likewise being suitable for modification based on the user's historical usage patterns. It is possible to collectively track all users of a particular device to build a database for the historical viewing preferences of the users of the device, and thereafter process the data dynamically to determine which content the users would likely desire. However, this implementation would require the storage of a large amount of data and the associated dynamic processing requirements to determine the user preferences. It is to be understood that the user preference description scheme may be used alone or in combination with other description scheme.
  • the usage preference description scheme 500 includes a description scheme of the user's audio and/or video consumption preferences.
  • the usage preference description scheme 500 describes one or more of the following, depending on the particular implementation, (a) browsing preferences, (b) filtering preferences, (c) searching preferences, and (d) device preferences of the user.
  • the type of preferences shown in the usage preference description scheme 500 are generally immediately usable by the system for selecting and otherwise using the available audio and/or video content.
  • the usage preference description scheme 500 includes data describing audio and/or video consumption of the user.
  • the usage history description scheme 502 includes a description scheme of the user's historical audio and/or video activity, such as browsing, device settings, viewing, and selection.
  • the usage history description scheme 502 describes one or more of the following, depending on the particular implementation, (a) browsing history, (b) filtering history, (c) searching history, and (d) device usage history.
  • the type of preferences shown in the usage history description scheme 502 are not generally immediately usable by the system for selecting and otherwise using the available audio and/or video content.
  • the data contained in the usage history description scheme 502 may be considered generally “unprocessed”, at least in comparison to the data contained in the usage preferences description scheme 500 because it generally contains the historical usage data of the audio and/or video content of the viewer.
  • capturing the user's usage history facilitates “automatic” composition of user preferences by a machine, as desired.
  • the usage history description scheme 502 be relatively symmetric to the usage preference description scheme 500 .
  • the symmetry permits more effective updating because less interpretation between the two description schemes is necessary in order to determine what data should be included in the preferences.
  • Numerous algorithms can then be applied in utilization of the history information in deriving user preferences. For instance, statistics can be computed from the history and utilized for this purpose.
  • the user preference description 20 may also include a user identification (user identifier) description 504 .
  • the user identification description 504 includes an identification of the particular user that is using the device. By incorporating a user identification description 504 more than one user may use the device while maintaining a different or a unique set of data within the usage preference description 500 and the usage history description 502 . Accordingly, the user identification description 504 associates the appropriate usage preference description(s) 500 and usage history description(s) 502 for the particular user identified by the user identification description 504 . With multiple user identification descriptions 504 , multiple entries within a single user identification description 504 identifying different users, and/or including the user identification description within the usage preference description 500 and/or usage history description 502 to provide the association therebetween, multiple users can readily use the same device while maintaining their individuality.
  • the user may more readily customize content anonymously.
  • the user's user identification description 504 may be used to identify multiple different sets of usage preference descriptions 500 —usage history descriptions 502 , from which the user may select for present interaction with the device depending on usage conditions.
  • the use of multiple user identification descriptions for the same user is useful when the user uses dultiple different types of devices, such as a television, a home stereo, a business television, a hotel television, and a vehicle audio player, and maintains multiple different sets of preference descriptions.
  • the identification may likewise be used to identify groups of individuals, such as for example, a family.
  • the user identification requirements may be overridden by employing a temporary session user identification assigned by such devices.
  • the user identification description 504 may also contain demographic information of the user. In this manner, as the usage history description 502 increases during use over time, this demographic data and/or data regarding usage patterns may be made available to other sources. The data may be used for any purpose, such as for example, providing targeted advertising or programming on the device based on such data.
  • an agent 510 processes the usage history description(s) 502 for a particular user to “automatically” determine the particular user's preferences.
  • the user's usage preference description 500 is updated to reflect data stored in the usage history description 502 .
  • This processing by the agent 510 is preferably performed on a periodic basis so that during normal operation the usage history description 502 does not need to be processed, or otherwise queried, to determine the user's current browsing, filtering, searching, and device preferences.
  • the usage preference description 500 is relatively compact and suitable for storage on a portable storage device, such as a smart card, for use by other devices as previously described.
  • the user may be traveling away from home with his smart card containing his usage preference description 500 .
  • the user will likely be browsing, filtering, searching, and setting device preferences of audio and/or video content on devices into which he provided his usage preference description 500 .
  • the audio and/or video content browsed, filtered, searched, and device preferences of the user may not be typically what he is normally interested in.
  • the user may desire more than one profile depending on the season, such as football season, basketball season, baseball season, fall, winter, summer, and spring. Accordingly, it may not be appropriate for the device to create a usage history description 502 and thereafter have the agent 510 “automatically” update the user's usage preference description 500 .
  • the device should include an option that disables the agent 510 from updating the usage preference description 500 .
  • the usage preference description 500 may include one or more fields or data structures that indicate whether or not the user desires the usage preference description 500 (or portions thereof) to be updated.
  • the device may use the program descriptions provided by any suitable source describing the current and/or future audio and/or video content available from which a filtering agent 520 selects the appropriate content for the particular user(s).
  • the content is selected based upon the usage preference description for a particular user identification(s) to determine a list of preferred audio and/or video programs.
  • a relatively compact user preference description 500 the user's preferences are readily movable to different devices, such as a personal video recorder, a TiVO player, a RePlay Networks player, a car audio player, or other audio and/or video appliance. Yet, the user preference description 500 may be updated in accordance with the user's browsing, filtering, searching, and device preferences.
  • the usage preference description 500 preferably includes three different categories of descriptions, depending on the particular implementation.
  • the preferred descriptions include (a) browsing preferences description 530 , (b) filtering and search preferences description, 532 and (c) device preferences description 534 .
  • the browsing preferences description 530 relates to the viewing preferences of audio and/or video programs.
  • the filtering and search preferences description 532 relates to audio and/or video program level preferences.
  • the program level preferences are not necessarily used at the same time as the (browsing) viewing preferences.
  • preferred programs can be determined as a result of filtering program descriptions according to user's filtering preferences. A particular preferred program may subsequently be viewed in accordance with user's browsing preferences.
  • the device preferences description 534 relates to the preferences for setting up the device in relation to the type of content being presented, e.g. romance, drama, action, violence, evening, morning, day, weekend, weekday, and/or the available presentation devices.
  • presentation devices may include stereo sound, mono sound, surround sound, multiple potential displays, multiple different sets of audio speakers, AC-3, and Dolby Digital. It may likewise be observed that the device preferences description 534 is likewise separate, at least logically, from the browsing description 530 and filtering/search preferences description 532 .
  • the browsing preferences description 530 contains descriptors that describe preferences of the user for browsing multimedia (audio and/or video) information.
  • the browsing preferences may include user's preference for continuous playback of the entire program versus visualizing a short summary of the program.
  • Various summary types may be described in the program descriptions describing multiple different views of programs where these descriptions are utilized by the device to facilitate rapid non-linear browsing, viewing, and navigation. Parameters of the various summary types should also be specified, i.e., number of hierarchy levels when the keyframe summary is preferred, or the time duration of the video highlight when highlight summary is preferred.
  • browsing preferences may also include descriptors describing parental control settings.
  • a switch descriptor (set by the user) should also be included to specify whether or not the preferences can be modified without consulting the user first. This prevents inadvertent changing or updating of the preferences by the device.
  • the browsing preferences are media content dependent. For example, a user may prefer 15 minute video highlight of a basketball game or may prefer to see only the 3-point shots. The same user may prefer a keyframe summary with two levels of hierarchy for home videos.
  • the filtering and search preferences description 532 preferably has four descriptions defined therein, depending on the particular embodiment.
  • the keyword preferences description 540 is used to specify favorite topics that may not be captured in the title, category, etc., information. This permits the acceptance of a query for matching entries in any of the available data fields.
  • the content preferences description 542 is used to facilitate capturing, for instance, favorite actors, directors.
  • the creation preferences description 544 is used to specify capturing, for instance, titles of favorite shows.
  • the classification preferences description 546 is used to specify descriptions, for instance, a favorite program category.
  • a switch descriptor, activated by the user, may be included to specify whether or not the preferences may be modified without consulting the user, as previously described.
  • the device preferences description 534 contains descriptors describing preferred audio and/or video rendering settings, such as volume, balance, bass, treble, brightness, contrast, closed captioning, AC-3, Dolby digital, which display device of several, type of display device, etc.
  • the settings of the device relate to how the user browses and consumes the audio and/or video content. It is desirable to be able to specify the device setting preferences in a media type and content-dependent manner. For example the preferred volume settings for an action movie may be higher than a drama, or the preferred settings of bass for classical music and rock music may be different.
  • a switch descriptor, activated by the user may be included to specify whether or not the preferences may be modified without consulting the user, as previously described.
  • the usage preferences description may be used in cooperation with an MPEG-7 compliant data stream and/or device.
  • MPEG-7 descriptions are described in ISO/IEC JTC1/SC29/WG11 “MPEG-7 Media/Meta DSs (V0.2), August 1999, incorporated by reference herein. It is preferable that media content descriptions are consistent with descriptions of preferences of users consuming the media. Consistency can be achieved by using common descriptors in media and user preference descriptions or by specifying a correspondence between user preferences and media descriptors. Browsing preferences descriptions are preferably consistent with media descriptions describing different views and summaries of the media.
  • the content preferences description 542 is preferably consistent with, e.g., a subset of the content description of the media 553 specified in MPEG-7 by content description scheme.
  • the classification preferences description 544 is preferably consistent with, e.g., a subset of the classification description 554 defined in MPEG-7 as classification description scheme.
  • the creation preferences description 546 is preferably consistent with, e.g., a subset of the creation description 556 specified in MPEG-7 by creation description scheme.
  • the keyword preferences description 540 is preferably a string supporting multiple languages and consistent with corresponding media content description schemes. Consistency between media and user preference descriptions is depicted or shown in FIG. 26 by couble arrows in the case of content, creation, and classification preferences.
  • the usage history description 502 preferably includes three different categories of descriptions, depending on the particular implementation.
  • the preferred descriptions include (a) browsing history description 560 , (b) filtering and search history description 562 , and (c) device usage history description 564 , as previously described in relation to the usage preference description 500 .
  • the filtering and search history description 562 preferably has four descriptions defined therein, depending on the particular embodiment, namely, a keyword usage history description 566 , a content usage history description 568 , a creation preferences description 570 , and a classification usage history description 572 , as previously described with respect to the preferences.
  • the usage history description 502 may contain additional descriptors therein (or description if desired) that describe the time and/or time duration of information contained therein.
  • the time refers to the duration of consuming a particular audio and/or video program.
  • the duration of time that a particular program has been viewed provides information that may be used to determine user preferences. For example, if a user only watches a show for 5 minutes then it may not be a suitable preference for inclusion the usage preference description 500 .
  • the present inventors came to the realization that an even more accurate measure of the user's preference of a particular audio and/or video program is the time viewed in light of the total duration of the program. This accounts for the relative viewing duration of a program. For example watching 30 minutes of a 4 hour show may be of less relevance than watching 30 minutes of a 30 minute show to determine preference data for inclusion in the usage preference description 500 .
  • audio/video program descriptions are available from the broadcast or other source, such as a telephone line.
  • the user preference description facilitate personalization of the browsing, filtering and search, and device settings.
  • the user preferences are stored at the user's terminal with provision for transporting it to other systems, for example via a smart card.
  • the user preferences may be stored in a server and the content adaptation can be performed according to user descriptions at the server and then the preferred content is transmitted to the user.
  • the user may directly provide the user preferences, if desired.
  • the user preferences and/or user history may likewise be provided to a service provider.
  • the system may employ an application that records user's usage history in the form of usage history description, as previously defined.
  • the usage history description is then utilized by another application, e.g., a smart agent, to automatically map usage history to user preferences.

Abstract

A method of using a system with at least one of audio, image, and a video comprises a plurality of frames comprising the steps of providing a usage preferences description where the usage preference description includes at least one of a browsing preferences description, a filtering preferences description, a search preferences description, and a device preferences description. The browsing preferences description relates to a user's viewing preferences. The filtering and search preferences descriptions relate to at least one of (1) content preferences of the at least one of audio, image, and video, (2) classification preferences of the at least one of audio, image, and video, (3) keyword preferences of the at least one of audio, image, and video, and (4) creation preferences of the at least one of audio, image, and video. The device preferences description relates to user's preferences regarding presentation characteristics. A usage history description is provided where the usage preference description includes at least one of a browsing history description, a filtering history description, a search history description, and a device usage history description. The browsing history description relates to a user's viewing preferences. The filtering and search history descriptions relate to at least one of (1) content usage history of the at least one of audio, image, and video, (2) classification usage history of the at least one of audio, image, and video, (3) keyword usage history of the at least one of audio, image, and video, and (4) creation usage history of the at least one of audio, image, and video. The device usage history description relates to user's preferences regarding presentation characteristics. The usage preferences description and the usage history description are used to enhance system functionality.

Description

    BACKGROUND OF THE INVENTION
  • The present invention relates to a system for managing audiovisual information, and in particular to a system for audiovisual information browsing, filtering, searching, archiving, and personalization.
  • Video cassette recorders (VCRs) may record video programs in response to pressing a record button or may be programmed to record video programs based on the time of day. However, the viewer must program the VCR based on information from a television guide to identify relevant programs to record. After recording, the viewer scans through the entire video tape to select relevant portions of the program for viewing using the functionality provided by the VCR, such as fast forward and fast reverse. Unfortunately, the searching and viewing is based on a linear search, which may require significant time to locate the desired portions of the program(s) and fast forward to the desired portion of the tape. In addition, it is time consuming to program the VCR in light of the television guide to record desired programs. Also, unless the viewer recognizes the programs from the television guide as desirable it is unlikely that the viewer will select such programs to be recorded.
  • RePlayTV and TiVo have developed hard disk based systems that receive, record, and play television broadcasts in a manner similar to a VCR. The systems may be programmed with the viewer's viewing preferences. The systems use a telephone line interface to receive scheduling information similar to that available from a television guide. Based upon the system programming and the scheduling information, the system automatically records programs that may be of potential interest to the viewer. Unfortunately, viewing the recorded programs occurs in a linear manner and may require substantial time. In addition, each system must be programmed for an individual's preference, likely in a different manner.
  • Freeman et al., U.S. Pat. No. 5,861,881, disclose an interactive computer system where subscribers can receive individualized content.
  • With all the aforementioned systems, each individual viewer is required to program the device according to his particular viewing preferences. Unfortunately, each different type of device has different capabilities and limitations which limit the selections of the viewer. In addition, each device includes a different interface which the viewer may be unfamiliar with. Further, if the operator's manual is inadvertently misplaced it may be difficult for the viewer to efficiently program the device.
  • BRIEF SUMMARY OF THE INVENTION
  • The present invention overcomes the aforementioned drawbacks of the prior art by providing a method of using a system with at least one of audio, image, and a video comprising a plurality of frames comprising the steps of providing a usage preferences description scheme where the usage preference description scheme includes at least one of a browsing preferences description scheme, a filtering preferences description scheme, a search preferences description scheme, and a device preferences description scheme. The browsing preferences description scheme relates to a user's viewing preferences. The filtering and search preferences description schemes relate to at least one of (1) content preferences of the at least one of audio, image, and video, (2) classification preferences of the at least one of audio, image, and video, (3) keyword preferences of the at least one of audio, image, and video, and (4) creation preferences of the at least one of audio, image, and video. The device preferences description scheme relates to user's preferences regarding presentation characteristics. A usage history description scheme is provided where the usage preference description scheme includes at least one of a browsing history description scheme, a filtering history description scheme, a search history description scheme, and a device usage history description scheme. The browsing history description scheme relates to a user's viewing preferences. The filtering and search history description schemes relate to at least one of (1) content usage history of the at least one of audio, image, and video, (2) classification usage history of the at least one of audio, image, and video, (3) keyword usage history of the at least one of audio, image, and video, and (4) creation usage history of the at least one of audio, image, and video. The device usage history description scheme relates to user's preferences regarding presentation characteristics. The usage preferences description scheme and the usage history description scheme are used to enhance system functionality.
  • The foregoing and other objectives, features and advantages of the invention will be more readily understood upon consideration of the following detailed description of the invention, taken in conjunction with the accompanying drawings.
  • BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS
  • FIG. 1 is an exemplary embodiment of a program, a system, and a user, with associated description schemes, of an audiovisual system of the present invention.
  • FIG. 2 is an exemplary embodiment of the audiovisual system, including an analysis module, of FIG. 1.
  • FIG. 3 is an exemplary embodiment of the analysis module of FIG. 2.
  • FIG. 4 is an illustration of a thumbnail view (category) for the audiovisual system.
  • FIG. 5 is an illustration of a thumbnail view (channel) for the audiovisual system.
  • FIG. 6 is an illustration of a text view (channel) for the audiovisual system.
  • FIG. 7 is an illustration of a frame view for the audiovisual system.
  • FIG. 8 is an illustration of a shot view for the audiovisual system.
  • FIG. 9 is an illustration of a key frame view the audiovisual system.
  • FIG. 10 is an illustration of a highlight view for the audiovisual system.
  • FIG. 11 is an illustration of an event view for the audiovisual system.
  • FIG. 12 is an illustration of a character/object view for the audiovisual system.
  • FIG. 13 is an alternative embodiment of a program description scheme including a syntactic structure description scheme, a semantic structure description scheme, a visualization description scheme, and a meta information description scheme.
  • FIG. 14 is an exemplary embodiment of the visualization description scheme of FIG. 13.
  • FIG. 15 is an exemplary embodiment of the meta information description scheme of FIG. 13.
  • FIG. 16 is an exemplary embodiment of a segment description scheme for the syntactic structure description scheme of FIG. 13.
  • FIG. 17 is an exemplary embodiment of a region description scheme for the syntactic structure description scheme of FIG. 13.
  • FIG. 18 is an exemplary embodiment of a segment/region relation description scheme for the syntactic structure description scheme of FIG. 13.
  • FIG. 19 is an exemplary embodiment of an event description scheme for the semantic structure description scheme of FIG. 13.
  • FIG. 20 is an exemplary embodiment of an object description scheme for the semantic structure description scheme of FIG. 13.
  • FIG. 21 is an exemplary embodiment of an event/object relation graph description scheme for the syntactic structure description scheme of FIG. 13.
  • FIG. 22 is an exemplary embodiment of a user preference description scheme.
  • FIG. 23 is an exemplary embodiment of the interrelationship between a usage history description scheme, an agent, and the usage preference description scheme of FIG. 22.
  • FIG. 24 is an exemplary embodiment of the interrelationship between audio and/or video programs together with their descriptors, user identification, and the usage preference description scheme of FIG. 22.
  • FIG. 25 is an exemplary embodiment of a usage preference description scheme of FIG. 22.
  • FIG. 26 is an exemplary embodiment of the interrelationship between the usage description schemes and an MPEG-7 description schemes.
  • FIG. 27 is an exemplary embodiment of a usage history description scheme of FIG. 22.
  • FIG. 28 is an exemplary system incorporating the user history description scheme.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
  • Many households today have many sources of audio and video information, such as multiple television sets, multiple VCR's, a home stereo, a home entertainment center, cable television, satellite television, internet broadcasts, world wide web, data services, specialized Internet services, portable radio devices, and a stereo in each of their vehicles. For each of these devices, a different interface is normally used to obtain, select, record, and play the video and/or audio content. For example, a VCR permits the selection of the recording times but the user has to correlate the television guide with the desired recording times. Another example is the user selecting a preferred set of preselected radio stations for his home stereo and also presumably selecting the same set of preselected stations for each of the user's vehicles. If another household member desires a different set of preselected stereo selections, the programming of each audio device would need to be reprogrammed at substantial inconvenience.
  • The present inventors came to the realization that users of visual information and listeners to audio information, such as for example radio, audio tapes, video tapes, movies, and news, desire to be entertained and informed in more than merely one uniform manner. In other words, the audiovisual information presented to a particular user should be in a format and include content suited to their particular viewing preferences. In addition, the format should be dependent on the content of the particular audiovisual information. The amount of information presented to a user or a listener should be limited to only the amount of detail desired by the particular user at the particular time. For example with the ever increasing demands on the user's time, the user may desire to watch only 10 minutes of or merely the highlights of a basketball game. In addition, the present inventors came to the realization that the necessity of programming multiple audio and visual devices with their particular viewing preferences is a burdensome task, especially when presented with unfamiliar recording devices when traveling. When traveling, users desire to easily configure unfamiliar devices, such as audiovisual devices in a hotel room, with their viewing and listening preferences in a efficient manner.
  • The present inventors came to the further realization that a convenient technique of merely recording the desired audio and video information is not sufficient because the presentation of the information should be in a manner that is time efficient, especially in light of the limited time frequently available for the presentation of such information. In addition, the user should be able to access only that portion of all of the available information that the user is interested in, while skipping the remainder of the information.
  • A user is not capable of watching or otherwise listening to the vast potential amount of information available through all, or even a small portion of, the sources of audio and video information. In addition, with the increasing information potentially available, the user is not likely even aware of the potential content of information that he may be interested in. In light of the vast amount of audio, image, and video information, the present inventors came to the realization that a system that records and presents to the user audio and video information based upon the user's prior viewing and listening habits, preferences, and personal characteristics, generally referred to as user information, is desirable. In addition, the system may present such information based on the capabilities of the system devices. This permits the system to record desirable information and to customize itself automatically to the user and/or listener. It is to be understood that user, viewer, and/or listener terms may be used interchangeability for any type of content. Also, the user information should be portable between and usable by different devices so that other devices may likewise be configured automatically to the particular user's preferences upon receiving the viewing information.
  • In light of the foregoing realizations and motivations, the present inventors analyzed a typical audio and video presentation environment to determine the significant portions of the typical audiovisual environment. First, referring to FIG. 1 the video, image, and/or audio information 10 is provided or otherwise made available to a user and/or a (device) system. Second, the video, image, and/or audio information is presented to the user from the system 12 (device), such as a television set or a radio. Third, the user interacts both with the system (device) 12 to view the information 10 in a desirable manner and has preferences to define which audio, image, and/or video information is obtained in accordance with the user information 14. After the proper identification of the different major aspects of an audiovisual system the present inventors then realized that information is needed to describe the informational content of each portion of the audiovisual system 16.
  • With three portions of the audiovisual presentation system 16 identified, the functionality of each portion is identified together with its interrelationship to the other portions. To define the necessary interrelationships, a set of description schemes containing data describing each portion is defined. The description schemes include data that is auxiliary to the programs 10, the system 12, and the user 14, to store a set of information, ranging from human readable text to encoded data, that can be used in enabling browsing, filtering, searching, archiving, and personalization. By providing a separate description scheme describing the program(s) 10, the user 14, and the system 12, the three portions (program, user, and system) may be combined together to provide an interactivity not previously achievable. In addition, different programs 10, different users 14, and different systems 12 may be combined together in any combination, while still maintaining full compatibility and functionality. It is to be understood that the description scheme may contain the data itself or include links to the data, as desired.
  • A program description scheme 18 related to the video, still image, and/or audio information 10 preferably includes two sets of information, namely, program views and program profiles. The program views define logical structures of the frames of a video that define how the video frames are potentially to be viewed suitable for efficient browsing. For example the program views may contain a set of fields that contain data for the identification of key frames, segment definitions between shots, highlight definitions, video summary definitions, different lengths of highlights, thumbnail set of frames, individual shots or scenes, representative frame of the video, grouping of different events, and a close-up view. The program view descriptions may contain thumbnail, slide, key frame, highlights, and close-up views so that users can filter and search not only at the program level but also within a particular program. The description scheme also enables users to access information in varying detail amounts by supporting, for example, a key frame view as a part of a program view providing multiple levels of summary ranging from coarse to fine. The program profiles define distinctive characteristics of the content of the program, such as actors, stars, rating, director, release date, time stamps, keyword identification, trigger profile, still profile, event profile, character profile, object profile, color profile, texture profile, shape profile, motion profile, and categories. The program profiles are especially suitable to facilitate filtering and searching of the audio and video information. The description scheme enables users to have the provision of discovering interesting programs that they may be unaware of by providing a user description scheme. The user description scheme provides information to a software agent that in turn performs a search and filtering on behalf of the user by possibly using the system description scheme and the program description scheme information. It is to be understood that in one of the embodiments of the invention merely the program description scheme is included.
  • Program views contained in the program description scheme are a feature that supports a functionality such as close-up view. In the close-up view, a certain image object, e.g., a famous basketball player such as Michael Jordan, can be viewed up close by playing back a close-up sequence that is separate from the original program. An alternative view can be incorporated in a straightforward manner. Character profile on the other hand may contain spatio-temporal position and size of a rectangular region around the character of interest. This region can be enlarged by the presentation engine, or the presentation engine may darken outside the region to focus the user's attention to the characters spanning a certain number of frames. Information within the program description scheme may contain data about the initial size or location of the region, movement of the region from one frame to another, and duration and terms of the number of frames featuring the region. The character profile also provides provision for including text annotation and audio annotation about the character as well as web page information, and any other suitable information. Such character profiles may include the audio annotation which is separate from and in addition to the associated audio track of the video.
  • The program description scheme may likewise contain similar information regarding audio (such as radio broadcasts) and images (such as analog or digital photographs or a frame of a video).
  • The user description scheme 20 preferably includes the user's personal preferences, and information regarding the user's viewing history such as for example browsing history, filtering history, searching history, and device setting history. The user's personal preferences includes information regarding particular programs and categorizations of programs that the user prefers to view. The user description scheme may also include personal information about the particular user, such as demographic and geographic information, e.g. zip code and age. The explicit definition of the particular programs or attributes related thereto permits the system 16 to select those programs from the information contained within the available program description schemes 18 that may be of interest to the user. Frequently, the user does not desire to learn to program the device nor desire to explicitly program the device. In addition, the user description scheme 20 may not be sufficiently robust to include explicit definitions describing all desirable programs for a particular user. In such a case, the capability of the user description scheme 20 to adapt to the viewing habits of the user to accommodate different viewing characteristics not explicitly provided for or otherwise difficult to describe is useful. In such a case, the user description scheme 20 may be augmented or any technique can be used to compare the information contained in the user description scheme 20 to the available information contained in the program description scheme 18 to make selections. The user description scheme provides a technique for holding user preferences ranging from program categories to program views, as well as usage history. User description scheme information is persistent but can be updated by the user or by an intelligent software agent on behalf of the user at any arbitrary time. It may also be disabled by the user, at any time, if the user decides to do so. In addition, the user description scheme is modular and portable so that users can carry or port it from one device to another, such as with a handheld electronic device or smart card or transported over a network connecting multiple devices. When user description scheme is standardized among different manufacturers or products, user preferences become portable. For example, a user can personalize the television receiver in a hotel room permitting users to access information they prefer at any time and anywhere. In a sense, the user description scheme is persistent and timeless based. In addition, selected information within the program description scheme may be encrypted since at least part of the information may be deemed to be private (e.g., demographics). A user description scheme may be associated with an audiovisual program broadcast and compared with a particular user's description scheme of the receiver to readily determine whether or not the program's intended audience profile matches that of the user. It is to be understood that in one of the embodiments of the invention merely the user description scheme is included.
  • The system description scheme 22 preferably manages the individual programs and other data. The management may include maintaining lists of programs, categories, channels, users, videos, audio, and images. The management may include the capabilities of a device for providing the audio, video, and/or images. Such capabilities may include, for example, screen size, stereo, AC3, DTS, color, black/white, etc. The management may also include relationships between any one or more of the user, the audio, and the images in relation to one or more of a program description scheme(s) and a user description scheme(s). In a similar manner the management may include relationships between one or more of the program description scheme(s) and user description scheme(s). It is to be understood that in one of the embodiments of the invention merely the system description scheme is included.
  • The descriptors of the program description scheme and the user description scheme should overlap, at least partially, so that potential desirability of the program can be determined by comparing descriptors representative of the same information. For example, the program and user description scheme may include the same set of categories and actors. The program description scheme has no knowledge of the user description scheme, and vice versa, so that each description scheme is not dependant on the other for its existence. It is not necessary for the description schemes to be fully populated. It is also beneficial not to include the program description scheme with the user description scheme because there will likely be thousands of programs with associated description schemes which if combined with the user description scheme would result in a unnecessarily large user description scheme. It is desirable to maintain the user description scheme small so that it is more readily portable. Accordingly, a system including only the program description scheme and the user description scheme would be beneficial.
  • The user description scheme and the system description scheme should include at least partially overlapping fields. With overlapping fields the system can capture the desired information, which would otherwise not be recognized as desirable. The system description scheme preferably includes a list of users and available programs. Based on the master list of available programs, and associated program description scheme, the system can match the desired programs. It is also beneficial not to include the system description scheme with the user description scheme because there will likely be thousands of programs stored in the system description schemes which if combined with the user description scheme would result in a unnecessarily large user description scheme. It is desirable to maintain the user description scheme small so that it is more readily portable. For example, the user description scheme may include radio station preselected frequencies and/or types of stations, while the system description scheme includes the available stations for radio stations in particular cities. When traveling to a different city the user description scheme together with the system description scheme will permit reprogramming the radio stations. Accordingly, a system including only the system description scheme and the user description scheme would be beneficial.
  • The program description scheme and the system description scheme should include at least partially overlapping fields. With the overlapping fields, the system description scheme will be capable of storing the information contained within the program description scheme, so that the information is properly indexed. With proper indexing, the system is capable of matching such information with the user information, if available, for obtaining and recording suitable programs. If the program description scheme and the system description scheme were not overlapping then no information would be extracted from the programs and stored. System capabilities specified within the system description scheme of a particular viewing system can be correlated with a program description scheme to determine the views that can be supported by the viewing system. For instance, if the viewing device is not capable of playing back video, its system description scheme may describe its viewing capabilities as limited to keyframe view and slide view only. Program description scheme of a particular program and system description scheme of the viewing system are utilized to present the appropriate views to the viewing system. Thus, a server of programs serves the appropriate views according to a particular viewing system's capabilities, which may be communicated over a network or communication channel connecting the server with user's viewing device. It is preferred to maintain the program description scheme separate from the system description scheme because the content providers repackage the content and description schemes in different styles, times, and formats. Preferably, the program description scheme is associated with the program, even if displayed at a different time. Accordingly, a system including only the system description scheme and the program description scheme would be beneficial.
  • By preferably maintaining the independence of each of the three description schemes while having fields that correlate the same information, the programs 10, the users 14, and the system 12 may be interchanged with one another while maintaining the functionality of the entire system 16. Referring to FIG. 2, the audio, visual, or audiovisual program 38, is received by the system 16. The program 38 may originate at any suitable source, such as for example broadcast television, cable television, satellite television, digital television, Internet broadcasts, world wide web, digital video discs, still images, video cameras, laser discs, magnetic media, computer hard drive, video tape, audio tape, data services, radio broadcasts, and microwave communications. The program description stream may originate from any suitable source, such as for example PSIP/DVB-SI information in digital television broadcasts, specialized digital television data services, specialized Internet services, world wide web, data files, data over the telephone, and memory, such as computer memory. The program, user, and/or system description scheme may be transported over a network (communication channel). For example, the system description scheme may be transported to the source to provide the source with views or other capabilities that the device is capable of using. In response, the source provides the device with image, audio, and/or video content customized or otherwise suitable for the particular device. The system 16 may include any device(s) suitable to receive any one or more of such programs 38. An audiovisual program analysis module 42 performs an analysis of the received programs 38 to extract and provide program related information (descriptors) to the description scheme (DS) generation module 44. The program related information may be extracted from the data stream including the program 38 or obtained from any other source, such as for example data transferred over a telephone line, data already transferred to the system 16 in the past, or data from an associated file. The program related information preferably includes data defining both the program views and the program profiles available for the particular program 38. The analysis module 42 performs an analysis of the programs 38 using information obtained from (i) automatic audio-video analysis methods on the basis of low-level features that are extracted from the program(s), (ii) event detection techniques, (iii) data that is available (or extractable) from data sources or electronic program guides (EPGs, DVB-SI, and PSIP), and (iv) user information obtained from the user description scheme 20 to provide data defining the program description scheme.
  • The selection of a particular program analysis technique depends on the amount of readily available data and the user preferences. For example, if a user prefers to watch a 5 minute video highlight of a particular program, such as a basketball game, the analysis module 42 may invoke a knowledge based system 90 (FIG. 3) to determine the highlights that form the best 5 minute summary. The knowledge based system 90 may invoke a commercial filter 92 to remove commercials and a slow motion detector 54 to assist in creating the video summary. The analysis module 42 may also invoke other modules to bring information together (e.g., textual information) to author particular program views. For example, if the program 38 is a home video where there is no further information available then the analysis module 42 may create a key-frame summary by identifying key-frames of a multi-level summary and passing the information to be used to generate the program views, and in particular a key frame view, to the description scheme. Referring also to FIG. 3, the analysis module 42 may also include other sub-modules, such as for example, a de-mux/decoder 60, a data and service content analyzer 62, a text processing and text summary generator 64, a close caption analyzer 66, a title frame generator 68, an analysis manager 70, an audiovisual analysis and feature extractor 72, an event detector 74, a key-frame summarizer 76, and a highlight summarizer 78.
  • The generation module 44 receives the system information 46 for the system description scheme. The system information 46 preferably includes data for the system description scheme 22 generated by the generation module 44. The generation module 44 also receives user information 48 including data for the user description scheme. The user information 48 preferably includes data for the user description scheme generated within the generation module 44. The user input 48 may include, for example, meta information to be included in the program and system description scheme. The user description scheme (or corresponding information) is provided to the analysis module 42 for selective analysis of the program(s) 38. For example, the user description scheme may be suitable for triggering the highlight generation functionality for a particular program and thus generating the preferred views and storing associated data in the program description scheme. The generation module 44 and the analysis module 42 provide data to a data storage unit 50. The storage unit 50 may be any storage device, such as memory or magnetic media.
  • A search, filtering, and browsing (SFB) module 52 implements the description scheme technique by parsing and extracting information contained within the description scheme. The SFB module 52 may perform filtering, searching, and browsing of the programs 38, on the basis of the information contained in the description schemes. An intelligent software agent is preferably included within the SFB module 52 that gathers and provides user specific information to the generation module 44 to be used in authoring and updating the user description scheme (through the generation module 44). In this manner, desirable content may be provided to the user though a display 80. The selections of the desired program(s) to be retrieved, stored, and/or viewed may be programmed, at least in part, through a graphical user interface 82. The graphical user interface may also include or be connected to a presentation engine for presenting the information to the user through the graphical user interface.
  • The intelligent management and consumption of audiovisual information using the multi-part description stream device provides a next-generation device suitable for the modern era of information overload. The device responds to changing lifestyles of individuals and families, and allows everyone to obtain the information they desire anytime and anywhere they want.
  • An example of the use of the device may be as follows. A user comes home from work late Friday evening being happy the work week is finally over. The user desires to catch up with the events of the world and then watch ABC's 20/20 show later that evening. It is now 9 PM and the 20/20 show will start in an hour at 10 PM. The user is interested in the sporting events of the week, and all the news about the Microsoft case with the Department of Justice. The user description scheme may include a profile indicating a desire that the particular user wants to obtain all available information regarding the Microsoft trial and selected sporting events for particular teams. In addition, the system description scheme and program description scheme provide information regarding the content of the available information that may selectively be obtained and recorded. The system, in an autonomous manner, periodically obtains and records the audiovisual information that may be of interest to the user during the past week based on the three description schemes. The device most likely has recorded more than one hour of audiovisual information so the information needs to be condensed in some manner. The user starts interacting with the system with a pointer or voice commands to indicate a desire to view recorded sporting programs. On the display, the user is presented with a list of recorded sporting events including Basketball and Soccer. Apparently the user's favorite Football team did not play that week because it was not recorded. The user is interested in basketball games and indicates a desire to view games. A set of title frames is presented on the display that captures an important moment of each game. The user selects the Chicago Bulls game and indicates a desire to view a 5 minute highlight of the game. The system automatically generates highlights. The highlights may be generated by audio or video analysis, or the program description scheme includes data indicating the frames that are presented for a 5 minute highlight. The system may have also recorded web-based textual information regarding the particular Chicago-Bulls game which may be selected by the user for viewing. If desired, the summarized information may be recorded onto a storage device, such as a DVD with a label. The stored information may also include an index code so that it can be located at a later time. After viewing the sporting events the user may decide to read the news about the Microsoft trial. It is now 9:50 PM and the user is done viewing the news. In fact, the user has selected to delete all the recorded news items after viewing them. The user then remembers to do one last thing before 10 PM in the evening. The next day, the user desires to watch the VHS tape that he received from his brother that day, containing footage about his brother's new baby girl and his vacation to Peru last summer. The user wants to watch the whole 2-hour tape but he is anxious to see what the baby looks like and also the new stadium built in Lima, which was not there last time he visited Peru. The user plans to take a quick look at a visual summary of the tape, browse, and perhaps watch a few segments for a couple of minutes, before the user takes his daughter to her piano lesson at 10 AM the next morning. The user plugs in the tape into his VCR, that is connected to the system, and invokes the summarization functionality of the system to scan the tape and prepare a summary. The user can then view the summary the next morning to quickly discover the baby's looks, and playback segments between the key-frames of the summary to catch a glimpse of the crying baby. The system may also record the tape content onto the system hard drive (or storage device) so the video summary can be viewed quickly. It is now 10:10 PM, and it seems that the user is 10 minutes late for viewing 20/20. Fortunately, the system, based on the three description schemes, has already been recording 20/20 since 10 PM. Now the user can start watching the recorded portion of 20/20 as the recording of 20/20 proceeds. The user will be done viewing 20/20 at 11:10 PM.
  • The average consumer has an ever increasing number of multimedia devices, such as a home audio system, a car stereo, several home television sets, web browsers, etc. The user currently has to customize each of the devices for optimal viewing and/or listening preferences. By storing the user preferences on a removable storage device, such as a smart card, the user may insert the card including the user preferences into such media devices for automatic customization. This results in the desired programs being automatically recorded on the VCR, and setting of the radio stations for the car stereo and home audio system. In this manner the user only has to specify his preferences at most once, on a single device and subsequently, the descriptors are automatically uploaded into devices by the removable storage device. The user description scheme may also be loaded into other devices using a wired or wireless network connection, e.g. that of a home network. Alternatively, the system can store the user history and create entries in the user description scheme based on the's audio and video viewing habits. In this manner, the user would never need to program the viewing information to obtain desired information. In a sense, the user descriptor scheme enables modeling of the user by providing a central storage for the user's listening, viewing, browsing preferences, and user's behavior. This enables devices to be quickly personalized, and enables other components, such as intelligent agents, to communicate on the basis of a standardized description format, and to make smart inferences regarding the user's preferences.
  • Many different realizations and applications can be readily derived from FIGS. 2 and 3 by appropriately organizing and utilizing their different parts, or by adding peripherals and extensions as needed. In its most general form, FIG. 2 depicts an audiovisual searching, filtering, browsing, and/or recording appliance that is personalizable. The list of more specific applications/implementations given below is not exhaustive but covers a range.
  • The user description scheme is a major enabler for personalizable audiovisual appliances. If the structure (syntax and semantics) of the description schemes is known amongst multiple appliances, the user (user) can carry (or otherwise transfer) the information contained within his user description scheme from one appliance to another, perhaps via a smart card—where these appliances support smart card interface—in order to personalize them. Personalization can range from device settings, such as display contrast and volume control, to settings of television channels, radio stations, web stations, web sites, geographic information, and demographic information such as age, zip code etc. Appliances that can be personalized may access content from different sources. They may be connected to the web, terrestrial or cable broadcast, etc., and they may also access multiple or different types of single media such as video, music, etc.
  • For example, one can personalize the car stereo using a smart card plugged out of the home system and plugged into the car stereo system to be able to tune to favorite stations at certain times. As another example, one can also personalize television viewing, for example, by plugging the smart card into a remote control that in turn will autonomously command the television receiving system to present the user information about current and future programs that fits the user's preferences. Different members of the household can instantly personalize the viewing experience by inserting their own smart card into the family remote. In the absence of such a remote, this same type of personalization can be achieved by plugging in the smart card directly to the television system. The remote may likewise control audio systems. In another implementation, the television receiving system holds user description schemes for multiple users (users) in local storage and identify different users (or group of users) by using an appropriate input interface. For example an interface using user-voice identification technology. It is noted that in a networked system the user description scheme may be transported over the network.
  • The user description scheme is generated by direct user input, and by using a software that watches the user to determine his/her usage pattern and usage history. User description scheme can be updated in a dynamic fashion by the user or automatically. A well defined and structured description scheme design allows different devices to interoperate with each other. A modular design also provides portability.
  • The description scheme adds new functionality to those of the current VCR. An advanced VCR system can learn from the user via direct input of preferences, or by watching the usage pattern and history of the user. The user description scheme holds user's preferences users and usage history. An intelligent agent can then consult with the user description scheme and obtain information that it needs for acting on behalf of the user. Through the intelligent agent, the system acts on behalf of the user to discover programs that fit the taste of the user, alert the user about such programs, and/or record them autonomously. An agent can also manage the storage in the system according to the user description scheme, i.e., prioritizing the deletion of programs (or alerting the user for transfer to a removable media), or determining their compression factor (which directly impacts their visual quality) according to user's preferences and history.
  • The program description scheme and the system description scheme work in collaboration with the user description scheme in achieving some tasks. In addition, the program description scheme and system description scheme in an advanced VCR or other system will enable the user to browse, search, and filter audiovisual programs. Browsing in the system offers capabilities that are well beyond fast forwarding and rewinding. For instance, the user can view a thumbnail view of different categories of programs stored in the system. The user then may choose frame view, shot view, key frame view, or highlight view, depending on their availability and user's preference. These views can be readily invoked using the relevant information in the program description scheme, especially in program views. The user at any time can start viewing the program either in parts, or in its entirety.
  • In this application, the program description scheme may be readily available from many services such as: (i) from broadcast (carried by EPG defined as a part of ATSC-PSIP (ATSC-Program Service Integration Protocol) in USA or DVB-SI (Digital Video Broadcast-Service Information) in Europe); (ii) from specialized data services (in addition to PSIP/DVB-SI); (iii) from specialized web sites; (iv) from the media storage unit containing the audiovisual content (e.g., DVD); (v) from advanced cameras (discussed later), and/or may be generated (i.e., for programs that are being stored) by the analysis module 42 or by user input 48.
  • Contents of digital still and video cameras can be stored and managed by a system that implements the description schemes, e.g., a system as shown in FIG. 2. Advanced cameras can store a program description scheme, for instance, in addition to the audiovisual content itself. The program description scheme can be generated either in part or in its entirety on the camera itself via an appropriate user input interface (e.g., speech, visual menu drive, etc.). Users can input to the camera the program description scheme information, especially those high-level (or semantic) information that may otherwise be difficult to automatically extract by the system. Some camera settings and parameters (e.g., date and time), as well as quantities computed in the camera (e.g., color histogram to be included in the color profile), can also be used in generating the program description scheme. Once the camera is connected, the system can browse the camera content, or transfer the camera content and its description scheme to the local storage for future use. It is also possible to update or add information to the description scheme generated in the camera.
  • The IEEE 1394 and Havi standard specifications enable this type of “audiovisual content” centric communication among devices. The description scheme API's can be used in the context of Havi to browse and/or search the contents of a camera or a DVD which also contain a description scheme associated with their content, i.e., doing more than merely invoking the PLAY API to play back and linearly view the media.
  • The description schemes may be used in archiving audiovisual programs in a database. The search engine uses the information contained in the program description scheme to retrieve programs on the basis of their content. The program description scheme can also be used in navigating through the contents of the database or the query results. The user description scheme can be used in prioritizing the results of the user query during presentation. It is possible of course to make the program description scheme more comprehensive depending on the nature of the particular application.
  • The description scheme fulfills the user's desire to have applications that pay attention and are responsive to their viewing and usage habits, preferences, and personal demographics. The proposed user description scheme directly addresses this desire in its selection of fields and interrelationship to other description schemes. Because the description schemes are modular in nature, the user can port his user description scheme from one device to another in order to “personalize” the device.
  • The proposed description schemes can be incorporated into current products similar to those from TiVo and Replay TV in order to extend their entertainment informational value. In particular, the description scheme will enable audiovisual browsing and searching of programs and enable filtering within a particular program by supporting multiple program views such as the highlight view. In addition, the description scheme will handle programs coming from sources other than television broadcasts for which TiVo and Replay TV are not designed to handle. In addition, by standardization of TiVo and Replay TV type of devices, other products may be interconnected to such devices to extend their capabilities, such as devices supporting an MPEG 7 description. MPEG-7 is the Moving Pictures Experts Group-7, acting to standardize descriptions and description schemes for audiovisual information. The device may also be extended to be personalized by multiple users, as desired.
  • Because the description scheme is defined, the intelligent software agents can communicate among themselves to make intelligent inferences regarding the user's preferences. In addition, the development and upgrade of intelligent software agents for browsing and filtering applications can be simplified based on the standardized user description scheme.
  • The description scheme is multi-modal in the following sense that it holds both high level (semantic) and low level features and/or descriptors. For example, the high and low level descriptors are actor name and motion model parameters, respectively. High level descriptors are easily readable by humans while low level descriptors are more easily read by machines and less understandable by humans. The program description scheme can be readily harmonized with existing EPG, PSIP, and DVB-SI information facilitating search and filtering of broadcast programs. Existing services can be extended in the future by incorporating additional information using the compliant description scheme.
  • For example, one case may include audiovisual programs that are prerecorded on a media such as a digital video disc where the digital video disc also contains a description scheme that has the same syntax and semantics of the description scheme that the FSB module uses. If the FSB module uses a different description scheme, a transcoder (converter) of the description scheme may be employed. The user may want to browse and view the content of the digital video disc. In this case, the user may not need to invoke the analysis module to author a program description. However, the user may want to invoke his or her user description scheme in filtering, searching and browsing the digital video disc content. Other sources of program information may likewise be used in the same manner.
  • It is to be understood that any of the techniques described herein with relation to video are equally applicable to images (such as still image or a frame of a video) and audio (such as radio).
  • An example of an audiovisual interface is shown in FIGS. 4-12 which is suitable for the preferred audiovisual description scheme. Referring to FIG. 4, by selecting the thumbnail function as a function of category provides a display with a set of categories on the left hand side. Selecting a particular category, such as news, provides a set of thumbnail views of different programs that are currently available for viewing. In addition, the different programs may also include programs that will be available at a different time for viewing. The thumbnail views are short video segments that provide an indication of the content of the respective actual program that it corresponds with. Referring to FIG. 5, a thumbnail view of available programs in terms of channels may be displayed, if desired. Referring to FIG. 6, a text view of available programs in terms of channels may be displayed, if desired. Referring to FIG. 7, a frame view of particular programs may be displayed, if desired. A representative frame is displayed in the center of the display with a set of representative frames of different programs in the left hand column. The frequency of the number of frames may be selected, as desired. Also a set of frames are displayed on the lower portion of the display representative of different frames during the particular selected program. Referring to FIG. 8, a shot view of particular programs may be displayed, as desired. A representative frame of a shot is displayed in the center of the display with a set of representative frames of different programs in the left hand column. Also a set of shots are displayed on the lower portion of the display representative of different shots (segments of a program, typically sequential in nature) during the particular selected program. Referring to FIG. 9, a key frame view of particular programs may be displayed, as desired. A representative frame is displayed in the center of the display with a set of representative frames of different programs in the left hand column. Also a set of key frame views are displayed on the lower portion of the display representative of different key frame portions during the particular selected program. The number of key frames in each key frame view can be adjusted by selecting the level. Referring to FIG. 10, a highlight view may likewise be displayed, as desired. Referring to FIG. 11, an event view may likewise be displayed, as desired. Referring to FIG. 12, a character/object view may likewise be displayed, as desired.
  • An example of the description schemes is shown below in XML. The description scheme may be implemented in any language and include any of the included descriptions (or more), as desired.
  • The proposed program description scheme includes three major sections for describing a video program. The first section identifies the described program. The second section defines a number of views which may be useful in browsing applications. The third section defines a number of profiles which may be useful in filtering and search applications. Therefore, the overall structure of the proposed description scheme is as follows:
    <?XMLversion=”1.0”>
    <!DOCTYPE MPEG-7 SYSTEM “mpeg-7.dtd”>
    <ProgramIdentity>
    <ProgramID> ... </ProgramID>
    <ProgramName> ... </ProgramName>
    <SourceLocation> ... </SourceLocation>
    </ProgramIdentity>
    <ProgramViews>
    <ThumbnailView> ... </ThumbnailView>
    <SlideView> ... </SlideView>
    <FrameView> ... </FrameView>
    <ShotView> ... </ShotView>
    <KeyFrameView> ... </KeyFrameView>
    <HighlightView> ... </HighlightView>
    <EventView> ... </EventView>
    <CloseUpView> ... </CloseUpView>
    <AlternateView> ... </AlternateView>
    </ProgramViews>
    <ProgramProfiles>
    <GeneralProfile> ... </GeneralProfile>
    <CategoryProfile> ... </CategoryProfile>
    <DateTimeProfile> ... </DateTimeProfile>
    <KeywordProfile> ... </KeywordProfile>
    <TriggerProfile> ... </TriggerProfile>
    <StillProfile> ... </StillProfile>
    <EventProfile> ... </EventProfile>
    <CharacterProfile> ... </CharacterProfile>
    <ObjectProfile> ... </ObjectProfile>
    <ColorProfile> ... </ColorProfile>
    <TextureProfile> ... </TextureProfile>
    <ShapeProfile> ... </ShapeProfile>
    <MotionProfile> ... </MotionProfile>
    </ProgramProfiles>
  • Program Identity
  • Program ID
    <ProgramID> program-id </ProgramID>
      • The descriptor <ProgramID> contains a number or a string to identify a program.
  • Program Name
    <ProgramName> program-name </ProgramName>
      • The descriptor <ProgramName> specifies the name of a program.
  • Source Location
    <SourceLocation> source-url </SourceLocation>
      • The descriptor <SourceLocation> specifies the location of a program in URL format.
  • Program Views
  • Thumbnail View
    <ThumbnailView>
    <Image> thumbnail-image </Image>
    </ThumbnailView>
      • The descriptor <ThumbnailView> specifies an image as the thumbnail representation of a program.
  • Slide View
    <SlideView> frame-id ... </SlideView>
      • The descriptor <SlideView> specifies a number of frames in a program which may be viewed as snapshots or in a slide show manner.
  • Frame View
    <FrameView> start-frame-id end-frame-id </FrameView>
      • The descriptor <FrameView> specifies the start and end frames of a program. This is the most basic view of a program and any program has a frame view.
  • Shot View
    <ShotView>
    <Shot id=””> start-frame-id end-frame-id display-frame-id </Shot>
    <Shot id=””> start-frame-id end-frame-id display-frame-id </Shot>
    ...
    </ShotView>
      • The descriptor <ShotView> specifies a number of shots in a program. The <Shot> descriptor defines the start and end frames of a shot. It may also specify a frame to represent the shot.
  • Key-Frame View
    <KeyFrameView>
    <KeyFrames level=””>
    <Clip id=””> start-frame-id end-frame-id display-frame-id
    </Clip>
    <Clip id=””> start-frame-id end-frame-id display-frame-id
    </Clip>
    ...
    </KeyFrames>
    <KeyFrames level=””>
    <Clip id=””> start-frame-id end-frame-id display-frame-id
    </Clip>
    <Clip id=““””> start-frame-id end-frame-id display-frame-id
    </Clip>
    ...
    </KeyFrames>
    ...
    </KeyFrameView>
      • The descriptor <KeyFrameView> specifies key frames in a program. The key frames may be organized in a hierarchical manner and the hierarchy is captured by the descriptor <KeyFrames> with a level attribute. The clips which are associated with each key frame are defined by the descriptor <Clip>. Here the display frame in each clip is the corresponding key frame.
  • Highlight View
    <HighlightView>
    <Highlight length=””>
    <Clip id=””> start-frame-id end-frame-id display-frame-id
    </Clip>
    <Clip id=””> start-frame-id end-frame-id display-frame-id
    </Clip>
    ...
    </Highlight>
    <Highlight length=””>
    <Clip id=””> start-frame-id end-frame-id display-frame-id
    </Clip>
    <Clip id=””> start-frame-id end-frame-id display-frame-id
    </Clip>
    ...
    </Highlight>
    ...
    </HighlightView>
      • The descriptor <HighlightView> specifies clips to form highlights of a program. A program may have different versions of highlights which are tailored into various time length. The clips are grouped into each version of highlight which is specified by the descriptor <Highlight> with a length attribute.
  • Event View
    <EventView>
    <Events name=””>
    <Clip id=””> start-frame-id end-frame-id display-frame-id
    </Clip>
    <Clip id=””> start-frame-id end-frame-id display-frame-id
    </Clip>
    ...
    </Events>
    <Events name=””>
    <Clip id=””> start-frame-id end-frame-id display-frame-id
    </Clip>
    <Clip id=””> start-frame-id end-frame-id display-frame-id
    </Clip>
    ...
    </Events>
    ...
    </EventView>
      • The descriptor <EventView> specifies clips which are related to certain events in a program. The clips are grouped into the corresponding events which are specified by the descriptor <Event> with a name attribute.
  • Close-Up View
    <CloseUpView>
    <Target name=””>
    <Clip id=””> start-frame-id end-frame-id display-frame-id
    </Clip>
    <Clip id=””> start-frame-id end-frame-id display-frame-id
    </Clip>
    ...
    </Target>
    <Target name=””>
    <Clip id=””> start-frame-id end-frame-id display-frame-id
    </Clip>
    <Clip id=””> start-frame-id end-frame-id display-frame-id
    </Clip>
    ...
    </Target>
    ...
    </CloseUpView>
      • The descriptor <CloseUpView> specifies clips which may be zoomed in to certain targets in a program. The clips are grouped into the corresponding targets which are specified by the descriptor <Target> with a name attribute.
  • Alternate View
    <AlternateView>
    <AlternateSource id=””> source-url </AlternateSource>
    <AlternateSource id=””> source-url </AlternateSource>
    ...
    </AlternateView>
      • The descriptor <AlternateView> specifies sources which may be shown as alternate views of a program. Each alternate view is specified by the descriptor <AlternateSource> with an id attribute. The locate of the source may be specified in URL format.
  • Program Profiles
  • General Profile
    <GeneralProfile>
    <Title> title-text </Title>
    <Abstract> abstract-text </Abstract>
    <Audio> voice-annotation </Audio>
    <Www> web-page-url </Www>
    <ClosedCaption> yes/no </ClosedCaption>
    <Language> language-name </Language>
    <Rating> rating </Rating>
    <Length> time </Length>
    <Authors> author-name ... </Authors>
    <Producers> producer-name ... </Producers>
    <Directors> director-name ... </Directors>
    <Actors> actor-name ... </Actors>
    ...
    </GeneralProfile>
      • The descriptor <GeneralProfile> describes the general aspects of a program.
  • Category Profile
    <CategoryProfile> category-name ... </CategoryProfile>
      • The descriptor <CategoryProfile> specifies the categories under which a program may be classified.
  • Date-Time Profile
    <DateTimeProfile>
    <ProductionDate> date </ProductionDate>
    <ReleaseDate> date </ReleaseDate>
    <RecordingDate> date </RecordingDate>
    <RecordingTime> time </RecordingTime>
    ...
    </DateTimeProfile>
      • The descriptor <DateTimeProfile> specifies various date and time information of a program.
  • Keyword Profile
    <KeywordProfile> keyword ... </KeywordProfile>
      • The descriptor <KeywordProfile> specifies a number of keywords which may be used to filter or search a program.
  • Trigger Profile
    <TriggerProfile> trigger-frame-id ... </TriggerProfile>
      • The descriptor <TriggerProfile> specifies a number of frames in a program which may be used to trigger certain actions while the playback of the program.
  • Still Profile
    <StillProfile>
    <Still id=””>
    <HotRegion id =””>
    <Location> x1 y1 x2 y2 </Location>
    <Text> text-annotation </Text>
    <Audio> voice-annotation </Audio>
    <Www> web-page-url </Www>
    </HotRegion>
    <HotRegion id =””>
    <Location> x1 y1 x2 y2 </Location>
    <Text> text-annotation </Text>
    <Audio> voice-annotation </Audio>
    <Www> web-page-url </Www>
    </HotRegion>
    ...
    </Still>
    <Still id=””>
    <HotRegion id =””>
    <Location> x1 y1 x2 y2 </Location>
    <Text> text-annotation </Text>
    <Audio> voice-annotation </Audio>
    <Www> web-page-url </Www>
    </HotRegion>
    <HotRegion id =””>
    <Location> x1 y1 x2 y2 </Location>
    <Text> text-annotation </Text>
    <Audio> voice-annotation </Audio>
    <Www> web-page-url </Www>
    </HotRegion>
    ...
    </Still>
    ...
    </StillProfile>
      • The descriptor <StillProfile> specifies hot regions or regions of interest within a frame. The frame is specified by the descriptor <Still> with an id attribute which corresponds to the frame-id. Within a frame, each hot region is specified by the descriptor <HotRegion> with an id attribute.
  • Event Profile
    <EventProfile>
    <EventList> event-name ... </EventList>
    <Event name=””>
    <Www> web-page-url </Www>
    <Occurrence id=””>
    <Duration> start-frame-id end-frame-id </Duration>
    <Text> text-annotation </Text>
    <Audio> voice-annotation </Audio>
    </Occurrence>
    <Occurrence id=””>
    <Duration> start-frame-id end-frame-id </Duration>
    <Text> text-annotation </Text>
    <Audio> voice-annotation </Audio>
    </Occurrence>
    ...
    </Event>
    <Event name=””>
    <Www> web-page-url </Www>
    <Occurrence id=””>
    <Duration> start-frame-id end-frame-id </Duration>
    <Text> text-annotation </Text>
    <Audio> voice-annotation </Audio>
    </Occurrence>
    <Occurrence id=””>
    <Duration> start-frame-id end-frame-id </Duration>
    <Text> text-annotation </Text>
    <Audio> voice-annotation </Audio>
    </Occurrence>
    ...
    </Event>
    ...
    </EventProfile>
      • The descriptor <EventProfile> specifies the detailed information for certain events in a program. Each event is specified by the descriptor <Event> with a name attribute. Each occurrence of an event is specified by the descriptor <Occurrence> with an id attribute which may be matched with a clip id under <EventView>.
  • Character Profile
    <CharacterProfile>
    <CharacterList> character-name ... </CharacterList>
    <Character name=””>
    <ActorName> actor-name </ActorName>
    <Gender> male </Gender>
    <Age> age </Age>
    <Www> web-page-url </Www>
    <Occurrence id=””>
    <Duration> start-frame-id end-frame-id </Duration>
    <Location> frame:[x1 y1 x2 y2] ... </Location>
    <Motion> vx vy vz vα vβ vγ </Motion>
    <Text> text-annotation </Text>
    <Audio> voice-annotation </Audio>
    </Occurrence>
    <Occurrence id=””>
    <Duration> start-frame-id end-frame-id </Duration>
    <Location> frame:[x1 y1 x2 y2] ... </Location>
    <Motion> vx vy vz vα vβ vγ </Motion>
    <Text> text-annotation </Text>
    <Audio> voice-annotation </Audio
    </Occurrence>
    ...
    </Character>
    <Character name=””>
    <ActorName> actor-name </ActorName>
    <Gender> male </Gender>
    <Age> age </Age>
    <Www> web-page-url </Www>
    <Occurrence id=””>
    <Duration> start-frame-id end-frame-id </Duration>
    <Location> frame:[x1 y1 x2 y2] ... </Location>
    <Motion> vx vy vz vα vβ vγ </Motion>
    <Text> text-annotation </Text>
    <Audio> voice-annotation </Audio>
    </Occurrence>
    <Occurrence id=””>
    <Duration> start-frame-id end-frame-id </Duration>
    <Location> frame:[x1 y1 x2 y2] ... </Location>
    <Motion> vx vy vz vα vβ vγ </Motion>
    <Text> text-annotation </Text>
    <Audio> voice-annotation </Audio>
    </Occurrence>
    ...
    </Character>
    ...
    </CharacterProfile>
      • The descriptor <CharacterProfile> specifies the detailed information for certain characters in a program. Each character is specified by the descriptor <Character> with a name attribute. Each occurrence of a character is specified by the descriptor <Occurrence> with an id attribute which may be matched with a clip id under <CloseUpView>.
  • Object Profile
    <ObjectProfile>
    <ObjectList> object-name ... </ObjectList>
    <Object name=””>
    <Www> web-page-url </Www>
    <Occurrence id=””>
    <Duration> start-frame-id end-frame-id </Duration>
    <Location> frame:[x1 y1 x2 y2] ... </Location>
    <Motion> vx vy vz vα vβ vγ </Motion>
    <Text> text-annotation </Text>
    <Audio> voice-annotation </Audio>
    </Occurrence>
    <Occurrence id=””>
    <Duration> start-frame-id end-frame-id </Duration>
    <Location> frame:[x1 y1 x2 y2] ... </Location>
    <Motion> vx vy vz vα vβ vγ </Motion>
    <Text> text-annotation </Text>
    <Audio> voice-annotation </Audio>
    </Occurrence>
    ...
    </Object>
    <Object name=””>
    <Www> web-page-url </Www>
    <Occurrence id=””>
    <Duration> start-frame-id end-frame-id </Duration>
    <Location> frame:[x1 y1 x2 y2] ... </Location>
    <Motion> vx vy vz vα vβ vγ </Motion>
    <Text> text-annotation </Text>
    <Audio> voice-annotation </Audio>
    </Occurrence>
    <Occurrence id=””>
    <Duration> start-frame-id end-frame-id </Duration>
    <Location> frame:[x1 y1 x2 y2] ... </Location>
    <Motion> vx vy vz vα vβ vγ </Motion>
    <Text> text-annotation </Text>
    <Audio> voice-annotation </Audio>
    </Occurrence>
    ...
    </Object>
    ...
    </ObjectProfile>
      • The descriptor <ObjectProfile> specifies the detailed information for certain objects in a program. Each object is specified by the descriptor <Object> with a name attribute. Each occurrence of a object is specified by the descriptor <Occurrence> with an id attribute which may be matched with a clip id under <CloseUpView>.
  • Color Profile
    <ColorProfile>
    ...
    </ColorProfile>
      • The descriptor <ColorProfile> specifies the detailed color information of a program. All MPEG-7 color descriptors may be placed under here.
  • Texture Profile
    <TextureProfile>
    ...
    </TextureProfile>
      • The descriptor <TextureProfile> specifies the detailed texture information of a program. All MPEG-7 texture descriptors may be placed under here.
  • Shape Profile
    <ShapeProfile>
    ...
    </ShapeProfile>
      • The descriptor <ShapeProfile> specifies the detailed shape information of a program. All MPEG-7 shape descriptors may be placed under here.
  • Motion Profile
    <MotionProfile>
    ...
    </MotionProfile>
      • The descriptor <MotionProfile> specifies the detailed motion information of a program. All MPEG-7 motion descriptors may be placed under here.
  • User Description Scheme
  • The proposed user description scheme includes three major sections for describing a user. The first section identifies the described user. The second section records a number of settings which may be preferred by the user. The third section records some statistics which may reflect certain usage patterns of the user. Therefore, the overall structure of the proposed description scheme is as follows:
    <?XML version=”1.0”>
    <!DOCTYPE MPEG-7 SYSTEM “mpeg-7.dtd”>
    <UserIdentity>
    <UserID> ... </UserID>
    <UserName> ... </UserName>
    </UserIdentity>
    <UserPreferences>
    <BrowsingPreferences> ... </BrowsingPreferences>
    <FilteringPreferences> ... </FilteringPreferences>
    <SearchPreferences> ... </SearchPreferences>
    <DevicePreferences> ... </DevicePreferences>
    </UserPreferences>
    <UserHistory>
    <BrowsingHistory> ... </BrowsingHistory>
    <FilteringHistory> ... </FilteringHistory>
    <SearchHistory> ... </SearchHistory>
    <DeviceHistory> ... </DeviceHistory>
    </UserHistory>
    <UserDemographics>
    <Age> ... </Age>
    <Gender> ... </Gender>
    <ZIP> ... </ZIP>
    </UserDemographics>
  • User Identity
  • User ID
    <UserID> user-id </UserID>
      • The descriptor <UserID> contains a number or a string to identify a user.
  • User Name
    <UserName> user-name </UserName>
      • The descriptor <UserName> specifies the name of a user.
  • User Preferences
  • Browsing Preferences
    <BrowsingPreferences>
    <Views>
    <ViewCategory id=””> view-id ... </ViewCategory>
    <ViewCategory id=””> view-id ... </ViewCategory>
    ...
    </Views>
    <FrameFrequency> frequency ...<FrameFrequency>
    <ShotFrequency> frequency ...<ShotFrequency>
    <KeyFrameLevel> level-id ...<KeyFrameLevel>
    <HighlightLength> length ...<HighlightLength>
    ...
    </BrowsingPreferences>
      • The descriptor <BrowsingPreferences> specifies the browsing preferences of a user. The user's preferred views are specified by the descriptor <Views>. For each category, the preferred views are specified by the descriptor <ViewCategory> with an id attribute which corresponds to the category id. The descriptor <FrameFrequency> specifies at what interval the frames should be displayed on a browsing slider under the frame view. The descriptor <ShotFrequency> specifies at what interval the shots should be displayed on a browsing slider under the shot view. The descriptor <KeyFrameLevel> specifies at what level the key frames should be displayed on a browsing slider under the key frame view. The descriptor <HighlightLength> specifies which version of the highlight should be shown under the highlight view.
  • Filtering Preferences
    <FilteringPreferences>
    <Categories> category-name ... </Categories>
    <Channels> channel-number ... </Channels>
    <Ratings> rating-id ... </Ratings>
    <Shows> show-name ... </Shows>
    <Authors> author-name ... </Authors>
    <Producers> producer-name ... </Producers>
    <Directors> director-name ... </Directors>
    <Actors> actor-name ... </Actors>
    <Keywords> keyword ... </Keywords>
    <Titles> title-text ... </Titles>
    ...
    </FilteringPreferences>
      • The descriptor <FilteringPreferences> specifies the filtering related preferences of a user.
  • Search Preferences
    <SearchPreferences>
    <Categories> category-name ... </Categories>
    <Channels> channel-number ... </Channels>
    <Ratings> rating-id ... </Ratings>
    <Shows> show-name ... </Shows>
    <Authors> author-name ... </Authors>
    <Producers> producer-name ... </Producers>
    <Directors> director-name ... </Directors>
    <Actors> actor-name ... </Actors>
    <Keywords> keyword ... </Keywords>
    <Titles> title-text ... </Titles>
    ...
    </SearchPreferences>
      • The descriptor <SearchPreferences> specifies the search related preferences of a user.
  • Device Preferences
    <DevicePreferences>
    <Brightness> brightness-value </Brightness>
    <Contrast> contrast-value </Contrast>
    <Volume> volume-value </Volume>
    </DevicePreferences>
      • The descriptor <DevicePreferences> specifies the device preferences of a user.
  • Usage History
  • Browsing History
    <BrowsingHistory>
    <Views>
    <ViewCategory id=””> view-id ... </ViewCategory>
    <ViewCategory id=””> view-id ... </ViewCategory>
    ...
    </Views>
    <FrameFrequency> frequency ...<FrameFrequency>
    <ShotFrequency> frequency ...<ShotFrequency>
    <KeyFrameLevel> level-id ...<KeyFrameLevel>
    <HighlightLength> length ...<HighlightLength>
    ...
    </BrowsingHistory>
      • The descriptor <BrowsingHistory> captures the history of a user's browsing related activities.
  • Filtering History
    <FilteringHistory>
    <Categories> category-name ... </Categories>
    <Channels> channel-number ... </Channels>
    <Ratings> rating-id ... </Ratings>
    <Shows> show-name ... </Shows>
    <Authors> author-name ... </Authors>
    <Producers> producer-name ... </Producers>
    <Directors> director-name ... </Directors>
    <Actors> actor-name ... </Actors>
    <Keywords> keyword ... </Keywords>
    <Titles> title-text ... </Titles>
    ...
    </FilteringHistory>
  • The descriptor <FilteringHistory> captures the history of a user's filtering related activities.
  • Search History
    <SearchHistory>
    <Categories> category-name ... </Categories>
    <Channels> channel-number ... </Channels>
    <Ratings> rating-id ... </Ratings>
    <Shows> show-name ... </Shows>
    <Authors> author-name ... </Authors>
    <Producers> producer-name ... </Producers>
    <Directors> director-name ... </Directors>
    <Actors> actor-name ... </Actors>
    <Keywords> keyword ... </Keywords>
    <Titles> title-text ... </Titles>
    ...
    </SearchHistory>
      • The descriptor <SearchHistory> captures the history of a user's search related activities.
  • Device History
    <DeviceHistory>
    <Brightness> brightness-value ... </Brightness>
    <Contrast> contrast-value ... </Contrast>
    <Volume> volume-value ... </Volume>
    </DeviceHistory>
      • The descriptor <DeviceHistory> captures the history of a user's device related activities.
  • User Demographics
  • Age
    <Age> age </Age>
      • The descriptor <Age> specifies the age of a user.
  • Gender
    <Gender> ... </Gender>
      • The descriptor <Gender> specifies the gender of a user.
  • ZIP Code
    <ZIP> ... </ZIP>
      • The descriptor <ZIP> specifies the ZIP code of where a user lives.
  • System Description Scheme
  • The proposed system description scheme includes four major sections for describing a user. The first section identifies the described system. The second section keeps a list of all known users. The third section keeps lists of available programs. The fourth section describes the capabilities of the system. Therefore, the overall structure of the proposed description scheme is as follows:
    <?XML version=”1.0”>
    <!DOCTYPE MPEG-7 SYSTEM “mpeg-7.dtd”>
    <SystemIdentity>
    <SystemID> ... </SystemID>
    <SystemName> ... </SystemName>
    <SystemSerialNumber> ... </SystemSerialNumber>
    </SystemIdentity>
    <SystemUsers>
    <Users> ... </Users>
    </SystemUsers>
    <SystemPrograms>
    <Categories> ... </Categories>
    <Channels> ... </Channels>
    <Programs> ... </Programs>
    </SystemPrograms>
    <SystemCapabilities>
    <Views> ... </Views>
    </SystemCapabilities>
  • System Identity
  • System ID
    <SystemID> system-id </SystemID>
      • The descriptor <SystemID> contains a number or a string to identify a video system or device.
  • System Name
    <SystemName> system-name </SystemName>
      • The descriptor <SystemName> specifies the name of a video system or device.
  • System Serial Number
    <SystemSerialNumber> system-serial-number </SystemSerialNumber>
      • The descriptor <SystemSerialNumber> specifies the serial number of a video system or device.
  • System Users
  • Users
    <Users>
    <User>
    <UserID> user-id </UserID>
    <UserName> user-name </UserName>
    </User>
    <User>
    <UserID> user-id </UserID>
    <UserName> user-name </UserName>
    </User>
    ...
    </Users>
      • The descriptor <SystemUsers> lists a number of users who have registered on a video system or device. Each user is specified by the descriptor <User>. The descriptor <UserID> specifies a number or a string which should match with the number or string specified in <UserID> in one of the user description schemes.
  • Programs in the System
  • Categories
    <Categories>
    <Category>
    <CategoryID> category-id </CategoryID>
    <CategoryName> category-name </CategoryName>
    <SubCategories> sub-category-id ... </SubCategories>
    </Category>
    <Category>
    <CategoryID> category-id </CategoryID>
    <CategoryName> category-name </CategoryName>
    <SubCategories> sub-category-id ... </SubCategories>
    </Category>
    ...
    </Categories>
      • The descriptor <Categories> lists a number of categories which have been registered on a video system or device. Each category is specified by the descriptor <Category>. The major-sub relationship between categories is captured by the descriptor <SubCategories>.
  • Channels
    <Channels>
    <Channel>
    <ChannelID> channel-id </ChannelID>
    <ChannelName> channel-name </ChannelName>
    <SubChannels> sub-channel-id ... </SubChannels>
    </Channel>
    <Channel>
    <ChannelID> channel-id </ChannelID>
    <ChannelName> channel-name </ChannelName>
    <SubChannels> sub-channel-id ... </SubChannels>
    </Channel>
    ...
    </Channels>
      • The descriptor <Channels> lists a number of channels which have been registered on a video system or device. Each channel is specified by the descriptor <Channel>. The major-sub relationship between channels is captured by the descriptor <SubChannels>.
  • Programs
    <Programs>
    <CategoryPrograms>
    <CategoryID> category-id </CategoryID>
    <Programs> program-id ... </Programs>
    </CategoryPrograms>
    <CategoryPrograms>
    <CategoryID> category-id </CategoryID>
    <Programs> program-id ... </Programs>
    </CategoryPrograms>
    ...
    <ChannelPrograms>
    <ChannelID> channel-id </ChannelID>
    <Programs> program-id ... </Programs>
    </ChannelPrograms>
    <ChannelPrograms>
    <ChannelID> channel-id </ChannelID>
    <Programs> program-id ... </Programs>
    </Channel Programs>
    ...
    </Programs>
      • The descriptor <Programs> lists programs who are available on a video system or device. The programs are grouped under corresponding categories or channels. Each group of programs are specified by the descriptor <CategoryPrograms> or <ChannelPrograms>. Each program id contained in the descriptor <Programs> should match with the number or string specified in <ProgramID> in one of the program description schemes.
  • System Capabilities
  • Views
    <Views>
    <View>
    <ViewID> view-id </ViewID>
    <ViewName> view-name </ViewName>
    </View>
    <View>
    <ViewID> view-id </ViewID>
    <ViewName> view-name </ViewName>
    </View>
    ...
    </Views>
  • The descriptor <Views> lists views which are supported by a video system or device. Each view is specified by the descriptor <View>. The descriptor <ViewName> contains a string which should match with one of the following views used in the program description schemes: ThumbnailView, SlideView, FrameView, ShotView, KeyFrameView, HighlightView, EventView, and CloseUpView.
  • The present inventors came to the realization that the program description scheme may be further modified to provide additional capabilities. Referring to FIG. 13, the modified program description scheme 400 includes four separate types of information, namely, a syntactic structure description scheme 402, a semantic structure description scheme 404, a visualization description scheme 406, and a meta information description scheme 408. It is to be understood that in any particular system one or more of the description schemes may be included, as desired.
  • Referring to FIG. 14, the visualization description scheme 406 enables fast and effective browsing of video program (and audio programs) by allowing access to the necessary data, preferably in a one-step process. The visualization description scheme 406 provides for several different presentations of the video content (or audio), such as for example, a thumbnail view description scheme 410, a key frame view description scheme 412, a highlight view description scheme 414, an event view description scheme 416, a close-up view description scheme 418, and an alternative view description scheme 420. Other presentation techniques and description schemes may be added, as desired. The thumbnail view description scheme 410 preferably includes an image 422 or reference to an image representative of the video content and a time reference 424 to the video. The key frame view description scheme 412 preferably includes a level indicator 426 and a time reference 428. The level indicator 426 accommodates the presentation of a different number of key frames for the same video portion depending on the user's preference. The highlight view description scheme 414 includes a length indicator 430 and a time reference 432. The length indicator 430 accommodates the presentation of a different highlight duration of a video depending on the user's preference. The event view description scheme 416 preferably includes an event indicator 434 for the selection of the desired event and a time reference 436. The close-up view description scheme 418 preferably includes a target indicator 438 and a time reference 440. The alternate view description scheme preferably includes a source indicator 442. To increase performance of the system it is preferred to specify the data which is needed to render such views in a centralized and straightforward manner. By doing so, it is then feasible to access the data in a simple one-step process without complex parsing of the video.
  • Referring to FIG. 15, the meta information description scheme 408 generally includes various descriptors which carry general information about a video (or audio) program such as the title, category, keywords, etc. Additional descriptors, such as those previously described, may be included, as desired.
  • Referring again to FIG. 13, the syntactic structure description scheme 402 specifies the physical structure of a video program (or audio), e.g., a table of contents. The physical features, may include for example, color, texture, motion, etc. The syntactic structure description scheme 402 preferably includes three modules, namely a segment description scheme 450, a region description scheme 452, and a segment/region relation graph description scheme 454. The segment description scheme 450 may be used to define relationships between different portions of the video consisting of multiple frames of the video. A segment description scheme 450 may contain another segment description scheme 450 and/or shot description scheme to form a segment tree. Such a segment tree may be used to define a temporal structure of a video program. Multiple segment trees may be created and thereby create multiple table of contents. For example, a video program may be segmented into story units, scenes, and shots, from which the segment description scheme 450 may contain such information as a table of contents. The shot description scheme may contain a number of key frame description schemes, a mosaic description scheme(s), a camera motion description scheme(s), etc. The key frame description scheme may contain a still image description scheme which may in turn contains color and texture descriptors. It is noted that various low level descriptors may be included in the still image description scheme under the segment description scheme. Also, the visual descriptors may be included in the region description scheme which is not necessarily under a still image description scheme. On example of a segment description scheme 450 is shown in FIG. 16.
  • Referring to FIG. 17, the region description scheme 452 defines the interrelationships between groups of pixels of the same and/or different frames of the video. The region description scheme 452 may also contain geometrical features, color, texture features, motion features, etc.
  • Referring to FIG. 18, the segment/region relation graph description scheme 454 defines the interrelationships between a plurality of regions (or region description schemes), a plurality of segments (or segment description schemes), and/or a plurality of regions (or description schemes) and segments (or description schemes).
  • Referring again to FIG. 13, the semantic structure description scheme 404 is used to specify semantic features of a video program (or audio), e.g. semantic events. In a similar manner to the syntactic structure description scheme, the semantic structure description scheme 404 preferably includes three modules, namely an event description scheme 480, an object description scheme 482, and an event/objection relation graph description scheme 484. The event description scheme 480 may be used to form relationships between different events of the video normally consisting of multiple frames of the video. An event description scheme 480 may contain another event description scheme 480 to form a segment tree. Such an event segment tree may be used to define a semantic index table for a video program. Multiple event trees may be created and thereby creating multiple index tables. For example, a video program may include multiple events, such as a basketball dunk, a fast break, and a free throw, and the event description scheme may contain such information as an index table. The event description scheme may also contain references which link the event to the corresponding segments and/or regions specified in the syntactic structure description scheme. On example of an event description scheme is shown in FIG. 19.
  • Referring to FIG. 20, the object description scheme 482 defines the interrelationships between groups of pixels of the same and/or different frames of the video representative of objects. The object description scheme 482 may contain another object description scheme and thereby form an object tree. Such an object tree may be used to define an object index table for a video program. The object description scheme may also contain references which link the object to the corresponding segments and/or regions specified in the syntactic structure description scheme.
  • Referring to FIG. 21, the event/object relation graph description scheme 484 defines the interrelationships between a plurality of events (or event description schemes), a plurality of objects (or object description schemes), and/or a plurality of events (or description schemes) and objects (or description schemes).
  • After further consideration, the present inventors came the realization that the particular design of the user preference description scheme is important to implement portability, while permitting adaptive updating, of the user preference description scheme. Moreover, the user preference description scheme should be readily usable by the system while likewise being suitable for modification based on the user's historical usage patterns. It is possible to collectively track all users of a particular device to build a database for the historical viewing preferences of the users of the device, and thereafter process the data dynamically to determine which content the users would likely desire. However, this implementation would require the storage of a large amount of data and the associated dynamic processing requirements to determine the user preferences. It is to be understood that the user preference description scheme may be used alone or in combination with other description scheme.
  • Referring to FIG. 22, to achieve portability and potentially decreased processing requirements the user preference description scheme 20 should be divided into at least two separate description schemes, namely, a usage preference description scheme 500 and a usage history description scheme 502. The usage preference description scheme 500, described in detail later, includes a description scheme of the user's audio and/or video consumption preferences. The usage preference description scheme 500 describes one or more of the following, depending on the particular implementation, (a) browsing preferences, (b) filtering preferences, (c) searching preferences, and (d) device preferences of the user. The type of preferences shown in the usage preference description scheme 500 are generally immediately usable by the system for selecting and otherwise using the available audio and/or video content. In other words, the usage preference description scheme 500 includes data describing audio and/or video consumption of the user. The usage history description scheme 502, described in detail later, includes a description scheme of the user's historical audio and/or video activity, such as browsing, device settings, viewing, and selection. The usage history description scheme 502 describes one or more of the following, depending on the particular implementation, (a) browsing history, (b) filtering history, (c) searching history, and (d) device usage history. The type of preferences shown in the usage history description scheme 502 are not generally immediately usable by the system for selecting and otherwise using the available audio and/or video content. The data contained in the usage history description scheme 502 may be considered generally “unprocessed”, at least in comparison to the data contained in the usage preferences description scheme 500 because it generally contains the historical usage data of the audio and/or video content of the viewer.
  • In general, capturing the user's usage history facilitates “automatic” composition of user preferences by a machine, as desired. When updating the user preference description scheme 500 it is desirable that the usage history description scheme 502 be relatively symmetric to the usage preference description scheme 500. The symmetry permits more effective updating because less interpretation between the two description schemes is necessary in order to determine what data should be included in the preferences. Numerous algorithms can then be applied in utilization of the history information in deriving user preferences. For instance, statistics can be computed from the history and utilized for this purpose.
  • After consideration of the usage preference description 500 and the usage history description 502, the present inventors came to the realization that in the home environment many different users with different viewing and usage preferences may use the same device. For example, with a male adult preferring sports, a female adult preferring afternoon talk shows, and a three year old child preferring children's programming, the total information contained in the usage preference description 500 and the usage history description 502 will not be individually suitable for any particular user. The resulting composite data and its usage by the device is frustrating to the users because the device will not properly select and present audio and/or video content that is tailored to any particular user. To alleviate this limitation, the user preference description 20 may also include a user identification (user identifier) description 504. The user identification description 504 includes an identification of the particular user that is using the device. By incorporating a user identification description 504 more than one user may use the device while maintaining a different or a unique set of data within the usage preference description 500 and the usage history description 502. Accordingly, the user identification description 504 associates the appropriate usage preference description(s) 500 and usage history description(s) 502 for the particular user identified by the user identification description 504. With multiple user identification descriptions 504, multiple entries within a single user identification description 504 identifying different users, and/or including the user identification description within the usage preference description 500 and/or usage history description 502 to provide the association therebetween, multiple users can readily use the same device while maintaining their individuality. Also, without the user identification description in the preferences and/or history, the user may more readily customize content anonymously. In addition, the user's user identification description 504 may be used to identify multiple different sets of usage preference descriptions 500usage history descriptions 502, from which the user may select for present interaction with the device depending on usage conditions. The use of multiple user identification descriptions for the same user is useful when the user uses dultiple different types of devices, such as a television, a home stereo, a business television, a hotel television, and a vehicle audio player, and maintains multiple different sets of preference descriptions. Further, the identification may likewise be used to identify groups of individuals, such as for example, a family. In addition, devices that are used on a temporary basis, such as those in hotel rooms or rental cars, the user identification requirements may be overridden by employing a temporary session user identification assigned by such devices. In applications where privacy concerns may be resolved or are otherwise not a concern, the user identification description 504 may also contain demographic information of the user. In this manner, as the usage history description 502 increases during use over time, this demographic data and/or data regarding usage patterns may be made available to other sources. The data may be used for any purpose, such as for example, providing targeted advertising or programming on the device based on such data.
  • Referring to FIG. 23, periodically an agent 510 processes the usage history description(s) 502 for a particular user to “automatically” determine the particular user's preferences. In this manner, the user's usage preference description 500 is updated to reflect data stored in the usage history description 502. This processing by the agent 510 is preferably performed on a periodic basis so that during normal operation the usage history description 502 does not need to be processed, or otherwise queried, to determine the user's current browsing, filtering, searching, and device preferences. The usage preference description 500 is relatively compact and suitable for storage on a portable storage device, such as a smart card, for use by other devices as previously described.
  • Frequently, the user may be traveling away from home with his smart card containing his usage preference description 500. During such traveling the user will likely be browsing, filtering, searching, and setting device preferences of audio and/or video content on devices into which he provided his usage preference description 500. However, in some circumstances the audio and/or video content browsed, filtered, searched, and device preferences of the user may not be typically what he is normally interested in. In addition, for a single device the user may desire more than one profile depending on the season, such as football season, basketball season, baseball season, fall, winter, summer, and spring. Accordingly, it may not be appropriate for the device to create a usage history description 502 and thereafter have the agent 510 “automatically” update the user's usage preference description 500. This will in effect corrupt the user's usage preference description 500. Accordingly, the device should include an option that disables the agent 510 from updating the usage preference description 500. Alternatively, the usage preference description 500 may include one or more fields or data structures that indicate whether or not the user desires the usage preference description 500 (or portions thereof) to be updated.
  • Referring to FIG. 24, the device may use the program descriptions provided by any suitable source describing the current and/or future audio and/or video content available from which a filtering agent 520 selects the appropriate content for the particular user(s). The content is selected based upon the usage preference description for a particular user identification(s) to determine a list of preferred audio and/or video programs.
  • As it may be observed, with a relatively compact user preference description 500 the user's preferences are readily movable to different devices, such as a personal video recorder, a TiVO player, a RePlay Networks player, a car audio player, or other audio and/or video appliance. Yet, the user preference description 500 may be updated in accordance with the user's browsing, filtering, searching, and device preferences.
  • Referring to FIG. 25, the usage preference description 500 preferably includes three different categories of descriptions, depending on the particular implementation. The preferred descriptions include (a) browsing preferences description 530, (b) filtering and search preferences description, 532 and (c) device preferences description 534. The browsing preferences description 530 relates to the viewing preferences of audio and/or video programs. The filtering and search preferences description 532 relates to audio and/or video program level preferences. The program level preferences are not necessarily used at the same time as the (browsing) viewing preferences. For example, preferred programs can be determined as a result of filtering program descriptions according to user's filtering preferences. A particular preferred program may subsequently be viewed in accordance with user's browsing preferences. Accordingly, efficient implementation may be achieved if the browsing preferences description 530 is separate, at least logically, from the filtering and search preferences description 532. The device preferences description 534 relates to the preferences for setting up the device in relation to the type of content being presented, e.g. romance, drama, action, violence, evening, morning, day, weekend, weekday, and/or the available presentation devices. For example, presentation devices may include stereo sound, mono sound, surround sound, multiple potential displays, multiple different sets of audio speakers, AC-3, and Dolby Digital. It may likewise be observed that the device preferences description 534 is likewise separate, at least logically, from the browsing description 530 and filtering/search preferences description 532.
  • The browsing preferences description 530 contains descriptors that describe preferences of the user for browsing multimedia (audio and/or video) information. In the case of video, for example, the browsing preferences may include user's preference for continuous playback of the entire program versus visualizing a short summary of the program. Various summary types may be described in the program descriptions describing multiple different views of programs where these descriptions are utilized by the device to facilitate rapid non-linear browsing, viewing, and navigation. Parameters of the various summary types should also be specified, i.e., number of hierarchy levels when the keyframe summary is preferred, or the time duration of the video highlight when highlight summary is preferred. In addition, browsing preferences may also include descriptors describing parental control settings. A switch descriptor (set by the user) should also be included to specify whether or not the preferences can be modified without consulting the user first. This prevents inadvertent changing or updating of the preferences by the device. In addition, it is desirable that the browsing preferences are media content dependent. For example, a user may prefer 15 minute video highlight of a basketball game or may prefer to see only the 3-point shots. The same user may prefer a keyframe summary with two levels of hierarchy for home videos.
  • The filtering and search preferences description 532 preferably has four descriptions defined therein, depending on the particular embodiment. The keyword preferences description 540 is used to specify favorite topics that may not be captured in the title, category, etc., information. This permits the acceptance of a query for matching entries in any of the available data fields. The content preferences description 542 is used to facilitate capturing, for instance, favorite actors, directors. The creation preferences description 544 is used to specify capturing, for instance, titles of favorite shows. The classification preferences description 546 is used to specify descriptions, for instance, a favorite program category. A switch descriptor, activated by the user, may be included to specify whether or not the preferences may be modified without consulting the user, as previously described.
  • The device preferences description 534 contains descriptors describing preferred audio and/or video rendering settings, such as volume, balance, bass, treble, brightness, contrast, closed captioning, AC-3, Dolby digital, which display device of several, type of display device, etc. The settings of the device relate to how the user browses and consumes the audio and/or video content. It is desirable to be able to specify the device setting preferences in a media type and content-dependent manner. For example the preferred volume settings for an action movie may be higher than a drama, or the preferred settings of bass for classical music and rock music may be different. A switch descriptor, activated by the user, may be included to specify whether or not the preferences may be modified without consulting the user, as previously described.
  • Referring to FIG. 26, the usage preferences description may be used in cooperation with an MPEG-7 compliant data stream and/or device. MPEG-7 descriptions are described in ISO/IEC JTC1/SC29/WG11 “MPEG-7 Media/Meta DSs (V0.2), August 1999, incorporated by reference herein. It is preferable that media content descriptions are consistent with descriptions of preferences of users consuming the media. Consistency can be achieved by using common descriptors in media and user preference descriptions or by specifying a correspondence between user preferences and media descriptors. Browsing preferences descriptions are preferably consistent with media descriptions describing different views and summaries of the media. The content preferences description 542 is preferably consistent with, e.g., a subset of the content description of the media 553 specified in MPEG-7 by content description scheme. The classification preferences description 544 is preferably consistent with, e.g., a subset of the classification description 554 defined in MPEG-7 as classification description scheme. The creation preferences description 546 is preferably consistent with, e.g., a subset of the creation description 556 specified in MPEG-7 by creation description scheme. The keyword preferences description 540 is preferably a string supporting multiple languages and consistent with corresponding media content description schemes. Consistency between media and user preference descriptions is depicted or shown in FIG. 26 by couble arrows in the case of content, creation, and classification preferences.
  • Referring to FIG. 27, the usage history description 502 preferably includes three different categories of descriptions, depending on the particular implementation. The preferred descriptions include (a) browsing history description 560, (b) filtering and search history description 562, and (c) device usage history description 564, as previously described in relation to the usage preference description 500. The filtering and search history description 562 preferably has four descriptions defined therein, depending on the particular embodiment, namely, a keyword usage history description 566, a content usage history description 568, a creation preferences description 570, and a classification usage history description 572, as previously described with respect to the preferences. The usage history description 502 may contain additional descriptors therein (or description if desired) that describe the time and/or time duration of information contained therein. The time refers to the duration of consuming a particular audio and/or video program. The duration of time that a particular program has been viewed provides information that may be used to determine user preferences. For example, if a user only watches a show for 5 minutes then it may not be a suitable preference for inclusion the usage preference description 500. In addition, the present inventors came to the realization that an even more accurate measure of the user's preference of a particular audio and/or video program is the time viewed in light of the total duration of the program. This accounts for the relative viewing duration of a program. For example watching 30 minutes of a 4 hour show may be of less relevance than watching 30 minutes of a 30 minute show to determine preference data for inclusion in the usage preference description 500.
  • Referring to FIG. 28, an exemplary example of an audio and/or video program receiver with persistent storage is illustrated. As shown, audio/video program descriptions are available from the broadcast or other source, such as a telephone line. The user preference description facilitate personalization of the browsing, filtering and search, and device settings. In this embodiment, the user preferences are stored at the user's terminal with provision for transporting it to other systems, for example via a smart card. Alternatively, the user preferences may be stored in a server and the content adaptation can be performed according to user descriptions at the server and then the preferred content is transmitted to the user. The user may directly provide the user preferences, if desired. The user preferences and/or user history may likewise be provided to a service provider. The system may employ an application that records user's usage history in the form of usage history description, as previously defined. The usage history description is then utilized by another application, e.g., a smart agent, to automatically map usage history to user preferences.
  • The terms and expressions that have been employed in the foregoing specification are sued as terms of description and not of limitation, and there is no intention, in the use of such terms and expressions, of excluding equivalents of the features shown and described or portions thereof, it being recognized that the scope of the invention is defined and limited only by the claims that follow.

Claims (5)

1-84. canceled
85. A method of using a system with at least one of audio, image, and a video comprising a plurality of frames comprising the steps of:
(a) providing a usage preferences description wherein said usage preference description includes at least two of a browsing preferences description, a filtering preferences description, a search preferences description, and a device preferences description where,
(i) said browsing preferences description relates to a user's viewing preferences;
(ii) said filtering preferences descriptions and said search preferences descriptions relate to at least one of (1) content preferences of said at least one of audio, image, and video, (2) classification preferences of said at least one of audio, image, and video, (3) keyword preferences of said at least one of audio, image, and video, and (4) creation preferences of said at least one of audio, image, and video; and
(iii) said device preferences description relates to user's preferences regarding presentation characteristics of the presentation device;
(b) providing a usage history description where said usage history description includes at least one of a browsing history description, a filtering history description, a search history description, and a device usage history description where,
(i) said browsing history description relates to a user's viewing history;
(ii) said filtering history description and said search history descriptions relate to at least one of (1) content usage history of said at least one of audio, image, and video, (2) classification usage history of said at least one of audio, image, and video, (3) keyword usage history of said at least one of audio, image, and video, and (4) creation usage history of said at least one of audio, image, and video; and
(iii) said device usage history description relates to user's history regarding presentation characteristics of the presentation device;
(c) selectively upclatiig said usage preferences description based on the content of said usage history description.
86. The method of claim 85 wherein said selective updating is based upon said a response from said user.
87. The method of claim 85 wherein said selective updating is based upon at least one setting within at least one of said usage preferences description and said usage history description.
88. The method of claim 87 wherein said selective updating is based upon at least one setting within said usage preferences description.
US10/977,039 1999-09-16 2004-10-28 Audiovisual information management system with selective updating Abandoned US20050060641A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US10/977,039 US20050060641A1 (en) 1999-09-16 2004-10-28 Audiovisual information management system with selective updating

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US15438899P 1999-09-16 1999-09-16
US54144700A 2000-03-31 2000-03-31
US10/977,039 US20050060641A1 (en) 1999-09-16 2004-10-28 Audiovisual information management system with selective updating

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US54144700A Continuation 1999-09-16 2000-03-31

Publications (1)

Publication Number Publication Date
US20050060641A1 true US20050060641A1 (en) 2005-03-17

Family

ID=46303165

Family Applications (9)

Application Number Title Priority Date Filing Date
US10/977,718 Expired - Fee Related US7178107B2 (en) 1999-09-16 2004-10-28 Audiovisual information management system with identification prescriptions
US10/977,716 Expired - Fee Related US7424677B2 (en) 1999-09-16 2004-10-28 Audiovisual information management system with usage preferences
US10/977,828 Expired - Fee Related US7197709B2 (en) 1999-09-16 2004-10-28 Audiovisual information management system with multiple user identifications
US10/977,859 Expired - Fee Related US7424678B2 (en) 1999-09-16 2004-10-28 Audiovisual information management system with advertising
US10/977,717 Expired - Fee Related US7194687B2 (en) 1999-09-16 2004-10-28 Audiovisual information management system with user identification
US10/977,885 Expired - Fee Related US7194688B2 (en) 1999-09-16 2004-10-28 Audiovisual information management system with seasons
US10/977,039 Abandoned US20050060641A1 (en) 1999-09-16 2004-10-28 Audiovisual information management system with selective updating
US10/977,866 Expired - Fee Related US7181691B2 (en) 1999-09-16 2004-10-28 Audiovisual information management system with presentation service
US10/977,536 Expired - Fee Related US7509580B2 (en) 1999-09-16 2004-10-28 Audiovisual information management system with preferences descriptions

Family Applications Before (6)

Application Number Title Priority Date Filing Date
US10/977,718 Expired - Fee Related US7178107B2 (en) 1999-09-16 2004-10-28 Audiovisual information management system with identification prescriptions
US10/977,716 Expired - Fee Related US7424677B2 (en) 1999-09-16 2004-10-28 Audiovisual information management system with usage preferences
US10/977,828 Expired - Fee Related US7197709B2 (en) 1999-09-16 2004-10-28 Audiovisual information management system with multiple user identifications
US10/977,859 Expired - Fee Related US7424678B2 (en) 1999-09-16 2004-10-28 Audiovisual information management system with advertising
US10/977,717 Expired - Fee Related US7194687B2 (en) 1999-09-16 2004-10-28 Audiovisual information management system with user identification
US10/977,885 Expired - Fee Related US7194688B2 (en) 1999-09-16 2004-10-28 Audiovisual information management system with seasons

Family Applications After (2)

Application Number Title Priority Date Filing Date
US10/977,866 Expired - Fee Related US7181691B2 (en) 1999-09-16 2004-10-28 Audiovisual information management system with presentation service
US10/977,536 Expired - Fee Related US7509580B2 (en) 1999-09-16 2004-10-28 Audiovisual information management system with preferences descriptions

Country Status (1)

Country Link
US (9) US7178107B2 (en)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020129371A1 (en) * 2001-03-08 2002-09-12 Matsushita Elecric Industrial Co., Ltd. Media distribution apparatus and media distribution method
US20060085371A1 (en) * 2002-09-24 2006-04-20 Koninklijke Philips Electronics, N.V. System and method for associating different types of media content
US20060090183A1 (en) * 2004-10-26 2006-04-27 David Zito Method and apparatus for a search-enabled remote control device
US20070089143A1 (en) * 2003-11-10 2007-04-19 Lefevre Chad A Method and apparatus for providing dynamic display of content information associated with a device in a network
US20070136459A1 (en) * 2005-12-09 2007-06-14 Sbc Knowledge Ventures Lp Session continuity in multimedia services
DE102008003914A1 (en) * 2008-01-10 2009-07-16 Deutsche Telekom Ag Service for web radio and internet television
EP2476249A2 (en) * 2009-09-07 2012-07-18 LG Electronics Inc. Image display apparatus and operation method therefore
EP2510682A2 (en) * 2009-12-08 2012-10-17 LG Electronics Inc. Image display apparatus and method for operating the same
US20140298414A1 (en) * 2013-03-27 2014-10-02 Apple Inc. Browsing remote content using a native user interface
US20150279428A1 (en) * 2014-03-27 2015-10-01 Samsung Electronics Co., Ltd. Method for generating thumbnail and electronic device thereof
US20160044388A1 (en) * 2013-03-26 2016-02-11 Orange Generation and delivery of a stream representing audiovisual content
WO2017196851A1 (en) * 2016-05-10 2017-11-16 Rovi Guides, Inc. Systems and methods for notifying different users about missed content by tailoring catch-up segments to each different user
US20180063253A1 (en) * 2015-03-09 2018-03-01 Telefonaktiebolaget Lm Ericsson (Publ) Method, system and device for providing live data streams to content-rendering devices
US10225603B2 (en) * 2017-03-13 2019-03-05 Wipro Limited Methods and systems for rendering multimedia content on a user device
CN110045976A (en) * 2019-05-23 2019-07-23 中国联合网络通信集团有限公司 A kind of update method and system of application
US10694137B2 (en) 2016-05-10 2020-06-23 Rovi Guides, Inc. Systems and methods for resizing content based on a relative importance of the content
CN111414150A (en) * 2019-01-04 2020-07-14 厦门雅基软件有限公司 Game engine rendering method and device, electronic equipment and computer storage medium
US20210084350A1 (en) * 2018-01-11 2021-03-18 Editorji Technologies Private Limited Method and system for customized content
US11012719B2 (en) * 2016-03-08 2021-05-18 DISH Technologies L.L.C. Apparatus, systems and methods for control of sporting event presentation based on viewer engagement
US11778287B2 (en) 2014-10-09 2023-10-03 Stats Llc Generating a customized highlight sequence depicting multiple events
US11863848B1 (en) * 2014-10-09 2024-01-02 Stats Llc User interface for interaction with customized highlight shows
US11882345B2 (en) 2014-10-09 2024-01-23 Stats Llc Customized generation of highlights show with narrative component

Families Citing this family (187)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8352400B2 (en) 1991-12-23 2013-01-08 Hoffberg Steven M Adaptive pattern recognition based controller apparatus and method and human-factored interface therefore
US7162532B2 (en) 1998-02-23 2007-01-09 Koehler Steven M System and method for listening to teams in a race event
US7966078B2 (en) 1999-02-01 2011-06-21 Steven Hoffberg Network media appliance system and method
US7050110B1 (en) * 1999-10-29 2006-05-23 Intel Corporation Method and system for generating annotations video
GB2365651B (en) * 1999-12-07 2004-04-21 Sony Corp Information retrieving apparatus and information receipt apparatus,and method of retrieving and receiving information
US7096185B2 (en) * 2000-03-31 2006-08-22 United Video Properties, Inc. User speech interfaces for interactive media guidance applications
US7962948B1 (en) 2000-04-07 2011-06-14 Virage, Inc. Video-enabled community building
US8171509B1 (en) 2000-04-07 2012-05-01 Virage, Inc. System and method for applying a database to video multimedia
US7260564B1 (en) 2000-04-07 2007-08-21 Virage, Inc. Network video guide and spidering
US7212990B1 (en) * 2000-05-31 2007-05-01 Microsoft Corp. System and method for managing and controlling accounts with profile information
JP4529240B2 (en) * 2000-06-13 2010-08-25 ソニー株式会社 Information processing apparatus and method, information processing system, and recording medium
US20060064716A1 (en) * 2000-07-24 2006-03-23 Vivcom, Inc. Techniques for navigating multiple video streams
US20020065678A1 (en) * 2000-08-25 2002-05-30 Steven Peliotis iSelect video
US6774908B2 (en) * 2000-10-03 2004-08-10 Creative Frontier Inc. System and method for tracking an object in a video and linking information thereto
JP2002158627A (en) * 2000-11-16 2002-05-31 Sony Corp Broadcast receiver, viewing information calculation method and viewing information calculation system
US6629104B1 (en) * 2000-11-22 2003-09-30 Eastman Kodak Company Method for adding personalized metadata to a collection of digital images
US20020184336A1 (en) * 2001-03-01 2002-12-05 Rising Hawley K. Occurrence description schemes for multimedia content
US20030088687A1 (en) 2001-12-28 2003-05-08 Lee Begeja Method and apparatus for automatically converting source video into electronic mail messages
US20030163815A1 (en) * 2001-04-06 2003-08-28 Lee Begeja Method and system for personalized multimedia delivery service
US8060906B2 (en) * 2001-04-06 2011-11-15 At&T Intellectual Property Ii, L.P. Method and apparatus for interactively retrieving content related to previous query results
US8479238B2 (en) * 2001-05-14 2013-07-02 At&T Intellectual Property Ii, L.P. Method for content-based non-linear control of multimedia playback
US20060212442A1 (en) * 2001-05-16 2006-09-21 Pandora Media, Inc. Methods of Presenting and Providing Content to a User
US7962482B2 (en) 2001-05-16 2011-06-14 Pandora Media, Inc. Methods and systems for utilizing contextual feedback to generate and modify playlists
US20060206478A1 (en) * 2001-05-16 2006-09-14 Pandora Media, Inc. Playlist generating methods
CA2348353A1 (en) * 2001-05-22 2002-11-22 Marc Arseneau Local broadcast system
GB0122189D0 (en) * 2001-09-13 2001-10-31 Pace Micro Tech Plc Television system
US20030098869A1 (en) * 2001-11-09 2003-05-29 Arnold Glenn Christopher Real time interactive video system
US7925139B2 (en) * 2001-12-03 2011-04-12 Sony Corporation Distributed semantic descriptions of audiovisual content
US20030121058A1 (en) * 2001-12-24 2003-06-26 Nevenka Dimitrova Personal adaptive memory system
US7474327B2 (en) 2002-02-12 2009-01-06 Given Imaging Ltd. System and method for displaying an image stream
US8285111B2 (en) * 2002-04-19 2012-10-09 Tivo Inc. Method and apparatus for creating an enhanced photo digital video disc
US8000584B1 (en) * 2002-04-26 2011-08-16 Tivo Inc. Approach for storing digital content onto digital versatile discs (DVDs)
US7200611B2 (en) 2002-05-13 2007-04-03 Microsoft Corporation TV program database
US6996390B2 (en) * 2002-06-26 2006-02-07 Microsoft Corporation Smart car radio
US20040024780A1 (en) * 2002-08-01 2004-02-05 Koninklijke Philips Electronics N.V. Method, system and program product for generating a content-based table of contents
US7788688B2 (en) * 2002-08-22 2010-08-31 Lg Electronics Inc. Digital TV and method for managing program information
US7882528B1 (en) * 2002-09-19 2011-02-01 Microsoft Corporation Methods and systems for enhancing a user's viewing experience
US7643550B2 (en) * 2002-10-09 2010-01-05 Hewlett-Packard Development Company, L.P. Method for presenting streaming media for an event
US7143352B2 (en) * 2002-11-01 2006-11-28 Mitsubishi Electric Research Laboratories, Inc Blind summarization of video content
KR100513278B1 (en) * 2003-04-17 2005-09-09 삼성전자주식회사 System for supporting user interface and method thereof
US20040210896A1 (en) * 2003-04-21 2004-10-21 Chou Charles C.L. Distributed interactive media authoring and recording
US8180275B2 (en) 2003-07-24 2012-05-15 Sirius Xm Radio Inc. Computer based multi-channel radio system and user interface
US9131272B2 (en) * 2003-11-04 2015-09-08 Universal Electronics Inc. System and method for saving and recalling state data for media and home appliances
TWI227640B (en) * 2003-11-17 2005-02-01 Power Whale Information Ltd Real-time communication device and method for people with the same TV program preference
JP2005275692A (en) * 2004-03-24 2005-10-06 Sony Corp Content providing apparatus, content providing system, web site change apparatus, web site change system, content providing method and web site change method
US20060020966A1 (en) * 2004-07-22 2006-01-26 Thomas Poslinski Program guide with integrated progress bar
JP4835439B2 (en) * 2004-08-10 2011-12-14 ソニー株式会社 Information signal processing method, information signal processing apparatus, and computer program recording medium
JP2006186922A (en) * 2004-12-28 2006-07-13 Toshiba Corp Video display device, video signal output device, and channel selection method for video display device
US20060159339A1 (en) * 2005-01-20 2006-07-20 Motorola, Inc. Method and apparatus as pertains to captured image statistics
TW200704183A (en) * 2005-01-27 2007-01-16 Matrix Tv Dynamic mosaic extended electronic programming guide for television program selection and display
US20060174276A1 (en) 2005-01-31 2006-08-03 Derrenberger Mike A Customer associated profile for accessing audio and video media objects
KR101087102B1 (en) * 2005-02-01 2011-11-25 엘지전자 주식회사 Program information method of digital broadcasting receiver
JP2006227843A (en) * 2005-02-16 2006-08-31 Sony Corp Content information management system, content information management device, content information management method and computer program
US7769189B1 (en) 2005-04-12 2010-08-03 Apple Inc. Preserving noise during editing of a signal
JP4561453B2 (en) * 2005-04-19 2010-10-13 株式会社日立製作所 Recording / reproducing apparatus and recording / reproducing method
US8295682B1 (en) * 2005-07-13 2012-10-23 Apple Inc. Selecting previously-selected segments of a signal
US20070019932A1 (en) * 2005-07-19 2007-01-25 Konica Minolta Technology U.S.A., Inc. Digital photo album producing apparatus
EP2498210A1 (en) 2005-07-22 2012-09-12 Kangaroo Media, Inc. System and methods for enhancing the experience of spectators attending a live sporting event
US8042140B2 (en) 2005-07-22 2011-10-18 Kangaroo Media, Inc. Buffering content on a handheld electronic device
US8538761B1 (en) 2005-08-01 2013-09-17 Apple Inc. Stretching/shrinking selected portions of a signal
US8364294B1 (en) 2005-08-01 2013-01-29 Apple Inc. Two-phase editing of signal data
RU2421800C2 (en) * 2005-08-01 2011-06-20 Конинклейке Филипс Электроникс Н.В. Organisation of content by means of dynamic profile
US8875196B2 (en) 2005-08-13 2014-10-28 Webtuner Corp. System for network and local content access
JP2007096560A (en) * 2005-09-28 2007-04-12 Hitachi Ltd User-taste extractor
US20070101394A1 (en) * 2005-11-01 2007-05-03 Yesvideo, Inc. Indexing a recording of audiovisual content to enable rich navigation
KR100650407B1 (en) * 2005-11-15 2006-11-29 삼성전자주식회사 Method and apparatus for generating video abstract information at high speed on based multi-modal
US8522142B2 (en) * 2005-12-08 2013-08-27 Google Inc. Adaptive media player size
US7584442B2 (en) * 2005-12-09 2009-09-01 Lsi Corporation Method and apparatus for generating memory models and timing database
US7870125B1 (en) * 2005-12-27 2011-01-11 Charter Communications Holding Company Integrated media content server system and method for the customization of metadata that is associated therewith
EP2343889A1 (en) * 2005-12-29 2011-07-13 United Video Properties, Inc. Systems and methods for managing content
US20070157220A1 (en) * 2005-12-29 2007-07-05 United Video Properties, Inc. Systems and methods for managing content
KR100836611B1 (en) * 2006-02-28 2008-06-10 (주)케이티에프테크놀로지스 Apparatus and method for using broadcasting programs as backgroud, and portable terminal using the same
US8732154B2 (en) * 2007-02-28 2014-05-20 Samsung Electronics Co., Ltd. Method and system for providing sponsored information on electronic devices
US20080221989A1 (en) * 2007-03-09 2008-09-11 Samsung Electronics Co., Ltd. Method and system for providing sponsored content on an electronic device
US9507778B2 (en) 2006-05-19 2016-11-29 Yahoo! Inc. Summarization of media object collections
JP2007324870A (en) * 2006-05-31 2007-12-13 Canon Inc Recording and reproducing device, recording and reproducing method, and program
US20070283268A1 (en) * 2006-06-06 2007-12-06 Berger Adam L Advertising delivery
US20080005679A1 (en) * 2006-06-28 2008-01-03 Microsoft Corporation Context specific user interface
US20080005166A1 (en) * 2006-06-29 2008-01-03 International Business Machines Corporation Dynamic search result of audio-visual and related content
US7712052B2 (en) * 2006-07-31 2010-05-04 Microsoft Corporation Applications of three-dimensional environments constructed from images
CN101536374B (en) * 2006-09-29 2012-05-09 意大利电信股份公司 Method of enjoying broadcasted communication services through distinct electronic apparatuses
BRPI0605830A2 (en) * 2006-10-10 2014-11-04 Scopus Tecnologia Ltda METHOD FOR INSTALLING ELECTRONIC TRANSACTIONS TO AN INSTITUTION
US8594702B2 (en) 2006-11-06 2013-11-26 Yahoo! Inc. Context server for associating information based on context
US8402356B2 (en) 2006-11-22 2013-03-19 Yahoo! Inc. Methods, systems and apparatus for delivery of media
US20080120178A1 (en) * 2006-11-22 2008-05-22 Ronald Martinez Methods, Systems and Apparatus for Delivery of Media
US9110903B2 (en) 2006-11-22 2015-08-18 Yahoo! Inc. Method, system and apparatus for using user profile electronic device data in media delivery
US20080127272A1 (en) * 2006-11-28 2008-05-29 Brian John Cragun Aggregation of Multiple Media Streams to a User
US8812637B2 (en) 2006-11-28 2014-08-19 International Business Machines Corporation Aggregation of multiple media streams to a user
US8769099B2 (en) 2006-12-28 2014-07-01 Yahoo! Inc. Methods and systems for pre-caching information on a mobile computing device
KR101387831B1 (en) * 2007-02-08 2014-04-23 엘지전자 주식회사 Method for user interface in data broadcasting receiver
CN101022539B (en) * 2007-02-15 2010-08-25 华为技术有限公司 Method, system, interactive EPG and business middleware for realizing business service control
US8769559B2 (en) 2007-03-30 2014-07-01 Verizon Patent And Licensing Inc. Systems and methods for using incentives to increase advertising effectiveness
US20080250315A1 (en) * 2007-04-09 2008-10-09 Nokia Corporation Graphical representation for accessing and representing media files
EP2153563A4 (en) * 2007-05-30 2011-04-27 Creatier Interactive Llc Method and system for enabling advertising and transaction within user generated video content
US9047374B2 (en) * 2007-06-08 2015-06-02 Apple Inc. Assembling video content
US9542394B2 (en) * 2007-06-14 2017-01-10 Excalibur Ip, Llc Method and system for media-based event generation
KR20090011518A (en) * 2007-07-26 2009-02-02 엘지전자 주식회사 Apparatus and method for displaying
US20090062943A1 (en) * 2007-08-27 2009-03-05 Sony Computer Entertainment Inc. Methods and apparatus for automatically controlling the sound level based on the content
US20090106807A1 (en) * 2007-10-19 2009-04-23 Hitachi, Ltd. Video Distribution System for Switching Video Streams
US9594844B2 (en) * 2007-11-08 2017-03-14 Microsoft Technology Licensing, Llc Selectively deleting items that are not of interest to a user
US8015192B2 (en) * 2007-11-20 2011-09-06 Samsung Electronics Co., Ltd. Cliprank: ranking media content using their relationships with end users
US8069142B2 (en) * 2007-12-06 2011-11-29 Yahoo! Inc. System and method for synchronizing data on a network
US20090150507A1 (en) * 2007-12-07 2009-06-11 Yahoo! Inc. System and method for prioritizing delivery of communications via different communication channels
US8307029B2 (en) * 2007-12-10 2012-11-06 Yahoo! Inc. System and method for conditional delivery of messages
US8671154B2 (en) 2007-12-10 2014-03-11 Yahoo! Inc. System and method for contextual addressing of communications on a network
US8166168B2 (en) * 2007-12-17 2012-04-24 Yahoo! Inc. System and method for disambiguating non-unique identifiers using information obtained from disparate communication channels
US8365235B2 (en) * 2007-12-18 2013-01-29 Netflix, Inc. Trick play of streaming media
US9305087B2 (en) * 2007-12-20 2016-04-05 Google Technology Holdings Method and apparatus for acquiring content-based capital via a sharing technology
US8059891B2 (en) * 2007-12-30 2011-11-15 Intel Corporation Markov stationary color descriptor
US9706345B2 (en) 2008-01-04 2017-07-11 Excalibur Ip, Llc Interest mapping system
US9626685B2 (en) 2008-01-04 2017-04-18 Excalibur Ip, Llc Systems and methods of mapping attention
US8762285B2 (en) 2008-01-06 2014-06-24 Yahoo! Inc. System and method for message clustering
US20090182618A1 (en) 2008-01-16 2009-07-16 Yahoo! Inc. System and Method for Word-of-Mouth Advertising
US8560390B2 (en) 2008-03-03 2013-10-15 Yahoo! Inc. Method and apparatus for social network marketing with brand referral
US8554623B2 (en) 2008-03-03 2013-10-08 Yahoo! Inc. Method and apparatus for social network marketing with consumer referral
US8538811B2 (en) 2008-03-03 2013-09-17 Yahoo! Inc. Method and apparatus for social network marketing with advocate referral
EP2099198A1 (en) * 2008-03-05 2009-09-09 Sony Corporation Method and device for personalizing a multimedia application
US8745133B2 (en) 2008-03-28 2014-06-03 Yahoo! Inc. System and method for optimizing the storage of data
US8589486B2 (en) 2008-03-28 2013-11-19 Yahoo! Inc. System and method for addressing communications
US8271506B2 (en) 2008-03-31 2012-09-18 Yahoo! Inc. System and method for modeling relationships between entities
EP2286356A4 (en) * 2008-06-03 2013-03-06 Whirlpool Co Appliance development toolkit
US8452855B2 (en) 2008-06-27 2013-05-28 Yahoo! Inc. System and method for presentation of media related to a context
US8706406B2 (en) 2008-06-27 2014-04-22 Yahoo! Inc. System and method for determination and display of personalized distance
US8813107B2 (en) * 2008-06-27 2014-08-19 Yahoo! Inc. System and method for location based media delivery
US8086700B2 (en) 2008-07-29 2011-12-27 Yahoo! Inc. Region and duration uniform resource identifiers (URI) for media objects
US8583668B2 (en) 2008-07-30 2013-11-12 Yahoo! Inc. System and method for context enhanced mapping
US10230803B2 (en) 2008-07-30 2019-03-12 Excalibur Ip, Llc System and method for improved mapping and routing
JP4924565B2 (en) * 2008-08-01 2012-04-25 沖電気工業株式会社 Information processing system and viewing effect measurement method
US8386506B2 (en) 2008-08-21 2013-02-26 Yahoo! Inc. System and method for context enhanced messaging
US8259082B2 (en) 2008-09-12 2012-09-04 At&T Intellectual Property I, L.P. Multimodal portable communication interface for accessing video content
US8281027B2 (en) 2008-09-19 2012-10-02 Yahoo! Inc. System and method for distributing media related to a location
US9600484B2 (en) 2008-09-30 2017-03-21 Excalibur Ip, Llc System and method for reporting and analysis of media consumption data
US8108778B2 (en) 2008-09-30 2012-01-31 Yahoo! Inc. System and method for context enhanced mapping within a user interface
US9237295B2 (en) * 2008-10-15 2016-01-12 Samsung Electronics Co., Ltd. System and method for keyframe analysis and distribution from broadcast television
US20100095345A1 (en) * 2008-10-15 2010-04-15 Samsung Electronics Co., Ltd. System and method for acquiring and distributing keyframe timelines
US8024317B2 (en) 2008-11-18 2011-09-20 Yahoo! Inc. System and method for deriving income from URL based context queries
US8032508B2 (en) 2008-11-18 2011-10-04 Yahoo! Inc. System and method for URL based query for retrieving data related to a context
US8060492B2 (en) * 2008-11-18 2011-11-15 Yahoo! Inc. System and method for generation of URL based context queries
US9805123B2 (en) 2008-11-18 2017-10-31 Excalibur Ip, Llc System and method for data privacy in URL based context queries
US9224172B2 (en) 2008-12-02 2015-12-29 Yahoo! Inc. Customizable content for distribution in social networks
US8055675B2 (en) 2008-12-05 2011-11-08 Yahoo! Inc. System and method for context based query augmentation
US8166016B2 (en) 2008-12-19 2012-04-24 Yahoo! Inc. System and method for automated service recommendations
KR101564415B1 (en) * 2009-01-07 2015-10-30 삼성전자주식회사 Method and apparatus for playing contents by integrated channel management
US20100185518A1 (en) * 2009-01-21 2010-07-22 Yahoo! Inc. Interest-based activity marketing
US20100205238A1 (en) * 2009-02-06 2010-08-12 International Business Machines Corporation Methods and apparatus for intelligent exploratory visualization and analysis
US20100241689A1 (en) * 2009-03-19 2010-09-23 Yahoo! Inc. Method and apparatus for associating advertising with computer enabled maps
US8150967B2 (en) 2009-03-24 2012-04-03 Yahoo! Inc. System and method for verified presence tracking
US9172482B2 (en) 2009-03-31 2015-10-27 At&T Intellectual Property I, L.P. Content recommendations based on personal preferences
US20100280913A1 (en) * 2009-05-01 2010-11-04 Yahoo! Inc. Gift credit matching engine
US20100303448A1 (en) * 2009-05-29 2010-12-02 Samsung Electronics Co., Ltd. Electronic apparatus and method for recording content
US10223701B2 (en) 2009-08-06 2019-03-05 Excalibur Ip, Llc System and method for verified monetization of commercial campaigns
US8914342B2 (en) 2009-08-12 2014-12-16 Yahoo! Inc. Personal data platform
KR101615262B1 (en) * 2009-08-12 2016-04-26 삼성전자주식회사 Method and apparatus for encoding and decoding multi-channel audio signal using semantic information
US8364611B2 (en) 2009-08-13 2013-01-29 Yahoo! Inc. System and method for precaching information on a mobile device
US20110078572A1 (en) * 2009-09-30 2011-03-31 Rovi Technologies Corporation Systems and methods for analyzing clickstream data
US8769560B2 (en) * 2009-10-13 2014-07-01 At&T Intellectual Property I, L.P. System and method to obtain content and generate modified content based on a time limited content consumption state
KR20110047398A (en) * 2009-10-30 2011-05-09 삼성전자주식회사 Image providing system and image providing mehtod of the same
US8621514B2 (en) 2010-06-23 2013-12-31 Echostar Broadcasting Corporation Apparatus, systems and methods for a video thumbnail electronic program guide
USD667021S1 (en) * 2010-09-24 2012-09-11 Research In Motion Limited Display screen with graphical user interface
US20120113239A1 (en) * 2010-11-08 2012-05-10 Hagai Krupnik System and method for displaying an image stream
CN103636147A (en) 2011-05-17 2014-03-12 韦伯图纳公司 System and method for scalable high-accuracy sensor and ID-based audience measurement system
AU2012258732A1 (en) 2011-05-24 2013-12-12 WebTuner, Corporation System and method to increase efficiency and speed of analytics report generation in Audience Measurement Systems
WO2012162693A1 (en) 2011-05-26 2012-11-29 WebTuner, Corporation Highly scalable audience measurement system with client event pre-processing
US9106940B2 (en) * 2011-09-19 2015-08-11 Time Warner Cable Enterprises Llc Methods and apparatus for customizing video services provided to customers in hotels
US9110891B2 (en) * 2011-12-12 2015-08-18 Google Inc. Auto-translation for multi user audio and video
GB2511668A (en) * 2012-04-12 2014-09-10 Supercell Oy System and method for controlling technical processes
US20140328570A1 (en) * 2013-01-09 2014-11-06 Sri International Identifying, describing, and sharing salient events in images and videos
KR101989599B1 (en) * 2012-09-10 2019-06-14 네이버 주식회사 Method and system for effective search retargeting in search advertising
USD755211S1 (en) * 2012-11-28 2016-05-03 Lg Electronics Inc. Display screen with graphical user interface
JP6814236B2 (en) * 2012-11-30 2021-01-13 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America Information processing method
US9589594B2 (en) 2013-02-05 2017-03-07 Alc Holdings, Inc. Generation of layout of videos
USD766286S1 (en) * 2013-12-05 2016-09-13 Lg Electronics Inc. Display of a television receiver with graphical user interface
USD766288S1 (en) * 2013-12-05 2016-09-13 Lg Electronics Inc. Display of a television receiver with graphical user interface
USD766287S1 (en) * 2013-12-05 2016-09-13 Lg Electronics Inc. Display of a television receiver with graphical user interface
USD754680S1 (en) * 2013-12-05 2016-04-26 Lg Electronics Inc. Display of a television receiver with graphical user interface
USD771087S1 (en) * 2013-12-05 2016-11-08 Lg Electronics Inc. Display of a television receiver with graphical user interface
US9215510B2 (en) 2013-12-06 2015-12-15 Rovi Guides, Inc. Systems and methods for automatically tagging a media asset based on verbal input and playback adjustments
USD755202S1 (en) * 2013-12-30 2016-05-03 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
USD754678S1 (en) * 2013-12-30 2016-04-26 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
USD749111S1 (en) 2014-01-24 2016-02-09 Microsoft Corporation Display screen with graphical user interface
USD770486S1 (en) 2014-01-24 2016-11-01 Microsoft Corporation Display screen with graphical user interface
USD764524S1 (en) 2014-08-28 2016-08-23 Microsoft Corporation Display screen with graphical user interface
USD764525S1 (en) 2014-08-28 2016-08-23 Microsoft Corporation Display screen with graphical user interface
RU2583764C1 (en) 2014-12-03 2016-05-10 Общество С Ограниченной Ответственностью "Яндекс" Method of processing request for user to access web resource and server
CN108763369B (en) * 2018-05-17 2021-01-05 北京奇艺世纪科技有限公司 Video searching method and device
CN108804598A (en) * 2018-05-29 2018-11-13 王妃 Cloud atlas distributed video sorting technique
US11722739B2 (en) * 2021-07-02 2023-08-08 Datashapes, Inc. Navigating content by relevance

Citations (95)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4183056A (en) * 1977-05-23 1980-01-08 Kewp Electronic Systems, Inc. Apparatus and method for monitoring sports contests
US4253108A (en) * 1979-06-04 1981-02-24 Zenith Radio Corporation Control for color killer and automatic color limiter
US4321635A (en) * 1979-04-20 1982-03-23 Teac Corporation Apparatus for selective retrieval of information streams or items
US4324402A (en) * 1979-01-05 1982-04-13 Mattel, Inc. Electronic baseball game
US4729044A (en) * 1985-02-05 1988-03-01 Lex Computing & Management Corporation Method and apparatus for playing serially stored segments in an arbitrary sequence
US5012334A (en) * 1990-01-29 1991-04-30 Dubner Computer Systems, Inc. Video image bank for storing and retrieving video image sequences
US5101364A (en) * 1990-02-09 1992-03-31 Massachusetts Institute Of Technology Method and facility for dynamic video composition and viewing
US5109482A (en) * 1989-01-11 1992-04-28 David Bohrman Interactive video control system for displaying user-selectable clips
US5200825A (en) * 1992-07-01 1993-04-06 Beam Laser Systems, Inc. Commercial insertion system remotely controlling multiple video switches
US5288069A (en) * 1992-11-20 1994-02-22 Susan Matsumoto Talking football
USD354059S (en) * 1992-12-03 1995-01-03 Discovery Communications, Inc. Remote control unit
US5381477A (en) * 1993-02-16 1995-01-10 Scientific-Atlanta, Inc. Method of selecting cable television converter groups
US5404316A (en) * 1992-08-03 1995-04-04 Spectra Group Ltd., Inc. Desktop digital video processing system
US5410344A (en) * 1993-09-22 1995-04-25 Arrowsmith Technologies, Inc. Apparatus and method of selecting video programs based on viewers' preferences
US5483278A (en) * 1992-05-27 1996-01-09 Philips Electronics North America Corporation System and method for finding a movie of interest in a large movie database
USD368263S (en) * 1994-07-12 1996-03-26 Discovery Communications, Inc. Remote control unit
US5600781A (en) * 1994-09-30 1997-02-04 Intel Corporation Method and apparatus for creating a portable personalized operating environment
US5600364A (en) * 1992-12-09 1997-02-04 Discovery Communications, Inc. Network controller for cable television delivery systems
US5600573A (en) * 1992-12-09 1997-02-04 Discovery Communications, Inc. Operations center with video storage for a television program packaging and delivery system
US5610653A (en) * 1992-02-07 1997-03-11 Abecassis; Max Method and system for automatically tracking a zoomed video image
US5710884A (en) * 1995-03-29 1998-01-20 Intel Corporation System for automatically updating personal profile server with updates to additional user information gathered from monitoring user's electronic consuming habits generated on computer during use
US5717814A (en) * 1992-02-07 1998-02-10 Max Abecassis Variable-content video retriever
US5717923A (en) * 1994-11-03 1998-02-10 Intel Corporation Method and apparatus for dynamically customizing electronic information to individual end users
US5727129A (en) * 1996-06-04 1998-03-10 International Business Machines Corporation Network system for profiling and actively facilitating user activities
US5732216A (en) * 1996-10-02 1998-03-24 Internet Angles, Inc. Audio message exchange system
US5734853A (en) * 1992-12-09 1998-03-31 Discovery Communications, Inc. Set top terminal for cable television delivery systems
US5857190A (en) * 1996-06-27 1999-01-05 Microsoft Corporation Event logging system and method for logging events in a network system
US5861881A (en) * 1991-11-25 1999-01-19 Actv, Inc. Interactive computer system for providing an interactive presentation with personalized video, audio and graphics responses for multiple viewers
US5867226A (en) * 1995-11-17 1999-02-02 Thomson Consumer Electronics, Inc. Scheduler employing a predictive agent for use in a television receiver
US5867386A (en) * 1991-12-23 1999-02-02 Hoffberg; Steven M. Morphological pattern recognition based controller system
US5875107A (en) * 1996-12-05 1999-02-23 Mitsubishi Denki Kabushiki Kaisha Inverter apparatus
US5878222A (en) * 1994-11-14 1999-03-02 Intel Corporation Method and apparatus for controlling video/audio and channel selection for a communication signal based on channel data indicative of channel contents of a signal
US5877821A (en) * 1997-01-30 1999-03-02 Motorola, Inc. Multimedia input and control apparatus and method for multimedia communications
US5892536A (en) * 1996-10-03 1999-04-06 Personal Audio Systems and methods for computer enhanced broadcast monitoring
US6014183A (en) * 1997-08-06 2000-01-11 Imagine Products, Inc. Method and apparatus for detecting scene changes in a digital video stream
US6020883A (en) * 1994-11-29 2000-02-01 Fred Herz System and method for scheduling broadcast of and access to video programs and other data using customer profiles
US6029195A (en) * 1994-11-29 2000-02-22 Herz; Frederick S. M. System for customized electronic identification of desirable objects
US6038367A (en) * 1992-02-07 2000-03-14 Abecassis; Max Playing a Video Responsive to a comparison of two sets of Content Preferences
US6041323A (en) * 1996-04-17 2000-03-21 International Business Machines Corporation Information search method, information search device, and storage medium for storing an information search program
US6049821A (en) * 1997-01-24 2000-04-11 Motorola, Inc. Proxy host computer and method for accessing and retrieving information between a browser and a proxy
US6055018A (en) * 1997-11-04 2000-04-25 Ati Technologies, Inc. System and method for reconstructing noninterlaced captured content for display on a progressive screen
US6055569A (en) * 1998-01-27 2000-04-25 Go Ahead Software Inc. Accelerating web access by predicting user action
US6169542B1 (en) * 1998-12-14 2001-01-02 Gte Main Street Incorporated Method of delivering advertising through an interactive video distribution system
US6177931B1 (en) * 1996-12-19 2001-01-23 Index Systems, Inc. Systems and methods for displaying and recording control interface with television programs, video, advertising information and program scheduling information
US6181335B1 (en) * 1992-12-09 2001-01-30 Discovery Communications, Inc. Card for a set top terminal
US6185625B1 (en) * 1996-12-20 2001-02-06 Intel Corporation Scaling proxy server sending to the client a graphical user interface for establishing object encoding preferences after receiving the client's request for the object
US6195497B1 (en) * 1993-10-25 2001-02-27 Hitachi, Ltd. Associated image retrieving apparatus and method
US6198767B1 (en) * 1995-03-27 2001-03-06 International Business Machines Corporation Apparatus for color component compression
US6199076B1 (en) * 1996-10-02 2001-03-06 James Logan Audio program player including a dynamic program selection controller
US6201536B1 (en) * 1992-12-09 2001-03-13 Discovery Communications, Inc. Network manager for cable television system headends
US6212527B1 (en) * 1996-07-08 2001-04-03 Survivors Of The Shoah Visual History Foundation Method and apparatus for cataloguing multimedia data
US6215526B1 (en) * 1998-11-06 2001-04-10 Tivo, Inc. Analog video tagging and encoding system
US6216129B1 (en) * 1998-12-03 2001-04-10 Expanse Networks, Inc. Advertisement selection system supporting discretionary target market characteristics
US6219837B1 (en) * 1997-10-23 2001-04-17 International Business Machines Corporation Summary frames in video
US6339842B1 (en) * 1998-06-10 2002-01-15 Dennis Sunga Fernandez Digital television with subscriber conference overlay
US6342904B1 (en) * 1998-12-17 2002-01-29 Newstakes, Inc. Creating a slide presentation from full motion video
US20020013943A1 (en) * 2000-04-07 2002-01-31 Seth Haberman System and method for simultaneous broadcast for personalized messages
US20020018594A1 (en) * 2000-07-06 2002-02-14 Mitsubishi Electric Research Laboratories, Inc. Method and system for high-level structure analysis and event detection in domain specific videos
US20020026345A1 (en) * 2000-03-08 2002-02-28 Ari Juels Targeted delivery of informational content with privacy protection
US6353444B1 (en) * 1998-03-05 2002-03-05 Matsushita Electric Industrial Co., Ltd. User interface apparatus and broadcast receiving apparatus
US6363380B1 (en) * 1998-01-13 2002-03-26 U.S. Philips Corporation Multimedia computer system with story segmentation capability and operating program therefor including finite automation video parser
US6363160B1 (en) * 1999-01-22 2002-03-26 Intel Corporation Interface using pattern recognition and tracking
US6370504B1 (en) * 1997-05-29 2002-04-09 University Of Washington Speech recognition on MPEG/Audio encoded files
US6370688B1 (en) * 1999-05-26 2002-04-09 Enounce, Inc. Method and apparatus for server broadcast of time-converging multi-media streams
US6374404B1 (en) * 1998-12-16 2002-04-16 Sony Corporation Of Japan Intelligent device having background caching of web pages from a digital television broadcast signal and method of same
US20030001880A1 (en) * 2001-04-18 2003-01-02 Parkervision, Inc. Method, system, and computer program product for producing and distributing enhanced media
US20030007555A1 (en) * 2001-04-27 2003-01-09 Mitsubishi Electric Research Laboratories, Inc. Method for summarizing a video using motion descriptors
US20030026592A1 (en) * 2000-12-28 2003-02-06 Minoru Kawahara Content creating device and method
US20030033288A1 (en) * 2001-08-13 2003-02-13 Xerox Corporation Document-centric system with auto-completion and auto-correction
US6522342B1 (en) * 1999-01-27 2003-02-18 Hughes Electronics Corporation Graphical tuning bar for a multi-program data stream
US6530082B1 (en) * 1998-04-30 2003-03-04 Wink Communications, Inc. Configurable monitoring of program viewership and usage of interactive applications
US6535639B1 (en) * 1999-03-12 2003-03-18 Fuji Xerox Co., Ltd. Automatic video summarization using a measure of shot importance and a frame-packing method
US6542546B1 (en) * 2000-02-02 2003-04-01 Mitsubishi Electric Research Laboratories, Inc. Adaptable compressed bitstream transcoder
US20030067554A1 (en) * 2000-09-25 2003-04-10 Klarfeld Kenneth A. System and method for personalized TV
US20040003041A1 (en) * 2002-04-02 2004-01-01 Worldcom, Inc. Messaging response system
US6675158B1 (en) * 1999-11-30 2004-01-06 Sony Corporation Method and apparatus for organizing data pertaining to audiovisual content
US6678659B1 (en) * 1997-06-20 2004-01-13 Swisscom Ag System and method of voice information dissemination over a network using semantic representation
US6678635B2 (en) * 2001-01-23 2004-01-13 Intel Corporation Method and system for detecting semantic events
US6681395B1 (en) * 1998-03-20 2004-01-20 Matsushita Electric Industrial Company, Ltd. Template set for generating a hypertext for displaying a program guide and subscriber terminal with EPG function using such set broadcast from headend
US20040015569A1 (en) * 2002-07-16 2004-01-22 Mikko Lonnfors System and method for providing partial presence notifications
US20040017389A1 (en) * 2002-07-25 2004-01-29 Hao Pan Summarization of soccer video content
US6691126B1 (en) * 2000-06-14 2004-02-10 International Business Machines Corporation Method and apparatus for locating multi-region objects in an image or video database
US20040030750A1 (en) * 2002-04-02 2004-02-12 Worldcom, Inc. Messaging response system
US20040032486A1 (en) * 2002-08-16 2004-02-19 Shusman Chad W. Method and apparatus for interactive programming using captioning
US6697523B1 (en) * 2000-08-09 2004-02-24 Mitsubishi Electric Research Laboratories, Inc. Method for summarizing a video using motion and color descriptors
US6704929B1 (en) * 1999-08-18 2004-03-09 Webtv Networks, Inc. Tracking viewing behavior of a home entertainment system
US20050021784A1 (en) * 2001-09-07 2005-01-27 Christian Prehofer Device and method for the automatic configuration of user profiles
US20050028194A1 (en) * 1998-01-13 2005-02-03 Elenbaas Jan Hermanus Personalized news retrieval system
US20050055713A1 (en) * 2003-09-09 2005-03-10 Samsung Electronics Co., Ltd. Apparatus and method for sharing recommended programs using digital set-top boxes
US6868440B1 (en) * 2000-02-04 2005-03-15 Microsoft Corporation Multi-level skimming of multimedia content using playlists
US6983478B1 (en) * 2000-02-01 2006-01-03 Bellsouth Intellectual Property Corporation Method and system for tracking network use
US6993245B1 (en) * 1999-11-18 2006-01-31 Vulcan Patents Llc Iterative, maximally probable, batch-mode commercial detection for audiovisual content
US7003792B1 (en) * 1998-11-30 2006-02-21 Index Systems, Inc. Smart agent based on habit, statistical inference and psycho-demographic profiling
US20070011148A1 (en) * 1998-11-12 2007-01-11 Accenture Llp System, method and article of manufacture for advanced information gathering for targetted activities
US7185355B1 (en) * 1998-03-04 2007-02-27 United Video Properties, Inc. Program guide system with preference profiles

Family Cites Families (110)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4252108A (en) * 1979-01-29 1981-02-24 Drow John P Solar heating device
US4298884A (en) 1980-03-31 1981-11-03 Zenith Radio Corporation Chroma amplifier and color killer
US4520404A (en) 1982-08-23 1985-05-28 Kohorn H Von System, apparatus and method for recording and editing broadcast transmissions
US4937685A (en) 1983-12-02 1990-06-26 Lex Computer And Management Corporation Method of display presentation for video editing
JP3002471B2 (en) 1988-08-19 2000-01-24 株式会社日立製作所 Program distribution device
US5241671C1 (en) * 1989-10-26 2002-07-02 Encyclopaedia Britannica Educa Multimedia search system using a plurality of entry path means which indicate interrelatedness of information
US5222924A (en) 1990-01-31 1993-06-29 Chan Shin Over-drive gear device
USD348521S (en) * 1990-05-11 1994-07-05 Eastman Kodak Company Immunoassay unit for analysis of biological liquid
US5148154A (en) 1990-12-04 1992-09-15 Sony Corporation Of America Multi-dimensional user interface
US5459830A (en) 1991-07-22 1995-10-17 Sony Corporation Animation data index creation drawn from image data sampling composites
US5901246A (en) * 1995-06-06 1999-05-04 Hoffberg; Steven M. Ergonomic man-machine interface incorporating adaptive pattern recognition based control system
US6081750A (en) 1991-12-23 2000-06-27 Hoffberg; Steven Mark Ergonomic man-machine interface incorporating adaptive pattern recognition based control system
US5875108A (en) * 1991-12-23 1999-02-23 Hoffberg; Steven M. Ergonomic man-machine interface incorporating adaptive pattern recognition based control system
US6553178B2 (en) * 1992-02-07 2003-04-22 Max Abecassis Advertisement subsidized video-on-demand system
US5684918A (en) 1992-02-07 1997-11-04 Abecassis; Max System for integrating video and communications
US5434678A (en) 1993-01-11 1995-07-18 Abecassis; Max Seamless transmission of non-sequential video segments
JP3038077B2 (en) 1992-03-11 2000-05-08 日本ビクター株式会社 Digital ACC circuit and digital chroma killer circuit
US5223924A (en) 1992-05-27 1993-06-29 North American Philips Corporation System and method for automatically correlating user preferences with a T.V. program information database
SE470037C (en) 1992-10-27 1995-03-23 Ericsson Telefon Ab L M Device for mobile telecommunication systems to enable synchronization of the transmitters of the base stations
US5371551A (en) 1992-10-29 1994-12-06 Logan; James Time delayed digital video system using concurrent recording and playback
USD348251S (en) 1992-12-09 1994-06-28 Discovery Communications, Inc. Menu control panel for a universal remote control unit
US5986690A (en) 1992-12-09 1999-11-16 Discovery Communications, Inc. Electronic book selection and delivery system
US5798785A (en) 1992-12-09 1998-08-25 Discovery Communications, Inc. Terminal for suggesting programs offered on a television program delivery system
US5659350A (en) 1992-12-09 1997-08-19 Discovery Communications, Inc. Operations center for a television program packaging and delivery system
US5333091B2 (en) 1993-01-08 1996-12-17 Arthur D Little Enterprises Method and apparatus for controlling a videotape player to automatically scan past recorded commercial messages
JP3297914B2 (en) 1993-01-08 2002-07-02 ソニー株式会社 Television receiver
US5987211A (en) 1993-01-11 1999-11-16 Abecassis; Max Seamless transmission of non-sequential video segments
US5339393A (en) 1993-04-15 1994-08-16 Sony Electronics, Inc. Graphical user interface for displaying available source material for editing
CA2121151A1 (en) 1993-04-16 1994-10-17 Trevor Lambert Method and apparatus for automatic insertion of a television signal from a remote source
USD368991S (en) * 1993-08-20 1996-04-16 Booda Products, Inc. Chew toy for dogs
AU7975094A (en) 1993-10-12 1995-05-04 Orad, Inc. Sports event video
US5550965A (en) 1993-12-27 1996-08-27 Lucent Technologies Inc. Method and system for operating a data processor to index primary data in real time with iconic table of contents
CA2140850C (en) 1994-02-24 1999-09-21 Howard Paul Katseff Networked system for display of multimedia presentations
US5521841A (en) 1994-03-31 1996-05-28 Siemens Corporate Research, Inc. Browsing contents of a given video sequence
US6230501B1 (en) * 1994-04-14 2001-05-15 Promxd Technology, Inc. Ergonomic systems and methods providing intelligent adaptive surfaces and temperature control
EP0687112B1 (en) * 1994-06-08 2006-09-20 Matsushita Electric Industrial Co., Ltd. Image conversion apparatus
US5635982A (en) 1994-06-27 1997-06-03 Zhang; Hong J. System for automatic video segmentation and key frame extraction for video sequences having both sharp and gradual transitions
USD381991S (en) 1994-07-12 1997-08-05 Discovery Communications, Inc. Remote control unit
US5682460A (en) 1994-08-29 1997-10-28 Motorola, Inc. Method for selecting transmission preferences
US5675752A (en) 1994-09-15 1997-10-07 Sony Corporation Interactive applications generator for an interactive presentation environment
US5664227A (en) 1994-10-14 1997-09-02 Carnegie Mellon University System and method for skimming digital audio/video data
JP3511409B2 (en) 1994-10-27 2004-03-29 株式会社半導体エネルギー研究所 Active matrix type liquid crystal display device and driving method thereof
US5696965A (en) 1994-11-03 1997-12-09 Intel Corporation Electronic information appraisal agent
USD402310S (en) 1994-11-07 1998-12-08 Discovery Communications, Inc. Electronic book
US6571279B1 (en) * 1997-12-05 2003-05-27 Pinpoint Incorporated Location enhanced information delivery system
US5617565A (en) 1994-11-29 1997-04-01 Hitachi America, Ltd. Broadcast interactive multimedia system
US5805733A (en) 1994-12-12 1998-09-08 Apple Computer, Inc. Method and system for detecting scenes and summarizing video sequences
US5821945A (en) 1995-02-03 1998-10-13 The Trustees Of Princeton University Method and apparatus for video browsing based on content and structure
US5761881A (en) 1995-05-10 1998-06-09 Wall; Benjamin Process and apparatus for wrapping paper rolls
US5907324A (en) * 1995-06-07 1999-05-25 Intel Corporation Method for saving and accessing desktop conference characteristics with a persistent conference object
US5900867A (en) 1995-07-17 1999-05-04 Gateway 2000, Inc. Self identifying remote control device having a television receiver for use in a computer
GB9517808D0 (en) 1995-08-31 1995-11-01 Philips Electronics Uk Ltd Interactive entertainment personalisation
US5758259A (en) 1995-08-31 1998-05-26 Microsoft Corporation Automated selective programming guide
US6226678B1 (en) * 1995-09-25 2001-05-01 Netspeak Corporation Method and apparatus for dynamically defining data communication utilities
US5694163A (en) 1995-09-28 1997-12-02 Intel Corporation Method and apparatus for viewing of on-line information service chat data incorporated in a broadcast television program
US5958006A (en) 1995-11-13 1999-09-28 Motorola, Inc. Method and apparatus for communicating summarized data
US5794210A (en) 1995-12-11 1998-08-11 Cybergold, Inc. Attention brokerage
EP0824826B1 (en) 1996-03-04 2003-10-22 Koninklijke Philips Electronics N.V. A user-oriented multimedia presentation system for multiple presentation items that each behave as an agent
US5848396A (en) 1996-04-26 1998-12-08 Freedom Of Information, Inc. Method and apparatus for determining behavioral profile of a computer user
US5945988A (en) 1996-06-06 1999-08-31 Intel Corporation Method and apparatus for automatically determining and dynamically updating user preferences in an entertainment system
US5920360A (en) 1996-06-07 1999-07-06 Electronic Data Systems Corporation Method and system for detecting fade transitions in a video signal
US5778108A (en) 1996-06-07 1998-07-07 Electronic Data Systems Corporation Method and system for detecting transitional markers such as uniform fields in a video signal
WO1997048230A1 (en) 1996-06-13 1997-12-18 Starsight Telecast, Inc. Method and apparatus for searching a guide using program characteristics
US5781188A (en) 1996-06-27 1998-07-14 Softimage Indicating activeness of clips and applying effects to clips and tracks in a timeline of a multimedia work
DK0932398T3 (en) * 1996-06-28 2006-09-25 Ortho Mcneil Pharm Inc Use of topiramate or derivatives thereof for the manufacture of a medicament for the treatment of manic depressive bipolar disorders
FI972718A0 (en) * 1996-07-02 1997-06-24 More Magic Software Mms Oy Foerfaranden och arrangemang Foer distribution av ett anvaendargraenssnitt
US5926624A (en) * 1996-09-12 1999-07-20 Audible, Inc. Digital information library and delivery system with logic for generating files targeted to the playback device
US5764916A (en) 1996-09-27 1998-06-09 Ichat, Inc. Method and apparatus for real time communication over a computer network
US6137486A (en) 1996-09-30 2000-10-24 Sanyo Electric Co., Ltd. Image display control device for restricting display of video data viewed on a television in accordance with a restrict level of the video data
US5828809A (en) 1996-10-01 1998-10-27 Matsushita Electric Industrial Co., Ltd. Method and apparatus for extracting indexing information from digital video data
US6088455A (en) 1997-01-07 2000-07-11 Logan; James D. Methods and apparatus for selectively reproducing segments of broadcast programming
US5986692A (en) 1996-10-03 1999-11-16 Logan; James D. Systems and methods for computer enhanced broadcast monitoring
US5774666A (en) 1996-10-18 1998-06-30 Silicon Graphics, Inc. System and method for displaying uniform network resource locators embedded in time-based medium
US5828839A (en) 1996-11-14 1998-10-27 Interactive Broadcaster Services Corp. Computer network chat room based on channel broadcast in real time
US6543053B1 (en) * 1996-11-27 2003-04-01 University Of Hong Kong Interactive video-on-demand system
US6233590B1 (en) * 1996-12-13 2001-05-15 Venson M. Shaw Server apparatus for distributed communications supporting multiple user/application environment
US6446261B1 (en) * 1996-12-20 2002-09-03 Princeton Video Image, Inc. Set top device for targeted electronic insertion of indicia into video
US6076166A (en) 1997-01-17 2000-06-13 Philips Electronics North America Corporation Personalizing hospital intranet web sites
US5930783A (en) * 1997-02-21 1999-07-27 Nec Usa, Inc. Semantic and cognition based image retrieval
US5913030A (en) 1997-03-18 1999-06-15 International Business Machines Corporation Method and system for client/server communications with user information revealed as a function of willingness to reveal and whether the information is required
GB9706620D0 (en) * 1997-04-01 1997-05-21 Sgs Thomson Microelectronics A method for remotely controlling a plurality of apparatus using a single remote control device
EP0916222B1 (en) * 1997-06-03 2007-04-04 Koninklijke Philips Electronics N.V. Navigating through television programs
US6070167A (en) 1997-09-29 2000-05-30 Sharp Laboratories Of America, Inc. Hierarchical method and system for object-based audiovisual descriptive tagging of images for information retrieval, editing, and manipulation
US6064385A (en) * 1997-09-30 2000-05-16 Compaq Computer Corporation Systems with user preference setting schemes
US6594699B1 (en) * 1997-10-10 2003-07-15 Kasenna, Inc. System for capability based multimedia streaming over a network
US6064449A (en) * 1997-10-31 2000-05-16 Webtv Networks, Inc. Automatic characterization of a television signal
US6128624A (en) 1997-11-12 2000-10-03 Ncr Corporation Collection and integration of internet and electronic commerce data in a database during web browsing
US5973683A (en) 1997-11-24 1999-10-26 International Business Machines Corporation Dynamic regulation of television viewing content based on viewer profile and viewing history
EP0962074B1 (en) * 1997-11-25 2012-12-19 Motorola Mobility LLC Audio content player methods, systems, and articles of manufacture
US6078928A (en) * 1997-12-12 2000-06-20 Missouri Botanical Garden Site-specific interest profiling system
US5956026A (en) 1997-12-19 1999-09-21 Sharp Laboratories Of America, Inc. Method for hierarchical summarization and browsing of digital video
US6252544B1 (en) * 1998-01-27 2001-06-26 Steven M. Hoffberg Mobile communication device
US6006265A (en) 1998-04-02 1999-12-21 Hotv, Inc. Hyperlinks resolution at and by a special network server in order to enable diverse sophisticated hyperlinking upon a digital network
US6005603A (en) * 1998-05-15 1999-12-21 International Business Machines Corporation Control of a system for processing a stream of information based on information content
JP4198786B2 (en) * 1998-06-30 2008-12-17 株式会社東芝 Information filtering system, information filtering apparatus, video equipment, and information filtering method
US6546555B1 (en) * 1998-07-23 2003-04-08 Siemens Corporate Research, Inc. System for hypervideo filtering based on end-user payment interest and capability
US6233389B1 (en) * 1998-07-30 2001-05-15 Tivo, Inc. Multimedia time warping system
US6898762B2 (en) * 1998-08-21 2005-05-24 United Video Properties, Inc. Client-server electronic program guide
JP2000090098A (en) * 1998-09-09 2000-03-31 Hitachi Ltd Data base querying method, its implementing device, and medium recording processing program thereof
US6317722B1 (en) * 1998-09-18 2001-11-13 Amazon.Com, Inc. Use of electronic shopping carts to generate personal recommendations
US7720723B2 (en) * 1998-09-18 2010-05-18 Amazon Technologies, Inc. User interface and methods for recommending items to users
US6412008B1 (en) * 1999-01-28 2002-06-25 International Business Machines Corporation System and method for cooperative client/server customization of web pages
US6236395B1 (en) * 1999-02-01 2001-05-22 Sharp Laboratories Of America, Inc. Audiovisual information management system
US6593936B1 (en) * 1999-02-01 2003-07-15 At&T Corp. Synthetic audiovisual description scheme, method and system for MPEG-7
US6426761B1 (en) * 1999-04-23 2002-07-30 Internation Business Machines Corporation Information presentation system for a graphical user interface
US6621895B1 (en) * 1999-08-31 2003-09-16 Nortel Networks Limited Enhanced communication services for data networks
US6546101B1 (en) * 2000-06-02 2003-04-08 Motorola, Inc. Communication device having illuminated audio indicator
US6766362B1 (en) * 2000-07-28 2004-07-20 Seiko Epson Corporation Providing a network-based personalized newspaper with personalized content and layout
US6702256B2 (en) * 2001-07-17 2004-03-09 Agilent Technologies, Inc. Flow-switching microdevice
US6900402B2 (en) * 2003-01-08 2005-05-31 Jeckson Electric Co., Ltd. Pushbutton switch with LED indicator

Patent Citations (100)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4183056A (en) * 1977-05-23 1980-01-08 Kewp Electronic Systems, Inc. Apparatus and method for monitoring sports contests
US4324402A (en) * 1979-01-05 1982-04-13 Mattel, Inc. Electronic baseball game
US4321635A (en) * 1979-04-20 1982-03-23 Teac Corporation Apparatus for selective retrieval of information streams or items
US4253108A (en) * 1979-06-04 1981-02-24 Zenith Radio Corporation Control for color killer and automatic color limiter
US4729044A (en) * 1985-02-05 1988-03-01 Lex Computing & Management Corporation Method and apparatus for playing serially stored segments in an arbitrary sequence
US5109482A (en) * 1989-01-11 1992-04-28 David Bohrman Interactive video control system for displaying user-selectable clips
US5012334A (en) * 1990-01-29 1991-04-30 Dubner Computer Systems, Inc. Video image bank for storing and retrieving video image sequences
US5012334B1 (en) * 1990-01-29 1997-05-13 Grass Valley Group Video image bank for storing and retrieving video image sequences
US5101364A (en) * 1990-02-09 1992-03-31 Massachusetts Institute Of Technology Method and facility for dynamic video composition and viewing
US5861881A (en) * 1991-11-25 1999-01-19 Actv, Inc. Interactive computer system for providing an interactive presentation with personalized video, audio and graphics responses for multiple viewers
US5867386A (en) * 1991-12-23 1999-02-02 Hoffberg; Steven M. Morphological pattern recognition based controller system
US6038367A (en) * 1992-02-07 2000-03-14 Abecassis; Max Playing a Video Responsive to a comparison of two sets of Content Preferences
US6011895A (en) * 1992-02-07 2000-01-04 Abecassis; Max Keyword responsive variable content video program
US5717814A (en) * 1992-02-07 1998-02-10 Max Abecassis Variable-content video retriever
US6208805B1 (en) * 1992-02-07 2001-03-27 Max Abecassis Inhibiting a control function from interfering with a playing of a video
US5724472A (en) * 1992-02-07 1998-03-03 Abecassis; Max Content map for seamlessly skipping a retrieval of a segment of a video
US5610653A (en) * 1992-02-07 1997-03-11 Abecassis; Max Method and system for automatically tracking a zoomed video image
US5483278A (en) * 1992-05-27 1996-01-09 Philips Electronics North America Corporation System and method for finding a movie of interest in a large movie database
US5200825A (en) * 1992-07-01 1993-04-06 Beam Laser Systems, Inc. Commercial insertion system remotely controlling multiple video switches
US5404316A (en) * 1992-08-03 1995-04-04 Spectra Group Ltd., Inc. Desktop digital video processing system
US5288069A (en) * 1992-11-20 1994-02-22 Susan Matsumoto Talking football
USD354059S (en) * 1992-12-03 1995-01-03 Discovery Communications, Inc. Remote control unit
US6052554A (en) * 1992-12-09 2000-04-18 Discovery Communications, Inc. Television program delivery system
US5600573A (en) * 1992-12-09 1997-02-04 Discovery Communications, Inc. Operations center with video storage for a television program packaging and delivery system
US5600364A (en) * 1992-12-09 1997-02-04 Discovery Communications, Inc. Network controller for cable television delivery systems
US5734853A (en) * 1992-12-09 1998-03-31 Discovery Communications, Inc. Set top terminal for cable television delivery systems
US6201536B1 (en) * 1992-12-09 2001-03-13 Discovery Communications, Inc. Network manager for cable television system headends
US6181335B1 (en) * 1992-12-09 2001-01-30 Discovery Communications, Inc. Card for a set top terminal
US5381477A (en) * 1993-02-16 1995-01-10 Scientific-Atlanta, Inc. Method of selecting cable television converter groups
US5410344A (en) * 1993-09-22 1995-04-25 Arrowsmith Technologies, Inc. Apparatus and method of selecting video programs based on viewers' preferences
US6195497B1 (en) * 1993-10-25 2001-02-27 Hitachi, Ltd. Associated image retrieving apparatus and method
USD368263S (en) * 1994-07-12 1996-03-26 Discovery Communications, Inc. Remote control unit
US5600781A (en) * 1994-09-30 1997-02-04 Intel Corporation Method and apparatus for creating a portable personalized operating environment
US5717923A (en) * 1994-11-03 1998-02-10 Intel Corporation Method and apparatus for dynamically customizing electronic information to individual end users
US5878222A (en) * 1994-11-14 1999-03-02 Intel Corporation Method and apparatus for controlling video/audio and channel selection for a communication signal based on channel data indicative of channel contents of a signal
US6020883A (en) * 1994-11-29 2000-02-01 Fred Herz System and method for scheduling broadcast of and access to video programs and other data using customer profiles
US6029195A (en) * 1994-11-29 2000-02-22 Herz; Frederick S. M. System for customized electronic identification of desirable objects
US6198767B1 (en) * 1995-03-27 2001-03-06 International Business Machines Corporation Apparatus for color component compression
US5710884A (en) * 1995-03-29 1998-01-20 Intel Corporation System for automatically updating personal profile server with updates to additional user information gathered from monitoring user's electronic consuming habits generated on computer during use
US5867226A (en) * 1995-11-17 1999-02-02 Thomson Consumer Electronics, Inc. Scheduler employing a predictive agent for use in a television receiver
US6041323A (en) * 1996-04-17 2000-03-21 International Business Machines Corporation Information search method, information search device, and storage medium for storing an information search program
US5727129A (en) * 1996-06-04 1998-03-10 International Business Machines Corporation Network system for profiling and actively facilitating user activities
US5857190A (en) * 1996-06-27 1999-01-05 Microsoft Corporation Event logging system and method for logging events in a network system
US6212527B1 (en) * 1996-07-08 2001-04-03 Survivors Of The Shoah Visual History Foundation Method and apparatus for cataloguing multimedia data
US5732216A (en) * 1996-10-02 1998-03-24 Internet Angles, Inc. Audio message exchange system
US6199076B1 (en) * 1996-10-02 2001-03-06 James Logan Audio program player including a dynamic program selection controller
US5892536A (en) * 1996-10-03 1999-04-06 Personal Audio Systems and methods for computer enhanced broadcast monitoring
US5875107A (en) * 1996-12-05 1999-02-23 Mitsubishi Denki Kabushiki Kaisha Inverter apparatus
US6177931B1 (en) * 1996-12-19 2001-01-23 Index Systems, Inc. Systems and methods for displaying and recording control interface with television programs, video, advertising information and program scheduling information
US6185625B1 (en) * 1996-12-20 2001-02-06 Intel Corporation Scaling proxy server sending to the client a graphical user interface for establishing object encoding preferences after receiving the client's request for the object
US6049821A (en) * 1997-01-24 2000-04-11 Motorola, Inc. Proxy host computer and method for accessing and retrieving information between a browser and a proxy
US5877821A (en) * 1997-01-30 1999-03-02 Motorola, Inc. Multimedia input and control apparatus and method for multimedia communications
US6370504B1 (en) * 1997-05-29 2002-04-09 University Of Washington Speech recognition on MPEG/Audio encoded files
US6678659B1 (en) * 1997-06-20 2004-01-13 Swisscom Ag System and method of voice information dissemination over a network using semantic representation
US6014183A (en) * 1997-08-06 2000-01-11 Imagine Products, Inc. Method and apparatus for detecting scene changes in a digital video stream
US6219837B1 (en) * 1997-10-23 2001-04-17 International Business Machines Corporation Summary frames in video
US6055018A (en) * 1997-11-04 2000-04-25 Ati Technologies, Inc. System and method for reconstructing noninterlaced captured content for display on a progressive screen
US6363380B1 (en) * 1998-01-13 2002-03-26 U.S. Philips Corporation Multimedia computer system with story segmentation capability and operating program therefor including finite automation video parser
US20050028194A1 (en) * 1998-01-13 2005-02-03 Elenbaas Jan Hermanus Personalized news retrieval system
US6055569A (en) * 1998-01-27 2000-04-25 Go Ahead Software Inc. Accelerating web access by predicting user action
US7185355B1 (en) * 1998-03-04 2007-02-27 United Video Properties, Inc. Program guide system with preference profiles
US6353444B1 (en) * 1998-03-05 2002-03-05 Matsushita Electric Industrial Co., Ltd. User interface apparatus and broadcast receiving apparatus
US6681395B1 (en) * 1998-03-20 2004-01-20 Matsushita Electric Industrial Company, Ltd. Template set for generating a hypertext for displaying a program guide and subscriber terminal with EPG function using such set broadcast from headend
US6530082B1 (en) * 1998-04-30 2003-03-04 Wink Communications, Inc. Configurable monitoring of program viewership and usage of interactive applications
US6339842B1 (en) * 1998-06-10 2002-01-15 Dennis Sunga Fernandez Digital television with subscriber conference overlay
US6215526B1 (en) * 1998-11-06 2001-04-10 Tivo, Inc. Analog video tagging and encoding system
US20070011148A1 (en) * 1998-11-12 2007-01-11 Accenture Llp System, method and article of manufacture for advanced information gathering for targetted activities
US7003792B1 (en) * 1998-11-30 2006-02-21 Index Systems, Inc. Smart agent based on habit, statistical inference and psycho-demographic profiling
US6216129B1 (en) * 1998-12-03 2001-04-10 Expanse Networks, Inc. Advertisement selection system supporting discretionary target market characteristics
US6169542B1 (en) * 1998-12-14 2001-01-02 Gte Main Street Incorporated Method of delivering advertising through an interactive video distribution system
US6374404B1 (en) * 1998-12-16 2002-04-16 Sony Corporation Of Japan Intelligent device having background caching of web pages from a digital television broadcast signal and method of same
US6342904B1 (en) * 1998-12-17 2002-01-29 Newstakes, Inc. Creating a slide presentation from full motion video
US6363160B1 (en) * 1999-01-22 2002-03-26 Intel Corporation Interface using pattern recognition and tracking
US6522342B1 (en) * 1999-01-27 2003-02-18 Hughes Electronics Corporation Graphical tuning bar for a multi-program data stream
US6535639B1 (en) * 1999-03-12 2003-03-18 Fuji Xerox Co., Ltd. Automatic video summarization using a measure of shot importance and a frame-packing method
US6370688B1 (en) * 1999-05-26 2002-04-09 Enounce, Inc. Method and apparatus for server broadcast of time-converging multi-media streams
US6704929B1 (en) * 1999-08-18 2004-03-09 Webtv Networks, Inc. Tracking viewing behavior of a home entertainment system
US6993245B1 (en) * 1999-11-18 2006-01-31 Vulcan Patents Llc Iterative, maximally probable, batch-mode commercial detection for audiovisual content
US6675158B1 (en) * 1999-11-30 2004-01-06 Sony Corporation Method and apparatus for organizing data pertaining to audiovisual content
US6983478B1 (en) * 2000-02-01 2006-01-03 Bellsouth Intellectual Property Corporation Method and system for tracking network use
US6542546B1 (en) * 2000-02-02 2003-04-01 Mitsubishi Electric Research Laboratories, Inc. Adaptable compressed bitstream transcoder
US6868440B1 (en) * 2000-02-04 2005-03-15 Microsoft Corporation Multi-level skimming of multimedia content using playlists
US20020026345A1 (en) * 2000-03-08 2002-02-28 Ari Juels Targeted delivery of informational content with privacy protection
US20020013943A1 (en) * 2000-04-07 2002-01-31 Seth Haberman System and method for simultaneous broadcast for personalized messages
US6691126B1 (en) * 2000-06-14 2004-02-10 International Business Machines Corporation Method and apparatus for locating multi-region objects in an image or video database
US20020018594A1 (en) * 2000-07-06 2002-02-14 Mitsubishi Electric Research Laboratories, Inc. Method and system for high-level structure analysis and event detection in domain specific videos
US6697523B1 (en) * 2000-08-09 2004-02-24 Mitsubishi Electric Research Laboratories, Inc. Method for summarizing a video using motion and color descriptors
US20030067554A1 (en) * 2000-09-25 2003-04-10 Klarfeld Kenneth A. System and method for personalized TV
US20030026592A1 (en) * 2000-12-28 2003-02-06 Minoru Kawahara Content creating device and method
US6678635B2 (en) * 2001-01-23 2004-01-13 Intel Corporation Method and system for detecting semantic events
US20030001880A1 (en) * 2001-04-18 2003-01-02 Parkervision, Inc. Method, system, and computer program product for producing and distributing enhanced media
US20030007555A1 (en) * 2001-04-27 2003-01-09 Mitsubishi Electric Research Laboratories, Inc. Method for summarizing a video using motion descriptors
US20030033288A1 (en) * 2001-08-13 2003-02-13 Xerox Corporation Document-centric system with auto-completion and auto-correction
US20050021784A1 (en) * 2001-09-07 2005-01-27 Christian Prehofer Device and method for the automatic configuration of user profiles
US20040030750A1 (en) * 2002-04-02 2004-02-12 Worldcom, Inc. Messaging response system
US20040003041A1 (en) * 2002-04-02 2004-01-01 Worldcom, Inc. Messaging response system
US20040015569A1 (en) * 2002-07-16 2004-01-22 Mikko Lonnfors System and method for providing partial presence notifications
US20040017389A1 (en) * 2002-07-25 2004-01-29 Hao Pan Summarization of soccer video content
US20040032486A1 (en) * 2002-08-16 2004-02-19 Shusman Chad W. Method and apparatus for interactive programming using captioning
US20050055713A1 (en) * 2003-09-09 2005-03-10 Samsung Electronics Co., Ltd. Apparatus and method for sharing recommended programs using digital set-top boxes

Cited By (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020129371A1 (en) * 2001-03-08 2002-09-12 Matsushita Elecric Industrial Co., Ltd. Media distribution apparatus and media distribution method
US20060085371A1 (en) * 2002-09-24 2006-04-20 Koninklijke Philips Electronics, N.V. System and method for associating different types of media content
US20070089143A1 (en) * 2003-11-10 2007-04-19 Lefevre Chad A Method and apparatus for providing dynamic display of content information associated with a device in a network
US20060090183A1 (en) * 2004-10-26 2006-04-27 David Zito Method and apparatus for a search-enabled remote control device
US8015184B2 (en) * 2004-10-26 2011-09-06 Yahoo! Inc. Method and apparatus for a search-enabled remote control device
US20070136459A1 (en) * 2005-12-09 2007-06-14 Sbc Knowledge Ventures Lp Session continuity in multimedia services
US8577953B2 (en) * 2005-12-09 2013-11-05 At&T Intellectual Property I, Lp System and method for providing multimedia services
DE102008003914A1 (en) * 2008-01-10 2009-07-16 Deutsche Telekom Ag Service for web radio and internet television
EP2476249A2 (en) * 2009-09-07 2012-07-18 LG Electronics Inc. Image display apparatus and operation method therefore
EP2476249A4 (en) * 2009-09-07 2013-07-31 Lg Electronics Inc Image display apparatus and operation method therefore
EP2510682A2 (en) * 2009-12-08 2012-10-17 LG Electronics Inc. Image display apparatus and method for operating the same
EP2510682A4 (en) * 2009-12-08 2014-01-29 Lg Electronics Inc Image display apparatus and method for operating the same
US20160044388A1 (en) * 2013-03-26 2016-02-11 Orange Generation and delivery of a stream representing audiovisual content
US20140298414A1 (en) * 2013-03-27 2014-10-02 Apple Inc. Browsing remote content using a native user interface
US10375342B2 (en) * 2013-03-27 2019-08-06 Apple Inc. Browsing remote content using a native user interface
US20150279428A1 (en) * 2014-03-27 2015-10-01 Samsung Electronics Co., Ltd. Method for generating thumbnail and electronic device thereof
US9685197B2 (en) * 2014-03-27 2017-06-20 Samsung Electronics Co., Ltd Method for generating thumbnail and electronic device thereof
US11882345B2 (en) 2014-10-09 2024-01-23 Stats Llc Customized generation of highlights show with narrative component
US11863848B1 (en) * 2014-10-09 2024-01-02 Stats Llc User interface for interaction with customized highlight shows
US11778287B2 (en) 2014-10-09 2023-10-03 Stats Llc Generating a customized highlight sequence depicting multiple events
US20180063253A1 (en) * 2015-03-09 2018-03-01 Telefonaktiebolaget Lm Ericsson (Publ) Method, system and device for providing live data streams to content-rendering devices
US11503345B2 (en) * 2016-03-08 2022-11-15 DISH Technologies L.L.C. Apparatus, systems and methods for control of sporting event presentation based on viewer engagement
US11012719B2 (en) * 2016-03-08 2021-05-18 DISH Technologies L.L.C. Apparatus, systems and methods for control of sporting event presentation based on viewer engagement
US20230076146A1 (en) * 2016-03-08 2023-03-09 DISH Technologies L.L.C. Apparatus, systems and methods for control of sporting event presentation based on viewer engagement
US10694137B2 (en) 2016-05-10 2020-06-23 Rovi Guides, Inc. Systems and methods for resizing content based on a relative importance of the content
WO2017196851A1 (en) * 2016-05-10 2017-11-16 Rovi Guides, Inc. Systems and methods for notifying different users about missed content by tailoring catch-up segments to each different user
US10225603B2 (en) * 2017-03-13 2019-03-05 Wipro Limited Methods and systems for rendering multimedia content on a user device
US20210084350A1 (en) * 2018-01-11 2021-03-18 Editorji Technologies Private Limited Method and system for customized content
US11641500B2 (en) * 2018-01-11 2023-05-02 Editorj1 Technologies Private Limited Method and system for customized content
CN111414150A (en) * 2019-01-04 2020-07-14 厦门雅基软件有限公司 Game engine rendering method and device, electronic equipment and computer storage medium
CN110045976A (en) * 2019-05-23 2019-07-23 中国联合网络通信集团有限公司 A kind of update method and system of application

Also Published As

Publication number Publication date
US7178107B2 (en) 2007-02-13
US7197709B2 (en) 2007-03-27
US7194687B2 (en) 2007-03-20
US20050091686A1 (en) 2005-04-28
US7181691B2 (en) 2007-02-20
US20050091165A1 (en) 2005-04-28
US7194688B2 (en) 2007-03-20
US20050188328A1 (en) 2005-08-25
US20050141864A1 (en) 2005-06-30
US20050120034A1 (en) 2005-06-02
US20050120033A1 (en) 2005-06-02
US20050091685A1 (en) 2005-04-28
US20050131727A1 (en) 2005-06-16
US7424677B2 (en) 2008-09-09
US7424678B2 (en) 2008-09-09
US7509580B2 (en) 2009-03-24

Similar Documents

Publication Publication Date Title
US7509580B2 (en) Audiovisual information management system with preferences descriptions
US6236395B1 (en) Audiovisual information management system
US8028314B1 (en) Audiovisual information management system
US7055168B1 (en) Method for interpreting and executing user preferences of audiovisual information
US20040268383A1 (en) Audiovisual information management system
US8020183B2 (en) Audiovisual management system
US20030206710A1 (en) Audiovisual management system
US20030061610A1 (en) Audiovisual management system
US20030121040A1 (en) Audiovisual management system
EP1100268B1 (en) Audiovisual information management system
EP1580990A2 (en) Audiovisual information management system

Legal Events

Date Code Title Description
AS Assignment

Owner name: SHARP LABORATORIES OF AMERICA, INC., WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SEZN, MUHAMMED IBRAHIM;VAN BEEK, PETRUS;REEL/FRAME:016215/0114;SIGNING DATES FROM 20000411 TO 20000412

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION