WO2002054192A2 - Synchronized multimedia presentation - Google Patents

Synchronized multimedia presentation Download PDF

Info

Publication number
WO2002054192A2
WO2002054192A2 PCT/US2002/000087 US0200087W WO02054192A2 WO 2002054192 A2 WO2002054192 A2 WO 2002054192A2 US 0200087 W US0200087 W US 0200087W WO 02054192 A2 WO02054192 A2 WO 02054192A2
Authority
WO
WIPO (PCT)
Prior art keywords
presentation
file
display
multimedia file
network
Prior art date
Application number
PCT/US2002/000087
Other languages
French (fr)
Other versions
WO2002054192A3 (en
Inventor
Yuewei Wang
Ganesh Jampani
Yenjen Lee
Original Assignee
3Cx, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 3Cx, Inc. filed Critical 3Cx, Inc.
Priority to AU2002237756A priority Critical patent/AU2002237756A1/en
Publication of WO2002054192A2 publication Critical patent/WO2002054192A2/en
Publication of WO2002054192A3 publication Critical patent/WO2002054192A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/43Querying
    • G06F16/438Presentation of query results
    • G06F16/4387Presentation of query results by the use of playlists
    • G06F16/4393Multimedia presentations, e.g. slide shows, multimedia albums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4316Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications

Definitions

  • the present invention relates to a system and method for displaying synchronized multimedia presentations on a terminal(s) attached to a network, and more particularly relates to a platform that synchronizes secondary multimedia files to a primary multimedia file within a presentation.
  • HTML-based applications have drastically increased. Many businesses and educational institutions have implemented HTML applications to service customers, employees and students. Examples of these types of applications include commercials, instructions, shopping, and Internet educational courses. Typically, a user will interface with an HTML-based application that will perform some function (i.e., display a product and corresponding price, give a detailed description of the installation of a certain product, display a student's course schedule).
  • multimedia files including both HTML-based and audio/video
  • the display window generally contains only the streaming video file.
  • an individual may give a presentation to a group of people and have a multimedia file (e.g., a PowerpointTM slide show) streamed from a remote server to a conference room or other location where the presentation is being given.
  • multimedia files may be streamed to other types of presentations including Internet classes, sales presentations or on-demand movie/television programs.
  • Many businesses are continually trying to improve the quality of their website above other competitors. This need for quality is most relevant with businesses within the online shopping space. An important aspect to these web businesses is the level of interaction a website affords individuals. Additionally, businesses are always trying to improve the visual quality of their websites.
  • the use of multiple multimedia files that are concurrently displayed allows a user a higher level of interaction and a content provider an increase in the quality and quantity of content presentations available for delivery. Additionally, a content provider may monitor the use and receive feedback when multimedia files are viewed concurrently. As a result, the provider is able to enable a quality viewing experience for an individual and at the same time monitor a viewer's reactions and preferences to specific characteristics within the presentation.
  • an individual giving a presentation must control a streaming multimedia file.
  • a person using a slide show must click through each slide or manually play and stop a streaming video. This requirement often times reduces the flow of a presentation and may distract both the presenter and audience. Also, the individual may make an error in controlling these multimedia files resulting in an extended break during which the multimedia presentation must be corrected.
  • a streamed multimedia file enhances a presentation, the level of interactivity between users is limited and between other files is non-existent. For example, an individual may scroll through a slide presentation or stop a streaming video; however, an individual may not effectively display a multi-file multimedia presentation that includes multiple files being concurrently displayed and controlled. This limitation within existing multimedia platforms is caused by a lack of file synchronization or cooperation that displays multiple files concurrently.
  • the present invention provides a system and method for synchronizing and displaying multimedia files in a presentation. Specifically, the present invention creates a platform on which rich content presentations are shown by synchronizing at least one secondary multimedia file to a primary multimedia file.
  • the system comprises a first server coupled to a network, a second server coupled to the network, a database coupled to the second server, and at least one client coupled to the network.
  • These networked devices allow multiple files to be transmitted, synchronized and displayed through the network.
  • a presentation may be stored remotely, transmitted to a client, synchronized locally at the client and displayed on an attached display device accordingly.
  • the first server streams a primary multimedia file across the network to the client.
  • This first server may be a video server that streams a video file to the client.
  • This video file is buffered at the client and shown by the display device that is coupled to the client.
  • the first multimedia file may be shown in a particular window within the display device.
  • a streaming video may be shown within an ActiveX controlled window within a web browser.
  • Other multimedia files within the presentation are synchronized to this primary multimedia file to provide a rich multi-file presentation.
  • the second server transmits a secondary multimedia file across the network to the client.
  • the secondary multimedia file may be an HTML file(s), graphic files, or text files.
  • the secondary multimedia file may be pre-fetched from the second server before the presentation begins. In such an instance, the secondary multimedia file is stored locally on the client in a storage device such as a cache or hard disk drive. This pre-fetching increases the available bandwidth at the client for the primary multimedia file. However, pre-fetching the secondary multimedia file is not required; rather, both the primary and secondary multimedia files may be streamed concurrently during the presentation.
  • the second server also transmits a synchronization file across the network to the client.
  • the synchronization file contains a plurality of indices that synchronize the secondary multimedia file(s) to the primary multimedia file.
  • the synchronization file may contain a first index that triggers a particular HTML file to be shown when a frame within the primary file is reached. Specifically, as a video is shown in a first window, an HTML picture may be shown in a second window after the 200 th frame in the video is shown. This synchronization allows a viewer to see multiple windows showing synchronized multimedia content during the presentation.
  • This synchronization file may be pre-fetched from the second server before the presentation begins. Additionally, the synchronization file may be processed at the second server allowing the second server to control the synchronization of the presentation. It is important to note that the first server and the second server may be located on the same physical device on the network.
  • the client receives these multimedia files from the servers and displays the synchronized multimedia presentation to a viewer.
  • the presentation may be synchronized at the client according to the synchronization file.
  • the client has a certain level of intelligence required to perform these synchronization functions.
  • a processor in the client may access the synchronization file and determine the indices that synchronize the secondary multimedia file to the primary multimedia file.
  • the processor monitors the primary multimedia file to determine when a particular index is reached (e.g., a particular frame).
  • a secondary multimedia file is displayed in a window on the display device. This process continues until the primary multimedia file is complete and/or all the secondary multimedia files have been shown.
  • the display device may comprise a single computer monitor with a web browser having multiple windows.
  • a web browser may have a first window in which the primary multimedia file (e.g., video) is shown.
  • the web browser may have a second window in which the secondary multimedia file(s) is shown.
  • the web browser may have other windows that provide further windows for displaying secondary multimedia files or offer other functions that may be integrated within a presentation.
  • a window within the web browser may be used to provide real-time chatting for a viewer. The viewer may be able ask questions regarding the presentation or discuss the presentation with other viewers at other locations.
  • the present invention also offers a security platform that may be integrated within the presentation. Specifically, a viewer may only have limited or no access to the presentation until fulfilling a security function. For example, a viewer may be required to supply a password in order to access particular presentations. Additionally, a viewer may not be given rights to the presentation file that would allow the user to modify the settings of the presentation. Specifically, a viewer may need to supply a password in order to adjust the presentation in any manner. Also, the presentation may be secured by allowing a viewer to access the presentation from a particular location. For example, a viewer may be sent an email containing a hyperlink referring a particular presentation. Viewing rights within the email may allow the viewer to see the presentation (e.g., from a particular network address).
  • the present invention may also provide viewer interaction features within the presentation platform.
  • a presentation may have various interactive features that allow a viewer to select items/topics of interest or respond to questions. Responses to these interactive features may operate as indices within the synchronization file that trigger a particular secondary multimedia file. For example, a viewer may be asked a question regarding a particular item shown in the primary multimedia file. In response to the answer, a specific secondary multimedia file may be shown in a separate window that gives further detail of a particular item or topic. This feature allows a viewer to tailor the presentation by providing requests for more information on particular topics.
  • the present invention may also monitor these interactive functions in order to provide valuable information to a service provider or other individual/entity responsible for the presentation.
  • This information is transmitted from the client to a database coupled to the network. This transmission may occur intermittently during the presentation or may be transmitted after the presentation has concluded. Thereafter, the information may be organized and analyzed. For example, a presentation may ask a viewer a question regarding a color of a particular product. The response is then transmitted to a database containing responses from a large number of viewers. A manufacturer may analyze this information in order to determine whether a product should be manufactured in a particular color.
  • FIG. 1 is an illustration of a graphical user interface used to display a synclironized multimedia presentation according to an embodiment of the present invention.
  • FIG. 2 is an illustration of a system used to deliver the synchronized multimedia presentation to a display device according to an embodiment of the present invention.
  • Fig. 3 is a block diagram of a device that may be used to display the synchronized multimedia presentation according to an embodiment of the present invention.
  • FIG. 4 is an illustration representing the hierarchical structure of a synchronized multimedia presentation according to an embodiment of the present invention.
  • Fig. 5 is an illustration of a storage and control device used for a synchronized multimedia presentation according to an embodiment of the present invention
  • Fig. 6 is a flow diagram of synchronized multimedia presentation according to an embodiment of the present invention.
  • Fig. 7 is a flow diagram of file synchronization within a multimedia presentation according to an embodiment of the present invention.
  • Fig. 8 is a flow diagram of a user response storage system implemented within a synchronized multimedia presentation according to an embodiment of the present invention.
  • Fig. 9 is a flow diagram of a user interaction monitor implemented within a synchronized multimedia presentation according to an embodiment of the present invention.
  • Fig. 10 is a general graphical user interface on which a synchronized multimedia presentation may be initiated and viewed.
  • FIG. 11 is an illustration of a synchronized multimedia presentation system having multiple display devices.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS [0034] A system and method for displaying, modifying, and creating a presentation containing synchronized concurrently-displayed files is described. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the invention. It will be apparent, however, to one skilled in the art that the invention can be practiced without these specific details. In other instances, structures and devices are shown in block diagram form in order to avoid obscuring the invention.
  • This apparatus may be specially constructed for the required purposes, or it may comprise a general-purpose computer selectively activated or reconfigured by a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, and magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, or any type of media suitable for storing electronic instructions, and each coupled to a computer system bus.
  • a computer readable storage medium such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, and magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, or any type of media suitable for storing electronic instructions, and each coupled to a computer system bus.
  • the present invention is claimed below operating or working with a plurality of servers attached to a network.
  • software implementing the present invention may be stored locally on a client terminal or in conjunction with one or a plurality of servers on a network. Additionally, the software may be stored within other storage devices (i.e., database, SAN) connected to the network.
  • the present invention is directed to a system and method for displaying presentations containing synchronized multimedia files.
  • a user can view the presentation on a display device attached to a network.
  • the present invention provides a viewing system allowing a user to view synchronized presentations on a web browser 100 after initiating or accessing an appropriate file through a network.
  • Figure 1 shows an embodiment of the present invention displayed within a web browser.
  • the web browser 100 is one example of a display device on which the present invention may be shown. Examples of web browsers include Microsoft Explorer and Netscape Navigator.
  • the browser 100 may be partitioned into different windows on which various different types of multimedia files may be shown.
  • a first display window 110 displays a primary multimedia file.
  • this primary multimedia file is a video file (i.e., mpeg, avi, real, asf, mp3, etc.) although it may be any type of multimedia file including those conforming to VoIP and H.323 standards.
  • the size, shape and color quality of the window may all be pre-determined and/or adjusted by the viewer.
  • a second display window 120 within the browser 100 displays a secondary multimedia file.
  • the secondary multimedia file may be an HTML based file or any type of file including a video file, a Microsoft PowerpointTM file, an image file, or a word processing file.
  • the secondary multimedia file may be shown concurrently with the primary multimedia file during a presentation.
  • the secondary multimedia files may also be automatically converted to a standard file type by the platform. For example, a PowerpointTM slide may be converted to an XML slide to facilitate easier control and viewing of the slide.
  • the secondary multimedia file is controlled by a synchronization file.
  • the synchronization file synchronizes the display of the secondary multimedia file to the primary file.
  • the synchronization file is retrieved from a remote location before the presentation begins.
  • the synchronization file contains at least one index that relates the display of the secondary multimedia file to the primary multimedia file.
  • the secondary multimedia file is displayed by triggering the pre-determined indices within the synchronization file as the primary file progresses.
  • An index may relate to a frame or moment of time within the primary multimedia file or any other indicator through which the secondary multimedia file may be controlled.
  • This interval of time may be a default time value, relate to the triggering of a subsequent secondary file, or a predetermined amount of time.
  • a text message window 190 displays text to a viewer during the presentation.
  • the text messages may be displayed by triggering a predetermined index within the synchronization file. Once the index is realized and the text message is displayed, the text message remains displayed for an interval of time. This interval of time may be a default time value, relate to the triggering of a subsequent secondary file, or a pre-determined amount of time.
  • the text message window 190 may facilitate real-time network chatting that allows a viewer to communicate with another individual through the network. In this instance, this type of chatting function would operate according to the H.323 standard or other standards that allow multiple applications to operate on a network concurrently.
  • a presentation indices window 130 displays a summary of indices contained within the presentation. This summary allows the viewer to quickly scan the presentation but also review or jump to specific secondary multimedia files indexed to the primary multimedia file. Additionally, the presentation indices window 130 may be hidden from the user or decreased to allow the expansion of other windows (e.g., sedond display window 120) within the browser 100.
  • a presentation control window 135 contains a plurality of controls that allow a viewer to manipulate a presentation. For example, a viewer may play, pause, rewind, fast forward or select continual play from a variety of controls within this window 135. Additionally, a status bar (not shown) may be implemented to show a currently displayed frame or time position relative to the entire video file. A user may also control other viewing options within the presentation by accessing controls such as window size, font, font size, font color, and window arrangement. Also, the presentation control window 135 may contain a counter that counts the number of frames that have been displayed as well as displaying the total number of frames within a certain video file. [0049] The web browser may also have a thumbnail view option to choose video, slide or a full thumbnail view.
  • a video thumbnail is a snapshot of the frame at the index point. This snapshot is automatically extracted at the time of authoring.
  • a slide thumbnail is a smaller view of the slide document (HTML, MS PowerPoint slide ...etc). If a thumbnail cannot be extracted, a generic image/blank is shown.
  • the full thumbnail view may show both the video thumbnail and the corresponding slide thumbnail together for an index point.
  • a thumbnail item may also show a time stamp of the index, image and the annotation.
  • FIG. 2 shows a block diagram of a networked system used to provide synchronized multimedia presentations according to an embodiment of the present invention.
  • This system includes a network to which a client 205, a web server 220 and a video server 215 are coupled.
  • a database 225 is coupled to the web server 220 to allow stored data to be transmitted onto the network 210.
  • a synchronized multimedia presentation contains a primary multimedia file, for example a video file, stored on the video server 215 attached to the network 210.
  • the synchronized multimedia presentation also contains a secondary multimedia file, for example an HTML-based file but may be any type of file including another video file.
  • the secondary multimedia file is indexed to the primary multimedia file by a synchronization file.
  • Both the secondary multimedia file and the synchronization file are stored on the database 225 and are transmitted onto the network 210 via the web server 220. Additionally, the database 225 may be used to store other type of data relating to the presentation, for example, feedback information regarding a viewer's response to the presentation may be transmitted from the client 205 to the web server 220 and stored in the database 225.
  • components of a synchronized multimedia presentation are streamed from the video server 215 and the web server 220 to the client 205.
  • a video file is stored on the video server 215 and streamed to the client 205 during a presentation.
  • Secondary multimedia files are stored in the database 225 and transmitted to the client from the web server 220.
  • there will be only one secondary multimedia file typically an HTML-based file.
  • multiple secondary multimedia files may be used to implement numerous files being synchronized and displayed within the presentation.
  • the secondary multimedia files do not need to be HTML-based files but can be any type of file including video, text, image files.
  • An individual may view the synchronized presentation by logging onto the network 210 and initiating the presentation.
  • the individual may use a uniform resource locator to address the presentation across the network 210 and display it within a web browser. This initiation include entering a password, opening an email message or clicking on a hyperlink.
  • the presentation may be transmitted to the client 205 using various types of methods.
  • the secondary multimedia files may be pre-fetched from the web server 220 and buffered in the client terminal 205 before the presentation begins.
  • the synchronization file, containing indexing information, stored within the database 225 may be pre-fetched and buffered in the client terminal 205.
  • a video file may be streamed to the client terminal 205 from the video server 215.
  • Software stored either locally on the client 205 or remotely on a server, monitors the video file as it is displayed and synchronizes the secondary multimedia file(s) according to the indexing information within the synchronization file.
  • the client 205 provides a screen(s) on which the presentation may be shown as well as a processing device and storage device that allow the components of the presentation to be synchronized. It is important to note that the web server 220 and the video server 215 may be located on the same physical device. For example, a single server may operate both as the web server 220 and the video server 215.
  • Figure 3 shows an example of a client 205 on which a synchronized multimedia presentation may be shown.
  • the client terminal comprises a control unit 300 coupled, via a bus, to a display 305, a keyboard 310, a cursor control 315, a network controller 320, and an I/O device 325.
  • the control unit 300 is typically a personal computer or computing box attached to a network. However, it may also be a personal digital assistant or any other device able to receive, process and display data. In one embodiment, the control unit 300 has an operating system (i.e., Windows, UNIX, etc.) upon which multiple applications operate.
  • the control unit 300 comprises a processor 350, main memory 335, and a data storage device all connected to a bus 330.
  • a processor 350 processes data signals and may comprise various computing architectures including a complex instruction set computer (CISC) architecture, a reduced instruction set computer (RISC) architecture, or an architecture implementing a combination of instruction sets. Although only a single processor is shown, multiple processors may be attached.
  • CISC complex instruction set computer
  • RISC reduced instruction set computer
  • Main memory 335 may store instructions and/or data that may be executed by processor 350.
  • the instructions and/or data may comprise code for performing any and/or all of the techniques described herein.
  • Main memory 335 may be a dynamic random access memory (DRAM) device, a static random access memory (SRAM) device, or some other memory device known in the art.
  • the memory 335 preferably includes a web browser 340 is of a conventional type that provides access to the Internet and processes HTML, XML or other mark up language to generated images on the display device 305.
  • the web browser 340 could be Netscape Navigator or Microsoft Internet Explorer.
  • Data storage device 345 stores data and instructions for processor 350 and may comprise one or more devices including a hard disk drive, a floppy disk drive, a CD-ROM device, a DVD-ROM device, a DVD-RAM device, a DVD-RW device, a flash memory device, or some other mass storage device known in the art.
  • System bus 330 represents a shared bus for communicating information and data throughout control unit 300.
  • System bus 330 may represent one or more buses including an industry standard architecture (ISA) bus, a peripheral component interconnect (PCI) bus, a universal serial bus (USB), or some other bus known in the art to provide similar functionality.
  • ISA industry standard architecture
  • PCI peripheral component interconnect
  • USB universal serial bus
  • Display device 330 include display device 305, keyboard 310, cursor control device 315, network controller 320 and audio device 325.
  • Display device 305 represents any device equipped to display electronic images and data as described herein.
  • Display device 305 may be a cathode ray tube (CRT), liquid crystal display (LCD), or any other similarly equipped display device, screen, or monitor.
  • display device 305 is equipped with a touch screen in which a touch-sensitive, transparent panel covers the screen of display device 305.
  • Keyboard 310 represents an alphanumeric input device coupled to control unit 300 to communicate information and command selections to processor 350.
  • Cursor control 315 represents a user input device equipped to communicate positional data as well as command selections to processor 350.
  • Cursor control 315 may include a mouse, a trackball, a stylus, a pen, a touch screen, cursor direction keys, or other mechanisms to cause movement of a cursor.
  • Network controller 320 links control unit 300 to a network that may include multiple processing systems.
  • the network of processing systems may comprise a local area network (LAN), a wide area network (WAN) (e.g., the Internet), and/or any other interconnected data path across which multiple devices may communicate.
  • LAN local area network
  • WAN wide area network
  • One or more I/O devices 325 are coupled to the system bus 330.
  • the I/O device 325 may be an audio device equipped to receive audio input and transmit audio output. Audio input may be received through various devices including a microphone within audio device 325 and network controller 320.
  • audio output may originate from various devices including processor 350 and network controller 320.
  • audio device 325 is a general purpose; audio add-in/expansion card designed for use within a general purpose computer system.
  • audio device 325 may contain one or more analog-to-digital or digital-to-analog converters, and/or one or more digital signal processors to facilitate audio processing.
  • control unit 300 may include more or less components than those shown in Figure 3 without departing from the spirit and scope of the present invention.
  • control unit 300 may include additional memory, such as, for example, a first or second level cache, or one or more application specific integrated circuits (ASICs).
  • additional components may be coupled to control unit 350 including, for example, image scanning devices, digital still or video cameras, or other devices that may or may not be equipped to capture and/or download electronic data to control unit 300.
  • Figure 4 is a block diagram representing a general hierarchical model of the multiple layers within an embodiment of the present invention.
  • a display composite 400 is generated through the combination of a video layer 410, an HTML layer 420, and a control/synchronization layer 430.
  • the video layer 410 comprises a video file (i.e., primary multimedia file) that is streamed from the video server 215 and displayed in a first display window 110. Specifically, this video is streamed from a video server during the presentation and buffered at a client. The video stream is then processed and displayed in a window on the web browser.
  • a video file i.e., primary multimedia file
  • An HTML layer 420 comprises an HTML file(s) that is pre-fetched from the database 225 via the web server 220.
  • the HTML file is stored within the client prior to the streaming of the video file begins.
  • the HTML file(s) is synchronized to the streaming video and displayed in the second display window 120 within the web browser 100. Note that the HTML files need not be pre-fetched but can be synchronized at the server-side and streamed from the server according to the synchronization.
  • a control layer 430 controls and synchronizes both the video file and the HTML file.
  • this control data may be stored within a single synchronization file that is pre-fetched from the database 225 via the web server 220.
  • the control layer 430 manages the display options of the window in which the video is shown and the window in which the HTML file is shown. These display options include the size and brightness of the window, the volume of an audio file, etc.
  • the control layer 430 synchronizes the display of both the video and HTML file. As described above, the display of the HTML file(s) may be triggered by a particular index (e.g., time index or frame index) in the video file. If multiple HTML files are used, then each file may have an index to the video file that will trigger its display.
  • a particular index e.g., time index or frame index
  • the composite layer 400 controls the incorporation of the video presentation and HTML presentation within a single browser.
  • the composite layer 400 utilizes a variety of applications to implement this incorporation including ActiveX, and the underlying operating system (Microsoft WindowsTM or UnixTM).
  • ActiveX the underlying operating system
  • UnixTM the underlying operating system
  • FIG. 4 shows a more detailed drawing of the main memory 335 of the client 105.
  • the main memory generally comprises an operating system 500 whereon a number of applications operate.
  • a bus 330 couples the operating system 500 to the applications module 505.
  • a video storage module 510 is coupled to the bus 330.
  • the video storage module 510 stores/buffers a primary video file received from the video server 215.
  • the primary video file may be stored locally or streamed in real time from the video server 215 and buffered in the video storage module. Examples of a video storage module include a portion of a hard disk drive or a RAM module.
  • An HTML storage module 520 is coupled to the bus 330.
  • the HTML storage module 520 stores secondary multimedia files received from the web server 220. These secondary multimedia files may be either converted to an HTML format by an author prior to storage or automatically converted by the HTML storage module 520 to an HTML based file. For example, a text file may automatically converted to an HTML file using one of numerous conversion files known within the art. These secondary multimedia files may be stored locally within the HTML storage module 520, pre-fetched from the web server 220 and buffered in the module 520, or streamed in real time and buffered in the module 520. Examples of an HTML storage module 420 include a portion of a hard disk drive or a RAM module.
  • a synchronization control module 525 is coupled to the bus 330.
  • the synchronization control module 525 stores a synchronization file containing indexing information and file addressing information created during the authoring of the presentation. Generally, this information is buffered in the synchronization control module 525 before the presentation begins. However, the synchronization control module 525 may receive indexing data during the presentation.
  • the processor 350 accesses this indexing information from the synchronization control module 525 during the presentation in order to properly synchronize the HTML file(s) to the video file. Examples of a synchronization control module 425 include a portion of a hard disk drive or a RAM module.
  • a graphical user interface control module 530 is coupled to a bus 330.
  • the graphical user interface control module 530 stores graphical display options created by the author during the creation of a presentation. Examples of these options include the size of each of the display windows, the volume of the audio, and the duration a secondary multimedia file is displayed. Once these display options are stored within the module 530, the processor 350 may access this data in order to provide the correct graphics on the display 305. Examples of a graphical control module 530 include a portion of a hard disk drive or a RAM module.
  • FIG. 6 is a flowchart showing a first method for displaying synchronized multimedia files contained within a presentation.
  • an individual In order to display a presentation, an individual must initiate 605 the presentation from the 205 connected to the network 210. The presentation may be initiated within a browser using a uniform resource locator. After the presentation has been initiated, certain files may be pre-fetched from a server and buffered at a client 205. For example, secondary multimedia file, such as HTML files stored on the database 225, may be pre-fetched 610 and buffered at the client 205 before the presentation begins. Additionally, synchronization data may be pre-fetched 615 from the database 225 and buffered at the client 205 before the presentation begins.
  • the presentation may be displayed without pre-fetching any secondary multimedia files.
  • a primary video file is streamed 620 from the video server 215 through the network 210 to the client 205.
  • Synchronization data is used by the processor 350 to synchronize and display 625 the secondary multimedia file(s) in relation to the primary video file.
  • the synchronization data is either processed locally or remotely on a server.
  • the client terminal 205 process the synchronization data and displays both files accordingly.
  • synchronization data is processed remotely on a server, either the video server 215 or the web server 220. In this example, both servers would communicate with each other in order to trigger the transmission and display of secondary multimedia files from the web server 220.
  • FIG. 7 is a flowchart showing a method for synchronizing the display of a secondary multimedia file to a primary multimedia file.
  • a primary multimedia file is shown 705 within a display.
  • a processor within a display device or coupled to the display device, monitors 710 the video file in order to properly identify synchronization indices relating to the secondary multimedia file(s). For example, the progression of frames within a video file may be monitored in order to identify a frame index relating to a secondary multimedia file. Similarly, the time progression of a video file as it is being displayed may be momtored to identify a timing index relating to a secondary multimedia file.
  • a particular HTML file is displayed in a second window when an index at the 200 th frame of the video file is reached.
  • This HTML file may continue to be displayed until the next index is reached or for a particular period of time.
  • the video file continues playing 730. This process continues until the entire video is played.
  • Figure 8 shows a flowchart describing a user response storage system that may be implemented within a synchronized multimedia.
  • a user may be prompted by a question or other inquiry 805.
  • a question or other inquiry 805. For example, a user may be asked what her/she thinks of a certain part of the presentation.
  • This question may be embedded within a secondary multimedia file.
  • the system may wait 810 for a response from the user before continuing on with the presentation or may simply continue the presentation.
  • This response may be a user activating a graphical icon in the presentation or an audio response that is recorded by the display device.
  • the system transmits the response to a storage system either locally or remotely on an attached network.
  • the response may be transmitted 815 across the network 210 to the web server 220 whereupon it is stored 820 in the database 225.
  • the response information may be stored in a different location on the network. In any event, this response information may then be accessed 825 from the database 225 and analyzed 830.
  • Figure 9 illustrates a method for monitoring interactive functions that may be embedded 905 within the primary video file.
  • Interactive functions include pausing, stopping, or rewinding the video file.
  • more complex interactive may be embedded in the primary video that allow a user to identify or activate points of interest within the primary video. For example, a user may be able to use a mouse to click on an item shown within the primary video to identify the item as something of interest. This click may then initiate a secondary multimedia file in a different window that describes the item of interest in more detail.
  • This interactive primary video display also allows a viewer to interact with the video by clicking on certain frames, by editing a video or multimedia file, or other ways of interaction.
  • This interaction is monitored 910 by a client (e.g., 205) on which the presentation is shown and stored locally.
  • the client transmits 915 the interaction data through an attached network to a web server.
  • This data may be transmitted during the presentation, or stored locally and transmitted at the end of the presentation.
  • the interaction data is then stored 920 in a storage device that is coupled to the network.
  • this interaction data may be transmitted to the web server 220 and stored in the attached database 225. Thereafter, if a provider or a business wants to analyze the stored data, the interaction data may be retrieved 925 from the database and subsequently analyzed 930.
  • FIG. 10 illustrates a browser window 1000 for accessing a synchronized multimedia presentation.
  • An addressing windowlOlO may be used to access the presentation by a specific uniform resource locator via a web page.
  • the user uses an Internet browser 1000 like Microsoft ExplorerTM or Netscape NavigatorTM.
  • the user is given a variety of choices, one of which is to see a list of presentations contained within multiple directories within the presentation view window 1040. Access to this list may be password protected.
  • a presentation security window 1030 within the browser 1000 may require a password from a user in order to gain access to the presentation directories or open a particular presentation.
  • the user may select a view a presentation icon (not shown) within the presentation view window 1040. Once the icon is selected, the user may input the name of the presentation or select the presentation from a default home directory. Once a specific presentation is selected, the presentation will be displayed within the browser or a different browser.
  • a user may modify settings of the presentation using a presentation control window 1050 within the browser 1000.
  • controls such as various window sizes within the presentation, ActiveX specific controls, brightness, video rate, etc. may be controlled within this window 1050 prior to displaying the presentation.
  • This functionality allows a presenter to tune the presentation before it is given.
  • the presentation controls may be adjusted locally where the presentation is given or adjusted remotely at a terminal coupled to the network.
  • a user may also initiate a presentation by opening an email and clicking on a hyperlink. The hyperlink will take the browser directly to a URL address where the presentation will automatically begin. This emailing operation allows an author to invite individuals to view a presentation by merely sending out an email containing a hyperlink to a corresponding URL address.
  • Security features may be embedded into the presentation or platform on which the presentation operates that allow the presentation to be initiated by a single or multiple IP addresses corresponding to invitation emails.
  • FIG 11 illustrates a system on which a single presentation may be transmitted to multiple display devices coupled to a network.
  • the web server 220 and video server 215 transmit multiple multimedia files that are synchronized and incorporated within a presentation.
  • the presentation may be shown on multiple display devices.
  • a primary multimedia file, a secondary multimedia file and a synchronization file may be transmitted to a client 1100 coupled to the network 210.
  • the client 1100 comprises a storage device and processing unit that allows the different multimedia files to be synchronized and incorporated into a single presentation. Once these multimedia files have been appropriately processed, files are transmitted to an a ⁇ ay of display devices on which the presentation is shown.
  • a first display device 1105 is coupled to the client 1100 and shows a primary multimedia file.
  • this primary multimedia file may be a video file that is streamed onto the client 1100 from the video server 215.
  • the first display device 1105 may be coupled to the client 1100 via a serial or parallel connection, as would be the case if it were a computer monitor.
  • the first display device 1105 may be coupled to the client 1100 via a network connection such as an Ethernet or IP network connection.
  • the first display device 1105 would likely require a certain level of intelligence in order to properly function on the network.
  • a second display device 1110 is coupled to the client 1100 and shows a secondary multimedia file.
  • the secondary multimedia file may be another video file, an HTML file, a text file or any other type of file that will allow it to be shown concu ⁇ ently with the primary multimedia file.
  • the second display device 1110 may be networked to the client 1100 or coupled either in series or parallel to the client 1100.
  • a third display device 1115 is coupled to the client 1110 and may show another secondary multimedia file or provide another window that may be implemented within the presentation. For example, this third display device 1115 may show an HTML file(s) or be used as a text chat room that enables a viewer to communicate with someone in real-time as the presentation is progressing.
  • the present invention facilitates streaming the presentation to multiple viewers that are watching it concurrently.
  • the present invention provides a platform on which an instructor may teach multiple students located in different locations. Specifically, a streaming video file of the instructor is transmitted to each client at which a student is located. Other teaching material may be transmitted to these clients and synchronized to the streaming video.
  • a streaming video file of the instructor is transmitted to each client at which a student is located.
  • Other teaching material may be transmitted to these clients and synchronized to the streaming video.
  • one skilled in the art will recognize a large number of different types of applications that may operate on this synchronized multimedia presentation platform.

Abstract

The present invention provides a system and method for synchronizing and displaying a multimedia presentation. This presentation comprises a primary multimedia file and at least one secondary multimedia file. These files are transmitted from network servers to a client on which the presentation is shown. The secondary multimedia file(s) are synchronized to the primary multimedia file either at the server or at the client. This synchronization allows for multiple windows to show different multimedia files concurrently. This display is automated because of the synchronization of the files. These files may be synchronized using indices such as time or frame progression of the primary file. Finally, interactive features may be indexed to the primary multimedia file as well. This invention creates a highly interactive, content rich platform on which presentations may be displayed.

Description

SYNCHRONIZED MULTIMEDIA PRESENTATION
BACKGROUND
A. Technical Field
[0001] The present invention relates to a system and method for displaying synchronized multimedia presentations on a terminal(s) attached to a network, and more particularly relates to a platform that synchronizes secondary multimedia files to a primary multimedia file within a presentation.
B. Background of the Invention
[0002] Recent advances in technology as well as physical expansions of network infrastructures have increased the available bandwidth a large number of existing networks, including the Internet. This increase in bandwidth has greatly expanded a typical network's capacity to stream large amounts of data to client terminals on a network. Due to this increase in network capacity, service and content providers' ability to simultaneously stream multiple files to a client has vastly increased over the past years. These advancements have also increased the quality and quantity of files that a provider may deliver to a particular client. For example, in the recent past, if a user wanted to view a video clip, the video file needed to be delivered to and saved locally on a corresponding client terminal before actually viewing the clip. However, currently a provider may deliver content to a client in real-time, allowing a user to view a file as it is being streamed (i.e., Webcasts, server-side cached real-time multimedia files). [0003] Due to the increase in network capacity, many bandwidth considerations that had often constrained software developers in producing new software applications are no longer relevant. As a result, software products requiring higher bandwidth are being produced and made available to customers. These new software products provide customers greater control and quality in a variety of network and multimedia applications. Additionally, advances in data compression and file protocol have also played a significant role in increasing network utilization. All of these advancements have allowed service providers to effectively stream large multimedia files to a client terminal. Examples of these multimedia files include RealPlayer™ audio/video files, mpeg, avi, and asf.
[0004] The quality and functionality of HTML-based applications have drastically increased. Many businesses and educational institutions have implemented HTML applications to service customers, employees and students. Examples of these types of applications include commercials, instructions, shopping, and Internet educational courses. Typically, a user will interface with an HTML-based application that will perform some function (i.e., display a product and corresponding price, give a detailed description of the installation of a certain product, display a student's course schedule).
[0005] The quality and functionality of audio/video applications have drastically increased as well, due in main part to the increases in network bandwidth. These video applications typically require large amounts of bandwidth in order to function properly on a corresponding client. Content providers, as well as businesses, provide these video applications to generate income or advertise a certain product. Examples of these high bandwidth video applications include Webcasts and streaming video. Typically, a user will only need to initiate a multimedia video/audio stream to display the file on a client terminal.
[0006] Generally, multimedia files, including both HTML-based and audio/video, are viewed independently within a single presentation. For example, when a user is viewing a presentation containing streaming video, the display window generally contains only the streaming video file. Also, an individual may give a presentation to a group of people and have a multimedia file (e.g., a Powerpoint™ slide show) streamed from a remote server to a conference room or other location where the presentation is being given. These multimedia files may be streamed to other types of presentations including Internet classes, sales presentations or on-demand movie/television programs. [0007] Many businesses are continually trying to improve the quality of their website above other competitors. This need for quality is most relevant with businesses within the online shopping space. An important aspect to these web businesses is the level of interaction a website affords individuals. Additionally, businesses are always trying to improve the visual quality of their websites.
[0008] The use of multiple multimedia files that are concurrently displayed allows a user a higher level of interaction and a content provider an increase in the quality and quantity of content presentations available for delivery. Additionally, a content provider may monitor the use and receive feedback when multimedia files are viewed concurrently. As a result, the provider is able to enable a quality viewing experience for an individual and at the same time monitor a viewer's reactions and preferences to specific characteristics within the presentation.
[0009] Typically, an individual giving a presentation must control a streaming multimedia file. For example, a person using a slide show must click through each slide or manually play and stop a streaming video. This requirement often times reduces the flow of a presentation and may distract both the presenter and audience. Also, the individual may make an error in controlling these multimedia files resulting in an extended break during which the multimedia presentation must be corrected.
[0010] Although a streamed multimedia file enhances a presentation, the level of interactivity between users is limited and between other files is non-existent. For example, an individual may scroll through a slide presentation or stop a streaming video; however, an individual may not effectively display a multi-file multimedia presentation that includes multiple files being concurrently displayed and controlled. This limitation within existing multimedia platforms is caused by a lack of file synchronization or cooperation that displays multiple files concurrently.
[0011] Accordingly it is desirable to provide a platform that displays synchronized multiple multimedia files. Additionally, it is desirable to provide a platform that allows an individual to interact with a multi-file multimedia presentation that is synchronized to a single primary multimedia file.
SUMMARY OF THE INVENTION
[0012] The present invention provides a system and method for synchronizing and displaying multimedia files in a presentation. Specifically, the present invention creates a platform on which rich content presentations are shown by synchronizing at least one secondary multimedia file to a primary multimedia file.
[0013] The system comprises a first server coupled to a network, a second server coupled to the network, a database coupled to the second server, and at least one client coupled to the network. These networked devices allow multiple files to be transmitted, synchronized and displayed through the network. As a result, a presentation may be stored remotely, transmitted to a client, synchronized locally at the client and displayed on an attached display device accordingly.
[0014] The first server streams a primary multimedia file across the network to the client. This first server may be a video server that streams a video file to the client. This video file is buffered at the client and shown by the display device that is coupled to the client. Specifically, the first multimedia file may be shown in a particular window within the display device. For example, a streaming video may be shown within an ActiveX controlled window within a web browser. Other multimedia files within the presentation are synchronized to this primary multimedia file to provide a rich multi-file presentation.
[0015] The second server transmits a secondary multimedia file across the network to the client. The secondary multimedia file may be an HTML file(s), graphic files, or text files. The secondary multimedia file may be pre-fetched from the second server before the presentation begins. In such an instance, the secondary multimedia file is stored locally on the client in a storage device such as a cache or hard disk drive. This pre-fetching increases the available bandwidth at the client for the primary multimedia file. However, pre-fetching the secondary multimedia file is not required; rather, both the primary and secondary multimedia files may be streamed concurrently during the presentation. [0016] The second server also transmits a synchronization file across the network to the client. The synchronization file contains a plurality of indices that synchronize the secondary multimedia file(s) to the primary multimedia file. For example, the synchronization file may contain a first index that triggers a particular HTML file to be shown when a frame within the primary file is reached. Specifically, as a video is shown in a first window, an HTML picture may be shown in a second window after the 200th frame in the video is shown. This synchronization allows a viewer to see multiple windows showing synchronized multimedia content during the presentation. This synchronization file may be pre-fetched from the second server before the presentation begins. Additionally, the synchronization file may be processed at the second server allowing the second server to control the synchronization of the presentation. It is important to note that the first server and the second server may be located on the same physical device on the network.
[0017] The client receives these multimedia files from the servers and displays the synchronized multimedia presentation to a viewer. As mentioned above, the presentation may be synchronized at the client according to the synchronization file. In this instance, the client has a certain level of intelligence required to perform these synchronization functions. A processor in the client may access the synchronization file and determine the indices that synchronize the secondary multimedia file to the primary multimedia file. The processor then monitors the primary multimedia file to determine when a particular index is reached (e.g., a particular frame). In response, a secondary multimedia file is displayed in a window on the display device. This process continues until the primary multimedia file is complete and/or all the secondary multimedia files have been shown.
[0018] The display device may comprise a single computer monitor with a web browser having multiple windows. For example, a web browser may have a first window in which the primary multimedia file (e.g., video) is shown. The web browser may have a second window in which the secondary multimedia file(s) is shown. The web browser may have other windows that provide further windows for displaying secondary multimedia files or offer other functions that may be integrated within a presentation. For example, a window within the web browser may be used to provide real-time chatting for a viewer. The viewer may be able ask questions regarding the presentation or discuss the presentation with other viewers at other locations.
[0019] The present invention also offers a security platform that may be integrated within the presentation. Specifically, a viewer may only have limited or no access to the presentation until fulfilling a security function. For example, a viewer may be required to supply a password in order to access particular presentations. Additionally, a viewer may not be given rights to the presentation file that would allow the user to modify the settings of the presentation. Specifically, a viewer may need to supply a password in order to adjust the presentation in any manner. Also, the presentation may be secured by allowing a viewer to access the presentation from a particular location. For example, a viewer may be sent an email containing a hyperlink referring a particular presentation. Viewing rights within the email may allow the viewer to see the presentation (e.g., from a particular network address).
[0020] The present invention may also provide viewer interaction features within the presentation platform. Specifically, a presentation may have various interactive features that allow a viewer to select items/topics of interest or respond to questions. Responses to these interactive features may operate as indices within the synchronization file that trigger a particular secondary multimedia file. For example, a viewer may be asked a question regarding a particular item shown in the primary multimedia file. In response to the answer, a specific secondary multimedia file may be shown in a separate window that gives further detail of a particular item or topic. This feature allows a viewer to tailor the presentation by providing requests for more information on particular topics. [0021] The present invention may also monitor these interactive functions in order to provide valuable information to a service provider or other individual/entity responsible for the presentation. This information is transmitted from the client to a database coupled to the network. This transmission may occur intermittently during the presentation or may be transmitted after the presentation has concluded. Thereafter, the information may be organized and analyzed. For example, a presentation may ask a viewer a question regarding a color of a particular product. The response is then transmitted to a database containing responses from a large number of viewers. A manufacturer may analyze this information in order to determine whether a product should be manufactured in a particular color. [0022] The features and advantages described in this summary and the following detailed description are not all-inclusive, and particularly, many additional features and advantages will be apparent to one of ordinary skill in the art in view of the drawings, specification, and claims hereof.
BRIEF DESCRIPTION OF THE DRAWINGS [0023] Fig. 1 is an illustration of a graphical user interface used to display a synclironized multimedia presentation according to an embodiment of the present invention.
[0024] Fig. 2 is an illustration of a system used to deliver the synchronized multimedia presentation to a display device according to an embodiment of the present invention.
[0025] Fig. 3 is a block diagram of a device that may be used to display the synchronized multimedia presentation according to an embodiment of the present invention.
[0026] Fig. 4 is an illustration representing the hierarchical structure of a synchronized multimedia presentation according to an embodiment of the present invention. [0027] Fig. 5 is an illustration of a storage and control device used for a synchronized multimedia presentation according to an embodiment of the present invention
[0028] Fig. 6 is a flow diagram of synchronized multimedia presentation according to an embodiment of the present invention.
[0029] Fig. 7 is a flow diagram of file synchronization within a multimedia presentation according to an embodiment of the present invention.
[0030] Fig. 8 is a flow diagram of a user response storage system implemented within a synchronized multimedia presentation according to an embodiment of the present invention.
[0031] Fig. 9 is a flow diagram of a user interaction monitor implemented within a synchronized multimedia presentation according to an embodiment of the present invention.
[0032] Fig. 10 is a general graphical user interface on which a synchronized multimedia presentation may be initiated and viewed.
[0033] Fig. 11 is an illustration of a synchronized multimedia presentation system having multiple display devices. DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS [0034] A system and method for displaying, modifying, and creating a presentation containing synchronized concurrently-displayed files is described. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the invention. It will be apparent, however, to one skilled in the art that the invention can be practiced without these specific details. In other instances, structures and devices are shown in block diagram form in order to avoid obscuring the invention.
[0035] Reference in the specification to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the invention. The appearances of the phrase "in one embodiment" in various places in the specification are not necessarily all referring to the same embodiment.
[0036] Some portions of the detailed description that follows are presented in terms of algorithms and symbolic representations of operations on data bits within a computer memory. These algorithmic descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. An algorithm is here, and generally, conceived to be a self- consistent sequence of steps leading to a desired result. The steps are those requiring physical manipulations of physical quantities. Usually, through not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like. [0037] It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussion, it is appreciated that throughout the description, discussions utilizing terms such as "processing" or "computing" or "retrieving" or "indexing" or "displaying" or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices. [0038] The present invention also relates to apparatus for performing the operations herein. This apparatus may be specially constructed for the required purposes, or it may comprise a general-purpose computer selectively activated or reconfigured by a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, and magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, or any type of media suitable for storing electronic instructions, and each coupled to a computer system bus.
[0039] The algorithms and displays presented herein are not inherently related to any particular computer or other apparatus. Various general-purpose systems may be used with programs in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform the required method steps. The required structure for a variety of these systems will appear from the description below. In addition, the present invention is not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of the invention as described herein.
[0040] Moreover, the present invention is claimed below operating or working with a plurality of servers attached to a network. As such, software implementing the present invention may be stored locally on a client terminal or in conjunction with one or a plurality of servers on a network. Additionally, the software may be stored within other storage devices (i.e., database, SAN) connected to the network.
A. Overview of a User Interface for a Synchronized Multimedia Presentation [0041] The present invention is directed to a system and method for displaying presentations containing synchronized multimedia files. Generally, a user can view the presentation on a display device attached to a network. For example, the present invention provides a viewing system allowing a user to view synchronized presentations on a web browser 100 after initiating or accessing an appropriate file through a network. Figure 1 shows an embodiment of the present invention displayed within a web browser.
[0042] The web browser 100 is one example of a display device on which the present invention may be shown. Examples of web browsers include Microsoft Explorer and Netscape Navigator. The browser 100 may be partitioned into different windows on which various different types of multimedia files may be shown. For example, within the web browser 100, a first display window 110 displays a primary multimedia file. Generally, this primary multimedia file is a video file (i.e., mpeg, avi, real, asf, mp3, etc.) although it may be any type of multimedia file including those conforming to VoIP and H.323 standards. The size, shape and color quality of the window may all be pre-determined and/or adjusted by the viewer.
[0043] A second display window 120 within the browser 100 displays a secondary multimedia file. The secondary multimedia file may be an HTML based file or any type of file including a video file, a Microsoft Powerpoint™ file, an image file, or a word processing file. The secondary multimedia file may be shown concurrently with the primary multimedia file during a presentation. The secondary multimedia files may also be automatically converted to a standard file type by the platform. For example, a Powerpoint™ slide may be converted to an XML slide to facilitate easier control and viewing of the slide. During the actual presentation, the secondary multimedia file is controlled by a synchronization file.
[0044] The synchronization file synchronizes the display of the secondary multimedia file to the primary file. According to one embodiment, the synchronization file is retrieved from a remote location before the presentation begins. The synchronization file contains at least one index that relates the display of the secondary multimedia file to the primary multimedia file. Specifically, the secondary multimedia file is displayed by triggering the pre-determined indices within the synchronization file as the primary file progresses. An index may relate to a frame or moment of time within the primary multimedia file or any other indicator through which the secondary multimedia file may be controlled. Once the index is realized and the secondary multimedia file is displayed, the secondary multimedia file remains displayed for an interval of time. This interval of time may be a default time value, relate to the triggering of a subsequent secondary file, or a predetermined amount of time.
[0045] Although there are only two display windows displaying multimedia files in this example, it is important to note that according to the present invention there can be numerous windows concurrently displaying numerous multimedia files.
[0046] A text message window 190 displays text to a viewer during the presentation. Like the secondary multimedia files, the text messages may be displayed by triggering a predetermined index within the synchronization file. Once the index is realized and the text message is displayed, the text message remains displayed for an interval of time. This interval of time may be a default time value, relate to the triggering of a subsequent secondary file, or a pre-determined amount of time. Additionally, the text message window 190 may facilitate real-time network chatting that allows a viewer to communicate with another individual through the network. In this instance, this type of chatting function would operate according to the H.323 standard or other standards that allow multiple applications to operate on a network concurrently.
[0047] A presentation indices window 130 displays a summary of indices contained within the presentation. This summary allows the viewer to quickly scan the presentation but also review or jump to specific secondary multimedia files indexed to the primary multimedia file. Additionally, the presentation indices window 130 may be hidden from the user or decreased to allow the expansion of other windows (e.g., sedond display window 120) within the browser 100.
[0048] A presentation control window 135 contains a plurality of controls that allow a viewer to manipulate a presentation. For example, a viewer may play, pause, rewind, fast forward or select continual play from a variety of controls within this window 135. Additionally, a status bar (not shown) may be implemented to show a currently displayed frame or time position relative to the entire video file. A user may also control other viewing options within the presentation by accessing controls such as window size, font, font size, font color, and window arrangement. Also, the presentation control window 135 may contain a counter that counts the number of frames that have been displayed as well as displaying the total number of frames within a certain video file. [0049] The web browser may also have a thumbnail view option to choose video, slide or a full thumbnail view. A video thumbnail is a snapshot of the frame at the index point. This snapshot is automatically extracted at the time of authoring. A slide thumbnail is a smaller view of the slide document (HTML, MS PowerPoint slide ...etc). If a thumbnail cannot be extracted, a generic image/blank is shown. The full thumbnail view may show both the video thumbnail and the corresponding slide thumbnail together for an index point. A thumbnail item may also show a time stamp of the index, image and the annotation.
[0050] Having just described an overview of a display device for showing synchronized presentations, the systems and methods used to display the synchronized presentations are described below. First, the system and its components will be described. Second, the software architecture and their various modules will be described. Third, methods of operation of the present invention will be described. Finally, an exemplary example of a display device capable of showing synchronized presentations is described in detail.
B. Synchronized Multimedia Presentation System [0051] Figure 2 shows a block diagram of a networked system used to provide synchronized multimedia presentations according to an embodiment of the present invention. This system includes a network to which a client 205, a web server 220 and a video server 215 are coupled. A database 225 is coupled to the web server 220 to allow stored data to be transmitted onto the network 210. As previously described, a synchronized multimedia presentation contains a primary multimedia file, for example a video file, stored on the video server 215 attached to the network 210. The synchronized multimedia presentation also contains a secondary multimedia file, for example an HTML-based file but may be any type of file including another video file. The secondary multimedia file is indexed to the primary multimedia file by a synchronization file. Both the secondary multimedia file and the synchronization file are stored on the database 225 and are transmitted onto the network 210 via the web server 220. Additionally, the database 225 may be used to store other type of data relating to the presentation, for example, feedback information regarding a viewer's response to the presentation may be transmitted from the client 205 to the web server 220 and stored in the database 225.
[0052] According to a first embodiment of the present invention, components of a synchronized multimedia presentation are streamed from the video server 215 and the web server 220 to the client 205. In this example, a video file, is stored on the video server 215 and streamed to the client 205 during a presentation. Secondary multimedia files are stored in the database 225 and transmitted to the client from the web server 220. Generally, there will be only one secondary multimedia file, typically an HTML-based file. However, multiple secondary multimedia files may be used to implement numerous files being synchronized and displayed within the presentation. Additionally, the secondary multimedia files do not need to be HTML-based files but can be any type of file including video, text, image files.
[0053] An individual may view the synchronized presentation by logging onto the network 210 and initiating the presentation. For example, the individual may use a uniform resource locator to address the presentation across the network 210 and display it within a web browser. This initiation include entering a password, opening an email message or clicking on a hyperlink.
[0054] The presentation may be transmitted to the client 205 using various types of methods. For example, the secondary multimedia files may be pre-fetched from the web server 220 and buffered in the client terminal 205 before the presentation begins. Also, the synchronization file, containing indexing information, stored within the database 225 may be pre-fetched and buffered in the client terminal 205. Once these files are buffered, a video file may be streamed to the client terminal 205 from the video server 215. Software, stored either locally on the client 205 or remotely on a server, monitors the video file as it is displayed and synchronizes the secondary multimedia file(s) according to the indexing information within the synchronization file.
[0055] The client 205 provides a screen(s) on which the presentation may be shown as well as a processing device and storage device that allow the components of the presentation to be synchronized. It is important to note that the web server 220 and the video server 215 may be located on the same physical device. For example, a single server may operate both as the web server 220 and the video server 215. [0056] Figure 3 shows an example of a client 205 on which a synchronized multimedia presentation may be shown. The client terminal comprises a control unit 300 coupled, via a bus, to a display 305, a keyboard 310, a cursor control 315, a network controller 320, and an I/O device 325. [0057] The control unit 300 is typically a personal computer or computing box attached to a network. However, it may also be a personal digital assistant or any other device able to receive, process and display data. In one embodiment, the control unit 300 has an operating system (i.e., Windows, UNIX, etc.) upon which multiple applications operate. The control unit 300 comprises a processor 350, main memory 335, and a data storage device all connected to a bus 330.
[0058] A processor 350 processes data signals and may comprise various computing architectures including a complex instruction set computer (CISC) architecture, a reduced instruction set computer (RISC) architecture, or an architecture implementing a combination of instruction sets. Although only a single processor is shown, multiple processors may be attached.
[0059] Main memory 335 may store instructions and/or data that may be executed by processor 350. The instructions and/or data may comprise code for performing any and/or all of the techniques described herein. Main memory 335 may be a dynamic random access memory (DRAM) device, a static random access memory (SRAM) device, or some other memory device known in the art. The memory 335 preferably includes a web browser 340 is of a conventional type that provides access to the Internet and processes HTML, XML or other mark up language to generated images on the display device 305. For example, the web browser 340 could be Netscape Navigator or Microsoft Internet Explorer. [0060] Data storage device 345 stores data and instructions for processor 350 and may comprise one or more devices including a hard disk drive, a floppy disk drive, a CD-ROM device, a DVD-ROM device, a DVD-RAM device, a DVD-RW device, a flash memory device, or some other mass storage device known in the art.
[0061] System bus 330 represents a shared bus for communicating information and data throughout control unit 300. System bus 330 may represent one or more buses including an industry standard architecture (ISA) bus, a peripheral component interconnect (PCI) bus, a universal serial bus (USB), or some other bus known in the art to provide similar functionality.
[0062] Additional components coupled to control unit 300 through system bus
330 include display device 305, keyboard 310, cursor control device 315, network controller 320 and audio device 325. Display device 305 represents any device equipped to display electronic images and data as described herein. Display device 305 may be a cathode ray tube (CRT), liquid crystal display (LCD), or any other similarly equipped display device, screen, or monitor. In one embodiment, display device 305 is equipped with a touch screen in which a touch-sensitive, transparent panel covers the screen of display device 305. [0063] Keyboard 310 represents an alphanumeric input device coupled to control unit 300 to communicate information and command selections to processor 350. Cursor control 315 represents a user input device equipped to communicate positional data as well as command selections to processor 350. Cursor control 315 may include a mouse, a trackball, a stylus, a pen, a touch screen, cursor direction keys, or other mechanisms to cause movement of a cursor. Network controller 320 links control unit 300 to a network that may include multiple processing systems. The network of processing systems may comprise a local area network (LAN), a wide area network (WAN) (e.g., the Internet), and/or any other interconnected data path across which multiple devices may communicate. [0064] One or more I/O devices 325 are coupled to the system bus 330. For example, the I/O device 325 may be an audio device equipped to receive audio input and transmit audio output. Audio input may be received through various devices including a microphone within audio device 325 and network controller 320. Similarly, audio output may originate from various devices including processor 350 and network controller 320. In one embodiment, audio device 325 is a general purpose; audio add-in/expansion card designed for use within a general purpose computer system. Optionally, audio device 325 may contain one or more analog-to-digital or digital-to-analog converters, and/or one or more digital signal processors to facilitate audio processing.
[0065] It should be apparent to one skilled in the art that control unit 300 may include more or less components than those shown in Figure 3 without departing from the spirit and scope of the present invention. For example, control unit 300 may include additional memory, such as, for example, a first or second level cache, or one or more application specific integrated circuits (ASICs). Similarly, additional components may be coupled to control unit 350 including, for example, image scanning devices, digital still or video cameras, or other devices that may or may not be equipped to capture and/or download electronic data to control unit 300. [0066] Figure 4 is a block diagram representing a general hierarchical model of the multiple layers within an embodiment of the present invention. A display composite 400 is generated through the combination of a video layer 410, an HTML layer 420, and a control/synchronization layer 430. The video layer 410 comprises a video file (i.e., primary multimedia file) that is streamed from the video server 215 and displayed in a first display window 110. Specifically, this video is streamed from a video server during the presentation and buffered at a client. The video stream is then processed and displayed in a window on the web browser. 100
[0067] An HTML layer 420 comprises an HTML file(s) that is pre-fetched from the database 225 via the web server 220. The HTML file is stored within the client prior to the streaming of the video file begins. The HTML file(s) is synchronized to the streaming video and displayed in the second display window 120 within the web browser 100. Note that the HTML files need not be pre-fetched but can be synchronized at the server-side and streamed from the server according to the synchronization.
[0068] A control layer 430 controls and synchronizes both the video file and the HTML file. In one example, this control data may be stored within a single synchronization file that is pre-fetched from the database 225 via the web server 220. The control layer 430 manages the display options of the window in which the video is shown and the window in which the HTML file is shown. These display options include the size and brightness of the window, the volume of an audio file, etc. Additionally, the control layer 430 synchronizes the display of both the video and HTML file. As described above, the display of the HTML file(s) may be triggered by a particular index (e.g., time index or frame index) in the video file. If multiple HTML files are used, then each file may have an index to the video file that will trigger its display.
[0069] The composite layer 400 controls the incorporation of the video presentation and HTML presentation within a single browser. The composite layer 400 utilizes a variety of applications to implement this incorporation including ActiveX, and the underlying operating system (Microsoft Windows™ or Unix™). Thus, the multi-layered presentation provides a user an independently controlled presentation while still allowing a presentation designer the ability to incorporate user interactivity within the presentation.
C. Client-Side Control of Synchronized Presentation [0070] Figure 4 shows a more detailed drawing of the main memory 335 of the client 105. The main memory generally comprises an operating system 500 whereon a number of applications operate. A bus 330 couples the operating system 500 to the applications module 505. In a preferred embodiment, a video storage module 510 is coupled to the bus 330. The video storage module 510 stores/buffers a primary video file received from the video server 215. The primary video file may be stored locally or streamed in real time from the video server 215 and buffered in the video storage module. Examples of a video storage module include a portion of a hard disk drive or a RAM module.
[0071] An HTML storage module 520 is coupled to the bus 330. The HTML storage module 520 stores secondary multimedia files received from the web server 220. These secondary multimedia files may be either converted to an HTML format by an author prior to storage or automatically converted by the HTML storage module 520 to an HTML based file. For example, a text file may automatically converted to an HTML file using one of numerous conversion files known within the art. These secondary multimedia files may be stored locally within the HTML storage module 520, pre-fetched from the web server 220 and buffered in the module 520, or streamed in real time and buffered in the module 520. Examples of an HTML storage module 420 include a portion of a hard disk drive or a RAM module.
[0072] A synchronization control module 525 is coupled to the bus 330. The synchronization control module 525 stores a synchronization file containing indexing information and file addressing information created during the authoring of the presentation. Generally, this information is buffered in the synchronization control module 525 before the presentation begins. However, the synchronization control module 525 may receive indexing data during the presentation. The processor 350 accesses this indexing information from the synchronization control module 525 during the presentation in order to properly synchronize the HTML file(s) to the video file. Examples of a synchronization control module 425 include a portion of a hard disk drive or a RAM module.
[0073] A graphical user interface control module 530 is coupled to a bus 330.
The graphical user interface control module 530 stores graphical display options created by the author during the creation of a presentation. Examples of these options include the size of each of the display windows, the volume of the audio, and the duration a secondary multimedia file is displayed. Once these display options are stored within the module 530, the processor 350 may access this data in order to provide the correct graphics on the display 305. Examples of a graphical control module 530 include a portion of a hard disk drive or a RAM module.
D. Methods for Displaying a Synchronized Multimedia Presentation [0074] Figure 6 is a flowchart showing a first method for displaying synchronized multimedia files contained within a presentation. In order to display a presentation, an individual must initiate 605 the presentation from the 205 connected to the network 210. The presentation may be initiated within a browser using a uniform resource locator. After the presentation has been initiated, certain files may be pre-fetched from a server and buffered at a client 205. For example, secondary multimedia file, such as HTML files stored on the database 225, may be pre-fetched 610 and buffered at the client 205 before the presentation begins. Additionally, synchronization data may be pre-fetched 615 from the database 225 and buffered at the client 205 before the presentation begins.
However, it is important to note that the presentation may be displayed without pre-fetching any secondary multimedia files.
[0075] A primary video file is streamed 620 from the video server 215 through the network 210 to the client 205. Synchronization data is used by the processor 350 to synchronize and display 625 the secondary multimedia file(s) in relation to the primary video file. The synchronization data is either processed locally or remotely on a server. In one example, the client terminal 205 process the synchronization data and displays both files accordingly. In a second example, synchronization data is processed remotely on a server, either the video server 215 or the web server 220. In this example, both servers would communicate with each other in order to trigger the transmission and display of secondary multimedia files from the web server 220.
[0076] Figure 7 is a flowchart showing a method for synchronizing the display of a secondary multimedia file to a primary multimedia file. Initially, a primary multimedia file is shown 705 within a display. A processor, within a display device or coupled to the display device, monitors 710 the video file in order to properly identify synchronization indices relating to the secondary multimedia file(s). For example, the progression of frames within a video file may be monitored in order to identify a frame index relating to a secondary multimedia file. Similarly, the time progression of a video file as it is being displayed may be momtored to identify a timing index relating to a secondary multimedia file.
[0077] Once an index is identified, a secondary multimedia file is displayed
720 within another window. For example, a particular HTML file is displayed in a second window when an index at the 200th frame of the video file is reached. This HTML file may continue to be displayed until the next index is reached or for a particular period of time. However, during this process, the video file continues playing 730. This process continues until the entire video is played.
E. Methods for Providing User Interactivity with the Presentation [0078] Figure 8 shows a flowchart describing a user response storage system that may be implemented within a synchronized multimedia. During the presentation, a user may be prompted by a question or other inquiry 805. For example, a user may be asked what her/she thinks of a certain part of the presentation. This question may be embedded within a secondary multimedia file. The system may wait 810 for a response from the user before continuing on with the presentation or may simply continue the presentation. This response may be a user activating a graphical icon in the presentation or an audio response that is recorded by the display device.
[0079] After the user responds, the system transmits the response to a storage system either locally or remotely on an attached network. For example, the response may be transmitted 815 across the network 210 to the web server 220 whereupon it is stored 820 in the database 225. In a another embodiment, the response information may be stored in a different location on the network. In any event, this response information may then be accessed 825 from the database 225 and analyzed 830.
[0080] Figure 9 illustrates a method for monitoring interactive functions that may be embedded 905 within the primary video file. Interactive functions include pausing, stopping, or rewinding the video file. Also, more complex interactive may be embedded in the primary video that allow a user to identify or activate points of interest within the primary video. For example, a user may be able to use a mouse to click on an item shown within the primary video to identify the item as something of interest. This click may then initiate a secondary multimedia file in a different window that describes the item of interest in more detail. This interactive primary video display also allows a viewer to interact with the video by clicking on certain frames, by editing a video or multimedia file, or other ways of interaction. This interaction is monitored 910 by a client (e.g., 205) on which the presentation is shown and stored locally. Next, the client transmits 915 the interaction data through an attached network to a web server. This data may be transmitted during the presentation, or stored locally and transmitted at the end of the presentation.
[0081] The interaction data is then stored 920 in a storage device that is coupled to the network. For example, this interaction data may be transmitted to the web server 220 and stored in the attached database 225. Thereafter, if a provider or a business wants to analyze the stored data, the interaction data may be retrieved 925 from the database and subsequently analyzed 930.
F. Embodiment of Other User Interfaces for Initiating a Synchronized Multimedia Presentation [0082] Figure 10 illustrates a browser window 1000 for accessing a synchronized multimedia presentation. An addressing windowlOlO may be used to access the presentation by a specific uniform resource locator via a web page. To view the web page, the user uses an Internet browser 1000 like Microsoft Explorer™ or Netscape Navigator™. According to this example, the user is given a variety of choices, one of which is to see a list of presentations contained within multiple directories within the presentation view window 1040. Access to this list may be password protected. Accordingly, a presentation security window 1030 within the browser 1000 may require a password from a user in order to gain access to the presentation directories or open a particular presentation.
[0083] If the user knows which file to view, then the user may select a view a presentation icon (not shown) within the presentation view window 1040. Once the icon is selected, the user may input the name of the presentation or select the presentation from a default home directory. Once a specific presentation is selected, the presentation will be displayed within the browser or a different browser.
[0084] A user may modify settings of the presentation using a presentation control window 1050 within the browser 1000. For example, controls such as various window sizes within the presentation, ActiveX specific controls, brightness, video rate, etc. may be controlled within this window 1050 prior to displaying the presentation. This functionality allows a presenter to tune the presentation before it is given. The presentation controls may be adjusted locally where the presentation is given or adjusted remotely at a terminal coupled to the network. [0085] A user may also initiate a presentation by opening an email and clicking on a hyperlink. The hyperlink will take the browser directly to a URL address where the presentation will automatically begin. This emailing operation allows an author to invite individuals to view a presentation by merely sending out an email containing a hyperlink to a corresponding URL address. Security features may be embedded into the presentation or platform on which the presentation operates that allow the presentation to be initiated by a single or multiple IP addresses corresponding to invitation emails.
G. Multi-Display Synchronized Multimedia Presentation [0086] Figure 11 illustrates a system on which a single presentation may be transmitted to multiple display devices coupled to a network. As previously discussed, the web server 220 and video server 215 transmit multiple multimedia files that are synchronized and incorporated within a presentation. According to one embodiment, the presentation may be shown on multiple display devices. For example, a primary multimedia file, a secondary multimedia file and a synchronization file may be transmitted to a client 1100 coupled to the network 210. The client 1100 comprises a storage device and processing unit that allows the different multimedia files to be synchronized and incorporated into a single presentation. Once these multimedia files have been appropriately processed, files are transmitted to an aπay of display devices on which the presentation is shown.
[0087] A first display device 1105 is coupled to the client 1100 and shows a primary multimedia file. As previously discussed, this primary multimedia file may be a video file that is streamed onto the client 1100 from the video server 215. The first display device 1105 may be coupled to the client 1100 via a serial or parallel connection, as would be the case if it were a computer monitor. The first display device 1105 may be coupled to the client 1100 via a network connection such as an Ethernet or IP network connection. However, in this instance, the first display device 1105 would likely require a certain level of intelligence in order to properly function on the network.
[0088] A second display device 1110 is coupled to the client 1100 and shows a secondary multimedia file. The secondary multimedia file may be another video file, an HTML file, a text file or any other type of file that will allow it to be shown concuπently with the primary multimedia file. As was the case with the first display device 1105, the second display device 1110 may be networked to the client 1100 or coupled either in series or parallel to the client 1100. A third display device 1115 is coupled to the client 1110 and may show another secondary multimedia file or provide another window that may be implemented within the presentation. For example, this third display device 1115 may show an HTML file(s) or be used as a text chat room that enables a viewer to communicate with someone in real-time as the presentation is progressing.
[0089] It is important to note that some or all of these windows may be combined into a single web browser. Additionally, the present invention facilitates streaming the presentation to multiple viewers that are watching it concurrently. For example, the present invention provides a platform on which an instructor may teach multiple students located in different locations. Specifically, a streaming video file of the instructor is transmitted to each client at which a student is located. Other teaching material may be transmitted to these clients and synchronized to the streaming video. Thus, one skilled in the art will recognize a large number of different types of applications that may operate on this synchronized multimedia presentation platform. [0090] While the present invention has been described with reference to certain embodiments, those skilled in the art will recognize that various modifications may be provided. Variations upon and modifications to the prefeπed embodiments are provided for by the present invention, which is limited only by the following claims.

Claims

We claim:
1. A networked presentation system that displays synchronized multimedia files within a presentation, the system comprising: a display device, coupled to a network, that displays synchronized multimedia files within a presentation; a video server, coupled to the network, that streams a video file across the network to the display device; a first web server, coupled to the network, that transmits a secondary multimedia file across the network to the display device; a second web server, coupled to the network, that transmits a synchronization file across the network to the display device; and a database, coupled to the web server, that stores the secondary multimedia file.
2. The networked presentation system of claim 1 wherein the video file is streamed from the video server during the presentation.
3. The networked presentation system of claim 1 wherein the secondary multimedia file is retrieved from the first web server before the presentation begins.
4. The networked presentation system of claim 1 wherein the synchronization file is retrieved from the second web server before the presentation begins.
5. The networked presentation system of claim 4 wherein the synchronization file uses a frame index to synchronize the secondary multimedia file to the video file.
6. The networked presentation system of claim 4 wherein the synchronization file uses a time index to synchronize the secondary multimedia file to the video file.
7. The networked presentation system of claim 1 wherein the first web server and the second web server are the same.
8. The networked presentation system of claim 1 further comprising a security access control, coupled to the network, that limits access to the video file.
9. The networked presentation system of claim 8 wherein a user is required to provide a password to access the video file.
10. The networked presentation system of claim 1 wherein a third web server transmits a response mechanism that allows real-time interaction between a user and the presentation.
11. The networked presentation system of claim 10 wherein the response mechanism is a graphical interface that the user may activate in response to the presentation.
12. The networked presentation system of claim 10 wherein the response mechanism is a recording device that stores the user's audio response to the presentation.
13. The networked presentation system of claim 10 further comprising a second database, coupled to the network, that stores the real-time interaction between the user and the presentation.
14. The networked presentation system of claim 1 wherein the display device comprises a computer monitor on which the video file and the secondary multimedia file are displayed.
15. The networked presentation system of claim 1 wherein the display device comprises a plurality of screens on which the video file and the secondary file are displayed.
16. A method for displaying a synchronized networked multimedia presentation, the method comprising: retrieving a primary multimedia file from a first remote location on a network; retrieving a secondary multimedia file from a second remote location on the network; displaying the primary multimedia file within a first display window connected to the network; synchronizing the display of the secondary multimedia file to the primary multimedia file; and displaying the secondary multimedia file within a second display window connected to the network.
17. The method of claim 16 wherein the primary multimedia file is a video file that is streamed from the first remote location during the presentation.
18. The method of claim 16 wherein the primary multimedia file is a video file that is retrieved from the first remote location before it is displayed in the first display window.
19. The method of claim 16 wherein the secondary multimedia file is retrieved from the first remote location before it is displayed in the second display window.
20. The method of claim 16 further comprising retrieving a synchronization file containing synchronization data for the presentation from a third remote location on the network.
21. The method of claim 20 wherein the second remote location and the third remote location are the same.
22. The method of claim 20 wherein the synchronization file is retrieved from the third remote location prior to the displaying of the presentation.
23. The method of claim 16 further comprising displaying text within a text message window that is coupled to the network.
24. The method of claim 23 wherein text displayed within the text message window is synchronized to the primary multimedia file.
25. The method of claim 16 further comprising displaying a presentation index within a presentation indices window that is coupled to the network.
26. The method of claim 25 wherein the presentation index is a time index relating to the primary multimedia file.
27. The method of claim 25 wherein the presentation index is a frame index relating to the primary multimedia file.
28. The method of claim 16 further comprising displaying a presentation control window that allows a user to directly control the presentation.
29. The method of claim 16 further comprising providing a response mechanism to a viewer of the presentation to allow real-time interaction between the presentation and the user.
30. The method of claim 29 wherein the response mechanism is a graphical user interface that the user may activate in response to the presentation.
31. The method of claim 29 wherein the response mechanism is a recording device that stores an audio response from the user.
32. The method of claim 29 further comprising storing the user interaction with the response mechanism within a database coupled to the network.
33. The method of claim 16 wherein the first display window and the second display window are shown within a computer monitor.
34. The method of claim 16 wherein the first display window is shown on a first screen and the second display window is shown on a second screen.
35. The method of claim 16 providing access security to the presentation to prevent unauthorized access of files relating to the presentation.
36. The method of claim 35 wherein the user is required to provide a password to access files relating to the presentation.
37. The method of claim 16 wherein a user access the presentation over a network through a uniform resource locator.
38. A synchronized multimedia presentation display comprising: a first display window that shows a primary multimedia file stored within a first remote location; a second display window that show a secondary multimedia file stored within a secondary remote location; a presentation indices window that shows a synchronization index describing a display timing relationship between the first multimedia file and the second multimedia file.
39. The presentation display of claim 38 wherein the primary multimedia file is a video file that is streamed from the first remote location during a presentation.
40. The presentation display of claim 8 wherein the primary multimedia file is a video file that is retrieved from the first remote location before it is displayed in the first display window.
41. The presentation display of claim 38 wherein the secondary multimedia file is retrieved from the second remote location before it is displayed in the second display window.
42. The presentation display of claim 38 wherein the synchronization index is retrieved from a third remote location before it is displayed in the presentation indices window.
43. The presentation display of claim 42 wherein the second remote location and the third remote location are the location on the network.
44. The presentation display of claim 38 further comprising a text window that displays text during the presentation.
45. The presentation display of claim 44 wherein text within the text window is synchronized to the primary multimedia file.
46. The presentation display of claim 38 wherein the secondary multimedia file is indexed to the primary multimedia file by a frame index.
47. The presentation display of claim 38 wherein the secondary multimedia file is indexed to the primary multimedia file by a time index.
48. The presentation display of claim 38 further comprising a presentation control window that allows a user to directly control the presentation.
49. The presentation display of claim 38 further comprising a response mechanism within the presentation that allows real-time interaction between the presentation and the user.
50. The presentation display of claim 49 wherein the response mechanism is a graphical user interface that the user may activate in response to the presentation.
51. The presentation display of claim 49 wherein the response mechanism is a recording device that stores an audio response from the user.
52. The presentation display of claim 38 further comprising a security access window that limits access to the presentation.
53. The presentation display of claim 52 wherein a user is required to enter a password to access the presentation.
54. The presentation display of claim 38 further comprising a network address window that allows a user to access the presentation through a uniform resource locator.
PCT/US2002/000087 2001-01-04 2002-01-04 Synchronized multimedia presentation WO2002054192A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2002237756A AU2002237756A1 (en) 2001-01-04 2002-01-04 Synchronized multimedia presentation

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US25999201P 2001-01-04 2001-01-04
US60/259,992 2001-01-04
US3936302A 2002-01-03 2002-01-03
US10/039,363 2002-01-03

Publications (2)

Publication Number Publication Date
WO2002054192A2 true WO2002054192A2 (en) 2002-07-11
WO2002054192A3 WO2002054192A3 (en) 2009-02-26

Family

ID=26716059

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2002/000087 WO2002054192A2 (en) 2001-01-04 2002-01-04 Synchronized multimedia presentation

Country Status (2)

Country Link
AU (1) AU2002237756A1 (en)
WO (1) WO2002054192A2 (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1517328A1 (en) 2003-09-16 2005-03-23 Ricoh Company Information editing device, information editing method, and computer program product
WO2005029346A1 (en) * 2003-09-24 2005-03-31 Magian Design Studio Pty Ltd Method and system for management and publication of media assets in a distributed network
WO2007049999A1 (en) * 2005-10-26 2007-05-03 Timetomarket Viewit Sweden Ab Information intermediation system
US7519739B2 (en) 2003-08-26 2009-04-14 International Business Machines Corporation Synchronizing a client user interface with a server backend
US7907815B2 (en) 2003-04-23 2011-03-15 Lg Electronics Inc. Method and apparatus for synchronous reproduction of main contents recorded on an interactive recording medium and additional contents therefor
TWI419564B (en) * 2009-04-30 2013-12-11 Hon Hai Prec Ind Co Ltd System and method for storing video images
WO2014014963A1 (en) * 2012-07-16 2014-01-23 Questionmine, LLC Apparatus and method for synchronizing interactive content with multimedia
US20140129522A1 (en) * 2012-11-04 2014-05-08 International Business Machines Corporation Method for Synchronization and Management of System Activities with Locally Installed Applications
US9086788B2 (en) 2011-12-12 2015-07-21 International Business Machines Corporation Context-sensitive collaboration channels
US9124657B2 (en) 2011-12-14 2015-09-01 International Business Machines Corporation Dynamic screen sharing for optimal performance
US9134889B2 (en) 2011-12-14 2015-09-15 International Business Machines Corporation Variable refresh rates for portions of shared screens
US9582808B2 (en) 2011-12-12 2017-02-28 International Business Machines Corporation Customizing a presentation based on preferences of an audience
US9588652B2 (en) 2011-12-12 2017-03-07 International Business Machines Corporation Providing feedback for screen sharing
US10304412B1 (en) 2016-06-30 2019-05-28 Google Llc Device synchronization
CN114679595A (en) * 2022-02-23 2022-06-28 武汉华瑾科技有限公司 Multi-window processing method and system for interactive teaching information
WO2023211442A1 (en) * 2022-04-28 2023-11-02 Futurewei Technologies, Inc. Adaptive lecture video conferencing

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5966121A (en) * 1995-10-12 1999-10-12 Andersen Consulting Llp Interactive hypervideo editing system and interface
US5978835A (en) * 1993-10-01 1999-11-02 Collaboration Properties, Inc. Multimedia mail, conference recording and documents in video conferencing
US6204840B1 (en) * 1997-04-08 2001-03-20 Mgi Software Corporation Non-timeline, non-linear digital multimedia composition method and system
US6317794B1 (en) * 1997-11-12 2001-11-13 Ncr Corporation Computer system and computer implemented method for synchronization of simultaneous web views

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5978835A (en) * 1993-10-01 1999-11-02 Collaboration Properties, Inc. Multimedia mail, conference recording and documents in video conferencing
US5966121A (en) * 1995-10-12 1999-10-12 Andersen Consulting Llp Interactive hypervideo editing system and interface
US6204840B1 (en) * 1997-04-08 2001-03-20 Mgi Software Corporation Non-timeline, non-linear digital multimedia composition method and system
US6317794B1 (en) * 1997-11-12 2001-11-13 Ncr Corporation Computer system and computer implemented method for synchronization of simultaneous web views

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7907815B2 (en) 2003-04-23 2011-03-15 Lg Electronics Inc. Method and apparatus for synchronous reproduction of main contents recorded on an interactive recording medium and additional contents therefor
US7519739B2 (en) 2003-08-26 2009-04-14 International Business Machines Corporation Synchronizing a client user interface with a server backend
EP1517328A1 (en) 2003-09-16 2005-03-23 Ricoh Company Information editing device, information editing method, and computer program product
US7844163B2 (en) 2003-09-16 2010-11-30 Ricoh Company, Ltd. Information editing device, information editing method, and computer product
WO2005029346A1 (en) * 2003-09-24 2005-03-31 Magian Design Studio Pty Ltd Method and system for management and publication of media assets in a distributed network
WO2007049999A1 (en) * 2005-10-26 2007-05-03 Timetomarket Viewit Sweden Ab Information intermediation system
JP2009514326A (en) * 2005-10-26 2009-04-02 エガード、アニカ Information brokerage system
TWI419564B (en) * 2009-04-30 2013-12-11 Hon Hai Prec Ind Co Ltd System and method for storing video images
US9086788B2 (en) 2011-12-12 2015-07-21 International Business Machines Corporation Context-sensitive collaboration channels
US9582808B2 (en) 2011-12-12 2017-02-28 International Business Machines Corporation Customizing a presentation based on preferences of an audience
US9852432B2 (en) 2011-12-12 2017-12-26 International Business Machines Corporation Customizing a presentation based on preferences of an audience
US9600152B2 (en) 2011-12-12 2017-03-21 International Business Machines Corporation Providing feedback for screen sharing
US9588652B2 (en) 2011-12-12 2017-03-07 International Business Machines Corporation Providing feedback for screen sharing
US9124657B2 (en) 2011-12-14 2015-09-01 International Business Machines Corporation Dynamic screen sharing for optimal performance
US9131021B2 (en) 2011-12-14 2015-09-08 International Business Machines Corporation Dynamic screen sharing for optimal performance
US9134889B2 (en) 2011-12-14 2015-09-15 International Business Machines Corporation Variable refresh rates for portions of shared screens
US9141264B2 (en) 2011-12-14 2015-09-22 International Business Machines Corporation Variable refresh rates for portions of shared screens
US9535577B2 (en) 2012-07-16 2017-01-03 Questionmine, LLC Apparatus, method, and computer program product for synchronizing interactive content with multimedia
WO2014014963A1 (en) * 2012-07-16 2014-01-23 Questionmine, LLC Apparatus and method for synchronizing interactive content with multimedia
US20140129522A1 (en) * 2012-11-04 2014-05-08 International Business Machines Corporation Method for Synchronization and Management of System Activities with Locally Installed Applications
US8903780B2 (en) * 2012-11-04 2014-12-02 International Business Machines Corporation Method for synchronization and management of system activities with locally installed applications
US8903768B2 (en) * 2012-11-04 2014-12-02 International Business Machines Corporation Method and system for synchronization and management of system activities with locally installed applications
US10304412B1 (en) 2016-06-30 2019-05-28 Google Llc Device synchronization
CN114679595A (en) * 2022-02-23 2022-06-28 武汉华瑾科技有限公司 Multi-window processing method and system for interactive teaching information
WO2023211442A1 (en) * 2022-04-28 2023-11-02 Futurewei Technologies, Inc. Adaptive lecture video conferencing

Also Published As

Publication number Publication date
AU2002237756A8 (en) 2009-03-26
WO2002054192A3 (en) 2009-02-26
AU2002237756A1 (en) 2002-07-16

Similar Documents

Publication Publication Date Title
US9584571B2 (en) System and method for capturing, editing, searching, and delivering multi-media content with local and global time
WO2002054192A2 (en) Synchronized multimedia presentation
US20040080528A1 (en) Systems and methods for presenting interactive programs over the internet
US7761507B2 (en) Networked chat and media sharing systems and methods
US7664813B2 (en) Dynamic data presentation
US8341683B2 (en) Convergence-enabled DVD and web system
US6557042B1 (en) Multimedia summary generation employing user feedback
US20030160813A1 (en) Method and apparatus for a dynamically-controlled remote presentation system
US20050154679A1 (en) System for inserting interactive media within a presentation
US20020077900A1 (en) Internet protocol-based interstitial advertising
US20040010629A1 (en) System for accelerating delivery of electronic presentations
US20050144258A1 (en) Method and system for facilitating associating content with a portion of a presentation to which the content relates
US20050022127A1 (en) Enhanced media player
US20110138282A1 (en) System and method for synchronizing static images with dynamic multimedia contents
EP0737930A1 (en) Method and system for comicstrip representation of multimedia presentations
US20070061326A1 (en) Receiving display station on a communication network for accessing and displaying network documents associated with a television program display in which the text stream of the TV program on the display station provides user selectable links to predetermined network source sites
Cisco Chapter 8: Managing Online Presentations
Cisco Managing Online Presentations
WO2002077750A2 (en) Authoring platform for synchronized multimedia presentation
CN114902690B (en) Multimedia system and method for multimedia playing platform
Li et al. Frame-buffer on demand: Applications of stateless client systems in web-based learning
JP5042061B2 (en) Character information display device, character information display method, and character information display program
Van Horn Looking ahead
US20110099483A1 (en) Website Recording of Reactions of a Designated User through interaction with characters
Leung et al. A Virtual Laboratory for an Online Web-Based Course-’Rapid E-business Systems Development’

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
NENP Non-entry into the national phase in:

Ref country code: DE

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase in:

Ref country code: JP