CLIENT PRESENTATION PAGE CONTENT SYNCHRONIZED TO A STREAMING DATA SIGNAL
Cross-Reference to Related Applications
This application claims priority to U.S. provisional application Serial No. 60/153,132, filed September 09, 1999. This co-pending application is incorporated herein by reference in its entirety.
Field of the Invention
The present invention relates generally to apparatus and methods for providing a synchronized client presentation page and more specifically to an apparatus and method for realtime synchronizing of elements of a client presentation page to a streaming data signal having a video and/or audio component.
Background of the Invention
With the growth of networked computers, the Internet, intranets and other computer networks are being used to distribute and present information. Presentations over a computer network can be displayed to a user using a web page. A typical web page can contain many elements, including video and/or audio components. Operators are the creators of a web page presentation. Operators of web presentations can coordinate text and graphical elements of the web page with the video and/or audio components using known software tools. For example, software tools such as Synchronized Multimedia Integration Language ("SMIL") by RealNetworks, Inc and ePublisher by Avid Technology, Inc., allow an operator to predefine and
prearrange text and graphics elements of the presentation page to be coordinated with the video portion of the presentation page. The coordination is all done prior to dissemination of the presentation page using pre-recorded video.
Summary of the Invention
The present invention synchronizes the content (i.e., text and/or graphics) of a client presentation page with the video and/or audio signals in a real-time environment, allowing a operator to change the synchronization up until the video and/or audio components are sent to the client for presentation. The invention does not require a pre-timed presentation, nor a prerecorded video and/or audio portion, so that that synchronization can be done using a live presentation. The invention relates to a method and apparatus for synchronizing content of a client presentation page to a streaming data signal, where the streaming data signal includes a video component and/or an audio component.
In one aspect, the invention relates to a method for synchronizing the content of a client presentation page to a streaming data signal which comprises a video component and/or an audio component. The method embeds one or more data events into the streaming data signal simultaneously with the generation of the data event, transmits the streaming data signal to a client and retrieves on the client, or from the server, data in response to each data event in the streaming data signal. The method further provides in the client presentation page a video and/or an audio component portion of the data signal as the data signal is processed, and simultaneously provides with the video and/or audio component portion in the client presentation page a representation of the data in response to each data event.
In another embodiment, this method generates one or more data events in response to user input. In another embodiment, this method contains the step of processing the streaming data signal on the client. In another embodiment, the streaming data signal contains an encoded
digital signal. In another embodiment, the video and/or audio component portion of the digital signal is associated with a live production. In another embodiment, the method of retrieving involves requesting from a server the data in response to each data event in the data signal and receiving the requested data at the client. In another embodiment, this method of retrieving also includes a process of receiving at the client, prior to receiving the data event, data that will be used for each data event in the data signal, storing at the client the received data for subsequent use, and retrieving from the client storage data in response to each data event in the data signal being processed on the client.
In another embodiment, the method for synchronizing involves storing the streaming data signal. In another embodiment, this method includes changing by a power user a source from which a video component and an audio component portion of the digital signal is received.
In another aspect, the invention involves a method for generating a data signal for synchronizing content of a client presentation page to a streaming data signal. This method of generating includes providing a digital signal containing a video component and/or an audio component, generating one or more data events in response to user input, embedding each data event into the digital signal substantially simultaneously with the generation of the data event to create a data signal and streaming to a client the data signal for providing in the client presentation page a video and/or an audio component portion of the data signal as the data signal is processed. The method further includes transmitting to a client data corresponding to each data event for substantially simultaneously providing with the video component and/or audio component portion in the client presentation page a representation of the data in response to each data event.
In another aspect, the invention relates to a method for presenting to a user a client presentation page synchronized to a streaming data signal. This method includes receiving from a server a streaming data signal, the data signal comprising at least one embedded data event and
a video component and/or an audio component, processing the data signal on the client and retrieving data in response to each data event in the data signal being processed on the client. The method further includes providing in the client presentation page the video and/or audio component portion of the data signal as the data signal is processed, and substantially simultaneously providing with the video and/or audio component portion in the client presentation page a representation of the data in response to each data event.
In another aspect, the invention relates to a system for synchronizing content of a client presentation page to a streaming data signal. The system includes a producer module for generating one or more data events in response to a user input, a data signal generation module in communication with the producer module for receiving each data event and in communication with a data source for receiving a source signal comprising a video component and/or an audio component, the data signal generation module embedding the data event into a digital representation of the source signal substantially simultaneously with generation of the data event to create a digital data signal and streaming the data signal to a client. The system further includes a web server module in communication with the producer module, the web server module transmitting to the client data corresponding to each embedded data event, and a client in communication with the web server module for receiving corresponding data and in communication with the data signal generation module for receiving the data signal. The client processes the data signal, retrieves the corresponding data in response to each data event in the processed data signal, and provides in the client presentation page the video and/or audio component portion of the data signal as the data signal is processed and a representation of the corresponding data in response to each data event, thereby substantially simultaneously providing with the video and/or audio component portion in the client presentation page the representation. In another embodiment, the digital data signal comprises an encoded digital signal. In another embodiment, the video component and audio component portion of the data signal is associated
with a live production. In another embodiment, the client is configured for requesting from the web server module corresponding data in response to each data event in the processed data signal. In another embodiment, the client also includes a memory buffer for storing the received data. In another embodiment, the system includes an archive module for storing the streaming data signal. In another embodiment, the producer module is further configured to allow a power user to change receipt of the source signal from the data source to a second data source from which a second source signal is received. In another embodiment, the data source is a video switcher.
In another aspect, the invention relates to a server node for generating a data signal for synchronizing content of a client presentation page to a streaming data signal. The server node includes a producer module for generating one or more data events in response to a user input, a data signal generation module in communication with the producer module for receiving each data event and in communication with a data source for receiving a source signal comprising a video component and/or an audio component. The data signal generation module embeds the data event into a digital representation of the source signal substantially simultaneously with generation of the data event to create a digital data signal and streams the data signal to a client for display of the video and/or audio component portion of the digital data signal in the client presentation page as the digital data signal is processed by the client. The server node further includes a web server module in communication with the producer module, the web server module transmitting to the client data corresponding to each embedded data event for display of a representation of the data in the client presentation page in response to each data event, so that the representation is substantially simultaneously displayed with the video and/or audio component portion.
In another aspect, the invention relates to a client node presenting to a user a client presentation page synchronized to a streaming data signal. The client node includes a client in
communication with a web server module for receiving corresponding data and in communication with a data signal generation module for receiving a digital data signal comprising at least one embedded data event and at least one video component and/or audio component. The client processes the digital data signal, retrieves the corresponding data in response to each data event in the processed digital data signal, and displays in the client presentation page the video and/or audio component portion of the digital data signal as the digital data signal is processed, and a representation of the corresponding data in response to each data event, thereby substantially simultaneously displaying with the video and/or audio component portion in the client presentation page the representation.
Brief Description of the Drawings
The foregoing and other objects, features and advantages of the present invention, as well as the invention itself, will be more fully understood from the following description of preferred embodiments, when read together with the accompanying drawings, in which:
FIG. la is a high level block diagram of an embodiment of the invention.
FIG. lb is a high level block diagram of an embodiment of a display produced by the invention.
FIG. 2 is a high level block diagram of another embodiment of the invention.
FIG. 3a is a screen shot of an embodiment of a configuration of a control panel used to control a video switcher, according to the invention.
FIG. 3b is a screen shot of an embodiment of a configuration of a control panel used to control the embedding of data events, according to the invention.
FIG. 4 is a screen shot of an embodiment of a graphical interface used to create a synchronized client presentation page produced by the invention.
FIG. 5 is a screen shot of an embodiment of a graphical interface used to enter data to create a synchronized client presentation page produced by the invention.
FIG. 6 is a screen shot of an embodiment of a graphical interface used to enter data to create a synchronized client presentation page produced by the invention.
FIG. 7 is a screen shot of an embodiment of a graphical interface used to enter data to create a synchronized client presentation page produced by the invention.
FIG. 8 is a screen shot of an embodiment of a graphical interface used to enter data to create a synchronized client presentation page produced by the invention.
FIGS. 9a and 9b are screen shots of an embodiment of a graphical interface used to enter data to create a synchronized client presentation page produced by the invention.
FIG. 9c is a screen shot of an embodiment of a survey created by an operator for use in a synchronized client presentation page produced by the invention.
FIG. 9d is a screen shot of an embodiment of a survey result displayed in a synchronized client presentation page produced by the invention.
FIG. 10 is a screen shot of an embodiment of a graphical interface used to enter data to create a synchronized client presentation page produced by the invention with a customized layout.
Detailed Description of Preferred Embodiments
In broad overview, FIGS, la and lb depict an exemplary embodiment of a multimedia presentation system 10. FIG. la depicts a partial page regeneration system 10 that includes a first computing system ("server node") 14 in communication with a second computing system ("client node") 18 over a network 22. The network 22 can be a local-area network (LAN), such as a company intranet or a wide area network (WAN) such as the Internet or the World Wide Web. The server node 14 and the client node 18 can be connected to the network 22 through a variety of connections including standard telephone lines, LAN or WAN links (e.g., Tl, T3, 56kb, X.25), broadband connections (ISDN, Frame Relay, ATM), and wireless connections. The connections to the network 22 and among the various modules can be established using a variety of communication protocols (e.g., TCP/IP, IPX, SPX, NetBIOS, Ethernet, RS232, and direct asynchronous connections).
As depicted in FIG. lb, the presentation system 10 displays to a user a synchronized client presentation page 26 (FIG. lb) on the display 30 at the client node 18. The synchronized client presentation page 26 includes several elements 34a, 34b, 34c, 34d, 34n (referred to generally as elements 34). The number of elements 34 varies depending on what an operator wants to display to a user. For example, the client presentation page 26 can be what is generally referred to as a web page. The client presentation page 26 can be written in any format comprehendible by the client node 18 including, for example, HTML XML, VRML, WML, (display) postscript and nroff.
Each element 34 represents something that is displayed to the user. An element 34 can be, for example, a picture, an image, a graphic, a title and/or a character string. An element 34 also can be, for example, a slide for Microsoft ® Power Point ® (manufactured by Microsoft, Inc. of Redmond, WA), a spreadsheet for Microsoft ® Excel (manufactured by Microsoft, Inc. of Redmond, WA), an HTML encoded page or the display of a chat session.
The client presentation page 26 also includes a video component 38 and/or an audio component. The audio component is presented to the user through an audio player 42. The audio player 42 represents any type of device and/or software that converts a digital signal to a signal audible to the user. The video component 38 and/or audio component (e.g., provided through the audio player 42) are/is used to synchronize the other elements 34 of the client presentation page 26 and referred to generally as the synchronizing component(s). The video component 38 is separated to distinguish the element as the synchronizing component. Note however that other elements 34 can be video and/or audio signals and can even be streamed signals. The server node 14 (FIG. la) includes a producer module 62, a data signal generation module 66 and a web-server module 70. The server node 14 and the three modules 62, 66, 70 are a combination of hardware and software. The implementation of the functions described for each can vary depending on the "tools" the user has available to him/her to implement the described functions. Particular implementations discussed herein are for exemplary purposes only and do not limit the possibilities of implementation. In addition, the three modules 62, 66, 70 need not be physically located near one another or within hardware designated as the server. The three modules 62, 66, 70 can be located anywhere throughout the computer network and in electrical communication with one another. The term node indicates a logical unit rather than a physical unit. The producer module 62 coordinates synchronization of text/graphic data with audio and/or video signals. The producer module 62 is in electrical communication with the data signal generation module 66. The communication can be implemented using any communication protocol (e.g., TCP/IP), based on the requirements imposed by the data signal generation module 66. The producer module 62 transmits control signals to the data signal generation module 66 to control the conversion of the video and/or audio input signal 58 to the streaming data signal 50.
The producer module 62 generates data events. Each data event represents the desired state of an element 34 (FIG. lb) at the particular point in the video/audio at which the data event is inserted. The producer module 62 transmits each data event to the data signal generation module 66 as it is generated to create real-time synchronization. The producer module 62 also is in communication with the web-server module 70.
The communication can be implemented using any communication protocol (e.g., TCP/IP) based on the requirements imposed by the web-server module 70. The producer module 62 transmits control signals to the web-server module 70 to control the access of client nodes 18 to the stored data needed for the presentation of the client presentation page 26. The producer module 62 also ensures that the data needed for the other elements 34 of the client presentation page 26 is stored on to the web-server module 70, so that client nodes 18 can retrieve data as the data is needed to respond to data events.
The data signal generation module 66 of the server node 14 generates the streaming data signal 50. The data signal generation module 66 receives a video and/or audio input signal 58. The video and or audio input signal 58 is the synchronizing component, i.e., the signal to which the other elements 34 of the client presentation page 26 are synchronized. The video and/or audio input signal 58 can be an analog signal or a digital signal. If the video and/or audio input signal 58 is an analog signal, the data signal generation module 66 converts the analog input into a digital signal. This conversion process can be done by using A/D hardware (e.g.,
Osprey ® 200 video capture card manufactured by ViewCast.com, Inc. of Dallas, Texas) or software that converts analog video and/or audio signals to digital signals.
Further, the digital signal can be compressed using known compression codecs (e.g., RealVideo G2 with SVT from RealNetworks, Inc. or any of the MPEG standards developed by MPEG, an ISO working group). The control signals received from the producer module 62 instruct the data signal generation module 66 in the encoding process. For example, the producer
module 62 converts to control signals the user inputs in the encoder settings box 156, FIG. 10, such as the video quality, audio format, frame rate and data transmission rate.
The data signal generation module 66 also receives data events from the producer module 62. In another embodiment, the data signal generation module 66 receives data events from a third party source (not shown). For example, if the presentation is a sporting event, the database with the scores of the event can transmit new scores as they change, as a text type data event, to the data signal generation module 66. Similarly, as another example, if the presentation is an auction, the database with the prices of items for auction can transmit new prices as they change, as a text type data event, to the data signal generation module 66. As the data events are received from the producer module 66 or a third party source
(not shown), they are embedded into the digital version of the video and/or audio input signal 58. The data event is embedded based on the encoding used by the data signal generation module 66. For example, if the digital version of the video and/or audio input signal 58 is being compressed using a RealVideo codec, the data event is embedded after the compression process. The data signal generation module 66 embeds each data event in the digital signal as it is received from the producer module 62. Thus, the data event is synchronized to the video and/or audio input signal 58 by embedding the data event at the near exact location in the digital version of the video and or audio input signal 58 with which the data event is intended to coincide. The data signal generation module 66 streams the digital signal, with embedded data events, to the client node 18 using known streaming server software and/or hardware (e.g., RealNetworks G2 server). In one embodiment, the data signal generation module 66 includes an encoder server (not shown) for the conversion and compression and a streaming server to stream the streaming data signal to the client node 18.
The client node 18 includes a display 18 to display the synchronized client presentation page 26 and a data signal processing module 74. To receive the synchronized client
presentation page 26, the client node 18 establishes a connection with the streaming server of the data signal generation module 66. In one embodiment, the client node 18 and the data signal generation module 66 communicate using a RealTime Streaming Protocol ("RTSP"), a protocol commonly used to transmit true streaming media to one or more viewers simultaneously. RTSP provides for viewers randomly accessing the stream and uses RealTime Transfer Protocol
("RTP") as the transport protocol. RTP transport protocol is created to deliver live media to one or more viewers simultaneously. In this embodiment, this protocol is a true streaming protocol, (i.e., the protocol matches the bandwidth of the media signal to the viewer's connection, so that the media is always seen in real-time.) and is the reason for one of the control signals being the data transfer rate .
In another embodiment, progressive download techniques (e.g., QuickTime's "fast start" feature) can be used. Progressive download allows the user to view the file as it is being downloaded. Progressive download files do not adjust to match the bandwidth of the user's connection like a "true streaming" format. Progressive download is also called "HTTP Streaming" because standard HTTP servers can deliver progressive download files, and no
special protocols are needed
When the client node 18 starts receiving the streaming data signal 50, the data signal processing module 74 processes the data events embedded in the streaming data signal 50. The data signal can be removed a number of ways, depending on the software the client is executing for displaying video signals. For example, RealPlayer by RealNetworks, Inc. removes the data events from the streaming data signal and transmits the data events to the data signal processing module 74. The data signal processing module 74 can include a typical web browser and additional code (e.g., JAVA) that instructs the web browser to retrieve data files from the web server module 70, using the data that is included in the data event.
In response to each data event, if necessary, the client node 18 establishes a connection to the web-server module 70 of the server node 14 using any network communication protocol (e.g., TCP/IP). The client node 18 retrieves the data 54 needed for the data event and displays the graphical representation of the data in the synchronized client presentation page 26. Once the data requested has been received, the connection is terminated. When new data is required, the connection is reestablished. In another embodiment, the data is transmitted to the client node 18 before the client node 18 processes the data event. In this embodiment, the client node 18 does not establish a connection with the web-server module 70 to retrieve data. The client node 18 instead retrieves the data from a storage location within the client node 18. FIG. 2 depicts another embodiment of the presentation system 10'. The server node
14' further includes a data source 88, a control panel 80 and an archive module 84. The data source 88 is in communication with data signal generation module 66 for transmitting a video and/or audio input signal 58 and in communication with the producer module 62 for receiving control signals. The data source 88 is a device that receives multiple video and/or audio signals as inputs and creates an output signal to be used as the video and/or audio input signal 58. In one embodiment, the data source 88 selects the input to use based on the control signals from the producer module 62.
The data source 88 can be for example, a microphone, a compact disk ("CD") and/or an audio mixer. The data source 88 also can be for example, a video camera, a video tape recorder ("VTR") and/or a video switcher. For example, an ECHOlab 5000 video switcher (Manufactured by e-StudioLIVE, Inc. of Chelmsford, Massachusetts) can be used to allow the operator to add special effects to the video signal. A data source 88 allows the operator to select from a number of video inputs.
For example, the video inputs can be cameras at various angles of an item being displayed for an auction. The data source 88 selects from the various inputs and transmits an
output that is received by the data signal generation module 66 and streamed to the client node 18. Thus, switching can be done at any time and there it is not necessary to rebuffer any video files on the client node 18 before the switchover can occur. The switchover is seamless and goes unnoticed by the user at the client node 18. Since the data signal generation module 66 creates synchronization by embedding a data event as the streaming data signal 50 is generated, the other elements 34 of the client presentation page 26 are synchronized to the new video signal input selected without any interruption.
The control panel 80 is in electrical communication with the data source 88 and the producer module 62 for transmitting control signals in response to input by an operator. In one embodiment, the data source 88 selects the video input to use based on the control signals from the control panel 80. In another embodiment, the control panel 80 uses control signals to create special effects on the video input. In one embodiment, the control panel 80 can be a configurable input device, for example the ECHOlab Commander by e-StudioLIVE, Inc. As shown in FIG. 3a, the control panel 80 can be configured with commands to control the data source 88 when the data source 88 is a video switcher. Each button 81 is configured to send a command to the video switcher (i.e., data source 88) to create a special effect. As shown in FIG. 3b, the control panel 80' also can be configured to control the insertion of predefined data events into the streaming data signal 50. In this configuration, each of the buttons 81 ' can represent a desired state of the other elements 34 of the client presentation page 26. The operator presses a button 81 ' at the point of the video presentation to change the current state of an element 34 to the state represented by the button. Pressing the button causes the producer module 62 to generate a data event to effect the desired state represented by the button.
The archive module 84 is in communication with the data signal generation module 66. The archive module captures the streaming data signal 50 as it is transmitted to the client node 18. The archive module 84 stores the captured streaming data signal 50 in a storage
medium. A user may subsequently retrieve the stored streaming data signal 50 and watch the synchronized client presentation page 26, as it appeared during the original transmission. Users can view chat sessions and surveys of the archived presentation as they appeared in the original presentation. In other embodiments, the original chat sessions and surveys can be replaced or augmented with chat sessions and surveys of the users accessing the archived streaming data signal 50.
The streaming data signal 50 contains the synchronizing component (i.e., the video and/or audio input signal 58) and data events. The producer module 62 generates the data events. Before the producer module 62 can generate data events to embed into streaming data signal 50, the operator must define them.
FIG. 4 depicts a screenshot of an exemplary embodiment of a graphical interface generated by the producer module 62 that the operator uses to predefine the desired elements 34 of the client presentation page 26. The operator has access to the producer module 62 using an input device (not shown) in communication with the producer module 62 (e.g., a keyboard and mouse of a personal computer). In this graphical interface, the client presentation page 94 has been predefined with five elements 100, 101, 102, 103, 104. The synchronizing component is a video (with audio) signal displayed in element 101. In one embodiment, the elements 100, 101, 102, 103, 104 are defined to the client node 14 using the HTML framing tags. The synchronizing component is displayed in element 101. The operator creates a predefined list (e.g., a playlist 90) of the desired states of the elements 100, 101, 102, 103, 104 that the operator wants to use as part of the client presentation page 94 during the video presentation.
The playlist 90 in FIG. 4 has two defined events labeled "First Event" and "Second Event." These events represent the two desired states the operator uses to create his/her synchronized client presentation page 94. In the first event, frame 0, which corresponds to element 100, displays the string "This is some text". In frame 2, which corresponds to element
102, the image bottom_bar_right, stored as a GIF file, is displayed. In the second event, frame 0, which corresponds to element 100, displays the string "Second Event Text". In frame 4, which corresponds to element 104, the homepage associated with the listed URL http://www.e- studiolive.com is displayed. Since frame 2 is not changed in the second event, it continues displaying the image from event 1. The first and second events of the playlist 90 also include a command "pause 30." This command instructs the producer module 62 that after a 30 second pause, the producer module 62 should generate the next data event. Without the pause command, the producer module 62 waits until the operator instructs it to generate an event. Even with the pause command, the operator can override the automatic generation of a data event. An operator has ultimate control over the data events that are generated. However, absent intervention, the pause command causes the producer module 62 to generate data events automatically.
As an example, an operator is a professor giving a live lecture. The professor has 16 slides to use in aiding discussion, each stored on the web-server 70 as a GIF image file. The live presentation of the professor is displayed in element 101. The slides are presented in element
102. A chat session for student's questions is presented in element 103. Since the professor does not know in advance the length of time he needs to display the slides or if the order may change based on student questions, the professor defines a playlist with 16 events, one for each slide, without using the pause feature. The professor also uses a configurable control panel 80, where 16 of the keys are configured to represent each event. As the professor is giving his lecture, he presses the button on the control panel 80 representing the slide he wants displayed. While discussing the current slide, a student types in the chat session that he does not understand where the variable in the displayed equation was determined. Upon reading the question, the professor states that the variable in question was determined with the set of initial conditions. The
professor presses the button on the control panel 80 to display slide six, on which the calculation of variable in question is shown.
When the professor presses the button on the control panel 80, the control panel 80 transmits a control signal representing the button pressed to the producer module 62. Upon receiving that control signal, the producer module 62 generates a data event that represents event six that, according to the playlist, displays the sixth slide. The producer module 62 transmits the data event to the data signal generation module 66 upon receipt of the control signal. The data signal generation module 66 embeds the data signal into the streaming data signal 50 being streamed to the client node 18. Upon receipt of the data event by the client node 18, the data signal processing module 74 obtains the streaming data signal 50 from the server node 14 and obtains the embedded data event. In response to that data event, the data signal processing module 74 retrieves the data needed for the data event, in this case slide six, from either the storage buffer of the client node 18, or if not there, from the web-server module 70. The client node 18 displays the representation of the retrieved data, the sixth slide on the display 30. At the same time the data signal processing module 74 is processing the data event and retrieving the associated data, the data signal processing module 74 is also decompressing the video signal and displaying the video in the display 30. The result is that at substantially the same time the video is being displayed, a representation of the data associated with the data event is retrieved and also displayed simultaneously. By placing the data event in the streaming data signal 50 at the server node 14 where the operator wants it to occur, the data event, and thus the display of the representation of the associated data, is synchronized to the video signal.
The data events inserted into the streaming data signal 50 include the data event frame identifier, the data event type and additional data. The frame identifier indicates to the data signal processing module 74 the frame of the client presentation page 26 in which the representation of the data should be displayed. The data event type indicates the type of data
event, which indicates to the data signal processing module 74 what additional data parameters will follow the data event type. Data event types can include, for example, titles, text events, survey events, links, picture and HTML files. The additional data are the parameters needed to display a particular data type. The parameters will vary with each data event type. For example, the data event type can be a picture type event. When creating a picture event for the playlist 90, the operator enters the file name 170 (FIG. 7) that holds the data of the image, the scale of the picture when displayed in the frame 174 (FIG. 7), and the position of the picture in the frame 178 (FIG. 7). In a picture data event, the associated data of the data event includes the parameters for the file name, the scale and the position. In the previous professor example, the frame id is two (i.e., element 102) and the data event type is a picture. The additional data includes the file name, the file is slide_six.GIF, and the scale is to display the picture as 95% of the frame and to center the slide both horizontally and vertically in the frame.
Some data event types (e.g., title, link) are self-contained (e.g., do not include a file name but instead include all the necessary data). For example, the data event type can be a title type event. When creating a title event for the playlist 90, the operator enters the text of the title 126 (FIG. 6), the font of the title 130 (FIG. 6), the colors of the text and the background of the title 134 (FIG. 6), and the position of the title in the frame 138 (FIG. 6). In a title data event, the associated data of the data event includes the parameters for the text, the font, the text and background colors and the position. For this data type no file is retrieved because the text to be displayed, along with the parameters on how to display it, are all included in the data event. In another embodiment, as hardware and/or software capacity of the multimedia presentation system 10 allows, the file name in the additional data for all of the data event types can be replaced with the file itself. In this embodiment, all the data needed to create the desired states of the other elements 34 of the client presentation page 26 is included as a data event in the
streaming data signal 50. The client node 18 does not need to retrieve any data from the webserver node 70.
To assist an operator in creating a playlist, the producer module 62 generates various graphical user interfaces ("GUI"). FIG. 5 depicts a GUI to create an event. The operator enters a description of the event in the description box 110, so that during a presentation, the operator can quickly identify which event to select. The operator can optionally select a pause time by checking the define pause box 114 and entering a time in the time box 118. When the event is complete, the user clicks on the OK box 122 and the event is added to the playlist. Once an event is defined, the operator can define the desired state of each of the elements 34 for that event.
FIG. 6 depicts a GUI to create an element that is a text string. The operator enters the string in the text box 126. The operator can select font characteristics 130 of the text string, text and background colors 134, and the position 138 of the text with respect to the placement within an element 34. In one embodiment, the producer module 62 has a predetermined layout, as depicted in FIG. 4, of 5 elements 100, 101, 102, 103, and 104. An operator using this GUI can select one of the 4 elements 100, 102, 103, 104 in which to display the text string by checking the location in the where box 142. Only four of the five elements can be chosen from because one element 101 is reserved for the video synchronizing component. Finally, if the operator wants to add a hyperlink to the text string the operator can check the link box 146 and enter a URL of the link in the address box 150. If a user while viewing a presentation selects the hyperlink, the browser of the client node 18 opens a separate window with the client presentation page of the hyperlink. When the text string is complete, the operator clicks on the OK box 122 and the text string is added to the playlist under the selected event.
Similarly, the producer module 62 generates GUIs to create elements containing an image (e.g., FIG. 7), HTML code (e.g., FIG. 8) or a survey (e.g., FIG. 9). The producer module
62 uses the data entered through the GUI to generate an associated data event. The operator has entered, through the GUI, the desired frame, the file or text string to be used and any additional data to aid in the display of the file.
Creating a survey requires slightly more data. In creating a survey, the producer module 62 creates a GUI (e.g., FIG. 9a) to allow entry of the questions and a (e.g., GUI FIG. 9b) to allow entry of the choice of answers. The producer module 62 also generates a view (e.g., FIG. 9c) of the survey to show the operator how the survey is presented to a user on the client node 18. The survey has two events associated with it. One is presenting to the user the survey. The other is displaying the results of the survey. FIG. 9d depicts a view of the results of a survey presented to users after those users were presented with and answered the questions of the survey.
The layout of the client presentation page 26 (i.e., the placement of the elements 34, including the video component 38) is controlled using a frameset file written in HTML. Even though in one embodiment the producer module 62 has a predetermined layout (e.g., client presentation page 94, FIG. 4), the producer module 62 provides a GUI to allow the operator to customize the layout. FIG. 10 depicts an example of a GUI that includes a show data tab 160. When the operator selects this tab 160, the operator has access to a custom layout button 164. When the operator selects this button 164, the producer module 62 generates a custom options box 168 for further information. The operator enters a template of his/her custom layout. If the custom layout template references any other files, they must also be entered. The list of additional files are needed so that the producer module 62 can verify that all of the files needed to transmit the synchronized client presentation page 26 to the client node 18 are located on the web server module 70. The custom layout template is a frameset file written in HTML.
Once the operator has designed a frameset, the operator can insert show elements, such as titles, pictures, links and surveys, in the same way as for a predetermined layout. The
operator can simply right click in the playlist at the point where a data event is to be inserted and define the data event using one of the GUIs described above (e.g., FIG. 6, FIG. 7, FIG. 8, FIG. 9). A difference is that instead of selecting the frame in which to place the data event by clicking in a diagram of the frames in the where box 142, FIG. 6, a dropdown box labeled "Where" will list all frames by the name given in the frameset file template. The operator selects the frame by highlighting and clicking the name of the frame in the dropdown list.
For another illustrative example of a presentation, the synchronizing component can be a video component 38 that is a live broadcast of an sales agent webcasting a sales pitch for the latest product, the super widget, over the Internet. Element a 34a contains the registered trademark of the company. Element c 34c contains the name of the sales agent and all of his/her contact information (e.g., phone number, fax number and e-mail address). These two elements do not change during the sales pitch and are displayed at the beginning of the live broadcast and not changed. Element n 34n contains a chat session that displays user input. The users use the chat session to ask questions to the sales agent during the sales pitch, so the sales agent can tailor the live broadcast to the needs of the users.
Element b displays many different items to aid the sales agent. During some points of the sales pitch, the element 34b displays slides for Microsoft ® Power Point ® (manufactured by Microsoft, Inc. of Redmond, WA), portraying features of the superwidget. At other points, the element 34b displays still photographs of the superwidget in different colors. Still at other points, the element 34b contains video of the superwidget being used. The invention allows the sales agent to use any of these aids (e.g., slides for Microsoft ® Power Point ® (manufactured by Microsoft, Inc. of Redmond, WA), still photographs, video) during any point during the sales pitch. To be the most effective and since the invention allows real-time synchronization, the sales agent uses the chat session as a guide and displays the aids in response to the reaction of the users. The sales agent (or the operator) synchronizes the display of the aids with his live sales
pitch as he/she is talking. Element d 34d displays an image of a bar graph with an entry for yes and an entry for no. During the sales pitch, the sales agent surveys the users with yes or no questions and as the results are received from the users, they are displayed on the graph. In other sales pitches, the sales agent has used multiple choice questions with four or five answers to choose from. Again, the displaying of the results are synchronized with the video presentation of the sales agent so he/she can discuss the results as they are presented to the users.
Equivalents
The invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. The foregoing embodiments are therefore to be considered in all respects illustrative rather than limiting on the invention described herein. Scope of the invention is thus indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein.