WO2003096682A1 - Video production system for automating the execution of a video show - Google Patents

Video production system for automating the execution of a video show Download PDF

Info

Publication number
WO2003096682A1
WO2003096682A1 PCT/US2003/014427 US0314427W WO03096682A1 WO 2003096682 A1 WO2003096682 A1 WO 2003096682A1 US 0314427 W US0314427 W US 0314427W WO 03096682 A1 WO03096682 A1 WO 03096682A1
Authority
WO
WIPO (PCT)
Prior art keywords
production
show
icon
keyer
control
Prior art date
Application number
PCT/US2003/014427
Other languages
French (fr)
Other versions
WO2003096682A9 (en
Inventor
Alex Holtz
Robert J. Snyder
John R. Benson
William H. Couch
Marcel Larocque
Richard Todd
Charles M. Hoeppner
Keith Gregory Tingle
Kevin K. Morrow
Maurice Smith
Original Assignee
Parkervision, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US10/208,810 external-priority patent/US20030001880A1/en
Priority claimed from US10/247,783 external-priority patent/US11109114B2/en
Application filed by Parkervision, Inc. filed Critical Parkervision, Inc.
Priority to EP03724519A priority Critical patent/EP1552685A4/en
Priority to AU2003230350A priority patent/AU2003230350A1/en
Publication of WO2003096682A1 publication Critical patent/WO2003096682A1/en
Publication of WO2003096682A9 publication Critical patent/WO2003096682A9/en

Links

Classifications

    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/34Indicating arrangements 
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • G11B27/034Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/268Signal distribution or switching
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B2220/00Record carriers by type
    • G11B2220/20Disc-shaped record carriers
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B2220/00Record carriers by type
    • G11B2220/20Disc-shaped record carriers
    • G11B2220/25Disc-shaped record carriers characterised in that the disc is based on a specific recording technology
    • G11B2220/2508Magnetic discs
    • G11B2220/2516Hard disks
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B2220/00Record carriers by type
    • G11B2220/40Combinations of multiple record carriers
    • G11B2220/41Flat as opposed to hierarchical combination, e.g. library of tapes or discs, CD changer, or groups of record carriers that together store one title

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Circuits (AREA)

Abstract

A method and system (100) for controlling a production studio for producing a television show. The method comprising sending control commands to a plurality of video production devices (104, 106, 110, 108) from a processing unit (102); associating one or more icons representing video production device control buttons with one or more control commands; creating, via a hierarchical user interface, a transition macro by placing one or more of the icons on a time sheet; and executing, via the hierarchical user interface, the transition macro to control the plurality of video production devices (104, 106, 110, 108) during the television show.

Description

VIDEO PRODUCTION SYSTEM FOR AUTOMATING THE EXECUTION OF A VIDEO SHOW
Background of the Invention
Field of the Invention
The present invention relates generally to video production, and more specifically, to a system, method and computer program product for automating the execution of a live or live-to-tape video show. The present invention also relates generally to media production, and more specifically, to . keying a video production. The present invention also relates generally to ■• production of live and as-live shows, and more particularly relates to production of live and as-live shows using live segments and re-purposed archived materials. The present invention also relates generally to media production, and more specifically, to automating production devices during a media production.
Related Art
Conventionally, the execution of a live or live-to-tape video show, such as a network news broadcast, talk show, or the like, is largely a manual process involving a team of specialized individuals working together in a video production environment having a studio and a control room. The video production environment is comprised of many diverse types of video production devices, such as video cameras, microphones, video tape recorders (NTRs), video switching devices, audio mixers, digital video effects devices, teleprompters, and video graphic overlay devices, etc. The basics of video production techniques is described in "Television Production Handbook," Zettl, 1997 Wadsworth Publishing Company, which is incorporated herein by reference. In a conventional production environment, the video production devices are manually operated by a production crew (which does not include the performers and actors, also known as the "talent") of artistic and technical personnel working together under the direction of a director. A standard production crew is made up of nine or more individuals, including camera operators (usually one for each camera, where there are usually three cameras), a video engineer who controls the camera control units (CCUs) for each camera, a teleprompter operator, a character generator operator, a lighting director who controls the studio lights, a technical director who controls the video switcher, an audio technician who controls an audio mixer, tape operator(s) who control(s) a bank of NTRs, and a floor director inside the studio who gives cues to the talent. Typically, the director coordinates the entire production crew by issuing verbal instructions to them according to a script referred to as a director's rundown sheet. Generally, each member of the production crew is equipped with a headset and a microphone to allow constant communication with each other and the director through an intercom system.
During the execution of a live or live-to-tape video show, the production crew must perform multiple parallel tasks using the variety of video production devices. Furthermore, these tasks must all be coordinated and precisely synchronized according to very strict timing requirements. Coordination between the production crew, the director and the talent is vitally important for the successful execution of a show. Accordingly, the logistics of executing a show are extremely difficult to plan and realize. Executing a show is extremely susceptible to errors. The industry knows that errors are generally expected to occur during the execution of a show. Accordingly, experienced production crews not only attempt to reduce the frequency of errors, but also attempt to react quickly in taking corrective action so that the inevitable errors that do occur are unnoticed by the viewing audience. However, it is quite apparent by watching live television broadcasts that this goal is not always met. Another problem with the conventional production environment is that the director does not have total control in executing a show because of the director's reliance on the production crew. The production crew does not always follow the instructions of the director due to mis-communication and/or misinterpretation of the director's cues. Further, the director cannot achieve certain desired transitions and sophisticated or enhanced visual effects because of the real time nature of the execution of the show and the fast paced/short time available.
The real time nature of the execution of the show creates great stress for the director, the production crew, and the talent. Everyone is extremely concerned about failure. The real time nature of the execution of the show also necessitates re-creation of the format, including transitions and special effects, for the show.
Another drawback of the conventional production environment, is that failure of any member of the production crew to be present for the execution of the show may prevent or hamper the show from occurring as planned. Thus, directors constantly worry about whether crew members will show up for work, particularly on weekends and holidays.
Conversely, there are situations in other than broadcast environments, such as business television and video training environments, where due to downsizing or budgetary constraints the number of available personnel for the production crew is so limited that shows cannot be produced with high quality.
Producing live or live-to-tape video shows is very expensive because of the large size of the video production crew. The compensation to the individuals that make up the production crew is substantial, and can run in the range of several Million dollars per year for the entire crew. Furthermore, the compensation for a member of a production crew is commensurate with the video market of the station. The level of compensation for the top markets is substantially higher than for the lesser markets, and the compensation for network affiliates is higher than independent broadcasters and cable networks.
This disparity in compensation produces frequent turnover in production crew personnel causing a director to frequently hire and train new members of the crew.
Another disadvantage with the conventional production environment is the inability to preview the show. That is, it is costly and impractical for the production crew to rehearse the show prior to its execution. The talent and the director cannot preview the transitions in a succinct manner.
Therefore, what is needed is a video production system and method that addresses the above problems.
Definitions Of Terms
Certain terms used in this document have specific meanings as follows:
"Activating an icon" means selecting or triggering the icon.
"Button" is an icon that is intended to represent an electrical push-button appearing as part of a graphical user interface. Moving a mouse pointer over the graphical button and pressing one of the physical mouse buttons starts some software action.
"Execution of a show" means the implementation of the steps necessary to broadcast the show or record it in any tangible medium of expression.
"Frame" a frame is one-thirtieth of a second. "Graphical Controls" are one or more icons used for controlling a video production device.
"Hot-key" is a programmable icon.
"Icon" means a small picture intended to represent something in a graphical user interface. When an icon is clicked on with a mouse, for example, some action is performed. Icons are usually stored as bitmaps, but of course can be stored using other formats.
"Pre-production" is the planning process whereby the video director plans the steps necessary to execute the show.
"Show" is a live or live-to-tape production. "Show template" is a stored file of a transition macro that can be used in whole or in part as a starting point to produce another show.
"Transition macro" means a set of video production commands, where each video production command is transmitted from a processing unit to a video production device. Transition macro also refers to a set of icons that have been dragged and dropped (i.e., assembled) onto the control lines of a transition macro time sheet.
"Video production command" is any command or instruction that controls a video production device. The concept of video keying enables two video sources to be combined into a composite video image by selectively switching between the two sources. A video keyer switches between the two sources in accordance with a switching signal. The switching signal, however, is derived from a video source rather than a fixed pattern generator. An internal key typically uses a luminance level of the video to create the switch. This is practical for superimposing black-and-white graphics or text onto the video.
Multiple types of video keyers can be found. For example, luminance keyers and chroma keyers represent two commonly known keyers. For luminance or chroma keyers, a monochrome key signal is used to determine when to switch. These simple keyers switch between two sources based on the level of the key signal in relation to key level and/or clip controls. The key signal can be derived from an overall brightness level (i.e., luminance key), color or hue information (i.e., chroma key), or a combination of both.
Another type of keyer is a linear keyer. Linear keyers provide a full range of transparency, which allows natural and pleasing compositing of images. With a linear keyer, the key signal is used to effectively dissolve between two video sources, one representing a background image and the other representing a foreground image. If the value of the key signal is zero or black, the foreground image is completely transparent and thus cannot be seen over the background image. If the key signal has the absolute value of one- hundred units or white, the foreground image is completely opaque and thus the background image cannot be seen under the foreground image. For key signal values between zero and one hundred, the foreground image achieves an increasing degree of translucence as the key signal approaches the absolute value. Accordingly, the background image becomes less visible through the foreground image as the absolute value of the key signal is approached. An advantage of using a linear keyer is that it can maintain the proper levels of anti-aliased images (especially graphics) when creating a composite image.
In a news broadcast environment, keyers are used to create lower thirds on a video shot. As such, graphic titles are retrieved from a character generator, and located or "keyed" at the lower third portion of a television screen. In addition, keyers are used to create "over-the-shoulder" (OTS) boxes. OTS boxes can be used to enhance the presentation of an on-camera shot of the news anchor by highlighting a graphic, picture, or text to support the topic being discussed.
To create a composite view having the lower third and/or OTS as described in the previous example, a director for a newscast would setup a keyer on a "mixed effect" bank (M/E).
First, the director would select an "unoccupied" keyer that would receive video input from the camera that is recording the news anchor. This video input would serve as the background video. The director would then select the desired graphic title and/or an OTS video source that would be used as the foreground image. The director would also specify the desired location for placing the graphic title and/or OTS over the background video.
After the keyer has been setup on M/E, the director would preview the composite view (i.e., the key shot with the lower third or OTS) to check for accuracy. Once the director is satisfied with the result, the composite or keyed shot is transitioned to air on a program channel. If another keyed shot is required, the director must select another unoccupied keyer, follow the same process from M/E, to preview, to air on the program channel.
As can be seen in a live production environment, the director must keep track of the keyers to avoid on-air mistakes. In other words, the director must be able to quickly determine which keyers are keying content for air, which ones are keying content for a preview channel, and which ones are unoccupied. In a complex newscast, many keyed shots are produced in a short time span, and most of the keyed shots occur back-to-back. Thus, it is extremely challenging for a director to accurately and quickly identify which keyer(s) on which M/E bank are on-air and which ones have already been setup in preview. Therefore, a need exists to develop a technology that addresses these concerns thereby simplifying the keying process.
The demand for 24-hour news programming is high, and growing steadily. Conventional systems and methods of producing news shows is labor extensive and, thus, expensive. What is needed are improved automated systems and methods for providing news programming that is less labor intensive and less expensive.
Conventionally, the production of a live or live-to-tape video show (such as a network news broadcast, talk show, or the like) is largely a manual process involving a team of specialized individuals working together in a video production environment having a studio and a control room. The video production environment is comprised of many diverse types of video production devices, such as video cameras, microphones, video tape recorders (NTRs), video switching devices, audio mixers, digital video effects devices, teleprompters, and video graphic overlay devices, etc. In a conventional production environment, the video production devices are manually operated by a production crew (which does not include the performers and actors, also known as the "talent") of artistic and technical personnel working together under the direction of a director. A standard production crew is made up of nine or more individuals, including camera operators (usually one for each camera, where there are usually three cameras), a video engineer who controls the camera control units (CCUs) for each camera, a teleprompter operator, a character generator operator, a lighting director who controls the studio lights, a technical director who controls the video switcher, an audio technician who controls an audio mixer, tape operator(s) who control(s) a bank of VTRs, and a floor director inside the studio who gives cues to the talent. Typically, the director coordinates the entire production crew by issuing verbal instructions to them according to a script referred to as a director's rundown sheet. Generally, each member of the production crew is equipped with a headset and a microphone to allow constant communication with each other and the director through an intercom system. The video produced by crew is delivered or transmitted to a master control system that, in turn, broadcasts the video over traditional mediums to a television set. Traditional mediums include the appropriate ranges of the frequency spectrum for television, satellite communications, and cable transmissions. The global Internet and other computer networks present an alternative distribution medium for video productions and like.
During the execution of a live or live-to-tape video show, the production crew must perform multiple parallel tasks using the variety of video production devices. Furthermore, these tasks must all be coordinated and precisely synchronized according to very strict timing requirements.
Coordination between the production crew, the director and the talent is vitally important for the successful execution of a show. Accordingly, the logistics of executing a show are extremely difficult to plan and realize.
In the early days, producer rundowns were created manually on paper as a form of putting together the show. New technology allows for this process to be used in networked computers. Companies such as iNEWS™ (i.e., the iNEWS™ news service available on the iNews.com website), Newsmaker, Comprompter, and the Associated Press (AP) have developed news automation systems to manage the workflow processes associated with a newsroom operation. A news automation systems is a network-based service that aggregates stories from news services, such as AP, Konas, and CNN services, police and fire information systems, and field reporters. During a news automation process, all components of a news production (including wire services, assignment editor, reporters, editors, producers, and directors) are connected so that the show building process can be streamlined with file sharing, indexing and archiving by show names. A news automation system allows a producer or director to develop a rundown sheet and always know the status of stories during the rundown assembly process. However, if a news automation source changes or becomes unavailable, the director must be able to quickly adjust the rundown to avoid errors on the air. Thus, a significant problem with today's conventional production environment is the director must be able to quickly assign sources while executing the show. During a live production, production equipment may fail to operate or members of the crew or talent may miss their cues. The director must be able to quickly react to these dynamic events. Therefore, a need exists to develop a technology that addresses these concerns.
Summary of the Invention
The present invention solves at least some of the above identified problems in conventional systems by providing an integrated video production system, method and computer program product (referred to collectively as
"video production system" for purposes of brevity) for automating the execution of a live or live-to-tape video show. The video production system is integrated such that a single person ("a video director") has control over all video production devices used in executing the show. Such devices include, but are not limited to, video cameras, robotic pan/tilt heads, video tape players and recorders (VTRs), video servers and virtual recorders, character generators, still stores, digital video disk players (DVDs), digital video effects (DVE), audio mixers, audio sources (e.g., CD's and DAT's), video switchers, and teleprompting systems. The automation capability provided by the video production system allows the video director to pre-produce a live show (such as a news show or talk show), preview the show in advance of "air time", and then, with a touch of a button or other trigger, execute the live show. Consequently, a live show or live-to-tape show can be executed more cost efficiently, with greater control over logistics and personnel, with enhanced functionality and transitions, in less time and with less stress, and with fewer people and fewer human errors than was previously possible. The present invention also allows the video director to reuse formats of prior shows by leveraging show templates.
In an embodiment, a video production system is provided having a processing unit in communication with and/or controlling one or more of the video production devices mentioned above. The processing unit displays on a monitor or other display device a graphical user interface (GUI) that consists of graphical controls for controlling the video production devices that it is in communication with. The graphical controls are made up of icons that the video director activates to control a video production device. The video director uses a keyboard and mouse or other input device or interface (including voice activated, touch screen, heads up display, etc.) to activate the icons, and thereby remotely control the video production devices. In this manner, a director is given control over video production devices used in executing a show.
The processing unit also enables the video director to automate the execution of a show. According to an embodiment, the video director pre- produces the show to create a director's rundown-sheet, creates a transition macro (or multiple transition macros), which specifies one or more video production commands, and instructs the processing unit to execute the transition macro. Executing a transition macro means transmitting the one or more video production commands that are specified by the transition macro to the appropriate video production devices.
Upon receiving a video production command, a video production device performs the function corresponding to the received command. In this manner, the processing unit provides automated control of the video production devices, and thereby provides a system for automating the execution of a show in real time. This feature provides the director with the advantage of not having to rely on a production crew to execute a show. The cost and time savings this feature provides are therefore substantial.
Additionally, the human errors that normally occur during the execution of a show are no longer an issue. Advantageously, the invention may include a timer and means for associating a timer value with each video production command specified by the transition macro, thereby creating a timer driven transition macro. In this embodiment, a video production command is transmitted to a video production device only when the timer reaches the timer value associated with the video production command. An advantage of this feature is that the video production commands are scheduled according to the timer. The timer is activated by the video director activating a timer start icon displayed by the processing unit or is activated by the processing unit receiving a timer start command from an external system, such as a teleprompting system. The timer can also be stopped at any point in time, thereby providing the video director with control over the execution of a transition macro.
In an embodiment, the processing unit is programmed to provide a graphical user interface (GUI) that enables the director to easily create timer driven transition macros via a hierarchical time sheet. The hierarchical time sheet includes a plurality of control lines and a possible plurality of hierarchical group layers. Each of the control lines corresponds to a video production device in a preferred embodiment. The video director creates a transition macro by defining one or more hierarchical group layer GUIs, where the group layer GUIs may include an object group layer GUI, a TME group layer GUI, a page group layer GUI, a story layer GUI and a show layer GUI.
A show is the container for everything, which can be divided into various story layers. A story can contain multiple page layers, a page layer can contain multiple TME layers, and a TME layers can contain multiple object layers.
In another aspect of the present invention, a method, system and computer program product are provided to implement parallel automated keying for one or more media productions (such as, news programs, situation comedies, concerts, movies, video rentals, radio broadcast, animation, etc.). A plurality of automated keyers are programmable to key one or more layers on a media production. Hence, multiple keyers are serially positioned to composite multiple keyer layers. Additionally, two or more keyers are positioned to composite productions for output on two separate channels, such as a program channel or a preview channel.
In an embodiment, the automated keyers are placed in a fixed arrangement comprising two groups. A first group of keyers are serially positioned to support multiple keyer layers. The total number of automated keyers placed in series depends on the maximum number of keyer layers established for the keyer system. A second group of keyers are placed in parallel to support multiple outputs. The total number of parallel keyers placed depends on the maximum number of channels established for the keyer system.
In another embodiment, an automated router comprises a plurality of automated keyers. The automated router is responsive to control signals that instruct the automated keyers to float. By floating, the automated keyers are programmable to be grouped in serial and parallel variable arrangements and assigned to multiple flow paths to output composite productions to multiple channels (e.g., program and preview). For example, the automated router can be instructed to allow two keyer layers to be composited and transitioned between a preview and program channel. In another example, the automated router is instructed to route a video shot having no keyer layers, a single keyer layer, or multiple keyer layers.
A user interface allows a director, or other personnel, to configure the attributes or properties for the key effects. Accordingly, the director specifies a background media source, a fill source, and a key source. By specifying the background media source, the director identifies a media production that will be keyed according to the present invention. The fill source specifies a device and/or file for the content that will be keyed on the background media production. The key source is associated with the fill source and specifies the shape and position of the key fill on the background media production. The three sources are recorded to a configuration file. In an embodiment, one or more automation control icons are placed on a graphical user interface to configure the key effects. As such, the director operates an input device to open an automation control icon that produces a dialogue box. The dialog box is responsive to receiving the background, fill, and key information. The automation control icon is associated with a set of computer readable broadcast instructions. When the automation control icon is activated by a timer associated with the user interface, the associated broadcast instructions are executed to transmit commands to an automated keyer that implements the key effects. In an embodiment, the broadcast instructions are created from the Transition Macro™ multimedia production control program developed by ParkerVision, Inc. (Jacksonville, FL) that can be executed to control an automated multimedia production system. The Transition Macro™ program is described in the pending U.S. application entitled "System and
Method for Real Time Video Production and Multicasting" (U.S. Appl. Ser. No. 09/634,735), which is incorporated herein by reference as though set forth in its entirety.
The automated keyers of the present invention can be implemented in a manual media production environment as well as an automated media production environment. An automated multimedia production environment includes a media production processing device that automatically or semi- automatically commands and controls the operation of a variety of media production devices, including an automated keyer. Therefore in an embodiment, the media production processing device sends control signals to the automated router to implement the serial and parallel variable arrangements and assigned flow paths, as previously discussed.
The present invention also includes a system and method for monitoring, updating, and altering the operating states of the automated keyers. The operating states are monitored to determine if a keyer is currently keying content for a program channel, keying content for a preview channel, or unoccupied. Hence prior to implementing the key effects from the configuration file, the operating states are monitored to select an unoccupied keyer. In an embodiment using an automation control icon, when the icon is activated, the associated broadcast instructions call or implement a routine to automatically select an automated keyer. Afterwards, keyer control commands are transmitted to send the configuration data (i.e., background, fill, and key source) to the selected keyer to composite the predefined key effects.
In an embodiment, only an unoccupied keyer is selected. However in another embodiment, a keyer operating in a preview state is chosen if no unoccupied keyers are currently available. The present invention includes mechanisms that allow a director to approve or reject the selection of a keyer before the keyer is placed in operation.
Therefore, the present invention provides methodologies and/or techniques for automatically selecting a keyer and compositing predefined keyer layer(s). The composite media production is routed over a preview channel so that the director can review the production. If the key effects are approved, the director steps or transitions the composite media production from preview to program. As a result, the director is not required to keep track of the keyer operating states when keying and reviewing a media production during a live production.
In another aspect of the invention, briefly stated, the present invention is directed to a system, method, and computer program product for producing a show. In an embodiment, the invention is directed to a production system having a first production path, a second production path, and a control system that causes the first production path to generate a show in a first aspect ratio
(4:3), and that causes the second production path to generate the same show in a second aspect ratio (16:9).
In another embodiment, the invention is directed to producing a show from live material and from archived material. This aspect of the invention operates by producing a first show comprising a plurality of stories, segmenting the first show, and storing the show segments in an archive. Then, the invention produces a second show using live portions as well as show segments retrieved from the archive.
The invention is also directed to a media manager that interacts with a server. In some cases, the server is integrated with the production system.
The media manager automatically assigns channels/ports of the server when accessing material stored in the server. In yet another aspect of the invention, a method, system, and computer program product overcomes the above problems by providing a director control interface that serves as a link between a newsroom information management system and a production control system. A newsroom information management system includes a news automation system, such as those available from iNEWS™, Newsmaker, Comprompter, and the Associated Press. A production control system includes an automated production control environment, such as the embodiments described in the pending U.S. application entitled "Method, System and Computer Program Product for Producing and Distributing Enhanced Media Downstreams" (U.S.
Appl. Ser. No. 09/836,239), which is incorporated herein by reference in its entirety.
In embodiments of the present invention, the director control system extracts production information from a newsroom information management system and populates a production control system. The director control interface enables the director to build a show, but mitigate errors and check for conflicts during the building process.
In an embodiment, the director control interface automatically selects macro elements, which are executed on the production control system. The director can override the selection process and choose the macro elements.
The director control interface monitors the newsroom information management system for rundown changes, evaluates the changes, and updates the production control system either automatically or with approval from the director. The director control interface is compatible with any type newsroom information management system as long as it can extract the requisite production information.
Further features and advantages of the invention, as well as the structure and operation of various embodiments of the invention, are described in detail below with reference to the accompanying drawings. In the drawings, like reference numbers generally indicate identical, functionally similar, and/or structurally similar elements. The drawing in which an element first appears is generally indicated by the left-most digit(s) in the corresponding reference number.
Brief Description of the Drawings
The accompanying drawings, which are incorporated herein and form part of the specification, illustrate the present invention and, together with the description, further serve to explain the principles of the invention and to enable a person skilled in the pertinent art to make and use the invention. In the drawings, like reference numbers indicate identical or functionally similar elements. Additionally, the left-most digit(s) of a reference number identifies the drawing in which the reference number first appears.
FIG. 1 illustrates an embodiment of an integrated, fully automated video production system.
FIG. 2 illustrates an interactive graphical user interface (GUI) for the fully automated video production system according to an embodiment of the present invention.
FIG. 3 illustrates a block diagram of an example computer system useful for implementing the present invention.
FIG. 4 illustrates an interactive graphical user interface (GUI) for the fully automated video production system according to an embodiment of the present invention.
FIG. 5 illustrates an alternative view of the time sheet GUI of FIG. 4.
FIG. 6 illustrates of an encode mark configuration GUI according to an embodiment of the present invention.
FIG. 7 illustrates an alternative view of the time sheet GUI of FIG. 4. FIG. 8 illustrates an encode object configuration GUI according to an embodiment of the present invention.
FIG. 9 illustrates the hierarchy of the group levels according to an embodiment of the present invention.
FIG. 10 further illustrates the group level hierarchy of FIG. 9. FIG. 11 illustrates the object group layer GUI according to an embodiment of the present invention.
FIG. 12 illustrates the TME group layer GUI according to an embodiment of the present invention. FIG. 13 illustrates the page group layer GUI according to an embodiment of the present invention.
FIG. 14 illustrates the story group layer GUI according to an embodiment of the present invention.
FIG. 15 illustrates an example time sheet row setup dialog according to an embodiment of the present invention.
FIG. 16 illustrates an example operation flowchart of the present invention upon receiving a command from the user to exit time sheet row setup dialog in FIG. 15 according to an embodiment of the present invention.
FIG. 17 illustrates an example time sheet layout setup dialog according to an embodiment of the present invention.
FIG. 18 illustrates an example timeline prep setup dialog according to an embodiment of the present invention.
FIG. 19 illustrates another embodiment of the page group layer GUI of the present invention. FIG. 20 illustrates example GUI buttons according to an embodiment of the present invention.
FIG. 21 illustrates an example rundown converter dialog according to an embodiment of the present invention.
FIG. 22 illustrates an example ETLA search according to an embodiment of the present invention.
FIG. 23 illustrates an example graphical time sheet view according to an embodiment of the present invention.
FIG. 24 illustrates an operational flow for keying a media production according to an embodiment of the present invention. FIG. 25 illustrates an operational flow for identifying and selecting a keyer according to an embodiment of the present invention. FIG. 26 illustrates an operational flow diagram for identifying and selecting a keyer according to another embodiment of the present invention.
FIG. 27 illustrates an operational flow diagram for identifying and selecting a keyer according to another embodiment of the present invention. FIG. 28a illustrates an initial state of two keyers according to an embodiment of the present invention.
FIG. 28b illustrates a queue for two unoccupied keyers according to an embodiment of the present invention.
FIG. 28c illustrates program and preview states of two keyers according to an embodiment of the present invention.
FIG. 29a illustrates an initial state of three keyers according to an embodiment of the present invention.
FIG. 29b illustrates a queue for three unoccupied keyers according to an embodiment of the present invention. FIG. 29c illustrates program and preview states of keyers according to another embodiment of the present invention.
FIG. 29d illustrates program and preview states of keyers according to another embodiment of the present invention.
FIG. 30a illustrates an initial state of a plurality of keyers according to an embodiment of the present invention.
FIG. 30b illustrates a queue for a plurality of keyers according to an embodiment of the present invention.
FIG. 30c illustrates a program state of keyers according to an embodiment of the present invention. FIG. 30d illustrates a preview state of keyers according to an embodiment of the present invention.
FIG. 30e illustrates an unoccupied state of keyers according to an embodiment of the present invention.
FIG. 31 illustrates a keyer system according to an embodiment of the present invention.
FIG. 32 illustrates a keyer system according to another embodiment of the present invention. FIG. 33 illustrates a user interface for a show rundown according to an embodiment of the present invention.
FIG. 34 illustrates a user interface for keying a media production according to an embodiment of the present invention. FIG. 35 illustrates a video image keyed according to an embodiment of the present invention.
FIG. 36 illustrates a display for a quad box application according to an embodiment of the present invention.
FIG. 37 is an example production system useful for simultaneously generating outputs in native 4:3 format and native 16:9 format.
FIGS. 38 and 39 are example flowcharts depicting the operation of the invention when producing shows using live content and re-purposed archived materials.
FIG. 40 is an example production system for producing shows using live content and re-purposed archived materials.
FIG. 41 is an example user interface for the production system of FIG. 4.
FIG. 42 is a block diagram of a media manager according to an embodiment of the invention. FIG. 43 illustrates a director control interface according to an embodiment of the present invention.
FIG. 44 illustrates a director control interface according to another embodiment of the present invention.
FIG. 45 illustrates a director control interface according to another embodiment of the present invention.
FIG. 46 illustrates an operational flow for building a director control interface according to an embodiment of the present invention.
FIG. 47 illustrates a production control interface according to an embodiment of the present invention. FIG. 48 illustrates a production control interface according to another embodiment of the present invention. Detailed Description of the Preferred Embodiments
A. System Architecture Overview
FIG. 1 illustrates, according to an embodiment of the present invention, an integrated video production system 100 for automating the execution of a show. Integrated video production system 100 is described in detail in commonly assigned U.S. Patent Application Serial No. 10/200,776, filed July 24, 2002, by Holtz et al, and entitled "Real Time Video Production System and Method," (hereinafter referred to as the '776 application"). The disclosure of the '776 application is incorporated herein by reference as though set forth in its entirety. To facilitate in the understanding of the present invention, integrated video production system 100 will be briefly discussed herein with reference to FIG. 1. As shown in FIG. 1, video production system 100, in a representative embodiment, includes a processing unit 102 in communication with a variety of video production devices. Such video production devices include, but are not limited to, a video switcher 104; a digital video effects device (DVE) 106; an audio mixer 110; a teleprompting system 108; video cameras and robotics (for pan, tilt, zoom, focus, and iris control) 120, 122, 124, and 126; a record/playback device (RPD) 128; and a character generator and/or still store 130. RPD 128 can be a video tape recorder/player (VTR), a video server, a virtual recorder, a digital audio tape (DAT) recorder, or any device that stores, records, generates or plays back via magnetic, optical, electronic, or any other storage media. Lines 170-188 represent logical communication paths between processing unit 102 and the video production devices 104-130 listed above. Each of these components are described in detail in the '776 application.
A video director 135 uses processing unit 102 to produce a show. In an embodiment, processing unit 102 displays graphical user interfaces (GUIs) 132 and 133 on display devices 114 and 115, respectively. In another embodiment, processing unit displays GUIs 132 and 133 together on a single display device. GUIs 132 and 133 display graphical controls corresponding to the video production devices 104-130. Video director 135 uses a keyboard 118 and a mouse 116 to interact with the processing unit 102 by manipulating the graphical controls of GUI 132 and 133. In response to video director 135 activating a graphical control from GUI 132 or 133, processing unit 102 transmits a video production command to the video production device corresponding to the activated graphical control. In this manner, video director 135 centrally controls the operation of each of the video production devices. FIGs. 2 and 4 illustrate an embodiment of GUI 132 and an embodiment of GUI 133, respectively. GUI 132 includes video switcher graphical controls 202 for controlling video switcher 104 and DVE 106; audio mixer graphical controls 204 for controlling audio mixer 110; RPD graphical controls 206 for controlling up to twelve RPDs; camera graphical controls 205 for controlling one or more cameras that are in communication with processing unit 102; and DVE controls 203 for controlling DVE 106. GUI 132 is described in detail in the '776 application.
GUI 133 of FIG. 4 is an user-friendly graphical interface that enables the director (i.e., video director 135 from FIG. 1), or other personnel, to interact with the control system and make timely edits and revisions to the production as it is being filmed, videotaped, or broadcast. The graphical interface is an event-driven, timeline based application. The time sheet of the graphical interface has a timeline and control lines. The control lines are populated with various icons that are linked to the control system. The present invention includes a mechanism that improves the director's ability to change the order and grouping of the selected icons in response to timely changes to the rundown at various levels of granularity. The present invention also includes ^synchronization and error correction routines for the altered time sheet. The enhanced time sheet of the present invention is described next in more detail. B. Time Sheet
FIG. 4 illustrates an embodiment of an interactive time sheet created by a timeline-based application of graphical user interface (GUI) 133, according to an embodiment of the invention. The time sheet includes a horizontal timeline 402 and one or more horizontal control lines 404a-404p. Automation control icons 406a-406t are positioned onto control lines 404a- 404p at various locations relative to timeline 402, and configured to be associated with one or more video production commands and at least one video production device. FIG. 4 illustrates an embodiment of the time sheet after the placement of automation control icons 406a-406t onto control lines 404a-404p.
A timer (not shown) is integrated into timeline 402, and operable to activate a specific automation control icon 406a-406t as a timer indicator 408 travels across timeline 402 to reach a location linked to the specific automation control icon 406. As a result, video production system 100 would execute the video production commands to operate the associated video production device.
In regards to automation control icons 406a-406t, label icon 406a permits a director to name one or more segments or portions of a time sheet.
In an embodiment, the director would drag and drop a label icon 406a onto control line 404a, and double click on the positioned label icon 406a to open up a dialogue box to enter a text description. The text would be displayed on the positioned label icon 406a. Control line 404a is also operable to receive a step mark icon 406b, a general purpose input/output (GPI/O) mark icon 406c, a user mark icon 406d, and an encode mark 406e. Encode mark 406e is described in detail below with reference to FIG. 5. Step mark icon 406b and GPI/O mark icon 406c are associated with time sheet step commands. The time sheet step commands instruct timer indicator or cursor 408 to start or stop running until deactivated or reactivated by the director or another video production device. For example, step mark icon 406b and GPI/O mark icon 406c can be placed onto control line 404a to specify a time when timer indicator 408 would automatically stop running. In other words, timer indicator 408 would stop moving across timeline 402 without the director having to manually stop the process, or without another device (e.g., a teleprompting system 108) having to transmit a timer stop command. If a step mark icon 406b is activated to stop timer indicator 408, timer indicator 408 can be restarted either manually by the director or automatically by another external device transmitting a step command. If a GPI/O mark icon 406c is used to stop timer indicator 408, timer indicator 408 can be restarted by a GPI or GPO device transmitting a GPI/O signal.
In an embodiment, step mark icon 406b and GPI/O mark icon 406c may be used to place a logical break between two segments on the time sheet. In other words, step mark icon 406b and GPI/O mark icon 406c are placed onto control line 440a to designate segments within a video production. One or more configuration files can also be associated with a step mark icon 406b and GPI O mark icon 406c to link metadata with the designated segment.
Transition icons 406f-406g are associated with automation control commands for controlling video switching equipment. Thus, transition icons 406f-406g can be positioned onto control lines 404b-404c to control one or more devices to implement a variety of transition effects or special effects into a video production. Such transition effects include, but are not limited to, fades, wipes, DVE, downstream keyer (DSK) effects, and the like. DVE includes, but is not limited to, warps, dual-box effects, page turns, slab effects, and sequences. DSK effects include DVE and DSK linear, chroma and luma keyers.
Keyer control icon 406h is positioned on control line 404d, and used to prepare and execute keyer layers either in linear, luma, chroma or a mix thereof for preview or program output. The keyers can be upstream or downstream of the DVE. Audio icon 406i can be positioned onto control line 404e and is associated with commands for controlling audio equipment, such as audio mixers, digital audio tape (DAT), cassette equipment, other audio sources (e.g., CDs and DATs), and the like. Teleprompter icon 406j can be positioned onto control line 404f and is associated with commands for controlling a teleprompting system to integrate a script into the timeline. Character generator (CG) icon 406k can be positioned onto control line 404g and is associated with commands for controlling a CG or still store to integrate a CG page into the timeline. Camera icons 4061-406n can be positioned onto control lines 404h-404j and are associated with commands for controlling the movement and settings of one or more cameras. VTR icons 406p-406r can be positioned onto control lines 404k-404m and are associated with commands for controlling VTR settings and movement. GPO icon 406s can be positioned onto control line 404n and is associated with commands for controlling GPI or GPO devices. Encode object icon 406t can be positioned onto control line 404p and is associated with encoding commands which are described in detail below with respect to FIG. 7. User mark icon 406d is provided to precisely associate or align one or more automation control icons 406a-406c and 404e-404t with a particular time value. For example, if a director desires to place teleprompter icon 406j onto control line 404f such that the timer value associated with teleprompter icon 406j is exactly 10 seconds, the director would first drag and drop user mark icon 406d onto control line 404a at the ten second mark. The director would then drag and drop teleprompter icon 406j onto the positioned user mark icon 406d. Teleprompter icon 406j is then automatically placed on control line 404f such that the timer value associated with teleprompter icon 406j is ten seconds. In short, any icon that is drag and dropped onto the user mark 406d is automatically placed on the appropriate control line and has a timer value of ten seconds. This feature helps to provide multiple icons with the exact same timer value.
After the appropriate automation control icons 406 have been properly position onto the time sheet, the time sheet can be stored in a file for later retrieval and modification. Accordingly, a show template or generic time sheet can be re-used to produce a variety of different shows. A director could recall the show template by filename, make any required modifications (according to a new rundown sheet), and save the time sheet with a new filename.
As described above, one video production device is teleprompting system 108 (FIG. 1) that includes a processing unit and one or more displays for presenting a teleprompting script (herein referred to as "script") to the talent. In an embodiment, teleprompting system 108 is the SCRIPT ViewerD, available from ParkerVision, Inc. As described in the '776 application, teleprompting system 108 can be used to create, edit, and run scripts of any length, at multiple speeds, in a variety of colors and fonts. In an embodiment of the present invention, teleprompting system 108 is operable to permit a director to use a text editor to insert video production commands into a script (herein referred to as "script commands"). The text editor can be a personal computer or like workstation, or the text editor can be an integrated component of time sheet GUI 133. Referring to FIG. 4, text window 410 permits a script to be viewed, including script commands. Script controls 412 are a set of graphical controls that enable a director to operate the teleprompting system and view changes in speed, font size, script direction and other parameters of the script in text window 410.
The script commands that can be inserted by teleprompting system 108 include a cue command, a delay command, a pause command, a time sheet step command, and an enhanced video command. The present invention is not limited to the aforementioned script commands. As would be apparent to one skilled in the relevant art(s), commands other than those just listed can be inserted into a script. FIG. 5 illustrates the top region of GUI 133 (FIG. 4) to provide a view of control line 404a. Control line 404a is used to enter icons 406a-406d that are associated with step commands and icon alignment commands, as discussed above. Another automation control icon that can be placed on control line 404a is encode mark 406e. In an embodiment, encode mark 406e operates like a Web MarkD developed by ParkerVision, Inc. During the encoding process, encode mark 406e identifies a distinct segment within a video production. As timer indicator 408 advances beyond encode mark 406e, the encoding system is instructed to index the beginning of a new segment.
In an embodiment, the properties of each encode mark 406e are established by activating encode mark 406e to open a configuration GUI. FIG. 6 illustrates an embodiment of an encode mark configuration GUI 600.
GUI 600 can be used to set the time for initiating the encoding commands associated with encode mark 406e. The time can be manually entered or is automatically entered at the time of placing encode mark 406e on control line 404a. GUI 600 also permits an operator to designate a name for the segment, and specify the segment type classification. Segment type classification includes a major and minor classification. For example, a major classification or topic can be sports, weather, headline news, traffic, health watch, elections, and the like. Exemplary minor classifications or category can be local sports, college basketball, NFL football, high school baseball, local weather, national weather, local politics, local community issues, local crime, editorials, national news, and the like. Classifications can expand beyond two levels to an unlimited number of levels for additional granularity and resolution for segment type identification and advertisement targeting. In short, the properties associated with each encode mark 406e provide a set of metadata that can linked to a specific segment. These properties can be subsequently searched to identify or retrieve the segment from an archive.
FIG. 7 illustrates the bottom region of GUI 133 (FIG. 4) to provide a view of control line 404p. Control line 404p is used to enter icons automation control icon 406t that is associated with encoded transmission commands. The encoded transmission commands instructs the encoding system to start or stop the encoding process until deactivated or reactivated by an operator or another video production device.
Encode object icons 406t are placed on control line 404p to produce encode objects. In an embodiment, encode object icon 406t operates like Web Objects™ developed by from ParkerVision, Inc. FIG. 8 illustrates an embodiment of a configuration GUI 800 that can be used to set the searchable properties of each encode object icon 406t. In this embodiment, start stream object 802, data object 804 and stream stop object 806 are three types of encode object icons 406t that can be used. Start stream object 802 initializes the encoding system and starts the encoding process. In comparison with encode mark 406e, start stream object 802 instructs the encoding system to start the encoding process to identify a distinct show, whereas encode mark
406e instructs the encoding system to designate a portion of the video stream as a distinct segment. The metadata contained in start stream object 802 is used to provide a catalog of available shows, and the metadata in encode mark 406e is used to provide a catalog of available show segments. Data object 804 is used to identify auxiliary information to be displayed with the video stream. As described in detail below, auxiliary information includes graphics or text in a HTML page and is referenced in GUI 800 by its URL address.
Stream stop object 806 is used to stop the encoding process and designate the end of a distinct show. Once timer indicator 408 passes the stream stop object 806, the encoding system would start the post-production processes, such as, including indexing segments, cataloging segments, pacing script, and the like.
The encoding start and stop times can be manually entered into GUI 800 or automatically updated upon placement of start stream object 802, data object 804 or stop stream object 806 onto control line 404p. GUI 800 also permits one to designate a show identifier, show name or description for the production. Other properties include the scheduled or projected air date and air time for the production. A copyright field is provided to specify any restrictions placed on the use or re-use of a specific show or show segment.
For example, a broadcasting studio may not have a license to transmit a specific content on the Internet, but may have permission to provide the content over a private network or the air waves. The content can be restricted for educational uses, single broadcast, transmissions to designated clients, and the like. The appropriate component of system 100 (e.g., enhanced video server 115, streaming server 125, IM server 130, etc.) would verify the copyright field prior to streaming the content to an enhanced video client 120. Referring back to FIG. 4 and FIG. 7, as timer indicator 408 moves or passes over each encode object icon 406t (i.e., start stream object 802, data object 804 or stop stream object 806), the associated encoding commands are automatically processed. However, the present invention enables an operator to manually alter the encoding process during execution. In particular, encoding control region 702 provides a set of graphical controls that enable an operator to modify the encoding process. The encoding graphical controls include a ready control 704, start control 706, stop control 708, and data control 710. Ready control 704 has an "activate" state and "de-activate" state. As such, ready control 704 is operable to send "read" or "not read" commands to timer indicator 408 depending on whether ready control 704 is operating in an activate or de-activate state, respectively. In an embodiment, when ready control 704 is operating in an activate state, timer indicator 408 signals the encoding system to read and process the associated encoding commands as timer indicator 408 passes each encode object icon 406t and encode mark 406e. Similarly, when deactivated, ready control 704 instructs timer indicator 408 to signal the encoding system to not read the encoding commands associated with each encode object icon 406t and encode mark 406e. Therefore, when ready control 704 is de-activated, ready control 704 allows directors to perform test runs to preview a show prior to the broadcast. A preview mode is desirable to allow directors to check the show to make sure that the correct sources and transitions are selected.
Start control 706 is used to initiate the encoding system manually. In an embodiment, start control 706 is operable to manually override a deactivate state established by ready control 704 or stop control 708 (discussed below). Start control 706 can be used to manually activate the encoding process to send video streams to streaming server 125 that contain time-sensitive production elements, such as a breaking news element, or other manually prepared video productions.
Stop control 708 is operable to deactivate the encoding process and stop transmissions to streaming server 125. Stop control 708 would deactivate an encoding process initiated by either ready control 704 or start control 706. Stop control 708 provides directors with the ability to stop the encoding system manually to avoid airing any unauthorized content as an example.
Data control 710 is used to enter auxiliary information and link the information to a specific segment or an entire show. The auxiliary information is entered by typing the URL reference in reference window 712 and activating data control 710. Accordingly, auxiliary information can be entered via the configuration GUI 800 for data object 804 or reference window 712. Data control 710 enables directors to enter URLs at any time during manual operations.
As described above, GUI 133 of FIG. 4 is a user- friendly graphical interface that enables the director (i.e., video director 135 from FIG. 1), or other personnel, to interact with the control system and make timely edits and revisions to the production as it is being filmed, videotaped, or broadcast. In an embodiment of GUI 133, the time sheet includes a horizontal timeline 402 and one or more horizontal control lines 404a-404p. In particular, the time sheet section of GUI 133 provides the video director (or user) with a more efficient way of maneuvering around the time sheet. This is accomplished by allowing the user to group and/or manipulate elements (or icons) on different levels, increase the speed at which elements are triggered from the time sheet and increase the user's flexibility to define a custom time sheet view.
To facilitate the understanding of the present invention with regards to the time sheet section of GUI 133, the following definitions are provided.
Hierarchical Grouping - a series of layered collections of objects (e.g., icons 406a-406t discussed above) within a system.
Layer - one of the various levels of collections. Object - a single icon or element dropped onto the timeline or time sheet of GUI 133. Example icons are GP I/O mark icon 406c, a DVE icon, audio icon 406i, and so forth. Event - objects (icons) placed between GP I/O mark icons 406c, which execute one element of the video production. TME (Transition Macro Elements) - a collection of one or more objects or events or icons on the time sheet.
Page - a collection of one or more TME's on the time sheet. (Newsroom systems may define a page as a single line on the rundown or as a unique slug within the rundown.)
Story - a collection of one or more pages on the time sheet. On newsroom systems, a single line or slug or multiple lines or slugs may make up a story. Input from the user at some point may be entered.
Show - a group of one or more stories on the time sheet. Any object on the time sheet is a show. When the time sheet is saved, it may be saved as a show.
Layout - the layout maintains user definable time sheet views (visible rows, grouping view), LBN pages, Camera Preset hotkeys, CG/SS hotkeys, the Switcher Layout, Audio Layout (Audio Presets, Page Setup, Aux Setup, Channel Setup), position and visibility of GUI windows.
Layer Handles - a graphical bar displayed on the time sheet that corresponds to a "Layer", which gives the ability to grab and maneuver the specific "Layer".
ITME (Intelligent Transition Elements) - a series of one or more objects (or icons) on the time sheet that contains link information to other objects (or icons).
Class ID (Major ID) - defines the module with which the present invention will communicate. (DVE, Audio, Keyers, ScriptViewer, Cameras, Machine Control, CG/SS, GPO, Web, etc.). NCS (Newsroom Computer System) - the newsroom management software that creates show rundowns. The rundowns become the running order of stories and events within a show.
Rundown Converter - the intelligent intermediary between a NCS and video production system 100 (FIG. 1). C. Hierarchical Group Levels
The present invention defines a hierarchy of at least five (5) group levels for time sheet of GUI 133. The hierarchy of the group levels is illustrated in FIG. 9. FIG. 9 illustrates that the object level is at the bottom of the hierarchy and the show level is at the top of the hierarchy. Moving from the object level to the show level, the other levels include a TME level, a page level and a story level. This hierarchy is further illustrated in FIG. 10.
FIG. 10 illustrates that one or more objects (or icons) make up a TME, one or more TMEs make up a page, one or more pages make up a story and one or more stories make up a show. The present invention provides for group level GUIs that illustrate each of these group layers in the time sheet of GUI 133. The group layer GUIs are illustrated in FIGs. 11-14. Object group layer GUI is illustrated in FIG. 11. TME group layer GUI is illustrated in FIG. 12. Page group layer GUI is illustrated in FIG. 13. Story group layer GUI is illustrated in FIG. 14. Each of these group layer GUIs will be described in more detail below
D. Hierarchical Flow
Several rules apply to the group layer GUIs of the time sheet for GUI 133 of the present invention. First, when an object (or icon) is dropped on the time sheet, it is by default a member of all group layers. For example, the object that is dropped is part of the object, TME, page, story and show layers. Additional objects can be placed within any level above the object level. An object dropped in the previous object layer, TME layer is a member of the first object layer, TME layer, page layer, story layer and show layer. An object dropped in the previous object layer, page layer is a new TME layer, but a member of the first object layer, page layer, story layer and show layer. Objects can be gathered under the TME level, TME's can be gathered under the page level, pages an be gathered under the story level, and everything is under the show level. A show is the container for everything, which can be divided into various stories. A story can contain multiple page layers, a page layer can contain multiple TME layers, and a TME layers can contain multiple object layers. The present invention provides graphical layer handles for easier manipulation of the TME, page and story group layers. Referring to FIG. 12 and example TME layer GUI, two handles are illustrated including handle 1202 and handle 1204. In FIG. 13 and example page layer GUI, two handles are illustrated including handle 1302 and handle 1304. In FIG. 14 and example story layer GUI, one handle 1402 is illustrated.
Each handle shown in FIGs. 12-14, is a graphical bar that stretches from the beginning to the end of the layer. In an embodiment of the invention, a handle should not extend beyond the right edge of the layer. The handles in the page layer may be titled with the slug name from the newsroom system for that page. For example, in FIG. 13, the slug name for handle 1302 is "A01-
Under Attack" and the slug name for handle 1304 is "A02- America at War."
A TME layer in the time sheet of GUI 133 can be "grabbed" and manipulated from anywhere in the TME layer. Labels can be placed on any control line 404a-404p (FIG. 4) of the time sheet. In an embodiment of the invention, a handle should not extend beyond the right edge of the TME layer.
Grouping rules provided by the present invention are described next.
The present invention provides a number of grouping rules. One rule is that the left edge of a TME layer, the page layer and the story layer are the same. Another rule is that like layers do not overlap. For example, a TME layer should not overlap another TME layer. A page layer should not overlap another page layer. A story layer should not overlap another story layer. Another rule is that any object placed in a TME layer is always at least one frame to the right of the left edge of the TME layer. Yet another rule is all prep (pre-process) times to the left of an object extend the TME layer to the left (the number of frame for prep + 1 frame). Another rule is that TME layers are spaced two (2) frames apart by default. The minimum spacing is one frame. The present invention also allows for user definable TME layer spacing settings. Another rule is that page and story layers are spaced three (3) frames apart by default. The minimum spacing is one frame. The present invention also allows for user definable page and story layer spacing settings. The present invention is not limited to the aforementioned rules. As would be apparent to one skilled in the relevant art(s), rules other than those just listed may be enforced by the invention. The time sheet setup provided by the present invention is described next.
E. Time Sheet Setup Dialogs
A goal of the present invention is to provide the user with the maximum flexibility to define the look and layout of the time sheet of GUI 133. The present invention provides this maximum flexibility without negatively affecting the performance of video production system 100 (FIG. 1) and/or requiring the user to make massive manual changes to a large library of
TMEs, LBNs, shows, and so forth. The time sheet setup includes, but is not limited to, four dialogs. These dialogs: (1) a timeline speed dialog; (2) a time sheet row setup dialog; (3) a time sheet layout setup dialog; and (4) a time sheet pre-process ("prep") setup dialog. Each of these are described in more detail below.
1. Timeline Speed Dialog
The timeline speed dialog includes a slider control that allows the user to change the speed at which timer indicator 408 travels across timeline 402
(FIG. 4).
2. Time Sheet Row Setup Dialog
As described above with reference to FIG. 4 and GUI 133, the time sheet includes a horizontal timeline 402 and one or more horizontal control lines 404a-404p. Automation control icons 406a-406t are positioned onto control lines 404a-404p at various locations relative to timeline 402, and configured to be associated with one or more video production commands and at least one video production device. The present invention provides a time sheet row setup dialog that includes three lists, as described with reference to FIG. 15. The time sheet row setup dialog may be password protected.
Referring to FIG. 15, time sheet row setup dialog 1500 includes three main lists. These three lists include a major H) list 1502, a current row order list 1504 and a new row order list 1506. Current row order list 1504 includes two columns, a row number 1508 and an icon 1510. New row order list 1506 also includes two columns, a row number 1512 and an icon 1514. Each of these are described in more detail next.
Major ID list 1502 contains icons representing each class ID (or major ID) with which a row can be assigned (e.g., class Ids for TME building).
Current row order list 1504 includes the row number 1508 and icon 1510 columns. New row order list 1506 includes the row number 1512 and icon 1514 columns. Here, the user may drag icons from the major ID list 1502 and drop them on the new row order list 1506. In addition, the user may drag icons from the current row order list 1504 and drop them on the new row order list 1506. The user may also create a row when an icon is dragged from the major ID list 1502 and dropped on the new row order list 1506. Here, the icon is automatically placed in the first available list index of column icon 1504 and the row number is assigned based on its position in the list.
When a user drags an icon (or item) from the current row order list 1504 and drops it on the new row order list 1506 (and not on an existing icon in new row order list 1504), the icon is placed at the same list index as the list index in the current row order list 1504. In addition, the icon is automatically "mapped" current row order to new row order. If another icon already exists at that list index, then that icon is moved to the first available list index that has not been "mapped" and the icon from the current row order is placed at the same list index as the current row order list index and is automatically "mapped." In an embodiment of the present invention, at least one list icon for each active TME building class ID should be created (e.g., DVE, audio, keyers, script viewer, cameras, machine control, CG/SS, GPO, web, and so forth). The present invention allows the user to replace a row in time sheet row setup dialog 1500 if an icon from major ID list 1502 is dropped on an assigned list icon that is not "mapped." Here, the dropped icon replaces the existing icon. If the icon is dropped on a "mapped" icon, then a warning dialog may appear to inform the user that mapped icon cannot be replaced. The present invention allows the user to delete a row in time sheet row setup dialog 1500 by right clicking on a icon in column icon 1514 in new row order list 1506 and then by executing a "delete row command." When a row is deleted, all non-mapped icons below the deleted row are re-ordered by filling in the available non-mapped rows. The user can insert a row in time sheet row setup dialog 1500 by right clicking on a icon and then by executing a "insert row command." When a row is inserted, all non-mapped icons re-order down filling in the available non-mapped rows. If the list icon selected is a "mapped" icon, then a warning message may appear to warn the user that mapped rows cannot be moved. The user can move rows in time sheet row setup dialog 1500 by holding the left button of the mouse to drag an icon. When the dragged icon is dropped onto another non-mapped row, the icon dropped (if not mapped) and all non-mapped icons below it re-order. Here, the dragged icon replaces the dropped icon. If a list icon is dragged and dropped onto an empty row, then that icon is placed on that row.
The present invention also allows for row mapping in time sheet row setup dialog 1500. Here, when icons are dragged from current row order list 1504 to new row order list 1506, the mapping is automatic since the row placement is the same. To map a current row order list icon to a different row of the new row order list 1506, the user may hold the left button of the mouse down to drag an icon from the current row order list 1504 and drop it on the row in the new row order list 1506 that the user wishes to associate the icon with. The color of the row number entry that is affected is changed to red (or any other predetermined color) and this indicates it is no longer available for selection. The present invention then draws a line from the current row order list icon to the new row order list icon to indicate the mapping relationship. Once an icon in either list is mapped, that icon is no longer available to be mapped to any other icon.
If the user wishes to un-map a mapped icon, he or she can select any mapped icon from the current row order list 1504 or the new row order list 1506 and right mouse click to select the "re-map row command." The line connecting the two icons is erased and the color of the affected row number entry is changed from red to black (or any other predetermined color). The relationship between the two icons is severed and each icon is available to be re-mapped.
An example operation of the present invention upon receiving a command from the user to exit time sheet row setup dialog 1500 is illustrated in the flowchart of FIG. 16. In step 1602, the user sends a command to exit the time sheet row setup dialog 1500. Control then passes to step 1604.
In step 1604, the present invention determines whether changes were made to the lists in the time sheet row setup dialog 1500. The lists include major ID list 1502, current row order list 1504 and new row order list 1506. If the outcome of step 1604 is negative, then control then passes to step 1606 where the time sheet row setup dialog 1500 is closed. At this point the flowchart in FIG. 16 ends. Alternatively, if the outcome of step 1604 is positive, then control passes to step 1608. In step 1608, the present invention checks a 'new row grid' to ensure that at least one row in the lists of time sheet row setup dialog 1500 has been created for every active TME building class ID. Control then passes to step 1610.
In step 1610, the present invention checks the row mapping to ensure all current row order list icons (i.e., column icon 1510) have been mapped to new row order list icons (i.e., column icon 1514). Control then passes to step 1612. In step 1612, if all current row order list icons have been mapped to new row order list icons, then control passes to step 1606 where the time sheet row setup dialog 1500 is closed. The flowchart in FIG. 16 ends at this point. Alternatively, control passes to step 1614. In step 1614, a warning message is given that not all rows have been mapped. Here, the TME library may be affected if all current row order list icons are not either mapped or deleted. The present invention provides the user with the opportunity to delete all unmapped current row order list icons. Control then passes to step 1616. In step 1616, if the user wants to delete all unmapped current row order list icons, the control passes to step 1618 where all unmapped current row order list icons are deleted and the flowchart in FIG. 16 ends. Alternatively, control passes to step 1620.
In step 1620, the user is returned to the time sheet row setup dialog 1500 to map any unmapped current row order list icons. The flowchart in
FIG. 16 ends at this point.
Once all current row order list icons are either mapped or deleted, the present invention assigns a new row setup GUID. Once the new row setup GUID is assigned, the present invention opens a dialog to update the TME library. The update begins and the TME Library is searched. Each object, using the "new row order" map, replaces the old row number with the new row number and replaces the old Row Setup GUID with the new Row Setup GUID. If the old row number was not mapped to the New Row Order, then the object is deleted from the TME. Upon completion of the TME Library update, the user is prompted, "Do you wish to update another TME Library?"
If the answer is "Yes", then the "Update TME Library" dialog is opened and the update process is repeated for the new selected library. If the answer is "No", then the dialog is closed and the process is complete.
A copy of the Old Row Order, New Row Order, Mapping and Row Setup GUID is automatically saved by the present invention. The Old Row
Order can be recalled from a menu item called "Restore Old Row Order". Here, the current state is replaced with the saved Old Row Order, New Row Order, Mapping and Row Setup GUID. This enables the updating of a TME Library from the last know state to the current row setup. The third dialog included in the time sheet setup is described next.
3. Time Sheet Layout Setup Dialog
The time sheet layout setup dialog of the present invention allows the user to define: (1) the spacing of TMEs, pages and stories; (2) visible rows on the time sheet of GUI 133; (3) the layering view of the object, TME, page and story group levels; (4) the LBN pages loaded; (5) the camera preset hot keys loaded; (6) the CG/SS hot keys loaded; (7) the switcher layout; (8) the audio layout (audio presets, page setup, aux setup, channel setup); (9) the position and visibility of the different GUI windows, and so forth. The present invention is not limited to the aforementioned features of the time sheet layout setup dialog.
An embodiment of the time sheet layout setup dialog 1700 is shown in FIG. 17. Time sheet layout setup dialog 1700 includes a table 1702 entitled "row view" that includes a three-column list. The three-column list includes a hide row 1704, a row number 1706 and an icon 1708. Dialog 1700 also includes a check box 1710 to allow the user to save the layout with a default window position. In addition, dialog 1700 includes a check box 1712 to allow the user to save the layout with the current window positions. Dialog 1700 also includes a check box 1714 to allow the user to save the layout.
Referring to table 1702 in FIG. 17, the present invention allows the user to check the "hide row" box (in hide row 1704) for each row (in row number 1706) the user wishes to hide in the time sheet of GUI 133. Alternatively, the user can un-check the "hide row" box for each row the user wants visible in the time sheet of GUI 133.
As mentioned above, dialog 1700 includes a check box 1710 to allow the user to save the layout with a default window position, a check box 1712 to allow the user to save the layout with the current window positions, and a check box 1714 to allow the user to save the layout. In the present invention, when a layout is saved various modules each load setup files. For example, a switcher module can load a switcher setup file, an audio module can load an audio setup file, a LBN module can load LBN pages, a camera preset module can load camera preset pages, and so forth. When video production system 100 (FIG. 1) is started, default setup files are loaded for each module. When a layout is saved, a setup file for each module's current setup is saved with the same name as the layout file name. In addition the current row view is saved with the layout and, depending on which option is selected (Save Layout with default window positions or Save Layout with current window positions), the modules (position and visibility) is saved with the layout.
The user may recall a layout by selecting a File/Load Layout menu. The invention allows the user to locate and select a layout to load. The fourth dialog included in the time sheet setup is described next.
4. Time Sheet Pre-process ("Prep") Setup Dialog
The timeline prep setup allows the user to assign the left edge trigger for prep (pre-process) objects or icons. An embodiment of the timeline prep setup dialog 1800 is shown in FIG. 18 and includes a text box 1802 and a combo box 1804. Text box 1802 contains the prep number (the number of frames to the left of the icon in time sheet of GUI 133 that pre-process will occur). Combo box 1804 contains a list of icons that are defined to have prep (pre-process) attributes. One prep number can be assigned for all prep objects by selecting "ALL" in the first selection in the list. Examples of prep icons include, but are not limited to, a digital video effects device (DVE) icon (not shown in FIG. 4) and the keyer control icon 406h (FIG. 4). Time sheet views of the present invention are described next.
F. Time Sheet Views
As described above with reference to FIG. 4, the time sheet includes a horizontal timeline 402 and one or more horizontal control lines 404a-404p. Automation control icons 406a-406t are positioned onto control lines 404a- 404p at various locations relative to timeline 402, and configured to be associated with one or more video production commands and at least one video production device. The time sheet of GUI 133 provides user selectable and definable views of the time sheet (e.g., user layouts). The present invention provides at least two different view property types, including grouping views and visible rows.
1. Time Sheet Window
The user can resize the time sheet window to see more rows. In an embodiment of the present invention, three rows in the time sheet are fixed. A GPI-Slug row is fixed at the top of the time sheet window. The layer handles are fixed at the bottom of the time sheet window. (See, for example, FIG. 12 and example TME layer GUI that illustrates two handles, handle 1202 and handle 1204, located at the bottom of the time sheet window.) The TME label row is typically fixed above the group handle row. This example embodiment of fixed rows is not meant to limit the invention.
In an embodiment of the present invention, time sheet rows typically have a height of 32 pixels. Typically, sixteen (16) rows can be seen in the default window size of the time sheet. If there are more visible rows than can be seen in the time sheet window, then the time sheet window can be scrolled from top to bottom with a scroll bar. All rows in the time sheet scroll except for the fixed rows. This example embodiment of the present invention is not meant to limit the invention. As would be apparent to one skilled in the relevant art(s), other time sheet views other than those listed could be provided.
2. Visible Rows of the Time Sheet
In the time sheet setup feature of the present invention described above with reference to FIG. 17, the number and position of rows can be assigned. Each assigned row can be visible or hidden from view. Here, the visible rows are assigned in time sheet setup and stored with the user layout. The present invention allows the user at any time to see all rows in the time sheet. The user can right mouse click anywhere on the time sheet to see the popup menu. Depending on the current view, either "Show Hidden Rows" or "Hide Rows" will be available for selection. If rows are hidden and the user selects "Show All Rows," all rows are painted to the time sheet in GUI 133. If all rows are visible in the time sheet and rows are selected by the user to be hidden in time sheet setup, and the user selects "Hide Rows," then only viewable rows are painted to the time sheet of GUI 133.
3. Layering Views of the Time Sheet
The time sheet of the present invention provides user-selectable layering views. The different views of the time sheet are based on the layer levels described above with reference to FIGs. 10-14. FIG. 10 illustrates that one or more objects make up a TME, one or more TMEs make up a page, one or more pages make up a story and one or more stories make up a show. The present invention provides for group level GUIs that illustrate each of these group layers in the time sheet of GUI 133. The group layer GUIs are illustrated in FIGs. 11-14. Object group layer GUI is illustrated in FIG. 11. TME group layer GUI is illustrated in FIG. 12. Page group layer GUI is illustrated in FIG. 13. Story group layer GUI is illustrated in FIG. 14. FIG. 19 illustrates another embodiment of the page group layer GUI. The objects or icons are always visible on the time sheet, but the colored grouping levels seen on the time sheet will change with each view (TME, page or story). For example, the TME view will show the TME layer, the page view will show the page layer, and the story view will show the story layer. Operation of the time sheet is described next. G. Time Sheet Operation
1. Time Sheet Objects or Icons
Various automation control icons 406a-406t were described above with reference to FIG. 4. These icons included label icon 406a, step mark icon 406b, general purpose input/output (GPI/O) mark icon 406c, user mark icon 406d, encode mark 406e, transition icons 406f-406g, keyer control icon 406h, audio icon 406i, teleprompter icon 406j, character generator (CG) icon 406k, camera icons 4061-406n, VTR icons 406p-406r, GPO icon 406s and encode object icon 406t. Additional embodiments of some of these icons will be described next, along with new icons not described with reference to FIG. 4.
GPI/O Mark Icon
GPI O (general purpose input/output) mark icon 406c is associated with time sheet step commands. The time sheet step commands instruct timer indicator 408 to start or stop running until deactivated or reactivated by the director or another video production device. The default GPI number is one. A GPI/O mark icon property page includes a time control with the GPI/O mark timeline position number. The timeline position number is in time. The time is typically in hours/minutes/seconds/frames (hh/mm ss/ff). The GPI/O mark icon property page also includes a combo box to select the triggering GPI/O number. When a GPI signal for the assigned GPI is received, timer indicator 408 begins to play.
b. Jump Mark Icon
The jump mark icon (not shown in FIG. 4) is similar to step mark icon 406b. When timer indicator 408 hits a jump mark icon it jumps to the next
GPI mark icon on the timeline. All icons to the right of the jump mark icon and up to the left GPI mark icon are executed. If a jump mark icon triggers another jump mark icon, the jumps do not accumulate. No matter how many jump mark icons timer indicator 408 encounters, it will only jump to the next GPI mark icon. If no GPI mark icon is on the timeline to the right of the jump mark icon, then timer indicator 408 does not jump. A GPI mark property page includes a time control with the jump mark timeline position number. The timeline position number is in time. The time is in hours/minutes/seconds/frames (hh/mm/ss/ff).
c. DVE Mark Icon
A DVE (digital video effects device) mark icon has three triggers that include a prep (pre-process), a trans (process) and a post (post-process). Prep (pre-process) occurs (a user definable number of frames) to the left of the left edge of the DVE mark icon. Trans (process) occurs at the left edge of the DVE mark icon.
The user definable prep number can be a global prep number or individually assigned number for each time sheet icon type. Typically, the minimum number is two (2) frames. The typical default prep number is ten (10) frames. DVE mark icon pre-process only occurs if the previous DVE mark icon, for the same DVE, has completed its duration. The prep commands are buffered. When the previous DVE mark icon completes its transition, the prep commands are sent.
The items that are prepped may include: DVE process effects for the assigned DVE button; video switches for the assigned fields (program, preview, auxl, aux2, preview keyers fill, preview keyers hole); and preview keyers turn (on or off) at prep. The DVE trans occurs at the left edge of the DVE mark icon. The DVE trans can only occur after prep is completed. A left mouse double click on the DVE mark icon opens the DVE property page. d. Keyer Icon
Keyer icons have three triggers, prep (pre-process) and trans (process) and (post process). Prep occurs a user definable number of frames to the left of the left edge of the keyer icon. Trans occurs at the left edge of the keyer icon. The items that at prep include video switches for the assigned fields (aux keyers background, aux keyers fill, aux keyers hole, DSK keyers fill, DSK keyers hole). The items that occur at trans include: aux keyers turn (on or off) at trans; DSK keyers turn (on or off) at trans and video switches to the aux video outs occur at trans.
e. Audio Icon
As described above with reference to FIG. 4, audio icon 406i can be positioned onto control line 404e and is associated with commands for controlling audio equipment, such as audio mixers, digital audio tape (DAT), cassette equipment, other audio sources (e.g., CDs and DATs), and the like. Audio icons trigger on the left edge of the audio icon. A left mouse double click on the audio icon opens the audio property page.
Script Viewer Icon
Script viewer icons trigger on the left edge of the script viewer icon. A left mouse double click on the script viewer icon opens the script viewer property page.
g. CG/SS Icon
CG/SS icons trigger on the left edge of the CG/SS icon. A left mouse double click on the CG/SS icon opens the CG/SS property page. h. Machine Control Icon
Machine control icons trigger on the left edge of the machine control icon. A left mouse double click on the machine control icon opens the machine control property page.
i. Camera Preset Icon
Camera preset icons trigger on the left edge of the camera preset icon. A left mouse double click on the camera preset icon opens the camera preset property page.
j. GPO Icon
As described above with reference to FIG. 4, GPO icon 406 is associated with commands for controlling GPI or GPO devices. GPO icons trigger on the left edge of the GPO icon. A left mouse double click on the GPO icon opens the GPO property page.
2. Timer Indicator Controls
As described above with reference to FIG. 4, a timer (not shown) is integrated into timeline 402, and operable to activate a specific automation control icon 406a-406t as a timer indicator or cursor 408 travels across timeline 402 to reach a location linked to the specific automation control icon
406. The timer indicator may be controlled via GUI controls, keyboard controls, GPI inputs and an optional shot box. Cursor controls include play, cue, stop, next/previous GPI, next/previous TME, next page, and next story. Each of these are described next with example GUI buttons in FIG. 20. It is important to note that the example GUI buttons in FIG. 20 are for illustration purposes only and are not meant to limit the invention. a. Play
Timer indicator 408 starts when a GUI Play Button 2002 is pressed. Timer indicator 408 may also start when the (Alt and Spacebar keys) on the keyboard are pressed. When timer indicator 408 stops at a GPI mark, timer indicator 408 starts when it receives a GPI input. Timer indicator 408 also starts when a ShotBox Play Button is pressed.
b. Cue
Timer indicator 408 jumps back to the beginning of timeline 402 when the Alt key on the keyboard and a GUI Cue Button 2004 are pressed. Timer indicator 408 jumps back to the beginning of timeline 402 when the Alt C keys on the Keyboard are pressed. When timer indicator 408 is cued, it automatically stops before jumping back to the beginning of timeline 402.
Stop
Timer indicator 408 stops when a GUI Stop Button 2006 is pressed. Timer indicator 408 stops when the Alt S key on the Keyboard is pressed.
Timer indicator 408 stops at GPI marks.
d. Next/Previous GPI
Timer indicator 408 only skips to the next GPI mark when timer indicator 408 is stopped. When a Next G Button 2008 is pressed, timer indicator 408 jumps to the next GPI mark on timeline 402. None of the timeline icons jumped over are executed. When the Previous G Button 2016 is pressed, timer indicator 408 jumps to the previous GPI mark on timeline 402. None of the timeline icons (or objects) jumped over are executed. The
ShotBox will have Next and Previous Buttons to advance timer indicator 408 to the next or previous GPI mark. e. Next/Previous TME
Timer indicator 408 only skips to the next TME when timer indicator 408 is stopped. When a Next T Button 2010 is pressed, timer indicator 408 jumps to the left edged of the next TME on timeline 402. None of the timeline icons jumped over are executed. When a Previous T Button 2018 is pressed, timer indicator 408 jumps to the left edge of the previous TME on timeline 402. None of the timeline icons jumped over are executed. The ShotBox will have Next and Previous Buttons to advance timer indicator 408 to the next or previous TME.
f. Next/Previous Page
Timer indicator 408 only skips to the next page when timer indicator
408 is stopped. When a Next P Button 2012 is pressed, timer indicator 408 jumps to the left edged of the next page on timeline 402. None of the timeline icons jumped over are executed. When a Previous P Button 2020 is pressed, timer indicator 408 jumps to the left edge of the previous page on timeline 402. None of the timeline icons jumped over are executed. The ShotBox will have Next and Previous Buttons to advance timer indicator 408 to the next or previous page.
g. Next/Previous Story
Timer indicator 408 only skips to the next story when timer indicator 408 is stopped. When a Next S Button 2014 is pressed, timer indicator 408 jumps to the left edged of the next story on timeline 402. None of the timeline icons jumped over are executed. When a Previous S Button 2022 is pressed, timer indicator 408 jumps to the left edge of the previous story on timeline
402. None of the timeline icons jumped over are executed. The ShotBox will have Next and Previous Buttons to advance timer indicator 408 to the next or previous story.
3. Timeline Speed
The timeline speed is adjustable by the user within a range. The timeline speed can be adjusted only when timer indicator 408 is stopped. The spacing relationship between timeline icons has a direct relationship to timer indicator 408 speed. When the timeline speed is adjusted, the spacing of timeline icons must be adjusted as well. The timeline speed dialog can be accessed from the timeline setup dialog.
H. Dynamic Time Sheet
The dynamic time sheet feature of the present invention increases the amount of information with regard to the relationship between icons or group levels on the time sheet. Features of the present invention that relate to the dynamic time sheet include, but are not limited to, dynamic links, intelligent transition macro elements (ITME), TME replacement, auto-channel, global macro changes, and conflict identification. Each of these features are described next with reference to a Newsroom Computer System (NCS). As described above, newsroom management software creates show rundowns. This rundown becomes the running order of stories and events within a show. A rundown converter is the intelligent intermediary between a NCS and video production system 100 (FIG. 1). The NCS is utilized for illustration purposes only and is not meant to limit the invention.
1. Dynamic Links
In an embodiment of the present invention, a dynamic link is maintained with the NCS rundown. As changes are made on the NCS rundown, the time sheet of the present invention is updated with the changes. The rundown converter module will maintain the link between the NCS rundown and the time sheet. The first rundown convert occurs when the user checks a show in the rundown converter dialog. A dynamic link is maintained between the rundown converter module, the NCS and the time sheet until the rundown is unchecked.
The dynamic time sheet has three modes of operation, including an automatic time sheet update mode, a manual time sheet update mode, and a no time sheet update mode. Each of these modes are discussed next.
The rundown converter module may be set to update the time sheet automatically. Here, when a change is made on the NCS rundown, rundown converter automatically updates the time sheet. Alternatively, the rundown converter module may be set to update the time sheet manually. Here, when a change is made on the NCS rundown, rundown converter alerts the time sheet that a change has been made, but the time sheet is not updated until the user accepts the changes. Finally, in the no time sheet update mode, no changes or alerts are sent to the time sheet.
The present invention provides a rundown converter dialog that consists of a tree view and a setup menu. An example rundown converter dialog is shown in FIG. 21. The rundown server setup dialog is used to build the list of available rundowns in the NCS.
2.. Intelligent Transition Macro Elements (ITME)
As described above, TME (Transition Macro Elements) is a collection of one or more objects or events or icons on the time sheet. ITME are TME's with built in link rules of operation. These relational instructions, in addition to icon mapping, are used by an interface to the time sheet to make the necessary changes to appropriate linked icons. The time sheet of the present invention can use the same set of build rules used by TME builder. 3. TME Replacement
When a TME is saved, a TME GUID is stored with the TME. If the user wishes to replace all TMEs with the same TME GUID, the user may right click on the mouse and select "Replace TME." A dialog opens to select the TME to replace the existing TME and all TMEs on the time sheet with the same TME GUID.
4. Auto-Channel
The auto-channel feature of the present invention automatically assigns server channels from a pool of server channels. Each time a server load command is encountered, the auto-channel module would find the next available server channel and make the necessary changes within the appropriate icons based on the ITME instructions. Auto-channel can only pool channels from the same device. Multiple channels may be connected to the same media. When setting up the port for a server device, it can be designated as a pooled device. When the time sheet encounters a pooled device load command identified by its GUID, the auto-channel module would assign the next available channel for that device. Each time a channel is assigned, the interface increments to the next channel. When the last channel in the pool of assigned channels is assigned, the interface resets to the first channel. When the channel is assigned, the auto-channel module uses the ITME link instructions to populate the appropriate time sheet icons.
5. Global Macro Changes
Another feature of the dynamic time sheet of the present invention is the ability to make global changes across the time sheet. This involves replacing a non-linked source with another source, wherever encountered on the time sheet. An example of a desire for a global change would occur if
"Micl" goes bad and the user wants to replace all "Micl" entries with "Mic2." The change would be occurring within a single type of icon on the time sheet and would involve replacing a linked source with another source, whenever encountered on the time sheet. Using the ITME link data, the time sheet will make the necessary changes to all linked icons.
6. Conflict Identification
As the present invention increases the ability to maneuver around the time sheet, and increases the amount of relational data between icons on the time sheet, there is a need to add intelligence to the time sheet to be able to identify conflicts. The present invention provides a set of global rules that comprise known production or system violations that produce on-air mistakes. For example, if a camera is on-air and the time sheet encounters a camera preset that is different from the last camera preset issued for that camera, then an error message should be given. In another example, if a tape or sever machine is on-air and the time sheet encounters a tape or server cue command, then an error message should be given. A further example is if a CG channel is on-air and the time sheet encounters a CG command for that CG channel, then an error message should be given. These global rules are provided for illustration purposes only and are not meant to limit the invention.
I. Enhanced Time Sheet Look Ahead (ETLA)
The ETLA (enhanced time sheet look ahead) feature of the time sheet is designed to act on (e.g., select time sheet icons) as soon as those icons are free from their current operation. The farther in advance a device can be prepped or cued, the less time it takes to air the device. The ETLA operation is seamless to the end user, because the operational rules are built into the time sheet. Some of the benefits of the ETLA feature of the present invention includes icon status for future conflict (identification or resolution); it guarantees no unwanted on-air cueing; it provides for a tighter, faster show pace due to pre-load media; and it provides for a visually clearer time sheet due to better-defined group level separation.
The following definitions are provided for the ETLA feature of the present invention. ETLA (Enhanced Time sheet Look Ahead)- the process in which the time sheet searches ahead to the right of the timer indicator looking for ETLA icons to trigger.
ETLA Search - the point at which the time sheet begins looking for the next ETLA icon to trigger. Done - when an icon is no longer on-air or needed for a particular
TME or story. There are multiple levels of Done. They include: TME Done - when timer indicator 408 steps past the first GPI Mark in the next TME; Page Done - when timer indicator 408 steps past the first GPI Mark in the next page; and Story Done - when timer indicator 408 steps past the first GPI mark in the next story.
Various ETLA icons include, but are not limited to, a camera preset icon, a still store load icon, a VTR cue icon and a server load icon.
1. ETLA Search
The ETLA Search is done at different levels, with a set of boundaries. The boundaries are determined by which time sheet groups levels (TME, Story) the ETLA icons reside. The first ETLA search begins when timer indicator 408 stops at the first GPI mark on the time sheet. Each time timer indicator 408 advances past the first GPI mark of a new TME, a search begins. The first occurrence of each ETLA icon (i.e., camera preset, still store load, VTR cue, server load) to the right of timer indicator 408 is searched for on the time sheet. Regarding the TME group level, if an ETLA icon is part of the TME group when the ETLA search begins, then that ETLA icon type is not searched for until timer indicator 408 passes the first GPI mark of the next TME. The ETLA search does not extend beyond the current story. Regarding the page group level, if an ETLA icon is part of the page level when the ETLA search begins, then that ETLA icon type is not searched for until timer indicator 408 passes the first GPI mark of the next page. Regarding the story group level, if an ETLA icon is part of the story level when the ETLA search begins, then that ETLA icon type is not searched for until timer indicator 408 passes the first GPI mark of the next story. Finally, with the show group level, an ETLA search will look for ETLA icons until it reaches the end of the show level.
2. ETLA Rules
The present invention provides ETLA rules. For example, one ETLA rule states that no ETLA icon is triggered (executed) if the icons associated source is on-air. Another rule is that all ETLA icons are triggered (executed) on left edge of the icon, if the (Preset, ID, Timecode, Clip ID) is different from the last loaded (Preset, ID, Timecode, Clip ID) for that device or device channel and the icons associated source is not on-air.
Another ETLA rule provided by the present invention is that the first search begins when timer indicator 408 stops at the first GPI mark on the time sheet. For the TME level, timer indicator 408 passes the first GPI mark of a new TME. For LBN insertion, when the LBN is dropped on the time sheet, the search begins to the right of timer indicator 408. For the time sheet jump, when timer indicator 408 is moved by jumping to the next (TME, Page, Story) or time sheet bar jumping, the search begins to the right of timer indicator 408.
In the example in FIG. 22, when timer indicator 408 stops at the first GPI mark on the time sheet, the ETLA search begins. The first occurrence of each ETLA icon type, to the right of timer indicator 408, is searched for on the time sheet. The boundaries of the search for each ETLA icon is set by the group level of each icon. Since timer indicator 408 rests in the first TME and the camera preset ETLA icon is part of that TME, that icon is not searched for on the rest of the time sheet at this point. The first ETLA icon found to the right of timer indicator 408, not in the current TME, is the "VT1 Load Clip" icon. Since this is the next occurrence of this icon and "VT1" is not on-air, the clip is loaded. The search ends for this icon type. The next ETLA icon found to the right of timer indicator 408 is the camera 3 preset. The camera 3 preset is sent and the search ends for this icon type. The next ETLA icon found to the right of timer indicator 408, is the "VT2 Load Clip" icon. Since this is the next occurrence of this icon and "VT2" is not on-air, the clip is loaded. The search ends for this icon type.
J. Graphical Time Sheet View
The graphical time sheet view 2300 in FIG. 23 is a different way of representing the events on the time sheet. Instead of rows and icons, the time sheet would consist of a graphical representation of the events output. For example if the event was an OTS (Over the shoulder) TME, then the graphical representation would contain an image to represent the camera shot position and an image to represent the OTS Graphic. Control icons to control the different icons make up an event. For example, a V - button may be used to control the video switching, an A- button may be used to control the audio switching, a P - button may be used to control the camera preset, and a M - button may be used to control the device of the OTS.
K. Example Environment of the Present Invention
Referring to FIG. 3, an example computer system 300 useful in implementing the present invention is shown. The computer system 300 includes one or more processors, such as processor 304. The processor 304 is connected to a communication infrastructure 306 (e.g., a communications bus, crossover bar, or network). Various software embodiments are described in terms of this exemplary computer system. After reading this description, it will become apparent to a person skilled in the relevant art(s) how to implement the invention using other computer systems and or computer architectures. Computer system 300 can include a display interface 302 that forwards graphics, text, and other data from the communication infrastructure 306 (or from a frame buffer not shown) for display on the display unit 330.
Computer system 300 also includes a main memory 308, preferably random access memory (RAM), and can also include a secondary memory
310. The secondary memory 310 can include, for example, a hard disk drive 312 and/or a removable storage drive 314, representing a floppy disk drive, a magnetic tape drive, an optical disk drive, etc. The removable storage drive 314 reads from and/or writes to a removable storage unit 318 in a well-known manner. Removable storage unit 318, represents a floppy disk, magnetic tape, optical disk, etc. which is read by and written to removable storage drive 314. As will be appreciated, the removable storage unit 318 includes a computer usable storage medium having stored therein computer software and/or data.
In alternative embodiments, secondary memory 310 can include other similar means for allowing computer programs or other instructions to be loaded into computer system 300. Such means can include, for example, a removable storage unit 322 and an interface 320. Examples of such can include a program cartridge and cartridge interface (such as that found in video game devices), a removable memory chip (such as an EPROM, or PROM) and associated socket, and other removable storage units 322 and interfaces 320 which allow software and data to be transferred from the removable storage unit 322 to computer system 300.
Computer system 300 can also include a communications interface 324. Communications interface 324 allows software and data to be transferred between computer system 300 and external devices. Examples of communications interface 324 can include a modem, a network interface (such as an Ethernet card), a communications port, a PCMCIA slot and card, etc. Software and data transferred via communications interface 324 are in the form of signals 328 which can be electronic, electromagnetic, optical or other signals capable of being received by communications interface 324. These signals 328 are provided to communications interface 324 via a communications path (i.e., channel) 326. This channel 326 carries signals 328 and can be implemented using wire or cable, fiber optics, a phone line, a cellular phone link, an RF link and other communications channels.
In this document, the terms "computer program medium" and
"computer usable medium" are used to generally refer to media such as removable storage drive 314, a hard disk installed in hard disk drive 312, and signals 328. These computer program products are means for providing software to computer system 300. The invention is directed to such computer program products.
Computer programs (also called computer control logic) are stored in main memory 308 and/or secondary memory 310. Computer programs can also be received via communications interface 324. Such computer programs, when executed, enable the computer system 300 to perform the features of the present invention as discussed herein. In particular, the computer programs, when executed, enable the processor 304 to perform the features of the present invention. Accordingly, such computer programs represent controllers of the computer system 300.
In an embodiment where the invention is implemented using software, the software can be stored in a computer program product and loaded into computer system 300 using removable storage drive 314, hard drive 312 or communications interface 324. The control logic (software), when executed by the processor 304, causes the processor 304 to perform the functions of the invention as described herein.
In another embodiment, the invention is implemented primarily in hardware using, for example, hardware components such as application specific integrated circuits (ASICs). Implementation of the hardware state machine so as to perform the functions described herein will be apparent to persons skilled in the relevant art(s).
In yet another embodiment, the invention is implemented using a combination of both hardware and software. Another aspect of the invention involving an autokeying method, system and computer program product is described next with reference to FIGs. 24-36. L. Autokeying
The present invention comprises various techniques and/or methodologies for monitoring, altering, and updating the operating state of a plurality of keying systems. The operating state determines, inter alias, if a keying system is currently keying content for air, keying content for preview, or not keying content. If a keying system is not keying content, the keying system is deemed to be "unoccupied" and available for use.
The present invention further describes techniques and/or methodologies for automatically selecting a keyer to key layers on a media production. As used herein, the term "media production" includes the production of any and all forms of media or multimedia in accordance with the method, system, and computer program product of the present invention. A media production includes, but is not limited to, video of news programs, television programming (such as, documentaries, situation comedies, dramas, variety shows, interviews, or the like), sporting events, concerts, infomercials, movies, video rentals, or any other content. For example, a media production can include streaming video related to corporate communications and training, educational distance learning, or home shopping video-based "e" or "t" - commerce. Media productions also include live or recorded audio (including radio broadcast), graphics, animation, computer generated, text, and other forms of media and multimedia.
Accordingly, a media production can be live, as-live, or live-to-tape. In a "live broadcast" embodiment of the present invention, a media production is recorded and immediately broadcast over traditional airwaves or other mediums (e.g., cable, satellite, etc.) to a television or the like. At the same time (or substantially the same time), the media production can be encoded for distribution over a computer network. In an embodiment, the computer network includes the Internet, and the media production is formatted in hypertext markup language (HTML), or the like, for distribution over the
World Wide Web. However, the present invention is not limited to the Internet. A system and method for synchronizing and transmitting traditional and network distributions are described in the pending U.S. application entitled "Method, System, and Computer Program Product for Producing and Distributing Enhanced Media" (U.S. Appl. Ser. No. 10/208,810), which is incorporated herein by reference in its entirety. The term "as-live" refers to a live media production that has been recorded for a delayed broadcast over traditional or network mediums. The delay period is typically a matter of seconds and is based on a number of factors. For example, a live broadcast may be delayed to grant an editor sufficient time to approve the content or edit the content to remove obj ectionable subj ect matter.
The term "live-to-tape" refers to a live media production that has been stored to any type of record playback device (RPD), including a video tape recorder/player (VTR), video recorder/server, virtual recorder (VR), digital audio tape (DAT) recorder, or any mechanism that stores, records, generates, or plays back via magnetic, optical, electronic, or any other storage media. It should be understood that "live-to-tape" represents only one embodiment of the present invention. The present invention is equally applicable to any other type of production that uses or does not use live talent (such as cartoons, computer-generated characters, animation, etc.). Accordingly, reference herein to "live," "as-live," or "live-to-tape" is made for illustration purposes, and is not limiting. Additionally, traditional or network distributions can be live or repurposed from previously stored media productions.
As discussed above, the present invention provides methods for keying a media production for distribution over traditional or network mediums. The keyed media production can also be saved to a storage medium for subsequent retrieval. In an embodiment, parallel automated keyers are used to allow multiple media productions to be keyed and outputted to a program or preview channel. The operating state is monitored, updated, and altered to simplify the keyer operations and thereby reduce the burden on the video director, so that the director can focus attention on the "on-air" product. In other words, the present invention enables a director, or other personnel, to select a key to preview before the key appears in a program channel when a transition occurs. The director no longer has to determine which keyer is being used on-air to determine which keyer is available for the program channel.
Referring to FIG. 24, flowchart 2400 represents the general operational flow of an embodiment of the present invention. More specifically, flowchart 2400 shows an example of a control flow for automating parallel keyers according to the present invention. In other words, the control flow provides an example of compositing one or more keys or keyer layers on two or more media productions at the same time.
The control flow of flowchart 2400 begins at step 2401 and passes immediately to step 2403. At step 2403, a director, another crew member, or the like establishes the configuration parameters for the automated keying system. As described in greater detail below, an automated keying system comprises one or more keyers, which can be internal or external to a media production switcher. The keyers can be luma keyers, chroma keyers, linear keyers, or any combination thereof.
In an embodiment, the configuration parameters are written to a configuration file, as described in greater detail below. The configuration parameters specify attributes or properties for the desired key effects for a media production. Hence, one configuration parameter is the background media source. By specifying the background media source, the director identifies the media production that will be keyed according to the present invention. For example if the media production is a live or as-live signal, the director would select an input port (e.g., "input 1") to the keying system and a video source (e.g., "camera 1"). Similarly if the media production is repurposed or an on-demand selection of a recording, the director would select an input port (e.g., "input 2"), a source (e.g., "virtual recorder 1"), and a filename.
A second configuration parameter is a fill source. The fill source specifies a device and/or file for media or multimedia content that will be keyed on the background media production. The key fill comprises data, graphics, text, title, captioning, matte color, photographs, still stores, video, animation, or any other type of media or multimedia. By specifying the key fill, the director indicates the source and content (e.g., template, filename, etc.) of the fill. Therefore, the source can be a graphic device, character generator, video server, file server, or the like.
Another configuration parameter is a key source. The key source is associated with the fill source and specifies the shape and position of the key fill on the background media. For example, a key can be located in the lower- third region of the background, in the upper-third as commonly used for over- the-shoulder keying, or the like. The key source, in essence, is used to cut a hole in the background media. The key source can come from the same device that provides the key source or another device, which feeds the key source to the keyer switcher to cut the key hole.
At step 2409, the configuration parameters are accessed, and at step 2412, an unoccupied keyer is selected to receive the configuration parameters. The current state of all keyers are monitored to determine if the keyers are currently in use. If a keyer is not being used, this unoccupied keyer is identified and selected.
At step 2415, the configuration parameters are executed to route the specified background media, fill, and key sources to the selected keyer. The configuration parameters also instruct the selected keyer to automatically produce a composite shot or image displaying the desired key effects, namely the predefined background media, key and fill.
FIG. 35 illustrates an example of a composite shot 3500 according to an embodiment of the present invention. As shown, a background video 3502 can include a video of a news anchor coming from a camera in a television studio. The key signal can come from a graphic device and is fed into a production switcher to cut a hole in background video 3502. A fill video 3504 is used to fill the hole to complete composite shot 3500. Fill 3504 can be a second video that is fed into the production switcher from the graphic device or another source. In this example, fill video 3504 is displayed in an over-the- shoulder (OTS) box. The OTS box includes the fill video 3504 along with an additional graphic, picture, or text (collectively shown as 3506) to support the topic being discussed by the news anchor. The key signal can also instruct the production switcher to receive graphic titles 3508 from a character generator, and key the titles 3508 at the lower third portion of the on-camera shot of the news anchor. The keyed titles 3508 can be translucent (such as, the CBS™ icon) or opaque (i.e., "Technology News") with respect to background video 3502.
The example depicted in FIG. 35 has been described with reference to a linear keyer in a live production environment. However, it should be understood that the present invention also can be used with linear keyers, luminance keyers, chroma keyers, or a combination thereof. Referring back to FIG. 24 at step 2418, the composite shot is fed over a preview channel to a display or storage medium. At step 2421, the director reviews the composite shot on a preview display for accuracy. If the composite shot requires any modifications, the director can reset the configuration parameters at step2403. On the other hand if the composite shot is approved, the control flow passes to step 2424. At step 2424, the director steps or transitions the composite shot from preview to program output. The program output takes the composite media production to air and/or to a storage medium. Afterwards, the control flow ends as indicated at step 2495. As discussed above, the present invention allows the operating states of a plurality of keyers to be monitored and selected for compositing keyer layers. Referring to FIG. 25, flowchart 2409 represents the general operational flow of an embodiment of keyer selection step 2409. More specifically, flowchart 2409 shows an example of a control flow for identifying and selecting an unoccupied keyer according to the present invention.
The control flow of flowchart 2409 begins at step 2501 and passes immediately to step 2503. At step 2503, all keyers are monitored to determine or update their operating states. In an embodiment, each keyer is monitored to determine if the keyer is compositing a media production over a program channel, compositing a media production over a preview channel, or not being used at all. In another embodiment, each keyer is monitored to determine whether or not the keyer is being used without regard to whether the keyer is in a preview or program state. If the keyer is not being used, it is deemed as being available and is denoted as being in an "unoccupied" state.
At step 2506, the keyer states are evaluated. If no keyers are in an unoccupied state, the control flow passes back to step 2503 and the keyer states are monitored until a keyer becomes available. In an embodiment, a message is sent to the director and the director is granted an option to manually change the state of an occupied keyer (i.e., program or preview state). If one or more unoccupied keyers are found, the control flow passes to step 2509. All unoccupied keyers are queued and selected on a first-in-first-out (FIFO) basis. Hence as future keyers become available, the keyers are placed at the bottom of the queue. As needed, keyers are selected from the top of the queue to perform the keying operations. Although the present invention implements a FIFO routine to select keyers from a queue, other selection techniques and methodologies can be used as long as an available keyer can be automatically chosen with little or no user interaction. After an unoccupied keyer is selected, the control flow ends as indicated at step 2595.
In FIG. 25, only unoccupied keyers are selected. However as discussed, the director can be granted an option to select an occupied keyer if no unoccupied keyers are available. In an embodiment, a preview keyer can be identified and selected automatically. The capability of selecting a preview keyer is described with reference to FIG. 26, where flowchart 2409 represents the general operational flow of another embodiment of keyer selection step 2409. More specifically, flowchart 2409 shows another example of a control flow for identifying and selecting an unoccupied keyer according to the present invention.
The control flow of flowchart 2409 begins at step 2601 and passes immediately to step 2503. At step 2503, the operating states are monitored or updated to determine whether the keyers are in a program, preview, or unoccupied state as previously discussed. At step 2506, the keyer states are evaluated. If an unoccupied keyer is found, the control flow passes to step 2609. In no unoccupied keyer is found, the control flow passes to step 2606. At step 2606, the keyer states are evaluated to determine if any keyer is currently compositing video for a preview channel. If no keyer is in a preview state, the control flow passes back to step 2503. If at least one preview keyer is found, the control flow passes to step 2609.
At step 2609, a keyer identified from step 2506 or step 2606 is selected. The keyers are queued according to their operating state (e.g., unoccupied, preview, etc.). The queues are emptied on a FIFO basis, or the like. Once a selection has been made, the control flow ends as indicated at step
2695.
In FIG. 26, a preview keyer is automatically selected if no unoccupied keyers are found. However, in another embodiment of the present invention, the director can accept or reject the selected preview keyer. This is described with reference to FIG. 27, where flowchart 2409 represents the general operational flow of another embodiment of keyer selection step 2409. More specifically, flowchart 2409 shows another example of a control flow for identifying and selecting an unoccupied keyer according to the present invention. The control flow of flowchart 2409 begins at step 2701 and passes immediately to step 2503. At step 2503, the operating states are monitored or updated as previously discussed. At step 2506, the keyer states are evaluated. If an unoccupied keyer is found, the control flow passes to step 2703. At step 2703, an unoccupied keyer is chosen from an unoccupied queue using FIFO or the like, as discussed above.
If no unoccupied keyer is found at step 2506, the control flow passes to step 2606. At step 2606, the keyer states are evaluated to determine if any preview keyers are found. If no keyer is in a preview state, the control flow passes back to step 2503. If at least one preview keyer is found, the control flow passes to step 2706.
At step 2706, a keyer is chosen from a preview queue using FIFO or the like. At step 2709, the director is sent a message and given the option of approving or rejecting the chosen keyer. If approval is denied, the control flow passes back to step 2503. Otherwise, the control flow passes to step 2712. At step 2712, the chosen keyer from step 2709 or step 2703 is identified and implemented as the selected keyer. Once a selection has been made, the control flow ends as indicated at step 2795.
Flowchart 2409 shows that the director approves or rejects the keyer chosen at step 2706 (i.e., a preview keyer). This gives the director control over whether a currently keyed production should be interrupted or cancelled. This can be an effective mechanism for incorporating a late-breaking news segment, or some other type of unforeseeable event, into a live production.
Examples of a system and method for inserting late-breaking events into an automated production is described in the pending U.S. application entitled "System and Method for Real Time Video Production and Multicasting" (U.S. Appl. Ser. No. 09/634,735), which is incorporated herein by reference as though set forth in its entirety.
However in another embodiment, the director may approve or reject the keyer chosen at either step 2706 (i.e., a preview keyer) or step 2703 (i.e., an unoccupied keyer). Therefore, the rationale for approving a chosen keyer is not limited to whether to interrupt a currently keyed production. As discussed, in an embodiment, the keyers of the present invention are queued and emptied on a FIFO basis. This routine is explained with reference to FIGs. 28-30, which illustrate the queuing and selecting process for different quantities of keyers. Specifically, FIGs. 28a-28c provide an example using two keyers. FIGs. 29a-29d provide an example using three keyers. Finally, FIGs. 30a-30e provide an example using a plurality of keyers.
Hence, the present invention is not restricted to the quantity of keyers.
FIGs. 28a-28c illustrate various keyer states for two keyers (i.e., Kl and K2) that are queued and selected for a program and preview channel, according to the present invention. FIG. 28a shows keyer Kl and keyer K2 in an initial state. At this point in time, both keyers are unoccupied. As such, keyer Kl and keyer K2 are placed in an availability queue as shown in FIG. 28b. In FIG. 28c, keyer Kl is selected from the availability queue, and is being used to key program output. Keyer K2 is selected to key a preview channel.
FIGs. 29a-29d illustrate another example of keyers being setup in a queue and selected for compositing a media production. FIG. 29a shows three keyers (i.e., Kl, K2, and K3) in an initial unoccupied state. FIG. 29b shows keyer Kl, keyer K2, and keyer K3 after they have been placed in an availability queue. In FIG. 29c, the availability queue has been emptied, using FIFO, to place keyer Kl in a program state and keyer K2 in a preview state. In FIG. 29d, the director has selected keyer K3 to preview a second keyed shot. The keyer state of keyer K3 is monitored by placing it at the bottom of the preview queue. As such, if the preview queue is searched to select a keyer, as discussed with reference to FIGs. 26 and 27, keyer K2 would be the first keyer selected from the top of the preview queue.
FIGs. 30a-30e illustrates another embodiment of the present invention that supports a plurality of keyers Kl-Kn. FIG. 30a shows keyers Kl-Kn in their initial unoccupied states. FIG. 30b shows the unoccupied keyers Kl-Kn in an availability queue. FIG. 30c shows that first keyers Kl and then keyer K4 have been selected and transitioned to a program state. FIG. 30d shows that first keyer K3, then keyer K5, and finally keyer K6 are operating in a preview state. In FIG. 30e, the current status of the availability queue includes keyers K7-Kn and K2. Keyer K2 is at the bottom of the availability queue because it was previously selected for preview or program, but currently, is no longer in use. Hence, keyer K2 goes to the bottom because the availability queue is emptied according to a FIFO routine as previously discussed. As described above, the present invention allows one or more keyers to be programmed ahead of time without having to consider the source of a current media production or whether a keyer is being used. The present invention also allows automatic preparation of the next composited media production that will be taken to air. When it is time for the next shot and if the next shot requires a keyed image, the control flow of FIG. 24 is repeated for the next shot. If a keyed image is not required, the automated keyers are deactivated, and the background media production continues to be processed through the keyer system but without any video manipulation.
FIG. 31 illustrates a keyer system 3100 according to an embodiment of the present invention. Keyer system 3100 includes an input router 3102, a plurality of keyers K1-K4, and a video switcher 3104. Input router 3102 receives the background, fill, and key sources as discussed above. Input router 3102 routes the sources an appropriate keyer K1-K4 that has been selected, as discussed above. Keyers K1-K4 can be internal or external to video switcher 3104. Keyers K1-K4 can be a luminance keyer, chroma keyer, linear keyer, or a combination thereof.
As can be seen, keyer system 3100 enables a media production to be configured with two keyer layers. Keyers Kl and K2 provide a first keyer layer. A second keyer layer is provided by keyers K3 and K4.
Keyers Kl and K3 provide a composite shot to a program input port 3106 of video switcher 3104, and keyers K2 and K4 provide a composite shot to a preview input port 3108 of video switcher 3104. According to an embodiment of the present invention, composited shots are always "automatically" sent to a preview channel. Once the background, key, and fill sources are selected and composited on preview, the director "steps" or "transitions" the shot from a preview output 31 12 to a program output 3110.
Therefore, keyer system 3100 is setup on "preview" to see a "composite" view prior to taking the shot to air. The preview process provides assurance that the composite shot meets the director's approval prior to "transitioning" to air on the "program" channel. Although keyer system 3100 shows two keyer layers being composited on a background, system 3100 also allows a single key on both program 3106 and preview 3108 inputs to switcher 3104. Since the fixed architecture of FIG. 31 shows two keyers serially positioned, a keyed production having a single key would pass through the second keyer (i.e., K3 or K4) without being keyed. Additionally as previously discussed, if a keyed image is not required, the background media production is routed, without any fill or key sources, through both keyers (i.e., Kl and K3, or K2 and K4) to switcher 3104. Control signals 3120 are transmitted from a media production processing device, which is described below with reference to FIG. 33. Control signals 3120 provide instructions to various components of system 3100, such as instructions to switch between program output 3110 and preview output 3112. Control signals 3120 also enable a keyer K1-K4 to be manipulated and switched remotely via communications from a user interface. FIG. 32 illustrates another embodiment of keyer system 3100. As shown, keyers K1-K4 reside within a router 3206 that allows keyers K1-K4 to "float." This means that keyers K1-K4 can be grouped in serial and parallel variable arrangements and assigned to the appropriate signal flow path to program 3110 and preview 3112 channels automatically. In an embodiment, keyer system 3100 is programmed through software to key two layers back-to- back between preview and program over and over again. In another embodiment, the director programs keyer system 3100 to go from an "on- camera" shot with no keyer layers to one with four keyer layers.
As discussed above with reference to FIG. 24, the operating states of floating keyers K1-K4 can be monitored and, thus programmed, without the director having to track which keyer is being used for what effect to make sure they do not impact the composite picture on the program channel. To accomplish this, the logic in the software always knows which keyers K1-K4 are on program and which ones are available for the user. In addition, keyers K1-K4 always route the signals to "preview" during the automation process.
In an embodiment, input router 3102 allows auxiliary pairs of background, fill, and key signals to go into the floating keyer router 3206. The auxiliary inputs allow the director to composite images with keys for insertion in dual, tri or quad box applications. FIG. 36 shows an example of a quad box display 3600 for a quad box application according to an embodiment of the present invention. Quad box display 3600 provides sufficient auxiliary sets 3602-3608 of background, fill, and key signals (i.e., four sets) for up to a four- channel digital video effect (DVE) boards. The present invention can support more or less channels of DVE and is not to be considered a limitation. In addition, the architecture can support more or less floating keyers K1-K4. As shown in FIGs. 30a-30e, the architecture of the present invention can be modified to support a plurality of keyers Kl-Kn.
The architecture illustrated in FIG. 32 provides the flexibility for multiple variations of production compositing. From simple keyer layers on an "on-camera" shot, to multiple keyer layers back-to-back, and all the way to being able to composite keyer layers within inserts of a DVE double, tri, quad, or greater sized box application.
The present invention can be implemented in a manual media production environment as well as an automated media production environment. The pending U.S. application entitled "Method, System and
Computer Program Product for Producing and Distributing Enhanced Media Downstreams" (U.S. Appl. Ser. No. 09/836,239) describes representative embodiments of manual and automated multimedia production systems that are implementable with the present invention, and is incorporated herein by reference in its entirety. As described in the aforesaid U.S. application, an automated multimedia production environment includes a centralized media production processing device that automatically or semi-automatically commands and controls the operation of a variety of media production devices in analog and/or digital video environments. The term "media production device" includes video switcher (such as, switcher 3110), digital video effects device (DVE), audio mixer, teleprompting system, video cameras and robotics (for pan, tilt, zoom, focus, and iris control), record/playback device (RPD), character generator, still store, studio lighting devices, news automation devices, master control/media management automation systems, commercial insertion devices, compression/decompression devices (codec), virtual sets, or the like. The term "RPD" includes VTRs, video recorders/servers, virtual recorder (VR), digital audio tape (DAT) recorder, or any mechanism that stores, records, generates or plays back via magnetic, optical, electronic, or any other storage media. In an embodiment, the media production processing device receives and routes live feeds (such as, field news reports, news services, sporting events, or the like) from any type of communications source, including satellite, terrestrial (e.g., fiber optic, copper, UTP, STP, coaxial, hybrid fiber-coaxial (HFC), or the like), radio, microwave, free-space optics, or any other form or method of transmission, in lieu of, or in addition to, producing a live show within a studio.
In addition to controlling media production devices, an automated media production processing device is configurable to convert an electronic show rundown into computer readable broadcast instructions, which are executed to send control commands to the media production devices. An exemplary embodiment of an electronic rundown is described in greater detail below with reference to FIG. 33. An electronic show rundown is often prepared by the show director, a web master, web cast director, or the like.
The director prepares the rundown to specify element-by-element instructions for producing a live or non-live show.
An electronic rundown can be a text-based or an object-oriented listing of production commands. When activated, electronic rundown is converted into computer readable broadcast instructions to automate the execution of a show without the need of an expensive production crew to control the media production devices. In an embodiment, the broadcast instructions are created from the Transition Macro™ multimedia production control program developed by ParkerVision, Inc. (Jacksonville, FL) that can be executed to control an automated multimedia production system. The Transition Macro™ program is described in the pending U.S. application entitled "System and Method for Real Time Video Production and Multicasting" (U.S. Appl. Ser. No. 09/634,735), which is incorporated herein by reference as though set forth in its entirety. As described in the aforesaid U.S. application, the Transition Macro™ program is an event-driven application that allows serial and parallel processing of media production commands to automate the control of a multimedia production environment. Each media production command is associated with a timer value and at least one media production device.
FIG. 33 illustrates an embodiment of an object-oriented, electronic show rundown created by an event-driven application on a graphical user interface (GUI) 3300. The electronic rundown includes a horizontal timeline 3302 and one or more horizontal control lines 3304a-3304p. Automation control icons 3306a-3306t are positioned onto control lines 3304a-3304p at various locations relative to timeline 3302, and configured to be associated with one or more media production commands and at least one media production device.
A timer (not shown) is integrated into timeline 3302, and operable to activate a specific automation control icon 3306a-3306t as a timer indicator 3308 travels across timeline 3302 to reach a location linked to the specific automation control icon 3306a-3306t. As a result, the media production processing device would execute the media production commands to operate the associated media production device.
In regards to automation control icons 3306a-3306t, label icon 3306a permits a director to name one or more elements, segments, or portions of the electronic rundown. In embodiment, the director would drag and drop a label icon 3306a onto control line 3304a, and double click on the positioned label icon 3306a to open up a dialogue box to enter a text description. The text would be displayed on the positioned label icon 3306a. Referring to FIG. 33, exemplary label icons 3306a have been generated to designate "CUE," "OPEN VT 3," "C2 TI T2," etc. Control line 3304a is also operable to receive a step mark icon 3306b, a general purpose input/output (GPI/O) mark icon 3306c, a user mark icon 3306d, and an encode mark 3306e. Step mark icon 3306b and GPI/O mark icon 3306c are associated with rundown step commands. The rundown step commands instruct timer indicator 3308 to start or stop running until deactivated or reactivated by the director or another media production device.
For example, step mark icon 3306b and GPI/O mark icon 3306c can be placed onto control line 3304a to specify a time when timer indicator 3308 would automatically stop running. In other words, timer indicator 3308 would stop moving across timeline 3302 without the director having to manually stop the process, or without another device (e.g., a teleprompting system (not shown)) having to transmit a timer stop command. If a step mark icon 3306b is activated to stop timer indicator 3308, timer indicator 3308 can be restarted either manually by the director or automatically by another external device transmitting a step command. If a GPI/O mark icon 3306c is used to stop timer indicator 3308, timer indicator 3308 can be restarted by a GPI or GPO device transmitting a GPI/O signal. In an embodiment, step mark icon 3306b and GPI/O mark icon 3306c are used to place a logically break between two elements on the electronic rundown. In other words, step mark icon 3306b and GPI/O mark icon 3306c are placed onto control line 1140a to designate segments within a media production. One or more configuration files can also be associated with a step mark icon 3306b and GPI/O mark icon 3306c to link metadata with the designated segment.
Encode mark 3306e can also be placed on control line 3304a. In an embodiment, encode mark 3306e is generated by the Web Mark™ software application developed by ParkerVision, Inc. Encode mark 3306e identifies a distinct segment within the media production produced by the electronic rundown of GUI 3300. As timer indicator 3308 advances beyond encode mark 3306e, an encoding system is instructed to index the beginning of a new segment. The encoding system automatically clips the media production into separate files based on the placement of encode mark 3306e. This facilitates the indexing, cataloging and future recall of segments identified by the encode mark 3306e. Encode mark 3306e allows the director to designate a name for the segment, and specify a segment type classification. Segment type classification includes a major and minor classification. For example, a major classification or topic can be sports, weather, headline news, traffic, health watch, elections, and the like. An exemplary minor classification or category can be local sports, college basketball, NFL football, high school baseball, local weather, national weather, local politics, local community issues, local crime, editorials, national news, and the like. Classifications can expand beyond two levels to an unlimited number of levels for additional granularity and resolution for segment type identification and advertisement targeting. In short, the properties associated with each encode mark 3306e provide a set of metadata that can be linked to a specific segment. These properties can be subsequently searched to identify or retrieve the segment from an archive.
Transition icons 3306f-3306g are associated with automation control commands for controlling video switching equipment. Thus, transition icons 3306f-3306g can be positioned onto control lines 3304b-3304c to control one or more devices to implement a variety of transition effects or special effects into a media production. Such transition effects include, but are not limited to, fades, wipes, DVE, downstream keyer (DSK) effects, and the like. DVE includes, but is not limited to, warps, dual-box effects, page turns, slab effects, and sequences. DSK effects include DVE and DSK linear, chroma and luma keyers (such as, K1-K4 and Kl-Kn, discussed above).
Keyer control icon 3306h is positioned on control line 3304d, and used to prepare and execute keyer layers either in linear, luma, chroma or a mix thereof for preview or program output. The keyers can be upstream or downstream of the DVE.
Audio icon 3306i can be positioned onto control line 3304e and is associated with commands for controlling audio equipment, such as audio mixers, digital audio tape (DAT), cassette equipment, other audio sources (e.g., CDs and DATs), and the like. Teleprompter icon 3306j can be positioned onto control line 3304f and is associated with commands for controlling a teleprompting system to integrate a script into the timeline. Character generator (CG) icon 3306k can be positioned onto control line 3304g and is associated with commands for controlling a CG or still store to integrate a CG page into the timeline. Camera icons 33061-3306n can be positioned onto control lines 3304h-3304j and are associated with commands for controlling the movement and settings of one or more cameras. VTR icons 3306p-3306r can be positioned onto control lines 3304k-3304m and are associated with commands for controlling VTR settings and movement. GPO icon 3306s can be positioned onto control line 3304n and is associated with commands for controlling GPI or GPO devices.
Encode object icons 3306t are placed on control line 3304p to produce encode objects, and is associated with encoding commands. In an embodiment, encode object icons 3306t are produced by the Web Objects™ software application developed by ParkerVision, Inc. When activated, an encode object icon 3306t initializes the encoding system and starts the encoding process. A second encode object icon 3306t can also be positioned to terminate the encoding process. Encode object icon 3306t also enables the director to link context-sensitive or other media (including an advertisement, other video, web site, etc.) with the media production. In comparison with encode mark 3306e, encode object icon 3306t instructs the encoding system to start and stop the encoding process to identify a distinct show, whereas encode mark 3306e instructs the encoding system to designate a portion of the media stream as a distinct segment. The metadata contained in encode object icon 3306t is used to provide a catalog of available shows, and the metadata in encode mark 3306e is used to provide a catalog of available show segments.
User mark icon 3306d is provided to precisely associate or align one or more automation control icons 3306a-3306c and 3306e-3306t with a particular time value. For example, if a director desires to place teleprompter icon 3306j onto control line 3304f such that the timer value associated with teleprompter icon 3306j is exactly ten seconds, the director would first drag and drop user mark icon 3306d onto control line 3304a at the ten second mark. The director would then drag and drop teleprompter icon 3306j onto the positioned user mark icon 3306d. Teleprompter icon 3306j is then automatically placed on control line 3304f such that the timer value associated with teleprompter icon 3306j is ten seconds. In short, any icon that is drag and dropped onto the user mark 3306d is automatically placed on the appropriate control line and has a timer value of ten seconds. This feature helps to provide multiple icons with the exact same timer value.
After the appropriate automation control icons 3306 have been properly position onto the electronic rundown, the electronic rundown can be stored in a file for later retrieval and modification. Accordingly, a show template or generic electronic rundown can be re-used to produce a variety of different shows. A director could recall the show template by filename, make any required modifications (according to a new electronic rundown), and save the electronic rundown with a new filename.
As described above, one media production device is a teleprompting system (not shown) that includes a processing unit and one or more displays for presenting a teleprompting script (herein referred to as "script") to the talent. In an embodiment, the teleprompting system is the SCRIPT Viewer™, available from ParkerVision, Inc. As described in the pending U.S. application entitled "Method, System and Computer Program Product for Producing and Distributing Enhanced Media Downstreams" (U.S. Appl. Ser. No. 09/836,239), a teleprompting system can be used to create, edit, and run scripts of any length, at multiple speeds, in a variety of colors and fonts. In an embodiment of the present invention, the teleprompting system is operable to permit a director to use a text editor to insert media production commands into a script (herein referred to as "script commands"). The text editor can be a personal computer or like workstation, or the text editor can be an integrated component of electronic rundown GUI 3300. Referring to FIG. 33, text window 3310 permits a script to be viewed, including script commands. Script controls 3312 are a set of graphical controls that enable a director to operate the teleprompting system and view changes in speed, font size, script direction and other parameters of the script in text window 3310.
The script commands that can be inserted by the teleprompting system include a cue command, a delay command, a pause command, a rundown step command, and an enhanced media command. As discussed below, enhanced media commands permit the synchronization of auxiliary information to be linked for display or referenced with a script and video. This allows the display device to display streaming video, HTML or other format graphics, or related topic or extended-play URLs and data. The present invention is not limited to the aforementioned script commands. As would be apparent to one skilled in the relevant art(s), commands other than those just listed can be inserted into a script. Thus, GUI 3300 enables the director to automate the control of the media production devices. GUI 3300 also enables the director to establish the configuration parameters, described above, for implementing the keying effects according to the present invention. As discussed, transition icons 3306f-3306g can be positioned onto control lines 3304b-3304c to setup DSK effects as well as other transition effects. If the director operates an input device to double click on a positioned transition icon 3306f-3306g, a dialogue box is opened to allow the director to specify a background, fill, and key source for the configuration parameters. This can be explained with reference to FIG. 34.
FIG. 34 shows a portion of the electronic rundown of GUI 3300. As shown, the activation of a transition icon 3306f-3306g generates a dialogue box 3402. Dialogue box 3402 allows the directors to set the transition properties, including the configuration parameters for key effects. Dialogue box 3402 includes time field 3404, which denotes the start and stop time value for implementing the key effects. The time value in time field 3404 corresponds to the timer values on timeline 3302.
Dialogue box 3402 also includes a program background field 3406, a preview background field 3408, and an auxiliary background field 3410. Program background field 1205, preview background field 3408, and auxiliary background field 3410 identify the background source of the media production to be keyed. Referring back to FIG. 32, program background field 3406, preview background field 3408, and auxiliary background field 3410 identify the input port(s) that input router 3102 uses to receive the program routed video outputs, preview routed video outputs, and auxiliary routed video outputs, respectfully, for router 3206.
Referring back to FIG. 34, a program fill field 3414 identifies a program fill source that is keyed on the program background media. A preview fill field 3416 identifies a preview fill source that is keyed on the preview background media. Additionally, an auxiliary fill field 3412 identifies an auxiliary fill source that is keyed on the auxiliary background media. Referring back to FIG. 32, program fill field 3414, preview fill field 3416, and auxiliary fill field 3410 identify the input port(s) that input router 3102 uses to receive the program routed fill outputs, preview routed fill outputs, and auxiliary routed fill outputs, respectfully, for router 3206.
Also shown in FIG. 34, a program key field 3420 identifies a program key source that is associated with program fill field 3414. In addition, a preview key field 3422 identifies a preview key source that is associated with preview fill field 3416. Further, an auxiliary key field 3418 identifies an auxiliary key source that is associated with auxiliary fill field 3412. Referring back to FIG. 32, program key field 3420, preview key field 3422, and auxiliary key field 3418 identify the input port(s) that input router 3102 uses to receive the program routed key outputs, preview routed key outputs, and auxiliary routed key outputs, respectfully, for router 3206.
Hence when activated, transition icons 3306f-3306g transmit commands that process the predefined fill and key attributes to composite the associated graphic layers within the specified media stream. More specifically, the broadcast instructions corresponding to a positioned transition icon 3306f- 3306g are executed as timer 3308 reaches the timer value on timeline 3302 that matches the time value (i.e., time field 3404) specified for the positioned transition icon 3306f-3306g. The broadcast instructions are programmed to select the correct keyer (e.g., K1-K4). In an embodiment, the broadcast instructions call or implement routine that determines which keyer is currently available, as discussed above with reference to FIGs. 24-27. After a keyer is selected, the properties from transition icons 3306f-3306g are assigned to the keyer. The keyed output is placed on a preview bus. This allows the director to review a keyer layer prior to broadcasting the show. The automation supported by the broadcast instructions also allow the director to concentrate on the quality aspect of the show instead of trying to determine which keyer is currently available and where it is routed.
Although GUI 3300 is described with reference to an automated production environment, it should be understood that a similar user interface could be used for a manual production environment. In a manual environment, control lines 3304b-3304c would execute broadcast instructions to select a keyer and assign the keyer attributes. However, the remaining control lines 3304a and 3304d-3304p would not transmit control commands to automate the control of other media production devices. The media production devices would be manually controlled. FIGs. 24-32 and 33-36 are conceptual illustrations allowing an easy explanation of the present invention. It should be understood that embodiments of the present invention could be implemented in hardware, firmware, software, or a combination thereof. In such an embodiment, the various components and steps would be implemented in hardware, firmware, and/or software to perform the functions of the present invention. That is, the same piece of hardware, firmware, or module of software could perform one or more of the illustrated blocks (i.e., components or steps).
The present invention just described can be implemented in one or more computer systems capable of carrying out the functionality as described above with reference to FIG. 3. Another aspect of the present invention involving systems, methods and computer program products for automated real-time execution of live inserts of repurposed stored content distribution, and multiple aspect ratio automated simulcast production is described next with reference to FIGs. 37-42.
M. Systems, Methods and Computer Program Products for Automated Real-Time Execution of Live Inserts of Repurposed Stored Content Distribution, and Multiple Aspect Ratio Automated Simulcast Production
Multiple Aspect Ratio Automated Simulcast Production
The invention is directed to a method, system, and computer program product for simulcasting digital video through outputs having differing format requirements. As described in greater detail below, in an embodiment, the present invention allows broadcasters to automate and produce a simulcast of dual 4:3 and 16:9 aspect ratio "live" or "as-live" programming. The present invention allows automatic adjustments to be made to character generators, still stores, transition effects, etc., such that the production can be synchronized and transmitted over parallel mediums with non-substantial delay. Overview The market for digital television continues to evolve as broadcasters transition from transmitting an analog NTSC signal to digital signals as mandated by the Federal Communications Commission (FCC). The mass majority of the installed television sets in the United States today have a 4:3 aspect ratio. New digital sets will come with the wider 16:9 aspect ratio similar to the format used in motion pictures
Market analysts expect the installed base of digital television sets to be at 4.2 million by the end of 2002. In addition, digital settop boxes that receive the broadcast digital signal either over-the-air or through cable for conversion onto existing analog television sets are expected to have an installed base of 39.5 million. Television households by the end of 2002 are forecasted to be at
3704.5 million. This shows a market penetration of 4% for digital television sets (16:9) and 38% for digital settop boxes (4:3). By the year 2005, the market penetration numbers will change for digital television sets from 4% to 17% while digital settop boxes change from 38% to 77%. This means that consumers will go through a period whereby many will be looking for a 16:9 format signal from broadcasters versus the standard 4:3 format currently used with analog television sets.
The broadcaster will be challenged to deliver several different types of programming solutions as follows: 4:3 original ("native") output "letter boxed" to fit a 16:9 digital television set.
The letter box could be "black" or contain text, data or graphic content. 16:9 original ("native") output sent out "as is" for display on 16:9 digital television sets. 16:9 original ("native") output "cropped" to fit into a 4:3 analog television set with digital settop box converter.
In several of the cases above, compromises exist in order to serve the consumer during the market transition from a 4:3 to a l6:9 aspect ratio format. For example, some televisions include circuitry to convert from one format to another. However, such conversions introduce artifacts and negative impact resolution and/or the viewing experience. It is preferable that televisions receive and use signals in their original, or native, formats. Therefore, a need exists to automate the process of producing simultaneous
4:3 and 16:9 formats from a single user interface to avoid duplicating both equipment and personnel resources. The simultaneous native output of both formats allow the broadcaster to design a "look and feel" that better enables them to take advantage of both 4:3 and 16:9 aspect ratios without negatively impacting one over the other.
The invention is applicable to formats other than 4:3 and 16:9, and is also applicable to the processing of greater than two formats, as will be appreciated by persons skilled in the relevant arts based on the teachings contained herein. Detailed Description
The invention allows broadcasters to automate and produce the simulcast of dual 4:3 and 16:9 aspect ratio "live" and "as-live" programming. FIG. 37 is a block diagram of a production system 3702 according to an embodiment of the invention. The production system 3702 includes a user interface 3706, to enable interaction with users 3704. Through the user interface 3706, users 3704 can design and produce shows, such as but not limited to live and as-live shows.
The production system 3702 includes a control system 3708, which controls the production system 3702 (and components thereof) in the manner described herein. The control system 3708 is implemented using a computer operating according to software. Such software, when stored on a computer readable medium (such as a CD, tape, hard drive, signals traveling over a wired or wireless medium, etc.), is referred to as computer program product. The invention is directed to such computer program products, as well as methods embodied in such computer program products, and systems incorporating such computer program products. Alternatively, the control system 3708 is implemented using predominately hardware (such as hardware state machines or application specific processors).
The production system 3702 includes a number of production paths 3710A and 3710B. In the example shown in FIG. 37, such production paths 3710A and 3710B include a 4:3 production path 3710A and a 16:9 production path 3710B. These examples are provided for illustrative purposes only, and are not limiting. The invention is applicable to other formats. Also, the invention can accommodate more than two production paths.
The production system 3702 also includes a number of sources 3714, which may be cameras, tape machines, still stores, etc. The sources 3714 provide material for the show to the production paths 3710A and 3710B, either directly or via the control system 3708.
In operation, the control system 3708 controls the 4:3 production path 3710A to generate a show in a native 4:3 format, to thereby generate a native 4:3 output 3712 A. Also, the control system 3708 controls the 16:9 production path 3710B to generate a show in a native 16:9 format, to thereby generate a native 16:9 output 3712B. In an embodiment, the control system 3708 simultaneously controls the 4:3 production path 3710A and the 16:9 production path 3710B such that the native 4:3 output 3712 A and the native 16:9 output 3712B are synchronized with each other. In practice, the outputs
3712A, 3712B may not be precisely synchronized with each other, but are sufficiently synchronized so that any offset from each other is not easily discernable by viewers (or are at least are not sufficiently distracting to viewers). In an embodiment, the production system 3702 is similar to that described in "Real Time Video Production System and Method," Serial No. 09/215,161, filed December 18, 1998, now U.S. Patent No. 6,452,612, issued September 17, 2002, referenced above (although the invention can be implemented using any production system having the functionality described herein). However, the production system 3702 is modified to include multiple production paths 3710A and 3710B, and the control system 3708 is modified to control the multiple production paths 3710A and 371 OB. Further information regarding the production system 3702 is provided below. Transition Macros for Controlling Multiple Production Paths
The invention uses a Transition Macro to achieve the functionality described herein. A transition macro is a set of video production commands, where each video production command is transmitted from a processing unit to a video production device. A user designs a production by combining one or more transition macros in a script. Execution of the transition macros cause the production system 3702 to produce the show in accordance with the script. Transition macros are described in great detail in U.S. Patent No. 6,452,612,
"Real Time Video Production System and Method," referenced above.
The '612 patent describes transition macros as controlling elements in a single production path. In the present invention, transition macros control elements in multiple production paths 3710A and 3710B. For example, in the example of FIG. 37, a given transition macro controls elements in both the 4:3 production path 3710A and the 16:9 production path 3710B (although some transition macros may still be limited to a subset of the production paths 3710A and 3710B). This feature of the invention is further described below. Material for Multiple Formats In an embodiment, the production system 3702 includes material in the native formats for each of the system 3702's production paths 3710A and 3710B. For example, for a given still, the production system 3702 includes a 4:3 version of the still, and a 16:9 version of the still. The production paths 3710A, 3710B may each include a dedicated still store, or may share a still store. Another example relates to cameras. In an embodiment, the 4:3 production path 3710A includes 4:3 format cameras, and the 16:9 production path 3710B includes 16:9 format cameras. Such duplicity is generally (but not always) the case for the production paths 3710A and 3710B, to enable generation of the outputs 3712A and 3712B in the multiple native formats. For example, consider the case of mixed effect (ME) banks. In embodiments of the invention, the production system 3702 includes ME banks dedicated for the 4:3 production path 3710A, and other ME banks dedicated for the 16:9 production path 371 OB.
A number of methods for acquiring materials for the production (in the different formats) are used, depending on a number of factors such as cost, availability, material types, as well as other implementation dependent factors.
For example, in an embodiment, field acquisitions will originally be in 16:9 format, since that format will capture more data (when compared to 4:3). The 16:9 raw footage will then be used to create 4:3 format material (via appropriate cropping, for example). The result will be two separate files, one in 16:9 format, and one in 4:3 format. Since both files were created prior to production, it is possible to optimize both for their respective uses (i.e., one for producing a 4:3 output, and one for producing a 16:9 output). In another embodiment, field acquisitions are obtained in multiple original formats.
Such operation of the invention is in contrast to many conventional post-production conversion techniques, which try to convert a 16:9 signal to a
4:3 signal (or vice versa). Since such cropping is performed post-production, the cropping is not specifically tailored to the characteristics and parameters of the target format. Processing Applicable Format As noted above, the production paths 3710A and 3710B include components that are dedicated to their operation. In practice, each production path 3710A and 3710B may include a dedicated component, or they may share a single component. For example, in some embodiments, the production paths 3710A and 3710B may each include character generator (CG) and still store components, or they may share the same character generator (CG) and still store components.
In an embodiment, all files are identified as being in either 4:3 format or 16:9 format (for example, each file may include meta data that denotes the applicable format). When the 4:3 production path 3710A accesses material, it ensures that it accesses files tagged as being in the 4:3 format. Similarly, when the 16:9 production path 3710B accesses material, it ensures that it accesses files tagged as being in the 16:9 format. In embodiments, such control is achieved through operation of the control system 3708, and/or through appropriate coding of transition macros. Keyer Functionality
In an embodiment, the control system 3708 includes a keyer for the 4:3 production path 3710A, and a second keyer for the 16: 9 production path
3710B.
As will be appreciated, the output signals 3712A and 3712B comprise a composite of signals from multiple sources. For example, the output signals 3712A and 3712B may comprise a first video signal having a hole, or "key," cut therein. A second video signal is then inserted inside the key. The output signals 3712A and 3712B are thus a composite of the first video signal and the second video signal.
In the invention, the production system 3702 includes video switchers to implement keying. The video switchers switch between multiple video sources (i.e., the first and second video signals from the above example). The switching operation of the video switchers is controlled by the control system 3708, which is operating according to commands / instructions contained in transition macros.
A given transition macro includes commands for both the 4:3 format and the 16:9 format. For example, assume the user 3704 inserts into a script the icon for a particular transition macro relating to keying. In an embodiment, the user 3704 need pay no concern to whether the icon relates to the 4:3 format or the 16:9 format. Instead, the icon is associated with a transition macro that includes video commands to perform the keying function for both the 4:3 format and the 16:9 format, such that the outputs 3712A and
3712B are synchronized. In other words, the invention provides a user interface 3706 that shields the user 3704 from the complexities of the generation of dual outputs 3712A, 3712B in dual formats. Adjustment of Effects The control system 3708 independently adjusts effects (such as DVE transitions like cuts, fades, wipes, etc.) for the desired look in a particular format (i.e., 4:3 or 16:9). This is achieved through transition macros. Specifically, a particular "cut" transition macro includes commands to effect the cut effect for the 4:3 format, and includes other commands to effect the cut effect for the 16:9 format. Such commands access either 4:3 format source materials, or 16:9 source materials, depending on whether the commands are for the 4:3 format or the 16:9 format. Such commands also have appropriate duration settings so the native 4:3 output 3712A is synchronized with the native 16:9 output 3712B.
Source Management The invention achieves source management for to simultaneously generate the 4:3 output 3712A and the 16:9 output 3712B. The invention synchronizes and manages dual output control of video server ports for proper ingestion and processing in the system. The invention performs automated format detection to properly address source material that does not meet one or the other format. Therefore, the invention utilizes a number of processes to
"letterbox" a 4:3 format or "crop" a 16:9 format for proper display.
In other words, the control system 3708 must ensure that source materials are served to the 4:3 production path 3710A and the 16:9 production path 3710B such that the 4:3 output 3712A is synchronized with the 16:9 output 3712B. In one embodiment, the control system 3708 issues control signals to simultaneously activate servers in the two production paths 3710A, 3710B. In this embodiment, the production paths 3710A, 3710B each have dedicated servers. In another embodiment, the production paths 3710A, 3710B share servers. In this embodiment, the control system 3708 communicates with a shared server(s) to server materials to the production paths 3710A, 3710B in a synchronous manner.
Exact synchronization between the outputs 3712A and 3712B is not necessary. The goal of the invention is to align the outputs 3712A and 3712B with one another so differences are not easily discernable to the human eye. The invention also performs automated format detection to properly address source material that does not meet one or the other format. For example, suppose in a show a particular source was available only in 4:3 format. In order to use this source in both the 4:3 production path 3710A and the 16:9 production path 371 OB, the source would have to be converted to the 16:9 format. Accordingly, the invention automatically determines whether a given source is only available in certain formats (or, equivalently, whether a given source is not available in any formats of interest). The invention then automatically converts the source to the other formats, using any well known technique. Such conversion can be performed either pre-production, or during production.
Addressing Camera Sources
The invention addresses camera sources properly according to the respective formats (4:3 or 16:9).
In some embodiments, both 4:3 cameras and 16:9 cameras are utilized to produce the native 4:3 output 3712 A and the native 16:9 output 3712B. In other embodiments, only cameras of a single format are used, such as 16:9 cameras. The video output of such cameras are then adjusted (cropped) to form the 4:3 source material. As a result, source materials in both the 16:9 format and the 4:3 format are made available to the production paths 3710A, 3710B. Cropping of the 16:9 video material into the 4:3 format takes into consideration the parameters of the 4:3 format, as well as the parameters of the show being produced. Accordingly, compositing the video materials to form the 4:3 output 3712A and the 16:9 output 3712B is enhanced, thereby increasing the quality of these output signals 3712A and 3712B. For example, consider the case where the script calls for a video signal to appear above and to the right of the anchor's shoulders. By knowing this ahead of time (during pre-production), the 16:9 original video signal can be cropped into the 4:3 format so that all the pertinent information will be conveyed in both the 16:9 output 3712B, as well as the 4:3 output 3712A. User Interface
As clear from the description above, the user interface 3706 enables users 3704 to create scripts that, when executed by the control system 3708, simultaneously generates the 4:3 output 3712A and the 16:9 output 3712B. Such dual format operation of the invention is transparent to the user 3704. In other words, the user 3704 does not need to explicitly design the script to achieve such dual format broadcasts. Instead, the invention utilizes transition macros that are coded for both formats, as described above.
The invention also supports "hot keys" that are coded for both formats. For example, the invention supports Late Breaking News keys, which when activated by the user 3704 inserts into the script commands to accommodate a late breaking news segment. Hot keys are described in greater detail in U.S. Patent No. 6,452,612, referenced above. According to the invention, such hot keys are associated with transition macros having instructions for both the 4:3 format and the 16:9 format.
Automated Real-Time Execution Of Live Inserts Of Repurposed Stored Content Distribution A method, system, and computer program product are provided for segmenting and marking digital video productions, such that all or segments of the production can be retrieved for a repurposed distribution over traditional mediums or computer networks, such as the Internet. As described in greater detail below, the present invention includes methodologies for removing or editing keyers, character generators, etc. The present invention enables one to change the order of the repurposed distribution or add segments from another production. The present invention also enables the insertion of advertisements and other information into the repurposed video stream. Overview Cable 24 hour news channels have found a niche for consumers looking for real time news information at the local/regional level. The 24- hour news channel model serves the community looking for updated news around the clock without having to wait for specific times as is typical of traditional broadcast local news. Traditional local broadcasters deliver live newscasts daily in the morning (5:00AM - 7:00AM), afternoon (11:00AM -
1 :00PM), evenings (4:00PM 6:00PM) and late evening hours (10:00PM - 12:00PM). The demand for local 24-hour news channels can be attributed to the different working hours and shifts associated with today's work environments along with the sense of immediacy that has been cultivated by the Internet.
In addition, in the not so distant future, local broadcasters will be able to participate in this market trend through the appropriate use of allocated digital bandwidth. The local broadcaster will be able to divide their digital bandwidth into multiple standard definition television (SDTV) channels that allows them to develop new applications. Local broadcasters in mid to large markets will take advantage of these channel opportunities to provide for new revenue applications such as 24-hour local news, local sports, local shopping and local education programming.
Two methods of deployment exist. One is to provide 24-hour "live" programming around the clock. Second, provide "live" programming when the' acts" change. In other words, a cable or local broadcaster can produce live content in the first l/2hour followed by a "repurposed" 1/2 hour on the back with live inserts for stories or events that require real time updates such as traffic and weather reports along with breaking news stories of events occurring in real time. Today, this is managed manually since the intelligence does not exist to automate this process. The manual process does not allow for true automated live inserts without many resources to manage the process.
Therefore a need exists to automate this process so that digital SDTV
24-hour programming channels can cost effectively produce content while the market develops. This will require a design that integrates the automated production environment with storage equipment intelligently including the ability to edit stories in real time during the production process while automatically maintaining a database and daily schedule at both the micro (story level) and macro (show level) resolution. Detailed Description
FIG. 40 is a production system 4002 according to an embodiment of the invention. The production system 4002 includes a user interface 4006 for enabling interaction with users 4004. The production system 4002 also includes a control system 4008 which is similar to the control system 3708 described above. The control system 4008 controls the production system
4002 to achieve the functionality described herein.
The production system 4002 includes a number of sources 4010, including a story archive 4012 and other production devices 4014. The story archive 4012 is a database having stored therein previously produced stories
(although a "story" stored in this database may be any portion of a show). The production devices 4014 are any known device for producing a show, such as a video switcher, camera, audio control, still store, tape machine, etc.
The production system 4002 further includes one or more switchers 4016, which receives sources 4010 and is controlled by the control system 4008.
The switchers 4016 generate an output 4018, which in an embodiment represents a live or as-live show.
The production system 4002 may be implemented using a production system as described in U.S. Patent No. 6,452,612, referenced above. FIG. 38 is a flowchart 3802 representing the operation of the production system 4002 according to an embodiment of the invention.
In step 3804, the user 4004 interacts with the user interface 4006 to generate a script for the show that is to be produced. Such operation is described in detail in U.S. Patent No. 6,452,612, referenced above. In step 3806, the production system 4002 produces the show according to the script generated in step 3804. Such operation is described in detail in
U.S. Patent No. 6,452,612, referenced above.
In step 3808, the production system 4002 segments the show
(produced in step 3806) according to some criteria. In an embodiment, the show is segmented according to content. Specifically, in an embodiment, the show is segmented according to story. Accordingly, in step 3808, the production system 4002 divides the show into its component stories.
In step 3810, these stories are stored in the story archive 4012, and indexed (or otherwise marked or tagged) for later retrieval. In an embodiment, the production system 4002 performs step 3808 by recording the beginning and end of each story in the show, and/or the duration of each story. This information is stored, for example in the story archive 4012, along with the show. Also stored is the story type or category of each story. Subsequently, the production system 4002 can uniquely access any story in the show using this information (meta data). It is noted that a "story" can be any portion of the show. Further description regarding the segmentation of a show is provided in "Method, System and Computer Program Product for Producing and Distributing Enhanced Media Downstreams," Serial No. 09/836,239, filed April 18, 2001, Attorney Docket No. 1752.0200000, referenced above.
In step 3812, the production system 4002 generates additional programming, such as a new live or as-live show, using a combination of live inserts and archived material from the story archive 4012. The live inserts are generated using the production devices 4014. The material from the story archive 4012 may be modified, if necessary, using the production devices 4014. In some cases, the additional programming is comprised of only material from the story archive 4012, although such materials may be modified to some extent. The operation of step 3812 is shown in greater detail in FIG. 39, which shall now be considered.
In step 3902, the user 4004 interacts with the production system 4002 via the user interface 4006 to create or modify a listing for a Current Show. The Current Show is a new show being designed for production using live inserts and material from the story archive 4012. The listing lists the components of the Current Show. For example, FIG. 41 shows a Current Show Listing 4106 having 6 components / segments / stories: a traffic report 4120A, a weather report 4120B, sports spotlight 4120C, a live 10:15AM insert 4118B, a bridge collapse story 4120E, and an entertainment segment 4120F.
Note that the sports spotlight 4120C and the bridge collapse story 4120E are from the story archive 4012 (shown in the archived video window 4108 of FIG. 41). The 10:15AM insert 4118B is a live insert (see the live segments window 4104). FIG. 41 illustrates an example user interface 4102 (provided by the user interface 4006 shown in FIG. 40) for enabling the user 4004 to create and modify the Current Show Listing 4106. The user 4004 can drag and drop live segments from the live segment window 4104, and/or archived stories from the archived video window 4108. Within the current show window 4106, the user 4004 can use well known computer pointing techniques to arrange and order the components of the Current Show. Other user interfaces could alternatively be used, such as those described in U.S. Patent No. 6,452,612 and/or pending U.S. application Ser. No. 09/836,239, referenced above.
In step 3904, the production system 4004 maps the Current Show Listing 4106 to a production system script that comprises transition macros. Scripts and transition macros are described in detail in U.S. Patent No. 6,452,612, as well as other patents and patent applications referenced above.
In step 3906, the production system 4004 produces the Current Show according to the Current Show Listing 4106 and/or the script (from step 3904). Step 3906 includes steps 3908, 3910, and 3912, which shall now be described.
In step 3908, the production system 4004 retrieves and modifies, as necessary, archived materials from the story archive 4012, in accordance with the Current Show Listing 4106. Such retrieved materials are inserted into the output 4018 via the switcher 4016.
Step 3908 involves automatic "downstream" keyer changes for "as live" replay of the retrieved materials (as necessary). For example, supposed the retrieved materials include a "bug" (for example, a time and temperature overlay on top of the video). In this example, the production system 4004 would cause the keyer (one of the production devices 4014) to either eliminate the bug, or update the bug with the cunent time and temperature.
In an embodiment, step 3908 is achieved by modifying the retrieved materials, in the manner just described. In an alternative embodiment, this is achieved by not recording the bug in the materials when the materials were originally produced (in step 3806). In this embodiment, the bug is inserted into the output 4018 in real-time.
In step 3910, the production system 4004 generates live segments in accordance with the Current Show Listing 4106. Such live segments are inserted into the output 4018 via the switcher 4016. Note that the user interface 4102 includes a Time Over/Under for Cunent Vi Hour timer 4114, which indicates the amount of time that the show is over/under given the stories already aired and the stories that are yet to be aired, and a Time Left for Current i Hour timer 4116, which indicates the time remaining in the 30 minute show (these counters 4114, 4116 are based on a 30 minute show). These counters 4114, 4116 are used by the director 4004 to ensure that the show fits into the allotted time (i.e., 30 minutes in the example of FIG. 41). The director 4004 can control the length of any live segments to ensure that the show fits within the 30 minute slot. Alternatively, the director 4004 can select other archived materials from the story archive
4012 based on duration of such materials to ensure the show fits within the allotted slot. Manipulation of the Current Show Listing 4106 in this manner is described in greater detail in "Method, System and Computer Program Product for Producing and Distributing Enhanced Media Downstreams," Serial No. 09/836,239, filed April 18, 2001, Attorney Docket No.
1752.0200000, referenced above (in the context of an approval application, system, method, and computer program product).
In step 3912, the production system 4002 inserts advertisements into the output stream 4018. In an embodiment, the ads are "searched" to import the right ad with the corcect duration, and possibly content. In other words, based on the scheduled "break" length, and/or the content of stories around the break, commercials of corresponding duration will be accessed and inserted into the output stream 4018.
As the Current Show is produced, some segments may run longer or shorter than originally expected. The production system 4002 keeps track of the overage or underage, and may adjust the duration of commercial breaks to compensate for such overage/underage. If the duration(s) of the commercial breaks change, then the production system 4002 selects and inserts into the output stream 4018 combination(s) of commercials that fit the duration of the breaks. Other aspects involving the selection of commercials are described in
"Advertisement Management Method, System, and Computer Program Product," Serial No. 10/247,783, filed September 20, 2002, Attorney Docket No. 1752.0450001, referenced above. Media Manager
FIG. 42 is a block diagram showing a media manager 4202 and a server 4204 according to an embodiment of the invention. The media manager 4202 and server 4204 are optionally part of the production systems 3702, 4002 shown in FIGS. 37 and 40, for example.
The media manager 4202 manages the selection, play out status, play out channel assignment of both video clips and graphic files from the server 4204.
The media manager 4202 / server 4204 combination has a number of features:
Media Types: The combination combines the management of a video server, still store, and character generator under one software interface. The media may consist of video and key. The key can be linked with the video for play out.
Integrated server: In an embodiment, the server 4204 is an integrated server that is capable of internally keying media, with key signals, and outputting the composite media out a single channel. Clip & Graphic Selection: The Media Manager 4202 interfaces with user interfaces 3706, 4006 to allow the user 3704, 4004 to brow a low- resolution image of video clips and graphic files available in the server 4204. These clip or graphic Id's can then be assigned to a particular story within the rundown. Non-integrated servers may only be able to export a list of clip or graphic id's available for selection.
Play out Status: The Media Manager 4202 updates the status of a clip assigned to a particular story. When a clip is assigned to a particular story within the rundown, the Media Manager 4202 tracks the air readiness of that particular clip. If a predefined graphic Id were entered into the rundown that does not exist, Media Manager 4202 would notify the rundown that the graphic requested is not air ready. When the production system 3702, 4002 monitors a rundown, a list of clips and graphics is generated under each story slug. If a clip or graphic is not air ready, the text is colored red. The status of the clip or graphic is automatically updated, until the rundown is unmonitored. When a list item that is colored red, becomes air ready, the text is changed to black.
Auto Assign Play out Channels: When a particular story is previewed for air, Media Manager 4202 automatically assigns the clips and graphics on an available play out channel (video & key) from a pool of channels. There are two parts to the channel assignment. Part one loads the media to the auto- assigned channel. Part two routes the auto-assigned channel to the proper background or key channel. (Channels on-air and requested channels for preview cannot exceed the total number of channels for the system.)
The media manager 4202 and server 4204 shall now be described in greater detail. As noted above, in an embodiment, the server 4204 is an integrated server. The media manager 4202 communicates and interacts with the server 4204 in the server 4204 's native language, such that the media manager 4202 is capable of accessing additional functionality of the server 4204. The media manager 4202 manages the play-out of server clips and still devices. The media manager 4202 controls what channels (ports) 4206 of the server 4204 these play out, and will ascertain if the clip exists or does not exist. When a particular clip on the script comes up to be output, the media manager 4206 automatically assigns the channel that it will be played out on, from the pool of channels that exist. This process is called auto-channeling. For example, suppose the production system 3702, 4002 is processing a "load media" command from a transition macro. According to the invention, the media manager 4202 automatically finds the next unused port 4206 of the server 4204, and uses that port 4206 to output the media from the server 4204. In other words, the port 4206 of the server 4204 need not be hard-coded into the "load media" command.
There are various benefits of auto-channeling. First, if the server port must be hard-coded into transition macros, then the library of transition macros becomes very large (for example, there must be a transition macro for each function / port combination). With auto-channeling, only a single transition macro for a given function is needed, since the channel 4206 of the server 4204 that is going to be used is automatically determined by the media manager 4202 when the command is executed. Second, auto-channeling greatly simplifies user operation, since the user need not be concerned with specifying the server port to use. Instead, the user can focus on selecting and ordering the transition macros in order to produce the desired show.
It is noted that the media manager 4202 can operate with non- integrated, commercially available servers. In this case, it is necessary to produce a configuration file or otherwise inform the media manager 4202 of the number and configuration of the server's ports / channels. Another aspect of the present invention director interface for production automation control is described next with reference to FIGs. 43-48.
N. Director Interface for Production Automation Control
The present invention provides a director control interface for extracting production information from a newsroom information management system (such as, a news automation system available from iNEWS™,
Newsmaker, Comprompter, and the Associated Press) and populating a production control system. To produce a show (such as a news program), a producer creates a rundown to select the stories that will be featured on the show. The producer can save the rundown to a rundown file within a newsroom information management system, which allows other personnel involved with the production to gain access to the rundown. An example of a system that integrates a newsroom rundown with a production control system is described in the pending U.S. application entitled "Method, System and Computer Program Product for Full News Integration and Automation in a Real Time Video Production Environment" (U.S. Appl. Ser. No. 09/822,855), which is incorporated herein by reference in its entirety. The director for the show uses the producer's rundown file as the basis for creating a director's rundown sheet. The director's rundown sheet comprises multiple elements for producing a show. An element, therefore, comprises a group of commands for instructing a production crew to operate the production equipment and thereby, produce a segment or special effects for a show. An example is a voice-over (VO) element. In this case, several commands are required to execute a VO element or line item on the director's rundown. Specifically, commands are required for a video switcher, audio mixer, teleprompter, and a record playback device (RPD), such as a videotape recorder/player (VTR) or video server. These commands are "grouped" together to define the VO element.
In an automated production control environment, an element represents a group of commands for automating the control of production equipment without significant human interactions. An example of an automated production control environment is described in the aforementioned U.S. application 09/822,855 as well as pending U.S. application entitled "System and Method for Real Time Video Production and Multicasting" (U.S. Appl. Ser. No. 09/634,735), which is incorporated herein by reference in its entirety. As described in these U.S. applications, an automated production can be managed and controlled by an automation control program, such as the
Transition Macro™ multimedia production control program developed by ParkerVision, Inc. (Jacksonville, FL). Hence, an automation control program includes several groupings of commands, representing a macro element or group of macro elements. Accordingly, the director would create a macro element, comprising all the production commands necessary to represent an element on the show rundown. The macro element is executable to control the designated production devices, and thereby, produce a show segment or special effect, such as an introduction, package and tag segment (INTRO/PKG/TAG), a voice over segment (VO), a sound-on-tape segment (SOT), an over-the- shoulder segment (OTS), a VO/SOT combination, an on camera segment (ON-CAM), or other types of elements or segments of a show.
As used herein, the term "media production" includes the production of any and all forms of media or multimedia in accordance with the method, system, and computer program product of the present invention. A media production includes, but is not limited to, video of news programs, television programming (such as, documentaries, situation comedies, dramas, variety shows, interviews, or the like), sporting events, concerts, infomercials, movies, video rentals, or any other content. For example, a media production can include streaming video related to corporate communications and training, educational distance learning, or home shopping video-based "e" or "t" - commerce. Media productions also include live or recorded audio (including radio broadcast), graphics, animation, computer generated, text, and other forms of media and multimedia. Accordingly, a media production can be live, as-live, or live-to-tape. In a "live broadcast" embodiment of the present invention, a media production is recorded and immediately broadcast over traditional airwaves or other mediums (e.g., cable, satellite, etc.) to a television or the like. At the same time (or substantially the same time), the media production can be encoded for distribution over a computer network. In an embodiment, the computer network includes the Internet, and the media production is formatted in hypertext markup language (HTML), or the like, for distribution over the World Wide Web. However, the present invention is not limited to the Internet. A system and method for synchronizing and transmitting traditional and network distributions are described in the pending U.S. application entitled "Method, System, and Computer Program Product for Producing and Distributing Enhanced Media" (U.S. Appl. Ser. No. 10/208,810), which is incorporated herein by reference in its entirety.
The term "as-live" refers to a live media production that has been recorded for a delayed broadcast over traditional or network mediums. The delay period is typically a matter of seconds and is based on a number of factors. For example, a live broadcast may be delayed to grant an editor sufficient time to approve the content or edit the content to remove objectionable subject matter.
The term "live-to-tape" refers to a live media production that has been stored to any type of record playback device (RPD), including a video tape recorder/player (VTR), video recorder/server, virtual recorder (VR), digital audio tape (DAT) recorder, or any mechanism that stores, records, generates, or plays back via magnetic, optical, electronic, or any other storage media. It should be understood that "live-to-tape" represents only one embodiment of the present invention. The present invention is equally applicable to any other type of production that uses or does not use live talent (such as cartoons, computer-generated characters, animation, etc.). Accordingly, reference herein to "live," "as-live," or "live-to-tape" is made for illustration purposes, and is not limiting. Additionally, traditional or network distributions can be live or repurposed from previously stored media productions. In an embodiment, a macro element is imported, or integrated, into an automation control program, such as the Transition Macro™ multimedia production control program developed by ParkerVision, Inc. (Jacksonville, FL) that can be executed to control an automated multimedia production system. The Transition Macro™ program is described in the pending U.S. application entitled "System and Method for Real Time Video Production and
Multicasting" (U.S. Appl. Ser. No. 09/634,735), which is incorporated herein by reference in its entirety. As described in the aforesaid U.S. application, the Transition Macro™ program is an event-driven application that allows serial and parallel processing of media production commands. The pending U.S. application entitled "Method, System and Computer Program Product for
Producing and Distributing Enhanced Media Downstreams" (U.S. Appl. Ser. No. 09/836,239) also describes representative embodiments of a multimedia production environment that is implementable with the present invention, and is incorporated herein by reference in its entirety. As described in the aforesaid U.S. applications, an automated multimedia production environment includes a centralized media production processing device that automatically or semi- automatically commands and controls the operation of a variety of media production devices in analog and/or digital video environments.
The term "media production device" includes video switcher, digital video effects device (DVE), audio mixer, teleprompting system, video cameras and robotics (for pan, tilt, zoom, focus, and iris control), record/playback device (RPD), character generator, still store, studio lighting devices, news automation devices, master control/media management automation systems, commercial insertion devices, compression/decompression devices (codec), virtual sets, or the like. The term "RPD" includes VTRs, video recorders/servers, virtual recorder (VR), digital audio tape (DAT) recorder, or any mechanism that stores, records, generates or plays back via magnetic, optical, electronic, or any other storage media. In an embodiment, the media production processing device receives and routes live feeds (such as, field news reports, news services, sporting events, or the like) from any type of communications source, including satellite, terrestrial
(e.g., fiber optic, copper, UTP, STP, coaxial, hybrid fiber-coaxial (HFC), or the like), radio, microwave, free-space optics, or any other form or method of transmission, in lieu of, or in addition to, producing a live show within a studio. As discussed, a director control interface links a rundown file from a newsroom information management system with a production control system. The director control interface serves as a management tool for extracting the requisite information from a newsroom rundown file (e.g., the producer's rundown) and populating the production control system with the appropriate macro elements. Therefore, the director control interface of the present invention is compatible with any type of newsroom information management system as long as it can extract the requisite production information.
FIG. 43 illustrates a director control interface 4300 according to an embodiment of the present invention. Director control interface 4300 includes a plurality of page control lines 4301(a)-4301(n). Each page control line
4301(a)-4301(n) corresponds to a page or line-item from a newsroom rundown file. Control columns 4302-4318 includes production information that has been selected from the newsroom rundown file or inputted by the director. Control columns 4302-4318 includes auto-build column 4302, group column 4303, page column 4304, slug column 4305, on-cam column 4306, camera column 4307, shot type column 4308, VT/NR column 4309, v-source column 4310, SS column 4311, effects column 4312, order column 4313, TME column 4314, layer column 4315, Web ID column 4316, Web segment column 4317, and Web URL column 4318.
Auto-build column 4302 is associated with commands for selecting one or more macro elements that, when executed, control a production control system and produce one or more segments of a media production. In an embodiment, auto-build column 4302 automatically builds effects column 4312 based on the information in the other control columns 4302-4318. Group column 4303 allows the director to select and group together multiple page control lines 4301(a)-4301(n). Page control lines 4301(a)-4301(n) can be grouped under one story level. Therefore, multiple stories (identified by slug column 4305) can be grouped under a single story level. The grouped page control lines 4301(a)-4301(n) can be moved, deleted, or cached as a group.
A single control line 4301(a)-4301(n) or a group of control lines 4301(a)-4301(n) can be stored to a cache button (not shown) that is displayed on director control interface 4300. All data associated with the cached page control line 4301(a)-4301(n) is also stored to the cache button. As such, in an embodiment, one or more cache buttons allow the director to float stories and then re-insert them as needed. When activated, a cache button inserts pages or stories stored at the cache button into the page control lines 4301(a)-4301(n) and pushes everything else down.
Page column 4304 includes an alpha-numeric designator or page number for each page control line 4301(a)-4301(n). A collection of one or more pages (i.e., page control line 4301(a)-4301(n)) comprises a story or segment of a media production. As shown, each control line 4301(a)-4301(n) is sequentially designated as Al, A2, A3, A4, A5, A6, B0, Bl, B2, etc. The first character in page column 4303 identifies a specific block within a media production. A newscast, for example, is typically assembled into blocks known as A, B, C, and D blocks. A show block can be used identify segments of a media production that can be used to sell advertisements.
Slug column 4305 identifies a unique story slug for each page control line 4301(a)-4301(n). The story slug is unique because the information provided in slug column 4305 does not change and therefore, is a constant descriptor of each control line 4301(a)-4301(n). This can be explained with reference to FIG. 44 and FIG. 45.
Another embodiment of director control interface 4300 is shown in FIG. 44 and FIG. 45. In FIG. 44, the first control line 4301(a)-4301(n) on director control interface 4300 is "AOl - Shooting". Specifically, page column 4304 reads "AOl," and slug column 4305 reads "Shooting." FIG. 45 shows another embodiment of director control interface 4300, where the producer has moved the "Shooting" slug to another position. The slug is now positioned after the control line 4301(a)-4301(n) that has a page column 4304 reading "A05" and a slug column 4305 reading "Touchdown Club." A new control line 4301(a)-4301(n) has been created and given the designator "A05.5" in page column 4304. The new control line 4301(a)-4301(n) would receive the story slug "Shooting", its script, and all of its production commands, and the previous control line 4301(a)-4301(n) having the value "AOl" in page column 4304 no longer exists. As such, the slug value in slug column 4305 is unique, and becomes a key field for synchronizing director control interface 4301 with the newsroom rundown, as discussed in greater detail below.
Referring back to FIG. 43, on-cam column 4306 indicates the talent(s) that will speak or read the story identified by slug column 4305. As described in the aforementioned U.S. application entitled "System and Method for Real Time Video Production and Multicasting" (U.S. Appl. Ser. No. 09/634,735), a camera preset and/or audio preset can be established for recording a news anchor. On-cam column 4306 allows one or more anchor names to be associated with a preset position. For example, a "News Set" may consist of Talent Position 1, Talent Position 2, Talent Position 3, and Talent Position 4. The anchor name "Deb" can be assigned to Talent Position 1 , the anchor name "Tom" can be assigned to Talent Position 2. Director control interface 4300 has the ability to equate the names placed in on-cam column 4305 with the talent positions on the "News Set."
Camera column 4307 indicates the camera source for recording story identified in slug column 4305. A primary and secondary camera can be selected. In an embodiment, a drop-down list of available cameras can be reviewed for camera selection. The list depends on the talent position and/or desired special effects or shot types.
Shot type column 4308 includes instructions for framing a camera shot. For example, the framing can be a straight shot, over-the-shoulder (OTS) shot, wide shot, or the like. In an embodiment, a dialog box can be opened to display a list of user-defined shot type names that can be selected for entry.
VT/VR column 4309 specifies the type of element or segment being produced for the story identified in slug column 4305. As discussed above, a segment type includes an INTRO/PKG/TAG, NO, SOT, OTS, VO/SOT combination, OΝ-CAM, or the like. In an embodiment, a dialog box is opened to display a list of user-defined segment type names that can be selected for entry.
V-source column 4310 identifies a machine source and filename. Machine source includes a RPD as described above. The filename can be expressed as a time code, server clip identifier, or the like. V-source column 4310 can include one or more filenames for one or more machine sources.
SS column 4311 identifies a source and address for a still store or character generator (CG) device. SS column 4311 can include one or more sources and/or addresses to a single or multiple store or CG devices.
Effects column 4312 indicates the type of transition effects or special effects that are needed for the story identified in slug column 4305. The effects include, but are not limited to, fades, wipes, DVE, downstream keyer (DSK) effects, or the like. DVE includes, but are not limited to, warps, dual-box (double box) effects, page turns, music, slab effects, and sequences. DSK effects include DVE and DSK linear, chroma and luma keyers. In an embodiment, a dialog box is opened to display a list of user-defined effect names that can be selected for entry. An effect can be assigned to an entire page control line 4301(a)-4301(n). Such global effects include a double box, bump, or the like. An effect can also be assigned to a specific layer of a page control line 4301(a)-4301(n). For example, a second level entry grid can be opened to allow a director to assign an effect to a particular production layer (e.g., camera, tape, still store, etc.).
Order column 4313 indicates the order in which events will occur while producing the story identified in slug column 4305. For example, if a page control line 4301(a)-4301(n) includes a camera shot, tape, and still store. Order column 4313 can specify the order as being (CAM, VT, SS), (VT, SS, CAM), (CAM, SS, VT), or the like.
TME column 4314 includes an association name or acronym for an association file, which corresponds to one or macro elements for producing a segment of a media production. In an embodiment, a tool window is opened to allow the director to search for an association name to select the correct macro element. If auto-build column 4302 is activated, TME column 4314 is left blank and the macro element(s) is automatically selected, as discussed below.
A system and method for selecting and importing association names or acronyms are described in the aforementioned U.S. application entitled "Method, System and Computer Program Product for Full News Integration and Automation in a Real Time Video Production Environment" (U.S. Appl. Ser. No. 09/822,855). Other examples of such systems and methods are described in the pending U.S. application entitled "Method, System, and Computer Program Product for Producing and Distributing Enhanced Media" (U.S. Appl. Ser. No. 10/208,810), which is incorporated herein by reference in its entirety; and in pending U.S. application entitled "Method, System and Computer Program Product for Producing and Distributing Enhanced Media Downstreams" (U.S. Appl. Ser. No. 09/836,239), which is incorporated herein by reference in its entirety.
Layer column 4315 includes instructions for keying or compositing layers over a backgound image. In an embodiment, layer column 4315 specifies the source(s) for a background, key hole, and key fill. Layer column 4315 can also specify a keyer(s), DVE(s), DVE channels, or other related production values.
Web ID column 4316 includes post-production distribution instructions for the story identified in slug column 4305. As discussed above, a media production can be encoded for distribution over a computer network, such as the global Internet. Web ID column 4316 enables the director to associate the story of slug column 4305 with a scheduled network distribution. In an embodiment, a combo box is opened to list the available shows for encoding. The director selects one of the available shows to associate with the story identified in slug column 4305.
Web segment column 4317 identifies a classifier for the story identified in slug column 4305. The director can select the classifier from a combo box listing all show segment classifications. In an embodiment, the director can choose from a library of major and minor classifications. For example, a major classification or topic can be sports, weather, headline news, traffic, health watch, elections, or the like. An exemplary minor classification or category can be local sports, college basketball, NFL football, high school baseball, local weather, national weather, local politics, local community issues, local crime, editorials, national news, or the like. Classifications can expand beyond two levels to an unlimited number of levels for additional granularity and resolution for segment type identification and advertisement targeting.
Web URL column 4318 allows the director to enter auxiliary information for the story identified in slug column 4305. Auxiliary information enhances the value of the story by making available graphics, extended play segments, opinion research data, URLs, advertisements, or the like. Web URL column 4318 includes a filename, path, URL, or like address to auxiliary information that is linked to director control interface 4300.
The aforementioned list of control columns 4302-4318 are provided by way of example and not limitation. Additional control columns for selecting requisite production information can be included and are intended to be within the scope of the present invention. Director control interface 4300 also includes an import activator 4319, link activator 4320, air activator 4321, start time field 4322, end time field 4323, and over/under field 4324. Import activator 4319 instructions director control interface 4300 to import the production information into a production control system. Link activator 4320 instructs director control interface 4300 to monitor the newsroom rundown for changes. Air activator 4321 allows the director to approve a page control line 4301(a)-4301(n) to be executed on a production control system. Once the production control system starts to execute the macro elements imported from director control interface 4300, start time field 4322 displays the time the show begins. End time field 4323 displays the projected completion time for the show.
Over/under field 4324 displays a contemporaneous difference between an updated projected completion time and the originally projected completion time displayed in end time field 4323. Each story identified at slug column 4305 has an estimated story time. When the macro elements associated with a story starts to execute, the story time for the previous story replaces the estimated time for the previous story with the actual time for the previous story. The updated projected completion time is a measure of the actual duration of all executed stories plus an estimated duration for the stories remaining to be executed. The updated projected completion time minus the value of end time field 4323 equals the over/under time reported in over/under field 4324.
In an embodiment, director control interface 4300 includes a next story button (not shown) and next page button (not shown). The next story button skips to the next story on director control interface 4300. As such, the current event is not executed. The next page button skips to the next page number (page column 4304), so that the current event is not executed. As discussed above, several pages may compose a single story.
Referring to FIG. 46, flowchart 4600 describes an example of a control flow for building director control interface 4300. Specifically, flowchart 4600 describes an operational flow for setting production values and macro elements for execution on a production control system, according to an embodiment of the present invention.
The control flow of flowchart 4600 begins at step 4601 and passes immediately to step 4603. At step 4603, the director identifies or selects the production information for building a director's rundown for a show. In an embodiment, the production information is based on industry standard terms. Therefore, the present invention can be implemented in various environments without requiring the director to learn a specific jargon.
The production information is collected from a newsroom information management system. As discussed above, a producer selects stories for a show and saves the selections to a newsroom rundown file. Portions of the production information are imported into director control interface 4300 from a newsroom rundown. In an embodiment, dialog is opened to select a path to the newsroom rundown. Thus, the dialog would contain a tree of rundown files and dates prepared and saved by the producer.
Once imported, the newsroom rundown populates the page control lines 4301(a)-4301(n) of director control interface 4300. At a minimum, the field values in page column 4304, slug column 4305, and on-cam column 4306 are extracted from the newsroom rundown. The director has the option of extracting additional field values (such as, tape, stills, OTS, etc.), as needed to build each story. The director must also specify or confirm a running order of stories by block, page number, and unique slug name.
As discussed above, director control interface 4300 has two modes of operation. In manual mode, the director manually selects macro elements for a story. In auto-build mode, director control interface 4301 automatically builds macro elements for a story. The mode determines the type of production information that must be extracted from the newsroom rundown or completed by the director.
For auto-build mode, the following fields must be imported or completed on director control interface 4300. Auto-build column 4302 must be activated. Page column 4304, slug column 4305, on-cam column 4306, camera column 4307, shot type column 4308, v-source column 4310, SS column 4311, effects column 4312, Web ID column 4316, Web segment column 4317, and Web URL column 4318 must be completed with the requisite data.
For manual mode, the following fields must be imported or completed. Auto-build column 4302 must be deactivated. Page column 4304, slug column 4305, on-cam column 4306, camera column 4307, shot type column 4308, VT VR column 4309, v-source column 4310, SS column 4311, effects column 4312, order column 4313, TME column 4314, layer column 4315, Web ID column 4316, Web segment column 4317, and Web URL column 4318 must be completed.
After the production information has been imported or completed, control passes to step 4606. At step 4606, one or more macro element files are identified or selected for each story that is uniquely identified at slug column 4305. If manual mode is set at step 4603, the director would input the association names for the appropriate macro element file. If, however, auto- build mode is set at step 4603, functions or routines associated with auto-build column 4302 are executed to select the appropriate macro element files. In an embodiment, a library of macro elements is indexed by production field values. The auto-build functions or routines are executed to search the macro element library to find macro element files having production field values that match the production information specified in page control lines 4301(a)- 430 l(n). In an embodiment, each combination of production values has a default macro element file. Some combinations have a secondary choice, third choice, etc.
Auto-building is performed on one page control line 4301(a)-4301(n) at a time. At times, a previous page control line 4301(a)-4301(n) may need to be changed due to selections made on a current page control line 4301(a)- 4301(n). For example, assume there are two DVE devices designated as DVE1 and DVE2. Further assume that DVE1 is the only available DVE that can implement a page turn effect. If foμr keyer layers are used on one page control line 4301(a)-4301(n), an auto-build default macro element may be built for DVE1. If on the next page control line 4301(a)-4301(n), a similar four keyer layer effect is required and a page turn is needed, the auto-build functions or routines must go back to the previous page control line 4301(a)-4301(n) and build effects on DVE2 to allow DVEl to page turn DVE2 off to transition to the next page control line 4301(a)-4301(n).
In an embodiment, the auto-build functions or routines selects macro element files based on a user-defined macro element type. The director specifies the type of macro element file that is desired to be built. The auto- build functions or routines identify the appropriate search fields and query the macro element library for the appropriate macro elements.
In an embodiment, the present invention supports four macro element types, which include a camera macro element type, tape/server macro element type, still store macro element type, effects macro element type. Each macro element type is associated with a combination of different control columns 4302-4318. A camera macro element type is associated with on-cam column 4306, shot type column 4308, camera column 4307, and effects column 4312. A tape/server macro element type is associated with on-cam column 4306, VT/VR column 4309, v-source column 4310, and effects column 4312. A still store macro element type is associated with on-cam column 4306, SS column 4311, v-source column 4310, and effects column 4312. Finally, an effects macro element type is associated with a camera macro element type, tape/server macro element type, or still store macro element type. Alternatively, an effects macro element type can be a separate macro element type (e.g., Double Box, Bumps, etc.).
After the macro element files have been selected and associated with each story of page control lines 4301(a)-4301(n), control passes to step 4609. At step 4609, the page control lines 4301(a)-4301(n) are checked for enors. In an embodiment, director control interface 4300 includes a status column (not shown) that indicates the cunent state of each page control line 4301(a)- 430 l(n). If the production information is complete for a particular page control line 4301(a)-4301(n), a status light for the status column turns green. If anything is missing, the status light is red. Additionally, if a page control line 4301(a)-4301(n) does not have sufficient information to auto-build a macro element, the status light turns red and/or TME column 4314 turns red.
If a tape/server macro element type or a still store macro element type is assigned and no identifier is given, the status light turns red and/or v-source column 4310 or SS column 4311 turns red.
In an embodiment, director control interface 4300 detects and/or resolves conflicts while it executes the error checking process. For example, when a macro element is automatically assigned to a page control line 4301(a)-4301(n), the macro element(s) assigned to the previous page control line 4301(a)-4301(n) is checked for conflicts. If a conflict exists, the conflict is resolved by selecting an alternate macro element. If no alternate macro element is available or has been specified, the story is flagged and/or TME column 4314 turns red.
Conflict checking can also be executed for macro elements selected in manual mode. Thus, for auto-build and manual mode, macro elements assigned in TME column 4314 for each page control lines 4301(a)-4301(n) are checked for conflicts with macro elements assigned in a preceding and subsequent page control lines 4301(a)-4301(n). In an embodiment, a dialog box is opened to create a list of macro elements that cannot go back to back. The dialog box can also include an option for assigning an alternate macro element to be inserted if a conflict arises. Therefore, back-to-back conflicting macro elements are flagged, and if an alternate macro element has been assigned, the alternate macro element is inserted. If an alternate macro element has not been assigned, the status light would turn red and/or TME column 4314 would turn red to flag the director.
In an embodiment, shot type column 4308 is checked for conflicts. A conflict may arise if two page control lines 4301(a)-4301(n) specify instructions for back-to-back camera shots that have different field values in shot type column 4308. For example, assume that CAM1 has been assigned as a primary camera and CAM 2 has been assigned as a secondary camera for two macro element files. The macro element files are associated with two adjacent page control lines 4301(a)-4301(n). In other words, an association name for the first macro element file is referenced in TME column 4314 for the first control lines 4301(a)-4301(n). Similarly, an association name for the second macro element file is referenced in TME column 4314 for the second page control line 4301(a)-4301(n). Further, assume that CAMl has been assigned to execute an OTS camera shot for the first macro element. A different shot type, however, is specified in shot type column 4308 for CAMl in the second macro element. Since CAMl has been selected to record two distinct back-to-back shot type, it would be difficult to produce a smooth transition between the two macro elements. To resolve the conflict, the secondary camera CAM2 is selected for the second macro element. If no secondary camera had been assigned, the status light would turn red and/or camera column 4307 and TME column 4314 would turn red.
Referring back to FIG. 46, if the production information passes error checking (including conflict resolution), control passes to step 4612. At step 4612, the director imports the production information from director control interface to a production control system. Techniques and/or methodologies for importing a newsroom rundown to populate a control interface for a production control system is described in the aforementioned U.S. application entitled "Method, System and Computer Program Product for Full News Integration and Automation in a Real Time Video Production Environment" (U.S. Appl. Ser. No. 09/822,855).
FIG. 47 illustrates a production control interface 4700 for a production control system, according to an embodiment of the present invention. Production control interface 4700 includes a plurality of control lines 4701(a)- 4701(b). As shown, control lines 4701(a)-4701(n) have not been populated with production information from director control interface 4300. To load the macro element files from director control interface 4300, the director, in an embodiment, can activate an icon, use a pull-down menu, or the like to execute an import function. In this embodiment, import window 4702, activated from a pull-down tab, identifies the ready-to-air rundown to be converted into macro elements. To activate, the director clicks on the "import" button. During the conversion process, the association names listed in TME column 4314 would call up the macro element files.
FIG. 48 illustrates another embodiment of production control interface 4700. As shown, control lines 4701(a)-4701(n) have been populated with macro element files following the conversion process. In this embodiment, macro element files 4804(a)-4804(e) are associated with production commands for five elements from a show rundown. In an embodiment, different colors can be assigned to each macro element file 4804(a)-4804(e) to allow the director to quickly and visually identify the type of element (e.g., VO, INTRO, SOT, or the like).
Referring back to FIG. 46, control passes to step 4615 after the production information has been imported into a production control system (such as, production control interface 4700). At step 4615, director control interface 4300 monitors the newsroom rundown for changes (e.g., from the producer). If changes are detected, the present invention provides mechanisms for updating director control interface 4300 and/or production control interface 4700. As discussed above, the field value in slug column 4305 is unique, and represents a key field for synchronizing director control interface 4301 with a newsroom rundown. Hence, the slug field is a key for searching the records of the newsroom rundown on a periodically scheduled basis. The records matching the slug key are compared with the production information conesponding to the page control line 4301(a)-4301(n) having the same slug value in its slug column 4305.
If changes are detected, several courses of action are taken depending on the linking mode. As discussed above, link activator 4320 instructs director control interface 4300 to monitor the newsroom rundown for changes. Link activator 4320 also enables the director to specify the linking mode for correcting or synchronizing director control interface 4320. In an embodiment, four linking modes are supported by the present invention. The linking modes include fully linked, timeline approval only, timeline and director interface approval, and fully manual. — I l l —
In a fully linked mode, changes that are made on the newsroom rundown are automatically updated on director control interface 4300. After director control interface 4300 has been updated, the changes are evaluated for conflicts. If auto-build has been activated, association names for the macro elements files are selected or updated (if required) in TME column 4314. Production control interface 4700 is also updated.
If the mode has been set for timeline approval only, changes that are made on the newsroom rundown are automatically updated on director control interface 4300. After director control interface 4300 has been updated, the changes are evaluated for conflicts. If auto-build has been activated, TME column 4314 is updated, if required, with the proper association names for macro element files. However, production control interface 4700 is not automatically updated. The director is alerted that changes have been made to director control interface 4300. The director is granted an option to accept the changes and update production control interface 4700.
If the mode is set for timeline and director interface approval, the changes that are made on the newsroom rundown are not automatically made on director control interface 4300. The changes must first be approved by the director. When director control interface 4300 has been properly updated, the director must grant authorization to update production control interface 4700.
In fully manual mode, the changes that are made on the newsroom rundown are automatically updated on director control interface 4300. The director, however, must manually review TME column 4314 and select or update the association names for macro elements, if necessary. The director must re-import the production information from director control interface 4300 to production control interface 4700 to incorporate the changes.
Thus, according to embodiments of the present invention, changes in the newsroom rundown ripples to director control interface 4300 and production control interface 4700. An anchor read ripple represents an example. A producer may change the talent that has been assigned to read a story or stories. When talent reads are changed, instructions associated with director control interface 4300 re-assigns the macro elements for each story line that has changed. Each time a new macro element is selected, the macro element is compared to the previous macro element for conflicts and the next macro element for conflicts. This process continues down the rundown until no conflicts are found or no macro elements are changed.
The director can impose changes onto director control interface 4300 by inserting or deleting page control lines 4301(a)-4301(n). These changes must also be synchronized with production control interface 4700 and the newsroom rundown. For instance, if stories are deleted from director control interface 4300 and not from the newsroom rundown, director control interface 4300 is no longer synchronized with the newsroom rundown and the link between the two is suspended. If link activator 4320 is activated to un-suspend the link, the two rundowns are compared and warnings are issued of any mismatch. The director can decide to accept the newsroom rundown changes or not. If the director chooses to not accept the changes, the link is once again suspended.
If the director inserts cache pages back into director control interface 4300, the stories are inserted on director control interface 4300, but not on the newsroom rundown. Consequently, director control interface 4300 is no longer synchronized with the newsroom rundown, and the link between the two is suspended. If link activator 4320 is activated to un-suspend the link, the rundowns are compared and warnings are issued of any mismatch. The director can decide to accept the newsroom rundown changes or not. If the director chooses to not accept the changes, the link is again suspended.
Once the director control interface 4300 has been synchronized to any changes in the newsroom rundown, the control flow ends as indicated at step 495.
FIGs. 43-48 are conceptual illustrations allowing an easy explanation of the present invention. It should be understood that embodiments of the present invention could be implemented in hardware, firmware, software, or a combination thereof. In such an embodiment, the various components and steps would be implemented in hardware, firmware, and/or software to perform the functions of the present invention. That is, the same piece of hardware, firmware, or module of software could perform one or more of the illustrated blocks (i.e., components or steps).
The present invention can be implemented in one or more computer systems capable of carrying out the functionality described above with reference to FIG. 3.
O. Conclusion
While various embodiments of the present invention have been described above, it should be understood that they have been presented by way of example, and not limitation. It will be apparent to persons skilled in the relevant art that various changes in form and detail may be made therein without departing from the spirit and scope of the invention. This is especially true in light of technology and terms within the relevant art(s) that may be later developed. Thus, the present invention should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents.

Claims

Whatls Claimed Is:
1. A method for controlling a production studio for producing a television show, the method comprising: sending control commands to a plurality of video production devices from a processing unit; associating one or more icons representing video production device control buttons with one or more control commands; creating, via a hierarchical user interface, a transition macro by placing one or more of said icons on a time sheet; and executing, via said hierarchical user interface, said transition macro to control said plurality of video production devices during the television show.
2. The method of claim 1, wherein the creating step includes placing one or more of said icons on said time sheet to create a hierarchical view of group layers.
3. The method of claim 2, wherein said hierarchical view of group layers is a graphical user interface view.
4. The method of claim 2, wherein said hierarchical view of group layers include one or more of an object group layer, a TME group layer, a page group layer, a story group layer and a show group layer.
5. The method of claim 4, wherein said TME group layer, said page group level and said story group layer each have an associated handle that provides for the maneuverability of the layer.
6. The method of claim 4, wherein said object group layer may be a member of said TME group layer, said TME group layer may be a member of said page group layer, said page group layer may be a member of said story group layer and said story group layer may be a member of said show group layer.
7. The method of claim 1, further comprising maintaining a dynamic link between a newsroom computer system and said time sheet.
8. The method of claim 1, wherein two or more of said icons, each being associated with different ones of said video production devices, are placed on said time sheet so that they are executed simultaneously during said executing step.
9. The method of claim 8, further selecting one or more of said icons to execute as soon as they are free from their current operation.
10. The method of claim 1, wherein said plurality of video production devices includes one of a camera, a character generator, a digital video effect (DVE) device, and an audio mixer.
11. The method of claim 1 , where said icon is one of GPI/O Mark Icon, Jump Mark Icon, DVE Mark Icon, Keyer Icon, Audio Icon, Script Viewer Icon, CG/SS Icon, Machine Control Icon, Camera Preset Icon, and GPO Icon.
12. A system for controlling a production studio for producing a television show, comprising: a processing unit that sends control commands to a plurality of video production devices; one or more icons that represent video production device control buttons associated with one or more control commands; and a hierarchical user interface that is used to create a transition macro by placing one or more of said icons on a time sheet, wherein said hierarchical user interface is also used to execute said transition macro to control said plurality of video production devices during the television show.
13. The system of claim 12, wherein one or more of said icons are placed on said time sheet to create a hierarchical view of group layers.
14. The system of claim 13, wherein said hierarchical view of group layers is a graphical user interface view.
15. The system of claim 13, wherein said hierarchical view of group layers include one or more of an object group layer, a TME group layer, a page group layer, a story group layer and a show group layer.
16. The system of claim 15, wherein said TME group layer, said page group level and said story group layer each have an associated handle that provides for the maneuverability of the layer.
17. The system of claim 15, wherein said object group layer may be a member of said TME group layer, said TME group layer may be a member of said page group layer, said page group layer may be a member of said story group layer and said story group layer may be a member of said show group layer.
18. The system of claim 12, further comprising a dynamic link between a newsroom computer system and said time sheet.
19. The system of claim 12, wherein two or more of said icons, each being associated with different ones of said video production devices, are placed on said time sheet so that they are executed simultaneously during said executing step.
20. The system of claim 19, wherein one or more of said icons are selected to execute as soon as they are free from their current operation.
21. The system of claim 12, wherein said plurality of video production devices includes one of a camera, a character generator, a digital video effect (DVE) device, and an audio mixer.
22. The system of claim 12, where said icon is one of GPI/O Mark Icon, Jump Mark Icon, DVE Mark Icon, Keyer Icon, Audio Icon, Script Viewer Icon, CG/SS Icon, Machine Control Icon, Camera Preset Icon, and GPO Icon.
23. A method of controlling a plurality of automated keyers to produce a composite media production, comprising the steps of: executing a first production command to determine the availability of the plurality of keyers; selecting an available keyer in response to step (1); and transmitting a control command that, when executed, instructs said available keyer to produce the composite media production in response to receiving a media production source and a fill source.
24. The method according to claim 23, wherein step (1) further comprises the step of: monitoring an operating state of at least one of the plurality of keyers to determine said availability.
25. The method according to claim 23, wherein step (1) further comprises the step of: detecting at least one of the plurality of keyers as being denoted as an unoccupied keyer.
26. The method according to claim 25, wherein step (2) further comprises the step of: selecting said unoccupied keyer as said available keyer in response to detecting only one unoccupied keyer.
27. The method according to claim 25, wherein step (a) further comprises the steps of: detecting a plurality of unoccupied keyers.
28. The method according to claim 27, wherein step (2) further comprises the steps of: identifying one of said plurality of unoccupied keyers that has a current state of being continuously denoted as an unoccupied keyer a greater period of time than other of said plurality of unoccupied keyers; and selecting said one of said plurality of unoccupied keyers as said available keyer.
29. A method of controlling a plurality of production devices to produce a show, comprising the steps of: scheduling a sequence of production events for producing a show, each production event being associated with one or more production commands, each of said one or more production commands being executable to send a control command for controlling a corresponding production device; sending a first control command to produce a media production segment upon completion of a first production event from said sequence; and sending a second control command to identify an available keyer upon completion of said first production event to key said media production.
30. The method according to claim 29, further comprising the step of: sending a third control command to deliver said media production segment to said available keyer.
31. The method according to claim 29, further comprising the step of: sending a fourth control command to access a fill source to key said media production segment.
32. A method of compositing a media production, comprising the steps of: accessing the media production having a key signal; monitoring a plurality of keyers to identify an available keyer; receiving a key fill associated with said key signal; detecting a key value with said key fill; and compositing a keyer layer in the media production, said keyer layer comprising said key fill and said key value.
33. The method according to claim 32, wherein said key fill being at least one of a title, text, graphic, video still store, video, and matte color.
34. The method according to claim 32, wherein said key value being at least one of an image shape, a hue, and a brightness level.
35. A system for controlling a plurality of production devices to produce a show, comprising: an automation control processor for scheduling a sequence of production events within the show, wherein each production event comprises one or more production commands, wherein each of said one or more production commands is executable to send a control command to control a corresponding production device; a plurality of keyers, wherein state information pertaining to each of said keyers is delivered to said automation control processor; and an input router for communicating signals from a media production source, a fill source, or a key source to a selected keyer from said plurality of keyers, wherein said selected keyer is determined from said state information.
36. A system for keying a media production, comprising: an input router; a first keying device for compositing a first keyer layer on the media production to thereby produce a composite media production, wherein said first keying device accesses the media production from said input router; and a second keying device for compositing a second keyer layer on said composite media production, wherein said second keying device is positioned in series with said first keying device.
37. The system of claim 36, further comprising: a switcher for receiving said composite media production, wherein said composite media production includes said first keyer layer and said second keyer layer.
38. A system for keying a media production, comprising: an input router; and a routing matrix including a plurality of keying devices, wherein each keying device is responsive to receiving a keying command that, when executed, instructs a keying device to key the media production in parallel or in series with another keying device.
39. The system of claim 38, further comprising: a switcher for receiving the media production from said routing matrix, wherein the media production comprises a layer keyed from said routing matrix.
40. A computer program product comprising a computer useable medium having control logic embedded in said medium for causing a computer to select an available keying device from a plurality of keying devices, said control logic comprising: first means for causing the computer to monitor state information pertaining to the plurality of keying devices; second means for causing the computer to detect a keying device having state information indicating that said keying device is not currently operating to produce a preview output or a program output; and third means for causing the computer to select said keying device detected by said second means as the available keying device, in response to said second means detecting only one keying device having said state information.
41. The computer program product according to claim 40, further comprising: fourth means for causing the computer to execute a first-in-first-out routine to select the available keying device in response to said second means detecting two or more keying devices having said state information.
42. The computer program product according to claim 40, further comprising: fourth means for causing the computer to execute a first-in-first-out routine to select a keying device producing a preview output in response to said second means detecting no keying device having said state information.
43. A production system, comprising: a first production path; a second production path; and a control system that causes said first production path to generate a show in a first aspect ratio, and that causes said second production path to generate said show in a second aspect ratio.
44. The production system of claim 43, wherein said first aspect ratio is a 4:3 aspect ratio, and said second aspect ratio is a 16:9 aspect ratio.
45. The production system of claim 43, wherein said first production path comprises first production devices supporting said first aspect ratio, and said second production path comprises second production devices supporting said second aspect ratio.
46. The production system of claim 45, wherein said control system operates according to a script comprising a plurality of transition macros, at least one of said transition macros, when executed, causing said first production path and said second production path to perform a function, said at least one of said transition macros comprising a first set of instructions that, when executed, cause said first production path to perform said function, said at least one of said transition macros comprising a second set of instructions that, when executed, cause said second production path to perform said function.
47. The production system of claim 43, wherein said control system comprises means for controlling said first production path and said second production path to synchronously generate said show in said first aspect ratio and said show in said second aspect ratio.
48. The production system of claim 43, further comprising: means for simulcasting said show in said first aspect ratio and said show in said second aspect ratio.
49. A method for producing a show, comprising: producing a first show comprising a plurality of stories; segmenting said first show; storing said show segments in an archive; and producing a second show comprising live portions and one or more of said show segments from said archive.
50. The method of claim 49, wherein step (3) comprises: storing in said archive meta data information indicating the beginning, end, and duration of each of said show segments.
51. The method of claim 50, wherein step (4) comprises: retrieving said one or more of said show segments from said archive using said meta data information.
52. The method of claim 51 , wherein step (4) further comprises: causing a keyer to modify "bugs" contained in said retrieved show segments.
53. The method of claim 49, wherein step (4) comprises: producing said live portions by executing one or more transition macros, each comprising a plurality of instructions that, when executed, control production devices.
54. The method of claim 53, wherein step (4) further comprises: adjusting duration of said live portions to conform said second show to its assigned time duration.
55. The method of claim 49, further comprising: inserting advertisements into said second show.
56. A system for producing a show, comprising: means for producing a first show comprising a plurality of stories; means for segmenting said first show; means for storing said show segments in an archive; and means for producing a second show comprising live portions and one or more of said show segments from said archive.
57. The system of claim 56, wherein said storing means comprises: means for storing in said archive meta data information indicating the beginning, end, and duration of each of said show segments.
58. The system of claim 57, wherein said second show producing means comprises: means for retrieving said one or more of said show segments from said archive using said meta data information.
59. The system of claim 58, wherein said second show producing means further comprises: means for causing a keyer to modify "bugs" contained in said retrieved show segments.
60. The system of claim 56, wherein said second show producing means comprises: means for producing said live portions by executing one or more transition macros, each comprising a plurality of instructions that, when executed, control production devices.
61. The system of claim 60, wherein said second show producing means further comprises: means for adjusting duration of said live portions to conform said second show to its assigned time duration.
62. The system of claim 56, further comprising: means for inserting advertisements into said second show.
63. A production system, comprising: a plurality of production devices, including a server; a media manager; and a control system that operates according to a script to produce a show, said script including commands that, when executed, cause said production devices to perform functions; wherein said media manager automatically determines channels of said server to utilize when accessing material in said server in accordance with execution of said commands.
64 A method of producing a show, comprising the steps of: identifying production information to build a rundown for the show; selecting one or more macro elements associated with said production information; importing said one or more macro elements into a production control system; and executing said one or more macro elements to produce said show.
65. The method according to claim 64, further comprising the step of: monitoring a newsroom automation system to detect changes in said production information.
66. The method according to claim 65, further comprising the step of: updating said production control system in response to detecting said changes.
67. The method according to claim 66, further comprising the step of: prompting for approval prior to executing step (6).
68. The method according to claim 64, further comprising the step of: checking for errors in said rundown.
69. The method according to claim 64, further comprising the step of: resolving conflicts detected in said rundown.
70. The method according to claim 64, wherein step (2) comprises the step of: identifying a macro element matching said production values to automatically select said one or more macro elements.
71. The method according to claim 64, wherein step (2) comprises the steps of: detecting a macro element type, said macro element type being associated with a combination of production values from said production information; and identifying a macro element matching said combination to automatically select said one or more macro elements.
72. The method according to claim 64, wherein step (1) comprises the step of: extracting said production information from a newsroom information management system to build said rundown.
73. A system for producing a show, comprising: a newsroom information management system for managing workflow within a newsroom environment; a director interface for extracting production information from said newsroom information management system; and a production control system for receiving said production information and executing one or more macro elements corresponding to said production information.
74. The system of claim 73, wherein said director interface comprises means for selecting said one or more macro elements.
75. The system of claim 73, wherein said director interface comprises a macro identifier for identifying a macro element having production values matching said production information to select said one or more macro elements.
76. The system of claim 73, wherein said director interface comprises means for detecting changes between said newsroom information management system and said director interface.
PCT/US2003/014427 2002-05-09 2003-05-09 Video production system for automating the execution of a video show WO2003096682A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP03724519A EP1552685A4 (en) 2002-05-09 2003-05-09 Video production system for automating the execution of a video show
AU2003230350A AU2003230350A1 (en) 2002-05-09 2003-05-09 Video production system for automating the execution of a video show

Applications Claiming Priority (14)

Application Number Priority Date Filing Date Title
US37867202P 2002-05-09 2002-05-09
US37865502P 2002-05-09 2002-05-09
US37867102P 2002-05-09 2002-05-09
US37865702P 2002-05-09 2002-05-09
US37865602P 2002-05-09 2002-05-09
US60/378,656 2002-05-09
US60/378,655 2002-05-09
US60/378,657 2002-05-09
US60/378,671 2002-05-09
US60/378,672 2002-05-09
US10/208,810 2002-08-01
US10/208,810 US20030001880A1 (en) 2001-04-18 2002-08-01 Method, system, and computer program product for producing and distributing enhanced media
US10/247,783 2002-09-20
US10/247,783 US11109114B2 (en) 2001-04-18 2002-09-20 Advertisement management method, system, and computer program product

Publications (2)

Publication Number Publication Date
WO2003096682A1 true WO2003096682A1 (en) 2003-11-20
WO2003096682A9 WO2003096682A9 (en) 2004-02-26

Family

ID=29424931

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2003/014427 WO2003096682A1 (en) 2002-05-09 2003-05-09 Video production system for automating the execution of a video show

Country Status (3)

Country Link
EP (1) EP1552685A4 (en)
AU (1) AU2003230350A1 (en)
WO (1) WO2003096682A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1555772A2 (en) * 2004-01-15 2005-07-20 Yamaha Corporation Remote control method of external devices
WO2005069606A1 (en) * 2003-12-18 2005-07-28 Eastman Kodak Company Look preservation for a motion picture
WO2006103578A1 (en) * 2005-03-29 2006-10-05 Koninklijke Philips Electronics N.V. Method and device for providing multiple video pictures
WO2009126130A1 (en) * 2008-04-11 2009-10-15 Thomson Licensing Auto channel assignment for live productions
WO2012024069A2 (en) * 2010-08-20 2012-02-23 Andrea Keating Data analytics system
US8332754B2 (en) 2009-11-04 2012-12-11 International Business Machines Corporation Rendering sections of content in a document
WO2022077106A1 (en) * 2020-10-13 2022-04-21 Grass Valley Canada Virtualized production switcher and method for media production
CN116996631A (en) * 2023-09-26 2023-11-03 牡丹江师范学院 System and method for making real-time display singing video

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102256050B (en) * 2011-08-22 2013-07-17 北京东方艾迪普科技发展有限公司 Intelligent studio view broadcast control system and method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5231499A (en) * 1991-02-11 1993-07-27 Ampex Systems Corporation Keyed, true-transparency image information combine
US5930446A (en) * 1995-04-08 1999-07-27 Sony Corporation Edition system
US6134380A (en) * 1997-08-15 2000-10-17 Sony Corporation Editing apparatus with display of prescribed information on registered material
US6452612B1 (en) * 1998-12-18 2002-09-17 Parkervision, Inc. Real time video production system and method

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3320197B2 (en) * 1994-05-09 2002-09-03 キヤノン株式会社 Image editing apparatus and method
AU6526600A (en) * 1999-08-06 2001-03-05 Avid Technology, Inc. Generation and versatile usage of effect tree presets
WO2001052526A2 (en) * 2000-01-14 2001-07-19 Parkervision, Inc. System and method for real time video production

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5231499A (en) * 1991-02-11 1993-07-27 Ampex Systems Corporation Keyed, true-transparency image information combine
US5930446A (en) * 1995-04-08 1999-07-27 Sony Corporation Edition system
US6134380A (en) * 1997-08-15 2000-10-17 Sony Corporation Editing apparatus with display of prescribed information on registered material
US6452612B1 (en) * 1998-12-18 2002-09-17 Parkervision, Inc. Real time video production system and method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP1552685A4 *

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005069606A1 (en) * 2003-12-18 2005-07-28 Eastman Kodak Company Look preservation for a motion picture
US6972828B2 (en) 2003-12-18 2005-12-06 Eastman Kodak Company Method and system for preserving the creative intent within a motion picture production chain
US7583355B2 (en) 2003-12-18 2009-09-01 Eastman Kodak Company Method and system for preserving the creative intent within a motion picture production chain
US7782439B2 (en) 2003-12-18 2010-08-24 Eastman Kodak Company Method and system for preserving the creative intent within a motion picture production chain
EP1555772A2 (en) * 2004-01-15 2005-07-20 Yamaha Corporation Remote control method of external devices
US8935444B2 (en) 2004-01-15 2015-01-13 Yamaha Corporation Remote control method of external devices
EP1555772A3 (en) * 2004-01-15 2013-07-17 Yamaha Corporation Remote control method of external devices
US8326133B2 (en) 2005-03-29 2012-12-04 Koninklijke Philips Electronics N.V. Method and device for providing multiple video pictures
WO2006103578A1 (en) * 2005-03-29 2006-10-05 Koninklijke Philips Electronics N.V. Method and device for providing multiple video pictures
US8432496B2 (en) 2008-04-11 2013-04-30 Thomson Licensing Auto channel assignment for live productions
WO2009126130A1 (en) * 2008-04-11 2009-10-15 Thomson Licensing Auto channel assignment for live productions
US8332754B2 (en) 2009-11-04 2012-12-11 International Business Machines Corporation Rendering sections of content in a document
US9229916B2 (en) 2009-11-04 2016-01-05 International Business Machines Corporation Rendering sections of content in a document
WO2012024069A2 (en) * 2010-08-20 2012-02-23 Andrea Keating Data analytics system
WO2012024069A3 (en) * 2010-08-20 2014-03-20 Andrea Keating Data analytics system
WO2022077106A1 (en) * 2020-10-13 2022-04-21 Grass Valley Canada Virtualized production switcher and method for media production
GB2615486A (en) * 2020-10-13 2023-08-09 Grass Valley Canada Virtualized production switcher and method for media production
CN116996631A (en) * 2023-09-26 2023-11-03 牡丹江师范学院 System and method for making real-time display singing video
CN116996631B (en) * 2023-09-26 2023-12-08 牡丹江师范学院 System and method for making real-time display singing video

Also Published As

Publication number Publication date
WO2003096682A9 (en) 2004-02-26
EP1552685A4 (en) 2006-06-07
AU2003230350A1 (en) 2003-11-11
EP1552685A1 (en) 2005-07-13

Similar Documents

Publication Publication Date Title
US10546612B2 (en) Systems, methods, and computer program products for automated real-time execution of live inserts of repurposed stored content distribution
US20040027368A1 (en) Time sheet for real time video production system and method
US8661366B2 (en) Building macro elements for production automation control
US7835920B2 (en) Director interface for production automation control
US6952221B1 (en) System and method for real time video production and distribution
US7024677B1 (en) System and method for real time video production and multicasting
US8560951B1 (en) System and method for real time video production and distribution
US20020054244A1 (en) Method, system and computer program product for full news integration and automation in a real time video production environment
US20110107368A1 (en) Systems and Methods for Selecting Ad Objects to Insert Into Video Content
US20020175931A1 (en) Playlist for real time video production
JP2008518564A (en) Digital screening film screening schedule setting
EP1262063B1 (en) System for real time video production and multicasting
US20030214605A1 (en) Autokeying method, system, and computer program product
US7649573B2 (en) Television production technique
EP1552685A1 (en) Video production system for automating the execution of a video show
CA2523947C (en) Building macro elements for production automation control
US8063990B2 (en) Television production technique
Leirpoll et al. Multi-Camera Editing

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NI NO NZ OM PH PL PT RO RU SC SD SE SG SK SL TJ TM TN TR TT TZ UA UG UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
COP Corrected version of pamphlet

Free format text: PAGES 19/50, 35/50, 45/50 AND 49/50, DRAWINGS, REPLACED BY NEW PAGES 19/50, 35/50, 45/50 AND 49/50;DUE TO LATE TRANSMITTAL BY THE RECEIVING OFFICE

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
WWE Wipo information: entry into national phase

Ref document number: 2003724519

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 2003724519

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: JP

WWW Wipo information: withdrawn in national office

Country of ref document: JP