US20040257434A1 - Personal multimedia device video format conversion across multiple video formats - Google Patents

Personal multimedia device video format conversion across multiple video formats Download PDF

Info

Publication number
US20040257434A1
US20040257434A1 US10/601,733 US60173303A US2004257434A1 US 20040257434 A1 US20040257434 A1 US 20040257434A1 US 60173303 A US60173303 A US 60173303A US 2004257434 A1 US2004257434 A1 US 2004257434A1
Authority
US
United States
Prior art keywords
video signal
format
frame rate
multimedia device
personal multimedia
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/601,733
Inventor
Robert Davis
Kuriacose Joseph
Ernest Seah
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
DirecTV Group Inc
Original Assignee
Hughes Electronics Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hughes Electronics Corp filed Critical Hughes Electronics Corp
Priority to US10/601,733 priority Critical patent/US20040257434A1/en
Assigned to HUGHES ELECTRONICS CORPORATION reassignment HUGHES ELECTRONICS CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: JOSEPH, KURIACOSE, SEAH, ERNEST, DAVIS, ROBERT
Publication of US20040257434A1 publication Critical patent/US20040257434A1/en
Priority to US11/879,306 priority patent/US7688384B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/4223Cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440218Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by transcoding between formats or standards, e.g. from MPEG-2 to MPEG-4
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440263Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the spatial resolution, e.g. for displaying on a connected PDA
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440281Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the temporal resolution, e.g. by frame skipping
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/147Communication arrangements, e.g. identifying the communication as a video-communication, intermediate storage of the signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/148Interfacing a video terminal to a particular transmission medium, e.g. ISDN

Definitions

  • the present invention relates generally to communications systems, and more particularly, to videoconferencing between entities that use multiple video standards.
  • Personal multimedia devices are devices that provide multimedia services to a user.
  • Example of personal multimedia devices include set-top boxes and personal video recording (PVR) devices.
  • a set-top box is a device that enables a television set to receive and decode digital television (DTV) broadcasts. Set-top boxes are frequently used to receive satellite and cable television signals. At a basic level, a set-top box will include an input connection for the television signal broadcast by the satellite or cable company, and an output connection leading to the user's television.
  • Some set-top boxes additionally include more advanced features, such as a network port (e.g., Ethernet) and/or other input output connections, such as a keyboard or a universal serial bus (USB) connection.
  • a network port e.g., Ethernet
  • other input output connections such as a keyboard or a universal serial bus (USB) connection.
  • USB universal serial bus
  • An additional activity that may be performed in conjunction with the more advanced set-top boxes is videoconferencing. More specifically, a video camera may be connected to the set-top box through a connection such as the USB connection. Audio and video recorded by the video camera may be processed by the set-top box and then transmitted through the network port to a network, such as the Internet. At the receiving end, another set-top box may receive the video signal from the network, process the video signal into a format compatible with the receiving television, and present the video signal on the receiving television.
  • a personal multimedia device converts received video signals to a format native to the device.
  • the device may automatically detect a format of the received video signal.
  • the conversion may be performed by adding or removing frames to the received video signal so that the frame rate of the received video signal is close to that of the native format.
  • One aspect consistent with the invention is directed to a personal multimedia device that includes a media processing component that increases a frame rate of a received video signal, when the frame rate of the received video signal is less than a frame rate of a native video format of the personal multimedia device by adding frames to the received video signal where the added frames are based on at least one of the received frames. Further, the media processing component decreases the frame rate of the received video signal when the frame rate of the received video signal is greater than a native frame rate of the personal multimedia device by removing frames from the received video signal.
  • Another aspect consistent with the invention is directed to a personal multimedia device that includes a media processing component that detects a frame rate of a received video signal and compares the frame rate to a frame rate native to the personal multimedia device.
  • the media processing component modifies a frame rate of the received video signal when the frame rate of the received video signal is different than the frame rate native to the personal multimedia device by temporally filtering the sequences of frames in the received video signal and also deriving a new set of interpolated frames at the new frame rate.
  • Another aspect consistent with the invention is directed to a method for converting a compressed video signal in a first format to a second format.
  • the method includes determining a frame rate of the video signal in the first format based on header information included in the compressed video signal and decoding the compressed video signal to uncompress to video signal.
  • the method further includes increasing the frame rate of the video signal in the first format when the determined frame rate is less than a frame rate of the second format by adding frames to the uncompressed video signal of the first format in which each added frame is based on at least one frame in the uncompressed video signal in the first format.
  • the method includes decreasing the frame rate of the video signal in the first format when the determined frame rate is greater than the frame rate of the second format by removing frames from the uncompressed video signal in the first format.
  • the method includes scaling a frame resolution of each of the frames of the video signal in the first format to adjust a resolution of the video signal in the first format to match the resolution of the video signal in the second format.
  • FIG. 1 is a diagram illustrating an exemplary video conferencing system
  • FIG. 2 is a diagram illustrating functional components of a set-top box
  • FIG. 3 is a flow chart illustrating operation of set-top box in performing video format conversions consistent with aspects of the invention
  • FIG. 4 is a diagram illustrating increasing the frames-per-second of a detected video signal
  • FIG. 5 is a diagram illustrating an alternate implementation for increasing the frames-per-second of a detected video signal
  • FIG. 6 is a diagram illustrating decreasing the frames-per-second of a detected video signal
  • FIG. 7 is a diagram illustrating temporal filtering and sharpening
  • FIG. 8 is a flow chart illustrating methods performed in a videoconference established consistent with aspects of the invention.
  • a personal multimedia device converts between different video standards to thus facilitate videoconferencing and other streaming video applications across different video transmission standards.
  • the conversion can be performed using standard components of a personal multimedia device such as a set-top box and thus does not add additional cost to the set-top boxes.
  • FIG. 1 is a diagram illustrating an exemplary video conferencing system 100 .
  • Users 101 - 1 through 101 -N may use system 100 to videoconference with one another.
  • Network 110 couples users 101 and 102 with one another.
  • Video streams created by users 101 may be transmitted over network 110 .
  • Network 110 may be a private network or a public network such as the Internet.
  • system 100 may include a video capture device 120 , a personal multimedia device 121 , and a television 122 .
  • Video capture device 120 may be, for example, a conventional video camera.
  • Television 122 presents the video and audio signals transmitted from user 102 via network 110 .
  • Television 122 may alternatively be implemented as other types of multimedia presentation devices, such as a computer monitor and speakers.
  • Personal multimedia device 121 is connected to network 110 .
  • Personal multimedia device 121 is also connected to receive the video signals from video capture device 120 and to output television compatible video signals to television 122 .
  • personal multimedia device 121 may be programmed to convert between different video formats native to different ones of televisions 122 .
  • video capture device 121 of user 101 - 1 may output PAL formatted signals and television 122 of user 101 - 2 may be a NTSC standard television.
  • personal multimedia device 121 of user 101 - 2 may convert the video signal transmitted over network 110 into an NTSC compatible television before outputting the signal to television 122 of user 101 - 2 .
  • personal multimedia device 121 is a set-top box. Accordingly, personal multimedia device 121 may alternatively be referred to herein as set-top box 121 .
  • FIG. 2 is a diagram illustrating functional components of a set-top box, such as set-top box 121 .
  • Three functional components are shown in FIG. 2 for simplicity: channel decoder 210 , media processor 220 , and input/output interface 230 .
  • Video signals may be received by set-top box 121 at channel decoder 210 .
  • channel decoder 210 may include one or more of satellite transceiver 211 , terrestrial decoder 212 , and cable transceiver 213 .
  • Satellite transceiver 211 may be designed to receive television signals broadcast from a satellite. Satellite transceiver 211 may thus include one or more conventional directional antennas (not shown) capable of transmitting/receiving signals via radio frequency waves. Satellite transceiver 211 may convert the received signals to digital signals using conventional techniques and transmit the converted signals to media processor 220 .
  • Terrestrial decoder 212 may include logic for receiving and processing conventional broadcast television signals.
  • Terrestrial decoder 212 may include, for example, an omni-directional antenna (not shown) and circuitry for processing television signals received over the omni-directional antenna.
  • a digital signal resulting from the processing from terrestrial decoder 212 may be output to media processor 220 .
  • Cable transceiver 213 receives signals transmitted over a cable system, such as an optical or coaxial cable system, and converts the signals into a digital signal.
  • Media processor 220 receives the signals from channel decoder 210 and processes the signals into a form suitable for viewing on television 122 .
  • Media processor 220 may include a number of functional components, such as MPEG decoder/encoder 221 , graphics processor(s) 222 , and video encoders/decoders 223 .
  • MPEG decoder/encoder 221 may perform encoding and decoding functions relating to the Moving Picture Experts Group (MPEG) family of standards.
  • MPEG Moving Picture Experts Group
  • the MPEG standards are used for coding audio-visual information in a digital compressed form. Because MPEG video streams are compressed, they can be transmitted with significantly less bandwidth than uncompressed streams. Multiple versions of the MPEG standard exist, such as the MPEG-2 and MPEG-4 standards. MPEG decoders/encoders are known in the art and will not be described further herein.
  • Graphics processor(s) 222 perform image-processing functions on an uncompressed image. Graphics processor(s) 222 may, for example, scale an image, manipulate the color in the image, or superimpose text or graphics on an image. Graphics processors are generally known in the art and are frequently implemented using custom semiconductor chips designed to efficiently perform graphic functions.
  • Video decoder/encoder 223 may be configured to convert uncompressed video streams into analog television signals.
  • Video decoder/encoder 223 can be, for example, a PAL or NTSC decoder/encoder that contains logic to convert video signals to one or both of these standards.
  • Video decoders/encoders are generally well known in the art.
  • Input/output interface 230 may include logic for connecting additional devices to set-top box 121 .
  • Input/output interface 230 may include network ports, such as an Ethernet port, keyboard and mouse ports, USB ports, etc. Through input/output interface 230 , set-top box 121 can provide personal computer-like functions. In particular, through a network port such as an Ethernet port, set-top box 121 may connect and communicate with other devices, including other set-top boxes coupled to network 110 . Two set-top boxes may, for example, exchange MPEG video streams over network 110 .
  • FIG. 2 The components illustrated in FIG. 2 are functional components useful in explaining the present invention. Actual set-top boxes may contain many additional components. Also, in actual set-top boxes, multiple ones of the components shown in FIG. 2 may be implemented as a single physical device or conversely, multiple physical devices may be used to implement one of the functional components shown in FIG. 2. It should also be understood that the components illustrated in FIG. 2 may be implemented in hardware, software, or any combination of hardware and software.
  • H.261, H.263, or H.264 may be used instead of using the MPEG family of compression standards.
  • H.261, H.263, and H.264 compression standards use best effort to achieve a frame rate of 15 or 30 frames-per-second.
  • video or television signals may be implemented in one of a number of different formats.
  • the PAL standard for example, has a line resolution of 625 lines per frame and is displayed at 25 frames-per-second (fps).
  • the NTSC standard has a line resolution of 525 lines per frame and is displayed at 29.97 fps.
  • Set-top boxes 121 will generally be programmed to output a “native” format that is common to the region in which the set-top box is used. Consistent with an aspect of the invention, media processor 220 converts between incoming video formats and the native format of set-top box 121 . The conversion is performed using hardware that is conventionally available in many set-top boxes, and can thus be implemented in set-top box 121 with relatively little incremental cost.
  • FIG. 3 is a flow chart illustrating operation of set-top box 121 in performing such conversions.
  • Incoming video signals may be processed by channel decoder 210 (Act 301 ).
  • the output of channel decoder 210 may be a digital version of the video signal.
  • the video signal may be compressed using a video compression standard such as MPEG-2 or MPEG-4.
  • the MPEG bit stream may be decoded by MPEG decoder/encoder 221 to produce a non-compressed video signal in a format such as PAL or NTSC.
  • the incoming video signals may be received as a digital video bit stream (e.g., as an MPEG bit stream) over network 110 by input/output interface 230 . In this situation, the digital video stream may be transmitted directly to media processor 220 .
  • Information relating to the format of a video signal may be embedded in fields associated with the MPEG bit stream.
  • Media processor 220 may examine fields in the MPEG video stream to determine the format, such as the frame rate and resolution of the video signal.
  • the video signal is compressed using either the MPEG-2 or MPEG-4 standard.
  • Media processor 220 may examine the MPEG bit stream to determine the appropriate MPEG version (Act 302 ). This information is contained in header fields present in the MPEG bit stream.
  • media processor 220 determines the format of the video signal (Acts 303 and 304 ).
  • the frame is described in a four-bit “frame rate” field.
  • a three-bit “video format” field describes the video format encoded in the MPEG bit stream. Table I and II, below, describe the meaning for various values in the “frame rate” and “video format” fields under MPEG-2.
  • media processor 220 may use either the frame rate field to infer the format of the video stream or the video format field to directly determine the format of the video stream (Act 303 ).
  • the frame rate is 29.97 fps. This frame rate corresponds to an NTSC video signal.
  • MPEG-4 video streams include a video format field similar to the video format field contained in MPEG-2 video streams. Accordingly, for MPEG-4 video streams, the video format field can be used to determine the format, and hence the frame rate, of the video stream (Act 304 ).
  • the received video stream can be played back by set-top box 121 without conversion (Acts 305 and 306 ).
  • MPEG decoder/encoder 221 may decode the MPEG video stream to an uncompressed version, which may then be converted directly to the appropriate analog output format by video decoder/encoder 223 .
  • media processor 220 determines whether the native format has more frames-per-second than the native format (Act 307 ). If the native format has more frames-per-second than the detected format, media processing component 220 increases the frames-per-second of the detected video signal (Act 308 ).
  • FIG. 4 is a diagram illustrating increasing the frames-per-second of the detected video signal as performed in Act 308 .
  • the video signal may first be decoded by MPEG decoder/encoder 221 to obtain an uncompressed version of the video signal.
  • the detected video format is PAL (25 fps) and the native video format is NTSC (29.97 fps).
  • a PAL video signal 410 is represented as a number of frames 411 - 1 through 411 - 5 . Although only five frames are shown in FIG. 4, in practice, the process shown for the five frames in FIG. 4 may be repeated for each set of five frames in the video signal.
  • PAL video signals are in an interlaced format. In an interlaced video signal, each frame is rendered in two successive sweeps. Thus, for example, with a PAL frame that includes 625 lines, the odd lines of the frame may be rendered first followed by the even lines. In this manner, when rendering the frame, the television interlaces the odd and even portions of the frame.
  • the odd and even interlaced portions of frames 411 are indicated in FIG. 4 with “A” for the odd fields and “B” for the even fields.
  • Video signal 420 includes six frames 421 - 1 through 421 - 6 .
  • the five frames in PAL video signal 410 are converted into NTSC video signal 420 containing six NTSC frames 421 .
  • the first frame of NTSC video signal 420 is formed from the odd and even portions of the first frame of PAL video signal 410 , frame 411 - 1 . That is, frame 421 - 1 is essentially the same as frame 411 - 1 , although frame 421 - 1 is associated with a shorter time duration.
  • frame 421 - 2 is formed from the odd and even portions of frame 411 - 2 .
  • Frame 421 - 3 is formed from the odd portion of frame 411 - 2 and the even portion of 411 - 3 .
  • Frame 421 - 4 is formed from the odd portion of frame 411 - 3 and the even portion of 411 - 4 .
  • Frame 421 - 5 is formed from the odd and even portions of frame 411 - 4 .
  • Frame 421 - 6 is formed from the odd and even portions of frame 411 - 5 .
  • MPEG decoder/encoder 221 may additionally multiply the converted 30.00 fps signal by 0.999 to obtain a 29.97 fps video signal.
  • One way of physically implementing the multiply operation is by dropping a frame every 1001 frames.
  • FIG. 5 is a diagram illustrating another implementation of Act 308 .
  • a FILM formatted video signal is converted to an NTSC formatted video signal.
  • the FILM format is a 24 fps format compared to the 29.97 fps format of NTSC.
  • FILM video signal 510 is represented as a number of frames 511 - 1 through 511 - 4 . Although only four frames are shown in FIG. 5, in practice, the process shown for the four frames in FIG. 5 may be repeated for each set of four frames in the video signal.
  • the target video format an NTSC signal, is illustrated as video signal 520 .
  • Video signal 520 includes frames 521 - 1 through 521 - 5 .
  • the first frame of NTSC video signal 520 is formed from the odd and even portions of the first frame of FILM video signal 510 , frame 511 - 1 .
  • frame 521 - 2 is formed from the odd and even portions of frame 511 - 2 .
  • Frame 521 - 3 is formed from the odd portion of frame 511 - 2 and the even portion of 511 - 3 .
  • Frame 521 - 4 is formed from the odd and even portions of frame 511 - 3 .
  • Frame 521 - 5 is formed from the odd and even portions of frame 511 - 4 .
  • media processing component 220 decreases the frames-per-second of the detected video signal (Act 309 ).
  • FIG. 6 is a diagram illustrating decreasing the frames-per-second of the detected video signal as performed in Act 309 .
  • the detected video format is NTSC (29.97 fps) and the native video format is PAL (25 fps).
  • media processor 220 occasionally drops frames to reduce the frames-per-second. For most applications, an occasional dropped frame will not be noticed when converting from NTSC to PAL video formats.
  • a NTSC video signal 610 is represented as a number of frames 611 - 1 through 611 - 6 .
  • the target video format, a PAL signal is illustrated in FIG. 6 as video signal 620 .
  • Video signal 620 includes five frames 621 - 1 through 621 - 5 .
  • the six frames in NTSC video signal 610 are converted into PAL video signal 620 containing five PAL frames 620 .
  • the conversion is formed by copying, on a one-to-one basis, frames 611 - 1 to 611 - 5 to frames 621 - 1 to 621 - 5 , respectively.
  • Frame 611 - 6 is dropped.
  • MPEG decoder/encoder 221 may additionally divide the PAL video signal 620 by 0.999 to achieve 25 fps. In practice, the divide operation may be performed by occasionally adding a duplicate frame to the signal.
  • FIGS. 4-6 illustrate frame rate conversion for video signals having interlaced formats.
  • increasing or decreasing the fps of the video signal reduces to a simple mathematical relation of dropping/repeating X frame(s) every Y frames(s).
  • every other frame in the first video format could be repeated to form the second video format.
  • FIGS. 4-6 do not impact lip-sync because the Audio Presentation Time Stamp (PTS) does not change. Accordingly, these techniques do not require modification of the audio stream corresponding to the video signal.
  • PTS Audio Presentation Time Stamp
  • graphics processor(s) 222 may rescale the picture size by scaling the resolution of each of the frames. In particular, if the detected frame size and the native frame size are not the same, (Act 310 ), graphic(s) processor(s) 222 may rescale the frame size so that the converted video signal has the correct frame size (Act 311 ). In the case of PAL to NTSC conversion, graphics processor(s) 222 scales the vertical scale by a factor of 0.84. Conversely, for NTSC to PAL conversion, graphics processor(s) 222 stretches the vertical scale by a factor of 1.19 (1/0.84). Alternatively, black bars can be inserted if it is desirable to maintain the aspect ratio.
  • Video decoder/encoder 223 may next process the converted video signal into an analog format appropriate for television 122 (Act 312 ). Video decoder/encoder 223 may, for example, add the frame timing information required by the particular video format.
  • set-top box 121 may implement more complicated image conversion techniques.
  • set-top box may perform temporal filtering and image sharpening to more optimally convert video signals.
  • FIG. 7 is a diagram illustrating interpolation through temporal filtering in additional detail for an exemplary NTSC to PAL conversion.
  • NTSC video signal 710 include six frames 711 - 1 through 711 - 6 .
  • frames may be temporarily filtered and sharpened in a more optimal (although potentially more processor/memory resource intensive) manner.
  • target frame 721 - 2 may be formed as a combination of the information in frames 711 - 3 and 711 - 4 , or even as a combination of the information in frames 711 - 2 through 711 - 5 .
  • FIG. 8 is a flow chart illustrating methods performed in a videoconference established in a manner consistent with aspects of the invention.
  • two or more set-top boxes may communicate with one another over network 110 to form a network connection (Act 801 ).
  • user 101 - 1 may form a video conference connection with users 101 - 2 and 101 - 3 .
  • devices other than set-top boxes 121 may also participate in the videoconference.
  • a set-top box may communicate with a personal computer to establish a videoconference.
  • data received the video capture device 120 associated with set-top boxes 121 is transmitted over network 110 (Act 802 ).
  • the video will generally be transmitted at the frame rate and resolution native to the transmitting set-top box or video capture device.
  • Set-top boxes 121 also receive and display video signals over network 110 (Act 803 ).
  • Video signals that are not in the native format of the set-top box are converted as described above.
  • a video signal can be converted to the appropriate video signal format on-the-fly, without having to pre-negotiate a particular video standard to use in the video conference.
  • each of the participants of the video conference could potentially be using a different native video standard, yet the video conference could still proceed without having to first negotiate a common standard.
  • the format decision is made in a, distributed manner.
  • a standard set-top box may be used to implement videoconferencing with videoconference partners that transmit video in a format different than the native format of the set-top box.
  • Techniques described herein provide for conversion between video formats at a personal multimedia device using existing hardware in the personal multimedia device.

Abstract

Personal multimedia devices can detect when an incoming video format is different from a native format and make a local decision to convert incoming video formats to a format native to the personal multimedia device. The personal multimedia device may include a media processor that comprises an MPEG decoder/encoder, graphics processors, and a video decoder/encoder. These components increase a frame rate of a received video signal when the frame rate of the received video signal is less than a frame rate of the native video format of the personal multimedia device and decrease the frame rate of the received video signal when the frame rate of the received video signal is greater than the native frame rate of the set-top box. The graphics processor scales a frame resolution of the frames in the received video signal to correspond to the native video format.

Description

    BACKGROUND OF THE INVENTION
  • A. Field of the Invention [0001]
  • The present invention relates generally to communications systems, and more particularly, to videoconferencing between entities that use multiple video standards. [0002]
  • B. Description of Related Art [0003]
  • Personal multimedia devices are devices that provide multimedia services to a user. Example of personal multimedia devices include set-top boxes and personal video recording (PVR) devices. A set-top box is a device that enables a television set to receive and decode digital television (DTV) broadcasts. Set-top boxes are frequently used to receive satellite and cable television signals. At a basic level, a set-top box will include an input connection for the television signal broadcast by the satellite or cable company, and an output connection leading to the user's television. [0004]
  • Some set-top boxes additionally include more advanced features, such as a network port (e.g., Ethernet) and/or other input output connections, such as a keyboard or a universal serial bus (USB) connection. With these set-top boxes, in addition to simply watching television, users may perform a number of more interactive activities, such as surfing the web and sending email. [0005]
  • An additional activity that may be performed in conjunction with the more advanced set-top boxes is videoconferencing. More specifically, a video camera may be connected to the set-top box through a connection such as the USB connection. Audio and video recorded by the video camera may be processed by the set-top box and then transmitted through the network port to a network, such as the Internet. At the receiving end, another set-top box may receive the video signal from the network, process the video signal into a format compatible with the receiving television, and present the video signal on the receiving television. [0006]
  • There are situations, however, in which different set-top boxes in the above-described video conferencing scheme are incompatible with one another. Set-top boxes in different regions of the world may use different video formats. Televisions in Europe, for example, typically use the Phase Alternation Line (PAL) analog television display standard while televisions in North America typically use the National Television Systems Committee (NTSC) standard. When attempting to implement a videoconference with televisions using different standards, the received video will not be able to be appropriately displayed on the receiving television. [0007]
  • Therefore, there is a need in the art to improve video conferencing capabilities of personal multimedia devices such as set-top boxes. [0008]
  • SUMMARY OF THE INVENTION
  • A personal multimedia device, as described herein, converts received video signals to a format native to the device. The device may automatically detect a format of the received video signal. The conversion may be performed by adding or removing frames to the received video signal so that the frame rate of the received video signal is close to that of the native format. [0009]
  • One aspect consistent with the invention is directed to a personal multimedia device that includes a media processing component that increases a frame rate of a received video signal, when the frame rate of the received video signal is less than a frame rate of a native video format of the personal multimedia device by adding frames to the received video signal where the added frames are based on at least one of the received frames. Further, the media processing component decreases the frame rate of the received video signal when the frame rate of the received video signal is greater than a native frame rate of the personal multimedia device by removing frames from the received video signal. [0010]
  • Another aspect consistent with the invention is directed to a personal multimedia device that includes a media processing component that detects a frame rate of a received video signal and compares the frame rate to a frame rate native to the personal multimedia device. The media processing component modifies a frame rate of the received video signal when the frame rate of the received video signal is different than the frame rate native to the personal multimedia device by temporally filtering the sequences of frames in the received video signal and also deriving a new set of interpolated frames at the new frame rate. [0011]
  • Another aspect consistent with the invention is directed to a method for converting a compressed video signal in a first format to a second format. The method includes determining a frame rate of the video signal in the first format based on header information included in the compressed video signal and decoding the compressed video signal to uncompress to video signal. The method further includes increasing the frame rate of the video signal in the first format when the determined frame rate is less than a frame rate of the second format by adding frames to the uncompressed video signal of the first format in which each added frame is based on at least one frame in the uncompressed video signal in the first format. Still further, the method includes decreasing the frame rate of the video signal in the first format when the determined frame rate is greater than the frame rate of the second format by removing frames from the uncompressed video signal in the first format. Finally, the method includes scaling a frame resolution of each of the frames of the video signal in the first format to adjust a resolution of the video signal in the first format to match the resolution of the video signal in the second format.[0012]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate the invention and, together with the description, explain the invention. In the drawings, [0013]
  • FIG. 1 is a diagram illustrating an exemplary video conferencing system; [0014]
  • FIG. 2 is a diagram illustrating functional components of a set-top box; [0015]
  • FIG. 3 is a flow chart illustrating operation of set-top box in performing video format conversions consistent with aspects of the invention; [0016]
  • FIG. 4 is a diagram illustrating increasing the frames-per-second of a detected video signal; [0017]
  • FIG. 5 is a diagram illustrating an alternate implementation for increasing the frames-per-second of a detected video signal; [0018]
  • FIG. 6 is a diagram illustrating decreasing the frames-per-second of a detected video signal; [0019]
  • FIG. 7 is a diagram illustrating temporal filtering and sharpening; and [0020]
  • FIG. 8 is a flow chart illustrating methods performed in a videoconference established consistent with aspects of the invention.[0021]
  • DETAILED DESCRIPTION
  • The following detailed description of the invention refers to the accompanying drawings. The same reference numbers in different drawings may identify the same or similar elements. Also, the following detailed description does not limit the invention. Instead, the scope of the invention is defined by the appended claims and equivalents. [0022]
  • A personal multimedia device, as described herein, converts between different video standards to thus facilitate videoconferencing and other streaming video applications across different video transmission standards. The conversion can be performed using standard components of a personal multimedia device such as a set-top box and thus does not add additional cost to the set-top boxes. [0023]
  • Video Conference Overview
  • FIG. 1 is a diagram illustrating an exemplary [0024] video conferencing system 100. Users 101-1 through 101-N (collectively users 101) may use system 100 to videoconference with one another. Network 110 couples users 101 and 102 with one another. Video streams created by users 101 may be transmitted over network 110. Network 110 may be a private network or a public network such as the Internet.
  • For a [0025] particular user 101, system 100 may include a video capture device 120, a personal multimedia device 121, and a television 122. Video capture device 120 may be, for example, a conventional video camera. Television 122 presents the video and audio signals transmitted from user 102 via network 110. Television 122 may alternatively be implemented as other types of multimedia presentation devices, such as a computer monitor and speakers.
  • [0026] Personal multimedia device 121 is connected to network 110. Personal multimedia device 121 is also connected to receive the video signals from video capture device 120 and to output television compatible video signals to television 122.
  • Consistent with an aspect of the invention, [0027] personal multimedia device 121 may be programmed to convert between different video formats native to different ones of televisions 122. For example, video capture device 121 of user 101-1 may output PAL formatted signals and television 122 of user 101-2 may be a NTSC standard television. In this situation, personal multimedia device 121 of user 101-2 may convert the video signal transmitted over network 110 into an NTSC compatible television before outputting the signal to television 122 of user 101-2.
  • In one implementation, [0028] personal multimedia device 121 is a set-top box. Accordingly, personal multimedia device 121 may alternatively be referred to herein as set-top box 121.
  • Set-Top Box Overview
  • FIG. 2 is a diagram illustrating functional components of a set-top box, such as set-[0029] top box 121. Three functional components are shown in FIG. 2 for simplicity: channel decoder 210, media processor 220, and input/output interface 230.
  • Video signals may be received by set-[0030] top box 121 at channel decoder 210. Depending on the type of set-top box, channel decoder 210 may include one or more of satellite transceiver 211, terrestrial decoder 212, and cable transceiver 213. Satellite transceiver 211 may be designed to receive television signals broadcast from a satellite. Satellite transceiver 211 may thus include one or more conventional directional antennas (not shown) capable of transmitting/receiving signals via radio frequency waves. Satellite transceiver 211 may convert the received signals to digital signals using conventional techniques and transmit the converted signals to media processor 220.
  • [0031] Terrestrial decoder 212 may include logic for receiving and processing conventional broadcast television signals. Terrestrial decoder 212 may include, for example, an omni-directional antenna (not shown) and circuitry for processing television signals received over the omni-directional antenna. A digital signal resulting from the processing from terrestrial decoder 212 may be output to media processor 220.
  • [0032] Cable transceiver 213 receives signals transmitted over a cable system, such as an optical or coaxial cable system, and converts the signals into a digital signal. Media processor 220 receives the signals from channel decoder 210 and processes the signals into a form suitable for viewing on television 122. Media processor 220 may include a number of functional components, such as MPEG decoder/encoder 221, graphics processor(s) 222, and video encoders/decoders 223.
  • MPEG decoder/[0033] encoder 221 may perform encoding and decoding functions relating to the Moving Picture Experts Group (MPEG) family of standards. The MPEG standards are used for coding audio-visual information in a digital compressed form. Because MPEG video streams are compressed, they can be transmitted with significantly less bandwidth than uncompressed streams. Multiple versions of the MPEG standard exist, such as the MPEG-2 and MPEG-4 standards. MPEG decoders/encoders are known in the art and will not be described further herein.
  • Graphics processor(s) [0034] 222 perform image-processing functions on an uncompressed image. Graphics processor(s) 222 may, for example, scale an image, manipulate the color in the image, or superimpose text or graphics on an image. Graphics processors are generally known in the art and are frequently implemented using custom semiconductor chips designed to efficiently perform graphic functions.
  • Video decoder/[0035] encoder 223 may be configured to convert uncompressed video streams into analog television signals. Video decoder/encoder 223 can be, for example, a PAL or NTSC decoder/encoder that contains logic to convert video signals to one or both of these standards. Video decoders/encoders are generally well known in the art.
  • Input/[0036] output interface 230 may include logic for connecting additional devices to set-top box 121. Input/output interface 230 may include network ports, such as an Ethernet port, keyboard and mouse ports, USB ports, etc. Through input/output interface 230, set-top box 121 can provide personal computer-like functions. In particular, through a network port such as an Ethernet port, set-top box 121 may connect and communicate with other devices, including other set-top boxes coupled to network 110. Two set-top boxes may, for example, exchange MPEG video streams over network 110.
  • The components illustrated in FIG. 2 are functional components useful in explaining the present invention. Actual set-top boxes may contain many additional components. Also, in actual set-top boxes, multiple ones of the components shown in FIG. 2 may be implemented as a single physical device or conversely, multiple physical devices may be used to implement one of the functional components shown in FIG. 2. It should also be understood that the components illustrated in FIG. 2 may be implemented in hardware, software, or any combination of hardware and software. [0037]
  • In alternate implementations, instead of using the MPEG family of compression standards, other compression standards, such as H.261, H.263, or H.264 may be used. In general, the H.261, H.263, and H.264 compression standards use best effort to achieve a frame rate of 15 or 30 frames-per-second. [0038]
  • Set-Top Box Operation
  • As discussed above, video or television signals may be implemented in one of a number of different formats. The PAL standard, for example, has a line resolution of 625 lines per frame and is displayed at 25 frames-per-second (fps). The NTSC standard has a line resolution of 525 lines per frame and is displayed at 29.97 fps. [0039]
  • Set-[0040] top boxes 121 will generally be programmed to output a “native” format that is common to the region in which the set-top box is used. Consistent with an aspect of the invention, media processor 220 converts between incoming video formats and the native format of set-top box 121. The conversion is performed using hardware that is conventionally available in many set-top boxes, and can thus be implemented in set-top box 121 with relatively little incremental cost. FIG. 3 is a flow chart illustrating operation of set-top box 121 in performing such conversions.
  • Incoming video signals may be processed by channel decoder [0041] 210 (Act 301). The output of channel decoder 210 may be a digital version of the video signal. The video signal may be compressed using a video compression standard such as MPEG-2 or MPEG-4. The MPEG bit stream may be decoded by MPEG decoder/encoder 221 to produce a non-compressed video signal in a format such as PAL or NTSC. Alternatively, the incoming video signals may be received as a digital video bit stream (e.g., as an MPEG bit stream) over network 110 by input/output interface 230. In this situation, the digital video stream may be transmitted directly to media processor 220.
  • Information relating to the format of a video signal, such as the frame rate, may be embedded in fields associated with the MPEG bit stream. [0042] Media processor 220 may examine fields in the MPEG video stream to determine the format, such as the frame rate and resolution of the video signal. In one implementation, the video signal is compressed using either the MPEG-2 or MPEG-4 standard. Media processor 220 may examine the MPEG bit stream to determine the appropriate MPEG version (Act 302). This information is contained in header fields present in the MPEG bit stream.
  • Based on the MPEG version determined in [0043] Act 302, media processor 220 determines the format of the video signal (Acts 303 and 304). For MPEG-2 video streams, the frame is described in a four-bit “frame rate” field. Also, under MPEG-2, a three-bit “video format” field describes the video format encoded in the MPEG bit stream. Table I and II, below, describe the meaning for various values in the “frame rate” and “video format” fields under MPEG-2.
    TABLE I
    MPEG-2 Frame Rate Field
    Frame Rate Code Frame Rate Value
    0000 Forbidden
    0001 24.000 ÷ 1.001 (23.976)
    0010 24
    0011 25
    0100 30.000 ÷ 1.001 (29.97)
    0101 30
    0110 50
    0111 60.000 ÷ 1.001 (59.94)
    1000 60
    1001 Reserved
    . . . . . .
    1111 Reserved
  • [0044]
    TABLE II
    MPEG-2 Video Format Field
    Video Format Code Meaning
    000 Component
    001 PAL
    010 NTSC
    011 SECAM (Systeme Electronique Couler Avec
    Memoire)
    100 MAC
    101 Unspecified video format
    110 Reserved
    111 Reserved
  • Thus, for MPEG-2 bit streams, [0045] media processor 220 may use either the frame rate field to infer the format of the video stream or the video format field to directly determine the format of the video stream (Act 303). As an example of inferring the format of a video stream, consider the situation in which the frame rate is 29.97 fps. This frame rate corresponds to an NTSC video signal.
  • MPEG-4 video streams include a video format field similar to the video format field contained in MPEG-2 video streams. Accordingly, for MPEG-4 video streams, the video format field can be used to determine the format, and hence the frame rate, of the video stream (Act [0046] 304).
  • If the frame rate of the format detected in [0047] Acts 303 and 304 is the same as the native format of set-top box 121, the received video stream can be played back by set-top box 121 without conversion (Acts 305 and 306). In this situation, MPEG decoder/encoder 221 may decode the MPEG video stream to an uncompressed version, which may then be converted directly to the appropriate analog output format by video decoder/encoder 223.
  • When, however, the frame rate of the format detected in [0048] Acts 303 and 304 is different than the native format of set-top box 121, media processor 220 determines whether the native format has more frames-per-second than the native format (Act 307). If the native format has more frames-per-second than the detected format, media processing component 220 increases the frames-per-second of the detected video signal (Act 308).
  • FIG. 4 is a diagram illustrating increasing the frames-per-second of the detected video signal as performed in [0049] Act 308. The video signal may first be decoded by MPEG decoder/encoder 221 to obtain an uncompressed version of the video signal. Assume that for this example, the detected video format is PAL (25 fps) and the native video format is NTSC (29.97 fps).
  • In FIG. 4, a [0050] PAL video signal 410 is represented as a number of frames 411-1 through 411-5. Although only five frames are shown in FIG. 4, in practice, the process shown for the five frames in FIG. 4 may be repeated for each set of five frames in the video signal. PAL video signals are in an interlaced format. In an interlaced video signal, each frame is rendered in two successive sweeps. Thus, for example, with a PAL frame that includes 625 lines, the odd lines of the frame may be rendered first followed by the even lines. In this manner, when rendering the frame, the television interlaces the odd and even portions of the frame. The odd and even interlaced portions of frames 411 are indicated in FIG. 4 with “A” for the odd fields and “B” for the even fields.
  • The target video format, an NTSC signal, is illustrated in FIG. 4 as [0051] video signal 420. Video signal 420 includes six frames 421-1 through 421-6. Thus, as shown in this figure, the five frames in PAL video signal 410 are converted into NTSC video signal 420 containing six NTSC frames 421.
  • The first frame of [0052] NTSC video signal 420, frame 421-1, is formed from the odd and even portions of the first frame of PAL video signal 410, frame 411-1. That is, frame 421-1 is essentially the same as frame 411-1, although frame 421-1 is associated with a shorter time duration. Similarly, frame 421-2 is formed from the odd and even portions of frame 411-2. Frame 421-3, however, is formed from the odd portion of frame 411-2 and the even portion of 411-3. Frame 421-4 is formed from the odd portion of frame 411-3 and the even portion of 411-4. Frame 421-5 is formed from the odd and even portions of frame 411-4. Frame 421-6 is formed from the odd and even portions of frame 411-5.
  • In the operations shown in FIG. 4, five PAL formatted frames are converted to six NTSC formatted frames. A pure five-to-six conversion would actually produce 30.00 NTSC fps instead of the required 29.97 fps. To correct this, MPEG decoder/[0053] encoder 221 may additionally multiply the converted 30.00 fps signal by 0.999 to obtain a 29.97 fps video signal. One way of physically implementing the multiply operation is by dropping a frame every 1001 frames.
  • FIG. 5 is a diagram illustrating another implementation of [0054] Act 308. In this example, a FILM formatted video signal is converted to an NTSC formatted video signal. The FILM format is a 24 fps format compared to the 29.97 fps format of NTSC.
  • FILM video signal [0055] 510 is represented as a number of frames 511-1 through 511-4. Although only four frames are shown in FIG. 5, in practice, the process shown for the four frames in FIG. 5 may be repeated for each set of four frames in the video signal. The target video format, an NTSC signal, is illustrated as video signal 520. Video signal 520 includes frames 521-1 through 521-5.
  • As shown in FIG. 5, the first frame of [0056] NTSC video signal 520, frame 521-1, is formed from the odd and even portions of the first frame of FILM video signal 510, frame 511-1. Similarly, frame 521-2 is formed from the odd and even portions of frame 511-2. Frame 521-3, however, is formed from the odd portion of frame 511-2 and the even portion of 511-3. Frame 521-4 is formed from the odd and even portions of frame 511-3. Frame 521-5 is formed from the odd and even portions of frame 511-4.
  • Referring back to FIG. 3, when the frame rate of the format detected in [0057] Act 307 is different than the native format of set-top box 121, and the native format has less frames-per-second than the detected format, media processing component 220 decreases the frames-per-second of the detected video signal (Act 309).
  • FIG. 6 is a diagram illustrating decreasing the frames-per-second of the detected video signal as performed in [0058] Act 309. Assume that for this example, the detected video format is NTSC (29.97 fps) and the native video format is PAL (25 fps). In this situation, media processor 220 occasionally drops frames to reduce the frames-per-second. For most applications, an occasional dropped frame will not be noticed when converting from NTSC to PAL video formats.
  • In FIG. 6, a [0059] NTSC video signal 610 is represented as a number of frames 611 -1 through 611-6. The target video format, a PAL signal, is illustrated in FIG. 6 as video signal 620. Video signal 620 includes five frames 621-1 through 621-5. Thus, as shown in this figure, the six frames in NTSC video signal 610 are converted into PAL video signal 620 containing five PAL frames 620. The conversion is formed by copying, on a one-to-one basis, frames 611-1 to 611-5 to frames 621-1 to 621-5, respectively. Frame 611-6 is dropped.
  • A straight six-to-five frame conversion from NTSC would actually yield a video signal at 24.975 fps. Accordingly, MPEG decoder/[0060] encoder 221 may additionally divide the PAL video signal 620 by 0.999 to achieve 25 fps. In practice, the divide operation may be performed by occasionally adding a duplicate frame to the signal.
  • FIGS. 4-6 illustrate frame rate conversion for video signals having interlaced formats. For video formats having progressive (non-interlaced) frames, increasing or decreasing the fps of the video signal reduces to a simple mathematical relation of dropping/repeating X frame(s) every Y frames(s). Thus, for example, if a first progressive video format having 10 fps is to be converted to a second video format having 15 fps, every other frame in the first video format could be repeated to form the second video format. [0061]
  • The techniques shown in FIGS. 4-6 do not impact lip-sync because the Audio Presentation Time Stamp (PTS) does not change. Accordingly, these techniques do not require modification of the audio stream corresponding to the video signal. [0062]
  • Referring back to FIG. 3, graphics processor(s) [0063] 222 may rescale the picture size by scaling the resolution of each of the frames. In particular, if the detected frame size and the native frame size are not the same, (Act 310), graphic(s) processor(s) 222 may rescale the frame size so that the converted video signal has the correct frame size (Act 311). In the case of PAL to NTSC conversion, graphics processor(s) 222 scales the vertical scale by a factor of 0.84. Conversely, for NTSC to PAL conversion, graphics processor(s) 222 stretches the vertical scale by a factor of 1.19 (1/0.84). Alternatively, black bars can be inserted if it is desirable to maintain the aspect ratio.
  • Video decoder/[0064] encoder 223 may next process the converted video signal into an analog format appropriate for television 122 (Act 312). Video decoder/encoder 223 may, for example, add the frame timing information required by the particular video format.
  • The above-discussed implementations perform frame rate conversion using relatively simple frame add/drop techniques for increasing/decreasing frame rate. In other implementations, set-[0065] top box 121 may implement more complicated image conversion techniques. For example, set-top box may perform temporal filtering and image sharpening to more optimally convert video signals.
  • FIG. 7 is a diagram illustrating interpolation through temporal filtering in additional detail for an exemplary NTSC to PAL conversion. [0066] NTSC video signal 710 include six frames 711-1 through 711-6. Instead of simply dropping every sixth frame as described above with reference to FIG. 6, frames may be temporarily filtered and sharpened in a more optimal (although potentially more processor/memory resource intensive) manner. Thus, target frame 721-2, for example, may be formed as a combination of the information in frames 711-3 and 711-4, or even as a combination of the information in frames 711-2 through 711-5.
  • Video Conferencing
  • As previously mentioned, one application of the above-described video format conversion performed by set-[0067] top box 121 is videoconferencing. FIG. 8 is a flow chart illustrating methods performed in a videoconference established in a manner consistent with aspects of the invention.
  • To begin, two or more set-top boxes may communicate with one another over [0068] network 110 to form a network connection (Act 801). For example, user 101-1 may form a video conference connection with users 101-2 and 101-3. In some implementations, devices other than set-top boxes 121 may also participate in the videoconference. For example, a set-top box may communicate with a personal computer to establish a videoconference.
  • During the videoconference, data received the [0069] video capture device 120 associated with set-top boxes 121 is transmitted over network 110 (Act 802). The video will generally be transmitted at the frame rate and resolution native to the transmitting set-top box or video capture device. Set-top boxes 121 also receive and display video signals over network 110 (Act 803). Video signals that are not in the native format of the set-top box are converted as described above. As described above, a video signal can be converted to the appropriate video signal format on-the-fly, without having to pre-negotiate a particular video standard to use in the video conference. Thus, each of the participants of the video conference could potentially be using a different native video standard, yet the video conference could still proceed without having to first negotiate a common standard. Relative to the centralized format decisions common in broadcast television, the format decision is made in a, distributed manner.
  • In the videoconferencing application described above, a standard set-top box may be used to implement videoconferencing with videoconference partners that transmit video in a format different than the native format of the set-top box. [0070]
  • Conclusion
  • Techniques described herein provide for conversion between video formats at a personal multimedia device using existing hardware in the personal multimedia device. [0071]
  • It will be apparent to one of ordinary skill in the art that aspects of the invention, as described above, may be implemented in many different forms of software, firmware, and hardware in the implementations illustrated in the figures. The actual software code or specialized control hardware used to implement aspects consistent with the present invention is not limiting of the present invention. Thus, the operation and behavior of the aspects were described without reference to the specific software code or hardware logic. It should be understood that a person of ordinary skill in the art would be able to design software and control hardware to implement the aspects of the present invention based on the description herein. [0072]
  • The foregoing description of preferred embodiments of the present invention provides illustration and description, but is not intended to be exhaustive or to limit the invention to the precise form disclosed. Modifications and variations are possible in light of the above teachings or may be acquired from practice of the invention. For example, devices other than traditional set-top boxes may be used to convert between video formats. Video game consoles, for example, may be used to implement video conferences consistent with aspects of the invention. [0073]
  • No element, act, or instruction used in the description of the present application should be construed as critical or essential to the invention unless explicitly described as such. Also, as used herein, the article “a” is intended to include one or more items. Where only one item is intended, the term “one” or similar language is used. [0074]
  • The scope of the invention is defined by the claims and their equivalents. [0075]

Claims (30)

What is claimed:
1. A personal multimedia device comprising:
a media processing component configured to:
detect a frame rate of a received video signal;
compare the frame rate to a frame rate native to the personal multimedia device;
increase a frame rate of the received video signal when the frame rate of the received video signal is less than the frame rate native to the personal multimedia device by adding frames to the received video signal where the added frames are based on at least one of the received frames, and
decrease the frame rate of the received video signal when the frame rate of the received video signal is greater than the frame rate native to the personal multimedia device by removing frames from the received video signal.
2. The personal multimedia device of claim 1, wherein the media processing component further includes at least one graphics processor configured to scale a frame resolution of frames in the received video signal to correspond to a native video format of the personal multimedia device.
3. The personal multimedia device of claim 1, wherein each of the added frames are derived from an odd field of a first frame of the received video signal and an even field of a second frame of the received video signal.
4. The personal multimedia device of claim 1, wherein when the received video signal is a Phase Alternation Line (PAL) formatted video signal and the native video format is a National Television Systems Committee (NTSC) formatted video signal, the media processing component adds one frame for every five received frames.
5. The personal multimedia device of claim 1, wherein when the received video signal is a National Television Systems Committee (NTSC) formatted video signal and the native video format is a Phase Alternation Line (PAL) formatted video signal, the media processing component removes one frame for every six received frames.
6. The personal multimedia device of claim 1, wherein when the received video signal is a FILM formatted video signal and the native video format is a National Television Systems Committee (NTSC) formatted video signal, the media processing component adds one frame for every four received frames.
7. The personal multimedia device of claim 1, wherein the media processing component further includes:
a video decoder/encoder configured to process the scaled video signal to an analog format appropriate for display on a television using the native video format of the personal multimedia device.
8. The personal multimedia device of claim 1, further comprising:
a channel decoder configured to decode television signals; and
an input/output interface including logic configured to connect external devices to the personal multimedia device.
9. The personal multimedia device of claim 1, wherein the media processing component determines the frame rate of the received video signal by examining header fields in a compressed version of the received video signal.
10. The personal multimedia device of claim 9, wherein the compressed version of the received video signal is compressed based on one of MPEG-2, MPEG-4, H.261, H.263, or H.264 standards.
11. A personal multimedia device comprising:
a media processing component configured to:
detect a frame rate of a received video signal;
compare the frame rate to a frame rate native to the personal multimedia device; and
modify a frame rate of the received video signal when the frame rate of the received video signal is different than the frame rate native to the personal multimedia device by temporally filtering the sequences of frames in the received video signal to derive a new set of interpolated frames at the frame rate native to the personal multimedia device,
wherein the media processing component further includes at least one graphics processor configured to scale a frame resolution of frames in the received video signal to correspond to a native video format of the personal multimedia device.
12. The personal multimedia device of claim 11, wherein the media processing component further includes:
a video decoder/encoder configured to process the scaled video signal to an analog format appropriate for display on a television using the native video format of the personal multimedia device.
13. The personal multimedia device of claim 11, further comprising:
a channel decoder configured to decode television signals; and
an input/output interface including logic configured to connect external devices to the personal multimedia device.
14. A method for converting a compressed video signal in a first format to a second format, the method comprising:
determining a frame rate of the video signal in the first format based on header information included in the compressed video signal;
decoding the compressed video signal to uncompress the video signal;
increasing the frame rate of the video signal in the first format when the determined frame rate is less than a frame rate of the second format by adding frames to the uncompressed video signal of the first format, where each added frame is based on at least one frame in the uncompressed video signal in the first format;
decreasing the frame rate of the video signal in the first format when the determined frame rate is greater than the frame rate of the second format by removing frames from the uncompressed video signal in the first format; and
scaling a frame resolution of each of the frames of the video signal in the first format to adjust a resolution of the video signal in the first format to match a resolution of the video signal in the second format.
15. The method of claim 14, wherein each of the added frames are derived from an odd field of a first frame of the video signal in the first format and an even field of a second frame of the video signal in the second format.
16. The method of claim 14, wherein when the video signal in the first format is a Phase Alternation Line (PAL) formatted video signal and the video signal in the second format is a National Television Systems Committee (NTSC) formatted video signal, increasing the frame rate includes adding one frame for every five received frames.
17. The method of claim 14, wherein when the video signal in the first format is a National Television Systems Committee (NTSC) formatted video signal and the video signal in the second format is a Phase Alternation Line (PAL) formatted video signal decreasing the frame rate includes removing one frame for every six received frames.
18. The method of claim 14, wherein when the video signal in the first format is a FILM formatted video signal and the video signal in the second format is a National Television Systems Committee (NTSC) formatted video signal increasing the frame rate includes adding one additional frame for every four received frames.
19. The method of claim 14, wherein determining a frame rate comprises:
determining the frame rate of the video signal in the first format by examining header fields in the compressed video signal in the first format.
20. The method of claim 19, wherein the compressed video signal in the first format is compressed based on one of MPEG-2, MPEG-4, H.261, H.263, and H.264 standards.
21. A video conference system comprising:
a first personal multimedia device configured to output video signals in a first format and to convert received video signals that are not in the first format to television compatible video signals in the first format;
a first video capture device coupled to the first personal multimedia device and configured to transmit video signals to the first personal multimedia device;
a second personal multimedia device coupled to the first personal multimedia device via a network, the second personal multimedia device configured to output video signals in a second format and to convert received video signals that are not in the second format to television compatible video signals in the second format; and
a second video capture device coupled to the second personal multimedia device and configured to transmit video signals to the second personal multimedia device.
22. The video conference system of claim 21, further comprising:
a third personal multimedia device coupled to the first personal multimedia device and the second personal multimedia device via the network, the third personal multimedia device configured to convert the received video signal to a third format when the native format is different from the first and second formats.
23. The system of claim 21, wherein the first and second personal multimedia devices are set-top boxes.
24. The video conference system of claim 21, wherein the first personal multimedia device includes:
a media processing component configured to:
increase a frame rate of a received video signal when the frame rate of the received video signal is less than a frame rate of a native video format of the first personal multimedia device by adding frames to the received video signal where the added frames are based on at least one of the received frames, and
decrease the frame rate of the received video signal when the frame rate of the received video signal is greater than a native frame rate of the first personal multimedia device by removing frames from the received video signal, wherein
the media processing component further includes at least one graphics processor configured to scale a frame resolution of the frames in the received video signal to correspond to the native video format of the first personal multimedia device.
25. The video conference system of claim 21, wherein each of the added frames are derived from an odd field of a first frame of the received video signal and an even field of a second frame of the received video signal.
26. The video conference system of claim 21, wherein when the received video signal is a Phase Alternation Line (PAL) formatted video signal and the native video format is a National Television Systems Committee (NTSC) formatted video signal, the media processing component adds one frame for every five received frames.
27. The video conference system of claim 21, wherein when the received video signal is a National Television Systems Committee (NTSC) formatted video signal and the native video format is a Phase Alternation Line (PAL) formatted video signal, the media processing component removes one frame for every six received frames.
28. The video conference system of claim 21, wherein when the received video signal is a FILM formatted video signal and the native video format is a National Television Systems Committee (NTSC) formatted video signal, the media processing component adds one additional frame for every four received frames.
29. A device for converting a video signal in a first format to a second format, the device comprising:
means for determining a frame rate of the video signal in the first format;
means for increasing the frame rate when the determined frame rate is less than a frame rate of the second format by adding frames to the video signal of the first format, where each added frame is based on at least one frame in the video signal in the first format;
means for decreasing the frame rate when the determined frame rate is greater than the frame rate of the second format by removing frames from the video signal in the first format; and
means for scaling a frame resolution of each of the frames of the video signal in the first format to adjust a resolution of the video signal in the first format to match a resolution of the video signal in the second format.
30. The device of claim 29, wherein the device is a set-top box.
US10/601,733 2003-06-23 2003-06-23 Personal multimedia device video format conversion across multiple video formats Abandoned US20040257434A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US10/601,733 US20040257434A1 (en) 2003-06-23 2003-06-23 Personal multimedia device video format conversion across multiple video formats
US11/879,306 US7688384B2 (en) 2003-06-23 2007-07-17 Personal multimedia device video format conversion across multiple video formats

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US10/601,733 US20040257434A1 (en) 2003-06-23 2003-06-23 Personal multimedia device video format conversion across multiple video formats

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US11/879,306 Continuation US7688384B2 (en) 2003-06-23 2007-07-17 Personal multimedia device video format conversion across multiple video formats

Publications (1)

Publication Number Publication Date
US20040257434A1 true US20040257434A1 (en) 2004-12-23

Family

ID=33518009

Family Applications (2)

Application Number Title Priority Date Filing Date
US10/601,733 Abandoned US20040257434A1 (en) 2003-06-23 2003-06-23 Personal multimedia device video format conversion across multiple video formats
US11/879,306 Active US7688384B2 (en) 2003-06-23 2007-07-17 Personal multimedia device video format conversion across multiple video formats

Family Applications After (1)

Application Number Title Priority Date Filing Date
US11/879,306 Active US7688384B2 (en) 2003-06-23 2007-07-17 Personal multimedia device video format conversion across multiple video formats

Country Status (1)

Country Link
US (2) US20040257434A1 (en)

Cited By (56)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060146353A1 (en) * 2004-12-30 2006-07-06 Microsoft Corporation Strategies for rendering job information using a multi-personality driver device
US20060238648A1 (en) * 2005-04-20 2006-10-26 Eric Wogsberg Audiovisual signal routing and distribution system
WO2006113776A2 (en) * 2005-04-20 2006-10-26 Jupiter Systems Audiovisual signal routing and distribution system
US20060242669A1 (en) * 2005-04-20 2006-10-26 Jupiter Systems Display node for use in an audiovisual signal routing and distribution system
US20060239294A1 (en) * 2005-04-20 2006-10-26 Jupiter Systems Capture node for use in an audiovisual signal routing and distribution system
US20060244816A1 (en) * 2005-04-28 2006-11-02 Hyeonkuk Jeong Adjusting sampling rate for encoding
US20060244819A1 (en) * 2005-04-28 2006-11-02 Thomas Pun Video encoding in a video conference
US20060244755A1 (en) * 2005-04-28 2006-11-02 Microsoft Corporation Pre-rendering conversion of graphical data
US20060247045A1 (en) * 2005-04-28 2006-11-02 Hyeonkuk Jeong Audio processing in a multi-participant conference
US20060245378A1 (en) * 2005-04-28 2006-11-02 Hyeonkuk Jeong Multi-participant conference setup
US20060245377A1 (en) * 2005-04-28 2006-11-02 Hyeonkuk Jeong Heterogeneous video conferencing
US20060244812A1 (en) * 2005-04-28 2006-11-02 Hyeonkuk Jeong Video processing in a multi-participant video conference
US20070202923A1 (en) * 2006-02-24 2007-08-30 Searete, Llc System and method for transferring media content between a portable device and a video display
EP1849239A2 (en) * 2005-01-25 2007-10-31 Collaboration Properties, Inc. Multiple-channel codec and transcoder environment for gateway, mcu, broadcast, and video storage applications
US20070297427A1 (en) * 2004-09-22 2007-12-27 Icube Corp. Media Gateway
US20080100740A1 (en) * 2006-10-27 2008-05-01 Mediatek Inc. Methods and apparatuses for adjusting digital video signals
WO2008100640A1 (en) * 2007-02-16 2008-08-21 Marvell World Trade Lte. Methods and systems for improving low resolution and low frame rate video
US20080211959A1 (en) * 2007-01-05 2008-09-04 Nikhil Balram Methods and systems for improving low-resolution video
EP1973617A2 (en) * 2005-12-23 2008-10-01 WMS Gaming Inc. Transient or persistent game play in wagering games
US20080291209A1 (en) * 2007-05-25 2008-11-27 Nvidia Corporation Encoding Multi-media Signals
US20100066808A1 (en) * 2008-09-12 2010-03-18 Embarq Holdings Company, Llc System and method for encoding changes for video conferencing through a set-top box
US20100149301A1 (en) * 2008-12-15 2010-06-17 Microsoft Corporation Video Conferencing Subscription Using Multiple Bit Rate Streams
US20100240442A1 (en) * 2006-02-17 2010-09-23 Wms Gaming Inc. Providing alternative persistent state recovery techniques
US8064752B1 (en) 2003-12-09 2011-11-22 Apple Inc. Video encoding
US8135261B1 (en) 2003-12-09 2012-03-13 Apple Inc. Insertion and usage of metadata in digital video
US8159612B1 (en) * 2004-12-13 2012-04-17 Nvidia Corporation Apparatus, system, and method for processing digital audio/video signals
WO2013052879A1 (en) * 2011-10-06 2013-04-11 Qualcomm Incorporated Frame buffer format detection
US8606949B2 (en) * 2005-04-20 2013-12-10 Jupiter Systems Interconnection mechanism for multiple data streams
US8660380B2 (en) 2006-08-25 2014-02-25 Nvidia Corporation Method and system for performing two-dimensional transform on data value array with reduced power consumption
US8660182B2 (en) 2003-06-09 2014-02-25 Nvidia Corporation MPEG motion estimation based on dual start points
US8666181B2 (en) 2008-12-10 2014-03-04 Nvidia Corporation Adaptive multiple engine image motion detection system and method
US8724702B1 (en) 2006-03-29 2014-05-13 Nvidia Corporation Methods and systems for motion estimation used in video coding
US8731071B1 (en) 2005-12-15 2014-05-20 Nvidia Corporation System for performing finite input response (FIR) filtering in motion estimation
US8756482B2 (en) 2007-05-25 2014-06-17 Nvidia Corporation Efficient encoding/decoding of a sequence of data frames
US8787732B2 (en) 2003-12-09 2014-07-22 Apple Inc. Exporting metadata associated with digital video
US8839110B2 (en) 2011-02-16 2014-09-16 Apple Inc. Rate conform operation for a media-editing application
US8854539B2 (en) * 2002-07-23 2014-10-07 Visualon, Inc. Method and system for direct recording of video information onto a disk medium
US8861701B2 (en) 2005-04-28 2014-10-14 Apple Inc. Multi-participant conference adjustments
US20140313283A1 (en) * 2011-12-22 2014-10-23 Yoshinaga Kato Electronic device and non-transitory computer readable recording medium storing program for controlling electronic device
US8873625B2 (en) 2007-07-18 2014-10-28 Nvidia Corporation Enhanced compression in representing non-frame-edge blocks of image frames
US8947492B2 (en) 2010-06-18 2015-02-03 Microsoft Corporation Combining multiple bit rate and scalable video coding
US9015755B2 (en) 2008-07-29 2015-04-21 Centurylink Intellectual Property Llc System and method for an automatic television channel change
US9032461B2 (en) 2008-09-12 2015-05-12 Centurylink Intellectual Property Llc System and method for video conferencing through a television forwarding device
US20150161565A1 (en) * 2012-08-24 2015-06-11 SNN, Inc. Methods and systems for producing, previewing, and publishing a video press release over an electronic network
US9118927B2 (en) 2007-06-13 2015-08-25 Nvidia Corporation Sub-pixel interpolation and its application in motion compensated encoding of a video signal
US9330060B1 (en) 2003-04-15 2016-05-03 Nvidia Corporation Method and device for encoding and decoding video image data
US20160219248A1 (en) * 2013-08-29 2016-07-28 Vid Scale, Inc. User-adaptive video telephony
US9412414B2 (en) 2011-02-16 2016-08-09 Apple Inc. Spatial conform operation for a media-editing application
US9544542B2 (en) * 2015-03-31 2017-01-10 Brother Kogyo Kabushiki Kaisha Teleconference management server device, teleconference management method and non-transitory computer-readable medium
US9870802B2 (en) 2011-01-28 2018-01-16 Apple Inc. Media clip management
US9997196B2 (en) 2011-02-16 2018-06-12 Apple Inc. Retiming media presentations
US10324605B2 (en) 2011-02-16 2019-06-18 Apple Inc. Media-editing application with novel editing tools
US10467855B2 (en) 2017-06-01 2019-11-05 Igt Gaming system and method for modifying persistent elements
US11044420B2 (en) * 2018-10-29 2021-06-22 Henry M. Pena Real time video special effects system and method
US11303847B2 (en) * 2019-07-17 2022-04-12 Home Box Office, Inc. Video frame pulldown based on frame analysis
US11747972B2 (en) 2011-02-16 2023-09-05 Apple Inc. Media-editing application with novel editing tools

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5022105B2 (en) * 2007-05-29 2012-09-12 富士フイルム株式会社 Camera display control apparatus and method
US20090100493A1 (en) * 2007-10-16 2009-04-16 At&T Knowledge Ventures, Lp. System and Method for Display Format Detection at Set Top Box Device
US9204086B2 (en) * 2008-07-17 2015-12-01 Broadcom Corporation Method and apparatus for transmitting and using picture descriptive information in a frame rate conversion processor
US8446452B2 (en) * 2008-11-07 2013-05-21 Magor Communications Corporation Video rate adaptation for congestion control
US20130076979A1 (en) * 2010-08-20 2013-03-28 Shinya Kadono Video transmitting apparatus, video transmitting method, video receiving apparatus, video receiving method, and video transmitting-receiving apparatus
US8582474B2 (en) * 2010-12-09 2013-11-12 At&T Intellectual Property I, L.P. Video conference system and method
DE102011013737A1 (en) * 2011-03-11 2012-09-13 Deutsches Zentrum für Luft- und Raumfahrt e.V. satellite
US9779471B2 (en) 2014-10-01 2017-10-03 Qualcomm Incorporated Transparent pixel format converter
CN104580835A (en) * 2015-01-12 2015-04-29 陶然 Private cinema decoding monitoring device

Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5111292A (en) * 1991-02-27 1992-05-05 General Electric Company Priority selection apparatus as for a video signal processor
US5757967A (en) * 1995-10-19 1998-05-26 Ibm Corporation Digital video decoder and deinterlacer, format/frame rate converter with common memory
US5819048A (en) * 1995-07-04 1998-10-06 Canon Kabushiki Kaisha Image data processing apparatus transmitting data in accordance with a reception rate
US5835636A (en) * 1996-05-28 1998-11-10 Lsi Logic Corporation Method and apparatus for reducing the memory required for decoding bidirectionally predictive-coded frames during pull-down
US6037991A (en) * 1996-11-26 2000-03-14 Motorola, Inc. Method and apparatus for communicating video information in a communication system
US6124880A (en) * 1996-05-24 2000-09-26 Nynex Science & Technology Inc. Multi-user video switchable translator
US6124881A (en) * 1997-04-25 2000-09-26 Fujitsu Limited System for generating composed signals of multiple pictures for use in a video conference system
US6133940A (en) * 1996-09-04 2000-10-17 8×8, Inc. Telephone web browser arrangement and method
US6215515B1 (en) * 1992-02-19 2001-04-10 Netergy Networks, Inc. Videocommunicating device with an on-screen telephone keypad user-interface method and arrangement
US6314575B1 (en) * 1994-09-14 2001-11-06 Time Warner Entertainment Company, L.P. Telecasting service for providing video programs on demand with an interactive interface for facilitating viewer selection of video programs
US6343098B1 (en) * 1998-02-26 2002-01-29 Lucent Technologies Inc. Efficient rate control for multi-resolution video encoding
US20020145611A1 (en) * 2000-02-01 2002-10-10 Dye Thomas A. Video controller system with object display lists
US20020154698A1 (en) * 1998-03-27 2002-10-24 Jeongnam Youn Method and apparatus for motion estimation for high performance transcoding
US20030012279A1 (en) * 1997-03-17 2003-01-16 Navin Chaddha Multimedia compression system with additive temporal layers
US20030206242A1 (en) * 2000-03-24 2003-11-06 Choi Seung Jong Device and method for converting format in digital TV receiver
US20030219238A1 (en) * 2002-04-05 2003-11-27 Akira Yamaguchi Frame conversion apparatus and frame conversion method
US6803922B2 (en) * 2002-02-14 2004-10-12 International Business Machines Corporation Pixel formatter for two-dimensional graphics engine of set-top box system
US20040239803A1 (en) * 2003-05-27 2004-12-02 Steve Selby Method and system for changing the frame rate to be optimal for the material being displayed while maintaining a stable image throughout
US20050033849A1 (en) * 2002-06-20 2005-02-10 Bellsouth Intellectual Property Corporation Content blocking
US6919929B1 (en) * 2001-03-29 2005-07-19 National Semiconductor Corporation Method and system for implementing a video and graphics interface signaling protocol
US6983478B1 (en) * 2000-02-01 2006-01-03 Bellsouth Intellectual Property Corporation Method and system for tracking network use
US7069573B1 (en) * 1999-12-09 2006-06-27 Vidiator Enterprises Inc. Personal broadcasting and viewing method of audio and video data using a wide area network

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5651104A (en) * 1995-04-25 1997-07-22 Evans & Sutherland Computer Corporation Computer graphics system and process for adaptive supersampling
US6529680B1 (en) * 1996-04-26 2003-03-04 Mitsubishi Digital Electronics America, Inc. Device for selecting and controlling a plurality of signal sources in a television system
US6121978A (en) * 1998-01-07 2000-09-19 Ati Technologies, Inc. Method and apparatus for graphics scaling
IL141104A0 (en) * 1998-07-27 2002-02-10 Webtv Networks Inc Remote computer access
US6466624B1 (en) * 1998-10-28 2002-10-15 Pixonics, Llc Video decoder with bit stream based enhancements
JP2003529964A (en) * 1999-08-27 2003-10-07 ノキア コーポレイション Mobile multimedia terminal for DVB-1 and large and small cell communication
US7890661B2 (en) * 2001-05-16 2011-02-15 Aol Inc. Proximity synchronizing audio gateway device
US20020188952A1 (en) * 2001-06-08 2002-12-12 Istvan Anthony F. Systems and methods for accessing interactive content via synthetic channels
EP1445688A4 (en) * 2001-10-24 2006-09-20 Matsushita Electric Ind Co Ltd Printing system, printer, data output device, printing method
US7209874B2 (en) * 2002-02-25 2007-04-24 Zoran Corporation Emulator-enabled network connectivity to a device
US20030188322A1 (en) * 2002-03-28 2003-10-02 General Instrument Corporation Method and system for remotely displaying television program content using streaming video

Patent Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5111292A (en) * 1991-02-27 1992-05-05 General Electric Company Priority selection apparatus as for a video signal processor
US6215515B1 (en) * 1992-02-19 2001-04-10 Netergy Networks, Inc. Videocommunicating device with an on-screen telephone keypad user-interface method and arrangement
US6314575B1 (en) * 1994-09-14 2001-11-06 Time Warner Entertainment Company, L.P. Telecasting service for providing video programs on demand with an interactive interface for facilitating viewer selection of video programs
US5819048A (en) * 1995-07-04 1998-10-06 Canon Kabushiki Kaisha Image data processing apparatus transmitting data in accordance with a reception rate
US5757967A (en) * 1995-10-19 1998-05-26 Ibm Corporation Digital video decoder and deinterlacer, format/frame rate converter with common memory
US6124880A (en) * 1996-05-24 2000-09-26 Nynex Science & Technology Inc. Multi-user video switchable translator
US5835636A (en) * 1996-05-28 1998-11-10 Lsi Logic Corporation Method and apparatus for reducing the memory required for decoding bidirectionally predictive-coded frames during pull-down
US6133940A (en) * 1996-09-04 2000-10-17 8×8, Inc. Telephone web browser arrangement and method
US6037991A (en) * 1996-11-26 2000-03-14 Motorola, Inc. Method and apparatus for communicating video information in a communication system
US20030012279A1 (en) * 1997-03-17 2003-01-16 Navin Chaddha Multimedia compression system with additive temporal layers
US6124881A (en) * 1997-04-25 2000-09-26 Fujitsu Limited System for generating composed signals of multiple pictures for use in a video conference system
US6343098B1 (en) * 1998-02-26 2002-01-29 Lucent Technologies Inc. Efficient rate control for multi-resolution video encoding
US20020154698A1 (en) * 1998-03-27 2002-10-24 Jeongnam Youn Method and apparatus for motion estimation for high performance transcoding
US7069573B1 (en) * 1999-12-09 2006-06-27 Vidiator Enterprises Inc. Personal broadcasting and viewing method of audio and video data using a wide area network
US20020145611A1 (en) * 2000-02-01 2002-10-10 Dye Thomas A. Video controller system with object display lists
US6983478B1 (en) * 2000-02-01 2006-01-03 Bellsouth Intellectual Property Corporation Method and system for tracking network use
US20030206242A1 (en) * 2000-03-24 2003-11-06 Choi Seung Jong Device and method for converting format in digital TV receiver
US6919929B1 (en) * 2001-03-29 2005-07-19 National Semiconductor Corporation Method and system for implementing a video and graphics interface signaling protocol
US6803922B2 (en) * 2002-02-14 2004-10-12 International Business Machines Corporation Pixel formatter for two-dimensional graphics engine of set-top box system
US20030219238A1 (en) * 2002-04-05 2003-11-27 Akira Yamaguchi Frame conversion apparatus and frame conversion method
US20050033849A1 (en) * 2002-06-20 2005-02-10 Bellsouth Intellectual Property Corporation Content blocking
US20040239803A1 (en) * 2003-05-27 2004-12-02 Steve Selby Method and system for changing the frame rate to be optimal for the material being displayed while maintaining a stable image throughout

Cited By (103)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8854539B2 (en) * 2002-07-23 2014-10-07 Visualon, Inc. Method and system for direct recording of video information onto a disk medium
US9330060B1 (en) 2003-04-15 2016-05-03 Nvidia Corporation Method and device for encoding and decoding video image data
US8660182B2 (en) 2003-06-09 2014-02-25 Nvidia Corporation MPEG motion estimation based on dual start points
US8135261B1 (en) 2003-12-09 2012-03-13 Apple Inc. Insertion and usage of metadata in digital video
US8886016B2 (en) 2003-12-09 2014-11-11 Apple Inc. Propagating metadata associated with digital video
US8811802B2 (en) 2003-12-09 2014-08-19 Aplle, Inc. Insertion and usage of metadata in digital video
US8787732B2 (en) 2003-12-09 2014-07-22 Apple Inc. Exporting metadata associated with digital video
US8666222B2 (en) * 2003-12-09 2014-03-04 Apple Inc. Video encoding
US20120093216A1 (en) * 2003-12-09 2012-04-19 David Robert Black Video encoding
US8064752B1 (en) 2003-12-09 2011-11-22 Apple Inc. Video encoding
US20070297427A1 (en) * 2004-09-22 2007-12-27 Icube Corp. Media Gateway
US8931027B2 (en) * 2004-09-22 2015-01-06 Icube Corp. Media gateway
US8159612B1 (en) * 2004-12-13 2012-04-17 Nvidia Corporation Apparatus, system, and method for processing digital audio/video signals
US20060146353A1 (en) * 2004-12-30 2006-07-06 Microsoft Corporation Strategies for rendering job information using a multi-personality driver device
EP1849239A2 (en) * 2005-01-25 2007-10-31 Collaboration Properties, Inc. Multiple-channel codec and transcoder environment for gateway, mcu, broadcast, and video storage applications
US20080117965A1 (en) * 2005-01-25 2008-05-22 Collaboration Properties, Inc. Multiple-Channel Codec and Transcoder Environment for Gateway, Mcu, Broadcast, and Video Storage Applications
EP1849239A4 (en) * 2005-01-25 2010-12-29 Avistar Comm Corp Multiple-channel codec and transcoder environment for gateway, mcu, broadcast, and video storage applications
US8606949B2 (en) * 2005-04-20 2013-12-10 Jupiter Systems Interconnection mechanism for multiple data streams
WO2006113776A3 (en) * 2005-04-20 2007-09-20 Jupiter Systems Audiovisual signal routing and distribution system
US10469553B2 (en) 2005-04-20 2019-11-05 Jupiter Systems, Llc Interconnection mechanism for multiple data streams
US20140223024A1 (en) * 2005-04-20 2014-08-07 Jupiter Systems Interconnection mechanism for multiple data streams
US8553716B2 (en) 2005-04-20 2013-10-08 Jupiter Systems Audiovisual signal routing and distribution system
US8547997B2 (en) * 2005-04-20 2013-10-01 Jupiter Systems Capture node for use in an audiovisual signal routing and distribution system
US20060239294A1 (en) * 2005-04-20 2006-10-26 Jupiter Systems Capture node for use in an audiovisual signal routing and distribution system
US20060242669A1 (en) * 2005-04-20 2006-10-26 Jupiter Systems Display node for use in an audiovisual signal routing and distribution system
US9549011B2 (en) * 2005-04-20 2017-01-17 Infocus Corporation Interconnection mechanism for multiple data streams
WO2006113776A2 (en) * 2005-04-20 2006-10-26 Jupiter Systems Audiovisual signal routing and distribution system
US20060238648A1 (en) * 2005-04-20 2006-10-26 Eric Wogsberg Audiovisual signal routing and distribution system
US20100189178A1 (en) * 2005-04-28 2010-07-29 Thomas Pun Video encoding in a video conference
US8269816B2 (en) 2005-04-28 2012-09-18 Apple Inc. Video encoding in a video conference
US7817180B2 (en) 2005-04-28 2010-10-19 Apple Inc. Video processing in a multi-participant video conference
US20100321469A1 (en) * 2005-04-28 2010-12-23 Hyeonkuk Jeong Video Processing in a Multi-Participant Video Conference
US20060244755A1 (en) * 2005-04-28 2006-11-02 Microsoft Corporation Pre-rendering conversion of graphical data
US7864209B2 (en) 2005-04-28 2011-01-04 Apple Inc. Audio processing in a multi-participant conference
US7899170B2 (en) * 2005-04-28 2011-03-01 Apple Inc. Multi-participant conference setup
US20060244819A1 (en) * 2005-04-28 2006-11-02 Thomas Pun Video encoding in a video conference
US7949117B2 (en) * 2005-04-28 2011-05-24 Apple Inc. Heterogeneous video conferencing
US20060247045A1 (en) * 2005-04-28 2006-11-02 Hyeonkuk Jeong Audio processing in a multi-participant conference
US20060244816A1 (en) * 2005-04-28 2006-11-02 Hyeonkuk Jeong Adjusting sampling rate for encoding
US20060244812A1 (en) * 2005-04-28 2006-11-02 Hyeonkuk Jeong Video processing in a multi-participant video conference
US20060245377A1 (en) * 2005-04-28 2006-11-02 Hyeonkuk Jeong Heterogeneous video conferencing
US7692682B2 (en) 2005-04-28 2010-04-06 Apple Inc. Video encoding in a video conference
US8638353B2 (en) 2005-04-28 2014-01-28 Apple Inc. Video processing in a multi-participant video conference
US20060245378A1 (en) * 2005-04-28 2006-11-02 Hyeonkuk Jeong Multi-participant conference setup
US7653250B2 (en) 2005-04-28 2010-01-26 Apple Inc. Adjusting sampling rate for encoding
US8861701B2 (en) 2005-04-28 2014-10-14 Apple Inc. Multi-participant conference adjustments
US8520053B2 (en) 2005-04-28 2013-08-27 Apple Inc. Video encoding in a video conference
US8594293B2 (en) 2005-04-28 2013-11-26 Apple Inc. Multi-participant conference setup
US8731071B1 (en) 2005-12-15 2014-05-20 Nvidia Corporation System for performing finite input response (FIR) filtering in motion estimation
US9704343B2 (en) 2005-12-23 2017-07-11 Bally Gaming, Inc. Transient or persistent game play in wagering games
EP1973617A2 (en) * 2005-12-23 2008-10-01 WMS Gaming Inc. Transient or persistent game play in wagering games
US20080300049A1 (en) * 2005-12-23 2008-12-04 Wms Gaming Inc Transient or Persistent Game Play in Wagering Games
US10290183B2 (en) 2005-12-23 2019-05-14 Bally Gaming, Inc. Transient or persistent game play in wagering games
EP1973617A4 (en) * 2005-12-23 2011-03-02 Wms Gaming Inc Transient or persistent game play in wagering games
US9293001B2 (en) 2005-12-23 2016-03-22 Bally Gaming, Inc. Transient or persistent game play in wagering games
US20100240442A1 (en) * 2006-02-17 2010-09-23 Wms Gaming Inc. Providing alternative persistent state recovery techniques
US8216058B2 (en) 2006-02-17 2012-07-10 Wms Gaming Inc. Providing alternative persistent state recovery techniques
US20070202923A1 (en) * 2006-02-24 2007-08-30 Searete, Llc System and method for transferring media content between a portable device and a video display
US8724702B1 (en) 2006-03-29 2014-05-13 Nvidia Corporation Methods and systems for motion estimation used in video coding
US8666166B2 (en) 2006-08-25 2014-03-04 Nvidia Corporation Method and system for performing two-dimensional transform on data value array with reduced power consumption
US8660380B2 (en) 2006-08-25 2014-02-25 Nvidia Corporation Method and system for performing two-dimensional transform on data value array with reduced power consumption
US20080100740A1 (en) * 2006-10-27 2008-05-01 Mediatek Inc. Methods and apparatuses for adjusting digital video signals
US20080211959A1 (en) * 2007-01-05 2008-09-04 Nikhil Balram Methods and systems for improving low-resolution video
US8269886B2 (en) 2007-01-05 2012-09-18 Marvell World Trade Ltd. Methods and systems for improving low-resolution video
US8819760B2 (en) 2007-01-05 2014-08-26 Marvell World Trade Ltd. Methods and systems for improving low-resolution video
US20080198264A1 (en) * 2007-02-16 2008-08-21 Nikhil Balram Methods and systems for improving low resolution and low frame rate video
US8885099B2 (en) 2007-02-16 2014-11-11 Marvell World Trade Ltd. Methods and systems for improving low resolution and low frame rate video
WO2008100640A1 (en) * 2007-02-16 2008-08-21 Marvell World Trade Lte. Methods and systems for improving low resolution and low frame rate video
US20080291209A1 (en) * 2007-05-25 2008-11-27 Nvidia Corporation Encoding Multi-media Signals
US8756482B2 (en) 2007-05-25 2014-06-17 Nvidia Corporation Efficient encoding/decoding of a sequence of data frames
US9118927B2 (en) 2007-06-13 2015-08-25 Nvidia Corporation Sub-pixel interpolation and its application in motion compensated encoding of a video signal
US8873625B2 (en) 2007-07-18 2014-10-28 Nvidia Corporation Enhanced compression in representing non-frame-edge blocks of image frames
US9015755B2 (en) 2008-07-29 2015-04-21 Centurylink Intellectual Property Llc System and method for an automatic television channel change
US8208001B2 (en) * 2008-09-12 2012-06-26 Embarq Holdings Company, Llc System and method for encoding changes for video conferencing through a set-top box
US8692863B2 (en) 2008-09-12 2014-04-08 Centurylink Intellectual Property Llc System and method for setting resolution utilized for video conferencing through a streaming device
US20140247318A1 (en) * 2008-09-12 2014-09-04 Centurylink Intellectual Property Llc System and method for initiating a video conferencing through a streaming device
US9025000B2 (en) * 2008-09-12 2015-05-05 Centurylink Intellectual Property Llc System and method for initiating a video conferencing through a streaming device
US9032461B2 (en) 2008-09-12 2015-05-12 Centurylink Intellectual Property Llc System and method for video conferencing through a television forwarding device
US20100066808A1 (en) * 2008-09-12 2010-03-18 Embarq Holdings Company, Llc System and method for encoding changes for video conferencing through a set-top box
US8666181B2 (en) 2008-12-10 2014-03-04 Nvidia Corporation Adaptive multiple engine image motion detection system and method
US20100149301A1 (en) * 2008-12-15 2010-06-17 Microsoft Corporation Video Conferencing Subscription Using Multiple Bit Rate Streams
US8947492B2 (en) 2010-06-18 2015-02-03 Microsoft Corporation Combining multiple bit rate and scalable video coding
US9870802B2 (en) 2011-01-28 2018-01-16 Apple Inc. Media clip management
US9997196B2 (en) 2011-02-16 2018-06-12 Apple Inc. Retiming media presentations
US8839110B2 (en) 2011-02-16 2014-09-16 Apple Inc. Rate conform operation for a media-editing application
US11747972B2 (en) 2011-02-16 2023-09-05 Apple Inc. Media-editing application with novel editing tools
US10324605B2 (en) 2011-02-16 2019-06-18 Apple Inc. Media-editing application with novel editing tools
US9412414B2 (en) 2011-02-16 2016-08-09 Apple Inc. Spatial conform operation for a media-editing application
US11157154B2 (en) 2011-02-16 2021-10-26 Apple Inc. Media-editing application with novel editing tools
WO2013052879A1 (en) * 2011-10-06 2013-04-11 Qualcomm Incorporated Frame buffer format detection
US20130242117A1 (en) * 2011-10-06 2013-09-19 Qualcomm Incorporated Frame buffer format detection
US8730328B2 (en) * 2011-10-06 2014-05-20 Qualcomm Incorporated Frame buffer format detection
US20140313283A1 (en) * 2011-12-22 2014-10-23 Yoshinaga Kato Electronic device and non-transitory computer readable recording medium storing program for controlling electronic device
US9319628B2 (en) * 2011-12-22 2016-04-19 Ricoh Company, Ltd. Electronic device and non-transitory computer readable recording medium storing program for controlling electronic device
US20150161565A1 (en) * 2012-08-24 2015-06-11 SNN, Inc. Methods and systems for producing, previewing, and publishing a video press release over an electronic network
US9171289B2 (en) * 2012-08-24 2015-10-27 SNN Incorporated Methods and systems for producing, previewing, and publishing a video press release over an electronic network
US20160219248A1 (en) * 2013-08-29 2016-07-28 Vid Scale, Inc. User-adaptive video telephony
US11356638B2 (en) 2013-08-29 2022-06-07 Vid Scale, Inc. User-adaptive video telephony
US9544542B2 (en) * 2015-03-31 2017-01-10 Brother Kogyo Kabushiki Kaisha Teleconference management server device, teleconference management method and non-transitory computer-readable medium
US10467855B2 (en) 2017-06-01 2019-11-05 Igt Gaming system and method for modifying persistent elements
US11044420B2 (en) * 2018-10-29 2021-06-22 Henry M. Pena Real time video special effects system and method
US11303847B2 (en) * 2019-07-17 2022-04-12 Home Box Office, Inc. Video frame pulldown based on frame analysis
US11711490B2 (en) 2019-07-17 2023-07-25 Home Box Office, Inc. Video frame pulldown based on frame analysis

Also Published As

Publication number Publication date
US20070277221A1 (en) 2007-11-29
US7688384B2 (en) 2010-03-30

Similar Documents

Publication Publication Date Title
US7688384B2 (en) Personal multimedia device video format conversion across multiple video formats
JP4509146B2 (en) Signal processing device
US5444491A (en) Television system with multiple transmission formats
US5408270A (en) Advanced television system
JP4724734B2 (en) Device that performs format conversion
US6310654B1 (en) Decoder device and receiver using the same
US8270920B2 (en) Systems and methods for receiving and transferring video information
US20090122185A1 (en) Providing video streams of a program with different stream type values coded according to the same video coding specification
US5801782A (en) Analog video encoder with metered closed caption data on digital video input interface
US6256350B1 (en) Method and apparatus for low cost line-based video compression of digital video stream data
US20070040943A1 (en) Digital noise reduction apparatus and method and video signal processing apparatus
JP2001501400A (en) Television receiver with integrated receiver and decoder
US7020205B1 (en) Sending progressive video sequences suitable for MPEG and other data formats
JPH11164322A (en) Aspect ratio converter and its method
US6195393B1 (en) HDTV video frame synchronizer that provides clean digital video without variable delay
JP2009111442A (en) Video transmission system and method
US20070222891A1 (en) Systems and methods for video data conversion
US7580457B2 (en) Unified system for progressive and interlaced video transmission
US20050025462A1 (en) Method and apparatus for equipping personal digital product with functions of recording and displaying of the digital video/audio multi-media
CA2886992A1 (en) Methods and apparatuses for adaptively filtering video signals
US9154669B2 (en) Image apparatus for determining type of image data and method for processing image applicable thereto
Lim A proposal for an HDTV/ATV standard with multiple transmission formats
US20040179136A1 (en) Image transmission system and method thereof
Haskell et al. Introduction to digital multimedia, compression, and mpeg-2
Lim et al. HDTV transmission formats and migration path

Legal Events

Date Code Title Description
AS Assignment

Owner name: HUGHES ELECTRONICS CORPORATION, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DAVIS, ROBERT;JOSEPH, KURIACOSE;SEAH, ERNEST;REEL/FRAME:014229/0432;SIGNING DATES FROM 20030610 TO 20030620

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION