US20090116494A1 - Method and System to Transport High-Quality Video Signals - Google Patents

Method and System to Transport High-Quality Video Signals Download PDF

Info

Publication number
US20090116494A1
US20090116494A1 US12/264,586 US26458608A US2009116494A1 US 20090116494 A1 US20090116494 A1 US 20090116494A1 US 26458608 A US26458608 A US 26458608A US 2009116494 A1 US2009116494 A1 US 2009116494A1
Authority
US
United States
Prior art keywords
subframe
data
video
signal
sequence number
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/264,586
Inventor
Pierre Costa
John Robert Erickson
Ahmad Ansari
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US12/264,586 priority Critical patent/US20090116494A1/en
Publication of US20090116494A1 publication Critical patent/US20090116494A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/22Adaptations for optical transmission
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04QSELECTING
    • H04Q11/00Selecting arrangements for multiplex systems
    • H04Q11/04Selecting arrangements for multiplex systems for time-division multiplexing
    • H04Q11/0428Integrated services digital network, i.e. systems for transmission of different types of digitised signals, e.g. speech, data, telecentral, television signals
    • H04Q11/0478Provisions for broadband connections
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/54Store-and-forward switching systems 
    • H04L12/56Packet switching systems
    • H04L12/5601Transfer mode dependent, e.g. ATM
    • H04L2012/5603Access techniques
    • H04L2012/5604Medium of transmission, e.g. fibre, cable, radio
    • H04L2012/5605Fibre
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/54Store-and-forward switching systems 
    • H04L12/56Packet switching systems
    • H04L12/5601Transfer mode dependent, e.g. ATM
    • H04L2012/5638Services, e.g. multimedia, GOS, QOS
    • H04L2012/5646Cell characteristics, e.g. loss, delay, jitter, sequence integrity
    • H04L2012/5652Cell construction, e.g. including header, packetisation, depacketisation, assembly, reassembly
    • H04L2012/5653Cell construction, e.g. including header, packetisation, depacketisation, assembly, reassembly using the ATM adaptation layer [AAL]
    • H04L2012/5654Cell construction, e.g. including header, packetisation, depacketisation, assembly, reassembly using the ATM adaptation layer [AAL] using the AAL1
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/54Store-and-forward switching systems 
    • H04L12/56Packet switching systems
    • H04L12/5601Transfer mode dependent, e.g. ATM
    • H04L2012/5638Services, e.g. multimedia, GOS, QOS
    • H04L2012/5646Cell characteristics, e.g. loss, delay, jitter, sequence integrity
    • H04L2012/5652Cell construction, e.g. including header, packetisation, depacketisation, assembly, reassembly
    • H04L2012/5653Cell construction, e.g. including header, packetisation, depacketisation, assembly, reassembly using the ATM adaptation layer [AAL]
    • H04L2012/5658Cell construction, e.g. including header, packetisation, depacketisation, assembly, reassembly using the ATM adaptation layer [AAL] using the AAL5
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/54Store-and-forward switching systems 
    • H04L12/56Packet switching systems
    • H04L12/5601Transfer mode dependent, e.g. ATM
    • H04L2012/5638Services, e.g. multimedia, GOS, QOS
    • H04L2012/5664Support of Video, e.g. MPEG

Definitions

  • the present invention relates to methods and systems for transporting high-quality video signals.
  • SMPTE 259M level C
  • SMPTE 259M level C
  • a SMPTE 259M signal is to be transported to a remote location, which may be several miles away for example.
  • Current methods of transporting SMPTE 259M signals or other professional quality video signals to remote locations use either dark fiber overlay networks or proprietary methods over very high bandwidth pipes.
  • an OC-12 channel may be used to transport an SMPTE 259M signal.
  • FIG. 1 is a block diagram of an embodiment of a system to transport high-quality video
  • FIG. 2 is a flow chart of an embodiment of a method performed at a transmitter end
  • FIG. 3 is a flow chart of an embodiment of a method performed at a receiver end
  • FIG. 4 illustrates the SMPTE 259M data structure
  • FIG. 5 is a block diagram illustrating an embodiment of an uncompressed signal based on a subframe of a SMPTE frame
  • FIG. 6 is a schematic block diagram of an embodiment of a video processor at the transmitter end
  • FIG. 7 is a schematic block diagram of an embodiment of a video processor at the receiver end
  • FIG. 8 is a block diagram of an embodiment of a system to provide timing information
  • FIG. 9 is a block diagram of an embodiment of a system to reconstruct the timing information at the receiver end.
  • FIG. 10 is a block diagram depicting a packing method for transmitting 10-bit word information using 8-bit bytes.
  • embodiments of the present invention provide an improved process for transporting high-quality video.
  • the process includes separating the video data into two sets of data, encapsulating each set of data into asynchronous transfer mode (ATM) cells, and transporting two ATM cell-based bit streams over dual, concatenated Optical Carrier 3 (OC-3) channels. Error checking and/or correction is used to reduce the probability of data errors during transport.
  • ATM asynchronous transfer mode
  • OC-3 concatenated Optical Carrier 3
  • FIG. 1 is a block diagram of an embodiment of a system to transport high-quality video
  • FIG. 2 which is a flow chart of an embodiment of a method performed at a transmitter end 20
  • FIG. 3 which is a flow chart of an embodiment of a method performed at a receiver end 22 .
  • the method comprises separating SMPTE video data 26 , such as SMPTE 259M video data, into first uncompressed data 30 and second uncompressed data 32 .
  • the act of separating the SMPTE video data comprises separating each frame of the SMPTE video data into a first subframe and a second subframe.
  • the first subframe comprises even lines of active video from the frame
  • the second subframe comprises odd lines of active video from the frame.
  • the first subframe and the second subframe may further comprise horizontal ancillary data, optional video data and/or vertical ancillary data.
  • the SMPTE 259M standard is inherently suitable to separate video data because of its field and frame-oriented data structure. In addition, extraneous data can be eliminated since the timing signals are not necessary to carry along a transport stream.
  • FIG. 4 illustrates the SMPTE 259M data structure.
  • the data structure comprises active video fields 34 and 36 .
  • the active video fields 34 and 36 contain component pixel data.
  • the active video field 34 includes odd lines of active video from a frame, while the active video field 36 includes even lines of active video from a frame.
  • Optional video fields 40 and 42 contain vertical blanking interval (VBI) data and non-critical data.
  • Horizontal ancillary (HANC) data fields 44 and 46 contain audio, timing and control information.
  • Vertical ancillary (VANC) data fields 48 and 49 contain special user information. A clear delineation between the fields is created by the inherent timing signals EAV 50 and SAV 52 .
  • the aforementioned structure is exploited to separate the SMPTE video data into two equal blocks.
  • the first uncompressed data includes data from the active video field 34 , the HANC data field 44 , and a portion of the data from the optional video field 40 and/or the VANC data field 48 .
  • the second uncompressed data includes data from the active video field 36 , the HANC data field 46 , and a portion of the data from the optional video field 42 and/or the VANC data field 49 .
  • the method comprises acts of forming a first uncompressed signal based on the first uncompressed data (block 54 ) and forming a second uncompressed signal based on the second uncompressed data (block 56 ).
  • the first uncompressed signal is based on the each first subframe.
  • the second uncompressed signal is based on the each second subframe.
  • the act of forming the first uncompressed signal further comprises appending a corresponding first sequence number to each first subframe, and encapsulating each first subframe with its corresponding first sequence number into at least one asynchronous transfer mode (ATM) cell.
  • the act of forming the second uncompressed signal further comprises appending a corresponding second sequence number to each second subframe, and encapsulating each second subframe with its corresponding second sequence number into at least one ATM cell.
  • the sequence numbers are appended to each subframe since traffic in most ATM networks can take any of several paths, each with a potentially different latency and cell delay variation.
  • the sequence numbers are used at the receiving end 22 to order reconstructed frames.
  • FIG. 5 is a block diagram illustrating an embodiment of an uncompressed signal which results from the act of either block 54 or block 56 in FIG. 2 .
  • the uncompressed signal comprises a bit stream including a sequence number 60 , ancillary data 62 , and active video data 64 for a subframe from a first frame. Thereafter, the bit stream includes a sequence number 70 , ancillary data 72 , and active video data 74 for a subframe from a second frame.
  • the pattern of including a sequence number, ancillary data and active video data for a subframe is repeated for each succeeding frame.
  • the above-described encapsulation method distinguishes video data (e.g. field/frame data) from ancillary data to facilitate the SMPTE 259M video data being properly reconstructed at the receiver end 22 . Since timing relationships are well-defined in the SMPTE 259M standard, and since a fixed frequency of 270 Mbps is used, logic at the receiver end 22 can add the proper timing signals.
  • video data e.g. field/frame data
  • ancillary data to facilitate the SMPTE 259M video data being properly reconstructed at the receiver end 22 . Since timing relationships are well-defined in the SMPTE 259M standard, and since a fixed frequency of 270 Mbps is used, logic at the receiver end 22 can add the proper timing signals.
  • each field has 244 active video lines.
  • the bandwidth for the HANC data is determined as follows.
  • the bandwidth for the VANC/optional data is determined as follows.
  • Each frame has 20 lines allocated for VANC/optional data.
  • the total data rate is equal to the sum of the total active video bandwidth, the HANC bandwidth and the VANC/optional bandwidth.
  • the total data rate is 230.7043 Mbps+32.8302 Mbps+9.4550112 Mbps, which equals 272.984612 Mbps. This is less than the 299.52 Mbps bandwidth available on two OC-3 links. Since the data is separated into two fields, the total data rate per field is 272.984612 Mbps/2, which approximately equals 136.4923 Mbps.
  • the act of forming the first uncompressed signal further comprises adding a first ATM adaptation layer (AAL) with either an error checking code or an error correcting code.
  • AAL ATM adaptation layer
  • the act of forming the second uncompressed signal may optionally comprise adding a second ATM adaptation layer with either an error checking code or an error correcting code.
  • a block coding algorithm such as Reed Solomon or another forward error correcting (FEC) code may be used.
  • the method comprises transporting 80 the first uncompressed signal via a first OC-3 channel 82 , and transporting 84 the second uncompressed signal via a second OC-3 channel 86 .
  • the OC-3 channels 82 and 86 are provided by an ATM network 90 .
  • the adaptation layers may be added because of additional bandwidth available on two OC-3 links beyond the 272.984612 Mbps required by the two bit streams.
  • Either AAL-1 with FEC or AAL-5 with FEC may be used. The former is less efficient but more robust, and the latter is more efficient and slightly less robust. The selection of which of these two adaptations to use may be dictated by specifications of a specific application. Note that the FEC process is symmetrical, requiring processing the inverse algorithm at the receiver end 22 .
  • a method performed at the receiver end 22 comprises receiving the first uncompressed signal via the first OC-3 channel (block 92 ), and receiving the second uncompressed signal via the second OC-3 channel (block 94 ).
  • the first uncompressed signal comprises a bit stream of a first plurality of ATM cells
  • the second uncompressed signal comprises a bit stream of a second plurality of ATM cells.
  • the ATM cells are extracted from the incoming bit streams.
  • the method optionally comprises performing error checking based on the first uncompressed signal, and performing error checking based on the second uncompressed signal.
  • An inverse FEC block code algorithm is used for error checking and recovery. If an error is detected, the block code may provide correction depending on which block code is used and the type and number of errors.
  • the method comprises extracting each first subframe and its corresponding first sequence number from the first plurality of ATM cells.
  • the method comprises extracting each second subframe and its corresponding second sequence number from the second plurality of ATM cells.
  • the data payload is extracted from the AAL-1 or AAL-5 encapsulation.
  • the method comprises reconstructing SMPTE video data 108 , such as SMPTE 259M video data.
  • SMPTE video data 108 such as SMPTE 259M video data.
  • Each frame of the SMPTE video data is reconstructed based on a first corresponding subframe represented within the first uncompressed signal and a second corresponding subframe represented within the second uncompressed signal.
  • the EAV and SAV timing signals are added to reconstructed frames.
  • the reconstructed frames are ordered based on each first sequence number and each second sequence number.
  • One approach to ordering the frames comprises using a buffer management process to synchronize the arriving data based on the sequence numbers.
  • a modified leaky bucket (LB) algorithm or similar technique can be used to synchronize the two fields. Optimization can be performed by varying the limit parameter based on the LB counter and the last compliance time. The arrival time is based on the arrival of the sequence number. This allows for a fast implementation in silicon, using the sequence number to direct data to the appropriate buffers.
  • LB leaky bucket
  • FIG. 6 is a schematic block diagram of an embodiment of the video processor 110 at the transmitter end 20 .
  • Each video frame 120 within SMPTE 259M video data 122 is separated into two active video subframes.
  • a temporary buffer 124 stores one of the two active video subframes.
  • a temporary buffer 126 stores the other of the two active video subframes.
  • the temporary buffers 124 and 126 may have equal sizes.
  • Ancillary data 130 within the SMPTE 259M video data 122 is appended to outputs of the temporary buffers 124 and 126 .
  • the resulting streams are applied to first-in-first-out (FIFOs) 132 and 134 .
  • FIFOs first-in-first-out
  • a sequence number is added to the FIFO stream 132 by tagging logic 136 .
  • a sequence number is added to the FIFO stream 134 by tagging logic 140 .
  • An AAL 142 applies FEC to the output of the tagging logic 136 .
  • An AAL 144 applies FEC to the output of the tagging logic 140 .
  • a physical layer 146 couples the AAL 142 to the OC-3 channel 82 in FIG. 1 .
  • a physical layer 150 couples the AAL 144 to the OC-3 channel 86 in FIG. 1 .
  • the aforementioned components of the video processor 110 are directed by system control logic 152 .
  • FIG. 7 is a schematic block diagram of an embodiment of the video processor 158 at the receiver end 22 .
  • a physical layer 160 couples the OC-3 channel 82 in FIG. 1 to an AAL 162 .
  • a physical layer 164 couples the OC-3 channel 86 in FIG. 1 to an AAL 166 .
  • the physical layers 160 and 164 extract ATM cells from an incoming bit stream.
  • the AALs 162 and 166 perform an inverse FEC block code algorithm for error checking and/or correcting, and extract the data payload from AAL-1/5 encapsulation.
  • Tagging logic 170 is responsive to the AAL 162 to order each subframe based on its sequence number, and to remove the sequence number.
  • Tagging logic 172 is responsive to the AAL 166 to order each subframe based on its sequence number, and to remove the sequence number.
  • the resulting synchronized buffers are indicated by FIFOs 174 and 176 .
  • Ancillary data 180 is extracted from each subframe.
  • Temporary buffers 182 and 184 store the two active video portions which, when combined with EAV and SAV signals, form a video frame 186 .
  • the video frame 186 is in accordance with an SMPTE standard such as SMPTE 259M.
  • the aforementioned components of the video processor 158 are directed by system control logic 190 .
  • the system control logic 190 directs synchronization of data from the two separate fields.
  • FIG. 8 is a block diagram of an embodiment of a system to provide timing information in addition to the sequence number.
  • the timing information is based upon a first clock 200 and a second clock 202 .
  • the first clock 200 has a frequency of 90 kHz
  • the second clock 202 has a frequency of 27 MHz.
  • a first counter 204 is responsive to the first clock 200 .
  • a second counter 206 is responsive to the second clock 202 .
  • the first counter 204 is a 23-bit counter and the second counter 206 is a 9-bit counter.
  • the timing information has an upper portion 210 comprising bits from the second counter 206 , and a lower portion 212 comprising bits from the first counter 204 .
  • the timing information is encapsulated as described above for the bit stream. The additional 32 bits keep the overall bandwidth within the bandwidth limit of the two OC-3 links.
  • FIG. 9 is a block diagram of an embodiment of a system to reconstruct the timing information at the receiver end.
  • a clock recovery module 214 outputs a first clock signal based on the lower portion 212 of the received timing information, and a second clock signal based on the upper portion 210 .
  • the clock recovery module 214 may be embodied using a phase-locked loop circuit.
  • the first clock signal has a frequency of 90 kHz and the second clock signal has a frequency of 27 MHz.
  • the clock signals can be useful in reducing jitter and synchronizing data.
  • the use of field/frame counters allow better decisions to be made when reconstructing frames at the receiver. If link errors occur, the receiver can perform a first check on field number and decide what to do based thereupon. For example, the receiver may decide to use a previous frame and wait for the next consecutive frames to resynchronize.
  • FIG. 10 is a block diagram depicting a packing method for transmitting 10-bit word information using 8-bit bytes.
  • Four consecutive pixel samples 220 , 222 , 224 and 226 are packed into five consecutive bytes 230 , 232 , 234 , 236 and 238 .
  • the herein-described methods and systems facilitate high bandwidth, real-time video signals to be transmitted over existing ATM infrastructure. Use of two OC-3 links rather than one OC-12 connection translates into a significant savings in bandwidth.

Abstract

Society of Motion Picture and Television Engineers (SMPTE) video data is separated into first data and second data. A first signal is formed based on the first data. A second signal is formed based on the second data. The first signal is transported via a first Optical Carrier 3 (OC-3) channel. The second signal is transported via a second OC-3 channel.

Description

    RELATED APPLICATIONS
  • The present application is a continuation of U.S. patent application Ser. No. 11/499,356 (still pending), filed Aug. 4, 2006, which is a continuation of U.S. patent application Ser. No. 09/956,475 (now U.S. Pat. No. 7,110,412), filed Sep. 18, 2001, the entirety of each of which are hereby incorporated by reference.
  • TECHNICAL FIELD
  • The present invention relates to methods and systems for transporting high-quality video signals.
  • BACKGROUND
  • The video industry has adopted the Society of Motion Picture and Television Engineers (SMPTE) 259M (level C) standard almost exclusively for high quality video in studio and production applications. In some applications, a SMPTE 259M signal is to be transported to a remote location, which may be several miles away for example. Current methods of transporting SMPTE 259M signals or other professional quality video signals to remote locations use either dark fiber overlay networks or proprietary methods over very high bandwidth pipes. For example, an OC-12 channel may be used to transport an SMPTE 259M signal.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The invention is pointed out with particularity in the appended claims. However, other features of the invention will become more apparent and the invention will be best understood by referring to the following detailed description in conjunction with the accompanying drawings in which:
  • FIG. 1 is a block diagram of an embodiment of a system to transport high-quality video;
  • FIG. 2 is a flow chart of an embodiment of a method performed at a transmitter end;
  • FIG. 3 is a flow chart of an embodiment of a method performed at a receiver end;
  • FIG. 4 illustrates the SMPTE 259M data structure;
  • FIG. 5 is a block diagram illustrating an embodiment of an uncompressed signal based on a subframe of a SMPTE frame;
  • FIG. 6 is a schematic block diagram of an embodiment of a video processor at the transmitter end;
  • FIG. 7 is a schematic block diagram of an embodiment of a video processor at the receiver end;
  • FIG. 8 is a block diagram of an embodiment of a system to provide timing information;
  • FIG. 9 is a block diagram of an embodiment of a system to reconstruct the timing information at the receiver end; and
  • FIG. 10 is a block diagram depicting a packing method for transmitting 10-bit word information using 8-bit bytes.
  • DETAILED DESCRIPTION OF THE DRAWINGS
  • Briefly, embodiments of the present invention provide an improved process for transporting high-quality video. The process includes separating the video data into two sets of data, encapsulating each set of data into asynchronous transfer mode (ATM) cells, and transporting two ATM cell-based bit streams over dual, concatenated Optical Carrier 3 (OC-3) channels. Error checking and/or correction is used to reduce the probability of data errors during transport.
  • Embodiments of the present invention are described with reference to FIG. 1, which is a block diagram of an embodiment of a system to transport high-quality video, FIG. 2, which is a flow chart of an embodiment of a method performed at a transmitter end 20, and FIG. 3, which is a flow chart of an embodiment of a method performed at a receiver end 22.
  • As indicated by block 24, the method comprises separating SMPTE video data 26, such as SMPTE 259M video data, into first uncompressed data 30 and second uncompressed data 32. Preferably, the act of separating the SMPTE video data comprises separating each frame of the SMPTE video data into a first subframe and a second subframe. The first subframe comprises even lines of active video from the frame, and the second subframe comprises odd lines of active video from the frame. In addition to active video, the first subframe and the second subframe may further comprise horizontal ancillary data, optional video data and/or vertical ancillary data.
  • The SMPTE 259M standard is inherently suitable to separate video data because of its field and frame-oriented data structure. In addition, extraneous data can be eliminated since the timing signals are not necessary to carry along a transport stream.
  • FIG. 4 illustrates the SMPTE 259M data structure. The data structure comprises active video fields 34 and 36. The active video fields 34 and 36 contain component pixel data. The active video field 34 includes odd lines of active video from a frame, while the active video field 36 includes even lines of active video from a frame. Optional video fields 40 and 42 contain vertical blanking interval (VBI) data and non-critical data. Horizontal ancillary (HANC) data fields 44 and 46 contain audio, timing and control information. Vertical ancillary (VANC) data fields 48 and 49 contain special user information. A clear delineation between the fields is created by the inherent timing signals EAV 50 and SAV 52.
  • The aforementioned structure is exploited to separate the SMPTE video data into two equal blocks. The first uncompressed data includes data from the active video field 34, the HANC data field 44, and a portion of the data from the optional video field 40 and/or the VANC data field 48. The second uncompressed data includes data from the active video field 36, the HANC data field 46, and a portion of the data from the optional video field 42 and/or the VANC data field 49.
  • Referring back to FIG. 2, the method comprises acts of forming a first uncompressed signal based on the first uncompressed data (block 54) and forming a second uncompressed signal based on the second uncompressed data (block 56). The first uncompressed signal is based on the each first subframe. The second uncompressed signal is based on the each second subframe.
  • The act of forming the first uncompressed signal further comprises appending a corresponding first sequence number to each first subframe, and encapsulating each first subframe with its corresponding first sequence number into at least one asynchronous transfer mode (ATM) cell. Similarly, the act of forming the second uncompressed signal further comprises appending a corresponding second sequence number to each second subframe, and encapsulating each second subframe with its corresponding second sequence number into at least one ATM cell. The sequence numbers are appended to each subframe since traffic in most ATM networks can take any of several paths, each with a potentially different latency and cell delay variation. The sequence numbers are used at the receiving end 22 to order reconstructed frames.
  • In one embodiment, each sequence number is defined by 20 bits. One bit of the sequence number is used to identify whether the field is field 1 or field 2. Choosing 20 bits for the sequence number field allows sequence numbers up to 2̂(20−1)=524,288. For a frame rate of 30 frames per second, the maximum video length for 20 sequence number bits is (524,288 frames)/((30 frames per second)*(3600 seconds per hour)), which approximately equals 4.854 hours.
  • FIG. 5 is a block diagram illustrating an embodiment of an uncompressed signal which results from the act of either block 54 or block 56 in FIG. 2. The uncompressed signal comprises a bit stream including a sequence number 60, ancillary data 62, and active video data 64 for a subframe from a first frame. Thereafter, the bit stream includes a sequence number 70, ancillary data 72, and active video data 74 for a subframe from a second frame. The pattern of including a sequence number, ancillary data and active video data for a subframe is repeated for each succeeding frame.
  • The above-described encapsulation method distinguishes video data (e.g. field/frame data) from ancillary data to facilitate the SMPTE 259M video data being properly reconstructed at the receiver end 22. Since timing relationships are well-defined in the SMPTE 259M standard, and since a fixed frequency of 270 Mbps is used, logic at the receiver end 22 can add the proper timing signals.
  • The bandwidth required to transmit the above bit stream is calculated as follows. With respect to the active video bandwidth, each field has 244 active video lines. The number of words per line is 720 pixels*2(Cr, Y, Cb)=1440. Since each word consists of 10 bits, the number of bits per line is (1440 words per line)*(10 bits per word)=14,400. Thus, the total number of active video bits per field is (14,400 bits per line)*(244 active video lines per field)=3.5136 Mb. Since each frame is based on 2 fields, the total number of active video bits per frame is (3.5136 Mb per field)*(2 fields per frame)=7.0272 Mb. For a frame rate of 30 frames per second, the active video bit rate is (7.0272 Mb per frame)*(30 frames per second)=210.816 Mbps. Accounting for ATM overhead with a cell tax of 1.09433, the total active video bandwidth is 1.09433*210.816 Mbps=230.7043 Mbps.
  • The bandwidth for the HANC data is determined as follows. The HANC bit rate is 30 Mbps. Accounting for ATM overhead with a cell tax of 1.09433, the HANC bandwidth is 1.09433*30 Mbps=32.8302 Mbps.
  • The bandwidth for the VANC/optional data is determined as follows. Each frame has 20 lines allocated for VANC/optional data. The 20 lines comprise any 10 lines selected from lines 1-20, and any 10 lines selected from lines 264-283. Since the number of bits per line is 14,400, the total number of VANC/optional bits per frame is (14,400 bits per line)*(20 VANC/optional lines per field)=288,000. For a frame rate of 30 frames per second, the VANC/optional bit rate is (288,000 bits per frame)*(30 frames per second)=8.64 Mbps. Accounting for ATM overhead with a cell tax of 1.09433, the VANC/optional bandwidth is 1.09433*8.64 Mbps=9.4550112 Mbps.
  • The total data rate is equal to the sum of the total active video bandwidth, the HANC bandwidth and the VANC/optional bandwidth. Thus, the total data rate is 230.7043 Mbps+32.8302 Mbps+9.4550112 Mbps, which equals 272.984612 Mbps. This is less than the 299.52 Mbps bandwidth available on two OC-3 links. Since the data is separated into two fields, the total data rate per field is 272.984612 Mbps/2, which approximately equals 136.4923 Mbps.
  • Optionally, the act of forming the first uncompressed signal further comprises adding a first ATM adaptation layer (AAL) with either an error checking code or an error correcting code. Similarly, the act of forming the second uncompressed signal may optionally comprise adding a second ATM adaptation layer with either an error checking code or an error correcting code. A block coding algorithm such as Reed Solomon or another forward error correcting (FEC) code may be used.
  • Referring back to FIGS. 1 and 2, the method comprises transporting 80 the first uncompressed signal via a first OC-3 channel 82, and transporting 84 the second uncompressed signal via a second OC-3 channel 86. The OC-3 channels 82 and 86 are provided by an ATM network 90.
  • The adaptation layers may be added because of additional bandwidth available on two OC-3 links beyond the 272.984612 Mbps required by the two bit streams. Either AAL-1 with FEC or AAL-5 with FEC may be used. The former is less efficient but more robust, and the latter is more efficient and slightly less robust. The selection of which of these two adaptations to use may be dictated by specifications of a specific application. Note that the FEC process is symmetrical, requiring processing the inverse algorithm at the receiver end 22.
  • Turning now to FIG. 3, a method performed at the receiver end 22 comprises receiving the first uncompressed signal via the first OC-3 channel (block 92), and receiving the second uncompressed signal via the second OC-3 channel (block 94). As described above, the first uncompressed signal comprises a bit stream of a first plurality of ATM cells, and the second uncompressed signal comprises a bit stream of a second plurality of ATM cells. The ATM cells are extracted from the incoming bit streams.
  • As indicated by blocks 96 and 100, the method optionally comprises performing error checking based on the first uncompressed signal, and performing error checking based on the second uncompressed signal. An inverse FEC block code algorithm is used for error checking and recovery. If an error is detected, the block code may provide correction depending on which block code is used and the type and number of errors.
  • As indicated by block 102, the method comprises extracting each first subframe and its corresponding first sequence number from the first plurality of ATM cells. As indicated by block 104, the method comprises extracting each second subframe and its corresponding second sequence number from the second plurality of ATM cells. In these acts, the data payload is extracted from the AAL-1 or AAL-5 encapsulation.
  • As indicated by block 106, the method comprises reconstructing SMPTE video data 108, such as SMPTE 259M video data. Each frame of the SMPTE video data is reconstructed based on a first corresponding subframe represented within the first uncompressed signal and a second corresponding subframe represented within the second uncompressed signal. Further, the EAV and SAV timing signals are added to reconstructed frames. The reconstructed frames are ordered based on each first sequence number and each second sequence number.
  • One approach to ordering the frames comprises using a buffer management process to synchronize the arriving data based on the sequence numbers. A modified leaky bucket (LB) algorithm or similar technique can be used to synchronize the two fields. Optimization can be performed by varying the limit parameter based on the LB counter and the last compliance time. The arrival time is based on the arrival of the sequence number. This allows for a fast implementation in silicon, using the sequence number to direct data to the appropriate buffers.
  • It is noted that some acts described with reference to FIGS. 2 and 3 need not be performed in the order shown in FIGS. 2 and 3. Further, some of the acts may be performed concurrently. For example, the act of transporting the first signal via the first OC-3 channel typically is performed concurrently with the act of transporting the second signal via the second OC-3.
  • Referring back to FIG. 1, the transmitter end 20 comprises a video processor 110 which performs the method described with reference to FIG. 2. FIG. 6 is a schematic block diagram of an embodiment of the video processor 110 at the transmitter end 20. Each video frame 120 within SMPTE 259M video data 122 is separated into two active video subframes. A temporary buffer 124 stores one of the two active video subframes. A temporary buffer 126 stores the other of the two active video subframes. The temporary buffers 124 and 126 may have equal sizes. Ancillary data 130 within the SMPTE 259M video data 122 is appended to outputs of the temporary buffers 124 and 126. The resulting streams are applied to first-in-first-out (FIFOs) 132 and 134. A sequence number is added to the FIFO stream 132 by tagging logic 136. A sequence number is added to the FIFO stream 134 by tagging logic 140. An AAL 142 applies FEC to the output of the tagging logic 136. An AAL 144 applies FEC to the output of the tagging logic 140. A physical layer 146 couples the AAL 142 to the OC-3 channel 82 in FIG. 1. A physical layer 150 couples the AAL 144 to the OC-3 channel 86 in FIG. 1. The aforementioned components of the video processor 110 are directed by system control logic 152.
  • Referring back to FIG. 1, the receiver end 22 comprises a video processor 158 which performs the method described with reference to FIG. 3. FIG. 7 is a schematic block diagram of an embodiment of the video processor 158 at the receiver end 22. A physical layer 160 couples the OC-3 channel 82 in FIG. 1 to an AAL 162. A physical layer 164 couples the OC-3 channel 86 in FIG. 1 to an AAL 166. The physical layers 160 and 164 extract ATM cells from an incoming bit stream. The AALs 162 and 166 perform an inverse FEC block code algorithm for error checking and/or correcting, and extract the data payload from AAL-1/5 encapsulation.
  • Tagging logic 170 is responsive to the AAL 162 to order each subframe based on its sequence number, and to remove the sequence number. Tagging logic 172 is responsive to the AAL 166 to order each subframe based on its sequence number, and to remove the sequence number. The resulting synchronized buffers are indicated by FIFOs 174 and 176. Ancillary data 180 is extracted from each subframe. Temporary buffers 182 and 184 store the two active video portions which, when combined with EAV and SAV signals, form a video frame 186. The video frame 186 is in accordance with an SMPTE standard such as SMPTE 259M. The aforementioned components of the video processor 158 are directed by system control logic 190. The system control logic 190, among other things, directs synchronization of data from the two separate fields.
  • FIG. 8 is a block diagram of an embodiment of a system to provide timing information in addition to the sequence number. The timing information is based upon a first clock 200 and a second clock 202. Preferably, the first clock 200 has a frequency of 90 kHz, and the second clock 202 has a frequency of 27 MHz.
  • A first counter 204 is responsive to the first clock 200. A second counter 206 is responsive to the second clock 202. Preferably, the first counter 204 is a 23-bit counter and the second counter 206 is a 9-bit counter. The timing information has an upper portion 210 comprising bits from the second counter 206, and a lower portion 212 comprising bits from the first counter 204. The timing information is encapsulated as described above for the bit stream. The additional 32 bits keep the overall bandwidth within the bandwidth limit of the two OC-3 links.
  • FIG. 9 is a block diagram of an embodiment of a system to reconstruct the timing information at the receiver end. A clock recovery module 214 outputs a first clock signal based on the lower portion 212 of the received timing information, and a second clock signal based on the upper portion 210. The clock recovery module 214 may be embodied using a phase-locked loop circuit. Preferably, the first clock signal has a frequency of 90 kHz and the second clock signal has a frequency of 27 MHz.
  • The clock signals can be useful in reducing jitter and synchronizing data. The use of field/frame counters allow better decisions to be made when reconstructing frames at the receiver. If link errors occur, the receiver can perform a first check on field number and decide what to do based thereupon. For example, the receiver may decide to use a previous frame and wait for the next consecutive frames to resynchronize.
  • FIG. 10 is a block diagram depicting a packing method for transmitting 10-bit word information using 8-bit bytes. Four consecutive pixel samples 220, 222, 224 and 226 are packed into five consecutive bytes 230, 232, 234, 236 and 238.
  • Several embodiments including preferred embodiments of a method and system to transport high-quality video signals have been described herein.
  • The herein-described methods and systems facilitate high bandwidth, real-time video signals to be transmitted over existing ATM infrastructure. Use of two OC-3 links rather than one OC-12 connection translates into a significant savings in bandwidth.
  • It will be apparent to those skilled in the art that the disclosed invention may be modified in numerous ways and may assume many embodiments other than the preferred form specifically set out and described above.
  • Accordingly, it is intended by the appended claims to cover all modifications of the invention which fall within the true spirit and scope of the invention.

Claims (15)

1. A method comprising:
separating a frame of Society of Motion Picture and Television Engineers (SMPTE) video data into a first subframe and a second subframe, wherein the first subframe comprises even lines of active video and the second subframe comprises odd lines of active video;
transporting a first signal that has been formed based on the first subframe via a first Optical Carrier 3 (OC-3) channel; and
transporting a second signal that has been formed based on the second subframe via a second OC-3 channel.
2. The method of claim 1, further comprising:
forming the first signal based on the first subframe; and
forming the second signal based on the second subframe.
3. The method of claim 2, wherein forming the first signal comprises:
appending a first sequence number to the first subframe; and
encapsulating the first subframe and the first subframe number into at least one asynchronous transfer mode (ATM) cell.
4. The method of claim 3, wherein forming the second signal comprises:
appending a second sequence number to the second subframe; and
encapsulating the second subframe and the second subframe number into at least one ATM cell.
5. The method of claim 1, wherein the first subframe comprises active video data and ancillary data, and the second subframe comprises active video data and ancillary data.
6. The method of claim 1, wherein the SMPTE video data is SMPTE 259M video data.
7. A computer-readable storage medium comprising a set of instructions to direct a processor to perform the acts of:
separating a frame of Society of Motion Picture and Television Engineers (SMPTE) video data into a first subframe and a second subframe, wherein the first subframe comprises even lines of active video and the second subframe comprises odd lines of active video;
transporting a first signal that has been formed based on the first subframe via a first Optical Carrier 3 (OC-3) channel; and
transporting a second signal that has been formed based on the second subframe via a second OC-3 channel.
8. The computer-readable storage medium of claim 7, further comprising a set of instructions to direct a processor to perform acts of:
forming the first signal based on the first subframe; and
forming the second signal based on the second subframe.
9. The computer-readable storage medium of claim 8, wherein forming the first signal based on the first subframe comprises:
appending a first sequence number to the first subframe; and
encapsulating the first subframe and the first subframe number into at least one asynchronous transfer mode (ATM) cell.
10. computer-readable storage medium of claim 8, wherein forming the second signal comprises:
appending a second sequence number to the second subframe; and
encapsulating the second subframe and the second subframe number into at least one ATM cell.
11. A video processor operative to separate Society of Motion Picture and Television Engineers (SMPTE) video data into first data and second data, the video processor comprising:
a first ATM adaptation layer (AAL) operative to form a first signal based on the first data and to couple the first signal to a first Optical Carrier 3 (OC-3) channel, and
a second AAL operative to form a second signal based on the second data and to couple the second signal to a second OC-3 channel.
12. The video processor of claim 11, wherein the first AAL is further operative to append a first sequence number to at least a portion of the first data and to encapsulate the first sequence number and the at least a portion of the first data into at least one asynchronous transfer mode (ATM) cell.
13. The video processor of claim 11, wherein the second AAL is further operative to append a second sequence number to at least a portion of the second data and to encapsulate the second sequence number and at least a portion of the second data into at least one ATM cell.
14. The video processor of claim 11, wherein the first and second data are first and second subframes.
15. The video processor of claim 11, wherein the first data comprises even lines of active video and the second subframe comprises odd lines of active video.
US12/264,586 2001-09-18 2008-11-04 Method and System to Transport High-Quality Video Signals Abandoned US20090116494A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/264,586 US20090116494A1 (en) 2001-09-18 2008-11-04 Method and System to Transport High-Quality Video Signals

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US09/956,475 US7110412B2 (en) 2001-09-18 2001-09-18 Method and system to transport high-quality video signals
US11/499,356 US7447216B2 (en) 2001-09-18 2006-08-04 Method and system to transport high-quality video signals
US12/264,586 US20090116494A1 (en) 2001-09-18 2008-11-04 Method and System to Transport High-Quality Video Signals

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/499,356 Continuation US7447216B2 (en) 2001-09-18 2006-08-04 Method and system to transport high-quality video signals

Publications (1)

Publication Number Publication Date
US20090116494A1 true US20090116494A1 (en) 2009-05-07

Family

ID=25498278

Family Applications (3)

Application Number Title Priority Date Filing Date
US09/956,475 Expired - Fee Related US7110412B2 (en) 2001-09-18 2001-09-18 Method and system to transport high-quality video signals
US11/499,356 Expired - Fee Related US7447216B2 (en) 2001-09-18 2006-08-04 Method and system to transport high-quality video signals
US12/264,586 Abandoned US20090116494A1 (en) 2001-09-18 2008-11-04 Method and System to Transport High-Quality Video Signals

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US09/956,475 Expired - Fee Related US7110412B2 (en) 2001-09-18 2001-09-18 Method and system to transport high-quality video signals
US11/499,356 Expired - Fee Related US7447216B2 (en) 2001-09-18 2006-08-04 Method and system to transport high-quality video signals

Country Status (1)

Country Link
US (3) US7110412B2 (en)

Families Citing this family (66)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6307487B1 (en) 1998-09-23 2001-10-23 Digital Fountain, Inc. Information additive code generator and decoder for communication systems
US7068729B2 (en) 2001-12-21 2006-06-27 Digital Fountain, Inc. Multi-stage code generator and decoder for communication systems
US7110412B2 (en) * 2001-09-18 2006-09-19 Sbc Technology Resources, Inc. Method and system to transport high-quality video signals
AU2003211057A1 (en) * 2002-02-15 2003-09-09 Digital Fountain, Inc. System and method for reliably communicating the content of a live data stream
US9240810B2 (en) 2002-06-11 2016-01-19 Digital Fountain, Inc. Systems and processes for decoding chain reaction codes through inactivation
EP2348640B1 (en) 2002-10-05 2020-07-15 QUALCOMM Incorporated Systematic encoding of chain reaction codes
US7139960B2 (en) 2003-10-06 2006-11-21 Digital Fountain, Inc. Error-correcting multi-stage code generator and decoder for communication systems having single transmitters or multiple transmitters
US7310807B2 (en) 2003-10-29 2007-12-18 Sbc Knowledge Ventures, L.P. System and method for local video distribution
US20050149988A1 (en) * 2004-01-06 2005-07-07 Sbc Knowledge Ventures, L.P. Delivering interactive television components in real time for live broadcast events
WO2005112250A2 (en) 2004-05-07 2005-11-24 Digital Fountain, Inc. File download and streaming system
US8904458B2 (en) 2004-07-29 2014-12-02 At&T Intellectual Property I, L.P. System and method for pre-caching a first portion of a video file on a set-top box
US8584257B2 (en) * 2004-08-10 2013-11-12 At&T Intellectual Property I, L.P. Method and interface for video content acquisition security on a set-top box
US20060037043A1 (en) * 2004-08-10 2006-02-16 Sbc Knowledge Ventures, L.P. Method and interface for managing movies on a set-top box
US20060048178A1 (en) * 2004-08-26 2006-03-02 Sbc Knowledge Ventures, L.P. Interface for controlling service actions at a set top box from a remote control
US8086261B2 (en) 2004-10-07 2011-12-27 At&T Intellectual Property I, L.P. System and method for providing digital network access and digital broadcast services using combined channels on a single physical medium to the customer premises
KR100644095B1 (en) * 2004-10-13 2006-11-10 박우현 Method of realizing interactive advertisement under digital broadcasting environment by extending program associated data-broadcasting to internet area
US20060174279A1 (en) * 2004-11-19 2006-08-03 Sbc Knowledge Ventures, L.P. System and method for managing television tuners
US8434116B2 (en) 2004-12-01 2013-04-30 At&T Intellectual Property I, L.P. Device, system, and method for managing television tuners
US7716714B2 (en) * 2004-12-01 2010-05-11 At&T Intellectual Property I, L.P. System and method for recording television content at a set top box
US7474359B2 (en) 2004-12-06 2009-01-06 At&T Intellectual Properties I, L.P. System and method of displaying a video stream
US20060156372A1 (en) * 2005-01-12 2006-07-13 Sbc Knowledge Ventures, L.P. System, method and interface for managing content at a set top box
US7436346B2 (en) * 2005-01-20 2008-10-14 At&T Intellectual Property I, L.P. System, method and interface for controlling multiple electronic devices of a home entertainment system via a single control device
US20060168610A1 (en) * 2005-01-26 2006-07-27 Sbc Knowledge Ventures, L.P. System and method of managing content
US20060174309A1 (en) * 2005-01-28 2006-08-03 Sbc Knowledge Ventures, L.P. System and method of managing set top box memory
US7307574B2 (en) * 2005-02-02 2007-12-11 Sbc Knowledge Ventures, Lp Remote control, apparatus, system and methods of using the same
US20060179466A1 (en) * 2005-02-04 2006-08-10 Sbc Knowledge Ventures, L.P. System and method of providing email service via a set top box
US8214859B2 (en) * 2005-02-14 2012-07-03 At&T Intellectual Property I, L.P. Automatic switching between high definition and standard definition IP television signals
US20060184991A1 (en) * 2005-02-14 2006-08-17 Sbc Knowledge Ventures, Lp System and method of providing television content
US20060218590A1 (en) * 2005-03-10 2006-09-28 Sbc Knowledge Ventures, L.P. System and method for displaying an electronic program guide
US20060230421A1 (en) * 2005-03-30 2006-10-12 Sbc Knowledge Ventures, Lp Method of using an entertainment system and an apparatus and handset for use with the entertainment system
US7623539B2 (en) * 2005-03-31 2009-11-24 Agere Systems Inc. Apparatus and method for processing cells in an ATM adaptation layer device in a communications system that exhibits cell delay variation
US20060236343A1 (en) * 2005-04-14 2006-10-19 Sbc Knowledge Ventures, Lp System and method of locating and providing video content via an IPTV network
US20060245358A1 (en) * 2005-04-29 2006-11-02 Beverly Harlan T Acceleration of data packet transmission
US8054849B2 (en) 2005-05-27 2011-11-08 At&T Intellectual Property I, L.P. System and method of managing video content streams
US20060282785A1 (en) * 2005-06-09 2006-12-14 Sbc Knowledge Ventures, L.P. System and method of displaying content in display windows
US7908627B2 (en) * 2005-06-22 2011-03-15 At&T Intellectual Property I, L.P. System and method to provide a unified video signal for diverse receiving platforms
US8893199B2 (en) * 2005-06-22 2014-11-18 At&T Intellectual Property I, L.P. System and method of managing video content delivery
US20070011133A1 (en) * 2005-06-22 2007-01-11 Sbc Knowledge Ventures, L.P. Voice search engine generating sub-topics based on recognitiion confidence
US8282476B2 (en) * 2005-06-24 2012-10-09 At&T Intellectual Property I, L.P. Multimedia-based video game distribution
US8635659B2 (en) 2005-06-24 2014-01-21 At&T Intellectual Property I, L.P. Audio receiver modular card and method thereof
US8365218B2 (en) 2005-06-24 2013-01-29 At&T Intellectual Property I, L.P. Networked television and method thereof
US20060294568A1 (en) * 2005-06-24 2006-12-28 Sbc Knowledge Ventures, L.P. Video game console modular card and method thereof
US8190688B2 (en) * 2005-07-11 2012-05-29 At&T Intellectual Property I, Lp System and method of transmitting photographs from a set top box
US7873102B2 (en) * 2005-07-27 2011-01-18 At&T Intellectual Property I, Lp Video quality testing by encoding aggregated clips
US7831887B2 (en) * 2005-12-15 2010-11-09 General Instrument Corporation Method and apparatus for using long forward error correcting codes in a content distribution system
US9136983B2 (en) 2006-02-13 2015-09-15 Digital Fountain, Inc. Streaming and buffering using variable FEC overhead and protection periods
US9270414B2 (en) 2006-02-21 2016-02-23 Digital Fountain, Inc. Multiple-field based code generator and decoder for communications systems
WO2007134196A2 (en) 2006-05-10 2007-11-22 Digital Fountain, Inc. Code generator and decoder using hybrid codes
US9178535B2 (en) 2006-06-09 2015-11-03 Digital Fountain, Inc. Dynamic stream interleaving and sub-stream based delivery
US9432433B2 (en) 2006-06-09 2016-08-30 Qualcomm Incorporated Enhanced block-request streaming system using signaling or block creation
US9380096B2 (en) 2006-06-09 2016-06-28 Qualcomm Incorporated Enhanced block-request streaming system for handling low-latency streaming
US9419749B2 (en) 2009-08-19 2016-08-16 Qualcomm Incorporated Methods and apparatus employing FEC codes with permanent inactivation of symbols for encoding and decoding processes
US9209934B2 (en) 2006-06-09 2015-12-08 Qualcomm Incorporated Enhanced block-request streaming using cooperative parallel HTTP and forward error correction
US9386064B2 (en) 2006-06-09 2016-07-05 Qualcomm Incorporated Enhanced block-request streaming using URL templates and construction rules
CN101802797B (en) 2007-09-12 2013-07-17 数字方敦股份有限公司 Generating and communicating source identification information to enable reliable communications
US9281847B2 (en) 2009-02-27 2016-03-08 Qualcomm Incorporated Mobile reception of digital video broadcasting—terrestrial services
US9288010B2 (en) 2009-08-19 2016-03-15 Qualcomm Incorporated Universal file delivery methods for providing unequal error protection and bundled file delivery services
US9917874B2 (en) 2009-09-22 2018-03-13 Qualcomm Incorporated Enhanced block-request streaming using block partitioning or request controls for improved client-side handling
US9225961B2 (en) 2010-05-13 2015-12-29 Qualcomm Incorporated Frame packing for asymmetric stereo video
US9596447B2 (en) 2010-07-21 2017-03-14 Qualcomm Incorporated Providing frame packing type information for video coding
US9456015B2 (en) 2010-08-10 2016-09-27 Qualcomm Incorporated Representation groups for network streaming of coded multimedia data
US8958375B2 (en) 2011-02-11 2015-02-17 Qualcomm Incorporated Framing for an improved radio link protocol including FEC
US9270299B2 (en) 2011-02-11 2016-02-23 Qualcomm Incorporated Encoding and decoding using elastic codes with flexible source block mapping
US9253233B2 (en) 2011-08-31 2016-02-02 Qualcomm Incorporated Switch signaling methods providing improved switching between representations for adaptive HTTP streaming
US9843844B2 (en) 2011-10-05 2017-12-12 Qualcomm Incorporated Network streaming of media data
US9294226B2 (en) 2012-03-26 2016-03-22 Qualcomm Incorporated Universal object delivery and template-based file delivery

Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US554254A (en) * 1896-02-11 Tuning-key
US5373327A (en) * 1993-02-25 1994-12-13 Hewlett-Packard Company Detection, correction and display of illegal color information in a digital video signal
US5541556A (en) * 1993-12-23 1996-07-30 Gennum Corporation Clock recovery circuit for serial digital video
US5721738A (en) * 1995-08-28 1998-02-24 Sony Corporation Data transmission method, system, and apparatus
US5742610A (en) * 1996-02-06 1998-04-21 Motorola, Inc. Method and apparatus for use in a data communications network serving subscribers operating at a plurality of transmisson data rates
US5896181A (en) * 1994-03-30 1999-04-20 Sony Corporation Video signal processing apparatus with matrix switching capability
US5929921A (en) * 1995-03-16 1999-07-27 Matsushita Electric Industrial Co., Ltd. Video and audio signal multiplex sending apparatus, receiving apparatus and transmitting apparatus
US6026434A (en) * 1995-10-11 2000-02-15 Sony Corporation Data transmission processing system
US6028934A (en) * 1996-04-05 2000-02-22 Nec Corporation TD multiplexing digital video signals with scramble of scrambling stages more than one greater in number than the video signals
US6055247A (en) * 1995-07-13 2000-04-25 Sony Corporation Data transmission method, data transmission apparatus and data transmission system
US6137944A (en) * 1996-12-26 2000-10-24 Matsushita Electric Industrial Co., Ltd. Image recording/reproducing apparatus
US6157674A (en) * 1996-03-21 2000-12-05 Sony Corporation Audio and video data transmitting apparatus, system, and method thereof
US6163284A (en) * 1998-03-31 2000-12-19 Sony Corporation Signal processing circuit and signal processing method
US6226443B1 (en) * 1995-07-07 2001-05-01 Matsushita Electric Industrial Co., Ltd. Recording and reproducing apparatus
US6414960B1 (en) * 1998-12-29 2002-07-02 International Business Machines Corp. Apparatus and method of in-service audio/video synchronization testing
US20020163936A1 (en) * 2001-05-01 2002-11-07 Adc Dsl Systems, Inc. Bit-level control for dynamic bandwidth allocation
US20020180869A1 (en) * 2000-09-02 2002-12-05 Magic Lantern, Llc, A Limited Liability Company Of The State Of Kansas Laser projection system
US6785289B1 (en) * 1998-06-05 2004-08-31 Sarnoff Corporation Method and apparatus for aligning sub-stream splice points in an information stream
US20040246977A1 (en) * 2001-06-04 2004-12-09 Jason Dove Backplane bus
US7110412B2 (en) * 2001-09-18 2006-09-19 Sbc Technology Resources, Inc. Method and system to transport high-quality video signals
US7141732B2 (en) * 2002-12-06 2006-11-28 Alpine Electronics, Inc. Storing apparatus and storing method for music data

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08191416A (en) 1995-01-10 1996-07-23 Sony Corp Digital video/audio processor
US6078958A (en) 1997-01-31 2000-06-20 Hughes Electronics Corporation System for allocating available bandwidth of a concentrated media output

Patent Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US554254A (en) * 1896-02-11 Tuning-key
US5373327A (en) * 1993-02-25 1994-12-13 Hewlett-Packard Company Detection, correction and display of illegal color information in a digital video signal
US5541556A (en) * 1993-12-23 1996-07-30 Gennum Corporation Clock recovery circuit for serial digital video
US5896181A (en) * 1994-03-30 1999-04-20 Sony Corporation Video signal processing apparatus with matrix switching capability
US5929921A (en) * 1995-03-16 1999-07-27 Matsushita Electric Industrial Co., Ltd. Video and audio signal multiplex sending apparatus, receiving apparatus and transmitting apparatus
US6226443B1 (en) * 1995-07-07 2001-05-01 Matsushita Electric Industrial Co., Ltd. Recording and reproducing apparatus
US6055247A (en) * 1995-07-13 2000-04-25 Sony Corporation Data transmission method, data transmission apparatus and data transmission system
US5721738A (en) * 1995-08-28 1998-02-24 Sony Corporation Data transmission method, system, and apparatus
US6026434A (en) * 1995-10-11 2000-02-15 Sony Corporation Data transmission processing system
US5742610A (en) * 1996-02-06 1998-04-21 Motorola, Inc. Method and apparatus for use in a data communications network serving subscribers operating at a plurality of transmisson data rates
US6157674A (en) * 1996-03-21 2000-12-05 Sony Corporation Audio and video data transmitting apparatus, system, and method thereof
US6028934A (en) * 1996-04-05 2000-02-22 Nec Corporation TD multiplexing digital video signals with scramble of scrambling stages more than one greater in number than the video signals
US6137944A (en) * 1996-12-26 2000-10-24 Matsushita Electric Industrial Co., Ltd. Image recording/reproducing apparatus
US6163284A (en) * 1998-03-31 2000-12-19 Sony Corporation Signal processing circuit and signal processing method
US6785289B1 (en) * 1998-06-05 2004-08-31 Sarnoff Corporation Method and apparatus for aligning sub-stream splice points in an information stream
US6414960B1 (en) * 1998-12-29 2002-07-02 International Business Machines Corp. Apparatus and method of in-service audio/video synchronization testing
US20020180869A1 (en) * 2000-09-02 2002-12-05 Magic Lantern, Llc, A Limited Liability Company Of The State Of Kansas Laser projection system
US20020163936A1 (en) * 2001-05-01 2002-11-07 Adc Dsl Systems, Inc. Bit-level control for dynamic bandwidth allocation
US20040246977A1 (en) * 2001-06-04 2004-12-09 Jason Dove Backplane bus
US7110412B2 (en) * 2001-09-18 2006-09-19 Sbc Technology Resources, Inc. Method and system to transport high-quality video signals
US7141732B2 (en) * 2002-12-06 2006-11-28 Alpine Electronics, Inc. Storing apparatus and storing method for music data

Also Published As

Publication number Publication date
US20060268892A1 (en) 2006-11-30
US7447216B2 (en) 2008-11-04
US20030056223A1 (en) 2003-03-20
US7110412B2 (en) 2006-09-19

Similar Documents

Publication Publication Date Title
US7447216B2 (en) Method and system to transport high-quality video signals
US5790543A (en) Apparatus and method for correcting jitter in data packets
US6807191B2 (en) Decoder for compressed and multiplexed video and audio data
US5887187A (en) Single chip network adapter apparatus
US6026088A (en) Network architecture
US5966387A (en) Apparatus and method for correcting jitter in data packets
CN1182722C (en) Video signal reducing device and method
US6456782B1 (en) Data processing device and method for the same
US6744782B1 (en) Communications device, method thereof, communications system and recording medium
CN101340590B (en) Multiplex apparatus and multiplex method
JP3045715B2 (en) Transmission system, transmitting device, recording / reproducing device, and recording device
US7466967B2 (en) Communication system
US5606558A (en) Method of and devices for transmitting in ATM cells information supplied in the form of a series of distinct entities for a given application
JP3927443B2 (en) Moving picture transmission / reception system and moving picture transmission / reception method
US6339597B1 (en) AAL5 jitter reduction method and apparatus
US6266384B1 (en) Method and apparatus for time base recovery and processing
US7415528B2 (en) Apparatus and method for transmitting hierarchically multimedia data TS to prevent jitter of timing information and for recovering the multimedia data TS
US6636531B1 (en) Communication device and method
JPH08275151A (en) Distribution decoder for multiplexed compressed image-audio data
JP2005065120A (en) Data transmitting apparatus, data receiving apparatus and data transmission system
KR100579132B1 (en) Apparatus and Method of the hierarchical transmission of a multimedia data TS for preventing the jitter of timing information, and Apparatus and Method of recovering a multimedia data TS transmitted hierarchically
US7130265B1 (en) Data multiplexing device and data multiplexing method, and data transmitter
CN100421449C (en) Control method for clock synhrronous preservation in network degital TV system
US20050083861A1 (en) Method and arrangement for converting a first data stream into a second data stream
US20070195822A1 (en) Method and system for locating packet boundaries

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION