USRE39955E1 - Multiple encoder output buffer apparatus for differential coding of video information - Google Patents

Multiple encoder output buffer apparatus for differential coding of video information Download PDF

Info

Publication number
USRE39955E1
USRE39955E1 US08/881,965 US88196597A USRE39955E US RE39955 E1 USRE39955 E1 US RE39955E1 US 88196597 A US88196597 A US 88196597A US RE39955 E USRE39955 E US RE39955E
Authority
US
United States
Prior art keywords
data
network
buffer
output buffers
time information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
US08/881,965
Inventor
Andrew J. Kuzma
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Intel Corp
Original Assignee
Intel Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intel Corp filed Critical Intel Corp
Priority to US08/881,965 priority Critical patent/USRE39955E1/en
Assigned to INTEL CORPORATION reassignment INTEL CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KUZMA, ANDREW J.
Application granted granted Critical
Publication of USRE39955E1 publication Critical patent/USRE39955E1/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/146Data rate or code amount at the encoder output
    • H04N19/152Data rate or code amount at the encoder output by measuring the fullness of the transmission buffer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/60Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding
    • H04N19/61Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding in combination with predictive coding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/434Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding

Definitions

  • the present invention relates to the transmission of information over a communications path. More particularly, the present invention relates to the communications of high bandwidths information over networks of varying types.
  • Frame differencing recognizes that a normal video sequence has little variation from one frame to the next. If, instead of coding each frame, only the differences between a frame and the previous frame are coded, then the amount of information needed to describe the new frame will be dramatically reduced.
  • Motion compensation recognizes that much of the difference that does occur between successive frames can be characterized as a simple translation of motion, caused either by the moving of objects in the scene or by a pan of the field of view.
  • the area around those blocks can be searched in the previous frame to find an offset block that more closely matches the block of the current frame.
  • the difference between a reference block in the current frame and the best match in the previous frame are coded to produce a vector that describes the offset of the best match. This motion vector then can be used with the previous frame to produce the equivalent of what the current frame should be.
  • bit-bang A well-known effect resulting from using a single output buffer is called “bit-bang” where the output buffer is over depleted by the interface to the communications channel, causing the feedback loop to indicate that the buffer can handle lots of data, which in turn causes the video compression algorithm to under optimize the subsequent image coding. The user perceives the bit-bang as an uneven quality and frame rate.
  • the typical approach has been to limit the amount of data pulled out from the encoder video output buffer to a fraction of the total size of the output buffer; 10% to 30% is typical.
  • This approach keeps the feedback indicator rather small, and encoding more uniform.
  • the underlying assumption of this approach is that the communications channel will usually not be changing rapidly. Exceptions are caused by connectivity interruptions, such as burst errors, which are handled strictly as exceptions to the call. In a local area network (LAN), or other collision-sensing multiple access channel, or in other networks with burst characteristics (such as noisy RF channels), this underlying assumption no longer holds.
  • LAN local area network
  • burst characteristics such as noisy RF channels
  • FIG. 1 demonstrate a hypothetical network having a plurality of video-capable nodes for interacting and providing video conferencing capabilities.
  • FIG. 2 illustrates hardware to be utilized in implementing the present invention in one embodiment.
  • FIG. 3 illustrates a logical rendition of a plurality of output buffers with successive time interval video information for one embodiment of the present invention.
  • FIG. 4 illustrates a branching tree structure corresponding to successive temporal transmit reference images for one embodiment of the present invention.
  • FIG. 5 illustrates alternative logical output buffer uses for channel dependent data transmission over a network.
  • FIG. 6 illustrates characteristics of audio information which may be transmitted over a network in accordance with another embodiment of the present invention.
  • FIG. 7 illustrates a generalized block diagram of the present invention.
  • a method and apparatus are described for the conveyance of real-time isochronous data over bursty networks.
  • the present invention is described predominantly in terms of the transmission of video information, the concepts and method are broad enough to encompass the transmission of real-time audio and other data requiring isochronous data transfer.
  • numerous details are specified such as bit rates and frame sizes, in order to provide a thorough understanding of the present invention.
  • the present invention may be practiced without such specific details.
  • well-known control structures and gate level circuits have not been shown in detail in order not to obscure unnecessarily the present invention.
  • some functions are described to be carried out by various logic circuits. Those of ordinary skill in the art, having been described the various functions will be able to implement the necessary logic circuits without undue experimentation.
  • FIG. 1 is used to illustrate a simple network having a plurality of video-capable nodes.
  • the network is illustrated as a simple star network 10 having a centrally incorporated multi-point control unit (MCU).
  • MCU multi-point control unit
  • the network is presented as having five (5) nodes 11 , 12 , 13 , 14 and 15 .
  • nodes 12 and 13 supporting IRV video (160 pixels ⁇ 120 lines) while nodes 11 , 14 and 15 support HQV video (320 pixels ⁇ 240 lines).
  • the network illustrated in FIG. 1 is purely for illustrative purposes and many more complex nodes may be incorporated that are non-video capable on the same network as the illustrated nodes.
  • the present invention may be applied to any network configuration besides the star configuration of FIG. 1 such as token ring networks, branching tree networks, etc.
  • the fundamental requirement for the network which has these video-capable nodes is that the nodes be able to transmit data, including video data from one point to another and receive acknowledgments from the receiving node.
  • FIG. 2 illustrates typical video encoding hardware to which the present invention may be applied. This can be used for preparing video data to be transmitted over a network of the type illustrated in FIG. 1 to provide real-time video conferencing.
  • a video camera 20 receives the video image that is to be encoded and conveyed. Such cameras are common and work on a number of technologies such as charge coupled devices, etc.
  • the video camera may directly include video CODEC 21 or it may be tightly coupled as illustrated in the figure.
  • the video CODEC 21 receives the electronic image from the camera and digitizes the image when being used in its encoder capacity.
  • Video CODECS are generally known and come in a number of varieties which may be used for encoding video data to be transmitted and decoding video data when received.
  • the camera output is propagated to the capture buffer 22 of video CODEC 21 .
  • the video information is processed by motion estimation circuitry 23 .
  • the motion estimation circuitry is used to generate motion vectors which describe the difference of a portion of a video image from the previously recorded image in terms of a translational offset.
  • the motion estimation circuitry compares the currently decoded frame from the previous frame stored in the transmit reference image buffer 30 about which more will be described further herein. From the motion estimation circuitry, the outputs are the motion vectors and the motion compensated image 24 .
  • the motion compensated image 24 is then processed by the differential pulse code modulation (DPCM) circuitry 25 which generates digital information of the changes to the previously stored transmit reference image.
  • DPCM differential pulse code modulation
  • transform coding block 26 which also performs quantization and run-length encoding. Run-length encoding is a technique for compressing data sequences that have large numbers of zeros and is well-known to those of ordinary skill in the art.
  • This transform coder may perform a discrete cosine transform (DCT).
  • DCT discrete cosine transform
  • the coded sequence is propagated to the output buffer 27 which is used to maintain a constant bit rate for the output to the network.
  • the output buffer 27 As was described, prior art methods used the output buffer fullness to regulate the degree of quantization that would be applied to the compressing and encoding circuitry because a constant bandwidth availability was assumed.
  • the transform coder block 26 also outputs the compressed image data to generate a new transfer reference image for storage in the transmit reference image buffer.
  • the encoding logic provides the compressed image data to a decoding block 28 that has an inverse quantizer and inverse discrete cosine transform decoder which can be used to combine the decoded image data with the previously stored transfer reference image to yield a new transmit reference image which corresponds to the image that was most recently propagated on the network. It is this image data that would be used in calculating the changes in the image in sending the next frame of information.
  • the transmit reference image which is the same image that will be reconstructed at the other end by the video decoder, is used as the basis of subsequent encoding, including motion vectors and motion compensated image compression.
  • the prior art feedback mechanism using the output buffer assumed a constant bit rate would be available for the transmission of information. This assumption no longer holds for video conferencing type devices which are on bursty networks such as CSMA LAN networks.
  • the solution proposed by the present invention is to provide feedback between the video CODEC and the communications channel such that the characteristics of the channel are used to drive multiple video output buffers. These buffers share an original temporal video reference but will have different subsequent temporal video images.
  • the communications channel interface picks the subsequent video image buffer that best matches the current condition.
  • the video algorithm can be tuned to provide video output buffers with the best guess of how the buffers should be configured.
  • the remaining buffers can be flushed to be refilled again based on a newly calculated transmit reference image.
  • the final action is to revert to an exception handler similar to current video CODECS, i.e., insert a key frame to restart the encoding of video data transmission.
  • FIG. 3 illustrates conceptually the logical multiple output buffers of the present invention.
  • the video camera 20 records an image it is encoded by the encoding circuitry described above and the encoded information is propagated to the output buffer 27 .
  • the network may not be able to receive this newly calculated image data. Accordingly, the camera continues to detect images and encode the data and newly translated data is stored in subsequent output buffers such as 41 , 42 or 43 .
  • the data information may correspond to the difference between the transfer reference image and 1/15th of a second later than the data information stored in buffer 27 .
  • output buffers 42 and 43 may store data corresponding to the temporal change between the transmit reference image and the image before the camera at successively later times.
  • the video encoder and camera circuitry described may be incorporated as part of a station that is on the network and are responsive to information received over the communications channel.
  • the output buffer with the most current image may be signaled to transmit its information to the receiving node.
  • the channel information is used to then calculate the next transmit reference image for storage.
  • the output buffers are then flushed and are again loaded in a time sequential manner until the data is again ready to be sent over the network. While four (4) output buffers are illustrated, this is purely for illustrative purposes in that as many buffers may be implemented as computing power and resources provide.
  • FIG. 5 illustrates another conceptualization of the present invention.
  • the encoder through feedback from the data communications channel, creates several logical output buffers corresponding to behavioral predictions based on the feedback from the communications channel.
  • logical output buffer 1 could represent the case where more bandwidth will be dynamically allocated to this natural data compression over the next unit of time.
  • the unit of time could be an image frame or, for example, a frame of sampled audio.
  • Table I the various predictions of the bandwidth available to the compression algorithm are shown below in Table I.
  • Logical output buffer 1 might represent the data from an image taken 1/15th of a second later than transmit reference 0
  • logical output buffer 2 might represent the differential coding from an image half a second later than transmit reference 0 .
  • FIG. 5 makes it clear that logical buffers may be created in a common block of memory and that the number of such buffers is limited only by the available computational power to simultaneously encode them and the memory to sufficiently handle them.
  • FIG. 6 is used to illustrate that the present invention is not necessarily limited to video encoding and illustrates a frame of audio information. For example, in the G.728 standard each frame of data is 5 milliseconds long. The frame may be stored as a transmit reference and subsequent transmissions may follow the differential coding principals wherein only the changed information is sent to the receiving node.
  • the audio encoder may be responsive to feedback from the network and maintain a plurality of logical output buffers such as those described in the video application.
  • a plurality of logical output buffers such as those described in the video application.
  • One possible application for such an implementation would be in wireless telephony wherein portions of an audio transmission may be lost when a transmitting station goes through a tunnel.
  • the responding network indicates that its most recently receiving information is slightly stale and that a late change logical output buffer should be used in providing the encoded differential information.
  • FIG. 7 Information about a real-time object 100 that is desired to be conveyed from a transmitting node to a receiving node on some sort of network is shown.
  • This real-time object 100 may be a video image or it may be a sound depending on the particular implementation.
  • a capture mechanism 110 detects the real-time object and encodes it into electronic information.
  • the capture mechanism may be a camera for video information as described above or a microphone or stereo microphones for audio information.
  • This information is then processed by differential encoder 115 which compares the newly captured real-time object to the previously stored recorded object in transmit reference buffer 120 .
  • the differentially encoded data is then propagated to a logical output buffer 125 which operates as those described above.
  • the particular output buffer having the best information conveys it over the network and that same information is used to calculate a new transmit reference to be stored in transmit reference buffer 120 .

Abstract

Feedback is introduced between a video CODEC and the intended communications channel such that the characteristics of the channel are used to drive multiple video output buffers. These multiple output buffers share an original temporal video reference, but have different subsequent temporal video images. The communications channel interface then picks the subsequent video image buffer that best matches the current conditions experienced by it. By using a predictor of the channel performance, the video algorithm can be tuned to provide video output buffers with the best guess of how the buffers should be configured. A number of subsequent histories of an image are buffered until the receiving channel indicates it is ready to receive the next. Then the appropriate output buffer having the corresponding temporal change in the video is used to supply the next frame change information to the receiving station.

Description

This is a reissue of Ser. No. 08/159,665 filed Nov. 30, 1993, now U.S. Pat. No. 5,416,520.
BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates to the transmission of information over a communications path. More particularly, the present invention relates to the communications of high bandwidths information over networks of varying types.
2. Art Background
Until recently, telecommunications and computing were considered to be entirely separate disciplines. Telecommunications was analog and done in real time whereas computing was digital and performed at a rate determined by the processing speed of a computer. Today, such technologies as speech processing, electronic mail and facsimile have blurred these lines. In the coming years, computing and telecommunications will become almost indistinguishable in a race to support a broad range of new multimedia (i.e., voice, video and data) applications. These applications are made possible by emerging digital-processing technologies, which include: compressed audio (both high fidelity audio and speech), high resolution still images, and compressed video. The emerging technologies will allow for collaboration at a distance, including video conferencing.
Of these technologies, video is particularly exciting in terms of its potential applications. But video is also the most demanding in terms of processing power and sheer volume of data to be processed. Uncompressed digital video requires somewhere between 50 and 200 Mb/s (megabits per second) to support the real-time transmission of standard television quality images. This makes impractical the widespread use of uncompressed digital video in telecommunications applications.
Fortunately, there is considerable redundancy in video data, both in terms of information theory and human perception. This redundancy allows for the compression of digital video sequences into lower transmission rates. For some time, researchers have been aware of a variety of techniques that can be used to compress video data sequences anywhere from 2:1 to 1000:1, depending on the quality required by the application. Until recently, however, it was not practical to incorporate these techniques into low cost video-based applications.
A number of standards have been recently developed for such activities as video conferencing, the transmission and storage of standard high quality still images, as well as standards for interactive video playback to provide interoperability between numerous communications points. The standards recognize a need for quality video compression to reduce the tremendous amount of data required for the transmission of video information.
Two important methods of data compression for video information are used widely throughout the various standards for video communication. These are the concepts of frame differencing and motion compensation. Frame differencing recognizes that a normal video sequence has little variation from one frame to the next. If, instead of coding each frame, only the differences between a frame and the previous frame are coded, then the amount of information needed to describe the new frame will be dramatically reduced. Motion compensation recognizes that much of the difference that does occur between successive frames can be characterized as a simple translation of motion, caused either by the moving of objects in the scene or by a pan of the field of view. Rather than form a simple difference between blocks in a current frame and the same block in the previous frame, the area around those blocks can be searched in the previous frame to find an offset block that more closely matches the block of the current frame. Once a best match has been identified, the difference between a reference block in the current frame and the best match in the previous frame are coded to produce a vector that describes the offset of the best match. This motion vector then can be used with the previous frame to produce the equivalent of what the current frame should be. These methods, and others are incorporated into systems which make possible the rapid transmission of real-time video information.
As the worlds of telecommunications and computers blend closely together, the telecommunications aspects of communications will have to contend with some of the constraints of the computer world. Particularly, video conferencing over existing computer networks will prove a challenge in that maintaining real time information communication over traffic-burdened existing network protocols may prove insurmountable.
Current video algorithms assume a nearly constant bandwidth availability for the encoding of video information. This is evidenced by the use of only a single output buffer for traditional video encoder output. It is common to use the output buffer fullness as a feedback parameter for encoding subsequent images; i.e., with higher or lower levels of quantization. A well-known effect resulting from using a single output buffer is called “bit-bang” where the output buffer is over depleted by the interface to the communications channel, causing the feedback loop to indicate that the buffer can handle lots of data, which in turn causes the video compression algorithm to under optimize the subsequent image coding. The user perceives the bit-bang as an uneven quality and frame rate.
To alleviate bit-bang, the typical approach has been to limit the amount of data pulled out from the encoder video output buffer to a fraction of the total size of the output buffer; 10% to 30% is typical. This approach keeps the feedback indicator rather small, and encoding more uniform. The underlying assumption of this approach is that the communications channel will usually not be changing rapidly. Exceptions are caused by connectivity interruptions, such as burst errors, which are handled strictly as exceptions to the call. In a local area network (LAN), or other collision-sensing multiple access channel, or in other networks with burst characteristics (such as noisy RF channels), this underlying assumption no longer holds. Over these sorts of communications channels, unanticipated transmission delays may result in bit-bang problems which are not so readily overcome by limiting the size of the feedback buffer. Thus, video jerkiness will result in real-time video communication over such channels. It would be advantageous, and is therefore an object of the present invention to provide a video transmission mechanism which can be accommodated on such potential bursty networks.
SUMMARY OF THE INVENTION
From the foregoing, it can be appreciated that there is a need for a mechanism of incorporating real-time video data communication over traditional network protocols to smooth video transmission. It is therefore an object of the present invention to provide a method and apparatus for the conveyance of video data over such networks as local area networks.
These and other objects of the present invention are provided by introducing feedback between the video CODEC and the intended communications channel such that the characteristics of the channel are used to drive multiple video output buffers. These multiple output buffers share an original temporal video reference, but have different subsequent temporal video images. The communications channel interface then picks the subsequent video image buffer that best matches the current conditions experienced by it. By using a predictor of the channel performance, the video algorithm can be tuned to provide video output buffers with the best guess of how the buffers should be configured. A number of subsequent histories of an image are buffered until the receiving channel indicates it is ready to receive the next. Then the appropriate output buffer having the corresponding temporal change in the video is used to supply the next frame change information to the receiving station.
BRIEF DESCRIPTION OF THE DRAWINGS
The objects, features and advantages of the present invention will be apparent from the following detailed description in which:
FIG. 1 demonstrate a hypothetical network having a plurality of video-capable nodes for interacting and providing video conferencing capabilities.
FIG. 2 illustrates hardware to be utilized in implementing the present invention in one embodiment.
FIG. 3 illustrates a logical rendition of a plurality of output buffers with successive time interval video information for one embodiment of the present invention.
FIG. 4 illustrates a branching tree structure corresponding to successive temporal transmit reference images for one embodiment of the present invention.
FIG. 5 illustrates alternative logical output buffer uses for channel dependent data transmission over a network.
FIG. 6 illustrates characteristics of audio information which may be transmitted over a network in accordance with another embodiment of the present invention.
FIG. 7 illustrates a generalized block diagram of the present invention.
DETAILED DESCRIPTION OF THE INVENTION
A method and apparatus are described for the conveyance of real-time isochronous data over bursty networks. Although the present invention is described predominantly in terms of the transmission of video information, the concepts and method are broad enough to encompass the transmission of real-time audio and other data requiring isochronous data transfer. Throughout this detailed description, numerous details are specified such as bit rates and frame sizes, in order to provide a thorough understanding of the present invention. To one skilled in the art, however, it will be understood that the present invention may be practiced without such specific details. In other instances, well-known control structures and gate level circuits have not been shown in detail in order not to obscure unnecessarily the present invention. Particularly, some functions are described to be carried out by various logic circuits. Those of ordinary skill in the art, having been described the various functions will be able to implement the necessary logic circuits without undue experimentation.
FIG. 1 is used to illustrate a simple network having a plurality of video-capable nodes. The network is illustrated as a simple star network 10 having a centrally incorporated multi-point control unit (MCU). The network is presented as having five (5) nodes 11, 12, 13, 14 and 15. For the purposes of explanation, these will all be considered video-capable nodes, with nodes 12 and 13 supporting IRV video (160 pixels×120 lines) while nodes 11, 14 and 15 support HQV video (320 pixels×240 lines). The network illustrated in FIG. 1 is purely for illustrative purposes and many more complex nodes may be incorporated that are non-video capable on the same network as the illustrated nodes. Further, the present invention may be applied to any network configuration besides the star configuration of FIG. 1 such as token ring networks, branching tree networks, etc. The fundamental requirement for the network which has these video-capable nodes is that the nodes be able to transmit data, including video data from one point to another and receive acknowledgments from the receiving node.
FIG. 2 illustrates typical video encoding hardware to which the present invention may be applied. This can be used for preparing video data to be transmitted over a network of the type illustrated in FIG. 1 to provide real-time video conferencing. A video camera 20 receives the video image that is to be encoded and conveyed. Such cameras are common and work on a number of technologies such as charge coupled devices, etc. The video camera may directly include video CODEC 21 or it may be tightly coupled as illustrated in the figure. The video CODEC 21 receives the electronic image from the camera and digitizes the image when being used in its encoder capacity. Video CODECS are generally known and come in a number of varieties which may be used for encoding video data to be transmitted and decoding video data when received. In FIG. 2, the camera output is propagated to the capture buffer 22 of video CODEC 21.
From the capture buffer 22, the video information is processed by motion estimation circuitry 23. The motion estimation circuitry is used to generate motion vectors which describe the difference of a portion of a video image from the previously recorded image in terms of a translational offset. The motion estimation circuitry compares the currently decoded frame from the previous frame stored in the transmit reference image buffer 30 about which more will be described further herein. From the motion estimation circuitry, the outputs are the motion vectors and the motion compensated image 24. The motion compensated image 24 is then processed by the differential pulse code modulation (DPCM) circuitry 25 which generates digital information of the changes to the previously stored transmit reference image. Finally, a final stage of coding is done at transform coding block 26 which also performs quantization and run-length encoding. Run-length encoding is a technique for compressing data sequences that have large numbers of zeros and is well-known to those of ordinary skill in the art. This transform coder may perform a discrete cosine transform (DCT).
From the transform coding block, the coded sequence is propagated to the output buffer 27 which is used to maintain a constant bit rate for the output to the network. As was described, prior art methods used the output buffer fullness to regulate the degree of quantization that would be applied to the compressing and encoding circuitry because a constant bandwidth availability was assumed.
The transform coder block 26 also outputs the compressed image data to generate a new transfer reference image for storage in the transmit reference image buffer. The encoding logic provides the compressed image data to a decoding block 28 that has an inverse quantizer and inverse discrete cosine transform decoder which can be used to combine the decoded image data with the previously stored transfer reference image to yield a new transmit reference image which corresponds to the image that was most recently propagated on the network. It is this image data that would be used in calculating the changes in the image in sending the next frame of information. In other words, the transmit reference image, which is the same image that will be reconstructed at the other end by the video decoder, is used as the basis of subsequent encoding, including motion vectors and motion compensated image compression.
As was described in the previous section, the prior art feedback mechanism using the output buffer assumed a constant bit rate would be available for the transmission of information. This assumption no longer holds for video conferencing type devices which are on bursty networks such as CSMA LAN networks. The solution proposed by the present invention is to provide feedback between the video CODEC and the communications channel such that the characteristics of the channel are used to drive multiple video output buffers. These buffers share an original temporal video reference but will have different subsequent temporal video images. The communications channel interface then picks the subsequent video image buffer that best matches the current condition. By using a predictor of the channel performance, the video algorithm can be tuned to provide video output buffers with the best guess of how the buffers should be configured. Once a particular output buffer's image data is selected, the remaining buffers can be flushed to be refilled again based on a newly calculated transmit reference image. In the limit, the final action is to revert to an exception handler similar to current video CODECS, i.e., insert a key frame to restart the encoding of video data transmission.
FIG. 3 illustrates conceptually the logical multiple output buffers of the present invention. When the video camera 20 records an image it is encoded by the encoding circuitry described above and the encoded information is propagated to the output buffer 27. In a bursty network, the network may not be able to receive this newly calculated image data. Accordingly, the camera continues to detect images and encode the data and newly translated data is stored in subsequent output buffers such as 41, 42 or 43. For example, the information stored in the output buffer 27 may correspond to the digital information equivalent to the changes from the transmit reference image stored in the transmit reference image buffer 30 at time t=0. In output buffer 41, the data information may correspond to the difference between the transfer reference image and 1/15th of a second later than the data information stored in buffer 27. Likewise, output buffers 42 and 43 may store data corresponding to the temporal change between the transmit reference image and the image before the camera at successively later times.
The video encoder and camera circuitry described may be incorporated as part of a station that is on the network and are responsive to information received over the communications channel. When a given node again has the bus, the output buffer with the most current image may be signaled to transmit its information to the receiving node. Likewise, the channel information is used to then calculate the next transmit reference image for storage. The output buffers are then flushed and are again loaded in a time sequential manner until the data is again ready to be sent over the network. While four (4) output buffers are illustrated, this is purely for illustrative purposes in that as many buffers may be implemented as computing power and resources provide.
FIG. 4 illustrates conceptually a branching tree that is pruned at times T=1, T=2, T=3, etc., for each slice of information that is taken and propagated on the network. This conceptualizes the use of multiple output buffers as a tree which is continually pruned with the most current pruning corresponding to the present transfer reference image.
FIG. 5 illustrates another conceptualization of the present invention. The encoder, through feedback from the data communications channel, creates several logical output buffers corresponding to behavioral predictions based on the feedback from the communications channel. For example, logical output buffer 1 could represent the case where more bandwidth will be dynamically allocated to this natural data compression over the next unit of time. The unit of time could be an image frame or, for example, a frame of sampled audio. In FIG. 5, the various predictions of the bandwidth available to the compression algorithm are shown below in Table I.
TABLE I
Prediction of Bandwidth per Unit
Time Relative to Current Transmit
Logical Output Buffer Reference
1 about the same
2 a lot more
3 more
4 a lot less
For video coding, more bandwidth could be used to get sharper images and/or higher frame rate. The actual data contained in the logical output buffers can be significantly different, too. For example, in video coding, the new transmit reference might be calculated from different input images in time and/or spatial resolution. Logical output buffer 1 might represent the data from an image taken 1/15th of a second later than transmit reference 0, while logical output buffer 2 might represent the differential coding from an image half a second later than transmit reference 0. Such an approach would be good for video coding for channels where the bit rate allocated to video may undergo extreme fluctuations such as in the bursty networks described above.
While with reference to FIGS. 2 and 3, the output buffers are illustrated as, for example, discrete memory elements. FIG. 5 makes it clear that logical buffers may be created in a common block of memory and that the number of such buffers is limited only by the available computational power to simultaneously encode them and the memory to sufficiently handle them. FIG. 6 is used to illustrate that the present invention is not necessarily limited to video encoding and illustrates a frame of audio information. For example, in the G.728 standard each frame of data is 5 milliseconds long. The frame may be stored as a transmit reference and subsequent transmissions may follow the differential coding principals wherein only the changed information is sent to the receiving node. The audio encoder may be responsive to feedback from the network and maintain a plurality of logical output buffers such as those described in the video application. One possible application for such an implementation would be in wireless telephony wherein portions of an audio transmission may be lost when a transmitting station goes through a tunnel. The responding network indicates that its most recently receiving information is slightly stale and that a late change logical output buffer should be used in providing the encoded differential information.
In a more general description of the present invention, reference is now made to FIG. 7. Information about a real-time object 100 that is desired to be conveyed from a transmitting node to a receiving node on some sort of network is shown. This real-time object 100 may be a video image or it may be a sound depending on the particular implementation. A capture mechanism 110 detects the real-time object and encodes it into electronic information. The capture mechanism may be a camera for video information as described above or a microphone or stereo microphones for audio information. This information is then processed by differential encoder 115 which compares the newly captured real-time object to the previously stored recorded object in transmit reference buffer 120. The differentially encoded data is then propagated to a logical output buffer 125 which operates as those described above. When the network clears the output buffers for transmission, the particular output buffer having the best information conveys it over the network and that same information is used to calculate a new transmit reference to be stored in transmit reference buffer 120.
There has thus been described a method and apparatus of differential coding for use in bursty transmission networks which greatly improves the quality of transmitted compressed information. Although the present invention has been described in terms of preferred embodiments, it will be appreciated that various modifications and alterations might be made by those skilled in the art without departing from the spirit and scope of the invention. The invention should, therefore, be measured in terms of the claims which follow.

Claims (27)

1. For use in a communications network having a plurality of nodes wherein a node may encode real-time information for propagating over said network, a method of processing said real-time information comprising the steps of:
providing said node with a plurality of output buffers;
(a) electronically capturing said real-time information and converting it into electronic data;
(b) differentially encoding said electronic data using a previously stored transmit reference image as a base to produce differential data;
(c) storing said differential data in one of said plurality of output buffers;
(d) monitoring said network for access to propagate said differential data;
repeating steps (a)-(d) until said node may propagate said differential data over said network;
transmitting data over said network from the one of said plurality of output buffers providing a best differential data to a receiving node on said network, wherein said best differential data represents a differential data whose use in conjunction with the previously stored transmit reference image produces an image that approximates a current frame better than use of other differential data contained in said plurality of output buffers; and
calculating a new transmit reference image based on said best differential data and said previously stored transmit reference image.
2. An apparatus comprising:
an encoder for producing encoded real-time information;
a transmit reference buffer for storing a current transmit reference;
compression circuitry coupled to the encoder and to the transmit reference buffer for producing compressed data based upon the current transmit reference and the encoded real-time information;
a plurality of dynamically created output buffers coupled to the compression circuitry for storing the compressed data, each dynamically created output buffer being created and configured based upon one or more characteristics of a communication channel to be used for transmitting the encoded real-time information over a network; and
a network interface coupled to the plurality of dynamically created output buffers, the network interface for interfacing with the network, for determining a selected output buffer from the plurality of dynamically created output buffers and for transmitting data over the network from the selected output buffer, the selected output buffer containing compressed data which accommodates the one or more characteristics of the network better than compressed data in at least one other buffer of the plurality of dynamically created output buffers.
3. The apparatus of claim 2, wherein the selected output buffer contains compressed data which accommodates one or more characteristics of the network better than compressed data in all other buffers of the plurality of output buffers.
4. An apparatus for transmitting real-time information over a network, the apparatus comprising:
an encoder for producing encoded real-time information;
a transmit reference buffer for storing a current transmit reference;
compression circuitry coupled to the encoder and to the transmit reference buffer for producing compressed data based upon the current transmit reference and the encoded real-time information; and
a plurality of dynamically created output buffers coupled to the compression circuitry for buffering the compressed data, each of the plurality of dynamically created output buffers having contents and being created and configured based upon one or more characteristics of a communication channel to be used for transmitting the encoded real-time information over a network, the contents of a selected output buffer of the plurality of dynamically created output buffers to be transmitted onto a data communications channel of the network based upon the one or more characteristics of the data communications channel.
5. The apparatus of claim 4 further comprising a network interface coupled to the plurality of output buffers, the network interface for interfacing with the network, the network interface determining the selected output buffer and transmitting data over the network from the selected output buffer.
6. The apparatus of claim 5, wherein the selected output buffer contains compressed data which, when used in conjunction with the current transmit reference, accommodates the one or more characteristics of the data communications channel better than compressed data from at least another buffer of the plurality of output buffers.
7. The apparatus of claim 5, wherein the selected output buffer contains compressed data which, when used in conjunction with the current transmit reference, accommodates the one or more characteristics of the data communications channel better than compressed data from all other buffers of the plurality of output buffers.
8. The apparatus of claim 4, wherein the compressed data comprises a differential between the encoded real-time information and the current transmit reference.
9. The apparatus of claim 4, wherein the one or more characteristics of the data communications channel include bandwidth availability on the data communications channel.
10. The apparatus of claim 4, wherein the one or more characteristics of the data communications channel include burstiness of traffic on the data communications channel.
11. The apparatus of claim 4, wherein the one or more characteristics of the data communications channel include transmission delay on the data communications channel.
12. The apparatus of claim 4, wherein the encoded real-time information includes video information.
13. The apparatus of claim 4, wherein the encoded real-time information includes audio information.
14. An apparatus comprising:
an encoder for producing encoded real-time information;
a transmit reference buffer for storing a current transmit reference;
compression circuitry coupled to the encoder and to the transmit reference buffer for producing compressed data based upon the current transmit reference and the encoded real-time information;
a plurality of dynamically created output buffers coupled to the compression circuitry for storing the compressed data, each dynamically created output buffer being created and configured based upon one or more characteristics of a communication channel to be used for transmitting the encoded real-time information over a network; and
a network interface coupled to the plurality of output buffers, the network interface for selecting a selected output buffer of the plurality of output buffers by determining, with reference to one or more predetermined coding strategies, whether compressed data from the selected output buffer is appropriate for transmission to a receiving node.
15. The apparatus of claim 14, wherein the one or more predetermined coding strategies include minimizing artifacts.
16. The apparatus of claim 14, wherein the one or more predetermined coding strategies include allocating available bandwidth to achieve a higher frame rate.
17. The apparatus of claim 14, wherein each of the output buffers is dynamically created and configured in accordance with characteristics of a communication channel being used to transmit the encoded real-time information over the network.
18. An apparatus comprising:
an encoder for producing encoded real-time information;
compression circuitry coupled to the encoder for producing compressed data based upon a previously stored transmit reference and the encoded real-time information;
a plurality of dynamically created output buffers coupled to the compression circuitry for storing the compressed data, each dynamically created output buffer being created and configured based upon one or more characteristics of a communication channel to be used for transmitting the encoded real-time information over a network; and
a network interface coupled to the plurality of dynamically created output buffers, the network interface transmitting compressed data from a selected output buffer of the plurality of dynamically created output buffers, the compressed data from the selected output buffer when used in conjunction with the previously stored transmit reference approximating a next frame expected by a receiving apparatus.
19. The apparatus of claim 18, wherein the selected output buffer is selected based upon current conditions of a communication channel to be used for transmitting the contents of the selected output buffer.
20. A method of transmitting data over a network comprising:
encoding the data by determining the differences between the data and a transmit reference to produce differential data;
storing the differential data in a plurality of output buffers dynamically created based upon characteristics of a communication channel to be used for transmitting the differential data over the network;
selecting one of the plurality of output buffers as a current transmit buffer based upon current conditions of a communications channel in the network used to transmit the differential data; and
transmitting the differential data from the current transmit buffer over the network.
21. The method of claim 20, additionally comprising compressing the differential data prior to storing the differential data in one of the plurality of output buffers.
22. The method of claim 20, additionally comprising repeating said encoding, storing, selecting, and transmitting using the data from the current transmit buffer as the transmit reference.
23. A method of transmitting real-time data over a network comprising:
encoding the real-time data by determining the differences between the real-time data and a transmit reference to produce differential data;
storing the differential data in one of a plurality of output buffers, each output buffer dynamically created based upon one or more characteristics of a data communications channel of the network;
selecting one of the plurality of output buffers as a current transmit buffer by determining whether the differential data in a particular output buffer accommodates one or more characteristics of the network better than different data in at least one other output buffer of the plurality of output buffers; and
transmitting differential data from the current transmit buffer over the network.
24. The method of claim 23, additionally comprising compressing the differential data prior to storing the differential data in one of the plurality of output buffers.
25. The method of claim 23, additionally comprising repeating said encoding, storing, selecting, and transmitting using the data from the current transmit buffer as the transmit reference.
26. An apparatus comprising:
an encoder for producing encoded real-time information;
compression circuitry coupled to the encoder for producing compressed data based upon a previously stored transmit reference and the encoded real-time information;
a plurality of dynamically created output buffers coupled to the compression circuitry for storing the compressed data, each buffer being configured in accordance with characteristics of a communication channel to be used for transmitting the encoded real-time information over a network; and
a network interface coupled to the plurality of output buffers, the network interface transmitting compressed data from a selected output buffer of the plurality of output buffers, the compressed data from the selected output buffer when used in conjunction with the previously stored transmit reference approximating a next frame expected by a receiving apparatus.
27. The method of claim 26, wherein said encoder produces encoded real-time information by determining the differences between the real time information and a transmit reference.
US08/881,965 1993-11-30 1997-05-16 Multiple encoder output buffer apparatus for differential coding of video information Expired - Lifetime USRE39955E1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US08/881,965 USRE39955E1 (en) 1993-11-30 1997-05-16 Multiple encoder output buffer apparatus for differential coding of video information

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US08/159,665 US5416520A (en) 1993-11-30 1993-11-30 Multiple encoder output buffer apparatus for differential coding of video information
US08/881,965 USRE39955E1 (en) 1993-11-30 1997-05-16 Multiple encoder output buffer apparatus for differential coding of video information

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US08/159,665 Reissue US5416520A (en) 1993-11-30 1993-11-30 Multiple encoder output buffer apparatus for differential coding of video information

Publications (1)

Publication Number Publication Date
USRE39955E1 true USRE39955E1 (en) 2007-12-25

Family

ID=22573466

Family Applications (3)

Application Number Title Priority Date Filing Date
US08/159,665 Ceased US5416520A (en) 1993-11-30 1993-11-30 Multiple encoder output buffer apparatus for differential coding of video information
US08/368,372 Expired - Lifetime US5485211A (en) 1993-11-30 1995-01-04 Multiple encoder output buffer apparatus for differential coding of video information
US08/881,965 Expired - Lifetime USRE39955E1 (en) 1993-11-30 1997-05-16 Multiple encoder output buffer apparatus for differential coding of video information

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US08/159,665 Ceased US5416520A (en) 1993-11-30 1993-11-30 Multiple encoder output buffer apparatus for differential coding of video information
US08/368,372 Expired - Lifetime US5485211A (en) 1993-11-30 1995-01-04 Multiple encoder output buffer apparatus for differential coding of video information

Country Status (1)

Country Link
US (3) US5416520A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100038044A1 (en) * 2004-12-23 2010-02-18 Mark Alan Burazin Method of Making Textured Tissue Sheets Having Highlighted Designs

Families Citing this family (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5758194A (en) * 1993-11-30 1998-05-26 Intel Corporation Communication apparatus for handling networks with different transmission protocols by stripping or adding data to the data stream in the application layer
JP3226719B2 (en) * 1994-06-21 2001-11-05 キヤノン株式会社 Information transmission method and apparatus
DE69503916T2 (en) 1994-10-14 1999-01-28 United Parcel Service Inc MULTI-STAGE PACKAGE TRACKING SYSTEM
US5550595A (en) * 1994-12-16 1996-08-27 Intel Corporation Apparatus and method for motion estimation with enhanced camera interface
US5694346A (en) * 1995-01-18 1997-12-02 International Business Machines Corporation Integrated circuit including fully testable small scale read only memory constructed of level sensitive scan device shift register latches
US5600646A (en) * 1995-01-27 1997-02-04 Videoserver, Inc. Video teleconferencing system with digital transcoding
US5838664A (en) 1997-07-17 1998-11-17 Videoserver, Inc. Video teleconferencing system with digital transcoding
US5754190A (en) * 1995-06-07 1998-05-19 Advanced Micro Devices System for reproducing images utilizing image libraries
AUPN399295A0 (en) * 1995-07-06 1995-07-27 Diginet Systems Pty. Limited Virtual broadband transmission
US5886744A (en) * 1995-09-08 1999-03-23 Intel Corporation Method and apparatus for filtering jitter from motion estimation video data
US5706054A (en) * 1995-12-01 1998-01-06 Intel Corporation Method and apparatus for adjusting video data to limit the effects of automatic focusing control on motion estimation video coders
EP0854652B1 (en) * 1996-08-07 2010-11-17 Panasonic Corporation Picture and sound decoding device, picture and sound encoding device, and information transmission system
US6480541B1 (en) 1996-11-27 2002-11-12 Realnetworks, Inc. Method and apparatus for providing scalable pre-compressed digital video with reduced quantization based artifacts
US6172672B1 (en) 1996-12-18 2001-01-09 Seeltfirst.Com Method and system for providing snapshots from a compressed digital video stream
US6151632A (en) * 1997-03-14 2000-11-21 Microsoft Corporation Method and apparatus for distributed transmission of real-time multimedia information
US6292834B1 (en) 1997-03-14 2001-09-18 Microsoft Corporation Dynamic bandwidth selection for efficient transmission of multimedia streams in a computer network
US6128668A (en) * 1997-11-07 2000-10-03 International Business Machines Corporation Selective transformation of multimedia objects
US6192032B1 (en) 1998-01-02 2001-02-20 International Business Machines Corporation Rate attenuation systems, methods and computer program products for reducing low priority video frame packets transmitted over a network
US6549669B1 (en) 1999-04-27 2003-04-15 Telefonaktiebolaget L M Ericsson (Publ) Predictive audio and video encoding with smooth scene switching capability
KR100305797B1 (en) * 1999-07-08 2001-09-13 이계안 Method for noise removing of the beyond lane alarm device for vehicle
GB2359209A (en) * 2000-02-09 2001-08-15 Motorola Ltd Apparatus and methods for video distribution via networks
US7051125B1 (en) * 2002-02-14 2006-05-23 Polycom, Inc. System and method for managing data flow in a conference
US6925117B2 (en) * 2003-03-12 2005-08-02 Kabushiki Kaisha Toshiba Data transmission apparatus, method and program, data reception apparatus and method, and data transmission and reception system, using differential data
US20060002315A1 (en) * 2004-04-15 2006-01-05 Citrix Systems, Inc. Selectively sharing screen data
US20060031779A1 (en) * 2004-04-15 2006-02-09 Citrix Systems, Inc. Selectively sharing screen data
US7827139B2 (en) * 2004-04-15 2010-11-02 Citrix Systems, Inc. Methods and apparatus for sharing graphical screen data in a bandwidth-adaptive manner
US7680885B2 (en) * 2004-04-15 2010-03-16 Citrix Systems, Inc. Methods and apparatus for synchronization of data set representations in a bandwidth-adaptive manner
EP1638333A1 (en) * 2004-09-17 2006-03-22 Mitsubishi Electric Information Technology Centre Europe B.V. Rate adaptive video coding
US8443040B2 (en) * 2005-05-26 2013-05-14 Citrix Systems Inc. Method and system for synchronizing presentation of a dynamic data set to a plurality of nodes
WO2007040361A1 (en) * 2005-10-05 2007-04-12 Lg Electronics Inc. Method and apparatus for signal processing and encoding and decoding method, and apparatus therefor
US8792640B2 (en) 2008-01-29 2014-07-29 Sony Corporation Systems and methods for securing a digital communications link
CN104247407A (en) * 2012-02-10 2014-12-24 易卜拉欣·纳拉 Data, multimedia & video transmission updating system
CN104486554B (en) * 2015-01-09 2018-02-13 苏州科达科技股份有限公司 Method for snap control, system and grasp shoot device
US9564917B1 (en) * 2015-12-18 2017-02-07 Intel Corporation Instruction and logic for accelerated compressed data decoding

Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4320500A (en) * 1978-04-10 1982-03-16 Cselt - Centro Studi E Laboratori Telecomunicazioni S.P.A. Method of and system for routing in a packet-switched communication network
US4860244A (en) * 1983-11-07 1989-08-22 Digital Equipment Corporation Buffer system for input/output portion of digital data processing system
US5010401A (en) * 1988-08-11 1991-04-23 Mitsubishi Denki Kabushiki Kaisha Picture coding and decoding apparatus using vector quantization
US5077742A (en) * 1988-01-11 1991-12-31 Ricoh Company, Ltd. Error-corrected facsimile communication control system
US5117350A (en) * 1988-12-15 1992-05-26 Flashpoint Computer Corporation Memory address mechanism in a distributed memory architecture
US5121202A (en) * 1989-05-12 1992-06-09 Nec Corporation Adaptive interframe prediction coded video communications system
US5128931A (en) * 1989-12-11 1992-07-07 Mitsubishi Denki Kabushiki Kaisha Data exchange apparatus
US5140417A (en) * 1989-06-20 1992-08-18 Matsushita Electric Co., Ltd. Fast packet transmission system of video data
US5155484A (en) * 1991-09-13 1992-10-13 Salient Software, Inc. Fast data compressor with direct lookup table indexing into history buffer
US5164939A (en) * 1988-03-17 1992-11-17 Kabushiki Kaisha Toshiba Packet switching device
US5196933A (en) * 1990-03-23 1993-03-23 Etat Francais, Ministere Des Ptt Encoding and transmission method with at least two levels of quality of digital pictures belonging to a sequence of pictures, and corresponding devices
US5227875A (en) * 1990-08-20 1993-07-13 Kabushiki Kaisha Toshiba System for transmitting encoded image data with quick image expansion and contraction
US5276681A (en) * 1992-06-25 1994-01-04 Starlight Networks Process for fair and prioritized access to limited output buffers in a multi-port switch
US5343465A (en) * 1993-06-11 1994-08-30 Bell Communications Research, Inc. Method and system for real-time burstiness analysis of network traffic
US5396494A (en) * 1991-07-01 1995-03-07 At&T Corp. Method for operating an asynchronous packet bus for transmission of asynchronous and isochronous information
US5408465A (en) * 1993-06-21 1995-04-18 Hewlett-Packard Company Flexible scheme for admission control of multimedia streams on integrated networks
US5440344A (en) * 1992-04-28 1995-08-08 Mitsubishi Denki Kabushiki Kaisha Video encoder using adjacent pixel difference for quantizer control
US5497153A (en) * 1992-07-23 1996-03-05 Samsung Electronics Co., Ltd. System for variable-length-coding and variable-length-decoding digital data for compressing transmission data
US5502727A (en) * 1993-04-20 1996-03-26 At&T Corp. Image and audio communication system having graphical annotation capability
US5629736A (en) * 1994-11-01 1997-05-13 Lucent Technologies Inc. Coded domain picture composition for multimedia communications systems
US5640208A (en) * 1991-06-27 1997-06-17 Sony Corporation Video signal encoding in accordance with stored parameters
US5905821A (en) * 1992-02-28 1999-05-18 Canon Kabushiki Kaisha Compression/expansion circuit having transfer means and storage means with address management of the storage means

Patent Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4320500A (en) * 1978-04-10 1982-03-16 Cselt - Centro Studi E Laboratori Telecomunicazioni S.P.A. Method of and system for routing in a packet-switched communication network
US4860244A (en) * 1983-11-07 1989-08-22 Digital Equipment Corporation Buffer system for input/output portion of digital data processing system
US5077742A (en) * 1988-01-11 1991-12-31 Ricoh Company, Ltd. Error-corrected facsimile communication control system
US5164939A (en) * 1988-03-17 1992-11-17 Kabushiki Kaisha Toshiba Packet switching device
US5010401A (en) * 1988-08-11 1991-04-23 Mitsubishi Denki Kabushiki Kaisha Picture coding and decoding apparatus using vector quantization
US5117350A (en) * 1988-12-15 1992-05-26 Flashpoint Computer Corporation Memory address mechanism in a distributed memory architecture
US5121202A (en) * 1989-05-12 1992-06-09 Nec Corporation Adaptive interframe prediction coded video communications system
US5140417A (en) * 1989-06-20 1992-08-18 Matsushita Electric Co., Ltd. Fast packet transmission system of video data
US5128931A (en) * 1989-12-11 1992-07-07 Mitsubishi Denki Kabushiki Kaisha Data exchange apparatus
US5196933A (en) * 1990-03-23 1993-03-23 Etat Francais, Ministere Des Ptt Encoding and transmission method with at least two levels of quality of digital pictures belonging to a sequence of pictures, and corresponding devices
US5227875A (en) * 1990-08-20 1993-07-13 Kabushiki Kaisha Toshiba System for transmitting encoded image data with quick image expansion and contraction
US5640208A (en) * 1991-06-27 1997-06-17 Sony Corporation Video signal encoding in accordance with stored parameters
US5396494A (en) * 1991-07-01 1995-03-07 At&T Corp. Method for operating an asynchronous packet bus for transmission of asynchronous and isochronous information
US5155484A (en) * 1991-09-13 1992-10-13 Salient Software, Inc. Fast data compressor with direct lookup table indexing into history buffer
US5905821A (en) * 1992-02-28 1999-05-18 Canon Kabushiki Kaisha Compression/expansion circuit having transfer means and storage means with address management of the storage means
US5440344A (en) * 1992-04-28 1995-08-08 Mitsubishi Denki Kabushiki Kaisha Video encoder using adjacent pixel difference for quantizer control
US5276681A (en) * 1992-06-25 1994-01-04 Starlight Networks Process for fair and prioritized access to limited output buffers in a multi-port switch
US5497153A (en) * 1992-07-23 1996-03-05 Samsung Electronics Co., Ltd. System for variable-length-coding and variable-length-decoding digital data for compressing transmission data
US5502727A (en) * 1993-04-20 1996-03-26 At&T Corp. Image and audio communication system having graphical annotation capability
US5343465A (en) * 1993-06-11 1994-08-30 Bell Communications Research, Inc. Method and system for real-time burstiness analysis of network traffic
US5408465A (en) * 1993-06-21 1995-04-18 Hewlett-Packard Company Flexible scheme for admission control of multimedia streams on integrated networks
US5629736A (en) * 1994-11-01 1997-05-13 Lucent Technologies Inc. Coded domain picture composition for multimedia communications systems

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100038044A1 (en) * 2004-12-23 2010-02-18 Mark Alan Burazin Method of Making Textured Tissue Sheets Having Highlighted Designs

Also Published As

Publication number Publication date
US5416520A (en) 1995-05-16
US5485211A (en) 1996-01-16

Similar Documents

Publication Publication Date Title
USRE39955E1 (en) Multiple encoder output buffer apparatus for differential coding of video information
US5793895A (en) Intelligent error resilient video encoder
US5212742A (en) Method and apparatus for encoding/decoding image data
US5267334A (en) Encoding/decoding moving images with forward and backward keyframes for forward and reverse display
US5821986A (en) Method and apparatus for visual communications in a scalable network environment
CA2084178C (en) Statistical multiplexer for a multichannel image compression system
US5758194A (en) Communication apparatus for handling networks with different transmission protocols by stripping or adding data to the data stream in the application layer
US5532744A (en) Method and apparatus for decoding digital video using parallel processing
US8374236B2 (en) Method and apparatus for improving the average image refresh rate in a compressed video bitstream
US6108027A (en) Progressive still frame mode
Schiller et al. Efficient coding of side information in a low bitrate hybrid image coder
US20110026592A1 (en) Intra block walk around refresh for h.264
JP2001511983A (en) Rate control method and apparatus for performing video encoding at a low bit rate based on a perceptual characteristic-based trellis
CN1214629C (en) Terminal and method of transmitting still image
US5796436A (en) Video data storing device and method for coding video data based upon determining whether or not video data conforms to a predetermined standard
Zhang Very low bit rate video coding standards
Chen et al. A robust coding scheme for packet video
CN112004084B (en) Code rate control optimization method and system by utilizing quantization parameter sequencing
KR100385620B1 (en) Improved MPEG coding method, moving picture transmitting system and method thereof
Huitema et al. Software codecs and work station video conferences
CN112004087B (en) Code rate control optimization method taking double frames as control units and storage medium
CN112004083B (en) Method and system for optimizing code rate control by utilizing inter-frame prediction characteristics
EP0762773B1 (en) Hierarchial video encoder and decoder
JPH02222388A (en) Moving picture encoding method
KR100323752B1 (en) Method and Telephone for still picture transport

Legal Events

Date Code Title Description
AS Assignment

Owner name: INTEL CORPORATION, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KUZMA, ANDREW J.;REEL/FRAME:008853/0472

Effective date: 19971118