US20130151972A1 - Media processing comparison system and techniques - Google Patents
Media processing comparison system and techniques Download PDFInfo
- Publication number
- US20130151972A1 US20130151972A1 US13/708,232 US201213708232A US2013151972A1 US 20130151972 A1 US20130151972 A1 US 20130151972A1 US 201213708232 A US201213708232 A US 201213708232A US 2013151972 A1 US2013151972 A1 US 2013151972A1
- Authority
- US
- United States
- Prior art keywords
- media
- sample
- decoded
- samples
- queue
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/91—Television signal processing therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/82—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
- H04N9/8205—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
Definitions
- Media content players may be hardware, software, firmware, or combinations thereof that render sets of video, audio, graphics, images, and/or data content (“media content items”) for presentation to users.
- Media content items that are rendered and presented to users may be referred to as “media presentations.”
- Media content players may be stand-alone consumer electronic devices, or included in other electronic devices. Although any type of known or later developed electronic device may be or include a media content player, examples of electronic devices that have served user demand for consumption of media content items include but are not limited to: servers; optical media players; personal computers; personal media players; video recorders; portable communication devices such as phones and personal digital assistants; set-top boxes; and gaming devices.
- a media processing comparison system (“MPCS”) and techniques that facilitate subjective quality comparisons between media presentations produced by different media processing components in a particular operating environment are described herein.
- the media processing components are hardware, software, firmware, or combinations thereof, operating in a multimedia pipeline associated with a media content player, such as a Windows® multimedia pipeline operating in personal computing device. It will be appreciated, however, that any known or later developed framework may be used to implement a multimedia processing pipeline, and that any known or later developed media content player operating environment is possible.
- the MPCS receives a particular media content item from a particular media source.
- the particular media content item is generally arranged as an ordered stream of encoded media samples, such as video, audio, graphics, images, data, or combinations thereof, playable within a predetermined amount of time referred to as a “play duration.”
- the particular media content item is a video clip.
- a particular encoded media sample (for example, a particular video frame) is playable to a user at a particular presentation time within the play duration.
- the MPCS decodes a particular encoded media sample using two or more different instances of media processing components that perform the same functions (such as instances of parsers, encoder/decoder pairs (“codecs”), decryptors, demultiplexers, or combinations thereof supplied by different entities).
- Each different instance of media processing component(s) produces a decoded media sample corresponding to a particular encoded media sample.
- a particular media content item (or a particular encoded media sample thereof) is replicated to form multiple identical streams of encoded media samples, and a particular stream is input to and decoded by a particular instance of media processing component(s).
- decoded media samples produced by a particular instance of media processing component(s) are stored in (for example, written to) a queue.
- Each instance of media processing component(s) may store decoded media samples in a separate queue, or fewer queues (for example, a single queue) may be used.
- a single renderer renders decoded media samples received from each instance of media processing component(s), producing rendered media samples.
- decoded media samples produced by a particular instance of media processing component(s) are received for rendering from a particular queue.
- the decoded media samples may be pushed or pulled from the particular queue.
- rendering does not occur until at least one decoded media sample (playable at or about the same presentation time) is available from each instance of media processing component(s).
- the renderer also coordinates the synchronous presentation, as separate media presentations, of the rendered media samples from each instance of media processing component(s).
- a presentation thread reads decoded media samples from the queue(s) after it is determined that the queue(s) contain at least one decoded media sample (playable at or about the same presentation time) from each instance of media processing component(s).
- Queue locks may be used to maintain consistency in the state of the queue(s) during read/write operations, and decoded media samples produced by a particular instance of media processing component(s) may be deleted after rendering and/or presentation has occurred.
- the separate, synchronized media presentations are subjectively compared and/or selected for storage by a user in a sample-by-sample manner.
- media presentation quality may be measured efficiently and in real-time, without time-consuming intermediate steps and storage, such as decoding and saving uncompressed decoded data into separate files and then serially comparing the uncompressed files.
- FIG. 1 is a simplified functional block diagram of a media processing comparison system.
- FIG. 2 is a graphical illustration of an exemplary media timeline associated with the exemplary media content item shown in FIG. 1 .
- FIG. 3 is a flowchart of a method for rendering media content using aspects of the media processing comparison system shown in FIG. 1 .
- FIG. 4 is a simplified functional block diagram of an exemplary operating environment in which the media processing comparison system shown in FIG. 1 and/or the method shown in the flowchart of FIG. 3 may be implemented or used.
- media content players can concurrently render and/or present more than one media content item (for example, a main movie along with features such as a director's commentary, actor biographies, or advertising), such media content players do not generally allow deinterlacing of multiple input pins and/or real-time presentation quality comparison.
- a media processing comparison system (“MPCS”) and techniques are described herein, which enable media presentation quality to be evaluated without time-consuming intermediate steps and storage, such as decoding and saving uncompressed decoded data into separate files and then serially comparing the uncompressed files.
- the MPCS and techniques facilitate concurrent, subjective quality comparisons between media presentations produced by different instances of media processing components performing the same functions (for example, instances of media processing components supplied by different entities) in a particular media content player.
- the media processing components include but are not limited to instances of parsers, encoder/decoder pairs (“codecs”), decryptors, demultiplexers, or combinations thereof, in the form of hardware, software, firmware, or combinations thereof.
- FIG. 1 is a simplified functional block diagram of an exemplary implementation of an MPCS 100 .
- design choices dictate how specific functions of MPCS 100 are implemented. Such functions may be implemented using hardware, software, firmware, or combinations thereof.
- MPCS 100 is, or operates within, a multimedia pipeline associated with a media content player, such as a Windows® multimedia pipeline operating in personal computing device.
- a media content player such as a Windows® multimedia pipeline operating in personal computing device.
- any known or later developed framework may be used to implement a multimedia processing pipeline, and that any known or later developed media content player operating environment is possible.
- media content players and operating environments therefor include but are not limited to: optical media players; operating systems for computing devices; personal media players; set-top boxes; servers; personal computers; video recorders; mobile phones; personal digital assistants; and gaming devices.
- MPCS receives a media content item 105 from a media source 102 , performs processing tasks using two or more instances (instances 1 through N are shown) of media processing component(s) 104 , and uses a renderer 106 to prepare the media content item for presentation to a user (not shown) as two or more synchronized media presentations 157 (media presentations 1 through N, corresponding to instances 1 through N of media processing component(s), are shown).
- Media source 102 represents any device, location, or data from which media content item 105 is derived or obtained. Examples of media sources 102 include but are not limited to optical media, hard drives, network locations, over-the-air transmissions, and other sources. In general, any computer-readable medium may serve as media source 102 (computer-readable media 404 are shown and discussed further below, in connection with FIG. 4 ).
- Media content item 105 represents an ordered set (for example, a stream) of digital information (such as video, audio, graphics, images, data, or combinations thereof) arranged into a number of individually-presentable units, referred to herein as “media samples.”
- a particular media content item 105 is a video clip having sequentially ordered frames.
- the media samples are referred to as encoded media samples 107
- the media samples are referred to as decoded media samples 117
- the media samples are referred to as rendered media samples 127
- presentation via presentation device 120 also discussed further below
- user selection the media samples are referred to as user-designated media samples 166 .
- the naming convention(s) used herein is/are for illustrative purposes only, and that any desired naming conventions may be used.
- sequences of individually-presentable units and the digital information therein may be grouped in any desirable manner, and represented by any desired units, for example, bits, frames, data packets, groups of pictures, enhanced video object units, etc.
- digital information or amount thereof within a particular media sample may be based on several factors, such as the characteristics of the video, audio, graphics, images, or data that forms the media sample, or one or more parameters associated with media source 102 from which the media sample is derived (for example, media source identity and/or location, codec parameters or settings or encryption parameters or settings).
- FIG. 2 is a graphical illustration of an exemplary media timeline 200 associated with a particular media content item 105 having a number of individually-presentable units referred to as media samples 201 .
- Media content item 105 has a predetermined play duration 205 , which represents a particular amount of time in which the media content item is playable to a user.
- Each media sample 201 has an associated, predetermined presentation time 202 within the play duration.
- one or more upcoming media samples 201 may be prepared for presentation in advance of the scheduled/predetermined presentation time(s) 202 .
- media processing component(s) blocks 104 represent any devices, instructions, or techniques used to retrieve video, audio, graphics, images, and/or data content from encoded media samples 107 received from media source 102 , and to produce decoded media samples 117 .
- Media processing component(s) perform functions that may include (but are not limited to) one or more of the following: parsing; encoding/decoding; decryption; demultiplexing; or a combination of the foregoing.
- Media processing component(s) may be, for example, parsers, codecs, decryptors, demultiplexers, or combinations thereof, that are implemented using hardware, software, firmware, or any combination thereof.
- each instance of media processing component(s) 104 performs the same function(s) to produce a decoded media sample 117 corresponding to a particular encoded media sample 107 playable at a particular presentation time 202 .
- Particular instances of media processing component(s) 104 may be different in various ways, including but not limited to: being supplied by different entities; operating in accordance with different standards; having different settings; or being implemented in different forms (for example, hardware, software, or firmware).
- media content item 105 or individual media samples thereof is/are replicated in such a manner that there are encoded media samples 107 concurrently available for input to each instance of media processing component(s) 104 . It will be understood that when encoded media samples are input to an instance of media processing component(s), processing/decoding order and output order may be different.
- Decoded media samples 117 produced by a particular instance of media processing component(s) 104 are stored in one or more queues (two queues are shown, queue 1 140 and queue N 145 ). As shown, each instance of media processing component(s) 104 stores decoded media samples 117 in a separate queue, although fewer queues (for example, one queue) may be used. Decoded media samples 117 may be pushed onto the queues (by instances of media processing component(s), for example) or alternatively may be pulled onto the queues (by renderer 106 , for example). Queue locks (queue locks 141 and 142 are shown) may be used to maintain consistency in the state of the queues during read/write operations.
- Renderer 106 represents any device, instructions (for example, media rendering instructions 110 , which as shown include rendering thread 111 and presentation thread 113 ), or technique used to: receive decoded media samples 117 from instances of media processing component(s) 104 ; prepare the decoded media samples for rendering; and produce rendered media samples 127 .
- Rendered media samples 127 represent information that has been prepared for presentation at or about a particular presentation time 202 by renderer 106 .
- decoded media samples 117 produced by a particular instance of media processing component(s) are received for rendering from a particular queue via rendering thread 111 .
- Renderer 106 may have multiple input pins. Queue locks 141 and 142 may be used to maintain consistency in the state of the queues during read/write operations.
- One manner of maintaining synchronicity is for rendering thread 111 to ensure that rendering of decoded media samples to produce rendered media samples 127 does not occur until at least one decoded media sample 117 (playable at or about the same presentation time 202 ) is available from each instance of media processing component(s) 104 .
- Another manner of maintaining synchronicity is to ensure that media samples are not dropped, that is, to ensure that the number of decoded media samples received from each instance of media processing component(s) is the same. Thus, it is not generally necessary for rendering thread 111 to transmit quality messages to upstream media processing components.
- Presentation thread 113 coordinates the synchronous presentation of rendered media samples 127 associated with each instance of media processing component(s) 104 .
- rendered media samples 127 associated with different instances of media processing component(s) 104 are presented to a user as separate, synchronous media presentations 157 (media presentations 1 through N are shown) via one or more presentation devices 120 such as displays and/or speakers.
- a particular media presentation 157 represents the visible and/or audible information that is produced based on rendered media samples 127 by presentation device 120 and perceivable by the user.
- a particular media presentation 157 is generally presented on a sample-by-sample basis, synchronized by presentation times 202 with the media presentations associated with other instances of media processing components(s).
- One manner of maintaining synchronicity is for presentation thread 113 to ensure that presentation of rendered media samples 127 does not occur until at least one rendered media sample (playable at the same presentation time 202 ) associated with each instance of media processing component(s) is available. In the exemplary implementation, this may be accomplished via an interaction of rendering thread 111 and queues 140 , 145 , as discussed above, or may be accomplished via an interaction of presentation thread 113 and queues 140 , 145 , or both.
- state variables (not shown) may be used by presentation thread 113 to check if particular queues are empty or have decoded and/or rendered samples to be presented. Upon presentation, decoded and/or rendered media samples 127 may be removed from queues.
- one or more events reach renderer 106 .
- certain events indicate to rendering thread 111 that it should not expect to receive additional decoded media samples 117 from one or more instances of media processing component(s) 104 , and rendering thread 111 may notify presentation thread 113 . It is desirable for presentation thread 113 to wait to exit until it has received notice from rendering thread 111 that it should not expect to receive rendered media samples 127 associated with any of the instances of media processing component(s) 104 .
- the user may designate desired media samples (referred to as “user-designated media samples 166 ”) that are used form an edited media content file 121 .
- desired media samples referred to as “user-designated media samples 166 ”
- the user may compare media presentations 157 on a sample-by-sample basis and designate desired rendered media samples 127 for (or drop rendered media samples from) inclusion in edited media content file 121 .
- Edited media content file 121 is generally an uncompressed file that may be stored in temporary or persistent memory.
- a user may efficiently and in real-time subjectively evaluate and compare media presentation quality, and thus evaluate performance of different media processing components, without extensive and time-consuming intermediate steps and storage.
- FIG. 3 is a flowchart illustrating certain aspects of a method for rendering media content, such as media content item 105 , using aspects of MPCS 100 .
- the method(s) illustrated in FIG. 3 may be implemented using computer-executable instructions executed by one or more general, multi-purpose, or single-purpose processors (exemplary computer-executable instructions 406 and processor 402 are discussed further below, in connection with FIG. 4 ).
- FIG. 3 is exemplary in nature, and that the subject matter defined in the claims is not necessarily limited to the specific features or acts described below. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
- the methods described herein are not constrained to a particular order or sequence.
- some of the described method or elements thereof can occur or be performed concurrently. It will be understood that all of the steps shown need not occur in performance of the functions described herein.
- the method begins at block 300 and continues at block 302 , where a media timeline associated with a particular media content item, such as media timeline 200 , is ascertained. Based on the media timeline, as indicated at blocks 304 and 306 , respectively, a presentation time, such as a particular presentation time 202 , and a corresponding media sample, such as a particular encoded media sample 107 , are identified. Next, the media sample is decoded using two or more instances of one or more media processing components that perform the same functions (such as parsers, codecs, decryptors, demultiplexers, or combinations thereof) to produce decoded media samples, such as decoded media samples 117 , which are stored (in separate queues, for example).
- media processing components such as parsers, codecs, decryptors, demultiplexers, or combinations thereof
- a decoded media sample associated with each instance of media processing component(s) is present (for example, whether a decoded media sample is present in each queue). If not, and it is not the end of the stream or playback has not otherwise ended as determined at block 312 (in which case the method exits, as indicated at block 314 ), then the method returns to diamond 310 until it is determined that a decoded media sample associated with each instance of media processing component(s) is present.
- the decoded media samples are rendered (to produce rendered media samples 127 , for example) and synchronously presented (in a sample-by-sample manner as media presentations 157 , for example).
- separate software or hardware threads such as rendering thread 111 and presentation thread 113 , may be used to ensure synchronization between media presentations associated with different instances of media processing component(s).
- rendering thread 111 and presentation thread 113 may be used to ensure synchronization between media presentations associated with different instances of media processing component(s).
- a user may efficiently and in real-time subjectively evaluate and compare media presentation quality on a sample-by-sample basis, and thus evaluate performance of different media processing components, without extensive and time-consuming intermediate steps and storage.
- FIG. 4 is a simplified block diagram of an exemplary operating environment 400 in which aspects of MPCS 100 and/or the method(s) shown in FIG. 3 may be implemented or used.
- Operating environment 400 is generally indicative of a wide variety of general-purpose or special-purpose computing environments, and is not intended to suggest any limitation as to the scope of use or functionality of the system(s) and methods described herein.
- operating environment 400 may be an electronic device such as a mobile phone, a server, a gaming device, a personal digital assistant, a mobile phone, a personal computer, a personal media player, a computer/television device, a set-top box, a hard-drive storage device, a video recorder, an optical media player, a device temporarily or permanently mounted in transportation equipment such as a wheeled vehicle, a plane, or a train, or another type of known or later developed electronic device.
- a mobile phone such as a mobile phone, a server, a gaming device, a personal digital assistant, a mobile phone, a personal computer, a personal media player, a computer/television device, a set-top box, a hard-drive storage device, a video recorder, an optical media player, a device temporarily or permanently mounted in transportation equipment such as a wheeled vehicle, a plane, or a train, or another type of known or later developed electronic device.
- operating environment 400 includes processor(s) 402 , computer-readable media 404 , computer-executable instructions 406 , user interface(s) 416 , communication interface(s) 410 , and specialized hardware/firmware 442 .
- processor(s) 402 computer-readable media 404
- computer-executable instructions 406 user interface(s) 416
- communication interface(s) 410 communication interface(s) 410
- specialized hardware/firmware 442 specialized hardware/firmware 442 .
- One or more buses 421 or other communication media may be used to carry data, addresses, control signals, and other information within, to, or from operating environment 400 or elements thereof.
- Processor 402 which may be a real or a virtual processor, controls functions of the operating environment by executing computer-executable instructions 406 .
- the processor may execute instructions at the assembly, compiled, or machine-level to perform a particular process.
- Computer-readable media 404 may represent any number and combination of local or remote devices, in any form, now known or later developed, capable of recording, storing, or transmitting computer-readable data.
- computer-readable media 404 may be, or may include, a semiconductor memory (such as a read only memory (“ROM”), any type of programmable ROM (“PROM”), a random access memory (“RAM”), or a flash memory, for example); a magnetic storage device (such as a floppy disk drive, a hard disk drive, a magnetic drum, a magnetic tape, or a magneto-optical disk); an optical storage device (such as any type of compact disk or digital versatile disk); a bubble memory; a cache memory; a core memory; a holographic memory; a memory stick; or any combination thereof.
- ROM read only memory
- PROM programmable ROM
- RAM random access memory
- flash memory for example
- magnetic storage device such as a floppy disk drive, a hard disk drive, a magnetic drum, a magnetic tape
- the computer-readable media may also include transmission media and data associated therewith.
- transmission media/data include, but are not limited to, data embodied in any form of wireline or wireless transmission, such as packetized or non-packetized data carried by a modulated carrier signal.
- Computer-executable instructions 406 represent any signal processing methods or stored instructions. Generally, computer-executable instructions 4306 are implemented as software components according to well-known practices for component-based software development, and encoded in computer-readable media. Computer programs may be combined or distributed in various ways. Computer-executable instructions 406 , however, are not limited to implementation by any specific embodiments of computer programs, and in other instances may be implemented by, or executed in, hardware, software, firmware, or any combination thereof.
- User interface(s) 416 represents the combination of physical or logical presentation tools and controls that define the way a user interacts with a particular application or device, such as MPCS 100 .
- Presentation tools are used to provide output to a user.
- An example of a physical presentation tool is presentation device 120 (such as a display or speaker).
- Another example of a physical presentation tool is printed material on a surface such as paper, glass, metal, etc.
- An example of a logical presentation tool is a data organization technique (for example, a window, a menu, or a layout thereof). Controls facilitate the receipt of input from a user.
- An example of a physical control is an input device such as a remote control, a display, a mouse, a pen, a stylus, a trackball, a keyboard, a microphone, or a scanning device.
- An example of a logical control is a data organization technique (for example, a window, a menu, or a layout thereof) via which a user may issue commands. It will be appreciated that the same physical device or logical construct may function to provide outputs to, and receive inputs from, a user.
- Communication interface(s) 410 represent one or more physical or logical elements, such as connectivity devices or computer-executable instructions, which enable communication between operating environment 400 and external devices or services, via one or more protocols or techniques. Such communication may be, but is not necessarily, client-server type communication or peer-to-peer communication. Information received at a given network interface may traverse one or more layers of a communication protocol stack.
- Specialized hardware 442 represents any hardware or firmware that implements functions of operating environment 400 .
- Examples of specialized hardware include media processing components 104 or aspects thereof, application-specific integrated circuits, clocks, and the like.
- operating environment 400 may include fewer, more, or different components or functions than those described.
- functional components of operating environment 400 may be implemented by one or more devices, which are co-located or remotely located, in a variety of ways.
- connections depicted herein may be logical or physical in practice to achieve a coupling or communicative interface between elements. Connections may be implemented, among other ways, as inter-process communications among software processes, or inter-machine communications among networked computers.
Abstract
Description
- Media content players may be hardware, software, firmware, or combinations thereof that render sets of video, audio, graphics, images, and/or data content (“media content items”) for presentation to users. Media content items that are rendered and presented to users may be referred to as “media presentations.”
- Media content players may be stand-alone consumer electronic devices, or included in other electronic devices. Although any type of known or later developed electronic device may be or include a media content player, examples of electronic devices that have served user demand for consumption of media content items include but are not limited to: servers; optical media players; personal computers; personal media players; video recorders; portable communication devices such as phones and personal digital assistants; set-top boxes; and gaming devices.
- It is common for various entities to supply different media processing components of a media content player (for example, parsers, encoder/decoder pairs (“codecs”), decryptors, demultiplexers, and combinations thereof), and it is desirable to evaluate the quality of the media presentations produced by the interoperation of such media processing components in particular environments.
- A media processing comparison system (“MPCS”) and techniques that facilitate subjective quality comparisons between media presentations produced by different media processing components in a particular operating environment are described herein. In one exemplary implementation, the media processing components are hardware, software, firmware, or combinations thereof, operating in a multimedia pipeline associated with a media content player, such as a Windows® multimedia pipeline operating in personal computing device. It will be appreciated, however, that any known or later developed framework may be used to implement a multimedia processing pipeline, and that any known or later developed media content player operating environment is possible.
- In operation, the MPCS receives a particular media content item from a particular media source. The particular media content item is generally arranged as an ordered stream of encoded media samples, such as video, audio, graphics, images, data, or combinations thereof, playable within a predetermined amount of time referred to as a “play duration.” In the exemplary implementation, the particular media content item is a video clip. Upon decoding, a particular encoded media sample (for example, a particular video frame) is playable to a user at a particular presentation time within the play duration.
- The MPCS decodes a particular encoded media sample using two or more different instances of media processing components that perform the same functions (such as instances of parsers, encoder/decoder pairs (“codecs”), decryptors, demultiplexers, or combinations thereof supplied by different entities). Each different instance of media processing component(s) produces a decoded media sample corresponding to a particular encoded media sample. In the exemplary implementation, a particular media content item (or a particular encoded media sample thereof) is replicated to form multiple identical streams of encoded media samples, and a particular stream is input to and decoded by a particular instance of media processing component(s). Also in the exemplary implementation, decoded media samples produced by a particular instance of media processing component(s) are stored in (for example, written to) a queue. Each instance of media processing component(s) may store decoded media samples in a separate queue, or fewer queues (for example, a single queue) may be used.
- A single renderer renders decoded media samples received from each instance of media processing component(s), producing rendered media samples. In the exemplary implementation, decoded media samples produced by a particular instance of media processing component(s) are received for rendering from a particular queue. The decoded media samples may be pushed or pulled from the particular queue. Generally, to ensure synchronicity, rendering does not occur until at least one decoded media sample (playable at or about the same presentation time) is available from each instance of media processing component(s).
- The renderer also coordinates the synchronous presentation, as separate media presentations, of the rendered media samples from each instance of media processing component(s). In the exemplary implementation, a presentation thread reads decoded media samples from the queue(s) after it is determined that the queue(s) contain at least one decoded media sample (playable at or about the same presentation time) from each instance of media processing component(s). Queue locks may be used to maintain consistency in the state of the queue(s) during read/write operations, and decoded media samples produced by a particular instance of media processing component(s) may be deleted after rendering and/or presentation has occurred.
- The separate, synchronized media presentations are subjectively compared and/or selected for storage by a user in a sample-by-sample manner. In this manner, media presentation quality may be measured efficiently and in real-time, without time-consuming intermediate steps and storage, such as decoding and saving uncompressed decoded data into separate files and then serially comparing the uncompressed files.
- This Summary is provided to introduce a selection of concepts in a simplified form. The concepts are further described in the Detailed Description section. Elements or steps other than those described in this Summary are possible, and no element or step is necessarily required. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended for use as an aid in determining the scope of the claimed subject matter. The claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
-
FIG. 1 is a simplified functional block diagram of a media processing comparison system. -
FIG. 2 is a graphical illustration of an exemplary media timeline associated with the exemplary media content item shown inFIG. 1 . -
FIG. 3 is a flowchart of a method for rendering media content using aspects of the media processing comparison system shown inFIG. 1 . -
FIG. 4 is a simplified functional block diagram of an exemplary operating environment in which the media processing comparison system shown inFIG. 1 and/or the method shown in the flowchart ofFIG. 3 may be implemented or used. - To enhance investment in, and user satisfaction with, media content players, it is desirable to provide high quality presentation of video, audio, and/or data content. Currently, when a user desires to subjectively evaluate video quality produced by different media content players or media processing components thereof, such evaluation is generally performed by capturing the output of the player and/or component under test, and serially comparing it to the output of a reference player and/or component. Many time-consuming steps (such as decoding and saving uncompressed decoded data into separate files) are often necessary, followed by a serial comparison of the uncompressed files, and human memory often results in incorrect quality assessments. Although some known media content players can concurrently render and/or present more than one media content item (for example, a main movie along with features such as a director's commentary, actor biographies, or advertising), such media content players do not generally allow deinterlacing of multiple input pins and/or real-time presentation quality comparison.
- A media processing comparison system (“MPCS”) and techniques are described herein, which enable media presentation quality to be evaluated without time-consuming intermediate steps and storage, such as decoding and saving uncompressed decoded data into separate files and then serially comparing the uncompressed files. The MPCS and techniques facilitate concurrent, subjective quality comparisons between media presentations produced by different instances of media processing components performing the same functions (for example, instances of media processing components supplied by different entities) in a particular media content player. In one exemplary implementation, the media processing components include but are not limited to instances of parsers, encoder/decoder pairs (“codecs”), decryptors, demultiplexers, or combinations thereof, in the form of hardware, software, firmware, or combinations thereof.
- Turning to the drawings, where like numerals designate like components,
FIG. 1 is a simplified functional block diagram of an exemplary implementation of an MPCS 100. In general, design choices dictate how specific functions of MPCS 100 are implemented. Such functions may be implemented using hardware, software, firmware, or combinations thereof. - In the exemplary implementation, MPCS 100 is, or operates within, a multimedia pipeline associated with a media content player, such as a Windows® multimedia pipeline operating in personal computing device. It will be appreciated, however, that any known or later developed framework may be used to implement a multimedia processing pipeline, and that any known or later developed media content player operating environment is possible. Examples of media content players and operating environments therefor include but are not limited to: optical media players; operating systems for computing devices; personal media players; set-top boxes; servers; personal computers; video recorders; mobile phones; personal digital assistants; and gaming devices.
- As shown in
FIG. 1 , MPCS receives amedia content item 105 from amedia source 102, performs processing tasks using two or more instances (instances 1 through N are shown) of media processing component(s) 104, and uses arenderer 106 to prepare the media content item for presentation to a user (not shown) as two or more synchronized media presentations 157 (media presentations 1 through N, corresponding toinstances 1 through N of media processing component(s), are shown). -
Media source 102 represents any device, location, or data from whichmedia content item 105 is derived or obtained. Examples ofmedia sources 102 include but are not limited to optical media, hard drives, network locations, over-the-air transmissions, and other sources. In general, any computer-readable medium may serve as media source 102 (computer-readable media 404 are shown and discussed further below, in connection withFIG. 4 ). -
Media content item 105 represents an ordered set (for example, a stream) of digital information (such as video, audio, graphics, images, data, or combinations thereof) arranged into a number of individually-presentable units, referred to herein as “media samples.” In an exemplary scenario, a particularmedia content item 105 is a video clip having sequentially ordered frames. As shown, prior to processing, the media samples are referred to as encodedmedia samples 107, after processing via instances of media processing component(s) 104 (discussed further below), the media samples are referred to as decodedmedia samples 117, after rendering via renderer 106 (discussed further below), the media samples are referred to as renderedmedia samples 127, and after presentation via presentation device 120 (also discussed further below) and user selection, the media samples are referred to as user-designatedmedia samples 166. It will be appreciated, however, that the naming convention(s) used herein is/are for illustrative purposes only, and that any desired naming conventions may be used. It will also be appreciated that sequences of individually-presentable units and the digital information therein may be grouped in any desirable manner, and represented by any desired units, for example, bits, frames, data packets, groups of pictures, enhanced video object units, etc. It will further be appreciated that the digital information or amount thereof within a particular media sample may be based on several factors, such as the characteristics of the video, audio, graphics, images, or data that forms the media sample, or one or more parameters associated withmedia source 102 from which the media sample is derived (for example, media source identity and/or location, codec parameters or settings or encryption parameters or settings). - With continuing reference to
FIG. 1 , brief reference is made toFIG. 2 , which is a graphical illustration of anexemplary media timeline 200 associated with a particularmedia content item 105 having a number of individually-presentable units referred to asmedia samples 201.Media content item 105 has apredetermined play duration 205, which represents a particular amount of time in which the media content item is playable to a user. Eachmedia sample 201 has an associated,predetermined presentation time 202 within the play duration. To avoid user-perceptible glitches in a media presentation, one or moreupcoming media samples 201 may be prepared for presentation in advance of the scheduled/predetermined presentation time(s) 202. - Referring again to
FIG. 1 , media processing component(s) blocks 104 (1 through N are shown) represent any devices, instructions, or techniques used to retrieve video, audio, graphics, images, and/or data content from encodedmedia samples 107 received frommedia source 102, and to produce decodedmedia samples 117. Media processing component(s) perform functions that may include (but are not limited to) one or more of the following: parsing; encoding/decoding; decryption; demultiplexing; or a combination of the foregoing. Media processing component(s) may be, for example, parsers, codecs, decryptors, demultiplexers, or combinations thereof, that are implemented using hardware, software, firmware, or any combination thereof. - Generally, each instance of media processing component(s) 104 performs the same function(s) to produce a decoded
media sample 117 corresponding to a particular encodedmedia sample 107 playable at aparticular presentation time 202. Particular instances of media processing component(s) 104 may be different in various ways, including but not limited to: being supplied by different entities; operating in accordance with different standards; having different settings; or being implemented in different forms (for example, hardware, software, or firmware). In one exemplary implementation,media content item 105 or individual media samples thereof is/are replicated in such a manner that there are encodedmedia samples 107 concurrently available for input to each instance of media processing component(s) 104. It will be understood that when encoded media samples are input to an instance of media processing component(s), processing/decoding order and output order may be different. - Decoded
media samples 117 produced by a particular instance of media processing component(s) 104 are stored in one or more queues (two queues are shown,queue 1 140 and queue N 145). As shown, each instance of media processing component(s) 104 stores decodedmedia samples 117 in a separate queue, although fewer queues (for example, one queue) may be used. Decodedmedia samples 117 may be pushed onto the queues (by instances of media processing component(s), for example) or alternatively may be pulled onto the queues (byrenderer 106, for example). Queue locks (queue locks 141 and 142 are shown) may be used to maintain consistency in the state of the queues during read/write operations. -
Renderer 106 represents any device, instructions (for example,media rendering instructions 110, which as shown includerendering thread 111 and presentation thread 113), or technique used to: receive decodedmedia samples 117 from instances of media processing component(s) 104; prepare the decoded media samples for rendering; and produce renderedmedia samples 127. Renderedmedia samples 127 represent information that has been prepared for presentation at or about aparticular presentation time 202 byrenderer 106. - In the exemplary implementation, decoded
media samples 117 produced by a particular instance of media processing component(s) are received for rendering from a particular queue viarendering thread 111.Renderer 106 may have multiple input pins. Queue locks 141 and 142 may be used to maintain consistency in the state of the queues during read/write operations. One manner of maintaining synchronicity is forrendering thread 111 to ensure that rendering of decoded media samples to produce renderedmedia samples 127 does not occur until at least one decoded media sample 117 (playable at or about the same presentation time 202) is available from each instance of media processing component(s) 104. Another manner of maintaining synchronicity is to ensure that media samples are not dropped, that is, to ensure that the number of decoded media samples received from each instance of media processing component(s) is the same. Thus, it is not generally necessary forrendering thread 111 to transmit quality messages to upstream media processing components. -
Presentation thread 113 coordinates the synchronous presentation of renderedmedia samples 127 associated with each instance of media processing component(s) 104. Generally, renderedmedia samples 127 associated with different instances of media processing component(s) 104 are presented to a user as separate, synchronous media presentations 157 (media presentations 1 through N are shown) via one ormore presentation devices 120 such as displays and/or speakers. Aparticular media presentation 157 represents the visible and/or audible information that is produced based on renderedmedia samples 127 bypresentation device 120 and perceivable by the user. - A
particular media presentation 157 is generally presented on a sample-by-sample basis, synchronized bypresentation times 202 with the media presentations associated with other instances of media processing components(s). One manner of maintaining synchronicity is forpresentation thread 113 to ensure that presentation of renderedmedia samples 127 does not occur until at least one rendered media sample (playable at the same presentation time 202) associated with each instance of media processing component(s) is available. In the exemplary implementation, this may be accomplished via an interaction ofrendering thread 111 andqueues presentation thread 113 andqueues presentation thread 113 to check if particular queues are empty or have decoded and/or rendered samples to be presented. Upon presentation, decoded and/or renderedmedia samples 127 may be removed from queues. - When
media source 102 reaches the end of a particular media content item 105 (or when a user stops, pauses, or jumps to specific samples within, media presentation(s)), one or more events reachrenderer 106. In the exemplary implementation, certain events indicate torendering thread 111 that it should not expect to receive additional decodedmedia samples 117 from one or more instances of media processing component(s) 104, andrendering thread 111 may notifypresentation thread 113. It is desirable forpresentation thread 113 to wait to exit until it has received notice fromrendering thread 111 that it should not expect to receive renderedmedia samples 127 associated with any of the instances of media processing component(s) 104. - Optionally, the user may designate desired media samples (referred to as “user-designated
media samples 166”) that are used form an editedmedia content file 121. For example, the user may comparemedia presentations 157 on a sample-by-sample basis and designate desired renderedmedia samples 127 for (or drop rendered media samples from) inclusion in editedmedia content file 121. Editedmedia content file 121 is generally an uncompressed file that may be stored in temporary or persistent memory. - In this manner, a user may efficiently and in real-time subjectively evaluate and compare media presentation quality, and thus evaluate performance of different media processing components, without extensive and time-consuming intermediate steps and storage.
- With continuing reference to
FIGS. 1 and 2 ,FIG. 3 is a flowchart illustrating certain aspects of a method for rendering media content, such asmedia content item 105, using aspects ofMPCS 100. The method(s) illustrated inFIG. 3 may be implemented using computer-executable instructions executed by one or more general, multi-purpose, or single-purpose processors (exemplary computer-executable instructions 406 andprocessor 402 are discussed further below, in connection withFIG. 4 ). It will be appreciated that the method ofFIG. 3 is exemplary in nature, and that the subject matter defined in the claims is not necessarily limited to the specific features or acts described below. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims. Unless specifically stated, the methods described herein are not constrained to a particular order or sequence. In addition, some of the described method or elements thereof can occur or be performed concurrently. It will be understood that all of the steps shown need not occur in performance of the functions described herein. - The method begins at
block 300 and continues atblock 302, where a media timeline associated with a particular media content item, such asmedia timeline 200, is ascertained. Based on the media timeline, as indicated atblocks particular presentation time 202, and a corresponding media sample, such as a particular encodedmedia sample 107, are identified. Next, the media sample is decoded using two or more instances of one or more media processing components that perform the same functions (such as parsers, codecs, decryptors, demultiplexers, or combinations thereof) to produce decoded media samples, such as decodedmedia samples 117, which are stored (in separate queues, for example). - At
diamond 310, it is determined whether a decoded media sample associated with each instance of media processing component(s) is present (for example, whether a decoded media sample is present in each queue). If not, and it is not the end of the stream or playback has not otherwise ended as determined at block 312 (in which case the method exits, as indicated at block 314), then the method returns todiamond 310 until it is determined that a decoded media sample associated with each instance of media processing component(s) is present. - As indicated at
block 316, the decoded media samples are rendered (to produce renderedmedia samples 127, for example) and synchronously presented (in a sample-by-sample manner asmedia presentations 157, for example). As discussed above in connection withFIG. 1 , separate software or hardware threads, such asrendering thread 111 andpresentation thread 113, may be used to ensure synchronization between media presentations associated with different instances of media processing component(s). Prior to presentation of rendered media samples, it may be desirable to wait until a rendered media sample associated with each instance of media processing component(s) is present—this may be accomplished using the same (or additional) queues as described in connection withblock 308 and techniques as described in connection withdiamond 310. - As indicated at
block 318, a user may efficiently and in real-time subjectively evaluate and compare media presentation quality on a sample-by-sample basis, and thus evaluate performance of different media processing components, without extensive and time-consuming intermediate steps and storage. - With continuing reference to
FIGS. 1 through 3 ,FIG. 4 is a simplified block diagram of anexemplary operating environment 400 in which aspects ofMPCS 100 and/or the method(s) shown inFIG. 3 may be implemented or used.Operating environment 400 is generally indicative of a wide variety of general-purpose or special-purpose computing environments, and is not intended to suggest any limitation as to the scope of use or functionality of the system(s) and methods described herein. For example, operatingenvironment 400 may be an electronic device such as a mobile phone, a server, a gaming device, a personal digital assistant, a mobile phone, a personal computer, a personal media player, a computer/television device, a set-top box, a hard-drive storage device, a video recorder, an optical media player, a device temporarily or permanently mounted in transportation equipment such as a wheeled vehicle, a plane, or a train, or another type of known or later developed electronic device. - As shown, operating
environment 400 includes processor(s) 402, computer-readable media 404, computer-executable instructions 406, user interface(s) 416, communication interface(s) 410, and specialized hardware/firmware 442. One or more buses 421 or other communication media may be used to carry data, addresses, control signals, and other information within, to, or from operatingenvironment 400 or elements thereof. -
Processor 402, which may be a real or a virtual processor, controls functions of the operating environment by executing computer-executable instructions 406. The processor may execute instructions at the assembly, compiled, or machine-level to perform a particular process. - Computer-
readable media 404 may represent any number and combination of local or remote devices, in any form, now known or later developed, capable of recording, storing, or transmitting computer-readable data. In particular, computer-readable media 404 may be, or may include, a semiconductor memory (such as a read only memory (“ROM”), any type of programmable ROM (“PROM”), a random access memory (“RAM”), or a flash memory, for example); a magnetic storage device (such as a floppy disk drive, a hard disk drive, a magnetic drum, a magnetic tape, or a magneto-optical disk); an optical storage device (such as any type of compact disk or digital versatile disk); a bubble memory; a cache memory; a core memory; a holographic memory; a memory stick; or any combination thereof. The computer-readable media may also include transmission media and data associated therewith. Examples of transmission media/data include, but are not limited to, data embodied in any form of wireline or wireless transmission, such as packetized or non-packetized data carried by a modulated carrier signal. - Computer-
executable instructions 406 represent any signal processing methods or stored instructions. Generally, computer-executable instructions 4306 are implemented as software components according to well-known practices for component-based software development, and encoded in computer-readable media. Computer programs may be combined or distributed in various ways. Computer-executable instructions 406, however, are not limited to implementation by any specific embodiments of computer programs, and in other instances may be implemented by, or executed in, hardware, software, firmware, or any combination thereof. - User interface(s) 416 represents the combination of physical or logical presentation tools and controls that define the way a user interacts with a particular application or device, such as
MPCS 100. Presentation tools are used to provide output to a user. An example of a physical presentation tool is presentation device 120 (such as a display or speaker). Another example of a physical presentation tool is printed material on a surface such as paper, glass, metal, etc. An example of a logical presentation tool is a data organization technique (for example, a window, a menu, or a layout thereof). Controls facilitate the receipt of input from a user. An example of a physical control is an input device such as a remote control, a display, a mouse, a pen, a stylus, a trackball, a keyboard, a microphone, or a scanning device. An example of a logical control is a data organization technique (for example, a window, a menu, or a layout thereof) via which a user may issue commands. It will be appreciated that the same physical device or logical construct may function to provide outputs to, and receive inputs from, a user. - Communication interface(s) 410 represent one or more physical or logical elements, such as connectivity devices or computer-executable instructions, which enable communication between
operating environment 400 and external devices or services, via one or more protocols or techniques. Such communication may be, but is not necessarily, client-server type communication or peer-to-peer communication. Information received at a given network interface may traverse one or more layers of a communication protocol stack. -
Specialized hardware 442 represents any hardware or firmware that implements functions of operatingenvironment 400. Examples of specialized hardware includemedia processing components 104 or aspects thereof, application-specific integrated circuits, clocks, and the like. - It will be appreciated that particular configurations of operating
environment 400 may include fewer, more, or different components or functions than those described. In addition, functional components of operatingenvironment 400 may be implemented by one or more devices, which are co-located or remotely located, in a variety of ways. - Functions/components described herein as being computer programs are not limited to implementation by any specific embodiments of computer programs. Rather, such functions/components are processes that convey or transform data, and may generally be implemented by, or executed in, hardware, software, firmware, or any combination thereof.
- It will be understood that when one element is indicated as being responsive to another element, the elements may be directly or indirectly coupled. Connections depicted herein may be logical or physical in practice to achieve a coupling or communicative interface between elements. Connections may be implemented, among other ways, as inter-process communications among software processes, or inter-machine communications among networked computers.
- The word “exemplary” is used herein to mean serving as an example, instance, or illustration. Any implementation or aspect thereof described herein as “exemplary” is not necessarily to be constructed as preferred or advantageous over other implementations or aspects thereof.
- As it is understood that embodiments other than the specific embodiments described above may be devised without departing from the spirit and scope of the appended claims, it is intended that the scope of the subject matter herein will be governed by the following claims.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/708,232 US20130151972A1 (en) | 2009-07-23 | 2012-12-07 | Media processing comparison system and techniques |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/507,875 US8351768B2 (en) | 2009-07-23 | 2009-07-23 | Media processing comparison system and techniques |
US13/708,232 US20130151972A1 (en) | 2009-07-23 | 2012-12-07 | Media processing comparison system and techniques |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/507,875 Continuation US8351768B2 (en) | 2009-07-23 | 2009-07-23 | Media processing comparison system and techniques |
Publications (1)
Publication Number | Publication Date |
---|---|
US20130151972A1 true US20130151972A1 (en) | 2013-06-13 |
Family
ID=43496894
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/507,875 Expired - Fee Related US8351768B2 (en) | 2009-07-23 | 2009-07-23 | Media processing comparison system and techniques |
US13/708,232 Abandoned US20130151972A1 (en) | 2009-07-23 | 2012-12-07 | Media processing comparison system and techniques |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/507,875 Expired - Fee Related US8351768B2 (en) | 2009-07-23 | 2009-07-23 | Media processing comparison system and techniques |
Country Status (3)
Country | Link |
---|---|
US (2) | US8351768B2 (en) |
CN (1) | CN102473088B (en) |
WO (1) | WO2011011180A2 (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8351768B2 (en) * | 2009-07-23 | 2013-01-08 | Microsoft Corporation | Media processing comparison system and techniques |
US8943020B2 (en) * | 2012-03-30 | 2015-01-27 | Intel Corporation | Techniques for intelligent media show across multiple devices |
US9674255B1 (en) * | 2014-03-26 | 2017-06-06 | Amazon Technologies, Inc. | Systems, devices and methods for presenting content |
CN110582025B (en) * | 2018-06-08 | 2022-04-01 | 北京百度网讯科技有限公司 | Method and apparatus for processing video |
Citations (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5677980A (en) * | 1995-06-20 | 1997-10-14 | Nippon Steel Corporation | Decoder for compressed digital signal |
US5920356A (en) * | 1995-06-06 | 1999-07-06 | Compressions Labs, Inc. | Coding parameter adaptive transform artifact reduction process |
US6057882A (en) * | 1996-10-29 | 2000-05-02 | Hewlett-Packard Company | Testing architecture for digital video transmission system |
US6269482B1 (en) * | 1997-07-14 | 2001-07-31 | Altinex, Inc. | Methods of testing electrical signals and compensating for degradation |
KR20010102899A (en) * | 2001-10-23 | 2001-11-17 | 박영남 | Apparatus and method for implementing multi-display of mpeg2 file in mpeg2 file reproducing system |
US6400400B1 (en) * | 1997-07-30 | 2002-06-04 | Sarnoff Corporation | Method and apparatus for automated testing of a video decoder |
US20020129374A1 (en) * | 1991-11-25 | 2002-09-12 | Michael J. Freeman | Compressed digital-data seamless video switching system |
US6477204B1 (en) * | 1997-12-19 | 2002-11-05 | Kabushiki Kaisha Toshiba | Video image decoding method and apparatus |
US6603505B1 (en) * | 1999-09-10 | 2003-08-05 | Kdd Corporation | Device for objectively evaluating the quality of a digital transmitted picture |
US20030174349A1 (en) * | 2002-03-15 | 2003-09-18 | Eastman Kodak Company | Preview function in a digital data preservation system |
US20040174384A1 (en) * | 2002-12-03 | 2004-09-09 | Pioneer Corporation | Image signal processing apparatus and method |
EP1548730A2 (en) * | 2003-12-26 | 2005-06-29 | Humax Co., Ltd. | Method for establishing recording quality in digital recording device |
US20050180511A1 (en) * | 2004-02-13 | 2005-08-18 | Kabushiki Kaisha Toshiba | H. 264 codec IC, DVD playback apparatus, H. 264 codec method |
JP2006166130A (en) * | 2004-12-08 | 2006-06-22 | Shinano Kenshi Co Ltd | Quality evaluation support device and quality evaluation support program for compressed video information |
US20060282867A1 (en) * | 2005-06-13 | 2006-12-14 | Yoshiaki Mizuhashi | Image processing apparatus capable of adjusting image quality by using moving image samples |
US20070047542A1 (en) * | 2005-08-30 | 2007-03-01 | Microsoft Corporation | Real-time audio-visual quality monitoring in a network |
US20070274676A1 (en) * | 2004-09-10 | 2007-11-29 | Giuseppe Diomelli | Method and Apparatus For Unified Management Of Different Type Of Communications Over Lanwan And Internet Networks, Using A Web Browser |
US20080043031A1 (en) * | 2006-08-15 | 2008-02-21 | Ati Technologies, Inc. | Picture adjustment methods and apparatus for image display device |
CN101453571A (en) * | 2007-12-04 | 2009-06-10 | 康佳集团股份有限公司 | Method and equipment for television image effect comparison |
US20090322774A1 (en) * | 2006-08-03 | 2009-12-31 | Kenichiro Hosoi | Image display control device, image processing device, image display control method, its program, and recording medium with the program recorded therein |
US20100110199A1 (en) * | 2008-11-03 | 2010-05-06 | Stefan Winkler | Measuring Video Quality Using Partial Decoding |
US20100223649A1 (en) * | 2009-03-02 | 2010-09-02 | Jason Robert Suitts | Automated Assessment of Digital Video Encodings |
US20100260271A1 (en) * | 2007-11-16 | 2010-10-14 | Thomson Licensing Llc. | Sysytem and method for encoding video |
US7974485B1 (en) * | 2005-10-27 | 2011-07-05 | Nvidia Corporation | Split-frame post-processing in a programmable video pipeline |
US20120076471A1 (en) * | 2005-10-11 | 2012-03-29 | Apple Inc. | Image capture and manipulation |
US8321907B2 (en) * | 2004-02-09 | 2012-11-27 | Panasonic Corporation | Broadcast receiving apparatus, broadcast receiving method, broadcast receiving program, and broadcast receiving circuit |
US8351768B2 (en) * | 2009-07-23 | 2013-01-08 | Microsoft Corporation | Media processing comparison system and techniques |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1255406B1 (en) * | 2000-04-21 | 2007-02-07 | Matsushita Electric Industrial Co., Ltd. | Trick play apparatus for digital storage medium |
US6748395B1 (en) | 2000-07-14 | 2004-06-08 | Microsoft Corporation | System and method for dynamic playlist of media |
US7023471B2 (en) | 2001-10-31 | 2006-04-04 | Intel Corporation | Video quality assessment with test video sequences |
US20040139173A1 (en) | 2002-12-11 | 2004-07-15 | Jeyhan Karaoguz | Media processing system automatically offering access to newly available media in a media exchange network |
JP4649865B2 (en) * | 2003-11-06 | 2011-03-16 | ソニー株式会社 | Information processing apparatus, information recording medium, information processing method, and computer program |
US7391434B2 (en) | 2004-07-27 | 2008-06-24 | The Directv Group, Inc. | Video bit stream test |
US7259779B2 (en) | 2004-08-13 | 2007-08-21 | Microsoft Corporation | Automatic assessment of de-interlaced video quality |
JP4408845B2 (en) * | 2005-07-27 | 2010-02-03 | シャープ株式会社 | Video composition apparatus and program |
JP2007066012A (en) | 2005-08-31 | 2007-03-15 | Toshiba Corp | Apparatus, method and program for drawing image |
KR100731358B1 (en) | 2005-11-09 | 2007-06-21 | 삼성전자주식회사 | Method and system for measuring the video quality |
WO2007078227A1 (en) * | 2006-01-05 | 2007-07-12 | Telefonaktiebolaget Lm Ericsson (Publ) | Media content management |
US8380864B2 (en) | 2006-12-27 | 2013-02-19 | Microsoft Corporation | Media stream slicing and processing load allocation for multi-user media systems |
US7961936B2 (en) * | 2007-03-30 | 2011-06-14 | Intel Corporation | Non-overlap region based automatic global alignment for ring camera image mosaic |
-
2009
- 2009-07-23 US US12/507,875 patent/US8351768B2/en not_active Expired - Fee Related
-
2010
- 2010-07-01 CN CN201080032993.0A patent/CN102473088B/en not_active Expired - Fee Related
- 2010-07-01 WO PCT/US2010/040769 patent/WO2011011180A2/en active Application Filing
-
2012
- 2012-12-07 US US13/708,232 patent/US20130151972A1/en not_active Abandoned
Patent Citations (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020129374A1 (en) * | 1991-11-25 | 2002-09-12 | Michael J. Freeman | Compressed digital-data seamless video switching system |
US5920356A (en) * | 1995-06-06 | 1999-07-06 | Compressions Labs, Inc. | Coding parameter adaptive transform artifact reduction process |
US5677980A (en) * | 1995-06-20 | 1997-10-14 | Nippon Steel Corporation | Decoder for compressed digital signal |
US6057882A (en) * | 1996-10-29 | 2000-05-02 | Hewlett-Packard Company | Testing architecture for digital video transmission system |
US6269482B1 (en) * | 1997-07-14 | 2001-07-31 | Altinex, Inc. | Methods of testing electrical signals and compensating for degradation |
US6400400B1 (en) * | 1997-07-30 | 2002-06-04 | Sarnoff Corporation | Method and apparatus for automated testing of a video decoder |
US6477204B1 (en) * | 1997-12-19 | 2002-11-05 | Kabushiki Kaisha Toshiba | Video image decoding method and apparatus |
US6603505B1 (en) * | 1999-09-10 | 2003-08-05 | Kdd Corporation | Device for objectively evaluating the quality of a digital transmitted picture |
KR20010102899A (en) * | 2001-10-23 | 2001-11-17 | 박영남 | Apparatus and method for implementing multi-display of mpeg2 file in mpeg2 file reproducing system |
US20030174349A1 (en) * | 2002-03-15 | 2003-09-18 | Eastman Kodak Company | Preview function in a digital data preservation system |
US20040174384A1 (en) * | 2002-12-03 | 2004-09-09 | Pioneer Corporation | Image signal processing apparatus and method |
EP1548730A2 (en) * | 2003-12-26 | 2005-06-29 | Humax Co., Ltd. | Method for establishing recording quality in digital recording device |
US8321907B2 (en) * | 2004-02-09 | 2012-11-27 | Panasonic Corporation | Broadcast receiving apparatus, broadcast receiving method, broadcast receiving program, and broadcast receiving circuit |
US20050180511A1 (en) * | 2004-02-13 | 2005-08-18 | Kabushiki Kaisha Toshiba | H. 264 codec IC, DVD playback apparatus, H. 264 codec method |
US20070274676A1 (en) * | 2004-09-10 | 2007-11-29 | Giuseppe Diomelli | Method and Apparatus For Unified Management Of Different Type Of Communications Over Lanwan And Internet Networks, Using A Web Browser |
JP2006166130A (en) * | 2004-12-08 | 2006-06-22 | Shinano Kenshi Co Ltd | Quality evaluation support device and quality evaluation support program for compressed video information |
US20060282867A1 (en) * | 2005-06-13 | 2006-12-14 | Yoshiaki Mizuhashi | Image processing apparatus capable of adjusting image quality by using moving image samples |
US20070047542A1 (en) * | 2005-08-30 | 2007-03-01 | Microsoft Corporation | Real-time audio-visual quality monitoring in a network |
US20120076471A1 (en) * | 2005-10-11 | 2012-03-29 | Apple Inc. | Image capture and manipulation |
US7974485B1 (en) * | 2005-10-27 | 2011-07-05 | Nvidia Corporation | Split-frame post-processing in a programmable video pipeline |
US20090322774A1 (en) * | 2006-08-03 | 2009-12-31 | Kenichiro Hosoi | Image display control device, image processing device, image display control method, its program, and recording medium with the program recorded therein |
US20080043031A1 (en) * | 2006-08-15 | 2008-02-21 | Ati Technologies, Inc. | Picture adjustment methods and apparatus for image display device |
US20100260271A1 (en) * | 2007-11-16 | 2010-10-14 | Thomson Licensing Llc. | Sysytem and method for encoding video |
CN101453571A (en) * | 2007-12-04 | 2009-06-10 | 康佳集团股份有限公司 | Method and equipment for television image effect comparison |
US20100110199A1 (en) * | 2008-11-03 | 2010-05-06 | Stefan Winkler | Measuring Video Quality Using Partial Decoding |
US20100223649A1 (en) * | 2009-03-02 | 2010-09-02 | Jason Robert Suitts | Automated Assessment of Digital Video Encodings |
US8351768B2 (en) * | 2009-07-23 | 2013-01-08 | Microsoft Corporation | Media processing comparison system and techniques |
Non-Patent Citations (1)
Title |
---|
Park, machine generated translation of KR2001102899, 11/17/2001. * |
Also Published As
Publication number | Publication date |
---|---|
CN102473088B (en) | 2014-08-13 |
WO2011011180A2 (en) | 2011-01-27 |
WO2011011180A3 (en) | 2011-03-24 |
US20110018889A1 (en) | 2011-01-27 |
US8351768B2 (en) | 2013-01-08 |
CN102473088A (en) | 2012-05-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9990350B2 (en) | Videos associated with cells in spreadsheets | |
KR101530101B1 (en) | Interfaces for digital media processing | |
US10528631B1 (en) | Media data presented with time-based metadata | |
US10580459B2 (en) | Dynamic media interaction using time-based metadata | |
US9548950B2 (en) | Switching camera angles during interactive events | |
US7500175B2 (en) | Aspects of media content rendering | |
US7721308B2 (en) | Synchronization aspects of interactive multimedia presentation management | |
US20090193345A1 (en) | Collaborative interface | |
US20060236219A1 (en) | Media timeline processing infrastructure | |
KR20070121728A (en) | Media timeline sorting | |
WO2022170836A1 (en) | Method and apparatus for processing track data of multimedia file, and medium and device | |
KR20090082888A (en) | Timing aspects of media content rendering | |
US20130151972A1 (en) | Media processing comparison system and techniques | |
US10084840B2 (en) | Social networking with video annotation | |
CN112015927A (en) | Multimedia file editing method and device, electronic equipment and storage medium | |
CN106792219B (en) | It is a kind of that the method and device reviewed is broadcast live | |
US9542922B2 (en) | Method for inserting watermark to image and electronic device thereof | |
McCune | Learning AV Foundation: A hands-on guide to mastering the AV foundation framework | |
US20230377606A1 (en) | Video editing projects using single bundled video files | |
WO2006030995A1 (en) | Index-based authoring and editing system for video contents | |
CN101826093A (en) | Video search engine based on multi-dimensional content mining | |
KR20130050539A (en) | Mobile terminal and system for providing a sound source, and method for providing a sound source | |
WO2018005569A1 (en) | Videos associated with cells in spreadsheets | |
WO2022156646A1 (en) | Video recording method and device, electronic device and storage medium | |
CN104485123A (en) | Metro line operation scene replay method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034544/0541 Effective date: 20141014 |
|
AS | Assignment |
Owner name: MICROSOFT CORPORATION, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DALAL, FIROZ;SADHWANI, SHYAM;REEL/FRAME:035781/0684 Effective date: 20090720 Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:035781/0698 Effective date: 20141014 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE |