US20160021418A1 - Displaying Heterogeneous Video - Google Patents

Displaying Heterogeneous Video Download PDF

Info

Publication number
US20160021418A1
US20160021418A1 US14/870,949 US201514870949A US2016021418A1 US 20160021418 A1 US20160021418 A1 US 20160021418A1 US 201514870949 A US201514870949 A US 201514870949A US 2016021418 A1 US2016021418 A1 US 2016021418A1
Authority
US
United States
Prior art keywords
display
video
sources
display screen
displayed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/870,949
Inventor
Oleg Rashkovskiy
Scott A. Rosenberg
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US14/870,949 priority Critical patent/US20160021418A1/en
Publication of US20160021418A1 publication Critical patent/US20160021418A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44209Monitoring of downstream path of the transmission network originating from a server, e.g. bandwidth variations of a wireless network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4316Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video or multiplex stream to a specific local network, e.g. a IEEE 1394 or Bluetooth® network
    • H04N21/43637Adapting the video or multiplex stream to a specific local network, e.g. a IEEE 1394 or Bluetooth® network involving a wireless protocol, e.g. Bluetooth, RF or wireless LAN [IEEE 802.11]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440281Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the temporal resolution, e.g. by frame skipping
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/443OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
    • H04N21/4438Window management, e.g. event handling following interaction with the user interface
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/462Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
    • H04N21/4622Retrieving content or additional data from different sources, e.g. from a broadcast channel and the Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/482End-user interface for program selection
    • H04N21/4821End-user interface for program selection using a grid, e.g. sorted out by channel and broadcast time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • H04N21/4858End-user interface for client configuration for modifying screen layout parameters, e.g. fonts, size of the windows

Definitions

  • This invention relates generally to video display devices and particularly to display devices that display heterogeneous video.
  • Heterogeneous video is video from disparate sources which is intended to be displayed on a single video display device.
  • the video may be graphics or streaming video normally associated with television programming.
  • the graphics may come, for example, from an associated processor-based system for display on a display device which also receives other video sources.
  • the heterogeneous video may also include video from a playback device such as a video cassette recorder or digital versatile disk player, games, and applications like e-mail, web browsers and word processors.
  • heterogeneous video is displayed on a single display device by mixing the disparate content within a processor-based system and then coding the disparate content to a single common video output signal for interface to the display.
  • an output represents an awkward compromise between different ideal representations for each of the sources and limitations imposed by the actual display device.
  • the simultaneous display of a first video at twenty-four bits per pixel and sixty frames per second with a second video at sixteen bits per pixel and sixty frames per second may require that both videos be converted into a common output format, for example, of twenty-four bits per pixel and sixty frames per second. While this output format may be advantageous for the first video, it amounts to a over representation of the second video which only requires sixteen bits per pixel. This “up-conversion” of the second video source unnecessarily increases the amount of bandwidth required to transport the first source to the display.
  • the unnecessary additional bandwidth and sub-optimal representation of at least one of more than one video source may result from the practice of adapting heterogeneous video sources to a single format.
  • there is a need for a way of handling heterogeneous video which does not unnecessarily waste bandwidth or diminish the representation of some (if not all) of the video from various video sources.
  • FIGS. 1A and 1B are schematic depictions of one embodiment of the present invention including a computing device which communicates with a display device;
  • FIG. 2 is a screen display for the embodiment shown in FIGS. 1A and 1B in accordance with one embodiment of the present invention
  • FIG. 3 is a schematic depiction corresponding to FIG. 1B for still another embodiment of the present invention.
  • FIG. 4 is a schematic cross-sectional view of one embodiment of the display in accordance with the present invention.
  • FIG. 5 is a more detailed, enlarged cross-sectional view of the display of the type shown in FIG. 1 ;
  • FIG. 6 is a schematic diagram of one cell of the display in accordance with another embodiment of the present invention.
  • FIG. 7 is a schematic diagram of a display implementing one embodiment of the present invention.
  • FIG. 8 is a flow chart for software in accordance with one embodiment of the present invention.
  • a computing device 12 may communicate over an interface 34 with a display device 14 .
  • the interface 34 is a wireless interface such as a radio wave or infrared interface.
  • other interfaces including wired connections may be utilized as well.
  • the computing device 12 may be a set-top box that communicates with a display device 14 that is a television receiver.
  • the computing device 12 may be any processor-based device including a desktop computer, a laptop computer, or a processor-based appliance.
  • each of the disparate video streams (such as the sources 1 - 4 in FIG. 1 ) that make up the heterogeneous video may be conveyed from the computing device 12 to the display device 14 independently.
  • These video streams come from a variety of sources, including a television signal or graphics generated by the processing unit of the computing device 12 .
  • the computing device 12 may include a central processing unit 18 , graphics or other processing units 26 , a television receiver 22 , and other input devices 24 coupled by a bus 16 .
  • a system memory 20 may also be coupled to the bus 16 .
  • a video controller 28 coupled to the bus 16 , provides a series of independent video sources, indicated as sources 1 - 4 in FIG. 1 , to an arbitration and packetization unit 30 .
  • a first video stream may come from a terrestrial or satellite television system while a second video stream (source 2 ) may represent graphical information describing the first video source and generated locally by the computing device 12 .
  • These graphics may be a television programming guide also known as an electronic content guide or an electronic programming guide (EPG).
  • the arbitration and packetization unit 30 may simultaneously drive independent video sources at their natural rates onto different portions of the display screen of a display device 14 .
  • the video information may be displayed on one portion of the display 14 at a native frame rate of sixty Hertz while the graphical information may be updated more infrequently, for example at 25 Hertz.
  • Each of the video sources may be converted into packets by the unit 30 for more efficient transport by a transport modulation unit 32 . If more sources are available than the bandwidth permits, the unit 30 arbitrates which sources are transmitted during a given interval.
  • the received packets from the interface 34 are demodulated by a transport demodulation unit 36 .
  • the demodulated packets are then depacketized in the depacketization unit 38 .
  • the unit 38 recreates the separate video sources which are then converted by the conversion unit 40 into an appropriate format for display on the display element 42 .
  • the display screen 42 may use liquid crystal over semiconductor (LCOS) imaging devices with embedded storage and other processing elements.
  • the conversion unit 40 receives the independent video signals, converts the signals to a format compatible with its LCOS display element or screen 42 and then drives the signals onto the LCOS display screen 42 .
  • the video signals may be driven onto the display element 42 in a random access pattern similar to that employed when driving new information into traditional memory technologies such as dynamic random access memory (DRAM) and static random access memory (SRAM).
  • DRAM dynamic random access memory
  • SRAM static random access memory
  • an electronic programming guide may be displayed in the region 46 of the LCOS display screen 42 and an ongoing television program may be displayed in the region 44 in one embodiment of the invention.
  • the EPG may include a plurality of channels listed down the left margin and a series of times, listed across the top of the region 46 in a grid pattern indicating what programs (shows #1-7) are available for viewing on any given channel at any of the indicated times.
  • the user may select from among the electronic programming guide entries. For example, by mouse clicking using an input device, a user may select a particular program for full screen viewing. While making the decision, one video source may be displayed in the region 44 .
  • display formats may be used.
  • each of the video sources may be assigned to a distinct, predetermined region of the overall display screen 42 to facilitate the accommodation of the native characteristics of each of two or more video sources.
  • FIG. 2 illustrates a situation where only two video sources are displayed
  • more than two video sources may be provided by dedicating specific screen portions to each video source.
  • the display screen 42 may be divided into quadrants each assigned to one of four potential video sources. In some cases, where only one source is available, that source may be displayed full screen. In other cases, the user may select among the various available sources for either full screen or split screen display.
  • the video streams that drive less information to the display than others need not be “up-converted” to higher information formats prior to transmission to the display. This may save transmission bandwidth between the computing device 12 and the display device 14 that might otherwise be used for the transport of other valuable video information.
  • the ability to present video sources at their native rates may yield a perceptually superior video presentation. The rate conversion process may degrade the source from its native format. Because the burden of converting all video sources into a single common format may be removed from the computer device 12 and the bandwidth of the information being driven to the display may potentially be reduced, the requirements on the memory and processing elements in the computer device 12 may be eased, yielding a system with additional performance margin. The added performance margin may then be applied to other processing operations or alternatively, the system may be cost reduced to a level sufficient to simply drive the display.
  • the display element 42 may be implemented as an electro-optical device 110 , such as a spatial light modulator (SLM) as shown in FIG. 4 .
  • the device 110 may include a plurality of reflective mirrors 112 defined on a semiconductor substrate 114 in accordance with one embodiment of the present invention.
  • the device 110 is implemented using liquid crystal over semiconductor (LCOS) technology.
  • LCOS technology may form large screen projection displays or smaller displays (using direct view rather than projection technology).
  • CMOS complementary metal oxide semiconductor
  • the display may be a reflective liquid crystal display.
  • the device 110 may include a silicon substrate 114 with a metal layer defining the mirrors 112 .
  • the mirrors 112 may be the mirrors of an electro-optic display such as a liquid crystal display.
  • the mirrors 112 may be part of spatial light modulator (SLM) for one of the color planes of a tricolor display. Potentials applied to the mirrors 112 alter the liquid crystal to modulate the incoming light to create images which then can be directly viewed or projected onto a projection screen.
  • SLM spatial light modulator
  • each cell or pixel of the display may include a reflective mirror 124 forming one of the mirrors of one of the pixels 112 shown in FIG. 4 .
  • each cell may be rectangular or square and a slight spacing may occur between each adjacent mirror 124 .
  • a rectangular array of mirrors 124 may form an array of pixel elements in conjunction with liquid crystal material 120 positioned over the mirrors 124 .
  • the LCOS structure includes a substrate 114 having doped regions 132 formed therein.
  • the doped regions 132 may define transistors for logic elements and/or memory cells which operate in conjunction with the display pixels as will be described hereinafter.
  • Four or more metal layers may be provided, including a metal one layer 130 which is spaced by an inter-layer dielectric (ILD) 131 from a metal two layer 128 and a metal three layer 126 .
  • ILD inter-layer dielectric
  • a metal four layer may form the pixel mirrors 124 .
  • the metal two layer 128 may provide light blocking and the metal one layer may provide the desired interconnections for forming the semiconductor logic and memory devices.
  • the pixel mirrors 124 may be coupled, by way of vias 133 , with the other metal layers.
  • a dielectric layer 122 may be formed over the mirror 124 .
  • a liquid crystal or electro-optic material 120 is sandwiched between a pair of buffered polyimide layers 119 a and 119 b .
  • One electrode of the liquid crystal device is formed by the metal layer 124 .
  • the other electrode is formed by an indium tin oxide (ITO) layer 118 .
  • ITO indium tin oxide
  • a top plate 116 may be formed of transparent material.
  • the ITO layer 118 may be coated on the top plate 126 .
  • the polyimide layers 119 a and 119 b provide electrical isolation between the capacitor plates which sandwich the electro-optic material 120 .
  • other insulating materials may be coated on the ITO layer 118 in place of or in addition to the polyimide layers.
  • a memory element or array may be incorporated into the same silicon substrate which includes the pixel array.
  • a memory 160 may be integrated with each pixel cell 112 .
  • pixel information may be passed through a digital to analog converter (DAC) 162 to produce gray scale information.
  • DAC digital to analog converter
  • Each pixel cell 112 metal electrode or top metal 124 may be coupled to a separate DAC 162 .
  • the DAC may be an eight bit DAC coupled to eight one bit storage elements 160 .
  • Each storage memory 160 may, for example, be a static random access memory (SRAM) cell.
  • SRAM static random access memory
  • Each one bit storage element 160 may be coupled by a transfer transistor 158 to a different row 156 and a column 154 .
  • the information which is used to refresh the metal 124 may be stored in the memory 160 .
  • the information in the memory 160 is refreshed.
  • the display refresh controller only needs to refresh new information to the display 14 , the overall drain on the computing device 12 including the buses and memory may be reduced, potentially yielding better performance out of the other components in the computer system which rely on these limited resources.
  • the amount of redundant information flowing to the display 14 may be reduced, allowing more new information to be sent to the display. This potentially enables the display of higher resolution or higher rate images.
  • a projection display 164 shown in FIG. 7 , includes the spatial light modulator display panels 166 , 174 and 176 , using liquid crystal over silicon technology with integrated memory.
  • the reflective liquid crystal display projection system 614 typically includes a modulator or display panel (LCD display panels 174 , 166 and 176 ) for each primary color that is projected onto a screen 192 .
  • the projection system 164 may include an LCD display panel 174 that is associated with a red color band, an LCD display panel 166 that is associated with the green color band and LCD display panel 176 that is associated with the blue color band.
  • Each of the LCD display panels 166 , 174 and 176 modulates light from the light source 194 and the optics 196 that form red, green and blue images, respectively, and add together to form a composite color image on the screen 192 . To accomplish this, each LCD display panel receives electrical signals indicating the corresponding modulated beam image to be formed.
  • the projection display 64 may include a beam splitter 186 that directs a substantially collimated white beam 198 of light, provided by the light source 194 , to optics that separate the white beam 198 into red 182 , blue 178 and green 202 beams.
  • the white light beam 198 may be directed to a red dichroic mirror 172 that reflects the red beam towards the LCD display panel 174 that, in turn, modulates the red beam 182 .
  • the blue beam passes through the red dichroic mirror to a blue dichroic mirror 170 that reflects the blue beam towards the LCD panel 176 for modulation.
  • the green beam 202 passes through the red and blue dichroic mirrors for modulation by the LCD display panel 166 .
  • each LCD display panel 166 , 174 and 176 modulates the incident beam and reflects the modulated beams 168 , 200 and 190 respectively, so that the modulated beams return on the paths described above to the beam splitter 186 .
  • the beam splitter 186 directs the modulated beams through projection optics such as a lens 188 , to form modulated beam images that ideally overlap and combine to form the composite image on the screen 192 .
  • Each of the panels 166 , 174 , and 176 may be implemented using liquid crystal over semiconductor technology as illustrated for example in FIG. 5 .
  • another embodiment of the present invention may include a computing device 12 which operates as described in connection with FIG. 1A and a display device 14 a which has been modified from the embodiment shown in FIG. 1B .
  • the display device 14 a may use a legacy display technology such as a cathode ray tube or a thin film transistor-liquid crystal display 52 with integrated memory and processing elements. Like the embodiment of FIG. 1A , independent video streams may be driven to the display device 14 a from the computing device 12 . In the display device 14 a , these video sources are converted to a common frame buffer format, and driven onto the integrated memory 48 through the use of an arbiter and addressing circuitry 40 a . Simultaneously, video is scanned out of this memory 48 in a synchronous, regular fashion by a controller 50 and driven onto the imaging elements 52 .
  • a controller 50 may be driven onto the imaging elements 52 .
  • software 210 may be utilized by the computing device 12 in the embodiment of FIG. 1A to control the generation of independent, packetized video sources. At least two video streams are received in the computing device 12 as indicated in block 212 . Each video stream is packetized as indicated in block 214 . The independent, packetized video streams are then sent to the display as indicated in block 216 for subsequent de-packetization and display.
  • Some embodiments of the present invention may exhibit superior performance to systems where heterogeneous video signals are mixed in the computing device prior to modulation for the display. Since only new information is transferred to the display, bandwidth may be saved, power consumption may be reduced and the generation of heat in the communication between the computing device and the display may be reduced in some embodiments.
  • perceptually superior display of disparate video sources is possible relative to systems where the content is mixed in the computing device. Instead of aggregating and synchronizing all the video sources in the computing device and thereby forcing all streams into a single, least-common-denominator format and timing, video sources may be independently streamed to the display and presented in their native format. In some cases, the performance of the computing device may be enhanced.

Abstract

Heterogeneous video may be independently encoded in a processor-based system and transmitted for display on a display device. In the display device, the independent video streams may be de-packetized for display on the same display at different frame rates. Thus, each of the video sources may be displayed in a separate portion of the display at its native frame rate. For example, an electronic programming guide may be transmitted for display at one frame rate and displayed in a distinct region on the display screen while video corresponding to an ongoing television program may be displayed in another portion of the display screen at its native frame rate.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This application is a divisional of U.S. patent application Ser. No. 12/006,011, filed Dec. 28, 2007, which is a divisional of U.S. patent application Ser. No. 09/522,053, filed Mar. 9, 2000 issued as U.S. Pat. No. 7,337,463 on Feb. 26, 2008.
  • BACKGROUND
  • This invention relates generally to video display devices and particularly to display devices that display heterogeneous video.
  • Heterogeneous video is video from disparate sources which is intended to be displayed on a single video display device. The video may be graphics or streaming video normally associated with television programming. The graphics may come, for example, from an associated processor-based system for display on a display device which also receives other video sources. The heterogeneous video may also include video from a playback device such as a video cassette recorder or digital versatile disk player, games, and applications like e-mail, web browsers and word processors.
  • Conventionally, heterogeneous video is displayed on a single display device by mixing the disparate content within a processor-based system and then coding the disparate content to a single common video output signal for interface to the display. Inevitably, such an output represents an awkward compromise between different ideal representations for each of the sources and limitations imposed by the actual display device.
  • For example, the simultaneous display of a first video at twenty-four bits per pixel and sixty frames per second with a second video at sixteen bits per pixel and sixty frames per second may require that both videos be converted into a common output format, for example, of twenty-four bits per pixel and sixty frames per second. While this output format may be advantageous for the first video, it amounts to a over representation of the second video which only requires sixteen bits per pixel. This “up-conversion” of the second video source unnecessarily increases the amount of bandwidth required to transport the first source to the display.
  • The unnecessary additional bandwidth and sub-optimal representation of at least one of more than one video source may result from the practice of adapting heterogeneous video sources to a single format. Thus, there is a need for a way of handling heterogeneous video which does not unnecessarily waste bandwidth or diminish the representation of some (if not all) of the video from various video sources.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIGS. 1A and 1B are schematic depictions of one embodiment of the present invention including a computing device which communicates with a display device;
  • FIG. 2 is a screen display for the embodiment shown in FIGS. 1A and 1B in accordance with one embodiment of the present invention;
  • FIG. 3 is a schematic depiction corresponding to FIG. 1B for still another embodiment of the present invention;
  • FIG. 4 is a schematic cross-sectional view of one embodiment of the display in accordance with the present invention;
  • FIG. 5 is a more detailed, enlarged cross-sectional view of the display of the type shown in FIG. 1;
  • FIG. 6 is a schematic diagram of one cell of the display in accordance with another embodiment of the present invention;
  • FIG. 7 is a schematic diagram of a display implementing one embodiment of the present invention; and
  • FIG. 8 is a flow chart for software in accordance with one embodiment of the present invention.
  • DETAILED DESCRIPTION
  • Referring to FIG. 1, a computing device 12 may communicate over an interface 34 with a display device 14. In one embodiment of the present invention, the interface 34 is a wireless interface such as a radio wave or infrared interface. However, other interfaces including wired connections may be utilized as well.
  • In one embodiment of the present invention, the computing device 12 may be a set-top box that communicates with a display device 14 that is a television receiver. However, the computing device 12 may be any processor-based device including a desktop computer, a laptop computer, or a processor-based appliance.
  • In accordance with one embodiment of the present invention, when a heterogeneous video is intended to be displayed on the display device 14, each of the disparate video streams (such as the sources 1-4 in FIG. 1) that make up the heterogeneous video may be conveyed from the computing device 12 to the display device 14 independently. These video streams come from a variety of sources, including a television signal or graphics generated by the processing unit of the computing device 12.
  • The computing device 12 may include a central processing unit 18, graphics or other processing units 26, a television receiver 22, and other input devices 24 coupled by a bus 16. A system memory 20 may also be coupled to the bus 16. A video controller 28, coupled to the bus 16, provides a series of independent video sources, indicated as sources 1-4 in FIG. 1, to an arbitration and packetization unit 30.
  • As an example, a first video stream (source 1) may come from a terrestrial or satellite television system while a second video stream (source 2) may represent graphical information describing the first video source and generated locally by the computing device 12. These graphics may be a television programming guide also known as an electronic content guide or an electronic programming guide (EPG).
  • The arbitration and packetization unit 30 may simultaneously drive independent video sources at their natural rates onto different portions of the display screen of a display device 14. Thus, in the example described above, the video information may be displayed on one portion of the display 14 at a native frame rate of sixty Hertz while the graphical information may be updated more infrequently, for example at 25 Hertz.
  • Each of the video sources may be converted into packets by the unit 30 for more efficient transport by a transport modulation unit 32. If more sources are available than the bandwidth permits, the unit 30 arbitrates which sources are transmitted during a given interval.
  • In the display device 14, the received packets from the interface 34 are demodulated by a transport demodulation unit 36. The demodulated packets are then depacketized in the depacketization unit 38. The unit 38 recreates the separate video sources which are then converted by the conversion unit 40 into an appropriate format for display on the display element 42.
  • In one embodiment of the present invention, the display screen 42 may use liquid crystal over semiconductor (LCOS) imaging devices with embedded storage and other processing elements. The conversion unit 40 receives the independent video signals, converts the signals to a format compatible with its LCOS display element or screen 42 and then drives the signals onto the LCOS display screen 42. By incorporating an integrated storage into the LCOS display screen 42, as described hereinafter, the video signals may be driven onto the display element 42 in a random access pattern similar to that employed when driving new information into traditional memory technologies such as dynamic random access memory (DRAM) and static random access memory (SRAM).
  • Thus, referring to FIG. 2, an electronic programming guide (EPG) may be displayed in the region 46 of the LCOS display screen 42 and an ongoing television program may be displayed in the region 44 in one embodiment of the invention. The EPG may include a plurality of channels listed down the left margin and a series of times, listed across the top of the region 46 in a grid pattern indicating what programs (shows #1-7) are available for viewing on any given channel at any of the indicated times. Thus, the user may select from among the electronic programming guide entries. For example, by mouse clicking using an input device, a user may select a particular program for full screen viewing. While making the decision, one video source may be displayed in the region 44. Of course, a wide variety of display formats may be used.
  • In this way, the video stream from one video source may be displayed at its native rate in the region 44 while the electronic programming guide (which may be in the form of graphical information) displayed in the region 46 may be updated less frequently. Thus, each of the video sources may be assigned to a distinct, predetermined region of the overall display screen 42 to facilitate the accommodation of the native characteristics of each of two or more video sources.
  • While FIG. 2 illustrates a situation where only two video sources are displayed, more than two video sources may be provided by dedicating specific screen portions to each video source. In one embodiment of the present invention, the display screen 42 may be divided into quadrants each assigned to one of four potential video sources. In some cases, where only one source is available, that source may be displayed full screen. In other cases, the user may select among the various available sources for either full screen or split screen display.
  • In some embodiments of the present invention, the video streams that drive less information to the display than others need not be “up-converted” to higher information formats prior to transmission to the display. This may save transmission bandwidth between the computing device 12 and the display device 14 that might otherwise be used for the transport of other valuable video information. In addition, the ability to present video sources at their native rates may yield a perceptually superior video presentation. The rate conversion process may degrade the source from its native format. Because the burden of converting all video sources into a single common format may be removed from the computer device 12 and the bandwidth of the information being driven to the display may potentially be reduced, the requirements on the memory and processing elements in the computer device 12 may be eased, yielding a system with additional performance margin. The added performance margin may then be applied to other processing operations or alternatively, the system may be cost reduced to a level sufficient to simply drive the display.
  • The display element 42 may be implemented as an electro-optical device 110, such as a spatial light modulator (SLM) as shown in FIG. 4. The device 110 may include a plurality of reflective mirrors 112 defined on a semiconductor substrate 114 in accordance with one embodiment of the present invention. Advantageously, the device 110 is implemented using liquid crystal over semiconductor (LCOS) technology. LCOS technology may form large screen projection displays or smaller displays (using direct view rather than projection technology). With LCOS technology, the liquid crystal display is formed in association with the same substrate that forms complementary metal oxide semiconductor (CMOS) circuit elements. The display may be a reflective liquid crystal display.
  • The device 110 may include a silicon substrate 114 with a metal layer defining the mirrors 112. The mirrors 112 may be the mirrors of an electro-optic display such as a liquid crystal display. For example, the mirrors 112 may be part of spatial light modulator (SLM) for one of the color planes of a tricolor display. Potentials applied to the mirrors 112 alter the liquid crystal to modulate the incoming light to create images which then can be directly viewed or projected onto a projection screen.
  • Referring to FIG. 5, each cell or pixel of the display may include a reflective mirror 124 forming one of the mirrors of one of the pixels 112 shown in FIG. 4. In one embodiment of the invention, each cell may be rectangular or square and a slight spacing may occur between each adjacent mirror 124. Thus, a rectangular array of mirrors 124 may form an array of pixel elements in conjunction with liquid crystal material 120 positioned over the mirrors 124.
  • The LCOS structure includes a substrate 114 having doped regions 132 formed therein. The doped regions 132 may define transistors for logic elements and/or memory cells which operate in conjunction with the display pixels as will be described hereinafter. Four or more metal layers may be provided, including a metal one layer 130 which is spaced by an inter-layer dielectric (ILD) 131 from a metal two layer 128 and a metal three layer 126. A metal four layer may form the pixel mirrors 124. Thus, for example, the metal two layer 128 may provide light blocking and the metal one layer may provide the desired interconnections for forming the semiconductor logic and memory devices. The pixel mirrors 124 may be coupled, by way of vias 133, with the other metal layers.
  • A dielectric layer 122 may be formed over the mirror 124. A liquid crystal or electro-optic material 120 is sandwiched between a pair of buffered polyimide layers 119 a and 119 b. One electrode of the liquid crystal device is formed by the metal layer 124. The other electrode is formed by an indium tin oxide (ITO) layer 118.
  • A top plate 116 may be formed of transparent material. The ITO layer 118 may be coated on the top plate 126. The polyimide layers 119 a and 119 b provide electrical isolation between the capacitor plates which sandwich the electro-optic material 120. However, other insulating materials may be coated on the ITO layer 118 in place of or in addition to the polyimide layers.
  • Using the LCOS structure, for example as depicted in FIG. 6, a memory element or array may be incorporated into the same silicon substrate which includes the pixel array. A memory 160 may be integrated with each pixel cell 112. In some embodiments, pixel information may be passed through a digital to analog converter (DAC) 162 to produce gray scale information. The particular manner in which pixels are arranged in the storage array and converted to analog signals may vary by implementation.
  • Each pixel cell 112 metal electrode or top metal 124 may be coupled to a separate DAC 162. In one embodiment of the present invention, the DAC may be an eight bit DAC coupled to eight one bit storage elements 160. Each storage memory 160 may, for example, be a static random access memory (SRAM) cell. Each one bit storage element 160 may be coupled by a transfer transistor 158 to a different row 156 and a column 154. Thus, the information which is used to refresh the metal 124 may be stored in the memory 160. When it is desired to change the pixel information to change the displayed image, then the information in the memory 160 is refreshed.
  • Since the display refresh controller only needs to refresh new information to the display 14, the overall drain on the computing device 12 including the buses and memory may be reduced, potentially yielding better performance out of the other components in the computer system which rely on these limited resources. In addition, the amount of redundant information flowing to the display 14 may be reduced, allowing more new information to be sent to the display. This potentially enables the display of higher resolution or higher rate images.
  • In one embodiment of the present invention, a projection display 164, shown in FIG. 7, includes the spatial light modulator display panels 166, 174 and 176, using liquid crystal over silicon technology with integrated memory. The reflective liquid crystal display projection system 614 typically includes a modulator or display panel ( LCD display panels 174, 166 and 176) for each primary color that is projected onto a screen 192. In this manner, for a red-green-blue (RGB) color space, the projection system 164 may include an LCD display panel 174 that is associated with a red color band, an LCD display panel 166 that is associated with the green color band and LCD display panel 176 that is associated with the blue color band. Each of the LCD display panels 166, 174 and 176 modulates light from the light source 194 and the optics 196 that form red, green and blue images, respectively, and add together to form a composite color image on the screen 192. To accomplish this, each LCD display panel receives electrical signals indicating the corresponding modulated beam image to be formed.
  • More particularly, the projection display 64 may include a beam splitter 186 that directs a substantially collimated white beam 198 of light, provided by the light source 194, to optics that separate the white beam 198 into red 182, blue 178 and green 202 beams. In this manner, the white light beam 198 may be directed to a red dichroic mirror 172 that reflects the red beam towards the LCD display panel 174 that, in turn, modulates the red beam 182. The blue beam passes through the red dichroic mirror to a blue dichroic mirror 170 that reflects the blue beam towards the LCD panel 176 for modulation. The green beam 202 passes through the red and blue dichroic mirrors for modulation by the LCD display panel 166.
  • For reflective LCD display panels, each LCD display panel 166, 174 and 176 modulates the incident beam and reflects the modulated beams 168, 200 and 190 respectively, so that the modulated beams return on the paths described above to the beam splitter 186. The beam splitter 186, in turn, directs the modulated beams through projection optics such as a lens 188, to form modulated beam images that ideally overlap and combine to form the composite image on the screen 192. Each of the panels 166, 174, and 176 may be implemented using liquid crystal over semiconductor technology as illustrated for example in FIG. 5.
  • Referring to FIG. 3, another embodiment of the present invention may include a computing device 12 which operates as described in connection with FIG. 1A and a display device 14 a which has been modified from the embodiment shown in FIG. 1B.
  • The display device 14 a may use a legacy display technology such as a cathode ray tube or a thin film transistor-liquid crystal display 52 with integrated memory and processing elements. Like the embodiment of FIG. 1A, independent video streams may be driven to the display device 14 a from the computing device 12. In the display device 14 a, these video sources are converted to a common frame buffer format, and driven onto the integrated memory 48 through the use of an arbiter and addressing circuitry 40 a. Simultaneously, video is scanned out of this memory 48 in a synchronous, regular fashion by a controller 50 and driven onto the imaging elements 52.
  • In accordance with one embodiment of the present invention, software 210, shown in FIG. 8, may be utilized by the computing device 12 in the embodiment of FIG. 1A to control the generation of independent, packetized video sources. At least two video streams are received in the computing device 12 as indicated in block 212. Each video stream is packetized as indicated in block 214. The independent, packetized video streams are then sent to the display as indicated in block 216 for subsequent de-packetization and display.
  • Some embodiments of the present invention may exhibit superior performance to systems where heterogeneous video signals are mixed in the computing device prior to modulation for the display. Since only new information is transferred to the display, bandwidth may be saved, power consumption may be reduced and the generation of heat in the communication between the computing device and the display may be reduced in some embodiments. In addition, in some embodiments of the present invention, perceptually superior display of disparate video sources is possible relative to systems where the content is mixed in the computing device. Instead of aggregating and synchronizing all the video sources in the computing device and thereby forcing all streams into a single, least-common-denominator format and timing, video sources may be independently streamed to the display and presented in their native format. In some cases, the performance of the computing device may be enhanced.
  • While the present invention has been described with respect to a limited number of embodiments, those skilled in the art will appreciate numerous modifications and variations therefrom. It is intended that the appended claims cover all such modifications and variations as fall within the true spirit and scope of this present invention.

Claims (10)

What is claimed is:
1. A method comprising:
streaming at least two independent video sources for display on a video display screen; and
causing said sources to be displayed at separate regions of said display screen.
2. The method of claim 1 including forming said sources into packets in a first device and transporting said packets to a second device.
3. The method of claim 2 including depacketizing said packets in said second device.
4. The method of claim 1 including transmitting said video sources from a processor-based system to a display device including said display screen.
5. The method of claim 4 including transmitting said video sources over a wireless connection between said processor-based system and said display device.
6. The method of claim 1, wherein said display screen includes a pixel array and a memory array, refreshing said memory array and said pixel array in the same refresh cycle.
7. The method of claim 6 including displaying said sources on a display that uses liquid crystal over semiconductor technology.
8. The method of claim 1 including streaming video sources for display on said display screen at different frame rates.
9. The method of claim 1 wherein one of said video sources includes television programming and the other of said video sources includes graphical information.
10. The method of claim 1 including streaming a first video source that includes television programming information and a second video source that includes an electronic programming guide information.
US14/870,949 2000-03-09 2015-09-30 Displaying Heterogeneous Video Abandoned US20160021418A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/870,949 US20160021418A1 (en) 2000-03-09 2015-09-30 Displaying Heterogeneous Video

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US09/522,053 US7337463B1 (en) 2000-03-09 2000-03-09 Displaying heterogeneous video
US12/006,011 US20080106511A1 (en) 2000-03-09 2007-12-28 Displaying heterogeneous video
US14/870,949 US20160021418A1 (en) 2000-03-09 2015-09-30 Displaying Heterogeneous Video

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US12/006,011 Division US20080106511A1 (en) 2000-03-09 2007-12-28 Displaying heterogeneous video

Publications (1)

Publication Number Publication Date
US20160021418A1 true US20160021418A1 (en) 2016-01-21

Family

ID=39103757

Family Applications (3)

Application Number Title Priority Date Filing Date
US09/522,053 Expired - Fee Related US7337463B1 (en) 2000-03-09 2000-03-09 Displaying heterogeneous video
US12/006,011 Abandoned US20080106511A1 (en) 2000-03-09 2007-12-28 Displaying heterogeneous video
US14/870,949 Abandoned US20160021418A1 (en) 2000-03-09 2015-09-30 Displaying Heterogeneous Video

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US09/522,053 Expired - Fee Related US7337463B1 (en) 2000-03-09 2000-03-09 Displaying heterogeneous video
US12/006,011 Abandoned US20080106511A1 (en) 2000-03-09 2007-12-28 Displaying heterogeneous video

Country Status (1)

Country Link
US (3) US7337463B1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107333164A (en) * 2016-04-29 2017-11-07 北京学而思教育科技有限公司 A kind of image processing method and device

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090100493A1 (en) * 2007-10-16 2009-04-16 At&T Knowledge Ventures, Lp. System and Method for Display Format Detection at Set Top Box Device
US8049761B1 (en) * 2007-11-08 2011-11-01 Nvidia Corporation Bus protocol for transferring pixel data between chips
US9386349B2 (en) 2012-09-27 2016-07-05 Canoe Ventures, Llc Asset conflict resolution for content on demand asset insertion
US9883208B2 (en) 2012-09-27 2018-01-30 Canoe Ventures Llc Data synchronization for content on demand asset insertion decisions
US9398340B2 (en) 2012-09-27 2016-07-19 Canoe Ventures, Llc Asset qualification for content on demand insertion
US8805721B2 (en) 2012-09-27 2014-08-12 Canoe Ventures Instantiation of asset insertion processing on multiple computing devices for directing insertion of assets into content on demand
US9872075B2 (en) 2012-09-27 2018-01-16 Canoe Ventures Asset scoring and ranking for content on demand insertion
US10636336B2 (en) * 2015-04-17 2020-04-28 Nvidia Corporation Mixed primary display with spatially modulated backlight
US20190014332A1 (en) * 2017-07-07 2019-01-10 Apple Inc. Content-aware video coding

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5708961A (en) * 1995-05-01 1998-01-13 Bell Atlantic Network Services, Inc. Wireless on-premises video distribution using digital multiplexing
US5729549A (en) * 1995-03-16 1998-03-17 Bell Atlantic Network Services, Inc. Simulcasting digital video programs for broadcast and interactive services
US5945974A (en) * 1996-05-15 1999-08-31 Cirrus Logic, Inc. Display controller with integrated half frame buffer and systems and methods using the same
US5959598A (en) * 1995-07-20 1999-09-28 The Regents Of The University Of Colorado Pixel buffer circuits for implementing improved methods of displaying grey-scale or color images
US5995146A (en) * 1997-01-24 1999-11-30 Pathway, Inc. Multiple video screen display system
US6128025A (en) * 1997-10-10 2000-10-03 International Business Machines Corporation Embedded frame buffer system and synchronization method
US6233253B1 (en) * 1997-05-23 2001-05-15 Thomson Licensing S.A. System for digital data format conversion and bit stream generation
US6246386B1 (en) * 1998-06-18 2001-06-12 Agilent Technologies, Inc. Integrated micro-display system
US6487722B1 (en) * 1998-02-12 2002-11-26 Sony Corporation EPG transmitting apparatus and method, EPG receiving apparatus and method, EPG transmitting/receiving system and method, and provider
US6651252B1 (en) * 1999-10-27 2003-11-18 Diva Systems Corporation Method and apparatus for transmitting video and graphics in a compressed form
US7091968B1 (en) * 1998-07-23 2006-08-15 Sedna Patent Services, Llc Method and apparatus for encoding a user interface

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH047990A (en) * 1990-04-25 1992-01-13 Mitsubishi Electric Corp Multiterminal type picture transmitter
US5619995A (en) * 1991-11-12 1997-04-15 Lobodzinski; Suave M. Motion video transformation system and method
US5625410A (en) * 1993-04-21 1997-04-29 Kinywa Washino Video monitoring and conferencing system
DE69535818D1 (en) * 1995-09-20 2008-10-02 Hitachi Ltd IMAGE DISPLAY DEVICE
US6172988B1 (en) * 1996-01-31 2001-01-09 Tiernan Communications, Inc. Method for universal messaging and multiplexing of video, audio, and data streams
JPH09247119A (en) * 1996-03-11 1997-09-19 Oki Electric Ind Co Ltd Multiplexer
US6108039A (en) * 1996-05-23 2000-08-22 C-Cube Microsystems, Inc. Low bandwidth, two-candidate motion estimation for interlaced video
JP2933133B2 (en) * 1997-01-28 1999-08-09 日本電気株式会社 Digital video signal multiplexing and separation systems
US6005620A (en) * 1997-01-31 1999-12-21 Hughes Electronics Corporation Statistical multiplexer for live and pre-compressed video
US6188436B1 (en) * 1997-01-31 2001-02-13 Hughes Electronics Corporation Video broadcast system with video data shifting
US6084910A (en) * 1997-01-31 2000-07-04 Hughes Electronics Corporation Statistical multiplexer for video signals
US6806909B1 (en) * 1997-03-03 2004-10-19 Koninklijke Philips Electronics N.V. Seamless splicing of MPEG-2 multimedia data streams
US6667494B1 (en) * 1997-08-19 2003-12-23 Semiconductor Energy Laboratory Co., Ltd. Semiconductor device and semiconductor display device
US6160545A (en) * 1997-10-24 2000-12-12 General Instrument Corporation Multi-regional interactive program guide for television
JPH11205696A (en) * 1998-01-20 1999-07-30 Sony Corp Video transmitting device and video transmitting method
JP3684525B2 (en) * 1998-02-19 2005-08-17 富士通株式会社 Multi-screen composition method and multi-screen composition device
US6140983A (en) * 1998-05-15 2000-10-31 Inviso, Inc. Display system having multiple memory elements per pixel with improved layout design
US6980183B1 (en) * 1999-07-30 2005-12-27 Intel Corporation Liquid crystal over semiconductor display with on-chip storage

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5729549A (en) * 1995-03-16 1998-03-17 Bell Atlantic Network Services, Inc. Simulcasting digital video programs for broadcast and interactive services
US5708961A (en) * 1995-05-01 1998-01-13 Bell Atlantic Network Services, Inc. Wireless on-premises video distribution using digital multiplexing
US5959598A (en) * 1995-07-20 1999-09-28 The Regents Of The University Of Colorado Pixel buffer circuits for implementing improved methods of displaying grey-scale or color images
US5945974A (en) * 1996-05-15 1999-08-31 Cirrus Logic, Inc. Display controller with integrated half frame buffer and systems and methods using the same
US5995146A (en) * 1997-01-24 1999-11-30 Pathway, Inc. Multiple video screen display system
US6233253B1 (en) * 1997-05-23 2001-05-15 Thomson Licensing S.A. System for digital data format conversion and bit stream generation
US6128025A (en) * 1997-10-10 2000-10-03 International Business Machines Corporation Embedded frame buffer system and synchronization method
US6487722B1 (en) * 1998-02-12 2002-11-26 Sony Corporation EPG transmitting apparatus and method, EPG receiving apparatus and method, EPG transmitting/receiving system and method, and provider
US6246386B1 (en) * 1998-06-18 2001-06-12 Agilent Technologies, Inc. Integrated micro-display system
US7091968B1 (en) * 1998-07-23 2006-08-15 Sedna Patent Services, Llc Method and apparatus for encoding a user interface
US6651252B1 (en) * 1999-10-27 2003-11-18 Diva Systems Corporation Method and apparatus for transmitting video and graphics in a compressed form

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107333164A (en) * 2016-04-29 2017-11-07 北京学而思教育科技有限公司 A kind of image processing method and device

Also Published As

Publication number Publication date
US20080106511A1 (en) 2008-05-08
US7337463B1 (en) 2008-02-26

Similar Documents

Publication Publication Date Title
US20160021418A1 (en) Displaying Heterogeneous Video
US7209103B2 (en) Liquid crystal projector
US6153927A (en) Packaged integrated processor and spatial light modulator
CN101299331B (en) Display controller for displaying multiple windows and method for the same
JP3513371B2 (en) Matrix substrate, liquid crystal device and display device using them
JP3987119B2 (en) Liquid crystal panel driving device, liquid crystal device and electronic apparatus
KR100454993B1 (en) Driver with built-in RAM, display unit with the driver, and electronic device
US6559821B2 (en) Matrix substrate and liquid crystal display as well as projector using the same
JPH10177371A (en) Matrix substrate, liquid crystal device and display device using them
CN100416357C (en) Image display device and projector
KR20020065854A (en) Image display system and image information transmission method
US20070159444A1 (en) Display Array of Display Panel
JP2003271108A (en) Liquid crystal display device
KR100362957B1 (en) Display device and display system
JP2020071469A (en) Image control device, display wall system using the same, and control method of outputting image to display wall
US20220383832A1 (en) Display assembly, display device and driving method for display assembly
JPH11265162A (en) Electro-optical device and electronic equipment
US6980183B1 (en) Liquid crystal over semiconductor display with on-chip storage
Janssen et al. Design aspects of a scrolling color LCoS display
JP3843658B2 (en) Electro-optical device drive circuit, electro-optical device, and electronic apparatus
US20050225521A1 (en) [liquid crystal on silicon panel and driving method thereof]
JP3261583B2 (en) Reflective LCD panel
Shimizu Single-panel reflective LCD projector
CN212519193U (en) Display screen and television
Alt Displays for electronic imaging

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION