US20120320113A1 - Direct-view mems display devices and methods for generating images thereon - Google Patents

Direct-view mems display devices and methods for generating images thereon Download PDF

Info

Publication number
US20120320113A1
US20120320113A1 US13/597,983 US201213597983A US2012320113A1 US 20120320113 A1 US20120320113 A1 US 20120320113A1 US 201213597983 A US201213597983 A US 201213597983A US 2012320113 A1 US2012320113 A1 US 2012320113A1
Authority
US
United States
Prior art keywords
sub
frame
image
display
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/597,983
Inventor
Nesbitt W. Hagood, IV
Jignesh Gandhi
Abraham McAllister
Rainer M. Malzbender
Stephen R. Lewis
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SnapTrack Inc
Original Assignee
Pixtronix Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US11/361,294 external-priority patent/US20060209012A1/en
Application filed by Pixtronix Inc filed Critical Pixtronix Inc
Priority to US13/597,983 priority Critical patent/US20120320113A1/en
Assigned to PIXTRONIX, INC. reassignment PIXTRONIX, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GANDHI, JIGNESH, HAGOOD, NESBITT W., IV, LEWIS, STEPHEN R., MCALLISTER, ABRAHAM R., MALZBENDER, RAINER M.
Publication of US20120320113A1 publication Critical patent/US20120320113A1/en
Assigned to SNAPTRACK, INC. reassignment SNAPTRACK, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: PIXTRONIX, INC.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/34Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source
    • G09G3/3433Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source using light modulating elements actuated by an electric field and being other than liquid crystal devices and electrochromic devices
    • G09G3/346Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source using light modulating elements actuated by an electric field and being other than liquid crystal devices and electrochromic devices based on modulation of the reflection angle, e.g. micromirrors
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/004Optical devices or arrangements for the control of light using movable or deformable optical elements based on a displacement or a deformation of a fluid
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/02Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the intensity of light
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/2003Display of colours
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/2007Display of intermediate tones
    • G09G3/2018Display of intermediate tones by time modulation using two or more time intervals
    • G09G3/2022Display of intermediate tones by time modulation using two or more time intervals using sub-frames
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/2007Display of intermediate tones
    • G09G3/2077Display of intermediate tones by a combination of two or more gradation control methods
    • G09G3/2081Display of intermediate tones by a combination of two or more gradation control methods with combination of amplitude modulation and time modulation
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/34Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source
    • G09G3/3406Control of illumination source
    • G09G3/3413Details of control of colour illumination sources
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2300/00Aspects of the constitution of display devices
    • G09G2300/08Active matrix structure, i.e. with use of active elements, inclusive of non-linear two terminal elements, in the pixels together with light emitting or modulating elements
    • G09G2300/0809Several active elements per pixel in active matrix panels
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2310/00Command of the display device
    • G09G2310/02Addressing, scanning or driving the display screen or processing steps related thereto
    • G09G2310/0235Field-sequential colour display
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2310/00Command of the display device
    • G09G2310/02Addressing, scanning or driving the display screen or processing steps related thereto
    • G09G2310/0243Details of the generation of driving signals
    • G09G2310/0251Precharge or discharge of pixel before applying new pixel voltage
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2310/00Command of the display device
    • G09G2310/08Details of timing specific for flat panels, other than clock recovery
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/02Improving the quality of display appearance
    • G09G2320/0247Flicker reduction other than flicker reduction circuits used for single beam cathode-ray tubes
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0626Adjustment of display parameters for control of overall brightness
    • G09G2320/064Adjustment of display parameters for control of overall brightness by time modulation of the brightness of the illumination source
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/066Adjustment of display parameters for control of contrast
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2330/00Aspects of power supply; Aspects of display protection and defect management
    • G09G2330/02Details of power systems and of start or stop of display operation
    • G09G2330/021Power management, e.g. power saving
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/14Detecting light within display terminals, e.g. using a single or a plurality of photosensors
    • G09G2360/144Detecting light within display terminals, e.g. using a single or a plurality of photosensors the light being ambient light
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/02Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
    • G09G5/06Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed using colour palettes, e.g. look-up tables
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/363Graphics controllers
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/39Control of the bit-mapped memory
    • G09G5/393Arrangements for updating the contents of the bit-mapped memory

Definitions

  • the invention relates to the field of imaging displays, in particular, the invention relates to controller circuits and processes for controlling light modulators incorporated into imaging displays.
  • Displays built from mechanical light modulators are an attractive alternative to displays based on liquid crystal technology.
  • Mechanical light modulators are fast enough to display video content with good viewing angles and with a wide range of color and grey scale. Mechanical light modulators have been successful in projection display applications. Direct-view displays using mechanical light modulators have not yet demonstrated sufficiently attractive combinations of brightness and low power.
  • a direct-view display includes an array of MEMS light modulators and a control matrix both formed on a transparent substrate, where each of the light modulators can be driven into at least two states.
  • the control matrix transmits data and actuation voltages to the array and may include, for each light modulator, a transistor and a capacitor.
  • the direct-view display also includes a controller for controlling the states of each of the light modulators in the array.
  • the controller includes an input, a processor, a memory, and an output.
  • the input receives image data encoding an image frame for display on the direct-view display.
  • the processor derives a plurality of sub-frame data sets from the image data. Each sub-frame data set indicates desired states of light modulators in multiple rows and multiple columns of the array.
  • the memory stores the plurality of sub-frame data sets.
  • the output outputs the plurality of sub-frame data sets according to an output sequence to drive light modulators into the states indicated in the sub-frame data sets.
  • the plurality of sub-frame data sets may include distinct sub-frame data sets for at least two of at least three color components of the image frame or for four color components of the image frame, where the four color components may consist of red, green, blue, and white.
  • the output sequence includes a plurality of events corresponding to the sub-frame data sets.
  • the controller stores different time values associated with events corresponding to at least two sub-frame data sets.
  • the time values may be selected to prevent illumination of the array while the modulators change states and may correlate to a brightness of a sub-frame image resulting from an outputting of a sub-frame data set of the plurality of sub-frame data sets.
  • the direct-view display may include a plurality of lamps, in which case the controller may store time values associated with lamp illumination events and/or lamp extinguishing events included in the output sequence.
  • the output sequence may include addressing events, where the controller stores time values associated with the addressing events.
  • the output sequence is stored at least in part in memory.
  • the direct-view display may include a data link to an external processor for receiving changes to the output sequence.
  • the direct-view display may include a plurality of lamps, where the output sequence includes a lamp illumination sequence.
  • the lamp illumination sequence may include data corresponding to the length of time and/or intensity with which lamps are illuminated in association with sub-frame data sets output in the output sequence. The length of time that a lamp is illuminated for each sub-frame data set in the lamp illumination sequence is preferably less than or equal to 4 milliseconds.
  • the processor derives the plurality of sub-frame data sets by decomposing the image frame into a plurality of sub-frame images and assigning a weight to each sub-frame image of the plurality of sub-frame images.
  • the controller may cause a sub-frame image to be illuminated for a length of time and/or with an illumination intensity proportional to the weight assigned to the sub-frame image.
  • the processor may assign the weight according to a coding scheme.
  • the coding scheme is a binary coding scheme
  • the sub-frame data sets are bitplanes
  • each color component of the image frame is decomposed into at least a most significant sub-frame image and a next most significant sub-frame image.
  • the most-significant sub-frame image may contribute to a displayed image frame twice as much as the next most significant sub-frame image.
  • the bitplane corresponding to the most significant sub-image of at least one color component of the image frame may be output at two distinct times which may be separated by no more than 25 milliseconds.
  • the length of time between a first time the bitplane corresponding to the most significant sub-frame image of a color component of the image frame is output and a second time the bitplane corresponding to the most significant sub-frame image of the color component is output is preferably within 10% of the length of time between the second time the bitplane corresponding to the most significant sub-frame image of the color component is output and a subsequent time at which a sub-frame image corresponding to a most significant sub-frame image of the color component is output.
  • At least one sub-frame data set corresponding to a first color component of the image frame is output before at least one sub-frame data set corresponding to a second color component of the image frame, and at least one sub-frame data set corresponding to the first color component of the image frame is output after at least one sub-frame data set corresponding to the second color component of the image frame.
  • Lamps of at least two different colors may be illuminated to display a single sub-frame image corresponding to a single sub-frame data set, where a lamp of one of the colors may be illuminated with a substantially greater intensity than lamps of the other colors.
  • the direct-view display includes a memory for storing a plurality of alternative output sequences and may include an output sequence switching module for switching between the output sequence and the plurality of alternative output sequences.
  • the output sequence switching module may respond to the processor, to a user interface included in the direct-view display, and/or to instructions received from a second processor, external to the controller, included in the device in which the direct-view display is incorporated.
  • the user interface may be a manual switch.
  • the direct-view display includes a sequence parameter calculation module for deriving changes to the output sequence. Based on characteristics of a received image frame, the sequence parameter calculation module may derive changes to the output sequence, to timing values stored in relation to events included in the output sequence, and/or to sub-frame data sets.
  • the direct-view display may include a plurality of lamps, in which case the sequence parameter calculation module may derive changes to lamp intensity values stored in relation to lamp illumination events included in the output sequence.
  • the array of light modulators includes a plurality of independently actuatable banks of light modulators.
  • the control matrix may include a plurality of global actuation interconnects, where each global actuation interconnect corresponds to a respective bank of light modulators.
  • the plurality of banks may be located adjacent one another in the array.
  • each bank of light modulator may include a plurality of rows in the array, where the banks are interwoven with one another in the array.
  • the display of a sub-frame image corresponding to a particular significance and color component in one of the banks is no more than 25 ms from a subsequent display of a sub-frame image corresponding to the significance value and color component, and is no more than 25 ms after a prior display of a sub-frame image corresponding to the significance and color component in the other of the banks.
  • the light modulators include shutters.
  • the shutters may selectively reflect light and/or selectively allow low light to pass through corresponding apertures to form the image frame.
  • the shutters may be driven transverse to the substrate.
  • the light modulators are reflective light modulators.
  • the light modulators selectively allow the passage of light towards a viewer.
  • a light guide is positioned proximate the array of light modulators.
  • the output sequence includes a plurality of global actuation events.
  • the direct-view display may include a global actuation interconnect coupled to the array of light modulators for causing light modulators in multiple rows and multiple columns of the array of light modulators to actuate substantially simultaneously.
  • a direct-view display in another aspect of the invention, includes an array of MEMS light modulators and a control matrix both formed on a transparent substrate, where each of the light modulators can be driven into at least two states, and lamps of at least three colors.
  • the control matrix transmits data and actuation voltages to the array.
  • the direct-view display also includes a controller for controlling the states of each of the light modulators in the array. The controller also controls the illumination of lamps to illuminate the array of light modulators with lamps of at least two colors at the same time to form a portion of an image. At least one of the colors illuminating the array of light modulators may be of greater intensity than the other colors.
  • Another aspect of the invention includes a method for displaying an image frame on a direct-view display.
  • the method includes the steps of receiving image data encoding the image frame; deriving a plurality of sub-frame data sets from the image data; storing the plurality of sub-frame data sets in a memory; and outputting the plurality of sub-frame data sets according to an output sequence.
  • Each sub-frame data set indicates desired states of MEMS light modulators in multiple rows and multiple columns of a light modulator array formed on a transparent substrate.
  • the step of outputting the plurality of sub-frame data sets drives the MEMS light modulators into the desired states indicated in each sub-frame data set and includes transmitting data and actuation voltages to the light modulator array via a control matrix formed on the transparent substrate.
  • a direct-view display in another aspect of the invention, includes an array of MEMS light modulators and a control matrix both formed on a transparent substrate, wherein each of the light modulators can be driven into at least two states.
  • the control matrix transmits data and actuation voltages to the array.
  • the direct-view display also includes a controller for controlling the states of each of the light modulators in the array.
  • the controller also controls the illumination of lamps of at least four colors to display an image.
  • the lamps may include at least a red lamp, a green lamp, a blue lamp, and a white lamp.
  • the lamps may include at least a red lamp, a green lamp, a blue lamp, and a yellow lamp.
  • the direct-view display may include a processor for translating three color image data into four color image data.
  • Another aspect of the invention includes a method for displaying an image on a direct-view display.
  • the method includes the steps of controlling states of MEMS light modulators in a light modulator array formed on a transparent substrate, where each of the MEMS light modulators can be driven into at least two states; transmitting data and actuation voltages to the light modulator array via a control matrix formed on the transparent substrate; and controlling the illumination of lamps of at least four colors to display the image.
  • FIG. 1 is a schematic diagram of a direct-view MEMS-based display according to an illustrative embodiment of the invention
  • FIG. 2A is a perspective view of an illustrative shutter-based light modulator suitable for incorporation into the direct-view MEMS-based display of FIG. 1 , according to an illustrative embodiment of the invention
  • FIG. 2B is a cross-sectional view of a rollershade-based light modulator suitable for incorporation into the direct-view MEMS-based display of FIG. 1 , according to an illustrative embodiment of the invention
  • FIG. 2C is a cross sectional view of a light-tap-based light modulator suitable for incorporation into the direct-view MEMS-based display of FIG. 1 , according to an illustrative embodiment of the invention
  • FIG. 2D is a cross sectional view of an electrowetting-based light modulator suitable for incorporation into the direct-view MEMS-based display of FIG. 1 , according to an illustrative embodiment of the invention
  • FIG. 3A is a schematic diagram of a control matrix suitable for controlling the light modulators incorporated into the direct-view MEMS-based display of FIG. 1 , according to an illustrative embodiment of the invention
  • FIG. 3B is a perspective view of an array of shutter-based light modulators connected to the control matrix of FIG. 3A , according to an illustrative embodiment of the invention
  • FIG. 3C illustrates a portion of a direct view display that includes the array of light modulators depicted in FIG. 3B disposed on top of a backlight, according to an illustrative embodiment of the invention
  • FIG. 3D is a schematic diagram of another suitable control matrix for inclusion in the direct-view MEMS-based display of FIG. 1 , according to an illustrative embodiment of the invention.
  • FIG. 4 is a timing diagram for a method of displaying an image on a display using a field sequential color technique
  • FIG. 5 is a timing diagram for a method of displaying an image on a display using a time-division gray scale technique
  • FIG. 6A is a schematic diagram of a digital image signal received by a display device, according to an illustrative embodiment of the invention.
  • FIG. 6B is a schematic diagram of a memory buffer useful for converting a received image signal into a bitplane, according to an illustrative embodiment of the invention.
  • FIG. 6C is a schematic diagram of portions of two bitplanes, according to an illustrative embodiment of the invention.
  • FIG. 7 is a block diagram of a display apparatus, according to an illustrative embodiment of the invention.
  • FIG. 8 is a flow chart of a method of displaying images suitable for use by the display apparatus of FIG. 6 , according to an illustrative embodiment of the invention.
  • FIG. 9 is a more detailed flow chart of a portion of a first implementation of the method of FIG. 7 , according to an illustrative embodiment of the invention.
  • FIG. 10 is a timing diagram illustrating the timing of various image formation events in the method of FIG. 9 , according to an illustrative embodiment of the invention.
  • FIG. 11 is a more detailed flow chart of a portion of a second implementation of the method of FIG. 8 , according to an illustrative embodiment of the invention.
  • FIG. 12 is a timing diagram illustrating the timing of various image formation events in a first implementation of the method of FIG. 11 , according to an illustrative embodiment of the invention.
  • FIG. 13 is a timing diagram illustrating the timing of various image formation events in a second implementation of the method of FIG. 11 , according to an illustrative embodiment of the invention.
  • FIG. 14A is a timing diagram illustrating the timing of various image formation events in a third implementation of the method of FIG. 11 , according to an illustrative embodiment of the invention.
  • FIG. 14B is a timing diagram illustrating the timing of various image formation events in a fourth implementation of the method of FIG. 11 , according to an illustrative embodiment of the invention.
  • FIG. 14C depicts various pulse profiles for lamps, according to an illustrative embodiment of the invention.
  • FIG. 15 is a timing diagram illustrating the timing of various image formation events in a fourth implementation of the method of FIG. 11 , according to an illustrative embodiment of the invention.
  • FIG. 16 is a timing diagram illustrating the timing of various image formation events in a fifth implementation of the method of FIG. 11 , according to an illustrative embodiment of the invention.
  • FIG. 17 is a timing diagram illustrating the timing of various image formation events in a sixth implementation of the method of FIG. 11 , according to an illustrative embodiment of the invention.
  • FIG. 18 is a more detailed flow chart of a portion of a third implementation of the method of FIG. 8 , according to an illustrative embodiment of the invention.
  • FIG. 19 is a timing diagram illustrating the timing of various image formation events in an implementation of the method of FIG. 18 , according to an illustrative embodiment of the invention.
  • FIG. 20 is a block diagram of a controller suitable for inclusion in the display apparatus of FIG. 1 , according to an illustrative embodiment of the invention.
  • FIG. 21 is a flow chart of a method of displaying an image suitable for use by the controller of FIG. 20 , according to an illustrative embodiment of the invention.
  • FIG. 22 is a block diagram of a second controller suitable for inclusion in the display apparatus of FIG. 1 , according to an illustrative embodiment of the invention.
  • FIG. 23 is a flow chart of a method of displaying an image suitable for use by the controller of FIG. 22 , according to an illustrative embodiment of the invention.
  • FIG. 1 is a schematic diagram of a direct-view MEMS-based display apparatus 100 , according to an illustrative embodiment of the invention.
  • the display apparatus 100 includes a plurality of light modulators 102 a - 102 d (generally “light modulators 102 ”) arranged in rows and columns.
  • light modulators 102 a and 102 d are in the open state, allowing light to pass.
  • Light modulators 102 b and 102 c are in the closed state, obstructing the passage of light.
  • the display apparatus 100 can be utilized to form an image 104 for a backlit display, if illuminated by a lamp or lamps 105 .
  • the apparatus 100 may form an image by reflection of ambient light originating from the front of the apparatus.
  • the apparatus 100 may form an image by reflection of light from a lamp or lamps positioned in the front of the display, i.e. by use of a frontlight.
  • each light modulator 102 corresponds to a pixel 106 in the image 104 .
  • the display apparatus 100 may utilize a plurality of light modulators to form a pixel 106 in the image 104 .
  • the display apparatus 100 may include three color-specific light modulators 102 . By selectively opening one or more of the color-specific light modulators 102 corresponding to a particular pixel 106 , the display apparatus 100 can generate a color pixel 106 in the image 104 .
  • the display apparatus 100 includes two or more light modulators 102 per pixel 106 to provide grayscale in an image 104 .
  • a “pixel” corresponds to the smallest picture element defined by the resolution of image.
  • the term “pixel” refers to the combined mechanical and electrical components utilized to modulate the light that forms a single pixel of the image.
  • Display apparatus 100 is a direct-view display in that it does not require imaging optics that are necessary for projection applications.
  • a projection display the image formed on the surface of the display apparatus is projected onto a screen or onto a wall.
  • the display apparatus is substantially smaller than the projected image.
  • a direct view display the user sees the image by looking directly at the display apparatus, which contains the light modulators and optionally a backlight or front light for enhancing brightness and/or contrast seen on the display.
  • Direct-view displays may operate in either a transmissive or reflective mode.
  • the light modulators filter or selectively block light which originates from a lamp or lamps positioned behind the display.
  • the light from the lamps is optionally injected into a lightguide or “backlight” so that each pixel can be uniformly illuminated.
  • Transmissive direct-view displays are often built onto transparent or glass substrates to facilitate a sandwich assembly arrangement where one substrate, containing the light modulators, is positioned directly on top of the backlight.
  • Each light modulator 102 includes a shutter 108 and an aperture 109 .
  • the shutter 108 is positioned such that it allows light to pass through the aperture 109 towards a viewer.
  • the shutter 108 is positioned such that it obstructs the passage of light through the aperture 109 .
  • the aperture 109 is defined by an opening patterned through a reflective or light-absorbing material in each light modulator 102 .
  • the display apparatus also includes a control matrix connected to the substrate and to the light modulators for controlling the movement of the shutters.
  • the control matrix includes a series of electrical interconnects (e.g., interconnects 110 , 112 , and 114 ), including at least one write-enable interconnect 110 (also referred to as a “scan-line interconnect”) per row of pixels, one data interconnect 112 for each column of pixels, and one common interconnect 114 providing a common voltage to all pixels, or at least to pixels from both multiple columns and multiples rows in the display apparatus 100 .
  • V we the write-enable interconnect 110 for a given row of pixels prepares the pixels in the row to accept new shutter movement instructions.
  • the data interconnects 112 communicate the new movement instructions in the form of data voltage pulses.
  • the data voltage pulses applied to the data interconnects 112 directly contribute to an electrostatic movement of the shutters.
  • the data voltage pulses control switches, e.g., transistors or other non-linear circuit elements that control the application of separate actuation voltages, which are typically higher in magnitude than the data voltages, to the light modulators 102 . The application of these actuation voltages then results in the electrostatic driven movement of the shutters 108 .
  • FIG. 2A is a perspective view of an illustrative shutter-based light modulator 200 suitable for incorporation into the direct-view MEMS-based display apparatus 100 of FIG. 1 , according to an illustrative embodiment of the invention.
  • the light modulator 200 includes a shutter 202 coupled to an actuator 204 .
  • the actuator 204 is formed from two separate compliant electrode beam actuators 205 (the “actuators 205 ”), as described in U.S. patent application Ser. No. 11/251,035, filed on Oct. 14, 2005.
  • the shutter 202 couples on one side to the actuators 205 .
  • the actuators 205 move the shutter 202 transversely over a surface 203 in a plane of motion which is substantially parallel to the surface 203 .
  • the opposite side of the shutter 202 couples to a spring 207 which provides a restoring force opposing the forces exerted by the actuator 204 .
  • Each actuator 205 includes a compliant load beam 206 connecting the shutter 202 to a load anchor 208 .
  • the load anchors 208 along with the compliant load beams 206 serve as mechanical supports, keeping the shutter 202 suspended proximate to the surface 203 .
  • the surface includes one or more aperture holes 211 for admitting the passage of light.
  • the load anchors 208 physically connect the compliant load beams 206 and the shutter 202 to the surface 203 and electrically connect the load beams 206 to a bias voltage, in some instances, ground.
  • aperture holes 211 are formed in the substrate by etching an array of holes through the substrate 204 . If the substrate 204 is transparent, such as glass or plastic, then the first step of the processing sequence involves depositing a light blocking layer onto the substrate and etching the light blocking layer into an array of holes 211 .
  • the aperture holes 211 can be generally circular, elliptical, polygonal, serpentine, or irregular in shape.
  • Each actuator 205 also includes a compliant drive beam 216 positioned adjacent to each load beam 206 .
  • the drive beams 216 couple at one end to a drive beam anchor 218 shared between the drive beams 216 .
  • the other end of each drive beam 216 is free to move.
  • Each drive beam 216 is curved such that it is closest to the load beam 206 near the free end of the drive beam 216 and the anchored end of the load beam 206 .
  • a display apparatus incorporating the light modulator 200 applies an electric potential to the drive beams 216 via the drive beam anchor 218 .
  • a second electric potential may be applied to the load beams 206 .
  • the resulting potential difference between the drive beams 216 and the load beams 206 pulls the free ends of the drive beams 216 towards the anchored ends of the load beams 206 , and pulls the shutter ends of the load beams 206 toward the anchored ends of the drive beams 216 , thereby driving the shutter 202 transversely towards the drive anchor 218 .
  • the compliant members 206 act as springs, such that when the voltage across the beams 206 and 216 potential is removed, the load beams 206 push the shutter 202 back into its initial position, releasing the stress stored in the load beams 206 .
  • a light modulator such as light modulator 200 incorporates a passive restoring force, such as a spring, for returning a shutter to its rest position after voltages have been removed.
  • a passive restoring force such as a spring
  • Other shutter assemblies incorporate a dual set of “open” and “closed” actuators and a separate sets of “open” and “closed” electrodes for moving the shutter into either an open or a closed state.
  • FIG. 2B is a cross-sectional view of a rolling actuator-based light modulator 220 suitable for incorporation into the direct-view MEMS-based display apparatus 100 of FIG. 1 , according to an illustrative embodiment of the invention.
  • a rolling actuator-based light modulator 220 suitable for incorporation into the direct-view MEMS-based display apparatus 100 of FIG. 1 , according to an illustrative embodiment of the invention.
  • a rolling actuator-based light modulator includes a moveable electrode disposed opposite a fixed electrode and biased to move in a preferred direction to produce a shutter upon application of an electric field.
  • the light modulator 220 includes a planar electrode 226 disposed between a substrate 228 and an insulating layer 224 and a moveable electrode 222 having a fixed end 230 attached to the insulating layer 224 . In the absence of any applied voltage, a moveable end 232 of the moveable electrode 222 is free to roll towards the fixed end 230 to produce a rolled state.
  • a voltage between the electrodes 222 and 226 causes the moveable electrode 222 to unroll and lie flat against the insulating layer 224 , whereby it acts as a shutter that blocks light traveling through the substrate 228 .
  • the moveable electrode 222 returns to the rolled state after the voltage is removed.
  • the bias towards a rolled state may be achieved by manufacturing the moveable electrode 222 to include an anisotropic stress state.
  • FIG. 2C is a cross-sectional view of a light-tap-based light modulator 250 suitable for incorporation into the direct-view MEMS-based display apparatus 100 of FIG. 1 , according to an illustrative embodiment of the invention.
  • a light tap works according to a principle of frustrated total internal reflection. That is, light 252 is introduced into a light guide 254 , in which, without interference, light 252 is for the most part unable to escape the light guide 254 through its front or rear surfaces due to total internal reflection.
  • the light tap 250 includes a tap element 256 that has a sufficiently high index of refraction that, in response to the tap element 256 contacting the light guide 254 , light 252 impinging on the surface of the light guide adjacent the tap element 256 escapes the light guide 254 through the tap element 258 towards a viewer, thereby contributing to the formation of an image.
  • the tap element 256 is formed as part of beam 258 of flexible, transparent material. Electrodes 260 coat portions one side of the beam 258 . Opposing electrodes 260 are disposed on a cover plate 264 positioned adjacent the layer 258 on the opposite side of the light guide 254 . By applying a voltage across the electrodes 260 , the position of the tap element 256 relative to the light guide 254 can be controlled to selectively extract light 252 from the light guide 254 .
  • FIG. 2D is a cross sectional view of a third illustrative non-shutter-based light modulator suitable for inclusion in various embodiments of the invention
  • FIG. 2 D is a cross sectional view of an electrowetting-based light modulation array 270 .
  • the light modulation array 270 includes a plurality of electrowetting-based light modulation cells 272 a - 272 d (generally “cells 272 ”) formed on an optical cavity 274 .
  • the light modulation array 270 also includes a set of color filters 276 corresponding to the cells 272 .
  • Each cell 272 includes a layer of water (or other transparent conductive or polar fluid) 278 , a layer of light absorbing oil 280 , a transparent electrode 282 (made, for example, from indium-tin oxide) and an insulating layer 284 positioned between the layer of light absorbing oil 280 and the transparent electrode 282 .
  • a transparent electrode 282 made, for example, from indium-tin oxide
  • an insulating layer 284 positioned between the layer of light absorbing oil 280 and the transparent electrode 282 .
  • the electrode takes up a portion of a rear surface of a cell 272 .
  • the remainder of the rear surface of a cell 272 is formed from a reflective aperture layer 286 that forms the front surface of the optical cavity 274 .
  • the reflective aperture layer 286 is formed from a reflective material, such as a reflective metal or a stack of thin films forming a dielectric mirror.
  • a reflective material such as a reflective metal or a stack of thin films forming a dielectric mirror.
  • an aperture is formed in the reflective aperture layer 286 to allow light to pass through.
  • the electrode 282 for the cell is deposited in the aperture and over the material forming the reflective aperture layer 286 , separated by another dielectric layer.
  • the remainder of the optical cavity 274 includes a light guide 288 positioned proximate the reflective aperture layer 286 , and a second reflective layer 290 on a side of the light guide 288 opposite the reflective aperture layer 286 .
  • a series of light redirectors 291 are formed on the rear surface of the light guide, proximate the second reflective layer.
  • the light redirectors 291 may be either diffuse or specular reflectors.
  • One of more light sources 292 inject light 294 into the light guide 288 .
  • an additional transparent substrate is positioned between the light guide 290 and the light modulation array 270 .
  • the reflective aperture layer 286 is formed on the additional transparent substrate instead of on the surface of the light guide 290 .
  • a voltage to the electrode 282 of a cell causes the light absorbing oil 280 in the cell to collect in one portion of the cell 272 .
  • the light absorbing oil 280 no longer obstructs the passage of light through the aperture formed in the reflective aperture layer 286 (see, for example, cells 272 b and 272 c ).
  • Light escaping the backlight at the aperture is then able to escape through the cell and through a corresponding color (for example, red, green, or blue) filter in the set of color filters 276 to form a color pixel in an image.
  • the electrode 282 is grounded, the light absorbing oil 280 covers the aperture in the reflective aperture layer 286 , absorbing any light 294 attempting to pass through it.
  • the area under which oil 280 collects when a voltage is applied to the cell 272 constitutes wasted space in relation to forming an image. This area cannot pass light through, whether a voltage is applied or not, and therefore, without the inclusion of the reflective portions of reflective apertures layer 286 , would absorb light that otherwise could be used to contribute to the formation of an image. However, with the inclusion of the reflective aperture layer 286 , this light, which otherwise would have been absorbed, is reflected back into the light guide 290 for future escape through a different aperture.
  • roller-based light modulator 220 , light tap 250 , and electrowetting-based light modulation array 270 are not the only examples of a non-shutter-based MEMS modulator suitable for control by the control matrices described herein.
  • Other forms of non-shutter-based MEMS modulators could likewise be controlled by various ones of the control matrices described herein without departing from the scope of the invention.
  • FIG. 3A is a schematic diagram of a control matrix 300 suitable for controlling the light modulators incorporated into the direct-view MEMS-based display apparatus 100 of FIG. 1 , according to an illustrative embodiment of the invention.
  • FIG. 3B is a perspective view of an array 320 of shutter-based light modulators connected to the control matrix 300 of FIG. 3A , according to an illustrative embodiment of the invention.
  • the control matrix 300 may address an array of pixels 320 (the “array 320 ”).
  • Each pixel 301 includes an elastic shutter assembly 302 , such as the shutter assembly 200 of FIG. 2A , controlled by an actuator 303 .
  • Each pixel also includes an aperture layer 322 that includes aperture holes 324 . Further electrical and mechanical descriptions of shutter assemblies such as shutter assembly 302 , and variations thereon, can be found in U.S. patent application Ser. Nos. 11/251,035 and 11/326,696.
  • the control matrix 300 is fabricated as a diffused or thin-film-deposited electrical circuit on the surface of a substrate 304 on which the shutter assemblies 302 are formed.
  • the control matrix 300 includes a scan-line interconnect 306 for each row of pixels 301 in the control matrix 300 and a data-interconnect 308 for each column of pixels 301 in the control matrix 300 .
  • Each scan-line interconnect 306 electrically connects a write-enabling voltage source 307 to the pixels 301 in a corresponding row of pixels 301 .
  • Each data interconnect 308 electrically connects a data voltage source, (“Vd source”) 309 to the pixels 301 in a corresponding column of pixels 301 .
  • Vd source data voltage source
  • the data voltage V d provides the majority of the energy necessary for actuation of the shutter assemblies 302 .
  • the data voltage source 309 also serves as an actuation voltage source.
  • the control matrix 300 includes a transistor 310 and a capacitor 312 .
  • the gate of each transistor 310 is electrically connected to the scan-line interconnect 306 of the row in the array 320 in which the pixel 301 is located.
  • the source of each transistor 310 is electrically connected to its corresponding data interconnect 308 .
  • the actuators 303 of each shutter assembly include two electrodes.
  • the drain of each transistor 310 is electrically connected in parallel to one electrode of the corresponding capacitor 312 and to the one of the electrodes of the corresponding actuator 303 .
  • the other electrode of the capacitor 312 and the other electrode of the actuator 303 in shutter assembly 302 are connected to a common or ground potential.
  • the control matrix 300 write-enables each row in the array 320 in sequence by applying V we to each scan-line interconnect 306 in turn.
  • V we For a write-enabled row, the application of V we to the gates of the transistors 310 of the pixels 301 in the row allows the flow of current through the data interconnects 308 through the transistors to apply a potential to the actuator 303 of the shutter assembly 302 . While the row is write-enabled, data voltages V d are selectively applied to the data interconnects 308 .
  • the data voltage applied to each data interconnect 308 is varied in relation to the desired brightness of the pixel 301 located at the intersection of the write-enabled scan-line interconnect 306 and the data interconnect 308 .
  • the data voltage is selected to be either a relatively low magnitude voltage (i.e., a voltage near ground) or to meet or exceed V at (the actuation threshold voltage).
  • the actuator 303 in the corresponding shutter assembly 302 actuates, opening the shutter in that shutter assembly 302 .
  • the voltage applied to the data interconnect 308 remains stored in the capacitor 312 of the pixel 301 even after the control matrix 300 ceases to apply V we to a row. It is not necessary, therefore, to wait and hold the voltage V we on a row for times long enough for the shutter assembly 302 to actuate; such actuation can proceed after the write-enabling voltage has been removed from the row.
  • the voltage in the capacitors 312 in a row remain substantially stored until an entire video frame is written, and in some implementations until new data is written to the row.
  • the pixels 301 of the array 320 are formed on a substrate 304 .
  • the array includes an aperture layer 322 , disposed on the substrate, which includes a set of aperture holes 324 for each pixel 301 in the array 320 .
  • the aperture holes 324 are aligned with the shutter assemblies 302 in each pixel.
  • the substrate 304 is made of a transparent material, such as glass or plastic.
  • the substrate 304 is made of an opaque material, but in which holes are etched to form the aperture holes 324 .
  • the shutter assembly 302 together with the actuator 303 can be made bi-stable. That is, the shutters can exist in at least two equilibrium positions (e.g. open or closed) with little or no power required to hold them in either position. More particularly, the shutter assembly 302 can be mechanically bi-stable. Once the shutter of the shutter assembly 302 is set in position, no electrical energy or holding voltage is required to maintain that position. The mechanical stresses on the physical elements of the shutter assembly 302 can hold the shutter in place.
  • the shutter assembly 302 together with the actuator 303 can also be made electrically bi-stable.
  • an electrically bi-stable shutter assembly there exists a range of voltages below the actuation voltage of the shutter assembly, which if applied to a closed actuator (with the shutter being either open or closed), hold the actuator closed and the shutter in position, even if an opposing force is exerted on the shutter.
  • the opposing force may be exerted by a spring such as spring 207 in shutter-based light modulator 200 , or the opposing force may be exerted by an opposing actuator, such as an “open” or “closed” actuator.
  • the light modulator array 320 is depicted as having a single MEMS light modulator per pixel. Other embodiments are possible in which multiple MEMS light modulators are provided in each pixel, thereby providing the possibility of more than just binary “on’ or “off” optical states in each pixel. Certain forms of coded area division gray scale are possible wherein the multiple MEMS light modulators in the pixel are provided, and where with aperture holes 324 associated with each of the light modulators have unequal areas.
  • FIG. 3D is yet another suitable control matrix 340 for inclusion in the display apparatus 100 , according to an illustrative embodiment of the invention.
  • Control matrix 340 controls an array of pixels 342 that include shutter assemblies 344 .
  • the control matrix 340 includes a single data interconnect 348 for each column of pixels 342 in the control matrix.
  • the actuators in the shutter assemblies 344 can be made either electrically bi-stable or mechanically bi-stable.
  • the control matrix 340 includes a scan-line interconnect 346 for each row of pixels 342 in the control matrix 340 .
  • the control matrix 340 further includes a charge interconnect 350 , and a global actuation interconnect 354 , and a shutter common interconnect 355 .
  • These interconnects 350 , 354 and 355 are shared among pixels 342 in multiple rows and multiple columns in the array. In one implementation, the interconnects 350 , 354 , and 355 are shared among all pixels 342 in the control matrix 340 .
  • Each pixel 342 in the control matrix includes a shutter charge transistor 356 , a shutter discharge transistor 358 , a shutter write-enable transistor 357 , and a data store capacitor 359 .
  • Control matrix 340 also incorporates an optional voltage stabilizing capacitor 352 which is connected in parallel with the source and drain of discharge switch transistor 358 .
  • the gate terminals of the charging transistor 356 are connected directly to the charge interconnect 350 , along with the drain terminal of the charging transistor 356 .
  • the charging transistors 356 operate essentially as diodes, they can pass a current in only one direction.
  • the control matrix 340 applies a voltage pulse to the charge interconnect 350 , allowing current to flow through charging transistor 356 and into the shutter assemblies 344 of the pixels 342 .
  • each of the shutter electrodes of shutter assemblies 344 will be in the same voltage state.
  • the potential of charge interconnect 350 is reset to zero, and the charging transistors 356 will prevent the charge stored in the shutter assemblies 344 from being dissipated through charge interconnect 350 .
  • the charge interconnect 350 transmits a pulsed voltage equal to or greater than V at , e.g., 40V.
  • the imposition of a voltage in excess of V at of causes all of the shutter assemblies connected to the charging interconnect 350 to actuate or move into the same state, for instance the shutter closed state.
  • the control matrix 340 applies a write-enabling voltage V we to the scan-line interconnect 346 corresponding to each row. While a particular row of pixels 342 is write-enabled, the control matrix 340 applies a data voltage to the data interconnect 348 corresponding to each column of pixels 342 in the control matrix 340 .
  • the application of V we to the scan-line interconnect 346 for the write-enabled row turns on the write-enable transistor 357 of the pixels 342 in the corresponding scan line.
  • the voltages applied to the data interconnect 348 is thereby caused to be stored on the data store capacitor 359 of the respective pixels 342 .
  • control matrix 340 the global actuation interconnect 354 is connected to the source of the shutter discharge switch transistor 358 . Maintaining the global actuation interconnect 354 at a potential significantly above that of the shutter common interconnect 355 prevents the turn-on of the discharge switch transistor 358 , regardless of what charge is stored on the capacitor 359 .
  • Global actuation in control matrix 340 is achieved by bringing the potential on the global actuation interconnect 354 to ground or to substantially the same potential as the shutter common interconnect 355 , enabling the discharge switch transistor 358 to turn-on in accordance to the whether a data voltage has been stored on capacitor 359 .
  • the discharge transistor turns on, charge drains out of the actuators of shutter assembly 344 , and the shutter assembly 344 is allowed to move or actuate into its relaxed state, for instance the shutter open state.
  • the discharge transistor 358 do not turn on and the shutter assembly 344 remains charged.
  • a voltage remains across the actuators of shutter assemblies 344 and those pixels remain, for instance, in the shutter closed state.
  • Control matrix 340 does not depend on electrical bi-stability in the shutter assembly 344 in order to achieve global actuation.
  • the global actuation interconnect 354 is connected to every shutter discharge transistor 358 in every row and column in the array of pixels. In other implementations the global actuation interconnect 354 is connected to the shutter discharge transistors within only a sub-group of pixels in multiple rows and columns.
  • the array of pixels can be arranged in banks, where each bank of pixels is connected by means of a global actuation interconnects to a unique global actuation driver. In this implementation the control circuit can load data into the selected banks and then actuate only the selected bank globally by means of the selected global actuation driver.
  • the display is separated into two banks, with one set of global drivers and global actuation interconnects connected to pixels in the odd-numbered rows while a separate set of global drivers and global actuation interconnects is connected to pixels in the even-numbered rows.
  • a separate set of global drivers and global actuation interconnects is connected to pixels in the even-numbered rows.
  • as many as 6 or 8 separately actuatable addressing banks are employed.
  • Other implementations of circuits for controlling displays are described in U.S. Ser. No. 11/607,715 filed Dec. 1, 2006 and entitled “Circuits for Controlling Display Apparatus,” which is incorporated herein by reference.
  • FIG. 3C illustrates a portion of a direct view display 380 that includes the array of light modulators 320 depicted in FIG. 3B disposed on top of backlight 330 .
  • the backlight 330 is made of a transparent material, i.e. glass or plastic, and functions as a light guide for evenly distributing light from lamps 382 , 384 , and 386 throughout the display plane.
  • the lamps 382 , 384 , and 386 can be alternate color lamps, e.g. red, green, and blue lamps respectively.
  • lamps 382 - 386 can be employed in the displays, including without limitation: incandescent lamps, fluorescent lamps, lasers, or light emitting diodes (LEDs). Further, lamp 382 - 386 of direct view display 380 can be combined into a single assembly containing multiple lamps. For instance a combination of red, green, and blue LEDs can be combined with or substituted for a white LED in a small semiconductor chip, or assembled into a small multi-lamp package. Similarly each lamp can represent an assembly of 4-color LEDs, for instance a combination of red, yellow, green, and blue LEDs.
  • the shutter assemblies 302 function as light modulators. By use of electrical signals from the associated control matrix the shutter assemblies 302 can be set into either an open or a closed state. Only the open shutters allow light from the lightguide 330 to pass through to the viewer, thereby forming a direct view image.
  • the light modulators are formed on the surface of substrate 304 that faces away from the light guide 330 and toward the viewer.
  • the substrate 304 can be reversed, such that the light modulators are formed on a surface that faces toward the light guide.
  • it is sometimes preferable to form an aperture layer, such as aperture layer 322 directly onto the top surface of the light guide 330 .
  • the spacing between the plane of the shutter assemblies 302 and the aperture layer 322 be kept as close as possible, preferably less than 10 microns, in some cases as close as 1 micron.
  • Descriptions of other optical assemblies useful for this invention can be found in US Patent Application Publication No. 20060187528A1 filed Sep. 2, 2005 and entitled “Methods and Apparatus for Spatial Light Modulation” and in U.S. Ser. No. 11/528,191 filed Sep. 26, 2006 and entitled “Display Apparatus with Improved Optical Cavities,” which are both incorporated herein by reference.
  • color pixels are generated by illuminating groups of light modulators corresponding to different colors, for example, red green and blue. Each light modulator in the group has a corresponding filter to achieve the desired color.
  • the filters absorb a great deal of light, in some cases as much as 60% of the light passing through the filters, thereby limiting the efficiency and brightness of the display.
  • the use of multiple light modulators per pixel decreases the amount of space on the display that can be used to contribute to a displayed image, further limiting the brightness and efficiency of such a display.
  • the human brain in response to viewing rapidly changing images, for example, at frequencies of greater than 20 Hz, averages images together to perceive an image which is the combination of the images displayed within a corresponding period.
  • This phenomenon can be utilized to display color images while using only single light modulators for each pixel of a display, using a technique referred to in the art as field sequential color.
  • field sequential color techniques eliminates the need for color filters and multiple light modulators per pixel.
  • an image frame to be displayed is divided into a number of sub-frame images, each corresponding to a particular color component (for example, red, green, or blue) of the original image frame.
  • the light modulators of a display are set into states corresponding to the color component's contribution to the image.
  • the light modulators then are illuminated by a lamp of the corresponding color.
  • the sub-images are displayed in sequence at a frequency (for example, greater than 60 Hz) sufficient for the brain to perceive the series of sub-frame images as a single image.
  • the data used to generate the sub-frames are often fractured in various memory components. For example, in some displays, data for a given row of display are kept in a shift-register dedicated to that row. Image data is shifted in and out of each shift register to a light modulator in a corresponding column in that row of the display according to a fixed clock cycle.
  • FIG. 4 is a timing diagram 400 corresponding to a display process for displaying images using field sequential color, which can be implemented according to an illustrative embodiment of the invention, for example, by a MEMS direct-view display described in FIG. 7 .
  • the timing diagrams included herein, including the timing diagram 400 of FIG. 4 conform to the following conventions.
  • the top portions of the timing diagrams illustrate light modulator addressing events.
  • the bottom portions illustrate lamp illumination events.
  • the addressing portions depict addressing events by diagonal lines spaced apart in time. Each diagonal line corresponds to a series of individual data loading events during which data is loaded into each row of an array of light modulators, one row at a time. Depending on the control matrix used to address and drive the modulators included in the display, each loading event may require a waiting period to allow the light modulators in a given row to actuate. In some implementations, all rows in the array of light modulators are addressed prior to actuation of any of the light modulators. Upon completion of loading data into the last row of the array of light modulators, all light modulators are actuated substantially simultaneously. One method for such actuation is described further in relation to FIG. 11 .
  • Lamp illumination events are illustrated by pulse trains corresponding to each color of lamp included in the display. Each pulse indicates that the lamp of the corresponding color is illuminated, thereby displaying the sub-frame image loaded into the array of light modulators in the immediately preceding addressing event.
  • the time at which the first addressing event in the display of a given image frame begins is labeled on each timing diagram as AT 0 . In most of the timing diagrams, this time falls shortly after the detection of a voltage pulse vsync, which precedes the beginning of each video frame received by a display.
  • the times at which each subsequent addressing event takes place are labeled as AT 1 , AT 2 , . . . AT(n ⁇ 1), where n is the number of sub-frame images used to display the image frame.
  • the diagonal lines are further labeled to indicate the data being loaded into the array of light modulators. For example, in the timing diagrams of FIGS.
  • D 0 represents the first data loaded into the array of light modulators for a frame and D(n ⁇ 1) represents the last data loaded into the array of light modulators for the frame.
  • D(n ⁇ 1) represents the last data loaded into the array of light modulators for the frame.
  • a bitplane is a coherent set of data identifying desired modulator states for modulators in multiple rows and multiple columns of an array of light modulators.
  • each bitplane corresponds to one of a series of sub-frame images derived according to a binary coding scheme. That is, each sub-frame image for a color component of an image frame is weighted according to a binary series 1, 2, 4, 8, 16, etc.
  • the bitplane with the lowest weighting is referred to as the least significant bitplane and is labeled in the timing diagrams and referred to herein by the first letter of the corresponding color component followed by the number 0.
  • next-most significant bitplane For each next-most significant bitplane for the color components, the number following the first letter of the color component increases by one. For example, for an image frame broken into 4 bitplanes per color, the least significant red bitplane is labeled and referred to as the R 0 bitplane. The next most significant red bitplane is labeled and referred to as R 1 , and the most significant red bitplane is labeled and referred to as R 3 .
  • Lamp-related events are labeled as LT 0 , LT 1 , LT 2 . . . LT(n ⁇ 1).
  • the lamp-related event times labeled in a timing diagram either represent times at which a lamp is illuminated or times at which a lamp is extinguished.
  • the meaning of the lamp times in a particular timing diagram can be determined by comparing their position in time relative to the pulse trains in the illumination portion of the particular timing diagram.
  • a single sub-frame image is used to display each of three color components of an image frame.
  • data, D 0 indicating modulator states desired for a red sub-frame image are loaded into an array of light modulators beginning at time AT 0 .
  • the red lamp is illuminated at time LT 0 , thereby displaying the red sub-frame image.
  • Data, D 1 indicating modulator states corresponding to a green sub-frame image are loaded into the array of light modulators at time AT 1 .
  • a green lamp is illuminated at time LT 1 .
  • data, D 2 indicating modulator states corresponding to a blue sub-frame image are loaded into the array of light modulators and a blue lamp is illuminated at times AT 2 and LT 2 , respectively. The process then repeats for subsequent image frames to be displayed.
  • the level of gray scale achievable by a display that forms images according to the timing diagram of FIG. 4 depends on how finely the state of each light modulator can be controlled. For example, if the light modulators are binary in nature, i.e., they can only be on or off, the display will be limited to generating 8 different colors.
  • the level of gray scale can be increased for such a display by providing light modulators than can be driven into additional intermediate states.
  • MEMS light modulators can be provided which exhibit an analog response to applied voltage.
  • the number of grayscale levels achievable in such a display is limited only by the resolution of digital to analog converters which are supplied in conjunction with data voltage sources, such as voltage source 309 .
  • finer grayscale can be generated if the time period used to display each sub-frame image is split into multiple time periods, each having its own corresponding sub-frame image.
  • a display that forms two sub-frame images of equal length and light intensity per color component can generate 27 different colors instead of 8.
  • Gray scale techniques that break each color component of an image frame into multiple sub-frame images are referred to, generally, as time division gray scale techniques.
  • FIG. 5 is a timing diagram corresponding to a display process for displaying an image frame by displaying multiple equally weighted sub-frame images per color that can be implemented by various embodiments of the invention.
  • each color component of an image frame is divided into four equally weighted sub-frame images. More particularly, each sub-frame image for a given color component is illuminated for the same amount of time at the same lamp intensity.
  • the number portion of the data identifier e.g., R 0 , R 1 , or G 3
  • the number portion of the data identifier only refers to the order in which the corresponding sub-frame image is displayed, and not to any weighting value.
  • the light modulators are binary in nature, a display utilizing this grayscale technique can generate 5 gray scale levels per color or 125 distinct colors.
  • data, R 0 indicating modulator states desired for a first red sub-frame image are loaded into an array of light modulators beginning at time AT 0 .
  • the red lamp is illuminated, thereby displaying the first red sub-frame image.
  • the red lamp is extinguished at time AT 1 , which is when data, R 1 , indicating modulator states corresponding to the next red sub-frame image are loaded into the array of light modulators.
  • the same steps repeat for each red sub-frame image corresponding to data R 1 , R 2 and R 3 .
  • the steps as described for the red sub-frame images R 0 -R 3 then repeat for the green sub-frame images G 0 -G 3 , and then for the blue sub-frame images B 0 -B 3 .
  • the process then repeats for subsequent image frames to be displayed.
  • the addressing times in FIG. 5 can be established through a variety of methods. Since the data is loaded at regular intervals, and since the sub-frame images are illuminated for equal times, a fixed clock cycle running with a frequency 12 times that of the vsync frequency can be sufficient for coordinating the display process.
  • FIGS. 6A-6C depict a process for generating a bitplane, according to an illustrative embodiment of the invention.
  • FIG. 6A is a schematic diagram of a digital image signal 600 received by a display device.
  • the image signal 600 encodes data corresponding to image frames.
  • the image signal 600 includes a series of bits for each pixel included in the image frame.
  • the data is encoded in a pixel-by-pixel fashion. That is, the image signal includes all data for the color of a single pixel in the image frame before it includes data for the next pixel.
  • the data for an image frame begins with a vsync signal indicating the beginning of the image frame.
  • the image signal 600 then includes, for example, 24 bits indicating the color of the pixel in the first row of the first column of the image frame. Of the 24 bits, 8 encode a red component of the pixel, 8 encode a green component, and 8 encode a blue component of the pixel. Each set of eight bits is referred to as a coded word. An eight bit coded word for each color enables a description of 256 unique brightness levels for each color, or 16 million unique combinations of the colors red, green, and blue.
  • each of the 8 bits represents a particular position or place value (also referred to as a significance value) in the coded word.
  • these place values are indicated by a coding scheme such as R 0 , R 1 , R 2 , R 3 , etc.
  • R 0 represents the least significant bit for the color red.
  • R 7 represents the most significant bit for the color red.
  • G 7 is the most significant bit for the color green, and B 7 is the most significant bit for the color blue.
  • the place values corresponding to R 0 , R 1 , R 2 , . . . R 7 are given by the binary series 2 0 , 2 1 , 2 2 , . . . , 2 7 .
  • the image signal 600 may include more or fewer bits per color component of an image.
  • the image signal 600 may include 3, 4, 5, 6, 7, 9, 10, 11, 12 or more bits per color component of an image frame.
  • the data as received in image signal 600 is organized by rows and columns. Generally the image signal provides all of the data for pixels in the first row before proceeding to subsequent rows. Within the first row, all of the data is received for the pixel in the first column before it is received for pixels in succeeding columns of the same row.
  • FIG. 6B is a schematic diagram of a memory buffer 620 useful for converting a received image signal into a bitplane, according to an illustrative embodiment of the invention.
  • a bitplane includes data for pixels in multiple columns and multiple rows of a display corresponding to a single significance value of a grayscale coded word for a color component of an image frame.
  • bits having the significance level are grouped together into a single data structure.
  • a small memory buffer 620 is employed to organize incoming image data.
  • the memory buffer 620 is organized in an array of rows and columns, and allows for data to be read in and out in by addressing either individual rows or by addressing individual columns.
  • Incoming data which, as described above, is received in a pixel by pixel format, is read into the memory buffer 620 in successive rows.
  • the memory buffer 620 stores data relevant to only a single designated row of the display, i.e. it operates on only a fraction of the incoming data at any given time.
  • Each numbered row within the memory buffer 620 contains complete pixel data for a given column for the designated row.
  • Each row of the memory buffer 620 contains complete gray scale data for a given pixel.
  • the data in the memory buffer 620 can be read out to populate a bitplane data structure.
  • the data is read out column by column.
  • Each column includes a single place value of the gray scale code word of the pixels row of the display.
  • These values correspond to desired states of light modulators in the display. For example, a 0 may refer to an “open” light modulator state and 1 may refer to a “closed” light modulator state, or visa versa. This process repeats for multiple rows in the display.
  • FIG. 6C is a schematic diagram of portions of two bitplanes 650 and 660 , according to an illustrative embodiment of the invention.
  • the first bitplane 650 includes data corresponding to the least significant bits of the gray scale coded words identifying the level of red (i.e., R 0 values) for the first 10 columns and 15 rows of pixels of a display.
  • the second bitplane 660 includes data corresponding to the second-least significant bits of the gray scale coded words identifying the level of red (i.e., R 1 ) for the same 10 columns and 15 rows of pixels of the display.
  • a sub-frame data set will refer herein to the general case of data structures which are not necessarily bitplanes: namely data structures that store information about the desired states of modulators in multiple rows and multiple columns of the array.
  • ternary coding a single sub-frame data set would include a ternary number value for each of the pixels in multiple rows and columns, e.g. a 0, 1, or 2.
  • Sequential sub-frame images according to a ternary coding scheme would be weighted according to the base-3 numbering system, with weights in the series 1, 3, 9, 27, etc.
  • a ternary coding system makes possible even greater numbers of achievable gray scale levels when displayed using an equal number of sub-frame images.
  • the use of quaternary or base-5 coding systems become advantageous in the control system.
  • FIG. 7 is a block diagram of a direct-view display 700 , according to an illustrative embodiment of the invention.
  • the direct-view display 700 includes an array of light modulators 702 , a controller 704 , a set of lamps 706 , and driver sets 708 , 710 , 714 , and 716 .
  • the array of light modulators 702 includes lights modulators arranged in rows and columns. Suitable light modulators include, without limitation, any of the MEMS-based light modulators described above in relation to FIGS. 2A-2D .
  • the array of light modulators 702 takes the form of the array of light modulators 320 depicted in FIG. 3B .
  • the light modulators are be controlled by a control matrix, such as the control matrices described in FIGS. 3A and 3D .
  • the controller receives an image signal 717 from an external source and generates outputs data and control signals to the drivers 708 , 710 , 714 , and 716 to control the light modulators in the array of light modulators 702 and the lamps 706 .
  • the order in which the data and control signals are output is referred to herein as an “output sequence,” described further below
  • the controller 704 includes an input processing module 718 , a memory control module 720 , a frame buffer 722 , a timing control module 724 , and a schedule table store 726 .
  • a module may be implemented as a hardware circuit including application specific integrated circuits, custom VLSI circuits or gate arrays, off-the-shelf semiconductors such as logic chips, memories, transistors, or other discrete components.
  • a module may also be implemented in programmable hardware devices such as field programmable gate arrays, programmable array logic, programmable logic devices or the like.
  • Modules may also be implemented in software for execution by various types of processors.
  • An identified module of executable code may, for instance, include one or more physical or logical blocks of computer instructions which may, for instance, be organized as an object, procedure, or function. Nevertheless, the executables of an identified module need not be physically located together, but may include disparate instructions stored in different locations which, when joined logically together, make up the module and achieve the stated purpose for the module.
  • a module of executable code could be a single instruction, or many instructions, and may even be distributed over several different code segments, among different programs, and across several memory devices.
  • operational data may be identified and illustrated herein within modules, and may be embodied in any suitable form and organized within any suitable type of data structure. The operational data may be collected as a single data set, or may be distributed over different locations including over different storage devices, and may exist, at least partially, merely as electronic signals on a system or network.
  • the illustration of direct view display 700 in FIG. 7 portrays the controller 704 and drivers 708 , 710 , 714 , and 716 as separate functional blocks. These blocks are understood to represent distinguishable circuits and/or modules of executable code.
  • the blocks 704 , 708 , 710 , 714 , and 716 may be provided as distinct chips or circuits which are connected together by means of circuit boards and/or cables. In other implementations, several of these blocks can be designed together into a single semiconductor chip such that their boundaries are nearly indistinguishable except by function.
  • the storage area referred to as frame buffer 722 is provided as a functional area within a custom design of the controller circuit 704 . In other implementations the frame buffer 722 is represented by a separate off-the-shelf memory chip such as a DRAM or SRAM.
  • the input processing module 718 receives the image signal 717 and processes the data encoded therein into a format suitable for displaying via the array of light modulators 702 .
  • the input processing module 718 takes the data encoding each image frame and converts it into a series of sub-frame data sets.
  • a sub-frame data set includes information about the desired states of modulators in multiple rows and multiple columns of the array of light modulators 702 aggregated into a coherent data structure.
  • the number and content of sub-frame data sets used to display an image frame depends on the grayscale technique employed by the controller 704 .
  • the sub-frame data sets needed to form an image frame using a coded time-division gray scale technique differs from the number and content of sub-frame data sets used to display an image frame using a non-coded time division gray scale technique.
  • the image processing module 718 may convert the image signal 717 into non-coded sub-frame data sets, ternary coded sub-frame data sets, or other form of coded sub-frame data set, preferably, the image processing module 718 converts the image signal 717 into bitplanes, as described above in relation to FIGS. 6A-6C .
  • the input processing module can carry out a number of other optional processing tasks. It may re-format or interpolate incoming data. For instance, it may rescale incoming data horizontally, vertically, or both, to fit within the spatial-resolution limits of modulator array 702 . It may also convert incoming data from an interlaced format to a progressive scan format. It may also resample the incoming data in time to reduce frame rates while maintaining acceptable flicker within the characteristics of MEMS display 700 . It may perform adjustments to contrast gradations of the incoming data, in some cases referred to as gamma corrections, to better match the gamma characteristics and/or contrast precision available in the MEMS display 700 .
  • gamma corrections adjustments to contrast gradations of the incoming data, in some cases referred to as gamma corrections, to better match the gamma characteristics and/or contrast precision available in the MEMS display 700 .
  • the input processing module will transform the data from an incoming 3-color space and map it to coordinates appropriate to the 4-color space.
  • the input processing module 718 outputs the sub-frame data sets to the memory control module 720 .
  • the memory control module 720 then stores the sub-frame data sets in the frame buffer 722 .
  • the frame buffer is preferably a random access memory, although other types of serial memory can be used without departing from the scope of the invention.
  • the memory control module 720 stores the sub-frame data set in a predetermined memory location based on the color and significance in a coding scheme of the sub-frame data set. In other implementations, the memory control module 720 stores the sub-frame data set in a dynamically determined memory location and stores that location in a lookup table for later identification.
  • the frame buffer 722 is configured for the storage of bitplanes.
  • the memory control module 720 is also responsible for, upon instruction from the timing control module 724 , retrieving sub-image data sets from the frame buffer 722 and outputting them to the data drivers 708 .
  • the data drivers 708 load the data output from the memory control module 720 into the light modulators of the array of light modulators 702 .
  • the memory control module 720 outputs the data in the sub-image data sets one row at a time.
  • the frame buffer 722 includes two buffers, whose roles alternate. While the memory control module 720 stores newly generated bitplanes corresponding to a new image frame in one buffer, it extracts bitplanes corresponding to the previously received image frame from the other buffer for output to the array of light modulators 702 . Both buffer memories can reside within the same circuit, separated only by address.
  • the timing control module 724 manages the output by the controller 704 of data and command signals according to an output sequence.
  • the output sequence includes the order and timing with which sub-frame data sets are output to the array of light modulators 702 and the timing and character of illumination events.
  • the output sequence in some implementations, also includes global actuation events.
  • At least some of the parameters that define the output sequence are stored in volatile memory. This volatile memory is referred to as schedule table store 726 .
  • a table including the data stored in the schedule table store 726 is referred to herein as a “schedule table” or alternately as a “sequence table”. The data stored therein need not actually be stored in table format.
  • the data stored in the schedule table store 726 is easier for a human to understand if displayed in table format.
  • the actual data structure used to store output sequence data can be, for example a series of bit strings. Each string of bits includes a series of coded words corresponding to timing values, memory, addresses, and illumination data.
  • An illustrative data structure for storing output sequence parameters is described further in relation to FIG. 24 . Other data structures may be employed without departing from the scope of the invention.
  • Some output sequence parameters may be stored as hardwired logic in the timing control module 724 .
  • the logic incorporated into the timing control module to wait until a particular event time may be expressed as follows:
  • a trigger signal is sent.
  • the trigger signal may be sent to the memory control module 720 to initiate the loading of a bitplane into the modulators. Or, the trigger signal could be sent to lamp driver 706 to switch the lamp on or off.
  • the logic takes the form of logic circuitry built directly into the timing control module 724 .
  • the particular timing parameter 1324 is a scalar value contained within the command sequence.
  • the logic does not include a specific value for a number of clock pulses to wait, but refers instead to one of a series of timing values which are stored in schedule stable store 726 .
  • the output sequence parameters stored in the schedule table store 726 vary in different embodiments of the invention.
  • the schedule table store 726 stores timing values associated with each sub-frame data set.
  • the schedule table store 726 may store timing values associated with the beginning of each addressing event in the output sequence, as well as timing values associated with lamp illumination and/or lamp extinguishing events.
  • the schedule table store 726 stores lamp intensity values instead of or in addition to timing values associated with addressing events.
  • the schedule table store 726 stores an identifier indicating where each sub-image data set is stored in the frame buffer 722 , and illumination data indicating the color or colors associated with each respective sub-image data set.
  • the nature of the timing values stored in the schedule table store 726 can vary depending on the specific implementation of the controller 704 .
  • the timing value, as stored in the schedule table store 726 in one implementation, is a number of clock cycles, which for example, have passed since the initiation of the display of an image frame, or since the last addressing or lamp event was triggered.
  • the timing value may be an actual time value, stored in microseconds or milliseconds.
  • Table 1 is an illustrative schedule table illustrating parameters suitable for storage in the schedule table store 726 for use by the timing control module 724 .
  • Table 1 is an illustrative schedule table illustrating parameters suitable for storage in the schedule table store 726 for use by the timing control module 724 .
  • Additional illustrative schedule tables are described in further detail in relation to FIGS. 13 , 14 A-B, 15 - 17 and 19 .
  • the Table 1 schedule table includes two timing values for each sub-frame data set, an addressing time and a lamp illumination time.
  • the addressing times AT 0 -AT(n ⁇ 1) are associated with times at which the memory control module 720 outputs a respective sub-frame data set, in this case a bitplane, to the array of light modulators 702 .
  • the lamp illumination times LT 0 -LT(n ⁇ 1) are associated with times at which corresponding lamps are illuminated.
  • each time value in the schedule table may trigger more than one event.
  • lamp activity is synchronized with the actuation of the light modulators to avoid illuminating the light modulators while they are not in an addressed state.
  • the addressing times AT not only trigger addressing events, they also trigger lamp extinguishing events. Similarly, in other implementations, lamp extinguishing events also trigger addressing events.
  • the address data, labeled in the table as “memory location of sub-frame data set,” in the schedule table can be stored in a number of forms.
  • the address is a specific memory location in the frame buffer of the beginning of the corresponding bitplane, referenced by buffer, column, and row numbers.
  • the address stored in the schedule table store 726 is an identifier for use in conjunction with a look up table maintained by the memory control module 720 .
  • the identifier may have a simple 6-bit binary “xxxxxx” word structure where the first 2 bits identify the color associated with the bitplane, while the next 4 bits refer to the significance of the bitplane.
  • the actual memory location of the bitplane is then stored in a lookup table maintained by the memory control module 720 when the memory control module 720 stores the bitplane into the frame buffer.
  • the memory locations for bitplanes in the output sequence may be stored as hardwired logic within the timing control module 724 .
  • the timing control module 724 may retrieve schedule table entries using several different methods. In one implementation the order of entries in the schedule table is fixed; the timing control module 724 retrieves each entry in order until reaching a special entry that designates the end of the sequence. Alternatively, a sequence table entry may contain codes that direct the timing control module 724 to retrieve an entry which may be different from the next entry in the table. These additional fields may incorporate the ability to perform jumps, branches, and looping in analogy with the control features of a standard microprocessor instruction set. Such flow control modifications to the operation of the timing control module 724 allow a reduction in the size of the sequence table.
  • the direct-view display 700 also includes a programming link 730 .
  • the programming link 730 provides a means by which the schedule table store 726 may be modified by external circuits or computers. In other embodiments the programming link connects directly to a system processor within the same housing as the direct view display 700 .
  • the system processor may be programmed to alter the schedule table store in accordance with the type of image or data to be displayed by display 700 .
  • the external processor using the programming link 730 , can modify the parameters stored in the schedule table store 726 to alter the output sequence used by the controller 704 .
  • the programming link 730 can be used to change the timing parameters stored in the schedule table store 726 to accommodate different frame rates.
  • the timing parameters associated with each bitplane and the number of bitplanes displayed can be modified by the programming link 730 to adjust the number of colors or grayscale the display can provide.
  • Average brightness can be adjusted by changing lamp intensity values.
  • Color saturation can be modified by the programming link by altering percentage of brightness formed using a white color field or by adjusting color mixing (described further in relation to FIG. 17 ).
  • the direct-view display includes a set of lamps 706 for illuminating the array of light modulators 702 .
  • the direct-view display 700 includes a red lamp, a green lamp, and a blue lamp.
  • the direct-view display 700 also includes a white lamp.
  • the direct-view display 700 includes multiple lamps for each color spaced along a side of the array of light modulators 702 .
  • a useful 4-color lamp combination with expanded color gamut is red, blue, true green (about 520 nm) plus parrot green (about 550 nm).
  • Another 5-color combination which expands the color gamut is red, green, blue, cyan, and yellow.
  • a 5-color analogue to the well known YIQ color space can be established with the lamps white, orange, blue, purple, and green.
  • a 5-color analog to the well known YUV color space can be established with the lamps white, blue, yellow, red, and cyan.
  • a useful 6-color space can be established with the lamp colors red, green, blue, cyan, magenta, and yellow.
  • a 6-color space can also be established with the colors white, cyan, magenta, yellow, orange, and green.
  • a large number of other 4-color and 5-color combinations can be derived from amongst the colors already listed above.
  • Further combinations of 6, 7, 8 or 9 lamps with different colors can be produced from the colors listed above, Additional colors may be employed using lamps with spectra which lie in between the colors listed above.
  • the direct-view display 700 also includes a number of sets of driver circuits 708 , 710 , 714 , and 716 controlled by, and in electrical communication with the various components of the controller 704 .
  • the direct-view display 700 includes a set of scan drivers 708 for write-enabling each of the rows of the array of light modulators in sequence.
  • the scan drivers 708 are controlled by, and in electrical communication with the timing control module 724 .
  • Data drivers 710 are in electrical communication with the memory control 720 .
  • the direct-view display 700 may include one driver circuit 710 for each column in the array of light modulators 702 , or it may have some smaller number of data drivers 710 , each responsible for loading data into multiple columns of the array of light modulators 702 .
  • the direct-view display 700 includes a series of common drivers 714 , including global actuation drivers, actuation voltage drivers, and, in some embodiments, additional common voltage drivers.
  • Common drivers 714 are in electrical communication with the timing control module 720 and light modulators in multiple rows and multiple columns of the array of light modulators 702 .
  • the lamps 706 are driven by lamp drivers 716 .
  • the lamps may be in electrical communication both with the memory control module 720 and/or the timing control module 724 .
  • the timing control module 724 controls the timing of the illumination of the lamps 706 .
  • Illumination intensity information may also be supplied by the timing control module 724 , or it may be supplied by the memory control module 720 .
  • controller 704 Some electronic devices employing displays according to this invention employ variations on the design of controller 704 .
  • the controller does not include an input processing module or a frame buffer. Instead the system processor attached to the electronic device provides a pre-formatted output sequence of bitplanes for display by the controller, drivers, and the array of MEMS light modulators.
  • the timing control module coordinates the output of bitplane data for the array of modulators and controls the illumination of lamps associated with each bitplanes.
  • the timing control module may make reference to a schedule table store, within which are stored timing values for addressing and lamp events and/or lamp intensities associated with each of the bitplanes.
  • FIG. 8 is a flow chart of a method of displaying video 800 (the “display method 800 ”) suitable for use by a direct-view display such as the direct-view display 700 of FIG. 7 , according to an illustrative embodiment of the invention.
  • the display method 800 begins with the provision of an array of light modulators (step 801 ), such as the array of light modulators 702 .
  • the display method 800 proceeds with two interrelated processes, which operate in parallel.
  • the first process is referred to herein as an image processing process 802 of the display method 800 .
  • the second process is referred to as a display process 804 .
  • the image processing process 802 begins with the receipt of an image signal (step 806 ) by the video input 718 .
  • the image signal encodes one or more image frames for display on the direct-view display 700 .
  • the image signal is received as indicated in FIG. 6A . That is, data for each pixel is received sequentially, pixel-by-pixel, row-by-row.
  • the data for a given pixel includes one or more bits for each color component of the pixel.
  • the controller 704 of the direct-view display 700 Upon receipt of data for an image frame (step 806 ), the controller 704 of the direct-view display 700 derives a plurality of sub-frame data sets for the image frame (step 808 ). Preferably, the image processing module 718 of the controller 704 derives a plurality of bitplanes based on the data in the image signal 717 as described above in relation to FIGS. 6A-6C .
  • the imaging process continues at step 810 , wherein the sub-frame data sets are stored in the memory.
  • the biplanes are stored in frame buffer 722 , according to address information that allows them to be randomly accessed at a later points in the process.
  • the display process 804 begins with the initiation of the display of an image frame (step 812 ), for example, in response to the detection of a vsync pulse in the input signal 717 . Then, the first sub-frame data set corresponding to the image frame is output by the memory control module 720 (step 814 ) to the array of light modulators 702 in an addressing event. The memory address of this first sub-frame data set is determined based on data in the schedule table store 726 . Preferably, the sub-frame data set is a bitplane. After the modulators addressed in the first sub-frame data set achieve the state indicated in the sub-frame data set, the lamp or lamps corresponding to the sub-frame data set loaded into the light modulators is illuminated (step 816 ).
  • the time at which the light is illuminated may be governed by a timing value stored in the schedule table store 726 associated with sub-frame image.
  • the lamp remains illuminated until the next time the light modulators in the array of light modulators begin to change state, at which time the lamp is extinguished.
  • the extinguishing time may be determined based on a time value stored in the schedule table store 726 .
  • the extinguishing time may be before or after the next addressing event begins.
  • the controller 704 determines, based on the output sequence, whether the recently displayed sub-frame image is the last sub-frame image to be displayed for the image frame (decision block 818 ). If it is not the last sub-frame image, the next sub-frame data set is loaded into the array of light modulators 702 in another addressing event (step 814 ). If the recently displayed sub-frame image is the last sub-frame image of an image frame, the controller 704 awaits initiation of the display of a subsequent display initiation event (step 812 ).
  • FIG. 9 is a more detailed flow chart of an illustrative display process 900 suitable for use as part of the display method 800 for displaying images on the direct-view display 700 .
  • the sub-frame data sets employed by the direct-view display are bitplanes.
  • the display process 900 begins with the initiation of the display of an image frame (step 902 ).
  • the display of an image frame may be initiated (step 902 ) in response to the detection by the controller 704 of a vsync pulse in the image signal 717 .
  • the bitplane corresponding to the image frame is output by the controller 704 to the array of light modulators 702 (step 904 ).
  • Each row of the sub-frame data set is loaded sequentially.
  • the controller 704 waits a sufficient amount of time to ensure the light modulators in the respective row actuate before beginning to address the next row in the array of light modulators 702 . During this time, as states of the light modulators in the array of light modulators 702 are in flux, the lamps of the direct-view display 700 remain off.
  • this waiting time is stored in the schedule table store 726 . In other implementations, this waiting time is a fixed value hardwired into the timing control module 724 as a number of clock cycles following the beginning of an addressing event.
  • the controller then waits a time stored in the schedule table data store 726 associated with the sub-frame image before extinguishing the lamp (step 908 ).
  • the controller 704 determines whether the most recently displayed sub-frame image is the last sub-frame image of the image frame being displayed. If the most recently displayed sub-frame image is the last sub-frame image for the image frame, the controller awaits the initiation of the display of a subsequent image frame (step 902 ). If it is not the last sub-frame image for the image frame, the controller 704 begins loading the next bitplane (step 904 ) into the array of light modulators 702 . This addressing event may be triggered directly by the extinguishing of the lamp at step 908 , or it may begin after a time associated with a timing value stored in the schedule table store 726 passes.
  • FIG. 10 is a timing diagram 1000 that corresponds to an implementation of the display process 900 that utilizes an output sequence having as parameters the values stored in the Table 1 schedule table.
  • the timing diagram 1000 corresponds to a coded-time division grayscale display process in which image frames are displayed by displaying four sub-frame images for each of three color components (red, green, and blue) of the image frame. Each sub-frame image displayed of a given color is displayed at the same intensity for half as long a time period as the prior sub-frame image, thereby implementing a binary weighting scheme for the sub-frame images.
  • the display of an image frame begins upon the detection of a vsync pulse.
  • the first sub-frame data set R 3 stored beginning at memory location M 0 , is loaded into the array of light modulators 702 in an addressing event that begins at time AT 0 .
  • the red lamp is then illuminated at time LT 0 .
  • LT 0 is selected such that it occurs after each of the rows in the array of light modulators 702 has been addressed, and the light modulators included therein have actuated.
  • the controller 704 of the direct-view display both extinguishes the red lamp and begins loading the subsequent bitplane, R 2 , into the array of light modulators 702 .
  • this bitplane is stored beginning at memory location M 1 . The process repeats until all bitplanes identified in the Table 1 schedule table have been displayed.
  • the controller 704 extinguishes the red lamp and begins loading the most significant green bitplane, G 3 , into the array of light modulators 702 .
  • the controller 704 turns on the green lamp until time AT 7 , at which it time it is extinguished again.
  • the time period between vsync pulses in the timing diagram is indicated by the symbol FT, indicating a frame time.
  • the addressing times AT 0 , AT 1 , etc. as well as the lamp times LT 0 , LT 1 , etc. are designed to accomplish 4 sub-frame images per color within a frame time FT of 16.6 milliseconds, i.e. according to a frame rate of 60 Hz.
  • the time values stored in schedule table store 726 can be altered to accomplish 4 sub-frame images per color within a frame time FT of 33.3 milliseconds, i.e. according to a frame rate of 30 Hz.
  • frame rates as low as 24 Hz may be employed or frame rates in excess of 100 Hz may be employed.
  • timing diagram 1000 the controller outputs 4 sub-frame images to the array 702 of light modulators for each color to be displayed.
  • the illumination of each of the 4 sub-frame images is weighted according to the binary series 1, 2, 4, 8.
  • the display process in timing diagram 1000 therefore, displays a 4-digit binary word for gray scale in each color, that is, it is capable of displaying 16 distinct gray scale levels for each color, despite the loading of only 4 sub-images per color.
  • the implementation of timing diagram 1000 is capable of displaying more than 4000 distinct colors.
  • the sub-frame images in the sequence of sub-frame images need not be weighted according to the binary series 1, 2, 4, 8, etc.
  • the use of base-3 weighting can be useful as a means of expressing sub-frame data sets derived from a ternary coding scheme.
  • Still other implementations employ a mixed coding scheme. For instance the sub-frame images associated with the least significant bits may be derived and illuminated according to a binary weighting scheme, while the sub-frame images associated with the most significant bits may be derived and illuminated with a more linear weighting scheme.
  • Such a mixed coding helps to reduce the large differences in illumination periods for the most significant bits and is helpful in reducing image artifacts such as dynamic false contouring.
  • FIG. 11 is a more detailed flow chart of an illustrative display process 1100 suitable for use as part of the display method 800 for displaying images on the direct-view display 700 .
  • the display process 1100 utilizes bitplanes for sub-frame data sets.
  • display process 1100 includes a global actuation functionality. In a display utilizing global actuation, pixels in multiple rows and multiple columns of the display are addressed before any of the actuators actuate. In the display process 1100 , all rows of the display are addressed prior to actuation.
  • a controller While in display process 900 , a controller must wait a certain amount of time after loading data into each row of light modulators to allow sufficient time for the light modulators to actuate, in display process 1100 , the controller need only wait this “actuation time” once, after all rows have been addressed.
  • One control matrix capable of providing a global actuation functionality is described above in relation to FIG. 3D .
  • Display process 1100 begins with the initiation of the display of a new image frame (step 1102 ). Such an initiation may be triggered by the detection of a vsync voltage pulse in the image signal 717 . Then, at a time stored in the schedule table store 726 after the initiation of the display process for the image frame, the controller 704 begins loading the first bitplane into the light modulators of the array of light modulators 702 (step 1104 ).
  • any lamp currently illuminated is extinguished.
  • Step 1106 may occur at or before the loading of a particular bitplane (step 1104 ) is completed, depending on the significance of the bitplane. For example, in some embodiments, to maintain the binary weighting of bitplanes with respect to one another, some bitplanes may need to be illuminated for a time period that is less than the amount of time it takes to load the next bitplane into the array of light modulators 702 . Thus, a lamp illuminating such a bitplane is extinguished while the next bitplane is being loaded into the array of light modulators (step 1104 ). To ensure that lamps are extinguished at the appropriate time, in one embodiment, a timing value is stored in the schedule table store 726 to indicate the appropriate light extinguishing time.
  • the controller 704 When the controller 704 has completed loading a given bitplane into the array of light modulators 702 (step 1104 ) and extinguished any illuminated lamps (step 1106 ), the controller 704 issues a global actuation command (step 1108 ) to a global actuation driver, causing all of the light modulators in the array of light modulators 702 to actuate at substantially the same time.
  • Global actuation drivers represent a type of common driver 714 included as part of display 700 .
  • the global actuation drivers may connect to modulators in the array of light modulators, for instance, by means of global actuation interconnects such as interconnect 354 of control matrix 340 .
  • the step 1108 globally actuate, includes a series of steps or commands issued by the timing control module 724 .
  • the global actuation step may involve a (first) charging of shutter mechanisms by means of a charging interconnect, followed by a (second) driving of a shutter common interconnect toward ground potential (at which point all commonly connected light modulators move into their closed state), followed after a constant waiting period for shutter actuation, followed by a (third) grounding of the global actuation interconnect (at which point only selected shutters move into their designated open states).
  • Each of the charging interconnects, shutter common interconnects, and global actuation interconnects is connected to a separate driver circuit, responsive to trigger signals sent at the appropriate times according to timing values stored in the timing control module 724 .
  • the controller 704 After waiting the actuation time of the light modulators, the controller 704 issues an illumination command (step 1110 ) to the lamp drivers to turn on the lamp corresponding to the recently loaded bitplane.
  • the actuation time is the same for each bitplane loaded, and thus need not be stored in the schedule table store 726 . It can be permanently stored in the timing control module 724 in hardware, firmware, or software.
  • the controller 704 determines, based on the output sequence, whether the currently loaded bitplane is the last bitplane for the image frame to be displayed. If so, the controller 704 awaits initiation of the display of the next image frame (step 1102 ). Otherwise, the controller 704 begins loading the next bitplane into the array of light modulators 702 .
  • FIG. 12 is a timing diagram 1200 that corresponds to an implementation of the display process 1100 that utilizes an output sequence having as parameters the values stored in the Table 1 schedule table. While the display processes corresponding to FIGS. 10 and 12 utilize similar stored parameters, their operation is quite different. Similar to the display process corresponding to timing diagram 1000 of FIG. 10 , the display process corresponding to timing diagram 1200 uses a coded-time division grayscale addressing process in which image frames are displayed by displaying four sub-frame images for each of three color components (red, green, and blue) of the image frame. Each sub-frame image displayed of a given color is displayed at the same intensity for half as long a time period as the prior sub-frame image, thereby implementing a binary weighting scheme for the sub-frame images.
  • a coded-time division grayscale addressing process in which image frames are displayed by displaying four sub-frame images for each of three color components (red, green, and blue) of the image frame. Each sub-frame image displayed of a given color is displayed at the same intensity for half
  • the display process corresponding to timing diagram 1200 differs from the timing diagram 1000 in that it incorporates the global actuation functionality described in the display process 1100 .
  • the lamps in the display are illuminated for a significantly greater portion of the frame time.
  • the display can therefore either display brighter images, or it can operate its lamps at lower power levels while maintaining the same brightness level.
  • the lower illumination level operating mode while providing equivalent image brightness, consumes less energy.
  • the display of an image frame in timing diagram 1200 begins upon the detection of a vsync pulse.
  • the bitplane R 3 stored beginning at memory location M 0 , is loaded into the array of light modulators 702 in an addressing event that begins at time AT 0 .
  • the controller 704 outputs the last row of data for a bitplane to the array of light modulators 702 .
  • the controller 704 outputs a global actuation command.
  • the controller 704 causes the red lamp to be illuminated.
  • the controller 704 begins loading the subsequent bitplane R 2 , which, according to the schedule table, is stored beginning at memory location M 1 , into the array of light modulators 702 .
  • Lamp extinguishing event times LT 0 -LT 11 occur at times stored in the schedule table store 726 .
  • the times may be stored in terms of clock cycles following the detection of a vsync pulse, or they may be stored in terms of clock cycles following the beginning of the loading of the previous bitplane into the array of light modulators 702 .
  • the lamp extinguishing times are set in the schedule table to coincide with the completion of an addressing event corresponding to the subsequent sub-frame image.
  • LT 0 is set to occur at a time after AT 0 which coincides with the completion of the loading of bitplane R 2 .
  • LT 1 is set to occur at a time after AT 1 which coincides with the completion of the loading of bitplane R 1 .
  • bitplanes such as R 0 , G 0 , and B 0 , however, are intended to be illuminated for a period of time that is less than the amount of time it takes to load a bitplane into the array.
  • LT 3 , LT 7 , and LT 11 occur in the middle of subsequent addressing events.
  • the sequence of lamp illumination and data addressing can be reversed. For instance the addressing of bitplanes corresponding to the subsequent sub-frame image can follow immediately upon the completion of a global actuation event, while the illumination of a lamp can be delayed until a lamp illumination event at some point after the addressing has begun.
  • FIG. 13 is a timing diagram 1300 that corresponds to another implementation of the display process 1100 that utilizes a table similar to Table 2 as a schedule table.
  • the timing diagram 1300 corresponds to a coded-intensity grayscale addressing process similar to that described with respect to FIG. 5 in that each sub-frame image for a given color component (red, green, and blue) is illuminated for the same amount of time.
  • each sub-frame image of a particular color component is illuminated at half the intensity as the prior sub-frame image of the color component, thereby implementing a binary weighting scheme without varying lamp illumination times.
  • the display of an image frame in timing diagram 1300 begins upon the detection of a vsync pulse.
  • the bitplane R 3 stored beginning at memory location M 0 , is loaded into the array of light modulators 702 in an addressing event that begins at time AT 0 .
  • the controller 704 outputs the last row data of a bitplane to the array of light modulators 702 , the controller 704 outputs a global actuation command.
  • the controller causes the red lamp to be illuminated at a lamp intensity IL 0 stored in the Table 2 schedule table. Similar to the addressing process described with respect to FIG.
  • the controller 704 begins loading the subsequent bitplane R 2 , which, according to the schedule table, is stored beginning at memory location M 1 , into the array of light modulators 702 .
  • the sub-frame image corresponding to bitplane R 2 is illuminated at an intensity level IL 1 , as indicated in Table 2, which is equal to half of the intensity level IL 0 .
  • the intensity level IL 2 for bitplane R 1 is equal to half of the intensity level IL 1
  • the intensity level 113 for bitplane R 0 is equal to half of the intensity level IL 2 .
  • the controller 704 may extinguish the illuminating lamp at the completion of an addressing event corresponding to the next sub-frame image. As such, no corresponding time value needs to be stored in the schedule table store 726 corresponding to lamp illumination times.
  • FIG. 14A is a timing diagram 1400 that corresponds to another implementation of the display process 1100 that utilizes a table similar to Table 3 as a schedule table.
  • the timing diagram 1400 corresponds to a coded-time division grayscale addressing process in which image frames are displayed by displaying five sub-frame images for each of three color components (red, green, and blue) of the image frame.
  • the display process corresponding to timing diagram 1400 can display twice the number of gray scale levels at each color as the display process that corresponds to timing diagram 1200 .
  • Each sub-frame image displayed of a given color is displayed at the same intensity for half as long a time period as the prior sub-frame image, thereby implementing a binary pulse width weighting scheme for the sub-frame images.
  • the display of an image frame in timing diagram 1200 begins upon the detection of a vsync pulse.
  • the bitplane R 4 stored beginning at memory location M 0 , is loaded into the array of light modulators 702 in an addressing event that begins at time AT 0 .
  • the controller 704 outputs the last row data of a bitplane to the array of light modulators 702 , the controller 704 outputs a global actuation command. After waiting the actuation time, the controller causes the red lamp to be illuminated. Similar to the addressing process described with respect to FIG.
  • the controller 704 begins loading the subsequent bitplane R 3 , which is stored beginning at memory location M 1 , into the array of light modulators 702 .
  • Lamp extinguishing event times occur at times stored in the schedule table store 726 .
  • the times may be stored in terms of clock cycles following the detection of a vsync pulse, or they may be stored in terms of clock cycles following the beginning of the loading of the previous bitplane into the array of light modulators 702 .
  • the lamp extinguishing times are set in the schedule table to coincide with the completion of an addressing event corresponding to the subsequent sub-frame image. For example, LT 0 is set to occur at a time after AT 0 which coincides with the completion of the loading of bitplane R 3 .
  • LT 1 is set to occur at a time after AT 1 which coincides with the completion of the loading of bitplane R 2 .
  • some bitplanes such as R 1 and R 0 , G 1 and G 0 , and B 1 and B 0 are intended to be illuminated for a period of time that is less than the amount of time it takes to load a bitplane into the array.
  • their corresponding lamp extinguishing times occur in the middle of subsequent addressing events. Because the lamp extinguishing times depend on whether the corresponding illumination times are less than or greater than the time required for addressing, the corresponding schedule table includes lamp times, e.g., LT 0 , LT 1 , LT 2 , etc.
  • FIG. 14B is a timing diagram 1450 that corresponds to another implementation of the display process 1100 that utilizes the parameters stored in Table 4 as a schedule table.
  • the timing diagram 1450 corresponds to a coded-time division and intensity grayscale addressing process similar to that of the timing diagram 1400 , except that the weighting of the least significant sub-image and the second least significant sub-image are achieved by varying lamp intensity in addition to lamp illumination time.
  • sub-frame images corresponding to the least significant bitplane and the second least significant bitplane are illuminated for the same length of time as the sub-frame images corresponding to the third least significant bitplane, but at one quarter and one half the intensity, respectively.
  • all the bitplanes may be illuminated for a period of time equal to or longer than the time it takes to load a bitplane into the array of light modulators 702 . This eliminates the need for lamp extinguishing times to be stored in the schedule table store 726 .
  • the display of an image frame in timing diagram 1450 begins upon the detection of a vsync pulse.
  • the bitplane R 4 stored beginning at memory location M 0 , is loaded into the array of light modulators 702 in an addressing event that begins at time AT 0 .
  • the controller 704 outputs the last row of data of a bitplane to the array of light modulators 702 , the controller 704 outputs a global actuation command.
  • the controller causes the red lamp to be illuminated at a lamp intensity IL 0 stored in the schedule table store 726 . Similar to the addressing process described with respect to FIG.
  • the controller 704 begins loading the subsequent bitplane R 3 , which, according to the schedule table, is stored beginning at memory location M 1 , into the array of light modulators 702 .
  • the sub-frame image corresponding to bitplane R 3 is illuminated at an intensity level IL 1 , as indicated in Table 2, which is equal to the intensity level IL 0 .
  • the intensity level IL 2 for bitplane R 2 is equal to the intensity level IL 1 .
  • the intensity level IT 3 for bitplane R 1 is half that of the intensity level IL 2
  • the intensity level IT 4 for bitplane R 0 is half that of the intensity level IT 3 .
  • the controller 704 may extinguish the illuminating lamp at the completion of an addressing event corresponding to the next sub-frame image. As such, no corresponding time value needs to be stored in the schedule table store 726 to determine this time.
  • the timing diagram 1450 corresponds to a display process in which perceived brightness of sub-images of an output sequence are controlled in a hybrid fashion. For some sub-frame images in the output sequence, brightness is controlled by modifying the period of illumination of the sub-frame image. For other sub-frame images in the output sequence, brightness is controlled by modifying illumination intensity. It is useful in a direct view display to provide the capability for controlling both pulse widths and intensities independently.
  • the lamp drivers 714 are responsive to variable intensity commands issued from the timing control module 724 as well as to timing or trigger signals from the timing control module 724 for the illumination and extinguishing of the lamps.
  • the schedule table store 726 stores parameters that describe the required intensity of lamps in addition to the timing values associated with their illumination.
  • an illumination value as the product (or the integral) of an illumination period (or pulse width) with the intensity of that illumination.
  • an illumination value is defined as the product (or the integral) of an illumination period (or pulse width) with the intensity of that illumination.
  • an illumination period or pulse width
  • FIG. 14C Three such alternate pulse profiles for lamps appropriate to this invention are compared in FIG. 14C .
  • the time markers 1482 and 1484 determine time limits within which a lamp pulse must express its illumination value.
  • the time marker 1482 might represent the end of one global actuation cycle, wherein the modulator states are set for a bitplane previously loaded, while the time marker 1484 can represent the beginning of a subsequent global actuation cycle, for setting the modulator states appropriate to the subsequent bitplane.
  • the time interval between the markers 1482 and 1484 can be constrained by the time necessary to load data subsets, e.g. bitplanes, into the array of modulators.
  • the available time interval in these cases, is substantially longer that the time required for illumination of the bitplane, assuming a simple scaling from the pulse widths assigned to bits of larger significance.
  • the lamp pulse 1486 is a pulse appropriate to the expression of a particular illumination value.
  • the pulse width 1486 completely fills the time available between the markers 1482 and 1484 .
  • the intensity or amplitude of lamp pulse 1486 is adjusted, however, to achieve a required illumination value.
  • An amplitude modulation scheme according to lamp pulse 1486 is useful, particularly in cases where lamp efficiencies are not linear and power efficiencies can be improved by reducing the peak intensities required of the lamps.
  • the lamp pulse 1488 is a pulse appropriate to the expression of the same illumination value as in lamp pulse 1486 .
  • the illumination value of pulse 1488 is expressed by means of pulse width modulation instead of by amplitude modulation. As shown in the timing diagram 1400 , for many bitplanes the appropriate pulse width will be less than the time available as determined by the addressing of the bitplanes.
  • the series of lamp pulses 1490 represent another method of expressing the same illumination value as in lamp pulse 1486 .
  • a series of pulses can express an illumination value through control of both the pulse width and the frequency of the pulses.
  • the illumination value can be considered as the product of the pulse amplitude, the available time period between markers 1482 and 1484 , and the pulse duty cycle.
  • the lamp driver circuitry can be programmed to produce any of the above alternate lamp pulses 1486 , 1488 , or 1490 .
  • the lamp driver circuitry can be programmed to accept a coded word for lamp intensity from the timing control module 724 and build a sequence of pulses appropriate to intensity. The intensity can be varied as a function of either pulse amplitude or pulse duty cycle.
  • FIG. 15 is a timing diagram 1500 that corresponds to another implementation of the display process 1100 that utilizes a schedule table similar to Table 5.
  • the timing diagram 1500 corresponds to a coded-time division grayscale addressing process similar to that described with respect to FIG. 12 , except that restrictions have been placed on illumination periods for the most significant bits and rules have been established for the ordering of the bitplanes in the display sequence.
  • the sequencing rules illustrated for timing diagram 1500 are established to help reduce two visual artifacts which detract from image quality in field sequential displays, i.e. color breakup and flicker.
  • Color breakup is reduced by increasing the frequency of color changes, that is by alternating between sub-images of different colors at a frequency preferably in excess of 180 Hz.
  • Flicker is reduced in its simplest manifestation by ensuring that frame rates are substantially greater than 30 Hz, that is, by ensuring that bitplanes of similar significance which appear in subsequent image frames are separated by time periods of less than 25 milliseconds.
  • Sequencing rules associated with color breakup and flicker can be implemented by the technique of bit splitting.
  • the most significant bits e.g. R 3 , G 3 , and B 3
  • the red bitplane R 3 for instance, is first loaded to the modulation array at time event AT 0 and is then loaded for the second time at the time event AT 9 .
  • the illumination period associated with the most significant bitplane R 3 loaded at time event AT 9 , is equal to the illumination period associated with bitplane R 2 , which is loaded at the time event AT 12 . Because the most significant bitplane R 3 appears twice within the image frame, however, the illumination value associated with the information contained within bitplane R 3 is still twice that allotted to the next most significant bitplane R 2 .
  • the timing diagram 1500 displays sub-frame images corresponding to a given color interspersed among sub-frame images corresponding to other colors. For example, to display an image according to the timing diagram 1500 , a display first loads and displays the first occurrence of the most significant bitplane for red, R 3 , followed immediately by the most significant green bitplane, G 3 , followed immediately by the most significant blue bitplane B 3 . Since the most significant bitplanes have been split, these color changes occur fairly rapidly, with the longest time periods between color changes about equal to the illumination time of the next most significant bitplane, R 2 .
  • the time periods between illumination of sub-frame images of different colors are preferably held to less than 4 milliseconds, more preferably less than 2.8 milliseconds.
  • the smaller bitplanes, R 1 and R 0 , G 1 and G 0 , and B 1 and B 0 can still be grouped together, since the total of their illumination times is still less than 4 milliseconds.
  • bitplane B 3 is the third of the bitplanes to be output by the controller (at addressing event AT 2 ), the appearance of the blue bitplane B 3 does not imply the end of all possible appearances of red bitplanes within the frame time. Indeed the bitplane R 1 for the color red immediately follows B 3 in the sequence of timing diagram 1500 . It is preferable to alternate between bitplanes of different color with the highest frequency possible within an image frame.
  • time periods K and L represent the separation in time between events in which the most significant bitplane in red, i.e. the most significant bitplane R 3 is output to the display. Similar time periods K and L exist between successive occurrences of the other most significant bitplanes G 3 and B 3 .
  • the time period K represents the maximum time between output of most significant bitplanes within a given image frame.
  • the time period L represents the maximum time between output of most significant bitplanes in two consecutive image frames.
  • timing diagram 1500 the sum of K+L is equal to the frame time, and for this embodiment, the frame time may be as long as 33 milliseconds (corresponding to a 30 Hz frame rate). Flicker may still be reduced in displays where bit-splitting is employed, if both time intervals K and L are held to less than 25 milliseconds, preferably less than 17 milliseconds.
  • Flicker may arise from a variety of factors wherein characteristics of a display are repeated at frequencies as low as 30 Hz.
  • the lesser significance bitplanes R 1 and R 0 are illuminate only once per frame, and the frame rate is as long as 30 Hz. Therefore images associated with these lesser bitplanes may contribute to the perception of flicker.
  • the bank-wise addressing method described with respect to FIG. 19 will provide another mechanism by which even lesser bitplanes can be repeated at frequencies substantially greater than the frame rate.
  • Flicker may also be generated by the characteristic of bitplane jitter. Jitter appears when the spacing between similar bitplanes is not equal in the sequence of displayed bitplanes. Flicker would ensue, for instance, if the time periods K and L between MSB red bitplanes were not equal. Flicker can be reduced by ensuring that time periods K and L are equal to within 10%.
  • the length of time between a first time the bitplane corresponding to the most significant sub-frame image of a color component of the image frame is output and a second time the bitplane corresponding to the most significant sub-frame image of the color component is output is within 10% of the length of time between the second time the bitplane corresponding to the most significant sub-frame image of the color component is output and a subsequent time at which a sub-frame image corresponding to the most significant sub-frame image of the color component is output.
  • FIG. 16 is a timing diagram 1600 that corresponds to another implementation of the display process 1100 that utilizes the parameters listed in Table 6.
  • the timing diagram 1600 corresponds to a coded-time division grayscale addressing process in which image frames are displayed by displaying four sub-frame images for each color component of the image frame. Each sub-frame image displayed of a given color is displayed at the same intensity for half as long a time period as the prior sub-frame image, thereby implementing a binary weighting scheme for the sub-frame images.
  • the timing diagram 1600 is similar to the timing diagram 1200 of FIG. 12 , but has sub-frame images corresponding to the color white, in addition to the colors red, green and blue, that are illuminated using a white lamp.
  • a white lamp allows the display to display brighter images or operate its lamps at lower power levels while maintaining the same brightness level. As brightness and power consumption are not linearly related, the lower illumination level operating mode, while providing equivalent image brightness, consumes less energy. In addition, white lamps are often more efficient, i.e. they consume less power than lamps of other colors to achieve the same brightness.
  • the display of an image frame in timing diagram 1600 begins upon the detection of a vsync pulse.
  • the bitplane R 3 stored beginning at memory location M 0 , is loaded into the array of light modulators 702 in an addressing event that begins at time AT 0 .
  • the controller 704 outputs the last row data of a bitplane to the array of light modulators 702 , the controller 704 outputs a global actuation command. After waiting the actuation time, the controller causes the red lamp to be illuminated. Similar to the addressing process described with respect to FIG.
  • the controller 704 begins loading the first of the green bitplanes, G 3 , which, according to the schedule table, is stored beginning at memory location M 4 .
  • the controller 704 begins loading the first of the blue bitplanes, B 3 , which, according to the schedule table, is stored beginning at memory location M 8 .
  • the controller 704 begins loading the first of the white bitplanes, W 3 , which, according to the schedule table, is stored beginning at memory location M 12 .
  • the controller causes the white lamp to be illuminated for the first time.
  • the controller 704 extinguishes the lamp illuminating a sub-frame image upon completion of an addressing event corresponding to the subsequent sub-frame image.
  • LT 0 is set to occur at a time after AT 0 which coincides with the completion of the loading of bitplane R 2 .
  • LT 1 is set to occur at a time after AT 1 which coincides with the completion of the loading of bitplane R 1 .
  • the time period between vsync pulses in the timing diagram is indicated by the symbol FT, indicating a frame time.
  • the addressing times AT 0 , AT 1 , etc. as well as the lamp times LT 0 , LT 1 , etc. are designed to accomplish 4 sub-frame images for each of the 4 colors within a frame time FT of 16.6 milliseconds, i.e. according to a frame rate of 60 Hz.
  • the time values stored in schedule table store 726 can be altered to accomplish 4 sub-frame images per color within a frame time FT of 33.3 milliseconds, i.e. according to a frame rate of 30 Hz.
  • frame rates as low as 24 Hz may be employed or frame rates in excess of 100 Hz may be employed.
  • the use of white lamps can improve the efficiency of the display.
  • the use of four distinct colors in the sub-frame images requires changes to the data processing in the input processing module 718 . Instead of deriving bitplanes for each of 3 different colors, a display process according to timing diagram 1600 requires bitplanes to be stored corresponding to each of 4 different colors.
  • the input processing module 718 may therefore convert the incoming pixel data, encoded for colors in a 3-color space, into color coordinates appropriate to a 4-color space before converting the data structure into bitplanes.
  • a useful 4-color lamp combination with expanded color gamut is red, blue, true green (about 520 nm) plus parrot green (about 550 nm).
  • Another 5-color combination which expands the color gamut is red, green, blue, cyan, and yellow.
  • a 5-color analogue to the well known YIQ color space can be established with the lamps white, orange, blue, purple, and green.
  • a 5-color analog to the well known YUV color space can be established with the lamps white, blue, yellow, red, and cyan.
  • a useful 6-color space can be established with the lamp colors red, green, blue, cyan, magenta, and yellow.
  • a 6-color space can also be established with the colors white, cyan, magenta, yellow, orange, and green.
  • a large number of other 4-color and 5-color combinations can be derived from amongst the colors already listed above.
  • Further combinations of 6, 7, 8 or 9 lamps with different colors can be produced from the colors listed above, Additional colors may be employed using lamps with spectra which lie in between the colors listed above.
  • FIG. 17 is a timing diagram 1700 that corresponds to another implementation of the display process 1100 that utilizes the parameters listed in the schedule table of Table 7.
  • the timing diagram 1700 corresponds to a hybrid coded-time division and intensity grayscale display process in which lamps of different colors may be illuminated simultaneously. Though each sub-frame image is illuminated by lamps of all colors, sub-frame images for a specific color are illuminated predominantly by the lamp of that color. For example, during illumination periods for red sub-frame images, the red lamp is illuminated at a higher intensity than the green lamp and the blue lamp. As brightness and power consumption are not linearly related, using multiple lamps each at a lower illumination level operating mode may require less power than achieving that same brightness using one lamp at an higher illumination level.
  • each sub-frame image is displayed at the same intensity for half as long a time period as the prior sub-frame image, except for the sub-frame images corresponding to the least significant bitplanes which are instead each illuminated for the same length of time as the prior sub-frame image, but at half the intensity.
  • the sub-frame images corresponding to the least significant bitplanes are illuminated for a period of time equal to or longer than that required to load a bitplane into the array.
  • the display of an image frame in timing diagram 1700 begins upon the detection of a vsync pulse.
  • the bitplane R 3 stored beginning at memory location M 0 , is loaded into the array of light modulators 702 in an addressing event that begins at time AT 0 .
  • the controller 704 outputs the last row data of a bitplane to the array of light modulators 702 , the controller 704 outputs a global actuation command.
  • the controller causes the red, green and blue lamps to be illuminated at the intensity levels indicated by the Table 7 schedule, namely RI 0 , GI 0 and BI 0 , respectively.
  • the controller 704 begins loading the subsequent bitplane R 2 , which, according to the schedule table, is stored beginning at memory location M 1 , into the array of light modulators 702 .
  • the sub-frame image corresponding to bitplane R 2 , and later the one corresponding to bitplane R 1 are each illuminated at the same set of intensity levels as for bitplane R 1 , as indicated by the Table 7 schedule.
  • the sub-frame image corresponding to the least significant bitplane R 0 stored beginning at memory location M 3 , is illuminated at half the intensity level for each lamp. That is, intensity levels RI 3 , GI 3 and BI 3 are equal to half that of intensity levels RI 0 , GI 0 and BI 0 , respectively.
  • the process continues starting at time AT 4 , at which time bitplanes in which the green intensity predominates are displayed.
  • the controller 704 begins loading bitplanes in which the blue intensity dominates.
  • the controller 704 extinguishes the lamp illuminating a sub-frame image upon completion of an addressing event corresponding to the subsequent sub-frame image.
  • LT 0 is set to occur at a time after AT 0 which coincides with the completion of the loading of bitplane R 2 .
  • LT 1 is set to occur at a time after AT 1 which coincides with the completion of the loading of bitplane R 1 .
  • timing diagram 1700 The mixing of color lamps within sub-frame images in timing diagram 1700 can lead to improvements in power efficiency in the display. Color mixing can be particularly useful when images do not include highly saturated colors.
  • FIG. 18 is a more detailed flow chart of an illustrative display process 1800 suitable for use as part of the display method 800 for displaying images on the direct-view display 700 .
  • the display process 1800 utilizes bitplanes for sub-frame data sets.
  • Display process 1800 also includes a global actuation functionality similar to that used in display process 1100 .
  • Display process 1800 adds a bankwise addressing functionality as a tool for improving the illumination efficiency in the display.
  • Timing diagram 1400 illustrates a 5-bit sequence per color with illumination values assigned to the bitplanes according to a binary significance sequence 16:8:4:2:1.
  • the illumination periods associated with the bitplanes R 1 and R 0 are considerably shorter than the time required for loading data sets into the array appropriate to the next bitplane.
  • Bankwise addressing is a functionality by which duty cycles for lamps can be increased by reducing the times required for addressing. This is accomplished by dividing the display into multiple independently actuatable banks of rows such that only a portion of the display needs to be addressed and actuated at any one time. Shorter addressing cycles increase the efficiency of the display for those bitplanes that require only the shortest of illumination times.
  • bank-wise addressing involves segregating the rows of the display into two segments.
  • the rows in the top half of the display are controlled separately from rows in the bottom half of the display.
  • the display is segregated on an every-other row basis, such that even-numbered rows belong to one bank or segment and the odd-numbered rows belong to the other bank.
  • Separate bitplanes are stored for each segment at a distinct addresses in the buffer memory 722 .
  • the input processing module 718 is programmed to not only derive bitplane information from the incoming video stream, but also to identify, and in some cases store, portions of bitplanes separately according to their assignment to different banks.
  • bitplanes are labeled by color, bank, and significance value.
  • bitplane RE 3 in a five bit per color component gray scale process refers to the second most significant bitplane for the even numbered rows of the display apparatus.
  • Bitplane BOO corresponds to the least significant blue bitplane for the odd numbered rows.
  • independent global actuation voltage drivers and independent global actuation interconnects are provided for each bank. For instance the odd-numbered rows are connected to one set of global actuation drivers and global actuation interconnects, while the even numbered rows are connected to an independent set of global actuation drivers and interconnects.
  • Display process 1800 begins with the initiation of the display of a new image frame (step 1802 ). Such an initiation may be triggered by the detection of a vsync voltage pulse in the image signal 717 . Then, at a time identified in the schedule table store 726 after the initiation of the display process for the image frame, the controller 704 begins loading the first bitplane into the light modulators of the array of light modulators 702 (step 1804 ). In contrast to step 1104 of FIG. 11 , at step 1804 , bitplanes for either one or both of the banks of the display are loaded into the corresponding rows of the array of light modulators 702 .
  • the timing control module 724 analyzes its output sequence to see how many banks need to be addressed in a given addressing event and then addresses each bank needed to be addressed in sequence.
  • bitplanes are loaded into corresponding light modulator rows in order of increasing significance while for the other bank, bitplanes are loaded into the corresponding light modulator rows in order of decreasing significance.
  • any lamp currently illuminated is extinguished.
  • Step 1806 may occur at or before the loading of a particular bitplane (step 1804 ) is completed, depending on the significance of the bitplane. For example, in some embodiments, to maintain the binary weighting of bitplanes with respect to one another, some bitplanes may need to be illuminated for a time period that is less than the amount of time it takes to load the next bitplane into the array of light modulators 702 . Thus, a lamp illuminating such a bitplane is extinguished while the next bitplane is being loaded into the array of light modulators (step 1804 ). To ensure that lamps are extinguished at the appropriate time, a timing value is stored in the schedule table to indicate the appropriate light extinguishing time.
  • the controller 704 When the controller 704 has completed loading either or both of the bitplane data into either or both of banks in the array of light modulators 702 (step 1804 ) and when the controller has extinguished any illuminated lamps (step 1806 ), the controller 704 issues a global actuation command (step 1808 ) to either or both of the global actuation drivers, depending on where it is in its output sequence, thereby causing either only one of the banks of addressable modulators or both banks in the array of light modulators 702 to actuate at substantially the same time.
  • the timing of the global actuation is determined by logic in the timing control module based on whether the schedule indicates that one or both of the banks requires addressing.
  • the timing control module 724 waits a first amount of time before causing the controller 704 to issue the global actuation command. If the schedule table store 726 indicates both banks require addressing, the timing control module 724 waits about twice that amount of time before triggering global actuation. As only two possible time values are needed for timing global actuation (i.e., a single bank time, or a dual bank time), these values can be stored permanently in the timing control module 724 in hardware, firmware, or software.
  • the controller 704 After waiting the actuation time of the light modulators, the controller 704 issues an illumination command (step 1810 ) to the lamp drivers to turn on the lamp corresponding to the recently loaded bitplane.
  • the actuation time is measured from the time a global actuation command is issued (step 1808 ), and thus is the same for each bitplane loaded. Therefore, it need not be stored in a schedule table. It can be permanently stored in the timing control module 724 in hardware, firmware, or software.
  • the controller 704 determines, based on the schedule table store 726 , whether the currently loaded bitplane is the last bitplane for the image frame to be displayed. If so, the controller 704 awaits initiation of the display of a subsequent image frame (step 1802 ). Otherwise, at the time of the next addressing event listed in the schedule table store 726 , the controller 704 begins loading the corresponding bitplane or bitplanes into the array of light modulators 702 (step 1804 ).
  • FIG. 19 is a timing diagram 1900 that corresponds to an implementation of the display process 1800 through utilization of the parameters listed in the schedule table of Table 8.
  • the timing diagram 1900 corresponds to a coded-time division grayscale display process in which image frames are displayed by displaying 5 sub-frame images for each of three color components (red, green, and blue) of the image frame. Each sub-frame image displayed of a given color is displayed at the same intensity for half as long a time period as the prior sub-frame image, thereby implementing a binary weighting scheme for the sub-frame images.
  • the timing diagram 1900 incorporates the global actuation functionality described in the display process 1100 and the bankwise addressing functionality described in the display process 1800 .
  • the display can therefore either display brighter images, or it can operate its lamps at lower power levels while maintaining the same brightness level.
  • the lower illumination level operating mode while providing equivalent image brightness, consumes less energy.
  • the display of an image frame in timing diagram 1900 begins upon the detection of a vsync pulse.
  • the bitplane R 04 stored beginning at memory location MO 0 , is loaded into only the odd rows of the array of light modulators 702 in an addressing event that begins at time AT 0 .
  • the bitplane RE 1 is loaded into only the even rows of the array of light modulators, using data stored in the location ME 0 .
  • the controller 704 outputs the last of the even rows of data of a bitplane to the array of light modulators 702 .
  • the controller 704 outputs a global actuation command to both of the independently addressable global actuation drivers connected to the banks of even and odd rows.
  • the controller 704 After waiting the actuation time following the issuance of the global actuation command, the controller 704 causes the red lamp to be illuminated.
  • the actuation time is a constant for all sub-frame images and is based on the issuance of the global actuation command, no corresponding time value needs to be stored in the schedule table store 726 to determine this time.
  • the controller 704 begins loading the subsequent bitplane RE 0 , stored beginning at memory location ME 1 , into the even rows of the array of light modulators 702 .
  • the timing control module 724 skips any process related to loading of the data into the odd rows. This may be accomplished by storage of a coded parameter in the schedule table store 726 associated with the timing value AT 1 , for instance, the numeral zero. In this fashion the amount of time to complete the addressing event initiated at time AT 1 is only 1 ⁇ 2 of the time required for addressing both banks of rows at time AT 0 . Note that the least significant red bitplane for the odd rows is not loaded into the array of light modulators 702 until much later, at time AT 5 .
  • Lamp extinguishing event times LT 0 -LTn ⁇ 1 occur at times stored in the schedule table store 726 .
  • the times may be stored in terms of clock cycles following the detection of a vsync pulse, or they may be stored in terms of clock cycles following the beginning of the loading of the previous bitplane into the array of light modulators 702 .
  • the lamp extinguishing times are set in the schedule table to coincide with the completion of a corresponding addressing event. For example, LT 0 is set to occur at a time after AT 0 which coincides with the completion of the loading of the even-numbered rows.
  • LT 0 is set to occur at a time after AT 1 which coincides with the completion of the loading of bitplane RE 0 into the even-numbered rows.
  • LT 3 is set to occur at a time after AT 4 , which coincides with the completion of the loading of bitplane RO 1 into the odd-numbered rows.
  • bank-wise addressing by timing diagram 1900 provides for only two independently addressable and actuatable banks.
  • arrays of MEMS modulators and their drive circuits can be interconnected so as to provide 3, 4, 5, 6, 7, 8 or more independently addressable banks
  • a display with 6 independently addressable banks would require only 1 ⁇ 6 the time for addressing the rows within one bank, in comparison to time needed for addressing of the whole display.
  • 6 banks 6 different bitplanes attributed to the same color of lamp can be interleaved and illuminated simultaneously.
  • the rows associated with each bank may be assigned to every 6 th row of the display.
  • the bank-wise addressing scheme provides additional opportunities for reducing flicker in a MEMS-based field sequential display.
  • the red bitplane R 1 for the even rows, introduced at addressing event AT 0 is displayed within the same grouping of red sub-images as the red bitplane R 1 for the odd rows, introduced at timing event AT 4 .
  • Each of these bitplanes is displayed only once per frame. If the frame rate in timing diagram 19 was as low as 30 Hz, then the display of these lesser bitplanes would be separated by substantially more than 25 milliseconds between frames, contributing to the perception of flicker. However, this situation can be improved if the bitplanes in timing diagram 19 are further re-arranged such that the display of R 1 bitplanes between adjacent frames are never separated by more than 25 milliseconds, preferably less than 17 milliseconds.
  • the display of the most significant bitplane in red i.e. the most significant bit—R 4
  • the display of the most significant bitplane in red can be split, for instance at some point between the addressing events AT 3 and AT 4 .
  • the two groupings of red sub-images can then be re-arranged amongst similar sub-groupings in the green and blue sub-images.
  • the red, green, and blue sub-groupings can be interspersed, as in the timing diagram 1500 .
  • the result is that the display of the e.g. R 1 , G 1 , B 1 , sub-frame data sets can be arranged to appear at roughly equal time intervals, both within and between successive image frames.
  • the R 1 bitplane for the even rows would still appear only once per image frame.
  • Flicker can be reduced, however, if the display of the R 1 bitplane alternates between odd and even rows, and if the time separation between display of the odd or even portions of the bitplane is never more than 25 milliseconds, preferably less than 17 milliseconds.
  • FIG. 20 is a block diagram of a controller 2000 for use in a direct-view display, according to an illustrative embodiment of the invention.
  • the controller 2000 can replace the controller 704 of the direct-view MEMS display 700 of FIG. 7 .
  • the controller 2000 receives an image signal 2017 from an external source and outputs both data and control signals for controlling light modulators and lamps of the display into which it is incorporated
  • the controller 2000 includes an input processing module 2018 , a memory control module 2020 , a frame buffer 2022 , a timing control module 2024 , four unique schedule table stores 2026 , 2027 , 2028 , and 2029 .
  • the controller instead of a programming link which allows alteration of the parameters in a schedule table store, the controller provides a switch control module 2040 which determines which of the 4 schedule table stores will be active at any given time.
  • the components 2018 - 2040 may be provided as distinct chips or circuits which are connected together by means of circuit boards and/or cables. In other implementations several of these components can be designed together into a single semiconductor chip such that their boundaries are nearly indistinguishable except by function.
  • the input processing module 2018 receives the image signal 2017 and processes the data encoded therein, similar to input processing module 718 , into a format suitable for displaying via the array of light modulators.
  • the input processing module 2018 takes the data encoding each image frame and converts it into a series of sub-frame data sets. While in various embodiments, the input processing module 2018 may convert the image signal 2017 into non-coded sub-frame data sets, ternary coded sub-frame data sets, or other form of coded sub-frame data set, preferably, the input processing module 2018 converts the image signal 2017 into bitplanes, as described above in relation to FIGS. 6A-6C .
  • the input processing module 2018 outputs the sub-frame data sets to the memory control module 2020 .
  • the memory control module 2020 then stores the sub-frame data sets in the frame buffer 2022 .
  • the frame buffer is preferably a random access memory, although other types of serial memory can be used without departing from the scope of the invention.
  • the memory control module 2020 in one implementation stores the sub-frame data set in a predetermined memory location based on the color and significance in a coding scheme of the sub-frame data set. In other implementations, the memory control module 2020 stores the sub-frame data set in a dynamically determined memory location and stores that location in a lookup table for later identification.
  • the frame buffer 2022 is configured for the storage of bitplanes.
  • the memory control module 2020 is also responsible for, upon instruction from the timing control module 2024 , retrieving sub-image data sets from the frame buffer 2022 and outputting them to the data drivers.
  • the data drivers load the data output by the memory control module 2020 into the light modulators of the array of light modulators.
  • the memory control module 2020 outputs the data in the sub-image data sets one row at a time.
  • the frame buffer 2022 includes two buffers, whose roles alternate. While the memory control module 2020 stores newly generated bitplanes corresponding to a new image frame in one buffer, it extracts bitplanes corresponding to the previously received image frame from the other buffer for output to the array of light modulators Both buffer memories can reside within the same circuit, distinguished only by address.
  • the order in which the sub-image data sets are output, referred to as the “sub-frame data set output sequence,” and the time at which the memory control module 2022 begins outputting each sub-image data set is controlled, at least in part, by data stored in one of the alternate schedule table stores 2026 , 2027 , 2028 , and 2029 .
  • Each of the schedule table stores 2026 - 2029 stores at least one timing value associated with each sub-frame data set, an identifier indicating where the sub-image data set is stored in the frame buffer 2022 , and illumination data indicating the color or colors associated with the sub-image data set.
  • the schedule table stores 2026 - 2029 also store intensity values indicating the intensity with which the corresponding lamp or lamps should be illuminated for a particular sub-frame data set.
  • the timing values stored in the schedule table stores 2026 - 2029 determine when to begin addressing the array of light modulators with the sub-frame data set.
  • the timing value is used to determine when a lamp or lamps associated with the sub-frame data set should be illuminated and/or extinguished.
  • the timing value is a number of clock cycles, which for example, have passed since the initiation of the display of an image frame, or since the last addressing or lamp event was triggered.
  • the timing value may be an actual time value, stored in microseconds or milliseconds.
  • the distinct timing values stored in the various schedule table stores 2026 - 2029 provide a choice between distinct imaging algorithms, for instance between display modes which differ in the properties of frame rate, lamp brightness, achievable grayscale precision, or in the saturation of displayed colors.
  • the storage of multiple schedule tables therefore, provides for flexibility in the method of displaying images, a flexibility which is especially advantageous when it provides a method for saving power for use in portable electronics.
  • Direct view display 2000 includes 4 unique schedule tables stored in memory. In other implementations the number of distinct schedules that are stored may be 2, 3, or any other number. For instance it may be advantageous to store schedule parameters for as many as 100 unique schedule table stores.
  • the multiple schedule tables stored in controller 2000 allow for the exploitation of trade-offs between image quality and power consumption. For some images, which do not require a display of deeper, saturated colors, it is possible to rely on white lamps or mixed colors to provide brightness, especially as these color schemes can be more power efficient. Similarly, not all images or applications require the display of 16 million colors. A palette of 250,000 colors may be sufficient (6 bits per color) for some images or applications. For other images or applications, a color range limited to only 4,000 colors (4 bits per color) or 500 colors (3 bits per color) may be sufficient. It is advantageous to include electronics in a direct view MEMS display controller so as to provide display flexibility to take advantage of power saving opportunities.
  • timing and bitplane parameters which are stored in the schedule table store stores 2026 - 2029 . Together with the sequencing commands stored within the timing control module 2024 , these parameters allow the controller 2000 to output variations on lamp intensities, frame rates, different pallets of colors (based on the mixing of lamp colors within a subfield), or different grey scale bit depths (based on the number of bitplanes employed to display an image frame).
  • each schedule table corresponds to a different display process.
  • schedule table 2026 corresponds to a display process capable of generating approximately 16 million colors (8 bits per color) with high color saturation.
  • Schedule table 2027 corresponds to a display process appropriate only for black and white (e.g. text) images with a frame rate, or refresh rate, that is very low, e.g. less than 20 frames per second.
  • Schedule table 2028 corresponds to a display process suited for outdoor viewing of color or video images where brightness is at a premium but where battery power must nevertheless be conserved.
  • Schedule table 2029 corresponds to a display process providing a restricted choice of colors (e.g. 4,000) which would provide an easy to read and low-power display appropriate for most icon or text-type information with the exception of video.
  • the display process represented by schedule table 2026 requires the most power, whereas the display process represented by schedule table 2027 requires the least.
  • the display processes corresponding to schedule tables 2028 and 2029 require power usage somewhere in between that required by the other display processes.
  • the timing control module 2024 derives its display process parameters or constants from only one of the four possible sequence tables.
  • a switch control module 2040 governs which of the sequence tables is referenced by the timing control module 2040 .
  • This switch control module 2040 could be a user controlled switch, or it could be responsive to commands from an external processor, contained either within the same housing as the MEMS display device or external to it (referred to as an “external module”).
  • the external module for instance, can decide whether the information to be displayed is text or video, or whether the information displayed should be colored or strictly black and white.
  • the switch commands can originate from the input processing module 2018 . Whether in response to an instruction from the user or an external module, the switch control module 2040 selects a schedule table store that corresponds to the desired display process or display parameters.
  • FIG. 21 is a flow chart of a process of displaying images 2100 (the “display process 2100 ”) suitable for use by a direct-view display such as the controller 2000 of FIG. 20 , according to an illustrative embodiment of the invention.
  • the display process 2100 begins with the selection of an appropriate schedule table for use in displaying an image frame (step 2102 ). For example, a selection is made between schedule table stores 2026 - 2029 . This selection can be made by the input processing module 2118 , a module in another part of the device in which the direct-view MEMS display is incorporated, or it can be made directly by the user of the device.
  • the selection amongst schedule tables is made by the input processing module or an external module, it can be made in response to the type of image to be displayed (for instance video or still images require finer levels of gray scale contrast versus an image which needs only a limited number of contrast levels (such as a text image)).
  • Another factor which that might influence the selection of an imaging mode or schedule table, whether selected directly by a user or automatically by the external module, might be the lighting ambient of the device. For example, one might prefer one brightness for the display when viewed indoors or in an office environment versus outdoors where the display must compete in an environment of bright sunlight. Brighter displays are more likely to be viewable in an ambient of direct sunlight, but brighter displays consume greater amounts of power.
  • the external module when selecting schedule tables on the basis of ambient light, can make that decision in response to signals it receives through an incorporated photodetector.
  • the selection step 2102 can be accomplished by means of a mechanical relay, which changes the reference within the timing control module 2024 to only one of the four schedule table stores 2026 - 2029 .
  • the selection step 2102 can be accomplished by the receipt of an address code which indicates the location of one of the schedule table stores 2026 - 2029 .
  • the timing control module 2024 then utilizes the selection address, as received through the switch control module 2040 , to indicate the correct memory source for its schedule parameters.
  • the timing control module 2024 can make reference to a schedule table stored in memory by means of a multiplexer circuit, similar to a memory control circuit.
  • the process 2100 then continues with the receipt of the data for an image frame.
  • the data is received by the input processing module 2018 by means of the input line 2017 at step 2104 .
  • the input processing module then derives a plurality of sub-frame data sets, for instance bitplanes, and stores them in the frame buffer 2022 (step 2106 ).
  • the timing control module 2024 proceeds to display each of the sub-frame data sets, at step 2108 , in their proper order and according to timing values stored in the selected schedule table.
  • the process 2100 then continues iteratively with receipt of subsequent frames of image data.
  • the sequence of receiving image data at step 2104 through the display of the sub-frame data sets at step 2108 can be repeated many times, where each image frame to be displayed is governed by the same selected schedule table. This process can continue until the selection of a new schedule table is made at a later time, e.g. by repeating the step 2102 .
  • the input processing module 2018 may select a schedule table for each image frame received, or it may periodically examine the incoming image data to determine if a change in schedule table is appropriate.
  • FIG. 22 is a block diagram of a controller 2200 , suitable for inclusion in a MEMS direct-view display, according to an illustrative embodiment of the invention.
  • the controller 2200 may replace the controller 704 of the MEMS direct-view display 700 .
  • the controller 2200 receives an image signal 2217 from an external source and outputs both data and control signals for controlling the drivers, light modulators, and lamps of the display in which the controller is included.
  • the controller 2200 includes an input processing module 2218 , a memory control module 2220 , a frame buffer 2222 , a timing control module 2224 .
  • the controller 2200 includes a sequence parameter calculation module 2228 .
  • the sequence parameter calculation module receives monitoring data from the input processing module 2218 and outputs changes to the sequencing parameters stored within the schedule table store 2226 , and in some implementations, changes to the bitplanes stored for a given image frame.
  • the components 2218 , 2220 , 2222 , 2224 , 2226 , and 2228 may be provided as distinct chips or circuits which are connected together by means of circuit boards and/or cables. In other implementations, several of these components can be designed together into a single semiconductor chip such that their boundaries are nearly indistinguishable except by function.
  • the input processing module 2218 receives the image signal 2217 and processes the data encoded therein into a format suitable for displaying via the array of light modulators.
  • the input processing module 2218 takes the data encoding each image frame and converts it into a series of sub-frame data sets.
  • a sub-frame data set includes information about the desired states of modulators in multiple rows and multiple columns of the array of light modulators.
  • the number and content of sub-frame data sets used to display an image frame depends on the grayscale technique employed by the controller 2200 . For example, the sub-frame data sets needed to form an image frame using a coded time-division gray scale technique differs from the number and content of sub-frame data sets used to display an image frame using a non-coded time division gray scale technique.
  • the input processing module 2218 may convert the image signal 2217 into non-coded sub-frame data sets, ternary coded sub-frame data sets, or other form of coded sub-frame data set, preferably, the input processing module 2218 converts the image signal 2217 into bitplanes, as described above in relation to FIGS. 6A-6C .
  • the input processing module 2218 outputs the sub-frame data sets to the memory control module 2220 .
  • the memory control module 2220 then stores the sub-frame data sets in the frame buffer 2222 .
  • the memory control module 2220 in one implementation stores the sub-frame data set in a predetermined memory location based on the color and significance in a coding scheme of the sub-frame data set. In other implementations, the memory control module 2220 stores the sub-frame data set in a dynamically determined memory location and stores that location in a lookup table for later identification.
  • the frame buffer 2222 is configured for the storage of bitplanes.
  • the memory control module 2220 is also responsible for, upon instruction from the timing control module 2224 , retrieving bitplanes sets from the frame buffer 2222 and outputting them to the data drivers 2208 .
  • the data drivers 2208 load the data output by the memory control module 2220 into the light modulators of the array of light modulators.
  • the memory control module 2220 outputs the data in the sub-image data sets one row at a time.
  • the frame buffer 2222 includes two buffers, whose roles alternate. While the memory control module 2220 stores newly generated bitplanes corresponding to a new image frame in one buffer, it extracts bitplanes corresponding to the previously received image frame from the other buffer for output to the array of light modulators Both buffer memories can reside within the same circuit, distinguished only by address.
  • the order in which the sub-image data sets are output, referred to as the “sub-frame data set output sequence,” and the time at which the memory control module 2220 begins outputting each sub-image data set is controlled, at least in part, by data stored in the schedule table store 2226 .
  • the schedule table store 2226 stores at least one timing value associated with each sub-frame data set, an identifier indicating where the sub-image data set is stored in the frame buffer 2222 , and illumination data indicating the color or colors associated with the sub-image data set.
  • the schedule table store 2226 also stores intensity values indicating the intensity with which the corresponding lamp or lamps should be illuminated for a particular sub-frame data set.
  • the timing values stored in the schedule table store 2226 determine when to begin addressing the array of light modulators with each sub-frame data set.
  • the timing value is used to determine when a lamp or lamps associated with the sub-frame data set should be illuminated and/or extinguished.
  • the timing value is a number of clock cycles, which for example, have passed since the initiation of the display of an image frame, or since the last addressing or lamp event was triggered.
  • the timing value may be an actual time value, stored in microseconds or milliseconds.
  • Controller 2200 includes a re-configurable schedule table store 2226 .
  • the schedule table store 2226 provides a flexible or programmable component to the controller.
  • a programming link, such as the interface 730 allowed the schedule table store 726 within controller 704 to be altered or reprogrammed according to different lamp intensities, frame rates, color schemes, or grey scale bit depths. Similar alterations to the display process are possible for schedule table store 2226 within controller 2200 , except that these variations now occur automatically in response to the requirements of individual image frames, based on characteristics of those image frames detected by the input processing module 2218
  • Controller 2200 is configured to sense the display requirements for an image frame based on the data within the image frame and to adapt the display algorithm by means of changes to the schedule table store 2226 .
  • Display method 2300 is suitable for use by a MEMS direct-view display such as the MEMS direct-view display 2200 of FIG. 22 , according to an illustrative embodiment of the invention.
  • the display method 2300 begins with the receipt of the data for an image frame at step 2302 .
  • the data is received by the input processing module 2218 by means of the input line 2217 .
  • the input processing module 2218 derives a plurality of sub-frame data sets, for instance bitplanes, from the data and stores the bitplanes in the frame buffer 2222 .
  • the input processing module monitors and analyzes the content of the incoming image to look for characteristics which might effect the display of that image. For instance at step 2304 the input processing module might make note of the pixel or pixels with the most saturated colors in the image frame, i.e. pixels which call for significant brightness values from one color which are not balanced, diluted or desaturated by requiring illumination in the same pixel from the other color lamps in the same image frame. In another example of input data monitoring, the input processing module 2218 might make note of the pixel or pixels with the brightest values required of each of the lamps, regardless of color saturation.
  • step 2308 the sequence parameter calculation module 2228 assesses the data collected at step 2304 and identifies changes to the display process that can be implemented by adjusting values in the sequence table 2226 . The changes to the sequence table 2226 are then affected at step 2310 by re-writing certain of the parameters stored within table 2226 .
  • step 2312 the method 2300 proceeds to the display of sub-images according to the ordering parameters and timing values that have been re-programmed within the schedule table 2226 .
  • the method 2300 then continues iteratively with receipt of subsequent frames of image data.
  • the processes of receiving (step 2302 ) and displaying image data (step 2312 ) may run in parallel, with one image being displayed from the data of one buffer memory according to the re-programmed schedule table at the same time that new sub-frame data sets are being analyzed and stored into a parallel buffer memory.
  • the sequence of receiving image data at step 2302 through the display of the sub-frame data sets at step 2312 can be repeated interminably, where each image frame to be displayed is governed by a schedule table which is re-programmed in response to the incoming data.
  • the data monitoring at step 2304 detects the pixels in each frame with the most saturated colors. If it is determined that the most saturated color required for a frame is only 82% of the saturation available from the colored lamps, then it is possible to remix the colors that are provided to the bitplanes so that power can be saved—while still providing the 82% saturation level required by the image. By adding, for instance subordinate red, green, or blue colors to the primary color in each frame, power can be saved in the display.
  • the sequence parameter calculation module 2228 would receive a signal from the input processing module 2218 indicating the degree of color mixing which is allowed. Before the frame is displayed the sequence parameter calculation module re-writes the intensity parameters in the sequence table 2226 which determine color mixing at each bitplane, so that colors are correspondingly desaturated and power is saved.
  • a process is provided within the sequence parameter calculation module 2228 which determines whether the image is comprised solely of text or text plus symbols as opposed to video or a photographic image.
  • the sequence parameter calculation module 2228 then re-writes the parameters in the sequence table accordingly.
  • Text images especially black and white text images, do not need to be refreshed as often as video images and typically require only a limited number of different colors or gray shades.
  • the sequence parameter calculator 2228 can therefore adjust both the frame rate as well as the number of sub-images to be displayed for each image frame. Text images require fewer sub-images in the display process than photographic images
  • the monitoring function at step 2304 analyzes or searches for the maximum intensity attributed to each color in each pixel. If an image is to be displayed that requires no more than 65% of the brightness from any of the lamps for any of the pixels, then in some cases it is possible to display that image correctly by reducing the average intensity of the lamps accordingly.
  • the lamp intensity values within the schedule table store 2226 can be reduced by a set of commands within the sequence parameter calculation module 2228 .
  • Appendix 1 presents a timing sequence 3000 expressed by means of schedule table 9, which expresses an embodiment for the timing sequences of this invention.
  • the timing sequence 3000 of Appendix 1 is appropriate to the display of image information at a 30 Hz frame rate (e.g. 33.3 milliseconds between vsync pulses); it includes the display of 7 bits for each of the colors red, green, and blue.
  • the timing sequence 3000 is constrained by the following parameters related to setting of modulator states in the array:
  • the schedule table for timing sequence 3000 includes the following information, required by the timing control module 724 for display of the sub-images
  • the timing sequence 3000 includes the following features as described previously. Similar to timing sequence 1200 , the display that employs timing sequence 3000 includes the capability of global actuation. The display that employs timing sequence 3000 includes two independent global actuation circuits, for each of the odd and even banks respectively. The timing sequence 3000 includes a scheme for control of the lamps, similar to that described in timing sequence 1450 , in which both pulse periods and pulse intensity is used to express illumination value. The timing sequence 3000 is capable of mixing colors, as in timing sequence 1700 , although in this embodiment only one lamp is illuminated at one time.
  • the timing sequence 3000 includes bank-wise addressing.
  • the lesser bitplanes e.g. R 0 , R 1 , R 2 , and R 3 are always displayed successively within a given bank, e.g. the odd rows, and this sequence of lesser bitplanes is illuminated at the same time that the most significant bit (e.g. R 6 ) is illuminated in the other bank (e.g. in the even rows).
  • the timing sequence 3000 splits the most significant bits (e.g. R 6 , G 6 , and B 6 into four separate but equally timed sub-images.
  • the timing sequence alternates colors frequently, with a maximum period between colors of 1.38 milliseconds.
  • the time between the expression of most significant bits is not always equal between successive pairs of most significant bits, but in no case is that period between most significant bits greater than 4.16 milliseconds.

Abstract

A direct-view display includes an array of MEMS light modulators and a control matrix formed on a transparent substrate, where each light modulator can be driven into at least two states, and a controller for controlling the states of each light modulator in the array. The control matrix transmits data and actuation voltages to the array. The controller includes an input, a processor, a memory, and an output. The input receives image data encoding an image frame for display. The processor derives a plurality of sub-frame data sets from the image data, where each sub-frame data set indicates desired states of light modulators in multiple rows and multiple columns of the array. The memory stores the plurality of sub-frame data sets. The output outputs the plurality of sub-frame data sets according to an output sequence to drive light modulators into the states indicated in the sub-frame data sets.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of U.S. patent application Ser. No. 11/643,042, filed Dec. 19, 2006, which claims priority to and the benefit of U.S. Patent Application No. 60/751,909, filed Dec. 19, 2005, now abandoned, and U.S. Patent Application No. 60/776,367, filed Feb. 24, 2006, now abandoned, and is also a continuation-in-part of U.S. patent application Ser. No. 11/361,294, filed Feb. 23, 2006, now abandoned, and which claims priority to and the benefit of U.S. Patent Application No. 60/676,053, filed Apr. 29, 2005, now abandoned, and U.S. Patent Application No. 60/655,827, filed Feb. 23, 2005, now abandoned. The disclosures of all of the foregoing are incorporated herein by reference in their entirety.
  • FIELD OF THE INVENTION
  • In general, the invention relates to the field of imaging displays, in particular, the invention relates to controller circuits and processes for controlling light modulators incorporated into imaging displays.
  • BACKGROUND OF THE INVENTION
  • Displays built from mechanical light modulators are an attractive alternative to displays based on liquid crystal technology. Mechanical light modulators are fast enough to display video content with good viewing angles and with a wide range of color and grey scale. Mechanical light modulators have been successful in projection display applications. Direct-view displays using mechanical light modulators have not yet demonstrated sufficiently attractive combinations of brightness and low power.
  • In contrast to projection displays in which switching circuitry and light modulators can be built on relatively small die cut from silicon substrates, most direct-view displays require the fabrication of light modulators on much larger substrates. In addition, in many cases, particularly for backlit direct view displays, both the control circuitry and the light modulators are preferably formed on transparent substrates. As a result, many typical semiconductor manufacturing processes are inapplicable, and switching circuits often need to be re-designed accordingly. A need remains for MEMS direct-view displays that incorporate display processes in conjunction with switching circuitry that yield detailed images along with rich levels of grayscale and contrast.
  • SUMMARY
  • There is a need in the art for fast, bright, low-powered mechanically actuated direct-view displays. Specifically there is a need for direct-view displays built on transparent substrates that can be driven at high speeds and at low voltages for improved image quality and reduced power consumption.
  • In one aspect of the invention, a direct-view display includes an array of MEMS light modulators and a control matrix both formed on a transparent substrate, where each of the light modulators can be driven into at least two states. The control matrix transmits data and actuation voltages to the array and may include, for each light modulator, a transistor and a capacitor. The direct-view display also includes a controller for controlling the states of each of the light modulators in the array. The controller includes an input, a processor, a memory, and an output. The input receives image data encoding an image frame for display on the direct-view display. The processor derives a plurality of sub-frame data sets from the image data. Each sub-frame data set indicates desired states of light modulators in multiple rows and multiple columns of the array. The memory stores the plurality of sub-frame data sets. The output outputs the plurality of sub-frame data sets according to an output sequence to drive light modulators into the states indicated in the sub-frame data sets. The plurality of sub-frame data sets may include distinct sub-frame data sets for at least two of at least three color components of the image frame or for four color components of the image frame, where the four color components may consist of red, green, blue, and white.
  • In one embodiment, the output sequence includes a plurality of events corresponding to the sub-frame data sets. The controller stores different time values associated with events corresponding to at least two sub-frame data sets. The time values may be selected to prevent illumination of the array while the modulators change states and may correlate to a brightness of a sub-frame image resulting from an outputting of a sub-frame data set of the plurality of sub-frame data sets. The direct-view display may include a plurality of lamps, in which case the controller may store time values associated with lamp illumination events and/or lamp extinguishing events included in the output sequence. The output sequence may include addressing events, where the controller stores time values associated with the addressing events.
  • In another embodiment, the output sequence is stored at least in part in memory. The direct-view display may include a data link to an external processor for receiving changes to the output sequence. The direct-view display may include a plurality of lamps, where the output sequence includes a lamp illumination sequence. The lamp illumination sequence may include data corresponding to the length of time and/or intensity with which lamps are illuminated in association with sub-frame data sets output in the output sequence. The length of time that a lamp is illuminated for each sub-frame data set in the lamp illumination sequence is preferably less than or equal to 4 milliseconds.
  • In another embodiment, the processor derives the plurality of sub-frame data sets by decomposing the image frame into a plurality of sub-frame images and assigning a weight to each sub-frame image of the plurality of sub-frame images. The controller may cause a sub-frame image to be illuminated for a length of time and/or with an illumination intensity proportional to the weight assigned to the sub-frame image. The processor may assign the weight according to a coding scheme. In one implementation, the coding scheme is a binary coding scheme, the sub-frame data sets are bitplanes, and each color component of the image frame is decomposed into at least a most significant sub-frame image and a next most significant sub-frame image. The most-significant sub-frame image may contribute to a displayed image frame twice as much as the next most significant sub-frame image. According to the output sequence, the bitplane corresponding to the most significant sub-image of at least one color component of the image frame may be output at two distinct times which may be separated by no more than 25 milliseconds. The length of time between a first time the bitplane corresponding to the most significant sub-frame image of a color component of the image frame is output and a second time the bitplane corresponding to the most significant sub-frame image of the color component is output is preferably within 10% of the length of time between the second time the bitplane corresponding to the most significant sub-frame image of the color component is output and a subsequent time at which a sub-frame image corresponding to a most significant sub-frame image of the color component is output.
  • In another embodiment, at least one sub-frame data set corresponding to a first color component of the image frame is output before at least one sub-frame data set corresponding to a second color component of the image frame, and at least one sub-frame data set corresponding to the first color component of the image frame is output after at least one sub-frame data set corresponding to the second color component of the image frame. Lamps of at least two different colors may be illuminated to display a single sub-frame image corresponding to a single sub-frame data set, where a lamp of one of the colors may be illuminated with a substantially greater intensity than lamps of the other colors.
  • In another embodiment, the direct-view display includes a memory for storing a plurality of alternative output sequences and may include an output sequence switching module for switching between the output sequence and the plurality of alternative output sequences. The output sequence switching module may respond to the processor, to a user interface included in the direct-view display, and/or to instructions received from a second processor, external to the controller, included in the device in which the direct-view display is incorporated. The user interface may be a manual switch.
  • In another embodiment, the direct-view display includes a sequence parameter calculation module for deriving changes to the output sequence. Based on characteristics of a received image frame, the sequence parameter calculation module may derive changes to the output sequence, to timing values stored in relation to events included in the output sequence, and/or to sub-frame data sets. The direct-view display may include a plurality of lamps, in which case the sequence parameter calculation module may derive changes to lamp intensity values stored in relation to lamp illumination events included in the output sequence.
  • In another embodiment, the array of light modulators includes a plurality of independently actuatable banks of light modulators. The control matrix may include a plurality of global actuation interconnects, where each global actuation interconnect corresponds to a respective bank of light modulators. The plurality of banks may be located adjacent one another in the array. Alternatively, each bank of light modulator may include a plurality of rows in the array, where the banks are interwoven with one another in the array. In one implementation, the display of a sub-frame image corresponding to a particular significance and color component in one of the banks is no more than 25 ms from a subsequent display of a sub-frame image corresponding to the significance value and color component, and is no more than 25 ms after a prior display of a sub-frame image corresponding to the significance and color component in the other of the banks.
  • In another embodiment, the light modulators include shutters. The shutters may selectively reflect light and/or selectively allow low light to pass through corresponding apertures to form the image frame. The shutters may be driven transverse to the substrate. In another embodiment, the light modulators are reflective light modulators. In another embodiment, the light modulators selectively allow the passage of light towards a viewer. In another embodiment, a light guide is positioned proximate the array of light modulators.
  • In another embodiment, the output sequence includes a plurality of global actuation events. The direct-view display may include a global actuation interconnect coupled to the array of light modulators for causing light modulators in multiple rows and multiple columns of the array of light modulators to actuate substantially simultaneously.
  • In another aspect of the invention, a direct-view display includes an array of MEMS light modulators and a control matrix both formed on a transparent substrate, where each of the light modulators can be driven into at least two states, and lamps of at least three colors. The control matrix transmits data and actuation voltages to the array. The direct-view display also includes a controller for controlling the states of each of the light modulators in the array. The controller also controls the illumination of lamps to illuminate the array of light modulators with lamps of at least two colors at the same time to form a portion of an image. At least one of the colors illuminating the array of light modulators may be of greater intensity than the other colors.
  • Another aspect of the invention includes a method for displaying an image frame on a direct-view display. The method includes the steps of receiving image data encoding the image frame; deriving a plurality of sub-frame data sets from the image data; storing the plurality of sub-frame data sets in a memory; and outputting the plurality of sub-frame data sets according to an output sequence. Each sub-frame data set indicates desired states of MEMS light modulators in multiple rows and multiple columns of a light modulator array formed on a transparent substrate. The step of outputting the plurality of sub-frame data sets drives the MEMS light modulators into the desired states indicated in each sub-frame data set and includes transmitting data and actuation voltages to the light modulator array via a control matrix formed on the transparent substrate.
  • In another aspect of the invention, a direct-view display includes an array of MEMS light modulators and a control matrix both formed on a transparent substrate, wherein each of the light modulators can be driven into at least two states. The control matrix transmits data and actuation voltages to the array. The direct-view display also includes a controller for controlling the states of each of the light modulators in the array. The controller also controls the illumination of lamps of at least four colors to display an image. The lamps may include at least a red lamp, a green lamp, a blue lamp, and a white lamp. The lamps may include at least a red lamp, a green lamp, a blue lamp, and a yellow lamp. The direct-view display may include a processor for translating three color image data into four color image data.
  • Another aspect of the invention includes a method for displaying an image on a direct-view display. The method includes the steps of controlling states of MEMS light modulators in a light modulator array formed on a transparent substrate, where each of the MEMS light modulators can be driven into at least two states; transmitting data and actuation voltages to the light modulator array via a control matrix formed on the transparent substrate; and controlling the illumination of lamps of at least four colors to display the image.
  • BRIEF DESCRIPTION
  • In the detailed description which follows, reference will be made to the attached drawings, in which:
  • FIG. 1 is a schematic diagram of a direct-view MEMS-based display according to an illustrative embodiment of the invention;
  • FIG. 2A is a perspective view of an illustrative shutter-based light modulator suitable for incorporation into the direct-view MEMS-based display of FIG. 1, according to an illustrative embodiment of the invention;
  • FIG. 2B is a cross-sectional view of a rollershade-based light modulator suitable for incorporation into the direct-view MEMS-based display of FIG. 1, according to an illustrative embodiment of the invention;
  • FIG. 2C is a cross sectional view of a light-tap-based light modulator suitable for incorporation into the direct-view MEMS-based display of FIG. 1, according to an illustrative embodiment of the invention;
  • FIG. 2D is a cross sectional view of an electrowetting-based light modulator suitable for incorporation into the direct-view MEMS-based display of FIG. 1, according to an illustrative embodiment of the invention;
  • FIG. 3A is a schematic diagram of a control matrix suitable for controlling the light modulators incorporated into the direct-view MEMS-based display of FIG. 1, according to an illustrative embodiment of the invention;
  • FIG. 3B is a perspective view of an array of shutter-based light modulators connected to the control matrix of FIG. 3A, according to an illustrative embodiment of the invention;
  • FIG. 3C illustrates a portion of a direct view display that includes the array of light modulators depicted in FIG. 3B disposed on top of a backlight, according to an illustrative embodiment of the invention;
  • FIG. 3D is a schematic diagram of another suitable control matrix for inclusion in the direct-view MEMS-based display of FIG. 1, according to an illustrative embodiment of the invention;
  • FIG. 4 is a timing diagram for a method of displaying an image on a display using a field sequential color technique;
  • FIG. 5 is a timing diagram for a method of displaying an image on a display using a time-division gray scale technique;
  • FIG. 6A is a schematic diagram of a digital image signal received by a display device, according to an illustrative embodiment of the invention;
  • FIG. 6B is a schematic diagram of a memory buffer useful for converting a received image signal into a bitplane, according to an illustrative embodiment of the invention;
  • FIG. 6C is a schematic diagram of portions of two bitplanes, according to an illustrative embodiment of the invention;
  • FIG. 7 is a block diagram of a display apparatus, according to an illustrative embodiment of the invention;
  • FIG. 8 is a flow chart of a method of displaying images suitable for use by the display apparatus of FIG. 6, according to an illustrative embodiment of the invention;
  • FIG. 9 is a more detailed flow chart of a portion of a first implementation of the method of FIG. 7, according to an illustrative embodiment of the invention;
  • FIG. 10 is a timing diagram illustrating the timing of various image formation events in the method of FIG. 9, according to an illustrative embodiment of the invention;
  • FIG. 11 is a more detailed flow chart of a portion of a second implementation of the method of FIG. 8, according to an illustrative embodiment of the invention;
  • FIG. 12 is a timing diagram illustrating the timing of various image formation events in a first implementation of the method of FIG. 11, according to an illustrative embodiment of the invention;
  • FIG. 13 is a timing diagram illustrating the timing of various image formation events in a second implementation of the method of FIG. 11, according to an illustrative embodiment of the invention;
  • FIG. 14A is a timing diagram illustrating the timing of various image formation events in a third implementation of the method of FIG. 11, according to an illustrative embodiment of the invention;
  • FIG. 14B is a timing diagram illustrating the timing of various image formation events in a fourth implementation of the method of FIG. 11, according to an illustrative embodiment of the invention;
  • FIG. 14C depicts various pulse profiles for lamps, according to an illustrative embodiment of the invention;
  • FIG. 15 is a timing diagram illustrating the timing of various image formation events in a fourth implementation of the method of FIG. 11, according to an illustrative embodiment of the invention;
  • FIG. 16 is a timing diagram illustrating the timing of various image formation events in a fifth implementation of the method of FIG. 11, according to an illustrative embodiment of the invention;
  • FIG. 17 is a timing diagram illustrating the timing of various image formation events in a sixth implementation of the method of FIG. 11, according to an illustrative embodiment of the invention;
  • FIG. 18 is a more detailed flow chart of a portion of a third implementation of the method of FIG. 8, according to an illustrative embodiment of the invention;
  • FIG. 19 is a timing diagram illustrating the timing of various image formation events in an implementation of the method of FIG. 18, according to an illustrative embodiment of the invention;
  • FIG. 20 is a block diagram of a controller suitable for inclusion in the display apparatus of FIG. 1, according to an illustrative embodiment of the invention;
  • FIG. 21 is a flow chart of a method of displaying an image suitable for use by the controller of FIG. 20, according to an illustrative embodiment of the invention;
  • FIG. 22 is a block diagram of a second controller suitable for inclusion in the display apparatus of FIG. 1, according to an illustrative embodiment of the invention; and
  • FIG. 23 is a flow chart of a method of displaying an image suitable for use by the controller of FIG. 22, according to an illustrative embodiment of the invention.
  • DETAILED DESCRIPTION
  • FIG. 1 is a schematic diagram of a direct-view MEMS-based display apparatus 100, according to an illustrative embodiment of the invention. The display apparatus 100 includes a plurality of light modulators 102 a-102 d (generally “light modulators 102”) arranged in rows and columns. In the display apparatus 100, light modulators 102 a and 102 d are in the open state, allowing light to pass. Light modulators 102 b and 102 c are in the closed state, obstructing the passage of light. By selectively setting the states of the light modulators 102 a-102 d, the display apparatus 100 can be utilized to form an image 104 for a backlit display, if illuminated by a lamp or lamps 105. In another implementation, the apparatus 100 may form an image by reflection of ambient light originating from the front of the apparatus. In another implementation, the apparatus 100 may form an image by reflection of light from a lamp or lamps positioned in the front of the display, i.e. by use of a frontlight.
  • In the display apparatus 100, each light modulator 102 corresponds to a pixel 106 in the image 104. In other implementations, the display apparatus 100 may utilize a plurality of light modulators to form a pixel 106 in the image 104. For example, the display apparatus 100 may include three color-specific light modulators 102. By selectively opening one or more of the color-specific light modulators 102 corresponding to a particular pixel 106, the display apparatus 100 can generate a color pixel 106 in the image 104. In another example, the display apparatus 100 includes two or more light modulators 102 per pixel 106 to provide grayscale in an image 104. With respect to an image, a “pixel” corresponds to the smallest picture element defined by the resolution of image. With respect to structural components of the display apparatus 100, the term “pixel” refers to the combined mechanical and electrical components utilized to modulate the light that forms a single pixel of the image.
  • Display apparatus 100 is a direct-view display in that it does not require imaging optics that are necessary for projection applications. In a projection display, the image formed on the surface of the display apparatus is projected onto a screen or onto a wall. The display apparatus is substantially smaller than the projected image. In a direct view display, the user sees the image by looking directly at the display apparatus, which contains the light modulators and optionally a backlight or front light for enhancing brightness and/or contrast seen on the display.
  • Direct-view displays may operate in either a transmissive or reflective mode. In a transmissive display, the light modulators filter or selectively block light which originates from a lamp or lamps positioned behind the display. The light from the lamps is optionally injected into a lightguide or “backlight” so that each pixel can be uniformly illuminated. Transmissive direct-view displays are often built onto transparent or glass substrates to facilitate a sandwich assembly arrangement where one substrate, containing the light modulators, is positioned directly on top of the backlight.
  • Each light modulator 102 includes a shutter 108 and an aperture 109. To illuminate a pixel 106 in the image 104, the shutter 108 is positioned such that it allows light to pass through the aperture 109 towards a viewer. To keep a pixel 106 unlit, the shutter 108 is positioned such that it obstructs the passage of light through the aperture 109. The aperture 109 is defined by an opening patterned through a reflective or light-absorbing material in each light modulator 102.
  • The display apparatus also includes a control matrix connected to the substrate and to the light modulators for controlling the movement of the shutters. The control matrix includes a series of electrical interconnects (e.g., interconnects 110, 112, and 114), including at least one write-enable interconnect 110 (also referred to as a “scan-line interconnect”) per row of pixels, one data interconnect 112 for each column of pixels, and one common interconnect 114 providing a common voltage to all pixels, or at least to pixels from both multiple columns and multiples rows in the display apparatus 100. In response to the application of an appropriate voltage (the “write-enabling voltage, Vwe”), the write-enable interconnect 110 for a given row of pixels prepares the pixels in the row to accept new shutter movement instructions. The data interconnects 112 communicate the new movement instructions in the form of data voltage pulses. The data voltage pulses applied to the data interconnects 112, in some implementations, directly contribute to an electrostatic movement of the shutters. In other implementations, the data voltage pulses control switches, e.g., transistors or other non-linear circuit elements that control the application of separate actuation voltages, which are typically higher in magnitude than the data voltages, to the light modulators 102. The application of these actuation voltages then results in the electrostatic driven movement of the shutters 108.
  • FIG. 2A is a perspective view of an illustrative shutter-based light modulator 200 suitable for incorporation into the direct-view MEMS-based display apparatus 100 of FIG. 1, according to an illustrative embodiment of the invention. The light modulator 200 includes a shutter 202 coupled to an actuator 204. The actuator 204 is formed from two separate compliant electrode beam actuators 205 (the “actuators 205”), as described in U.S. patent application Ser. No. 11/251,035, filed on Oct. 14, 2005. The shutter 202 couples on one side to the actuators 205. The actuators 205 move the shutter 202 transversely over a surface 203 in a plane of motion which is substantially parallel to the surface 203. The opposite side of the shutter 202 couples to a spring 207 which provides a restoring force opposing the forces exerted by the actuator 204.
  • Each actuator 205 includes a compliant load beam 206 connecting the shutter 202 to a load anchor 208. The load anchors 208 along with the compliant load beams 206 serve as mechanical supports, keeping the shutter 202 suspended proximate to the surface 203. The surface includes one or more aperture holes 211 for admitting the passage of light. The load anchors 208 physically connect the compliant load beams 206 and the shutter 202 to the surface 203 and electrically connect the load beams 206 to a bias voltage, in some instances, ground.
  • If the substrate is opaque, such as silicon, then aperture holes 211 are formed in the substrate by etching an array of holes through the substrate 204. If the substrate 204 is transparent, such as glass or plastic, then the first step of the processing sequence involves depositing a light blocking layer onto the substrate and etching the light blocking layer into an array of holes 211. The aperture holes 211 can be generally circular, elliptical, polygonal, serpentine, or irregular in shape.
  • Each actuator 205 also includes a compliant drive beam 216 positioned adjacent to each load beam 206. The drive beams 216 couple at one end to a drive beam anchor 218 shared between the drive beams 216. The other end of each drive beam 216 is free to move. Each drive beam 216 is curved such that it is closest to the load beam 206 near the free end of the drive beam 216 and the anchored end of the load beam 206.
  • In operation, a display apparatus incorporating the light modulator 200 applies an electric potential to the drive beams 216 via the drive beam anchor 218. A second electric potential may be applied to the load beams 206. The resulting potential difference between the drive beams 216 and the load beams 206 pulls the free ends of the drive beams 216 towards the anchored ends of the load beams 206, and pulls the shutter ends of the load beams 206 toward the anchored ends of the drive beams 216, thereby driving the shutter 202 transversely towards the drive anchor 218. The compliant members 206 act as springs, such that when the voltage across the beams 206 and 216 potential is removed, the load beams 206 push the shutter 202 back into its initial position, releasing the stress stored in the load beams 206.
  • A light modulator, such as light modulator 200, incorporates a passive restoring force, such as a spring, for returning a shutter to its rest position after voltages have been removed. Other shutter assemblies, as described in U.S. patent application Ser. Nos. 11/251,035 and 11/326,696, incorporate a dual set of “open” and “closed” actuators and a separate sets of “open” and “closed” electrodes for moving the shutter into either an open or a closed state.
  • U.S. patent application Ser. Nos. 11/251,035 and 11/326,696 have described a variety of methods by which an array of shutters and apertures can be controlled via a control matrix to produce images, in many cases moving images, with appropriate gray scale. In some cases control is accomplished by means of a passive matrix array of row and column interconnects connected to driver circuits on the periphery of the display. In other cases it is appropriate to include switching and/or data storage elements within each pixel of the array (the so-called active matrix) to improve either the speed, the gray scale and/or the power dissipation performance of the display.
  • The control matrices described herein are not limited to controlling shutter-based MEMS light modulators, such as the light modulators described above. For example, FIG. 2B is a cross-sectional view of a rolling actuator-based light modulator 220 suitable for incorporation into the direct-view MEMS-based display apparatus 100 of FIG. 1, according to an illustrative embodiment of the invention. As described further in U.S. Pat. No. 5,233,459, entitled “Electric Display Device,” and U.S. Pat. No. 5,784,189, entitled “Spatial Light Modulator,” the entireties of which are incorporated herein by reference, a rolling actuator-based light modulator includes a moveable electrode disposed opposite a fixed electrode and biased to move in a preferred direction to produce a shutter upon application of an electric field. In one embodiment, the light modulator 220 includes a planar electrode 226 disposed between a substrate 228 and an insulating layer 224 and a moveable electrode 222 having a fixed end 230 attached to the insulating layer 224. In the absence of any applied voltage, a moveable end 232 of the moveable electrode 222 is free to roll towards the fixed end 230 to produce a rolled state. Application of a voltage between the electrodes 222 and 226 causes the moveable electrode 222 to unroll and lie flat against the insulating layer 224, whereby it acts as a shutter that blocks light traveling through the substrate 228. The moveable electrode 222 returns to the rolled state after the voltage is removed. The bias towards a rolled state may be achieved by manufacturing the moveable electrode 222 to include an anisotropic stress state.
  • FIG. 2C is a cross-sectional view of a light-tap-based light modulator 250 suitable for incorporation into the direct-view MEMS-based display apparatus 100 of FIG. 1, according to an illustrative embodiment of the invention. As described further in U.S. Pat. No. 5,771,321, entitled “Micromechanical Optical Switch and Flat Panel Display,” the entirety of which is incorporated herein by reference, a light tap works according to a principle of frustrated total internal reflection. That is, light 252 is introduced into a light guide 254, in which, without interference, light 252 is for the most part unable to escape the light guide 254 through its front or rear surfaces due to total internal reflection. The light tap 250 includes a tap element 256 that has a sufficiently high index of refraction that, in response to the tap element 256 contacting the light guide 254, light 252 impinging on the surface of the light guide adjacent the tap element 256 escapes the light guide 254 through the tap element 258 towards a viewer, thereby contributing to the formation of an image.
  • In one embodiment, the tap element 256 is formed as part of beam 258 of flexible, transparent material. Electrodes 260 coat portions one side of the beam 258. Opposing electrodes 260 are disposed on a cover plate 264 positioned adjacent the layer 258 on the opposite side of the light guide 254. By applying a voltage across the electrodes 260, the position of the tap element 256 relative to the light guide 254 can be controlled to selectively extract light 252 from the light guide 254.
  • FIG. 2D is a cross sectional view of a third illustrative non-shutter-based light modulator suitable for inclusion in various embodiments of the invention Specifically, FIG. 2D is a cross sectional view of an electrowetting-based light modulation array 270. The light modulation array 270 includes a plurality of electrowetting-based light modulation cells 272 a-272 d (generally “cells 272”) formed on an optical cavity 274. The light modulation array 270 also includes a set of color filters 276 corresponding to the cells 272.
  • Each cell 272 includes a layer of water (or other transparent conductive or polar fluid) 278, a layer of light absorbing oil 280, a transparent electrode 282 (made, for example, from indium-tin oxide) and an insulating layer 284 positioned between the layer of light absorbing oil 280 and the transparent electrode 282. Illustrative implementation of such cells are described further in U.S. Patent Application Publication No. 2005/0104804, published May 19, 2005 and entitled “Display Device.” In the embodiment described herein, the electrode takes up a portion of a rear surface of a cell 272.
  • The remainder of the rear surface of a cell 272 is formed from a reflective aperture layer 286 that forms the front surface of the optical cavity 274. The reflective aperture layer 286 is formed from a reflective material, such as a reflective metal or a stack of thin films forming a dielectric mirror. For each cell 272, an aperture is formed in the reflective aperture layer 286 to allow light to pass through. The electrode 282 for the cell is deposited in the aperture and over the material forming the reflective aperture layer 286, separated by another dielectric layer.
  • The remainder of the optical cavity 274 includes a light guide 288 positioned proximate the reflective aperture layer 286, and a second reflective layer 290 on a side of the light guide 288 opposite the reflective aperture layer 286. A series of light redirectors 291 are formed on the rear surface of the light guide, proximate the second reflective layer. The light redirectors 291 may be either diffuse or specular reflectors. One of more light sources 292 inject light 294 into the light guide 288.
  • In an alternative implementation, an additional transparent substrate is positioned between the light guide 290 and the light modulation array 270. In this implementation, the reflective aperture layer 286 is formed on the additional transparent substrate instead of on the surface of the light guide 290.
  • In operation, application of a voltage to the electrode 282 of a cell (for example, cell 272 b or 272 c) causes the light absorbing oil 280 in the cell to collect in one portion of the cell 272. As a result, the light absorbing oil 280 no longer obstructs the passage of light through the aperture formed in the reflective aperture layer 286 (see, for example, cells 272 b and 272 c). Light escaping the backlight at the aperture is then able to escape through the cell and through a corresponding color (for example, red, green, or blue) filter in the set of color filters 276 to form a color pixel in an image. When the electrode 282 is grounded, the light absorbing oil 280 covers the aperture in the reflective aperture layer 286, absorbing any light 294 attempting to pass through it.
  • The area under which oil 280 collects when a voltage is applied to the cell 272 constitutes wasted space in relation to forming an image. This area cannot pass light through, whether a voltage is applied or not, and therefore, without the inclusion of the reflective portions of reflective apertures layer 286, would absorb light that otherwise could be used to contribute to the formation of an image. However, with the inclusion of the reflective aperture layer 286, this light, which otherwise would have been absorbed, is reflected back into the light guide 290 for future escape through a different aperture.
  • The roller-based light modulator 220, light tap 250, and electrowetting-based light modulation array 270 are not the only examples of a non-shutter-based MEMS modulator suitable for control by the control matrices described herein. Other forms of non-shutter-based MEMS modulators could likewise be controlled by various ones of the control matrices described herein without departing from the scope of the invention.
  • FIG. 3A is a schematic diagram of a control matrix 300 suitable for controlling the light modulators incorporated into the direct-view MEMS-based display apparatus 100 of FIG. 1, according to an illustrative embodiment of the invention. FIG. 3B is a perspective view of an array 320 of shutter-based light modulators connected to the control matrix 300 of FIG. 3A, according to an illustrative embodiment of the invention. The control matrix 300 may address an array of pixels 320 (the “array 320”). Each pixel 301 includes an elastic shutter assembly 302, such as the shutter assembly 200 of FIG. 2A, controlled by an actuator 303. Each pixel also includes an aperture layer 322 that includes aperture holes 324. Further electrical and mechanical descriptions of shutter assemblies such as shutter assembly 302, and variations thereon, can be found in U.S. patent application Ser. Nos. 11/251,035 and 11/326,696.
  • The control matrix 300 is fabricated as a diffused or thin-film-deposited electrical circuit on the surface of a substrate 304 on which the shutter assemblies 302 are formed. The control matrix 300 includes a scan-line interconnect 306 for each row of pixels 301 in the control matrix 300 and a data-interconnect 308 for each column of pixels 301 in the control matrix 300. Each scan-line interconnect 306 electrically connects a write-enabling voltage source 307 to the pixels 301 in a corresponding row of pixels 301. Each data interconnect 308 electrically connects a data voltage source, (“Vd source”) 309 to the pixels 301 in a corresponding column of pixels 301. In control matrix 300, the data voltage Vd provides the majority of the energy necessary for actuation of the shutter assemblies 302. Thus, the data voltage source 309 also serves as an actuation voltage source.
  • Referring to FIGS. 3A and 3B, for each pixel 301 or for each shutter assembly in the array of pixels 320, the control matrix 300 includes a transistor 310 and a capacitor 312. The gate of each transistor 310 is electrically connected to the scan-line interconnect 306 of the row in the array 320 in which the pixel 301 is located. The source of each transistor 310 is electrically connected to its corresponding data interconnect 308. The actuators 303 of each shutter assembly include two electrodes. The drain of each transistor 310 is electrically connected in parallel to one electrode of the corresponding capacitor 312 and to the one of the electrodes of the corresponding actuator 303. The other electrode of the capacitor 312 and the other electrode of the actuator 303 in shutter assembly 302 are connected to a common or ground potential.
  • In operation, to form an image, the control matrix 300 write-enables each row in the array 320 in sequence by applying Vwe to each scan-line interconnect 306 in turn. For a write-enabled row, the application of Vwe to the gates of the transistors 310 of the pixels 301 in the row allows the flow of current through the data interconnects 308 through the transistors to apply a potential to the actuator 303 of the shutter assembly 302. While the row is write-enabled, data voltages Vd are selectively applied to the data interconnects 308. In implementations providing analog gray scale, the data voltage applied to each data interconnect 308 is varied in relation to the desired brightness of the pixel 301 located at the intersection of the write-enabled scan-line interconnect 306 and the data interconnect 308. In implementations providing digital control schemes, the data voltage is selected to be either a relatively low magnitude voltage (i.e., a voltage near ground) or to meet or exceed Vat (the actuation threshold voltage). In response to the application of Vat to a data interconnect 308, the actuator 303 in the corresponding shutter assembly 302 actuates, opening the shutter in that shutter assembly 302. The voltage applied to the data interconnect 308 remains stored in the capacitor 312 of the pixel 301 even after the control matrix 300 ceases to apply Vwe to a row. It is not necessary, therefore, to wait and hold the voltage Vwe on a row for times long enough for the shutter assembly 302 to actuate; such actuation can proceed after the write-enabling voltage has been removed from the row. The voltage in the capacitors 312 in a row remain substantially stored until an entire video frame is written, and in some implementations until new data is written to the row.
  • The pixels 301 of the array 320 are formed on a substrate 304. The array includes an aperture layer 322, disposed on the substrate, which includes a set of aperture holes 324 for each pixel 301 in the array 320. The aperture holes 324 are aligned with the shutter assemblies 302 in each pixel. In one implementation the substrate 304 is made of a transparent material, such as glass or plastic. In another implementation the substrate 304 is made of an opaque material, but in which holes are etched to form the aperture holes 324.
  • The shutter assembly 302 together with the actuator 303 can be made bi-stable. That is, the shutters can exist in at least two equilibrium positions (e.g. open or closed) with little or no power required to hold them in either position. More particularly, the shutter assembly 302 can be mechanically bi-stable. Once the shutter of the shutter assembly 302 is set in position, no electrical energy or holding voltage is required to maintain that position. The mechanical stresses on the physical elements of the shutter assembly 302 can hold the shutter in place.
  • The shutter assembly 302 together with the actuator 303 can also be made electrically bi-stable. In an electrically bi-stable shutter assembly, there exists a range of voltages below the actuation voltage of the shutter assembly, which if applied to a closed actuator (with the shutter being either open or closed), hold the actuator closed and the shutter in position, even if an opposing force is exerted on the shutter. The opposing force may be exerted by a spring such as spring 207 in shutter-based light modulator 200, or the opposing force may be exerted by an opposing actuator, such as an “open” or “closed” actuator.
  • The light modulator array 320 is depicted as having a single MEMS light modulator per pixel. Other embodiments are possible in which multiple MEMS light modulators are provided in each pixel, thereby providing the possibility of more than just binary “on’ or “off” optical states in each pixel. Certain forms of coded area division gray scale are possible wherein the multiple MEMS light modulators in the pixel are provided, and where with aperture holes 324 associated with each of the light modulators have unequal areas.
  • FIG. 3D is yet another suitable control matrix 340 for inclusion in the display apparatus 100, according to an illustrative embodiment of the invention. Control matrix 340 controls an array of pixels 342 that include shutter assemblies 344. The control matrix 340 includes a single data interconnect 348 for each column of pixels 342 in the control matrix. The actuators in the shutter assemblies 344 can be made either electrically bi-stable or mechanically bi-stable.
  • The control matrix 340 includes a scan-line interconnect 346 for each row of pixels 342 in the control matrix 340. The control matrix 340 further includes a charge interconnect 350, and a global actuation interconnect 354, and a shutter common interconnect 355. These interconnects 350, 354 and 355 are shared among pixels 342 in multiple rows and multiple columns in the array. In one implementation, the interconnects 350, 354, and 355 are shared among all pixels 342 in the control matrix 340. Each pixel 342 in the control matrix includes a shutter charge transistor 356, a shutter discharge transistor 358, a shutter write-enable transistor 357, and a data store capacitor 359. Control matrix 340 also incorporates an optional voltage stabilizing capacitor 352 which is connected in parallel with the source and drain of discharge switch transistor 358. The gate terminals of the charging transistor 356 are connected directly to the charge interconnect 350, along with the drain terminal of the charging transistor 356. In operation, the charging transistors 356 operate essentially as diodes, they can pass a current in only one direction.
  • At the beginning of each frame addressing cycle the control matrix 340 applies a voltage pulse to the charge interconnect 350, allowing current to flow through charging transistor 356 and into the shutter assemblies 344 of the pixels 342. After this charging pulse, each of the shutter electrodes of shutter assemblies 344 will be in the same voltage state. After the voltage pulse, the potential of charge interconnect 350 is reset to zero, and the charging transistors 356 will prevent the charge stored in the shutter assemblies 344 from being dissipated through charge interconnect 350. The charge interconnect 350, in one implementation, transmits a pulsed voltage equal to or greater than Vat, e.g., 40V. In one implementation the imposition of a voltage in excess of Vat of causes all of the shutter assemblies connected to the charging interconnect 350 to actuate or move into the same state, for instance the shutter closed state.
  • Each row is then write-enabled in sequence. The control matrix 340 applies a write-enabling voltage Vwe to the scan-line interconnect 346 corresponding to each row. While a particular row of pixels 342 is write-enabled, the control matrix 340 applies a data voltage to the data interconnect 348 corresponding to each column of pixels 342 in the control matrix 340. The application of Vwe to the scan-line interconnect 346 for the write-enabled row turns on the write-enable transistor 357 of the pixels 342 in the corresponding scan line. The voltages applied to the data interconnect 348 is thereby caused to be stored on the data store capacitor 359 of the respective pixels 342.
  • In control matrix 340 the global actuation interconnect 354 is connected to the source of the shutter discharge switch transistor 358. Maintaining the global actuation interconnect 354 at a potential significantly above that of the shutter common interconnect 355 prevents the turn-on of the discharge switch transistor 358, regardless of what charge is stored on the capacitor 359. Global actuation in control matrix 340 is achieved by bringing the potential on the global actuation interconnect 354 to ground or to substantially the same potential as the shutter common interconnect 355, enabling the discharge switch transistor 358 to turn-on in accordance to the whether a data voltage has been stored on capacitor 359. During the global actuation step, for the pixels wherein a data voltage has been stored on capacitor 359, the discharge transistor turns on, charge drains out of the actuators of shutter assembly 344, and the shutter assembly 344 is allowed to move or actuate into its relaxed state, for instance the shutter open state. For pixels wherein no data voltage was stored on the capacitor 359, the discharge transistor 358 do not turn on and the shutter assembly 344 remains charged. For those pixels a voltage remains across the actuators of shutter assemblies 344 and those pixels remain, for instance, in the shutter closed state. During the global actuation step all pixels connected to the same global actuation interconnect, and with data stored on capacitor 359, move into their new states at substantially at the same time. Control matrix 340 does not depend on electrical bi-stability in the shutter assembly 344 in order to achieve global actuation.
  • Applying partial voltages to the data store capacitor 359 allows partial turn-on of the discharge switch transistor 358 during the time that the global actuation interconnect 354 is brought to its actuation potential. In this fashion, an analog voltage is created on the shutter assembly 344, for providing analog gray scale.
  • In some implementation the global actuation interconnect 354 is connected to every shutter discharge transistor 358 in every row and column in the array of pixels. In other implementations the global actuation interconnect 354 is connected to the shutter discharge transistors within only a sub-group of pixels in multiple rows and columns. As will be discussed with reference to FIGS. 18 and 19, the array of pixels can be arranged in banks, where each bank of pixels is connected by means of a global actuation interconnects to a unique global actuation driver. In this implementation the control circuit can load data into the selected banks and then actuate only the selected bank globally by means of the selected global actuation driver. In one implementation, the display is separated into two banks, with one set of global drivers and global actuation interconnects connected to pixels in the odd-numbered rows while a separate set of global drivers and global actuation interconnects is connected to pixels in the even-numbered rows. In other implementations as many as 6 or 8 separately actuatable addressing banks are employed. Other implementations of circuits for controlling displays are described in U.S. Ser. No. 11/607,715 filed Dec. 1, 2006 and entitled “Circuits for Controlling Display Apparatus,” which is incorporated herein by reference.
  • FIG. 3C illustrates a portion of a direct view display 380 that includes the array of light modulators 320 depicted in FIG. 3B disposed on top of backlight 330. In one implementation the backlight 330 is made of a transparent material, i.e. glass or plastic, and functions as a light guide for evenly distributing light from lamps 382, 384, and 386 throughout the display plane. When assembling the display 380 as a field sequential display, the lamps 382, 384, and 386 can be alternate color lamps, e.g. red, green, and blue lamps respectively.
  • A number of different types of lamps 382-386 can be employed in the displays, including without limitation: incandescent lamps, fluorescent lamps, lasers, or light emitting diodes (LEDs). Further, lamp 382-386 of direct view display 380 can be combined into a single assembly containing multiple lamps. For instance a combination of red, green, and blue LEDs can be combined with or substituted for a white LED in a small semiconductor chip, or assembled into a small multi-lamp package. Similarly each lamp can represent an assembly of 4-color LEDs, for instance a combination of red, yellow, green, and blue LEDs.
  • The shutter assemblies 302 function as light modulators. By use of electrical signals from the associated control matrix the shutter assemblies 302 can be set into either an open or a closed state. Only the open shutters allow light from the lightguide 330 to pass through to the viewer, thereby forming a direct view image.
  • In direct view display 380 the light modulators are formed on the surface of substrate 304 that faces away from the light guide 330 and toward the viewer. In other implementations the substrate 304 can be reversed, such that the light modulators are formed on a surface that faces toward the light guide. In these implementations it is sometimes preferable to form an aperture layer, such as aperture layer 322, directly onto the top surface of the light guide 330. In other implementations it is useful to interpose a separate piece of glass or plastic between the light guide and the light modulators, such separate piece of glass or plastic containing an aperture layer, such as aperture layer 322 and associated aperture holes, such as aperture holes 324. It is preferable that the spacing between the plane of the shutter assemblies 302 and the aperture layer 322 be kept as close as possible, preferably less than 10 microns, in some cases as close as 1 micron. Descriptions of other optical assemblies useful for this invention can be found in US Patent Application Publication No. 20060187528A1 filed Sep. 2, 2005 and entitled “Methods and Apparatus for Spatial Light Modulation” and in U.S. Ser. No. 11/528,191 filed Sep. 26, 2006 and entitled “Display Apparatus with Improved Optical Cavities,” which are both incorporated herein by reference.
  • In some displays, color pixels are generated by illuminating groups of light modulators corresponding to different colors, for example, red green and blue. Each light modulator in the group has a corresponding filter to achieve the desired color. The filters, however, absorb a great deal of light, in some cases as much as 60% of the light passing through the filters, thereby limiting the efficiency and brightness of the display. In addition, the use of multiple light modulators per pixel decreases the amount of space on the display that can be used to contribute to a displayed image, further limiting the brightness and efficiency of such a display.
  • The human brain, in response to viewing rapidly changing images, for example, at frequencies of greater than 20 Hz, averages images together to perceive an image which is the combination of the images displayed within a corresponding period. This phenomenon can be utilized to display color images while using only single light modulators for each pixel of a display, using a technique referred to in the art as field sequential color. The use of field sequential color techniques in displays eliminates the need for color filters and multiple light modulators per pixel. In a field sequential color enabled display, an image frame to be displayed is divided into a number of sub-frame images, each corresponding to a particular color component (for example, red, green, or blue) of the original image frame. For each sub-frame image, the light modulators of a display are set into states corresponding to the color component's contribution to the image. The light modulators then are illuminated by a lamp of the corresponding color. The sub-images are displayed in sequence at a frequency (for example, greater than 60 Hz) sufficient for the brain to perceive the series of sub-frame images as a single image. The data used to generate the sub-frames are often fractured in various memory components. For example, in some displays, data for a given row of display are kept in a shift-register dedicated to that row. Image data is shifted in and out of each shift register to a light modulator in a corresponding column in that row of the display according to a fixed clock cycle.
  • FIG. 4 is a timing diagram 400 corresponding to a display process for displaying images using field sequential color, which can be implemented according to an illustrative embodiment of the invention, for example, by a MEMS direct-view display described in FIG. 7. The timing diagrams included herein, including the timing diagram 400 of FIG. 4, conform to the following conventions. The top portions of the timing diagrams illustrate light modulator addressing events. The bottom portions illustrate lamp illumination events.
  • The addressing portions depict addressing events by diagonal lines spaced apart in time. Each diagonal line corresponds to a series of individual data loading events during which data is loaded into each row of an array of light modulators, one row at a time. Depending on the control matrix used to address and drive the modulators included in the display, each loading event may require a waiting period to allow the light modulators in a given row to actuate. In some implementations, all rows in the array of light modulators are addressed prior to actuation of any of the light modulators. Upon completion of loading data into the last row of the array of light modulators, all light modulators are actuated substantially simultaneously. One method for such actuation is described further in relation to FIG. 11.
  • Lamp illumination events are illustrated by pulse trains corresponding to each color of lamp included in the display. Each pulse indicates that the lamp of the corresponding color is illuminated, thereby displaying the sub-frame image loaded into the array of light modulators in the immediately preceding addressing event.
  • The time at which the first addressing event in the display of a given image frame begins is labeled on each timing diagram as AT0. In most of the timing diagrams, this time falls shortly after the detection of a voltage pulse vsync, which precedes the beginning of each video frame received by a display. The times at which each subsequent addressing event takes place are labeled as AT1, AT2, . . . AT(n−1), where n is the number of sub-frame images used to display the image frame. In some of the timing diagrams, the diagonal lines are further labeled to indicate the data being loaded into the array of light modulators. For example, in the timing diagrams of FIGS. 4 and 5, D0 represents the first data loaded into the array of light modulators for a frame and D(n−1) represents the last data loaded into the array of light modulators for the frame. In the timing diagrams of FIGS. 10, 12-17 and 19, the data loaded during each addressing event corresponds to a bitplane.
  • As described in further detail in relation to FIGS. 6A-6C, a bitplane is a coherent set of data identifying desired modulator states for modulators in multiple rows and multiple columns of an array of light modulators. Moreover, each bitplane corresponds to one of a series of sub-frame images derived according to a binary coding scheme. That is, each sub-frame image for a color component of an image frame is weighted according to a binary series 1, 2, 4, 8, 16, etc. The bitplane with the lowest weighting is referred to as the least significant bitplane and is labeled in the timing diagrams and referred to herein by the first letter of the corresponding color component followed by the number 0. For each next-most significant bitplane for the color components, the number following the first letter of the color component increases by one. For example, for an image frame broken into 4 bitplanes per color, the least significant red bitplane is labeled and referred to as the R0 bitplane. The next most significant red bitplane is labeled and referred to as R1, and the most significant red bitplane is labeled and referred to as R3.
  • Lamp-related events are labeled as LT0, LT1, LT2 . . . LT(n−1). The lamp-related event times labeled in a timing diagram, depending on the timing diagram, either represent times at which a lamp is illuminated or times at which a lamp is extinguished. The meaning of the lamp times in a particular timing diagram can be determined by comparing their position in time relative to the pulse trains in the illumination portion of the particular timing diagram. Specifically referring back to the timing diagram 400 of FIG. 4, to display an image frame according to the timing diagram 400, a single sub-frame image is used to display each of three color components of an image frame. First, data, D0, indicating modulator states desired for a red sub-frame image are loaded into an array of light modulators beginning at time AT0. After addressing is complete, the red lamp is illuminated at time LT0, thereby displaying the red sub-frame image. Data, D1, indicating modulator states corresponding to a green sub-frame image are loaded into the array of light modulators at time AT1. A green lamp is illuminated at time LT1. Finally, data, D2, indicating modulator states corresponding to a blue sub-frame image are loaded into the array of light modulators and a blue lamp is illuminated at times AT2 and LT2, respectively. The process then repeats for subsequent image frames to be displayed.
  • The level of gray scale achievable by a display that forms images according to the timing diagram of FIG. 4 depends on how finely the state of each light modulator can be controlled. For example, if the light modulators are binary in nature, i.e., they can only be on or off, the display will be limited to generating 8 different colors. The level of gray scale can be increased for such a display by providing light modulators than can be driven into additional intermediate states. In some embodiments related to the field sequential technique of FIG. 4, MEMS light modulators can be provided which exhibit an analog response to applied voltage. The number of grayscale levels achievable in such a display is limited only by the resolution of digital to analog converters which are supplied in conjunction with data voltage sources, such as voltage source 309.
  • Alternatively, finer grayscale can be generated if the time period used to display each sub-frame image is split into multiple time periods, each having its own corresponding sub-frame image. For example, with binary light modulators, a display that forms two sub-frame images of equal length and light intensity per color component can generate 27 different colors instead of 8. Gray scale techniques that break each color component of an image frame into multiple sub-frame images are referred to, generally, as time division gray scale techniques.
  • FIG. 5 is a timing diagram corresponding to a display process for displaying an image frame by displaying multiple equally weighted sub-frame images per color that can be implemented by various embodiments of the invention. In the timing diagram of FIG. 5, each color component of an image frame is divided into four equally weighted sub-frame images. More particularly, each sub-frame image for a given color component is illuminated for the same amount of time at the same lamp intensity. Thus, the number portion of the data identifier (e.g., R0, R1, or G3) only refers to the order in which the corresponding sub-frame image is displayed, and not to any weighting value. Assuming the light modulators are binary in nature, a display utilizing this grayscale technique can generate 5 gray scale levels per color or 125 distinct colors.
  • More specifically, first, data, R0, indicating modulator states desired for a first red sub-frame image are loaded into an array of light modulators beginning at time AT0. After the light modulators have achieved the states indicated by data R0, the red lamp is illuminated, thereby displaying the first red sub-frame image. The red lamp is extinguished at time AT1, which is when data, R1, indicating modulator states corresponding to the next red sub-frame image are loaded into the array of light modulators. The same steps repeat for each red sub-frame image corresponding to data R1, R2 and R3. The steps as described for the red sub-frame images R0-R3 then repeat for the green sub-frame images G0-G3, and then for the blue sub-frame images B0-B3. The process then repeats for subsequent image frames to be displayed. The addressing times in FIG. 5 can be established through a variety of methods. Since the data is loaded at regular intervals, and since the sub-frame images are illuminated for equal times, a fixed clock cycle running with a frequency 12 times that of the vsync frequency can be sufficient for coordinating the display process.
  • By contrast to the timing diagram shown in FIG. 5, which employs equal-weighting for each of 4 sub-frame images per color, other display processes made possible by this invention employ unequal illumination weightings between sub-frame images. Such unequal weightings enable a coded time division gray scale technique wherein much larger numbers of gray scale levels can be displayed with the same number of sub-frame images. Display processes using coded time division gray scale, in some cases, utilizes bitplanes to implement a binary weighting scheme of sub-frame images. FIGS. 6A-6C depict a process for generating a bitplane, according to an illustrative embodiment of the invention. FIG. 6A is a schematic diagram of a digital image signal 600 received by a display device. The image signal 600 encodes data corresponding to image frames. For a given image frame encoded in the image signal 600, the image signal 600 includes a series of bits for each pixel included in the image frame. The data is encoded in a pixel-by-pixel fashion. That is, the image signal includes all data for the color of a single pixel in the image frame before it includes data for the next pixel.
  • For example, in FIG. 6A, the data for an image frame begins with a vsync signal indicating the beginning of the image frame. The image signal 600 then includes, for example, 24 bits indicating the color of the pixel in the first row of the first column of the image frame. Of the 24 bits, 8 encode a red component of the pixel, 8 encode a green component, and 8 encode a blue component of the pixel. Each set of eight bits is referred to as a coded word. An eight bit coded word for each color enables a description of 256 unique brightness levels for each color, or 16 million unique combinations of the colors red, green, and blue. Within the coded word, each of the 8 bits represents a particular position or place value (also referred to as a significance value) in the coded word. In FIG. 6A, these place values are indicated by a coding scheme such as R0, R1, R2, R3, etc. R0 represents the least significant bit for the color red. R7 represents the most significant bit for the color red. G7 is the most significant bit for the color green, and B7 is the most significant bit for the color blue. Quantitatively, in binary coding, the place values corresponding to R0, R1, R2, . . . R7 are given by the binary series 20, 21, 22, . . . , 27. In other examples, the image signal 600 may include more or fewer bits per color component of an image. For example, the image signal 600 may include 3, 4, 5, 6, 7, 9, 10, 11, 12 or more bits per color component of an image frame.
  • The data as received in image signal 600 is organized by rows and columns. Generally the image signal provides all of the data for pixels in the first row before proceeding to subsequent rows. Within the first row, all of the data is received for the pixel in the first column before it is received for pixels in succeeding columns of the same row.
  • FIG. 6B is a schematic diagram of a memory buffer 620 useful for converting a received image signal into a bitplane, according to an illustrative embodiment of the invention. As described above, a bitplane includes data for pixels in multiple columns and multiple rows of a display corresponding to a single significance value of a grayscale coded word for a color component of an image frame. To convert a binary coded image signal, such as image signal 600 of FIG. 6A, into bitplanes, bits having the significance level are grouped together into a single data structure. A small memory buffer 620 is employed to organize incoming image data. The memory buffer 620 is organized in an array of rows and columns, and allows for data to be read in and out in by addressing either individual rows or by addressing individual columns.
  • Incoming data, which, as described above, is received in a pixel by pixel format, is read into the memory buffer 620 in successive rows. The memory buffer 620 stores data relevant to only a single designated row of the display, i.e. it operates on only a fraction of the incoming data at any given time. Each numbered row within the memory buffer 620 contains complete pixel data for a given column for the designated row. Each row of the memory buffer 620 contains complete gray scale data for a given pixel.
  • Once the small memory buffer 620 has been loaded with data for all columns of a given row of the display, the data in the memory buffer 620 can be read out to populate a bitplane data structure. The data is read out column by column. Each column includes a single place value of the gray scale code word of the pixels row of the display. These values correspond to desired states of light modulators in the display. For example, a 0 may refer to an “open” light modulator state and 1 may refer to a “closed” light modulator state, or visa versa. This process repeats for multiple rows in the display.
  • FIG. 6C is a schematic diagram of portions of two bitplanes 650 and 660, according to an illustrative embodiment of the invention. The first bitplane 650 includes data corresponding to the least significant bits of the gray scale coded words identifying the level of red (i.e., R0 values) for the first 10 columns and 15 rows of pixels of a display. The second bitplane 660 includes data corresponding to the second-least significant bits of the gray scale coded words identifying the level of red (i.e., R1) for the same 10 columns and 15 rows of pixels of the display.
  • Alternative weighting schemes are available within the method of coded time division gray scale. Binary coding is appropriate where the spatial light modulators allow only two states, e.g. open or closed or e.g. on or off. A ternary weighting system is also a possibility. For instance a spatial light modulator and associated control matrix could allow for three unique states of light transmission or reflection. These could be closed, ½ open, and full open. MEMS-based modulators could, for instance, be constructed where the ½ open and full open states result from the application of distinct actuation voltages. Alternatively, the ½ open state could be achieved through the actuation of only one out of two equal-area MEMS modulators which are supplied in each pixel.
  • A sub-frame data set will refer herein to the general case of data structures which are not necessarily bitplanes: namely data structures that store information about the desired states of modulators in multiple rows and multiple columns of the array. For the case of ternary coding a single sub-frame data set would include a ternary number value for each of the pixels in multiple rows and columns, e.g. a 0, 1, or 2. Sequential sub-frame images according to a ternary coding scheme would be weighted according to the base-3 numbering system, with weights in the series 1, 3, 9, 27, etc. Compared to a binary coding system, a ternary coding system makes possible even greater numbers of achievable gray scale levels when displayed using an equal number of sub-frame images. By extension, as MEMS pixels or modulators are developed capable of 4 or 5 unique modulation states at each pixel, the use of quaternary or base-5 coding systems become advantageous in the control system.
  • FIG. 7 is a block diagram of a direct-view display 700, according to an illustrative embodiment of the invention. The direct-view display 700 includes an array of light modulators 702, a controller 704, a set of lamps 706, and driver sets 708, 710, 714, and 716. The array of light modulators 702 includes lights modulators arranged in rows and columns. Suitable light modulators include, without limitation, any of the MEMS-based light modulators described above in relation to FIGS. 2A-2D. In one implementation, the array of light modulators 702 takes the form of the array of light modulators 320 depicted in FIG. 3B. The light modulators are be controlled by a control matrix, such as the control matrices described in FIGS. 3A and 3D.
  • In general, the controller receives an image signal 717 from an external source and generates outputs data and control signals to the drivers 708, 710, 714, and 716 to control the light modulators in the array of light modulators 702 and the lamps 706. The order in which the data and control signals are output is referred to herein as an “output sequence,” described further below More particularly, the controller 704 includes an input processing module 718, a memory control module 720, a frame buffer 722, a timing control module 724, and a schedule table store 726.
  • A module may be implemented as a hardware circuit including application specific integrated circuits, custom VLSI circuits or gate arrays, off-the-shelf semiconductors such as logic chips, memories, transistors, or other discrete components. A module may also be implemented in programmable hardware devices such as field programmable gate arrays, programmable array logic, programmable logic devices or the like.
  • Modules may also be implemented in software for execution by various types of processors. An identified module of executable code may, for instance, include one or more physical or logical blocks of computer instructions which may, for instance, be organized as an object, procedure, or function. Nevertheless, the executables of an identified module need not be physically located together, but may include disparate instructions stored in different locations which, when joined logically together, make up the module and achieve the stated purpose for the module.
  • Indeed, a module of executable code could be a single instruction, or many instructions, and may even be distributed over several different code segments, among different programs, and across several memory devices. Similarly, operational data may be identified and illustrated herein within modules, and may be embodied in any suitable form and organized within any suitable type of data structure. The operational data may be collected as a single data set, or may be distributed over different locations including over different storage devices, and may exist, at least partially, merely as electronic signals on a system or network.
  • The illustration of direct view display 700 in FIG. 7 portrays the controller 704 and drivers 708, 710, 714, and 716 as separate functional blocks. These blocks are understood to represent distinguishable circuits and/or modules of executable code. In some implementations the blocks 704, 708, 710, 714, and 716 may be provided as distinct chips or circuits which are connected together by means of circuit boards and/or cables. In other implementations, several of these blocks can be designed together into a single semiconductor chip such that their boundaries are nearly indistinguishable except by function. In some implementations the storage area referred to as frame buffer 722 is provided as a functional area within a custom design of the controller circuit 704. In other implementations the frame buffer 722 is represented by a separate off-the-shelf memory chip such as a DRAM or SRAM.
  • The input processing module 718 receives the image signal 717 and processes the data encoded therein into a format suitable for displaying via the array of light modulators 702. The input processing module 718 takes the data encoding each image frame and converts it into a series of sub-frame data sets. A sub-frame data set includes information about the desired states of modulators in multiple rows and multiple columns of the array of light modulators 702 aggregated into a coherent data structure. The number and content of sub-frame data sets used to display an image frame depends on the grayscale technique employed by the controller 704. For example, the sub-frame data sets needed to form an image frame using a coded time-division gray scale technique differs from the number and content of sub-frame data sets used to display an image frame using a non-coded time division gray scale technique. While in various embodiments, the image processing module 718 may convert the image signal 717 into non-coded sub-frame data sets, ternary coded sub-frame data sets, or other form of coded sub-frame data set, preferably, the image processing module 718 converts the image signal 717 into bitplanes, as described above in relation to FIGS. 6A-6C.
  • In addition to and in many cases prior to deriving the sub-frame data sets, the input processing module can carry out a number of other optional processing tasks. It may re-format or interpolate incoming data. For instance, it may rescale incoming data horizontally, vertically, or both, to fit within the spatial-resolution limits of modulator array 702. It may also convert incoming data from an interlaced format to a progressive scan format. It may also resample the incoming data in time to reduce frame rates while maintaining acceptable flicker within the characteristics of MEMS display 700. It may perform adjustments to contrast gradations of the incoming data, in some cases referred to as gamma corrections, to better match the gamma characteristics and/or contrast precision available in the MEMS display 700. It may alter the grayscale levels assigned between neighboring pixels (spatial dithering) and/or assigned between succeeding image frames (temporal dithering) to enhance the gray scale precision available in the display. And it may perform adjustments to color values expressed in the pixel data. In one instance of color adjustment, the data is transformed to match the color coordinates of the lamps 706 used in display 700. For embodiments where four or more distinct-color lamps are employed, the input processing module will transform the data from an incoming 3-color space and map it to coordinates appropriate to the 4-color space.
  • The input processing module 718 outputs the sub-frame data sets to the memory control module 720. The memory control module 720 then stores the sub-frame data sets in the frame buffer 722. The frame buffer is preferably a random access memory, although other types of serial memory can be used without departing from the scope of the invention. The memory control module 720, in one implementation, stores the sub-frame data set in a predetermined memory location based on the color and significance in a coding scheme of the sub-frame data set. In other implementations, the memory control module 720 stores the sub-frame data set in a dynamically determined memory location and stores that location in a lookup table for later identification. In one particular implementation, the frame buffer 722 is configured for the storage of bitplanes.
  • The memory control module 720 is also responsible for, upon instruction from the timing control module 724, retrieving sub-image data sets from the frame buffer 722 and outputting them to the data drivers 708. The data drivers 708 load the data output from the memory control module 720 into the light modulators of the array of light modulators 702. The memory control module 720 outputs the data in the sub-image data sets one row at a time. In one implementation, the frame buffer 722 includes two buffers, whose roles alternate. While the memory control module 720 stores newly generated bitplanes corresponding to a new image frame in one buffer, it extracts bitplanes corresponding to the previously received image frame from the other buffer for output to the array of light modulators 702. Both buffer memories can reside within the same circuit, separated only by address.
  • The timing control module 724 manages the output by the controller 704 of data and command signals according to an output sequence. The output sequence includes the order and timing with which sub-frame data sets are output to the array of light modulators 702 and the timing and character of illumination events. The output sequence, in some implementations, also includes global actuation events. At least some of the parameters that define the output sequence are stored in volatile memory. This volatile memory is referred to as schedule table store 726. A table including the data stored in the schedule table store 726 is referred to herein as a “schedule table” or alternately as a “sequence table”. The data stored therein need not actually be stored in table format. Conceptually, the data stored in the schedule table store 726 is easier for a human to understand if displayed in table format. The actual data structure used to store output sequence data can be, for example a series of bit strings. Each string of bits includes a series of coded words corresponding to timing values, memory, addresses, and illumination data. An illustrative data structure for storing output sequence parameters is described further in relation to FIG. 24. Other data structures may be employed without departing from the scope of the invention.
  • Some output sequence parameters may be stored as hardwired logic in the timing control module 724. For example, the logic incorporated into the timing control module to wait until a particular event time may be expressed as follows:
  • mycount <= mycount + 1;
    if mycount = 1324 then
    triggersignal <= ‘1’;
    else
    triggersignal <= ‘0’;

    This logic employs a counter which increments at every clock cycle. When the clock counter reaches the timing value 1324 a trigger signal is sent. For example, the trigger signal may be sent to the memory control module 720 to initiate the loading of a bitplane into the modulators. Or, the trigger signal could be sent to lamp driver 706 to switch the lamp on or off. In the example above, the logic takes the form of logic circuitry built directly into the timing control module 724. The particular timing parameter 1324 is a scalar value contained within the command sequence. In another implementation of timing control module 724, the logic does not include a specific value for a number of clock pulses to wait, but refers instead to one of a series of timing values which are stored in schedule stable store 726.
  • The output sequence parameters stored in the schedule table store 726 vary in different embodiments of the invention. In one embodiment, the schedule table store 726 stores timing values associated with each sub-frame data set. For example, the schedule table store 726 may store timing values associated with the beginning of each addressing event in the output sequence, as well as timing values associated with lamp illumination and/or lamp extinguishing events. In other embodiments, the schedule table store 726 stores lamp intensity values instead of or in addition to timing values associated with addressing events. In various embodiments, the schedule table store 726 stores an identifier indicating where each sub-image data set is stored in the frame buffer 722, and illumination data indicating the color or colors associated with each respective sub-image data set.
  • The nature of the timing values stored in the schedule table store 726 can vary depending on the specific implementation of the controller 704. The timing value, as stored in the schedule table store 726, in one implementation, is a number of clock cycles, which for example, have passed since the initiation of the display of an image frame, or since the last addressing or lamp event was triggered. Alternatively, the timing value may be an actual time value, stored in microseconds or milliseconds.
  • Table 1 is an illustrative schedule table illustrating parameters suitable for storage in the schedule table store 726 for use by the timing control module 724. Several additional illustrative schedule tables are described in further detail in relation to FIGS. 13, 14A-B, 15-17 and 19.
  • TABLE 1
    Schedule Table 1
    Field Field Field Field Field Field Field Field Field
    1 2 3 4 5 6 7 - - - n − 1 n
    addressing time AT0 AT1 AT2 AT3 AT4 AT5 AT6 - - - AT(n − 1) ATn
    memory location of M0 M1 M2 M3 M4 M4 M6 - - - M(n − 1) Mn
    sub-frame data set
    lamp ID R R R R G G G - - - B B
    lamp time LT0 LT1 LT2 LT3 LT4 LT5 LT6 - - - LT(n − 1) LTn
  • The Table 1 schedule table includes two timing values for each sub-frame data set, an addressing time and a lamp illumination time. The addressing times AT0-AT(n−1) are associated with times at which the memory control module 720 outputs a respective sub-frame data set, in this case a bitplane, to the array of light modulators 702. The lamp illumination times LT0-LT(n−1) are associated with times at which corresponding lamps are illuminated. In fact, each time value in the schedule table may trigger more than one event. For example, in some grayscale techniques, lamp activity is synchronized with the actuation of the light modulators to avoid illuminating the light modulators while they are not in an addressed state. Thus, in some implementations, the addressing times AT, not only trigger addressing events, they also trigger lamp extinguishing events. Similarly, in other implementations, lamp extinguishing events also trigger addressing events.
  • The address data, labeled in the table as “memory location of sub-frame data set,” in the schedule table can be stored in a number of forms. For example, in one implementation, the address is a specific memory location in the frame buffer of the beginning of the corresponding bitplane, referenced by buffer, column, and row numbers. In another implementation, the address stored in the schedule table store 726 is an identifier for use in conjunction with a look up table maintained by the memory control module 720. For example, the identifier may have a simple 6-bit binary “xxxxxx” word structure where the first 2 bits identify the color associated with the bitplane, while the next 4 bits refer to the significance of the bitplane. The actual memory location of the bitplane is then stored in a lookup table maintained by the memory control module 720 when the memory control module 720 stores the bitplane into the frame buffer. In other implementations the memory locations for bitplanes in the output sequence may be stored as hardwired logic within the timing control module 724.
  • The timing control module 724 may retrieve schedule table entries using several different methods. In one implementation the order of entries in the schedule table is fixed; the timing control module 724 retrieves each entry in order until reaching a special entry that designates the end of the sequence. Alternatively, a sequence table entry may contain codes that direct the timing control module 724 to retrieve an entry which may be different from the next entry in the table. These additional fields may incorporate the ability to perform jumps, branches, and looping in analogy with the control features of a standard microprocessor instruction set. Such flow control modifications to the operation of the timing control module 724 allow a reduction in the size of the sequence table.
  • The direct-view display 700 also includes a programming link 730. The programming link 730 provides a means by which the schedule table store 726 may be modified by external circuits or computers. In other embodiments the programming link connects directly to a system processor within the same housing as the direct view display 700. The system processor may be programmed to alter the schedule table store in accordance with the type of image or data to be displayed by display 700. The external processor, using the programming link 730, can modify the parameters stored in the schedule table store 726 to alter the output sequence used by the controller 704. For example, the programming link 730 can be used to change the timing parameters stored in the schedule table store 726 to accommodate different frame rates. The timing parameters associated with each bitplane and the number of bitplanes displayed can be modified by the programming link 730 to adjust the number of colors or grayscale the display can provide. Average brightness can be adjusted by changing lamp intensity values. Color saturation can be modified by the programming link by altering percentage of brightness formed using a white color field or by adjusting color mixing (described further in relation to FIG. 17).
  • The direct-view display includes a set of lamps 706 for illuminating the array of light modulators 702. In one implementation, the direct-view display 700 includes a red lamp, a green lamp, and a blue lamp. In another implementation, the direct-view display 700 also includes a white lamp. In still another implementation, the direct-view display 700 includes multiple lamps for each color spaced along a side of the array of light modulators 702.
  • In addition to the red, green, blue, white color combination, other lamp combinations are possible which expand the space or gamut of achievable colors. A useful 4-color lamp combination with expanded color gamut is red, blue, true green (about 520 nm) plus parrot green (about 550 nm). Another 5-color combination which expands the color gamut is red, green, blue, cyan, and yellow. A 5-color analogue to the well known YIQ color space can be established with the lamps white, orange, blue, purple, and green. A 5-color analog to the well known YUV color space can be established with the lamps white, blue, yellow, red, and cyan.
  • Other lamp combinations are possible. For instance, a useful 6-color space can be established with the lamp colors red, green, blue, cyan, magenta, and yellow. A 6-color space can also be established with the colors white, cyan, magenta, yellow, orange, and green. A large number of other 4-color and 5-color combinations can be derived from amongst the colors already listed above. Further combinations of 6, 7, 8 or 9 lamps with different colors can be produced from the colors listed above, Additional colors may be employed using lamps with spectra which lie in between the colors listed above.
  • The direct-view display 700 also includes a number of sets of driver circuits 708, 710, 714, and 716 controlled by, and in electrical communication with the various components of the controller 704. The direct-view display 700 includes a set of scan drivers 708 for write-enabling each of the rows of the array of light modulators in sequence. The scan drivers 708 are controlled by, and in electrical communication with the timing control module 724. Data drivers 710 are in electrical communication with the memory control 720. The direct-view display 700 may include one driver circuit 710 for each column in the array of light modulators 702, or it may have some smaller number of data drivers 710, each responsible for loading data into multiple columns of the array of light modulators 702.
  • The direct-view display 700 includes a series of common drivers 714, including global actuation drivers, actuation voltage drivers, and, in some embodiments, additional common voltage drivers. Common drivers 714 are in electrical communication with the timing control module 720 and light modulators in multiple rows and multiple columns of the array of light modulators 702.
  • The lamps 706 are driven by lamp drivers 716. The lamps may be in electrical communication both with the memory control module 720 and/or the timing control module 724. The timing control module 724 controls the timing of the illumination of the lamps 706. Illumination intensity information may also be supplied by the timing control module 724, or it may be supplied by the memory control module 720.
  • Some electronic devices employing displays according to this invention employ variations on the design of controller 704. For such displays the controller does not include an input processing module or a frame buffer. Instead the system processor attached to the electronic device provides a pre-formatted output sequence of bitplanes for display by the controller, drivers, and the array of MEMS light modulators. In such a display the timing control module coordinates the output of bitplane data for the array of modulators and controls the illumination of lamps associated with each bitplanes. The timing control module may make reference to a schedule table store, within which are stored timing values for addressing and lamp events and/or lamp intensities associated with each of the bitplanes.
  • FIG. 8 is a flow chart of a method of displaying video 800 (the “display method 800”) suitable for use by a direct-view display such as the direct-view display 700 of FIG. 7, according to an illustrative embodiment of the invention. Referring to FIGS. 7 and 8, the display method 800 begins with the provision of an array of light modulators (step 801), such as the array of light modulators 702. Then, the display method 800 proceeds with two interrelated processes, which operate in parallel. The first process is referred to herein as an image processing process 802 of the display method 800. The second process is referred to as a display process 804.
  • The image processing process 802 begins with the receipt of an image signal (step 806) by the video input 718. As described above, the image signal encodes one or more image frames for display on the direct-view display 700. In one embodiment, the image signal is received as indicated in FIG. 6A. That is, data for each pixel is received sequentially, pixel-by-pixel, row-by-row. The data for a given pixel includes one or more bits for each color component of the pixel.
  • Upon receipt of data for an image frame (step 806), the controller 704 of the direct-view display 700 derives a plurality of sub-frame data sets for the image frame (step 808). Preferably, the image processing module 718 of the controller 704 derives a plurality of bitplanes based on the data in the image signal 717 as described above in relation to FIGS. 6A-6C. The imaging process continues at step 810, wherein the sub-frame data sets are stored in the memory. Preferably the biplanes are stored in frame buffer 722, according to address information that allows them to be randomly accessed at a later points in the process.
  • The display process 804 begins with the initiation of the display of an image frame (step 812), for example, in response to the detection of a vsync pulse in the input signal 717. Then, the first sub-frame data set corresponding to the image frame is output by the memory control module 720 (step 814) to the array of light modulators 702 in an addressing event. The memory address of this first sub-frame data set is determined based on data in the schedule table store 726. Preferably, the sub-frame data set is a bitplane. After the modulators addressed in the first sub-frame data set achieve the state indicated in the sub-frame data set, the lamp or lamps corresponding to the sub-frame data set loaded into the light modulators is illuminated (step 816). The time at which the light is illuminated may be governed by a timing value stored in the schedule table store 726 associated with sub-frame image. The lamp remains illuminated until the next time the light modulators in the array of light modulators begin to change state, at which time the lamp is extinguished. The extinguishing time, too, may be determined based on a time value stored in the schedule table store 726. Depending on the addressing technique implemented by the controller 704, the extinguishing time may be before or after the next addressing event begins.
  • After the array of light modulators is illuminated, but not necessarily before or at the same time the lamp is extinguished, the controller 704 determines, based on the output sequence, whether the recently displayed sub-frame image is the last sub-frame image to be displayed for the image frame (decision block 818). If it is not the last sub-frame image, the next sub-frame data set is loaded into the array of light modulators 702 in another addressing event (step 814). If the recently displayed sub-frame image is the last sub-frame image of an image frame, the controller 704 awaits initiation of the display of a subsequent display initiation event (step 812).
  • FIG. 9 is a more detailed flow chart of an illustrative display process 900 suitable for use as part of the display method 800 for displaying images on the direct-view display 700. In the display process 900, the sub-frame data sets employed by the direct-view display are bitplanes. The display process 900 begins with the initiation of the display of an image frame (step 902). For example, the display of an image frame may be initiated (step 902) in response to the detection by the controller 704 of a vsync pulse in the image signal 717. Next, the bitplane corresponding to the image frame is output by the controller 704 to the array of light modulators 702 (step 904). Each row of the sub-frame data set is loaded sequentially. As each row is addressed, the controller 704 waits a sufficient amount of time to ensure the light modulators in the respective row actuate before beginning to address the next row in the array of light modulators 702. During this time, as states of the light modulators in the array of light modulators 702 are in flux, the lamps of the direct-view display 700 remain off.
  • After waiting a sufficient amount of time to ensure all rows of the array of light modulators 702 have actuated according to the data in the bitplane, the color lamp 702 corresponding to the bitplane is illuminated (step 906), thereby displaying the sub-frame image corresponding to the bitplane loaded into the array of light modulators 702. In one implementation, this waiting time is stored in the schedule table store 726. In other implementations, this waiting time is a fixed value hardwired into the timing control module 724 as a number of clock cycles following the beginning of an addressing event.
  • The controller then waits a time stored in the schedule table data store 726 associated with the sub-frame image before extinguishing the lamp (step 908). At decision block 910, the controller 704 determines whether the most recently displayed sub-frame image is the last sub-frame image of the image frame being displayed. If the most recently displayed sub-frame image is the last sub-frame image for the image frame, the controller awaits the initiation of the display of a subsequent image frame (step 902). If it is not the last sub-frame image for the image frame, the controller 704 begins loading the next bitplane (step 904) into the array of light modulators 702. This addressing event may be triggered directly by the extinguishing of the lamp at step 908, or it may begin after a time associated with a timing value stored in the schedule table store 726 passes.
  • FIG. 10 is a timing diagram 1000 that corresponds to an implementation of the display process 900 that utilizes an output sequence having as parameters the values stored in the Table 1 schedule table. The timing diagram 1000 corresponds to a coded-time division grayscale display process in which image frames are displayed by displaying four sub-frame images for each of three color components (red, green, and blue) of the image frame. Each sub-frame image displayed of a given color is displayed at the same intensity for half as long a time period as the prior sub-frame image, thereby implementing a binary weighting scheme for the sub-frame images.
  • The display of an image frame begins upon the detection of a vsync pulse. As indicated on the timing diagram and in the Table 1 schedule table, the first sub-frame data set R3, stored beginning at memory location M0, is loaded into the array of light modulators 702 in an addressing event that begins at time AT0. According to the Table 1 schedule table, the red lamp is then illuminated at time LT0. LT0 is selected such that it occurs after each of the rows in the array of light modulators 702 has been addressed, and the light modulators included therein have actuated. At time AT1, the controller 704 of the direct-view display both extinguishes the red lamp and begins loading the subsequent bitplane, R2, into the array of light modulators 702. According to the Table 1 schedule table, this bitplane is stored beginning at memory location M1. The process repeats until all bitplanes identified in the Table 1 schedule table have been displayed. For example, at time AT4, the controller 704 extinguishes the red lamp and begins loading the most significant green bitplane, G3, into the array of light modulators 702. Similarly at time LT6, the controller 704 turns on the green lamp until time AT7, at which it time it is extinguished again.
  • The time period between vsync pulses in the timing diagram is indicated by the symbol FT, indicating a frame time. In some implementations the addressing times AT0, AT1, etc. as well as the lamp times LT0, LT1, etc. are designed to accomplish 4 sub-frame images per color within a frame time FT of 16.6 milliseconds, i.e. according to a frame rate of 60 Hz. In other implementations the time values stored in schedule table store 726 can be altered to accomplish 4 sub-frame images per color within a frame time FT of 33.3 milliseconds, i.e. according to a frame rate of 30 Hz. In other implementations frame rates as low as 24 Hz may be employed or frame rates in excess of 100 Hz may be employed.
  • In the particular implementation of coded time division gray scale illustrated by timing diagram 1000, the controller outputs 4 sub-frame images to the array 702 of light modulators for each color to be displayed. The illumination of each of the 4 sub-frame images is weighted according to the binary series 1, 2, 4, 8. The display process in timing diagram 1000, therefore, displays a 4-digit binary word for gray scale in each color, that is, it is capable of displaying 16 distinct gray scale levels for each color, despite the loading of only 4 sub-images per color. Through combinations of the colors, the implementation of timing diagram 1000 is capable of displaying more than 4000 distinct colors.
  • In other implementations of display process 800 the sub-frame images in the sequence of sub-frame images need not be weighted according to the binary series 1, 2, 4, 8, etc. As mentioned above, the use of base-3 weighting can be useful as a means of expressing sub-frame data sets derived from a ternary coding scheme. Still other implementations employ a mixed coding scheme. For instance the sub-frame images associated with the least significant bits may be derived and illuminated according to a binary weighting scheme, while the sub-frame images associated with the most significant bits may be derived and illuminated with a more linear weighting scheme. Such a mixed coding helps to reduce the large differences in illumination periods for the most significant bits and is helpful in reducing image artifacts such as dynamic false contouring.
  • FIG. 11 is a more detailed flow chart of an illustrative display process 1100 suitable for use as part of the display method 800 for displaying images on the direct-view display 700. As in the display process 900, the display process 1100 utilizes bitplanes for sub-frame data sets. In contrast to display process 900, however, display process 1100 includes a global actuation functionality. In a display utilizing global actuation, pixels in multiple rows and multiple columns of the display are addressed before any of the actuators actuate. In the display process 1100, all rows of the display are addressed prior to actuation. Thus, while in display process 900, a controller must wait a certain amount of time after loading data into each row of light modulators to allow sufficient time for the light modulators to actuate, in display process 1100, the controller need only wait this “actuation time” once, after all rows have been addressed. One control matrix capable of providing a global actuation functionality is described above in relation to FIG. 3D.
  • Display process 1100 begins with the initiation of the display of a new image frame (step 1102). Such an initiation may be triggered by the detection of a vsync voltage pulse in the image signal 717. Then, at a time stored in the schedule table store 726 after the initiation of the display process for the image frame, the controller 704 begins loading the first bitplane into the light modulators of the array of light modulators 702 (step 1104).
  • At step 1106, any lamp currently illuminated is extinguished. Step 1106 may occur at or before the loading of a particular bitplane (step 1104) is completed, depending on the significance of the bitplane. For example, in some embodiments, to maintain the binary weighting of bitplanes with respect to one another, some bitplanes may need to be illuminated for a time period that is less than the amount of time it takes to load the next bitplane into the array of light modulators 702. Thus, a lamp illuminating such a bitplane is extinguished while the next bitplane is being loaded into the array of light modulators (step 1104). To ensure that lamps are extinguished at the appropriate time, in one embodiment, a timing value is stored in the schedule table store 726 to indicate the appropriate light extinguishing time.
  • When the controller 704 has completed loading a given bitplane into the array of light modulators 702 (step 1104) and extinguished any illuminated lamps (step 1106), the controller 704 issues a global actuation command (step 1108) to a global actuation driver, causing all of the light modulators in the array of light modulators 702 to actuate at substantially the same time. Global actuation drivers represent a type of common driver 714 included as part of display 700. The global actuation drivers may connect to modulators in the array of light modulators, for instance, by means of global actuation interconnects such as interconnect 354 of control matrix 340.
  • In some implementations the step 1108, globally actuate, includes a series of steps or commands issued by the timing control module 724. For instance, in certain control matrices described in co-pending U.S. Ser. No. 11/607,715, the global actuation step may involve a (first) charging of shutter mechanisms by means of a charging interconnect, followed by a (second) driving of a shutter common interconnect toward ground potential (at which point all commonly connected light modulators move into their closed state), followed after a constant waiting period for shutter actuation, followed by a (third) grounding of the global actuation interconnect (at which point only selected shutters move into their designated open states). Each of the charging interconnects, shutter common interconnects, and global actuation interconnects is connected to a separate driver circuit, responsive to trigger signals sent at the appropriate times according to timing values stored in the timing control module 724.
  • After waiting the actuation time of the light modulators, the controller 704 issues an illumination command (step 1110) to the lamp drivers to turn on the lamp corresponding to the recently loaded bitplane. The actuation time is the same for each bitplane loaded, and thus need not be stored in the schedule table store 726. It can be permanently stored in the timing control module 724 in hardware, firmware, or software.
  • After the lamp corresponding to the bitplane is illuminated (step 1110), at decision block 1112, the controller 704 determines, based on the output sequence, whether the currently loaded bitplane is the last bitplane for the image frame to be displayed. If so, the controller 704 awaits initiation of the display of the next image frame (step 1102). Otherwise, the controller 704 begins loading the next bitplane into the array of light modulators 702.
  • FIG. 12 is a timing diagram 1200 that corresponds to an implementation of the display process 1100 that utilizes an output sequence having as parameters the values stored in the Table 1 schedule table. While the display processes corresponding to FIGS. 10 and 12 utilize similar stored parameters, their operation is quite different. Similar to the display process corresponding to timing diagram 1000 of FIG. 10, the display process corresponding to timing diagram 1200 uses a coded-time division grayscale addressing process in which image frames are displayed by displaying four sub-frame images for each of three color components (red, green, and blue) of the image frame. Each sub-frame image displayed of a given color is displayed at the same intensity for half as long a time period as the prior sub-frame image, thereby implementing a binary weighting scheme for the sub-frame images. However, the display process corresponding to timing diagram 1200 differs from the timing diagram 1000 in that it incorporates the global actuation functionality described in the display process 1100. As such, the lamps in the display are illuminated for a significantly greater portion of the frame time. The display can therefore either display brighter images, or it can operate its lamps at lower power levels while maintaining the same brightness level. As brightness and power consumption are not linearly related, the lower illumination level operating mode, while providing equivalent image brightness, consumes less energy.
  • More specifically, the display of an image frame in timing diagram 1200 begins upon the detection of a vsync pulse. As indicated on the timing diagram and in the Table 1 schedule table, the bitplane R3, stored beginning at memory location M0, is loaded into the array of light modulators 702 in an addressing event that begins at time AT0. Once the controller 704 outputs the last row of data for a bitplane to the array of light modulators 702, the controller 704 outputs a global actuation command. After waiting the actuation time, the controller 704 causes the red lamp to be illuminated. As indicated above, since the actuation time is a constant for all sub-frame images, no corresponding time value needs to be stored in the schedule table store 726 to determine this time. At time AT1, the controller 704 begins loading the subsequent bitplane R2, which, according to the schedule table, is stored beginning at memory location M1, into the array of light modulators 702.
  • Lamp extinguishing event times LT0-LT11 occur at times stored in the schedule table store 726. The times may be stored in terms of clock cycles following the detection of a vsync pulse, or they may be stored in terms of clock cycles following the beginning of the loading of the previous bitplane into the array of light modulators 702. For bitplanes which are to be illuminated for a period longer than the time it takes to load a bitplane into the array of light modulators 726, the lamp extinguishing times are set in the schedule table to coincide with the completion of an addressing event corresponding to the subsequent sub-frame image. For example, LT0 is set to occur at a time after AT0 which coincides with the completion of the loading of bitplane R2. LT1 is set to occur at a time after AT1 which coincides with the completion of the loading of bitplane R1.
  • Some bitplanes, such as R0, G0, and B0, however, are intended to be illuminated for a period of time that is less than the amount of time it takes to load a bitplane into the array. Thus, LT3, LT7, and LT11 occur in the middle of subsequent addressing events.
  • In alternate implementations the sequence of lamp illumination and data addressing can be reversed. For instance the addressing of bitplanes corresponding to the subsequent sub-frame image can follow immediately upon the completion of a global actuation event, while the illumination of a lamp can be delayed until a lamp illumination event at some point after the addressing has begun.
  • FIG. 13 is a timing diagram 1300 that corresponds to another implementation of the display process 1100 that utilizes a table similar to Table 2 as a schedule table. The timing diagram 1300 corresponds to a coded-intensity grayscale addressing process similar to that described with respect to FIG. 5 in that each sub-frame image for a given color component (red, green, and blue) is illuminated for the same amount of time. However, in contrast to the display process depicted in FIG. 5, in the display process corresponding to timing diagram 1300, each sub-frame image of a particular color component is illuminated at half the intensity as the prior sub-frame image of the color component, thereby implementing a binary weighting scheme without varying lamp illumination times.
  • TABLE 2
    Schedule Table 2
    Field Field Field Field Field Field Field Field Field
    1 2 3 4 5 6 7 - - - n − 1 n
    addressing time AT0 AT1 AT2 AT3 AT4 AT5 AT6 - - - AT(n − 1) ATn
    memory location of M0 M1 M2 M3 M4 M4 M6 - - - M(n − 1) Mn
    sub-frame data set
    lamp ID R R R R G G G - - - B B
    lamp intensity IL0 IL1 IL2 IL3 IL4 IL5 IL6 - - - IT(n − 1) ITn
  • More specifically, the display of an image frame in timing diagram 1300 begins upon the detection of a vsync pulse. As indicated on the timing diagram and in the Table 2 schedule table, the bitplane R3, stored beginning at memory location M0, is loaded into the array of light modulators 702 in an addressing event that begins at time AT0. Once the controller 704 outputs the last row data of a bitplane to the array of light modulators 702, the controller 704 outputs a global actuation command. After waiting the actuation time, the controller causes the red lamp to be illuminated at a lamp intensity IL0 stored in the Table 2 schedule table. Similar to the addressing process described with respect to FIG. 12, since the actuation time is a constant for all sub-frame images, no corresponding time value needs to be stored in the schedule table store 726 to determine this time. At time AT1, the controller 704 begins loading the subsequent bitplane R2, which, according to the schedule table, is stored beginning at memory location M1, into the array of light modulators 702. The sub-frame image corresponding to bitplane R2 is illuminated at an intensity level IL1, as indicated in Table 2, which is equal to half of the intensity level IL0. Similarly, the intensity level IL2 for bitplane R1 is equal to half of the intensity level IL1, and the intensity level 113 for bitplane R0 is equal to half of the intensity level IL2.
  • For each sub-frame image, the controller 704 may extinguish the illuminating lamp at the completion of an addressing event corresponding to the next sub-frame image. As such, no corresponding time value needs to be stored in the schedule table store 726 corresponding to lamp illumination times.
  • FIG. 14A is a timing diagram 1400 that corresponds to another implementation of the display process 1100 that utilizes a table similar to Table 3 as a schedule table. The timing diagram 1400 corresponds to a coded-time division grayscale addressing process in which image frames are displayed by displaying five sub-frame images for each of three color components (red, green, and blue) of the image frame. By including an extra sub-frame image per color component, the display process corresponding to timing diagram 1400 can display twice the number of gray scale levels at each color as the display process that corresponds to timing diagram 1200. Each sub-frame image displayed of a given color is displayed at the same intensity for half as long a time period as the prior sub-frame image, thereby implementing a binary pulse width weighting scheme for the sub-frame images.
  • TABLE 3
    Schedule Table 3
    Field Field Field Field Field Field Field Field Field
    1 2 3 4 5 6 7 - - - n − 1 n
    addressing time AT0 AT1 AT2 AT3 AT4 AT5 AT6 - - - AT(n − 1) ATn
    memory location of M0 M1 M2 M3 M4 M4 M6 - - - M(n − 1) Mn
    sub-frame data set
    lamp ID R R R R R G G - - - B B
    lamp time LT0 LT1 LT2 LT3 LT4 LT5 LT6 - - - LT(n − 1) LTn
  • More specifically, the display of an image frame in timing diagram 1200 begins upon the detection of a vsync pulse. As indicated on the timing diagram, the bitplane R4, stored beginning at memory location M0, is loaded into the array of light modulators 702 in an addressing event that begins at time AT0. Once the controller 704 outputs the last row data of a bitplane to the array of light modulators 702, the controller 704 outputs a global actuation command. After waiting the actuation time, the controller causes the red lamp to be illuminated. Similar to the addressing process described with respect to FIG. 12, since the actuation time is a constant for all sub-frame images, no corresponding time value needs to be stored in the schedule table store 726 to determine this time. At time AT1, the controller 704 begins loading the subsequent bitplane R3, which is stored beginning at memory location M1, into the array of light modulators 702.
  • Lamp extinguishing event times occur at times stored in the schedule table store 726. The times may be stored in terms of clock cycles following the detection of a vsync pulse, or they may be stored in terms of clock cycles following the beginning of the loading of the previous bitplane into the array of light modulators 702. For bitplanes which are to be illuminated for a period longer than the time it takes to load a bitplane into the array of light modulators 726, the lamp extinguishing times are set in the schedule table to coincide with the completion of an addressing event corresponding to the subsequent sub-frame image. For example, LT0 is set to occur at a time after AT0 which coincides with the completion of the loading of bitplane R3. LT1 is set to occur at a time after AT1 which coincides with the completion of the loading of bitplane R2.
  • Similar to the addressing process corresponding to the timing diagram 1200 of FIG. 12, some bitplanes, such as R1 and R0, G1 and G0, and B1 and B0 are intended to be illuminated for a period of time that is less than the amount of time it takes to load a bitplane into the array. Thus, their corresponding lamp extinguishing times occur in the middle of subsequent addressing events. Because the lamp extinguishing times depend on whether the corresponding illumination times are less than or greater than the time required for addressing, the corresponding schedule table includes lamp times, e.g., LT0, LT1, LT2, etc.
  • FIG. 14B is a timing diagram 1450 that corresponds to another implementation of the display process 1100 that utilizes the parameters stored in Table 4 as a schedule table. The timing diagram 1450 corresponds to a coded-time division and intensity grayscale addressing process similar to that of the timing diagram 1400, except that the weighting of the least significant sub-image and the second least significant sub-image are achieved by varying lamp intensity in addition to lamp illumination time. In particular, sub-frame images corresponding to the least significant bitplane and the second least significant bitplane are illuminated for the same length of time as the sub-frame images corresponding to the third least significant bitplane, but at one quarter and one half the intensity, respectively. By combining intensity grayscale with time division grayscale, all the bitplanes may be illuminated for a period of time equal to or longer than the time it takes to load a bitplane into the array of light modulators 702. This eliminates the need for lamp extinguishing times to be stored in the schedule table store 726.
  • TABLE 4
    Schedule Table 4
    Field Field Field Field Field Field Field Field Field
    1 2 3 4 5 6 7 - - - n − 1 n
    addressing time AT0 AT1 AT2 AT3 AT4 AT5 AT6 - - - AT(n − 1) ATn
    memory location of M0 M1 M2 M3 M4 M4 M6 - - - M(n − 1) Mn
    sub-frame data set
    lamp ID R R R R R G G - - - B B
    lamp intensity IL0 IL1 IL2 IT3 IT4 IT5 IT6 - - - IT(n − 1) ITn
  • More specifically, the display of an image frame in timing diagram 1450 begins upon the detection of a vsync pulse. As indicated on the timing diagram and schedule table 4, the bitplane R4, stored beginning at memory location M0, is loaded into the array of light modulators 702 in an addressing event that begins at time AT0. Once the controller 704 outputs the last row of data of a bitplane to the array of light modulators 702, the controller 704 outputs a global actuation command. After waiting the actuation time, the controller causes the red lamp to be illuminated at a lamp intensity IL0 stored in the schedule table store 726. Similar to the addressing process described with respect to FIG. 12, since the actuation time is a constant for all sub-frame images, no corresponding time value needs to be stored in the schedule table store 726 to determine this time. At time AT1, the controller 704 begins loading the subsequent bitplane R3, which, according to the schedule table, is stored beginning at memory location M1, into the array of light modulators 702. The sub-frame image corresponding to bitplane R3 is illuminated at an intensity level IL1, as indicated in Table 2, which is equal to the intensity level IL0. Similarly, the intensity level IL2 for bitplane R2 is equal to the intensity level IL1. However, the intensity level IT3 for bitplane R1 is half that of the intensity level IL2, and the intensity level IT4 for bitplane R0 is half that of the intensity level IT3. Similar to the display process described with respect to FIG. 13, for each sub-frame image, the controller 704 may extinguish the illuminating lamp at the completion of an addressing event corresponding to the next sub-frame image. As such, no corresponding time value needs to be stored in the schedule table store 726 to determine this time.
  • The timing diagram 1450 corresponds to a display process in which perceived brightness of sub-images of an output sequence are controlled in a hybrid fashion. For some sub-frame images in the output sequence, brightness is controlled by modifying the period of illumination of the sub-frame image. For other sub-frame images in the output sequence, brightness is controlled by modifying illumination intensity. It is useful in a direct view display to provide the capability for controlling both pulse widths and intensities independently. In one implementation of such independent control, the lamp drivers 714 are responsive to variable intensity commands issued from the timing control module 724 as well as to timing or trigger signals from the timing control module 724 for the illumination and extinguishing of the lamps. For independent pulse width and intensity control, the schedule table store 726 stores parameters that describe the required intensity of lamps in addition to the timing values associated with their illumination.
  • It is useful to define an illumination value as the product (or the integral) of an illumination period (or pulse width) with the intensity of that illumination. For a given time interval assigned in an output sequence for the illumination of a bitplane there are numerous alternative methods for controlling the lamps to achieve any required illumination value. Three such alternate pulse profiles for lamps appropriate to this invention are compared in FIG. 14C. In FIG. 14C the time markers 1482 and 1484 determine time limits within which a lamp pulse must express its illumination value. In a global actuation scheme for driving MEMS-based displays, the time marker 1482 might represent the end of one global actuation cycle, wherein the modulator states are set for a bitplane previously loaded, while the time marker 1484 can represent the beginning of a subsequent global actuation cycle, for setting the modulator states appropriate to the subsequent bitplane. For bitplanes with smaller significance, the time interval between the markers 1482 and 1484 can be constrained by the time necessary to load data subsets, e.g. bitplanes, into the array of modulators. The available time interval, in these cases, is substantially longer that the time required for illumination of the bitplane, assuming a simple scaling from the pulse widths assigned to bits of larger significance.
  • The lamp pulse 1486 is a pulse appropriate to the expression of a particular illumination value. The pulse width 1486 completely fills the time available between the markers 1482 and 1484. The intensity or amplitude of lamp pulse 1486 is adjusted, however, to achieve a required illumination value. An amplitude modulation scheme according to lamp pulse 1486 is useful, particularly in cases where lamp efficiencies are not linear and power efficiencies can be improved by reducing the peak intensities required of the lamps.
  • The lamp pulse 1488 is a pulse appropriate to the expression of the same illumination value as in lamp pulse 1486. The illumination value of pulse 1488 is expressed by means of pulse width modulation instead of by amplitude modulation. As shown in the timing diagram 1400, for many bitplanes the appropriate pulse width will be less than the time available as determined by the addressing of the bitplanes.
  • The series of lamp pulses 1490 represent another method of expressing the same illumination value as in lamp pulse 1486. A series of pulses can express an illumination value through control of both the pulse width and the frequency of the pulses. The illumination value can be considered as the product of the pulse amplitude, the available time period between markers 1482 and 1484, and the pulse duty cycle.
  • The lamp driver circuitry can be programmed to produce any of the above alternate lamp pulses 1486, 1488, or 1490. For example, the lamp driver circuitry can be programmed to accept a coded word for lamp intensity from the timing control module 724 and build a sequence of pulses appropriate to intensity. The intensity can be varied as a function of either pulse amplitude or pulse duty cycle.
  • FIG. 15 is a timing diagram 1500 that corresponds to another implementation of the display process 1100 that utilizes a schedule table similar to Table 5. The timing diagram 1500 corresponds to a coded-time division grayscale addressing process similar to that described with respect to FIG. 12, except that restrictions have been placed on illumination periods for the most significant bits and rules have been established for the ordering of the bitplanes in the display sequence. The sequencing rules illustrated for timing diagram 1500 are established to help reduce two visual artifacts which detract from image quality in field sequential displays, i.e. color breakup and flicker. Color breakup is reduced by increasing the frequency of color changes, that is by alternating between sub-images of different colors at a frequency preferably in excess of 180 Hz. Flicker is reduced in its simplest manifestation by ensuring that frame rates are substantially greater than 30 Hz, that is, by ensuring that bitplanes of similar significance which appear in subsequent image frames are separated by time periods of less than 25 milliseconds.
  • TABLE 5
    Schedule Table 5
    Field Field Field Field Field Field Field Field Field
    1 2 3 4 5 6 7 - - - n − 1 n
    addressing time AT0 AT1 AT2 AT3 AT4 AT5 AT6 - - - AT(n − 1) ATn
    memory location of M0 M1 M2 M3 M4 M4 M6 - - - M(n − 1) Mn
    sub-frame data set
    lamp ID R G B R R G G - - - G B
  • Sequencing rules associated with color breakup and flicker can be implemented by the technique of bit splitting. In particular, in timing diagram 1500, the most significant bits, e.g. R3, G3, and B3, are split in two, that is: reduced to half of their nominal illumination period and then repeated or displayed twice within the time of any given image frame. The red bitplane R3 for instance, is first loaded to the modulation array at time event AT0 and is then loaded for the second time at the time event AT9. The illumination period associated with the most significant bitplane R3, loaded at time event AT9, is equal to the illumination period associated with bitplane R2, which is loaded at the time event AT12. Because the most significant bitplane R3 appears twice within the image frame, however, the illumination value associated with the information contained within bitplane R3 is still twice that allotted to the next most significant bitplane R2.
  • In addition, instead of displaying sub-frame images of an image grouped by color, as shown in the timing diagrams 1000, 1200, 1300, 1400 and 1450, the timing diagram 1500 displays sub-frame images corresponding to a given color interspersed among sub-frame images corresponding to other colors. For example, to display an image according to the timing diagram 1500, a display first loads and displays the first occurrence of the most significant bitplane for red, R3, followed immediately by the most significant green bitplane, G3, followed immediately by the most significant blue bitplane B3. Since the most significant bitplanes have been split, these color changes occur fairly rapidly, with the longest time periods between color changes about equal to the illumination time of the next most significant bitplane, R2. The time periods between illumination of sub-frame images of different colors, illustrated as the time period J in timing diagram 1500, are preferably held to less than 4 milliseconds, more preferably less than 2.8 milliseconds. The smaller bitplanes, R1 and R0, G1 and G0, and B1 and B0, can still be grouped together, since the total of their illumination times is still less than 4 milliseconds.
  • Interspersing or alternating between bitplanes or different colors helps to reduce the imaging artifact of color breakup. It is preferable to avoid grouping the output of bitplanes by color. For instance, although the bitplane B3 is the third of the bitplanes to be output by the controller (at addressing event AT2), the appearance of the blue bitplane B3 does not imply the end of all possible appearances of red bitplanes within the frame time. Indeed the bitplane R1 for the color red immediately follows B3 in the sequence of timing diagram 1500. It is preferable to alternate between bitplanes of different color with the highest frequency possible within an image frame.
  • To reduce the power associated with refreshing a display it is not always possible to establish a frame rate in excess of 30 Hz. Alternate rules related to the ordering of the bitplanes may still be applied, however, to minimize flicker in the perceived image. In timing diagram 1500 the time periods K and L represent the separation in time between events in which the most significant bitplane in red, i.e. the most significant bitplane R3 is output to the display. Similar time periods K and L exist between successive occurrences of the other most significant bitplanes G3 and B3. The time period K represents the maximum time between output of most significant bitplanes within a given image frame. The time period L represents the maximum time between output of most significant bitplanes in two consecutive image frames. In timing diagram 1500 the sum of K+L is equal to the frame time, and for this embodiment, the frame time may be as long as 33 milliseconds (corresponding to a 30 Hz frame rate). Flicker may still be reduced in displays where bit-splitting is employed, if both time intervals K and L are held to less than 25 milliseconds, preferably less than 17 milliseconds.
  • Flicker may arise from a variety of factors wherein characteristics of a display are repeated at frequencies as low as 30 Hz. In timing diagram 1500, for instance, the lesser significance bitplanes R1 and R0 are illuminate only once per frame, and the frame rate is as long as 30 Hz. Therefore images associated with these lesser bitplanes may contribute to the perception of flicker. The bank-wise addressing method described with respect to FIG. 19, however, will provide another mechanism by which even lesser bitplanes can be repeated at frequencies substantially greater than the frame rate.
  • Flicker may also be generated by the characteristic of bitplane jitter. Jitter appears when the spacing between similar bitplanes is not equal in the sequence of displayed bitplanes. Flicker would ensue, for instance, if the time periods K and L between MSB red bitplanes were not equal. Flicker can be reduced by ensuring that time periods K and L are equal to within 10%. That is the length of time between a first time the bitplane corresponding to the most significant sub-frame image of a color component of the image frame is output and a second time the bitplane corresponding to the most significant sub-frame image of the color component is output is within 10% of the length of time between the second time the bitplane corresponding to the most significant sub-frame image of the color component is output and a subsequent time at which a sub-frame image corresponding to the most significant sub-frame image of the color component is output.
  • FIG. 16 is a timing diagram 1600 that corresponds to another implementation of the display process 1100 that utilizes the parameters listed in Table 6. The timing diagram 1600 corresponds to a coded-time division grayscale addressing process in which image frames are displayed by displaying four sub-frame images for each color component of the image frame. Each sub-frame image displayed of a given color is displayed at the same intensity for half as long a time period as the prior sub-frame image, thereby implementing a binary weighting scheme for the sub-frame images. The timing diagram 1600 is similar to the timing diagram 1200 of FIG. 12, but has sub-frame images corresponding to the color white, in addition to the colors red, green and blue, that are illuminated using a white lamp. The addition of a white lamp allows the display to display brighter images or operate its lamps at lower power levels while maintaining the same brightness level. As brightness and power consumption are not linearly related, the lower illumination level operating mode, while providing equivalent image brightness, consumes less energy. In addition, white lamps are often more efficient, i.e. they consume less power than lamps of other colors to achieve the same brightness.
  • More specifically, the display of an image frame in timing diagram 1600 begins upon the detection of a vsync pulse. As indicated on the timing diagram and in the Table 6 schedule table, the bitplane R3, stored beginning at memory location M0, is loaded into the array of light modulators 702 in an addressing event that begins at time AT0. Once the controller 704 outputs the last row data of a bitplane to the array of light modulators 702, the controller 704 outputs a global actuation command. After waiting the actuation time, the controller causes the red lamp to be illuminated. Similar to the addressing process described with respect to FIG. 12, since the actuation time is a constant for all sub-frame images, no corresponding time value needs to be stored in the schedule table store 726 to determine this time. At time AT4, the controller 704 begins loading the first of the green bitplanes, G3, which, according to the schedule table, is stored beginning at memory location M4. At time ATB, the controller 704 begins loading the first of the blue bitplanes, B3, which, according to the schedule table, is stored beginning at memory location M8. At time AT12, the controller 704 begins loading the first of the white bitplanes, W3, which, according to the schedule table, is stored beginning at memory location M12. After completing the addressing corresponding to the first of the white bitplanes, W3, and after waiting the actuation time, the controller causes the white lamp to be illuminated for the first time.
  • Because all the bitplanes are to be illuminated for a period longer than the time it takes to load a bitplane into the array of light modulators 726, the controller 704 extinguishes the lamp illuminating a sub-frame image upon completion of an addressing event corresponding to the subsequent sub-frame image. For example, LT0 is set to occur at a time after AT0 which coincides with the completion of the loading of bitplane R2. LT1 is set to occur at a time after AT1 which coincides with the completion of the loading of bitplane R1.
  • The time period between vsync pulses in the timing diagram is indicated by the symbol FT, indicating a frame time. In some implementations the addressing times AT0, AT1, etc. as well as the lamp times LT0, LT1, etc. are designed to accomplish 4 sub-frame images for each of the 4 colors within a frame time FT of 16.6 milliseconds, i.e. according to a frame rate of 60 Hz. In other implementations the time values stored in schedule table store 726 can be altered to accomplish 4 sub-frame images per color within a frame time FT of 33.3 milliseconds, i.e. according to a frame rate of 30 Hz. In other implementations frame rates as low as 24 Hz may be employed or frame rates in excess of 100 Hz may be employed.
  • TABLE 6
    Schedule Table 6
    Field Field Field Field Field Field Field Field Field
    1 2 3 4 5 6 7 - - - n − 1 n
    addressing time AT0 AT1 AT2 AT3 AT4 AT5 AT6 - - - AT(n − 1) ATn
    memory location of M0 M1 M2 M3 M4 M4 M6 - - - M(n − 1) Mn
    sub-frame data set
    lamp ID R R R R G G G - - - W W
  • The use of white lamps can improve the efficiency of the display. The use of four distinct colors in the sub-frame images requires changes to the data processing in the input processing module 718. Instead of deriving bitplanes for each of 3 different colors, a display process according to timing diagram 1600 requires bitplanes to be stored corresponding to each of 4 different colors. The input processing module 718 may therefore convert the incoming pixel data, encoded for colors in a 3-color space, into color coordinates appropriate to a 4-color space before converting the data structure into bitplanes.
  • In addition to the red, green, blue, and white lamp combination, shown in timing diagram 1600, other lamp combinations are possible which expand the space or gamut of achievable colors. A useful 4-color lamp combination with expanded color gamut is red, blue, true green (about 520 nm) plus parrot green (about 550 nm). Another 5-color combination which expands the color gamut is red, green, blue, cyan, and yellow. A 5-color analogue to the well known YIQ color space can be established with the lamps white, orange, blue, purple, and green. A 5-color analog to the well known YUV color space can be established with the lamps white, blue, yellow, red, and cyan.
  • Other lamp combinations are possible. For instance, a useful 6-color space can be established with the lamp colors red, green, blue, cyan, magenta, and yellow. A 6-color space can also be established with the colors white, cyan, magenta, yellow, orange, and green. A large number of other 4-color and 5-color combinations can be derived from amongst the colors already listed above. Further combinations of 6, 7, 8 or 9 lamps with different colors can be produced from the colors listed above, Additional colors may be employed using lamps with spectra which lie in between the colors listed above.
  • FIG. 17 is a timing diagram 1700 that corresponds to another implementation of the display process 1100 that utilizes the parameters listed in the schedule table of Table 7. The timing diagram 1700 corresponds to a hybrid coded-time division and intensity grayscale display process in which lamps of different colors may be illuminated simultaneously. Though each sub-frame image is illuminated by lamps of all colors, sub-frame images for a specific color are illuminated predominantly by the lamp of that color. For example, during illumination periods for red sub-frame images, the red lamp is illuminated at a higher intensity than the green lamp and the blue lamp. As brightness and power consumption are not linearly related, using multiple lamps each at a lower illumination level operating mode may require less power than achieving that same brightness using one lamp at an higher illumination level.
  • The addressing timing is similar to that described in FIG. 12 in that each sub-frame image is displayed at the same intensity for half as long a time period as the prior sub-frame image, except for the sub-frame images corresponding to the least significant bitplanes which are instead each illuminated for the same length of time as the prior sub-frame image, but at half the intensity. As such, the sub-frame images corresponding to the least significant bitplanes are illuminated for a period of time equal to or longer than that required to load a bitplane into the array.
  • TABLE 7
    Schedule Table 7
    Field Field Field Field Field Field Field Field Field
    1 2 3 4 5 6 7 - - - n − 1 n
    data time AT0 AT1 AT2 AT3 AT4 AT5 AT6 - - - AT(n − 1) ATn
    memory location of M0 M1 M2 M3 M4 M5 M6 - - - M(n − 1) Mn
    sub-frame data set
    red average RI0 RI1 RI2 RI3 RI4 RI5 RI6 - - - RI(n − 1) Rn
    intensity
    green average GI0 GI1 GI2 GI3 GI4 GI5 GI6 - - - GI(n − 1) Gn
    intensity
    blue average BI0 BI1 BI2 BI3 BI4 BI5 BI6 - - - BI(n − 1) Bn
    intensity
  • More specifically, the display of an image frame in timing diagram 1700 begins upon the detection of a vsync pulse. As indicated on the timing diagram and in the Table 7 schedule table, the bitplane R3, stored beginning at memory location M0, is loaded into the array of light modulators 702 in an addressing event that begins at time AT0. Once the controller 704 outputs the last row data of a bitplane to the array of light modulators 702, the controller 704 outputs a global actuation command. After waiting the actuation time, the controller causes the red, green and blue lamps to be illuminated at the intensity levels indicated by the Table 7 schedule, namely RI0, GI0 and BI0, respectively. Similar to the addressing process described with respect to FIG. 12, since the actuation time is a constant for all sub-frame images, no corresponding time value needs to be stored in the schedule table store 726 to determine this time. At time AT1, the controller 704 begins loading the subsequent bitplane R2, which, according to the schedule table, is stored beginning at memory location M1, into the array of light modulators 702. The sub-frame image corresponding to bitplane R2, and later the one corresponding to bitplane R1, are each illuminated at the same set of intensity levels as for bitplane R1, as indicated by the Table 7 schedule. In comparison, the sub-frame image corresponding to the least significant bitplane R0, stored beginning at memory location M3, is illuminated at half the intensity level for each lamp. That is, intensity levels RI3, GI3 and BI3 are equal to half that of intensity levels RI0, GI0 and BI0, respectively. The process continues starting at time AT4, at which time bitplanes in which the green intensity predominates are displayed. Then, at time ATB, the controller 704 begins loading bitplanes in which the blue intensity dominates.
  • Because all the bitplanes are to be illuminated for a period longer than the time it takes to load a bitplane into the array of light modulators 726, the controller 704 extinguishes the lamp illuminating a sub-frame image upon completion of an addressing event corresponding to the subsequent sub-frame image. For example, LT0 is set to occur at a time after AT0 which coincides with the completion of the loading of bitplane R2. LT1 is set to occur at a time after AT1 which coincides with the completion of the loading of bitplane R1.
  • The mixing of color lamps within sub-frame images in timing diagram 1700 can lead to improvements in power efficiency in the display. Color mixing can be particularly useful when images do not include highly saturated colors.
  • FIG. 18 is a more detailed flow chart of an illustrative display process 1800 suitable for use as part of the display method 800 for displaying images on the direct-view display 700. As in the display process 1100, the display process 1800 utilizes bitplanes for sub-frame data sets. Display process 1800 also includes a global actuation functionality similar to that used in display process 1100. Display process 1800, however, adds a bankwise addressing functionality as a tool for improving the illumination efficiency in the display.
  • For many display processes, especially where a display loads and displays large numbers of bitplanes (for example, greater than 5) for each color component of an image, proportionately more time must be dedicated to the addressing of the display at the expense of illumination of corresponding sub-images This is true even when global actuation techniques are employed as in display process 1100. The situation is illustrated by the timing diagram 1400 of FIG. 14A. Timing diagram 1400 illustrates a 5-bit sequence per color with illumination values assigned to the bitplanes according to a binary significance sequence 16:8:4:2:1. The illumination periods associated with the bitplanes R1 and R0, however, are considerably shorter than the time required for loading data sets into the array appropriate to the next bitplane. As a result, a considerable amount of time passes between the times the lamps illuminating the R1 and R0 bitplanes are extinguished and the times the lamps illuminating the R0 and G4 bitplanes, respectively, are turned on. This situation results in a reduced duty cycle and therefore reduced efficiency for lamp illumination.
  • Bankwise addressing is a functionality by which duty cycles for lamps can be increased by reducing the times required for addressing. This is accomplished by dividing the display into multiple independently actuatable banks of rows such that only a portion of the display needs to be addressed and actuated at any one time. Shorter addressing cycles increase the efficiency of the display for those bitplanes that require only the shortest of illumination times.
  • In one particular implementation, bank-wise addressing involves segregating the rows of the display into two segments. In one embodiment, the rows in the top half of the display are controlled separately from rows in the bottom half of the display. In another embodiment the display is segregated on an every-other row basis, such that even-numbered rows belong to one bank or segment and the odd-numbered rows belong to the other bank. Separate bitplanes are stored for each segment at a distinct addresses in the buffer memory 722. For bank-wise addressing, the input processing module 718 is programmed to not only derive bitplane information from the incoming video stream, but also to identify, and in some cases store, portions of bitplanes separately according to their assignment to different banks. In the following description bitplanes are labeled by color, bank, and significance value. For example, bitplane RE3 in a five bit per color component gray scale process refers to the second most significant bitplane for the even numbered rows of the display apparatus. Bitplane BOO corresponds to the least significant blue bitplane for the odd numbered rows.
  • When the bankwise addressing scheme employs a global actuation functionality, then independent global actuation voltage drivers and independent global actuation interconnects are provided for each bank. For instance the odd-numbered rows are connected to one set of global actuation drivers and global actuation interconnects, while the even numbered rows are connected to an independent set of global actuation drivers and interconnects.
  • Display process 1800 begins with the initiation of the display of a new image frame (step 1802). Such an initiation may be triggered by the detection of a vsync voltage pulse in the image signal 717. Then, at a time identified in the schedule table store 726 after the initiation of the display process for the image frame, the controller 704 begins loading the first bitplane into the light modulators of the array of light modulators 702 (step 1804). In contrast to step 1104 of FIG. 11, at step 1804, bitplanes for either one or both of the banks of the display are loaded into the corresponding rows of the array of light modulators 702. In one embodiment, at step 1804, the timing control module 724 analyzes its output sequence to see how many banks need to be addressed in a given addressing event and then addresses each bank needed to be addressed in sequence. In one implementation, for one bank, bitplanes are loaded into corresponding light modulator rows in order of increasing significance while for the other bank, bitplanes are loaded into the corresponding light modulator rows in order of decreasing significance.
  • At step 1806, any lamp currently illuminated is extinguished. Step 1806 may occur at or before the loading of a particular bitplane (step 1804) is completed, depending on the significance of the bitplane. For example, in some embodiments, to maintain the binary weighting of bitplanes with respect to one another, some bitplanes may need to be illuminated for a time period that is less than the amount of time it takes to load the next bitplane into the array of light modulators 702. Thus, a lamp illuminating such a bitplane is extinguished while the next bitplane is being loaded into the array of light modulators (step 1804). To ensure that lamps are extinguished at the appropriate time, a timing value is stored in the schedule table to indicate the appropriate light extinguishing time.
  • When the controller 704 has completed loading either or both of the bitplane data into either or both of banks in the array of light modulators 702 (step 1804) and when the controller has extinguished any illuminated lamps (step 1806), the controller 704 issues a global actuation command (step 1808) to either or both of the global actuation drivers, depending on where it is in its output sequence, thereby causing either only one of the banks of addressable modulators or both banks in the array of light modulators 702 to actuate at substantially the same time. The timing of the global actuation is determined by logic in the timing control module based on whether the schedule indicates that one or both of the banks requires addressing. That is, if a single bank needs addressing according to the schedule table store 726, the timing control module 724 waits a first amount of time before causing the controller 704 to issue the global actuation command. If the schedule table store 726 indicates both banks require addressing, the timing control module 724 waits about twice that amount of time before triggering global actuation. As only two possible time values are needed for timing global actuation (i.e., a single bank time, or a dual bank time), these values can be stored permanently in the timing control module 724 in hardware, firmware, or software.
  • After waiting the actuation time of the light modulators, the controller 704 issues an illumination command (step 1810) to the lamp drivers to turn on the lamp corresponding to the recently loaded bitplane. The actuation time is measured from the time a global actuation command is issued (step 1808), and thus is the same for each bitplane loaded. Therefore, it need not be stored in a schedule table. It can be permanently stored in the timing control module 724 in hardware, firmware, or software.
  • After the lamp corresponding to the bitplane is illuminated (step 1810), at decision block 1812, the controller 704 determines, based on the schedule table store 726, whether the currently loaded bitplane is the last bitplane for the image frame to be displayed. If so, the controller 704 awaits initiation of the display of a subsequent image frame (step 1802). Otherwise, at the time of the next addressing event listed in the schedule table store 726, the controller 704 begins loading the corresponding bitplane or bitplanes into the array of light modulators 702 (step 1804).
  • FIG. 19 is a timing diagram 1900 that corresponds to an implementation of the display process 1800 through utilization of the parameters listed in the schedule table of Table 8. The timing diagram 1900 corresponds to a coded-time division grayscale display process in which image frames are displayed by displaying 5 sub-frame images for each of three color components (red, green, and blue) of the image frame. Each sub-frame image displayed of a given color is displayed at the same intensity for half as long a time period as the prior sub-frame image, thereby implementing a binary weighting scheme for the sub-frame images. In addition, the timing diagram 1900 incorporates the global actuation functionality described in the display process 1100 and the bankwise addressing functionality described in the display process 1800. By reducing the times required for addressing, the display can therefore either display brighter images, or it can operate its lamps at lower power levels while maintaining the same brightness level. As brightness and power consumption are not linearly related, the lower illumination level operating mode, while providing equivalent image brightness, consumes less energy.
  • TABLE 8
    Schedule Table 8
    Field Field Field Field Field Field Field Field Field
    1 2 3 4 5 6 7 - - - n − 1 n
    data time AT0 AT1 AT2 AT3 AT4 AT5 AT6 - - - AT(n − 1) ATn
    memory location for MO0 0 0 0 0 MO5 M06 - - - MO(n − 1) MOn
    Bank
    1
    “odd rows”
    memory location for ME0 ME1 ME2 ME3 ME4 0 0 - - - ME(n − 1) MEn
    Bank
    2
    “even rows”
    lamp ID R R R R R R R - - - B B
    lamp time LT0 LT1 LT2 LT3 LT4 LT5 LT6 - - - LT(n − 1) LTn
  • More specifically, the display of an image frame in timing diagram 1900 begins upon the detection of a vsync pulse. As indicated on the timing diagram and in the Table 8 schedule table, the bitplane R04, stored beginning at memory location MO0, is loaded into only the odd rows of the array of light modulators 702 in an addressing event that begins at time AT0. Immediately thereafter, the bitplane RE1 is loaded into only the even rows of the array of light modulators, using data stored in the location ME0. Once the controller 704 outputs the last of the even rows of data of a bitplane to the array of light modulators 702, the controller 704 outputs a global actuation command to both of the independently addressable global actuation drivers connected to the banks of even and odd rows. After waiting the actuation time following the issuance of the global actuation command, the controller 704 causes the red lamp to be illuminated. As indicated above, since the actuation time is a constant for all sub-frame images and is based on the issuance of the global actuation command, no corresponding time value needs to be stored in the schedule table store 726 to determine this time.
  • At time AT1, the controller 704 begins loading the subsequent bitplane RE0, stored beginning at memory location ME1, into the even rows of the array of light modulators 702. During the addressing event beginning at AT1, the timing control module 724 skips any process related to loading of the data into the odd rows. This may be accomplished by storage of a coded parameter in the schedule table store 726 associated with the timing value AT1, for instance, the numeral zero. In this fashion the amount of time to complete the addressing event initiated at time AT1 is only ½ of the time required for addressing both banks of rows at time AT0. Note that the least significant red bitplane for the odd rows is not loaded into the array of light modulators 702 until much later, at time AT5.
  • Lamp extinguishing event times LT0-LTn−1 occur at times stored in the schedule table store 726. The times may be stored in terms of clock cycles following the detection of a vsync pulse, or they may be stored in terms of clock cycles following the beginning of the loading of the previous bitplane into the array of light modulators 702. For bitplanes which are to be illuminated for a period longer than the time it takes to load a bitplane into the array of light modulators 726, the lamp extinguishing times are set in the schedule table to coincide with the completion of a corresponding addressing event. For example, LT0 is set to occur at a time after AT0 which coincides with the completion of the loading of the even-numbered rows. LT0 is set to occur at a time after AT1 which coincides with the completion of the loading of bitplane RE0 into the even-numbered rows. LT3 is set to occur at a time after AT4, which coincides with the completion of the loading of bitplane RO1 into the odd-numbered rows. After all red bitplanes for each bank are loaded and illuminated for the appropriate amounts of time, the process begins again with the green bitplanes.
  • The example of bank-wise addressing by timing diagram 1900 provides for only two independently addressable and actuatable banks. In other embodiments, arrays of MEMS modulators and their drive circuits can be interconnected so as to provide 3, 4, 5, 6, 7, 8 or more independently addressable banks A display with 6 independently addressable banks would require only ⅙ the time for addressing the rows within one bank, in comparison to time needed for addressing of the whole display. With the use of 6 banks, 6 different bitplanes attributed to the same color of lamp can be interleaved and illuminated simultaneously. For the 6-bit example, the rows associated with each bank may be assigned to every 6th row of the display.
  • In some embodiments that employ bank-wise addressing, it is not necessary to turn off the lamps while switching a given bank of rows from states indicated by one bitplane to states indicated in the next, as long as the states of the rows in the other contemporaneous banks are associated with the same color.
  • Referring again to the sequencing rules introduced with respect to timing diagram 1500, the bank-wise addressing scheme provides additional opportunities for reducing flicker in a MEMS-based field sequential display. In particular the red bitplane R1 for the even rows, introduced at addressing event AT0, is displayed within the same grouping of red sub-images as the red bitplane R1 for the odd rows, introduced at timing event AT4. Each of these bitplanes is displayed only once per frame. If the frame rate in timing diagram 19 was as low as 30 Hz, then the display of these lesser bitplanes would be separated by substantially more than 25 milliseconds between frames, contributing to the perception of flicker. However, this situation can be improved if the bitplanes in timing diagram 19 are further re-arranged such that the display of R1 bitplanes between adjacent frames are never separated by more than 25 milliseconds, preferably less than 17 milliseconds.
  • In particular, the display of the most significant bitplane in red, i.e. the most significant bit—R4, can be split, for instance at some point between the addressing events AT3 and AT4. The two groupings of red sub-images can then be re-arranged amongst similar sub-groupings in the green and blue sub-images. The red, green, and blue sub-groupings can be interspersed, as in the timing diagram 1500. The result is that that the display of the e.g. R1, G1, B1, sub-frame data sets can be arranged to appear at roughly equal time intervals, both within and between successive image frames. In this example, the R1 bitplane for the even rows would still appear only once per image frame. Flicker can be reduced, however, if the display of the R1 bitplane alternates between odd and even rows, and if the time separation between display of the odd or even portions of the bitplane is never more than 25 milliseconds, preferably less than 17 milliseconds.
  • FIG. 20 is a block diagram of a controller 2000 for use in a direct-view display, according to an illustrative embodiment of the invention. For example, the controller 2000 can replace the controller 704 of the direct-view MEMS display 700 of FIG. 7. The controller 2000 receives an image signal 2017 from an external source and outputs both data and control signals for controlling light modulators and lamps of the display into which it is incorporated
  • The controller 2000 includes an input processing module 2018, a memory control module 2020, a frame buffer 2022, a timing control module 2024, four unique schedule table stores 2026, 2027, 2028, and 2029. For the controller 2000, instead of a programming link which allows alteration of the parameters in a schedule table store, the controller provides a switch control module 2040 which determines which of the 4 schedule table stores will be active at any given time. In some implementations the components 2018-2040 may be provided as distinct chips or circuits which are connected together by means of circuit boards and/or cables. In other implementations several of these components can be designed together into a single semiconductor chip such that their boundaries are nearly indistinguishable except by function.
  • The input processing module 2018 receives the image signal 2017 and processes the data encoded therein, similar to input processing module 718, into a format suitable for displaying via the array of light modulators. The input processing module 2018 takes the data encoding each image frame and converts it into a series of sub-frame data sets. While in various embodiments, the input processing module 2018 may convert the image signal 2017 into non-coded sub-frame data sets, ternary coded sub-frame data sets, or other form of coded sub-frame data set, preferably, the input processing module 2018 converts the image signal 2017 into bitplanes, as described above in relation to FIGS. 6A-6C.
  • The input processing module 2018 outputs the sub-frame data sets to the memory control module 2020. The memory control module 2020 then stores the sub-frame data sets in the frame buffer 2022. The frame buffer is preferably a random access memory, although other types of serial memory can be used without departing from the scope of the invention. The memory control module 2020, in one implementation stores the sub-frame data set in a predetermined memory location based on the color and significance in a coding scheme of the sub-frame data set. In other implementations, the memory control module 2020 stores the sub-frame data set in a dynamically determined memory location and stores that location in a lookup table for later identification. In one particular implementation, the frame buffer 2022 is configured for the storage of bitplanes.
  • The memory control module 2020 is also responsible for, upon instruction from the timing control module 2024, retrieving sub-image data sets from the frame buffer 2022 and outputting them to the data drivers. The data drivers load the data output by the memory control module 2020 into the light modulators of the array of light modulators. The memory control module 2020 outputs the data in the sub-image data sets one row at a time. In one implementation, the frame buffer 2022 includes two buffers, whose roles alternate. While the memory control module 2020 stores newly generated bitplanes corresponding to a new image frame in one buffer, it extracts bitplanes corresponding to the previously received image frame from the other buffer for output to the array of light modulators Both buffer memories can reside within the same circuit, distinguished only by address.
  • The order in which the sub-image data sets are output, referred to as the “sub-frame data set output sequence,” and the time at which the memory control module 2022 begins outputting each sub-image data set is controlled, at least in part, by data stored in one of the alternate schedule table stores 2026, 2027, 2028, and 2029. Each of the schedule table stores 2026-2029 stores at least one timing value associated with each sub-frame data set, an identifier indicating where the sub-image data set is stored in the frame buffer 2022, and illumination data indicating the color or colors associated with the sub-image data set. In some implementations, the schedule table stores 2026-2029 also store intensity values indicating the intensity with which the corresponding lamp or lamps should be illuminated for a particular sub-frame data set.
  • In one implementation the timing values stored in the schedule table stores 2026-2029 determine when to begin addressing the array of light modulators with the sub-frame data set. In another implementation, the timing value is used to determine when a lamp or lamps associated with the sub-frame data set should be illuminated and/or extinguished. In one implementation, the timing value is a number of clock cycles, which for example, have passed since the initiation of the display of an image frame, or since the last addressing or lamp event was triggered. Alternatively, the timing value may be an actual time value, stored in microseconds or milliseconds.
  • The distinct timing values stored in the various schedule table stores 2026-2029 provide a choice between distinct imaging algorithms, for instance between display modes which differ in the properties of frame rate, lamp brightness, achievable grayscale precision, or in the saturation of displayed colors. The storage of multiple schedule tables, therefore, provides for flexibility in the method of displaying images, a flexibility which is especially advantageous when it provides a method for saving power for use in portable electronics. Direct view display 2000 includes 4 unique schedule tables stored in memory. In other implementations the number of distinct schedules that are stored may be 2, 3, or any other number. For instance it may be advantageous to store schedule parameters for as many as 100 unique schedule table stores.
  • The multiple schedule tables stored in controller 2000 allow for the exploitation of trade-offs between image quality and power consumption. For some images, which do not require a display of deeper, saturated colors, it is possible to rely on white lamps or mixed colors to provide brightness, especially as these color schemes can be more power efficient. Similarly, not all images or applications require the display of 16 million colors. A palette of 250,000 colors may be sufficient (6 bits per color) for some images or applications. For other images or applications, a color range limited to only 4,000 colors (4 bits per color) or 500 colors (3 bits per color) may be sufficient. It is advantageous to include electronics in a direct view MEMS display controller so as to provide display flexibility to take advantage of power saving opportunities.
  • Many of the variables effecting both image quality and power consumption in a MEMS direct view display are governed by the timing and bitplane parameters which are stored in the schedule table store stores 2026-2029. Together with the sequencing commands stored within the timing control module 2024, these parameters allow the controller 2000 to output variations on lamp intensities, frame rates, different pallets of colors (based on the mixing of lamp colors within a subfield), or different grey scale bit depths (based on the number of bitplanes employed to display an image frame).
  • In one implementation, each schedule table corresponds to a different display process. For example, schedule table 2026 corresponds to a display process capable of generating approximately 16 million colors (8 bits per color) with high color saturation. Schedule table 2027 corresponds to a display process appropriate only for black and white (e.g. text) images with a frame rate, or refresh rate, that is very low, e.g. less than 20 frames per second. Schedule table 2028 corresponds to a display process suited for outdoor viewing of color or video images where brightness is at a premium but where battery power must nevertheless be conserved. Schedule table 2029 corresponds to a display process providing a restricted choice of colors (e.g. 4,000) which would provide an easy to read and low-power display appropriate for most icon or text-type information with the exception of video. Of the display processes represented by the schedule table stores 2026-2029, the display process represented by schedule table 2026 requires the most power, whereas the display process represented by schedule table 2027 requires the least. The display processes corresponding to schedule tables 2028 and 2029 require power usage somewhere in between that required by the other display processes.
  • In the controller 2000, for any given image frame, the timing control module 2024 derives its display process parameters or constants from only one of the four possible sequence tables. A switch control module 2040 governs which of the sequence tables is referenced by the timing control module 2040. This switch control module 2040 could be a user controlled switch, or it could be responsive to commands from an external processor, contained either within the same housing as the MEMS display device or external to it (referred to as an “external module”). The external module, for instance, can decide whether the information to be displayed is text or video, or whether the information displayed should be colored or strictly black and white. In some embodiments the switch commands can originate from the input processing module 2018. Whether in response to an instruction from the user or an external module, the switch control module 2040 selects a schedule table store that corresponds to the desired display process or display parameters.
  • FIG. 21 is a flow chart of a process of displaying images 2100 (the “display process 2100”) suitable for use by a direct-view display such as the controller 2000 of FIG. 20, according to an illustrative embodiment of the invention. Referring to FIGS. 20 and 21, the display process 2100 begins with the selection of an appropriate schedule table for use in displaying an image frame (step 2102). For example, a selection is made between schedule table stores 2026-2029. This selection can be made by the input processing module 2118, a module in another part of the device in which the direct-view MEMS display is incorporated, or it can be made directly by the user of the device. When the selection amongst schedule tables is made by the input processing module or an external module, it can be made in response to the type of image to be displayed (for instance video or still images require finer levels of gray scale contrast versus an image which needs only a limited number of contrast levels (such as a text image)). Another factor which that might influence the selection of an imaging mode or schedule table, whether selected directly by a user or automatically by the external module, might be the lighting ambient of the device. For example, one might prefer one brightness for the display when viewed indoors or in an office environment versus outdoors where the display must compete in an environment of bright sunlight. Brighter displays are more likely to be viewable in an ambient of direct sunlight, but brighter displays consume greater amounts of power. The external module, when selecting schedule tables on the basis of ambient light, can make that decision in response to signals it receives through an incorporated photodetector. Another factor that might influence the selection of an imaging mode or schedule table, whether selected directly by a user or automatically by the external module, might be the level of stored energy in a battery powering the device in which the display is incorporated. As batteries near the end of their storage capacity it may be preferable to switch to an imaging mode which consumes less power to extend the life of the battery.
  • The selection step 2102 can be accomplished by means of a mechanical relay, which changes the reference within the timing control module 2024 to only one of the four schedule table stores 2026-2029. Alternately, the selection step 2102 can be accomplished by the receipt of an address code which indicates the location of one of the schedule table stores 2026-2029. The timing control module 2024 then utilizes the selection address, as received through the switch control module 2040, to indicate the correct memory source for its schedule parameters. Alternately the timing control module 2024 can make reference to a schedule table stored in memory by means of a multiplexer circuit, similar to a memory control circuit. When a selection code is entered into the controller 2000 by means of the switch control module 2040, the multiplexer is reset so that schedule table parameters requested by the timing control module 2024 are routed to the correct address in memory.
  • The process 2100 then continues with the receipt of the data for an image frame. The data is received by the input processing module 2018 by means of the input line 2017 at step 2104. The input processing module then derives a plurality of sub-frame data sets, for instance bitplanes, and stores them in the frame buffer 2022 (step 2106). After storage of the sub-frame data sets the timing control module 2024 proceeds to display each of the sub-frame data sets, at step 2108, in their proper order and according to timing values stored in the selected schedule table.
  • The process 2100 then continues iteratively with receipt of subsequent frames of image data. The sequence of receiving image data at step 2104 through the display of the sub-frame data sets at step 2108 can be repeated many times, where each image frame to be displayed is governed by the same selected schedule table. This process can continue until the selection of a new schedule table is made at a later time, e.g. by repeating the step 2102. Alternatively, the input processing module 2018 may select a schedule table for each image frame received, or it may periodically examine the incoming image data to determine if a change in schedule table is appropriate.
  • FIG. 22 is a block diagram of a controller 2200, suitable for inclusion in a MEMS direct-view display, according to an illustrative embodiment of the invention. For example, the controller 2200 may replace the controller 704 of the MEMS direct-view display 700. The controller 2200 receives an image signal 2217 from an external source and outputs both data and control signals for controlling the drivers, light modulators, and lamps of the display in which the controller is included.
  • The controller 2200 includes an input processing module 2218, a memory control module 2220, a frame buffer 2222, a timing control module 2224. In contrast to controllers 704 and 2000, the controller 2200 includes a sequence parameter calculation module 2228. The sequence parameter calculation module receives monitoring data from the input processing module 2218 and outputs changes to the sequencing parameters stored within the schedule table store 2226, and in some implementations, changes to the bitplanes stored for a given image frame. In some implementations, the components 2218, 2220, 2222, 2224, 2226, and 2228 may be provided as distinct chips or circuits which are connected together by means of circuit boards and/or cables. In other implementations, several of these components can be designed together into a single semiconductor chip such that their boundaries are nearly indistinguishable except by function.
  • The input processing module 2218 receives the image signal 2217 and processes the data encoded therein into a format suitable for displaying via the array of light modulators. The input processing module 2218 takes the data encoding each image frame and converts it into a series of sub-frame data sets. A sub-frame data set includes information about the desired states of modulators in multiple rows and multiple columns of the array of light modulators. The number and content of sub-frame data sets used to display an image frame depends on the grayscale technique employed by the controller 2200. For example, the sub-frame data sets needed to form an image frame using a coded time-division gray scale technique differs from the number and content of sub-frame data sets used to display an image frame using a non-coded time division gray scale technique. While in various embodiments, the input processing module 2218 may convert the image signal 2217 into non-coded sub-frame data sets, ternary coded sub-frame data sets, or other form of coded sub-frame data set, preferably, the input processing module 2218 converts the image signal 2217 into bitplanes, as described above in relation to FIGS. 6A-6C.
  • The input processing module 2218 outputs the sub-frame data sets to the memory control module 2220. The memory control module 2220 then stores the sub-frame data sets in the frame buffer 2222. The memory control module 2220, in one implementation stores the sub-frame data set in a predetermined memory location based on the color and significance in a coding scheme of the sub-frame data set. In other implementations, the memory control module 2220 stores the sub-frame data set in a dynamically determined memory location and stores that location in a lookup table for later identification. In one particular implementation, the frame buffer 2222 is configured for the storage of bitplanes.
  • The memory control module 2220 is also responsible for, upon instruction from the timing control module 2224, retrieving bitplanes sets from the frame buffer 2222 and outputting them to the data drivers 2208. The data drivers 2208 load the data output by the memory control module 2220 into the light modulators of the array of light modulators. The memory control module 2220 outputs the data in the sub-image data sets one row at a time. In one implementation, the frame buffer 2222 includes two buffers, whose roles alternate. While the memory control module 2220 stores newly generated bitplanes corresponding to a new image frame in one buffer, it extracts bitplanes corresponding to the previously received image frame from the other buffer for output to the array of light modulators Both buffer memories can reside within the same circuit, distinguished only by address.
  • The order in which the sub-image data sets are output, referred to as the “sub-frame data set output sequence,” and the time at which the memory control module 2220 begins outputting each sub-image data set is controlled, at least in part, by data stored in the schedule table store 2226. The schedule table store 2226 stores at least one timing value associated with each sub-frame data set, an identifier indicating where the sub-image data set is stored in the frame buffer 2222, and illumination data indicating the color or colors associated with the sub-image data set. In some implementations, the schedule table store 2226 also stores intensity values indicating the intensity with which the corresponding lamp or lamps should be illuminated for a particular sub-frame data set.
  • In one implementation the timing values stored in the schedule table store 2226 determine when to begin addressing the array of light modulators with each sub-frame data set. In another implementation, the timing value is used to determine when a lamp or lamps associated with the sub-frame data set should be illuminated and/or extinguished. In one implementation, the timing value is a number of clock cycles, which for example, have passed since the initiation of the display of an image frame, or since the last addressing or lamp event was triggered. Alternatively, the timing value may be an actual time value, stored in microseconds or milliseconds.
  • Controller 2200 includes a re-configurable schedule table store 2226. As described above with respect to controllers 704 and 2000 the schedule table store 2226 provides a flexible or programmable component to the controller. A programming link, such as the interface 730 allowed the schedule table store 726 within controller 704 to be altered or reprogrammed according to different lamp intensities, frame rates, color schemes, or grey scale bit depths. Similar alterations to the display process are possible for schedule table store 2226 within controller 2200, except that these variations now occur automatically in response to the requirements of individual image frames, based on characteristics of those image frames detected by the input processing module 2218
  • Based on the data contained within the image frame, it is often possible to reduce power consumption in the display by controlling variables such as lamp brightness, color saturation, and bit depth without any change or distortion perceptible in the image. This is because many images do not require full brightness from the lamps, or they do not require the deepest or most saturated of colors, or they require only a limited number of gray scale levels. Controller 2200 is configured to sense the display requirements for an image frame based on the data within the image frame and to adapt the display algorithm by means of changes to the schedule table store 2226.
  • A method by which the controller 2200 can adapt the display characteristics based on the content of incoming image data is shown in FIG. 23 as display method 2300. Display method 2300 is suitable for use by a MEMS direct-view display such as the MEMS direct-view display 2200 of FIG. 22, according to an illustrative embodiment of the invention. Referring to FIGS. 22 and 23, the display method 2300 begins with the receipt of the data for an image frame at step 2302. The data is received by the input processing module 2218 by means of the input line 2217. The input processing module 2218 derives a plurality of sub-frame data sets, for instance bitplanes, from the data and stores the bitplanes in the frame buffer 2222. Additionally, however, at step 2304 prior to the storage of the bitplanes in step 2306, the input processing module monitors and analyzes the content of the incoming image to look for characteristics which might effect the display of that image. For instance at step 2304 the input processing module might make note of the pixel or pixels with the most saturated colors in the image frame, i.e. pixels which call for significant brightness values from one color which are not balanced, diluted or desaturated by requiring illumination in the same pixel from the other color lamps in the same image frame. In another example of input data monitoring, the input processing module 2218 might make note of the pixel or pixels with the brightest values required of each of the lamps, regardless of color saturation.
  • After a complete image frame has been received and stored in the frame buffer 2222 the method 2300 proceeds to step 2308. At step 2308 the sequence parameter calculation module 2228 assesses the data collected at step 2304 and identifies changes to the display process that can be implemented by adjusting values in the sequence table 2226. The changes to the sequence table 2226 are then affected at step 2310 by re-writing certain of the parameters stored within table 2226. Finally, at step 2312, the method 2300 proceeds to the display of sub-images according to the ordering parameters and timing values that have been re-programmed within the schedule table 2226.
  • The method 2300 then continues iteratively with receipt of subsequent frames of image data. As indicated in the method 800, the processes of receiving (step 2302) and displaying image data (step 2312) may run in parallel, with one image being displayed from the data of one buffer memory according to the re-programmed schedule table at the same time that new sub-frame data sets are being analyzed and stored into a parallel buffer memory. The sequence of receiving image data at step 2302 through the display of the sub-frame data sets at step 2312 can be repeated interminably, where each image frame to be displayed is governed by a schedule table which is re-programmed in response to the incoming data.
  • It is instructive to consider some examples of how the method 2300 can reduce power consumption by adjusting the display characteristics in the schedule table store 2226 in response to data collected at step 2304. These examples are referred to as adaptive power schemes.
  • In one scheme for adaptive power that is responsive to the incoming image data, the data monitoring at step 2304 detects the pixels in each frame with the most saturated colors. If it is determined that the most saturated color required for a frame is only 82% of the saturation available from the colored lamps, then it is possible to remix the colors that are provided to the bitplanes so that power can be saved—while still providing the 82% saturation level required by the image. By adding, for instance subordinate red, green, or blue colors to the primary color in each frame, power can be saved in the display. In this example the sequence parameter calculation module 2228 would receive a signal from the input processing module 2218 indicating the degree of color mixing which is allowed. Before the frame is displayed the sequence parameter calculation module re-writes the intensity parameters in the sequence table 2226 which determine color mixing at each bitplane, so that colors are correspondingly desaturated and power is saved.
  • In another adaptive power scheme, a process is provided within the sequence parameter calculation module 2228 which determines whether the image is comprised solely of text or text plus symbols as opposed to video or a photographic image. The sequence parameter calculation module 2228 then re-writes the parameters in the sequence table accordingly. Text images, especially black and white text images, do not need to be refreshed as often as video images and typically require only a limited number of different colors or gray shades. The sequence parameter calculator 2228 can therefore adjust both the frame rate as well as the number of sub-images to be displayed for each image frame. Text images require fewer sub-images in the display process than photographic images
  • In still another adaptive power scheme, the monitoring function at step 2304 analyzes or searches for the maximum intensity attributed to each color in each pixel. If an image is to be displayed that requires no more than 65% of the brightness from any of the lamps for any of the pixels, then in some cases it is possible to display that image correctly by reducing the average intensity of the lamps accordingly. The lamp intensity values within the schedule table store 2226 can be reduced by a set of commands within the sequence parameter calculation module 2228.
  • APPENDIX 1
  • Appendix 1 presents a timing sequence 3000 expressed by means of schedule table 9, which expresses an embodiment for the timing sequences of this invention.
  • The timing sequence 3000 of Appendix 1 is appropriate to the display of image information at a 30 Hz frame rate (e.g. 33.3 milliseconds between vsync pulses); it includes the display of 7 bits for each of the colors red, green, and blue. The timing sequence 3000 is constrained by the following parameters related to setting of modulator states in the array:
      • 240 microseconds required for loading a complete bitplane to the array
      • 120 microseconds required for loading bitplanes to only a single bank (odd or even) of rows
      • 100 microseconds required for global actuation
  • The schedule table for timing sequence 3000 includes the following information, required by the timing control module 724 for display of the sub-images
      • Subfield number
      • Bitplane interval (elapsed time between global actuation pulses)
      • Alphanumeric code for memory locations of the bitplanes, separated by their assigned banks (e.g. R0, R1, R2, . . . R6)
      • Illumination intensity
        The schedule table for timing sequence 3000 does not distinguish between addressing times and illumination times. Instead the logic within the timing control module 724 assumes that each bitplane interval begins immediately after completion of a global actuation event. In the first action of the sequence after global actuation the lamps are illuminated according to the intensity values listed in Table 9.
  • The timing sequence 3000 includes the following features as described previously. Similar to timing sequence 1200, the display that employs timing sequence 3000 includes the capability of global actuation. The display that employs timing sequence 3000 includes two independent global actuation circuits, for each of the odd and even banks respectively. The timing sequence 3000 includes a scheme for control of the lamps, similar to that described in timing sequence 1450, in which both pulse periods and pulse intensity is used to express illumination value. The timing sequence 3000 is capable of mixing colors, as in timing sequence 1700, although in this embodiment only one lamp is illuminated at one time.
  • The timing sequence 3000 includes bank-wise addressing. The lesser bitplanes, e.g. R0, R1, R2, and R3 are always displayed successively within a given bank, e.g. the odd rows, and this sequence of lesser bitplanes is illuminated at the same time that the most significant bit (e.g. R6) is illuminated in the other bank (e.g. in the even rows).
  • The timing sequence 3000 splits the most significant bits (e.g. R6, G6, and B6 into four separate but equally timed sub-images. The timing sequence alternates colors frequently, with a maximum period between colors of 1.38 milliseconds. The time between the expression of most significant bits is not always equal between successive pairs of most significant bits, but in no case is that period between most significant bits greater than 4.16 milliseconds.
  • TABLE 9
    Schedule Table 9
    Interval Odd Even
    Field Illumination Illumination Time Width Load Load
    Number Width Intensity (ms) bitplane bitplane
    0 R1 R6
    1 1 1 0.1301 R2 *
    2 2 1 0.2602 R3 *
    3 4 1 0.5203 R0 *
    4 1 0.5 0.1301 G1 G6
    5 1 1 0.1301 G2 *
    6 2 1 0.2602 G3 *
    7 4 1 0.5203 G0 *
    8 1 0.5 0.1301 B1 B6
    9 1 1 0.1301 B2 *
    10 2 1 0.2602 B3 *
    11 4 1 0.5203 B0 *
    12 1 0.5 0.1301 R6 R6
    13 8 1 1.0406 G6 G6
    14 8 1 1.0406 B6 B6
    15 8 1 1.0406 R5 R5
    16 8 1 1.0406 G5 G5
    17 8 1 1.0406 B5 B5
    18 8 1 1.0406 R4 R6
    19 8 1 1.0406 G4 G6
    20 8 1 1.0406 B4 B6
    21 8 1 1.0406 R6 R1
    22 1 1 0.1301 * R2
    23 2 1 0.2602 * R3
    24 4 1 0.5203 * R0
    25 1 0.5 0.1301 G6 G1
    26 1 1 0.1301 * G2
    27 2 1 0.2602 * G3
    28 4 1 0.5203 * G0
    29 1 0.5 0.1301 B6 B1
    30 1 1 0.1301 * B2
    31 2 1 0.2602 * B3
    32 4 1 0.5203 * B0
    33 1 0.5 0.1301 R6 R6
    34 8 1 1.0406 G6 G6
    35 8 1 1.0406 B6 B6
    36 8 1 1.0406 R5 R5
    37 8 1 1.0406 G5 G5
    38 8 1 1.0406 B5 B5
    39 8 1 1.0406 R6 R4
    40 8 1 1.0406 G6 G4
    41 8 1 1.0406 B6 B4
    42 8 1 1.0406 R1 R6

Claims (15)

1. An electromechanical device comprising:
an array of light blocking elements having at least two positions;
at least one light source;
a controller for controlling the position of the light blocking elements, and for controlling illumination of the at least one light source, and for
obtaining from a memory a plurality of sub-frame data sets corresponding to an image frame; and
processing the sub-frame data sets according to an output sequence to drive the light blocking elements into the positions indicated in the sub-frame data sets, and illuminate the at least one light source according to the sub-frame data sets to form an image; and
a sequence parameter calculation module for determining changes to the output sequence to alter the image.
2. The electromechanical device of claim 1, wherein the controller identifies data within a respective image frame and the sequence parameter calculation module determines changes to the output sequence based on the identified data.
3. The electromechanical device of claim 2, wherein the sequence parameter calculation module adjusts a frame rate.
4. The electromechanical device of claim 2, wherein the sequence parameter calculation module adjusts a number of sub-images to be displayed for the image frame.
5. The electromechanical device of claim 1, wherein the sequence parameter calculation module changes a timing value associated with a light blocking element movement event.
6. The electromechanical device of claim 1, wherein the sequence parameter calculation module changes a timing value associated with a light source illumination event.
7. The electromechanical device of claim 1, wherein the sequence parameter calculation module changes an intensity value for the at least one light source.
8. The electromechanical device of claim 1, wherein the controller identifies a characteristic of a respective image frame and the sequence parameter calculator changes at least one of the plurality of sub-frame data sets based on the identified characteristic.
9. The electromechanical device of claim 1, comprising a memory for storing the output sequence.
10. The electromechanical device of claim 1, wherein the light blocking elements are electromechanical shutters formed on a transparent substrate, the electromechanical shutters moving transverse to the transparent substrate to modulate light.
11. A method for displaying an image, comprising:
obtaining from memory a plurality of sub-frame data sets corresponding to an image frame, the respective sub-frame data sets indicating positions of light blocking elements in an array;
processing the plurality of sub-frame data sets according to an output sequence to drive the light blocking elements into the positions indicated in the respective sub-frame data sets; and
changing the output sequence to alter the displayed image.
12. The method of claim 11, comprising identifying data within the image frame and changing the output sequence based on the identified data.
13. The method of claim 11, comprising changing timing values stored in relation to at least one light blocking element movement event included in the output sequence.
14. The method of claim 11, comprising changing timing values stored in relation to at least one light source illumination event included in the output sequence.
15. The method of claim 11, comprising changing at least one light source intensity value stored in relation to at least one light source illumination event included in the output sequence.
US13/597,983 2005-02-23 2012-08-29 Direct-view mems display devices and methods for generating images thereon Abandoned US20120320113A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/597,983 US20120320113A1 (en) 2005-02-23 2012-08-29 Direct-view mems display devices and methods for generating images thereon

Applications Claiming Priority (7)

Application Number Priority Date Filing Date Title
US65582705P 2005-02-23 2005-02-23
US67605305P 2005-04-29 2005-04-29
US75190905P 2005-12-19 2005-12-19
US11/361,294 US20060209012A1 (en) 2005-02-23 2006-02-23 Devices having MEMS displays
US77636706P 2006-02-24 2006-02-24
US11/643,042 US20070205969A1 (en) 2005-02-23 2006-12-19 Direct-view MEMS display devices and methods for generating images thereon
US13/597,983 US20120320113A1 (en) 2005-02-23 2012-08-29 Direct-view mems display devices and methods for generating images thereon

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/643,042 Continuation US20070205969A1 (en) 2005-02-23 2006-12-19 Direct-view MEMS display devices and methods for generating images thereon

Publications (1)

Publication Number Publication Date
US20120320113A1 true US20120320113A1 (en) 2012-12-20

Family

ID=38471024

Family Applications (5)

Application Number Title Priority Date Filing Date
US11/643,042 Abandoned US20070205969A1 (en) 2005-02-23 2006-12-19 Direct-view MEMS display devices and methods for generating images thereon
US13/597,983 Abandoned US20120320113A1 (en) 2005-02-23 2012-08-29 Direct-view mems display devices and methods for generating images thereon
US13/597,968 Abandoned US20120320112A1 (en) 2005-02-23 2012-08-29 Direct-view mems display devices and methods for generating images thereon
US13/597,943 Expired - Fee Related US9135868B2 (en) 2005-02-23 2012-08-29 Direct-view MEMS display devices and methods for generating images thereon
US15/078,725 Abandoned US20160275876A1 (en) 2005-02-23 2016-03-23 Direct-view mems display devices and methods for generating images thereon

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US11/643,042 Abandoned US20070205969A1 (en) 2005-02-23 2006-12-19 Direct-view MEMS display devices and methods for generating images thereon

Family Applications After (3)

Application Number Title Priority Date Filing Date
US13/597,968 Abandoned US20120320112A1 (en) 2005-02-23 2012-08-29 Direct-view mems display devices and methods for generating images thereon
US13/597,943 Expired - Fee Related US9135868B2 (en) 2005-02-23 2012-08-29 Direct-view MEMS display devices and methods for generating images thereon
US15/078,725 Abandoned US20160275876A1 (en) 2005-02-23 2016-03-23 Direct-view mems display devices and methods for generating images thereon

Country Status (1)

Country Link
US (5) US20070205969A1 (en)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110188106A1 (en) * 2010-02-04 2011-08-04 Samsung Electronics Co., Ltd. 2d/3d switchable image display device
US8482496B2 (en) 2006-01-06 2013-07-09 Pixtronix, Inc. Circuits for controlling MEMS display apparatus on a transparent substrate
US8519923B2 (en) 2005-02-23 2013-08-27 Pixtronix, Inc. Display methods and apparatus
US8519945B2 (en) 2006-01-06 2013-08-27 Pixtronix, Inc. Circuits for controlling display apparatus
US8526096B2 (en) 2006-02-23 2013-09-03 Pixtronix, Inc. Mechanical light modulators with stressed beams
US8599463B2 (en) 2008-10-27 2013-12-03 Pixtronix, Inc. MEMS anchors
US9082353B2 (en) 2010-01-05 2015-07-14 Pixtronix, Inc. Circuits for controlling display apparatus
US9087486B2 (en) 2005-02-23 2015-07-21 Pixtronix, Inc. Circuits for controlling display apparatus
US9134552B2 (en) 2013-03-13 2015-09-15 Pixtronix, Inc. Display apparatus with narrow gap electrostatic actuators
US9135868B2 (en) 2005-02-23 2015-09-15 Pixtronix, Inc. Direct-view MEMS display devices and methods for generating images thereon
US9158106B2 (en) 2005-02-23 2015-10-13 Pixtronix, Inc. Display methods and apparatus
WO2015156938A1 (en) * 2014-04-09 2015-10-15 Pixtronix, Inc. Field sequential color (fsc) display apparatus and method employing different subframe temporal spreading
US9176318B2 (en) 2007-05-18 2015-11-03 Pixtronix, Inc. Methods for manufacturing fluid-filled MEMS displays
US9229222B2 (en) 2005-02-23 2016-01-05 Pixtronix, Inc. Alignment methods in fluid-filled MEMS displays
US9261694B2 (en) 2005-02-23 2016-02-16 Pixtronix, Inc. Display apparatus and methods for manufacture thereof
US9336732B2 (en) 2005-02-23 2016-05-10 Pixtronix, Inc. Circuits for controlling display apparatus
US9500853B2 (en) 2005-02-23 2016-11-22 Snaptrack, Inc. MEMS-based display apparatus
US9583035B2 (en) 2014-10-22 2017-02-28 Snaptrack, Inc. Display incorporating lossy dynamic saturation compensating gamut mapping
US9857628B2 (en) 2011-01-07 2018-01-02 Semiconductor Energy Laboratory Co., Ltd. Display device
US20190333460A1 (en) * 2018-04-27 2019-10-31 Japan Display Inc. Display device and image determination device
US10685598B2 (en) 2016-03-25 2020-06-16 Sharp Kabushiki Kaisha Display panel, display apparatus, and method for manufacturing display panel
US11024656B2 (en) 2016-06-28 2021-06-01 Sharp Kabushiki Kaisha Active matrix substrate, optical shutter substrate, display device, and method for manufacturing active matrix substrate
TWI749628B (en) * 2020-07-09 2021-12-11 瑞昱半導體股份有限公司 Scalar, display device and associated data processing method

Families Citing this family (62)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7417782B2 (en) * 2005-02-23 2008-08-26 Pixtronix, Incorporated Methods and apparatus for spatial light modulation
US8310441B2 (en) 2004-09-27 2012-11-13 Qualcomm Mems Technologies, Inc. Method and system for writing data to MEMS display elements
US20080158635A1 (en) * 2005-02-23 2008-07-03 Pixtronix, Inc. Display apparatus and methods for manufacture thereof
US7746529B2 (en) 2005-02-23 2010-06-29 Pixtronix, Inc. MEMS display apparatus
US7755582B2 (en) 2005-02-23 2010-07-13 Pixtronix, Incorporated Display methods and apparatus
US7742016B2 (en) 2005-02-23 2010-06-22 Pixtronix, Incorporated Display methods and apparatus
US20070126673A1 (en) * 2005-12-07 2007-06-07 Kostadin Djordjev Method and system for writing data to MEMS display elements
JP2007163647A (en) * 2005-12-12 2007-06-28 Mitsubishi Electric Corp Image display apparatus
US7876489B2 (en) 2006-06-05 2011-01-25 Pixtronix, Inc. Display apparatus with optical cavities
US7527998B2 (en) 2006-06-30 2009-05-05 Qualcomm Mems Technologies, Inc. Method of manufacturing MEMS devices providing air gap control
EP2080045A1 (en) 2006-10-20 2009-07-22 Pixtronix Inc. Light guides and backlight systems incorporating light redirectors at varying densities
US20100001666A1 (en) * 2006-12-21 2010-01-07 Koninklijke Philips Electronics N.V. Micro-electro-mechanical system with actuators
US7852546B2 (en) 2007-10-19 2010-12-14 Pixtronix, Inc. Spacers for maintaining display apparatus alignment
US7403180B1 (en) * 2007-01-29 2008-07-22 Qualcomm Mems Technologies, Inc. Hybrid color synthesis for multistate reflective modulator displays
US8111262B2 (en) * 2007-05-18 2012-02-07 Qualcomm Mems Technologies, Inc. Interferometric modulator displays with reduced color sensitivity
JP5052223B2 (en) * 2007-06-26 2012-10-17 三菱電機株式会社 Image display device, image processing circuit, and image display method
US7847999B2 (en) 2007-09-14 2010-12-07 Qualcomm Mems Technologies, Inc. Interferometric modulator display devices
US8248560B2 (en) 2008-04-18 2012-08-21 Pixtronix, Inc. Light guides and backlight systems incorporating prismatic structures and light redirectors
US7920317B2 (en) 2008-08-04 2011-04-05 Pixtronix, Inc. Display with controlled formation of bubbles
US8520285B2 (en) 2008-08-04 2013-08-27 Pixtronix, Inc. Methods for manufacturing cold seal fluid-filled display apparatus
US8466864B2 (en) * 2008-10-08 2013-06-18 Dell Products, Lp Grayscale-based field-sequential display for low power operation
KR101588850B1 (en) * 2008-12-03 2016-01-27 삼성디스플레이 주식회사 Display apparatus
US8339058B2 (en) * 2008-12-12 2012-12-25 Microchip Technology Incorporated Three-color RGB LED color mixing and control by variable frequency modulation
US8270056B2 (en) 2009-03-23 2012-09-18 Qualcomm Mems Technologies, Inc. Display device with openings between sub-pixels and method of making same
US8736590B2 (en) 2009-03-27 2014-05-27 Qualcomm Mems Technologies, Inc. Low voltage driver scheme for interferometric modulators
US8179401B2 (en) * 2009-05-21 2012-05-15 Spatial Photonics, Inc. Reducing image artifacts in a color sequential display system
US8363380B2 (en) 2009-05-28 2013-01-29 Qualcomm Incorporated MEMS varactors
US20110148837A1 (en) * 2009-12-18 2011-06-23 Qualcomm Mems Technologies, Inc. Charge control techniques for selectively activating an array of devices
KR101775745B1 (en) 2010-03-11 2017-09-19 스냅트랙, 인코포레이티드 Reflective and transflective operation modes for a display device
JP2013524287A (en) 2010-04-09 2013-06-17 クォルコム・メムズ・テクノロジーズ・インコーポレーテッド Mechanical layer of electromechanical device and method for forming the same
WO2012001886A1 (en) * 2010-06-28 2012-01-05 パナソニック株式会社 Plasma display panel integrated circuit, access control method and plasma display system
KR101707586B1 (en) * 2010-09-28 2017-02-17 삼성디스플레이 주식회사 3 dimensional image display device
KR20170077261A (en) 2010-12-20 2017-07-05 스냅트랙, 인코포레이티드 Systems and methods for mems light modulator arrays with reduced acoustic emission
US8963159B2 (en) 2011-04-04 2015-02-24 Qualcomm Mems Technologies, Inc. Pixel via and methods of forming the same
US9134527B2 (en) 2011-04-04 2015-09-15 Qualcomm Mems Technologies, Inc. Pixel via and methods of forming the same
US8659816B2 (en) 2011-04-25 2014-02-25 Qualcomm Mems Technologies, Inc. Mechanical layer and methods of making the same
US9196189B2 (en) 2011-05-13 2015-11-24 Pixtronix, Inc. Display devices and methods for generating images thereon
US20130027444A1 (en) * 2011-07-25 2013-01-31 Qualcomm Mems Technologies, Inc. Field-sequential color architecture of reflective mode modulator
US9323041B2 (en) 2011-11-30 2016-04-26 Pixtronix, Inc. Electromechanical systems display apparatus incorporating charge dissipation surfaces
US20140009576A1 (en) * 2012-07-05 2014-01-09 Alcatel-Lucent Usa Inc. Method and apparatus for compressing, encoding and streaming graphics
US20140085274A1 (en) * 2012-09-26 2014-03-27 Pixtronix, Inc. Display devices and display addressing methods utilizing variable row loading times
US20140125707A1 (en) * 2012-11-06 2014-05-08 Qualcomm Mems Technologies, Inc. Color performance and image quality using field sequential color (fsc) together with single-mirror imods
US20140160137A1 (en) * 2012-12-12 2014-06-12 Qualcomm Mems Technologies, Inc. Field-sequential color mode transitions
KR101995553B1 (en) * 2013-01-16 2019-07-03 삼성디스플레이 주식회사 Timing controller of display device and method for driving the same
US9170421B2 (en) 2013-02-05 2015-10-27 Pixtronix, Inc. Display apparatus incorporating multi-level shutters
DE102013010731A1 (en) * 2013-06-24 2014-12-24 Carl Zeiss Microscopy Gmbh Fast and accurate switching light switch and beam stabilization device
JP6367529B2 (en) * 2013-06-25 2018-08-01 ソニー株式会社 Display device, display control method, display control device, and electronic apparatus
US9142041B2 (en) * 2013-07-11 2015-09-22 Pixtronix, Inc. Display apparatus configured for selective illumination of low-illumination intensity image subframes
US9454265B2 (en) 2013-09-23 2016-09-27 Qualcomm Incorporated Integration of a light collection light-guide with a field sequential color display
US9967546B2 (en) 2013-10-29 2018-05-08 Vefxi Corporation Method and apparatus for converting 2D-images and videos to 3D for consumer, commercial and professional applications
US20150116458A1 (en) 2013-10-30 2015-04-30 Barkatech Consulting, LLC Method and apparatus for generating enhanced 3d-effects for real-time and offline appplications
KR102072403B1 (en) * 2013-12-31 2020-02-03 엘지디스플레이 주식회사 Hybrid drive type organic light emitting display device
EP2976878B1 (en) * 2014-03-03 2018-12-05 Photoneo S.R.O Method and apparatus for superpixel modulation
KR20150114020A (en) * 2014-03-31 2015-10-12 삼성디스플레이 주식회사 Organic light emitting display device and method of driving an organic light emitting display device
US10158847B2 (en) 2014-06-19 2018-12-18 Vefxi Corporation Real—time stereo 3D and autostereoscopic 3D video and image editing
US20160048015A1 (en) * 2014-08-13 2016-02-18 Pixtronix, Inc. Displays having reduced optical sensitivity to aperture alignment at stepper field boundary
US20170092183A1 (en) * 2015-09-24 2017-03-30 Pixtronix, Inc. Display apparatus including pixel circuits for controlling light modulators
US10043459B1 (en) * 2016-06-01 2018-08-07 Amazon Technologies, Inc. Display timing controller with single-frame buffer memory
WO2018015848A2 (en) * 2016-07-17 2018-01-25 Gsi Technology Inc. Finding k extreme values in constant processing time
KR102608683B1 (en) * 2017-07-16 2023-11-30 쥐에스아이 테크놀로지 인코포레이티드 Natural language processing with knn
JP2020122950A (en) * 2019-01-31 2020-08-13 株式会社ジャパンディスプレイ Display device and display system
KR20230093682A (en) * 2021-12-20 2023-06-27 주식회사 엘엑스세미콘 Hybrid Dimming Control Device and Hybrid Dimming Control Method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010050661A1 (en) * 2000-06-12 2001-12-13 Matsushita Electric Industrial Co., Ltd. Color image display device and projection-type image display apparatus
US20020000959A1 (en) * 1998-10-08 2002-01-03 International Business Machines Corporation Micromechanical displays and fabrication method
US20040156246A1 (en) * 2002-09-18 2004-08-12 Seiko Epson Corporation Optoelectronic-device substrate, method for driving same, digitally-driven liquid-crystal-display, electronic apparatus, and projector
US20040233498A1 (en) * 2000-10-31 2004-11-25 Microsoft Corporation Microelectrical mechanical structure (MEMS) optical modulator and optical display system
US20050140636A1 (en) * 2003-12-29 2005-06-30 Chung In J. Method and apparatus for driving liquid crystal display

Family Cites Families (779)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3864582A (en) 1973-01-22 1975-02-04 Timex Corp Mosfet dynamic circuit
US4074253A (en) * 1975-11-19 1978-02-14 Kenneth E. Macklin Novel bistable light modulators and display element and arrays therefrom
US4067043A (en) * 1976-01-21 1978-01-03 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Optical conversion method
CH633902A5 (en) 1980-03-11 1982-12-31 Centre Electron Horloger LIGHT MODULATION DEVICE.
US4421381A (en) 1980-04-04 1983-12-20 Yokogawa Hokushin Electric Corp. Mechanical vibrating element
JPS5762028A (en) 1980-10-01 1982-04-14 Hitachi Ltd Manufacture of liquid crystal display element
JPS5828291Y2 (en) 1981-02-02 1983-06-20 株式会社 小林記録紙製造所 optically readable form
US4563836A (en) * 1981-04-06 1986-01-14 American Cyanamid Co. Insect feeding station
CH641315B (en) * 1981-07-02 Centre Electron Horloger MINIATURE SHUTTER DISPLAY DEVICE.
JPS5774730A (en) 1981-08-24 1982-05-11 Copal Co Ltd Curtain operating device for focal plane shutter
US4559535A (en) 1982-07-12 1985-12-17 Sigmatron Nova, Inc. System for displaying information with multiple shades of a color on a thin-film EL matrix display panel
JPS5933077U (en) 1982-08-23 1984-02-29 株式会社小糸製作所 LCD display
US4582396A (en) 1983-05-09 1986-04-15 Tektronix, Inc. Field sequential color display system using optical retardation
JPS6079331A (en) 1983-10-07 1985-05-07 Citizen Watch Co Ltd Manufacture of color liquid crystal display device
US5061049A (en) 1984-08-31 1991-10-29 Texas Instruments Incorporated Spatial light modulator and method
US5096279A (en) 1984-08-31 1992-03-17 Texas Instruments Incorporated Spatial light modulator and method
US4744640A (en) 1985-08-29 1988-05-17 Motorola Inc. PLZT multi-shutter color electrode pattern
US4728936A (en) 1986-04-11 1988-03-01 Adt, Inc. Control and display system
US5835255A (en) 1986-04-23 1998-11-10 Etalon, Inc. Visible spectrum modulator arrays
JPS62275230A (en) 1986-05-23 1987-11-30 Nippon Telegr & Teleph Corp <Ntt> Optical gate matrix switch
GB8728433D0 (en) 1987-12-04 1988-01-13 Emi Plc Thorn Display device
US4907132A (en) 1988-03-22 1990-03-06 Lumitex, Inc. Light emitting panel assemblies and method of making same
US4991941A (en) * 1988-06-13 1991-02-12 Kaiser Aerospace & Electronics Corporation Method and apparatus for multi-color display
US5042900A (en) 1988-09-12 1991-08-27 Lumitex, Inc. Connector assemblies for optical fiber light cables
CA1335889C (en) 1988-10-07 1995-06-13 Mahmoud A. Gawad Small profile luminaire having adjustable photometric distribution
US4958911A (en) 1988-10-19 1990-09-25 Jonand, Inc. Liquid crystal display module having housing of C-shaped cross section
US5986828A (en) 1988-11-01 1999-11-16 The United States Of America As Represented By The Secretary Of The Army Optical power limiter utilizing nonlinear refraction
EP0366847A3 (en) 1988-11-02 1991-01-09 Sportsoft Systems, Inc. Graphics display using biomorphs
US4889603A (en) 1988-12-09 1989-12-26 Copytele, Inc. Method of eliminating gas bubbles in an electrophoretic display
DE3842900C1 (en) 1988-12-16 1990-05-10 Krone Ag, 1000 Berlin, De
US5005108A (en) 1989-02-10 1991-04-02 Lumitex, Inc. Thin panel illuminator
US5025346A (en) 1989-02-17 1991-06-18 Regents Of The University Of California Laterally driven resonant microstructures
US5446479A (en) 1989-02-27 1995-08-29 Texas Instruments Incorporated Multi-dimensional array video processor system
EP0438614A1 (en) 1990-01-23 1991-07-31 Alternative Energy Research Center Inc. Information display apparatus and method
CH682523A5 (en) * 1990-04-20 1993-09-30 Suisse Electronique Microtech A modulation matrix addressed light.
US5142405A (en) 1990-06-29 1992-08-25 Texas Instruments Incorporated Bistable dmd addressing circuit and method
EP0467048B1 (en) * 1990-06-29 1995-09-20 Texas Instruments Incorporated Field-updated deformable mirror device
US5184248A (en) * 1990-07-16 1993-02-02 U.S. Philips Corporation Image projection apparatus
US5990990A (en) 1990-08-03 1999-11-23 Crabtree; Allen F. Three-dimensional display techniques, device, systems and method of presenting data in a volumetric format
US5319491A (en) 1990-08-10 1994-06-07 Continental Typographics, Inc. Optical display
US5062689A (en) 1990-08-21 1991-11-05 Koehler Dale R Electrostatically actuatable light modulating device
US5202950A (en) 1990-09-27 1993-04-13 Compaq Computer Corporation Backlighting system with faceted light pipes
US5050946A (en) 1990-09-27 1991-09-24 Compaq Computer Corporation Faceted light pipe
US5128787A (en) 1990-12-07 1992-07-07 At&T Bell Laboratories Lcd display with multifaceted back reflector
DE69122075T2 (en) 1991-01-16 1997-04-03 Lumitex Inc Thin plate lamp
US5233459A (en) 1991-03-06 1993-08-03 Massachusetts Institute Of Technology Electric display device
DE4110209C2 (en) * 1991-03-28 1993-11-18 Roland Man Druckmasch Device for adjusting a CNC-controlled grinding machine
CA2063744C (en) 1991-04-01 2002-10-08 Paul M. Urbanus Digital micromirror device architecture and timing for use in a pulse-width modulated display system
US5136751A (en) 1991-04-30 1992-08-11 Master Manufacturing Co. Wheel assembly
US5579035A (en) 1991-07-05 1996-11-26 Technomarket, L.P. Liquid crystal display module
JP3158667B2 (en) 1991-08-01 2001-04-23 セイコーエプソン株式会社 Method of manufacturing liquid crystal display element and method of reproducing liquid crystal display element
US5233385A (en) 1991-12-18 1993-08-03 Texas Instruments Incorporated White light enhanced color field sequential projection
US5245454A (en) 1991-12-31 1993-09-14 At&T Bell Laboratories Lcd display with microtextured back reflector and method for making same
JPH05188337A (en) 1992-01-09 1993-07-30 Minolta Camera Co Ltd Optical shutter array
KR960010845B1 (en) 1992-01-18 1996-08-09 제일모직 주식회사 Epoxy resin composition for encapsulating semiconductor element
US5198730A (en) 1992-01-29 1993-03-30 Vancil Bernard K Color display tube
JPH0579530U (en) * 1992-03-24 1993-10-29 日本ビクター株式会社 Display system optics
US5655832A (en) 1992-04-16 1997-08-12 Tir Technologies, Inc. Multiple wavelength light processor
JPH06174929A (en) 1992-05-15 1994-06-24 Fujitsu Ltd Backlight device and condenser
US5231559A (en) 1992-05-22 1993-07-27 Kalt Charles G Full color light modulating capacitor
US5499127A (en) 1992-05-25 1996-03-12 Sharp Kabushiki Kaisha Liquid crystal display device having a larger gap between the substrates in the display area than in the sealant area
DK69492D0 (en) 1992-05-26 1992-05-26 Purup Prepress As DEVICE FOR EXPOSURE OF A MEDIUM, DEVICE FOR POINT EXPOSURE OF A MEDIA, AND A DEVICE FOR HOLDING A MEDIA
NL194848C (en) 1992-06-01 2003-04-03 Samsung Electronics Co Ltd Liquid crystal indicator device.
US5568964A (en) 1992-07-10 1996-10-29 Lumitex, Inc. Fiber optic light emitting panel assemblies and methods of making such panel assemblies
US5359345A (en) 1992-08-05 1994-10-25 Cree Research, Inc. Shuttered and cycled light emitting diode display and method of producing the same
US5724062A (en) 1992-08-05 1998-03-03 Cree Research, Inc. High resolution, high brightness light emitting diode display and method and producing the same
US5319061A (en) 1992-08-07 1994-06-07 The Humphrey Chemical Co., Inc. Imparting moisture resistance to epoxies
US5493439A (en) * 1992-09-29 1996-02-20 Engle; Craig D. Enhanced surface deformation light modulator
US5339179A (en) 1992-10-01 1994-08-16 International Business Machines Corp. Edge-lit transflective non-emissive display with angled interface means on both sides of light conducting panel
US6008781A (en) 1992-10-22 1999-12-28 Board Of Regents Of The University Of Washington Virtual retinal display
US5596339A (en) * 1992-10-22 1997-01-21 University Of Washington Virtual retinal display with fiber optic point source
US5467104A (en) 1992-10-22 1995-11-14 Board Of Regents Of The University Of Washington Virtual retinal display
KR950010659B1 (en) 1992-11-10 1995-09-21 재단법인한국전자통신연구소 Micro light shutter and manufacturing method thereof
KR960001941B1 (en) 1992-11-10 1996-02-08 재단법인한국전자통신연구소 Plate display device
GB2272555A (en) 1992-11-11 1994-05-18 Sharp Kk Stereoscopic display using a light modulator
DE69330425T2 (en) 1992-11-27 2001-10-25 Yasuhiro Koike DEVICE FOR GUIDING SPREADED LIGHTS
JP3547160B2 (en) 1993-01-11 2004-07-28 テキサス インスツルメンツ インコーポレイテツド Spatial light modulator
US5528262A (en) 1993-01-21 1996-06-18 Fakespace, Inc. Method for line field-sequential color video display
JP2555922B2 (en) 1993-02-26 1996-11-20 日本電気株式会社 Electrostatically driven micro shutters and shutter arrays
US6674562B1 (en) * 1994-05-05 2004-01-06 Iridigm Display Corporation Interferometric modulation of radiation
US5810469A (en) 1993-03-26 1998-09-22 Weinreich; Steve Combination light concentrating and collimating device and light fixture and display screen employing the same
US5461411A (en) 1993-03-29 1995-10-24 Texas Instruments Incorporated Process and architecture for digital micromirror printer
US5477086A (en) 1993-04-30 1995-12-19 Lsi Logic Corporation Shaped, self-aligning micro-bump structures
GB2278480A (en) 1993-05-25 1994-11-30 Sharp Kk Optical apparatus
US5884872A (en) 1993-05-26 1999-03-23 The United States Of America As Represented By The Secretary Of The Navy Oscillating flap lift enhancement device
US5622612A (en) 1993-06-02 1997-04-22 Duracell Inc. Method of preparing current collectors for electrochemical cells
US5510824A (en) 1993-07-26 1996-04-23 Texas Instruments, Inc. Spatial light modulator array
US5552925A (en) 1993-09-07 1996-09-03 John M. Baker Electro-micro-mechanical shutters on transparent substrates
FR2709854B1 (en) 1993-09-07 1995-10-06 Sextant Avionique Visualization device with optimized colors.
US5559389A (en) 1993-09-08 1996-09-24 Silicon Video Corporation Electron-emitting devices having variously constituted electron-emissive elements, including cones or pedestals
US5564959A (en) 1993-09-08 1996-10-15 Silicon Video Corporation Use of charged-particle tracks in fabricating gated electron-emitting devices
US5440197A (en) 1993-10-05 1995-08-08 Tir Technologies, Inc. Backlighting apparatus for uniformly illuminating a display panel
US5526051A (en) 1993-10-27 1996-06-11 Texas Instruments Incorporated Digital television system
US5452024A (en) 1993-11-01 1995-09-19 Texas Instruments Incorporated DMD display system
US5894686A (en) * 1993-11-04 1999-04-20 Lumitex, Inc. Light distribution/information display systems
US5396350A (en) 1993-11-05 1995-03-07 Alliedsignal Inc. Backlighting apparatus employing an array of microprisms
US5517347A (en) 1993-12-01 1996-05-14 Texas Instruments Incorporated Direct view deformable mirror device
US5798746A (en) 1993-12-27 1998-08-25 Semiconductor Energy Laboratory Co., Ltd. Liquid crystal display device
JPH07212639A (en) 1994-01-25 1995-08-11 Sony Corp Electronic shutter device for television cameras
US5504389A (en) 1994-03-08 1996-04-02 Planar Systems, Inc. Black electrode TFEL display
US5729038A (en) 1995-12-15 1998-03-17 Harris Corporation Silicon-glass bonded wafers
US5465054A (en) 1994-04-08 1995-11-07 Vivid Semiconductor, Inc. High voltage CMOS logic using low voltage CMOS process
US5629784A (en) 1994-04-12 1997-05-13 Ois Optical Imaging Systems, Inc. Liquid crystal display with holographic diffuser and prism sheet on viewer side
JP3102259B2 (en) * 1994-04-21 2000-10-23 株式会社村田製作所 High voltage connector
US5491347A (en) * 1994-04-28 1996-02-13 Xerox Corporation Thin-film structure with dense array of binary control units for presenting images
US6040937A (en) 1994-05-05 2000-03-21 Etalon, Inc. Interferometric modulation
US6710908B2 (en) 1994-05-05 2004-03-23 Iridigm Display Corporation Controlling micro-electro-mechanical cavities
US6680792B2 (en) 1994-05-05 2004-01-20 Iridigm Display Corporation Interferometric modulation of radiation
US7460291B2 (en) 1994-05-05 2008-12-02 Idc, Llc Separable modulator
US7550794B2 (en) 2002-09-20 2009-06-23 Idc, Llc Micromechanical systems device comprising a displaceable electrode and a charge-trapping layer
US7123216B1 (en) * 1994-05-05 2006-10-17 Idc, Llc Photonic MEMS and structures
US20010003487A1 (en) 1996-11-05 2001-06-14 Mark W. Miles Visible spectrum modulator arrays
US5815134A (en) 1994-05-16 1998-09-29 Semiconductor Energy Laboratory Co., Ltd. Liquid crystal electro-optical device and driving method thereof
JP3708583B2 (en) 1994-05-16 2005-10-19 株式会社半導体エネルギー研究所 Driving method of liquid crystal electro-optical device
US5497258A (en) 1994-05-27 1996-03-05 The Regents Of The University Of Colorado Spatial light modulator including a VLSI chip and using solder for horizontal and vertical component positioning
US5497172A (en) 1994-06-13 1996-03-05 Texas Instruments Incorporated Pulse width modulation for spatial light modulator with split reset addressing
GB9411160D0 (en) 1994-06-03 1994-07-27 Land Infrared Ltd Improvements relating to radiation thermometers
US5694227A (en) 1994-07-15 1997-12-02 Apple Computer, Inc. Method and apparatus for calibrating and adjusting a color imaging system
JP3184069B2 (en) 1994-09-02 2001-07-09 シャープ株式会社 Image display device
JPH0895526A (en) 1994-09-22 1996-04-12 Casio Comput Co Ltd Color liquid crystal display device for rgb field sequential display system
EP0952466A3 (en) 1994-10-18 2000-05-03 Mitsubishi Rayon Co., Ltd. Lens sheet
FR2726135B1 (en) 1994-10-25 1997-01-17 Suisse Electronique Microtech SWITCHING DEVICE
JP3755911B2 (en) 1994-11-15 2006-03-15 富士通株式会社 Semiconductor circuit
US5808800A (en) 1994-12-22 1998-09-15 Displaytech, Inc. Optics arrangements including light source arrangements for an active matrix liquid crystal image generator
US5596369A (en) * 1995-01-24 1997-01-21 Lsi Logic Corporation Statistically derived method and system for decoding MPEG motion compensation and transform coded video data
US5504614A (en) 1995-01-31 1996-04-02 Texas Instruments Incorporated Method for fabricating a DMD spatial light modulator with a hardened hinge
WO1996033483A1 (en) 1995-04-18 1996-10-24 Cambridge Display Technology Limited A display
US6424388B1 (en) 1995-04-28 2002-07-23 International Business Machines Corporation Reflective spatial light modulator array
KR19990022626A (en) 1995-06-07 1999-03-25 야스카와 히데아키 Computer system with video display controller with power saving mode
JP3533759B2 (en) 1995-06-08 2004-05-31 凸版印刷株式会社 Color liquid crystal display using hologram
US6969635B2 (en) 2000-12-07 2005-11-29 Reflectivity, Inc. Methods for depositing, releasing and packaging micro-electromechanical devices on wafer substrates
US5835256A (en) 1995-06-19 1998-11-10 Reflectivity, Inc. Reflective spatial light modulator with encapsulated micro-mechanical elements
US6046840A (en) * 1995-06-19 2000-04-04 Reflectivity, Inc. Double substrate reflective spatial light modulator with self-limiting micro-mechanical elements
US6952301B2 (en) 1995-06-19 2005-10-04 Reflectivity, Inc Spatial light modulators with light blocking and absorbing areas
US5975711A (en) 1995-06-27 1999-11-02 Lumitex, Inc. Integrated display panel assemblies
US7108414B2 (en) 1995-06-27 2006-09-19 Solid State Opto Limited Light emitting panel assemblies
US6185356B1 (en) * 1995-06-27 2001-02-06 Lumitex, Inc. Protective cover for a lighting device
US5613751A (en) 1995-06-27 1997-03-25 Lumitex, Inc. Light emitting panel assemblies
US20040135273A1 (en) 1995-06-27 2004-07-15 Parker Jeffery R. Methods of making a pattern of optical element shapes on a roll for use in making optical elements on or in substrates
US6712481B2 (en) * 1995-06-27 2004-03-30 Solid State Opto Limited Light emitting panel assemblies
US20020058931A1 (en) 1995-06-27 2002-05-16 Jeffrey R. Parker Light delivery system and applications thereof
US5959598A (en) 1995-07-20 1999-09-28 The Regents Of The University Of Colorado Pixel buffer circuits for implementing improved methods of displaying grey-scale or color images
EP0757958A1 (en) 1995-08-07 1997-02-12 Societe Des Produits Nestle S.A. Multi-pack of units held together by means of an adhesive sticker
DE19530121A1 (en) 1995-08-16 1997-02-20 Fev Motorentech Gmbh & Co Kg Reduction of impact velocity method for armature impacting on to electromagnetic actuator
US5801792A (en) 1995-12-13 1998-09-01 Swz Engineering Ltd. High resolution, high intensity video projection cathode ray tube provided with a cooled reflective phosphor screen support
JP3799092B2 (en) 1995-12-29 2006-07-19 アジレント・テクノロジーズ・インク Light modulation device and display device
US5771321A (en) 1996-01-04 1998-06-23 Massachusetts Institute Of Technology Micromechanical optical switch and flat panel display
US5895115A (en) 1996-01-16 1999-04-20 Lumitex, Inc. Light emitting panel assemblies for use in automotive applications and the like
JPH09218360A (en) 1996-02-08 1997-08-19 Ricoh Co Ltd Mechanical optical shutter
WO1997029538A1 (en) * 1996-02-10 1997-08-14 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Bistable microactuator with coupled membranes
US5745284A (en) 1996-02-23 1998-04-28 President And Fellows Of Harvard College Solid-state laser source of tunable narrow-bandwidth ultraviolet radiation
TW395121B (en) * 1996-02-26 2000-06-21 Seiko Epson Corp Personal wearing information display device and the display method using such device
US5745203A (en) 1996-03-28 1998-04-28 Motorola, Inc. Liquid crystal display device including multiple ambient light illumination modes with switchable holographic optical element
CN1095057C (en) 1996-04-17 2002-11-27 迪科公司 Method and apparatus for controlling light
FR2747802B1 (en) 1996-04-18 1998-05-15 Commissariat Energie Atomique OPTOMECHANICAL MICRODISPOSITIVE, AND APPLICATION TO AN OPTOMECHANICAL MICRODEFLECTOR
US5731802A (en) 1996-04-22 1998-03-24 Silicon Light Machines Time-interleaved bit-plane, pulse-width-modulation digital display system
JPH09292576A (en) 1996-04-25 1997-11-11 Casio Comput Co Ltd Optical control element and display device using the same
FR2748578B1 (en) 1996-05-10 1998-05-29 Commissariat Energie Atomique OPTOMECHANICAL DEVICE AND APPLICATION TO SENSORS IN INTEGRATED OPTICS
US5691695A (en) 1996-07-24 1997-11-25 United Technologies Automotive Systems, Inc. Vehicle information display on steering wheel surface
WO1998004950A1 (en) 1996-07-25 1998-02-05 Anvik Corporation Seamless, maskless lithography system using spatial light modulator
JP4050802B2 (en) 1996-08-02 2008-02-20 シチズン電子株式会社 Color display device
JP3442581B2 (en) 1996-08-06 2003-09-02 株式会社ヒューネット Driving method of nematic liquid crystal
US5781333A (en) 1996-08-20 1998-07-14 Lanzillotta; John Piezoelectric light shutter
US5884083A (en) * 1996-09-20 1999-03-16 Royce; Robert Computer system to compile non-incremental computer source code to execute within an incremental type computer system
US5854872A (en) 1996-10-08 1998-12-29 Clio Technologies, Inc. Divergent angle rotator system and method for collimating light beams
US6028656A (en) * 1996-10-09 2000-02-22 Cambridge Research & Instrumentation Inc. Optical polarization switch and method of using same
AU5156198A (en) 1996-10-29 1998-05-22 Xeotron Corporation Optical device utilizing optical waveguides and mechanical light-switches
US6677936B2 (en) * 1996-10-31 2004-01-13 Kopin Corporation Color display system for a camera
DE19730715C1 (en) 1996-11-12 1998-11-26 Fraunhofer Ges Forschung Method of manufacturing a micromechanical relay
US6046716A (en) 1996-12-19 2000-04-04 Colorado Microdisplay, Inc. Display system having electrode modulation to alter a state of an electro-optic layer
US7471444B2 (en) 1996-12-19 2008-12-30 Idc, Llc Interferometric modulation of radiation
US5781331A (en) 1997-01-24 1998-07-14 Roxburgh Ltd. Optical microshutter array
JP3726441B2 (en) 1997-03-18 2005-12-14 株式会社デンソー Radar equipment
JPH10282521A (en) 1997-04-04 1998-10-23 Sony Corp Reflection type liquid crystal display device
JP2877136B2 (en) 1997-04-11 1999-03-31 日本電気株式会社 Reflective color liquid crystal display
DE69841370D1 (en) 1997-04-14 2010-01-21 Dicon As EXPOSURE UNIT AND METHOD FOR POINTLESS EXPOSURE OF A CARRIER
US5973727A (en) 1997-05-13 1999-10-26 New Light Industries, Ltd. Video image viewing device and method
US5986628A (en) 1997-05-14 1999-11-16 Planar Systems, Inc. Field sequential color AMEL display
US5889625A (en) 1997-05-21 1999-03-30 Raytheon Company Chromatic aberration correction for display systems
US6529250B1 (en) 1997-05-22 2003-03-04 Seiko Epson Corporation Projector
JPH10333145A (en) 1997-05-30 1998-12-18 Sanyo Electric Co Ltd Liquid crystal display device equipped with lighting mechanism
TW482921B (en) 1997-06-16 2002-04-11 Matsushita Electric Ind Co Ltd Reflective liquid crystal display device
US6137313A (en) 1997-06-20 2000-10-24 Altera Corporation Resistive pull-up device for I/O pin
GB9713658D0 (en) 1997-06-28 1997-09-03 Travis Adrian R L View-sequential holographic display
JP3840746B2 (en) 1997-07-02 2006-11-01 ソニー株式会社 Image display device and image display method
US6591049B2 (en) * 1997-07-02 2003-07-08 Lumitex, Inc. Light delivery systems and applications thereof
US20050171408A1 (en) 1997-07-02 2005-08-04 Parker Jeffery R. Light delivery systems and applications thereof
US6852095B1 (en) * 1997-07-09 2005-02-08 Charles D. Ray Interbody device and method for treatment of osteoporotic vertebral collapse
US6239777B1 (en) 1997-07-22 2001-05-29 Kabushiki Kaisha Toshiba Display device
JPH1195693A (en) 1997-07-22 1999-04-09 Toshiba Corp Display device
US5867302A (en) * 1997-08-07 1999-02-02 Sandia Corporation Bistable microelectromechanical actuator
US6214633B1 (en) 1997-08-28 2001-04-10 Mems Optical Inc. System for controlling light including a micromachined foucault shutter array and a method of manufacturing the same
JPH1184419A (en) 1997-09-09 1999-03-26 Hitachi Ltd Liquid crystal light valve and projection type display device
GB9719824D0 (en) 1997-09-18 1997-11-19 A P Valves Self-contained breathing apparatus
US5963367A (en) 1997-09-23 1999-10-05 Lucent Technologies, Inc. Micromechanical xyz stage for use with optical elements
JP3371200B2 (en) 1997-10-14 2003-01-27 富士通株式会社 Display control method of liquid crystal display device and liquid crystal display device
CA2306384A1 (en) 1997-10-14 1999-04-22 Patterning Technologies Limited Method of forming an electronic device
US5943223A (en) 1997-10-15 1999-08-24 Reliance Electric Industrial Company Electric switches for reducing on-state power loss
JP4550175B2 (en) 1997-10-23 2010-09-22 株式会社東芝 Electronic device, backlight control method, and recording medium
US6486997B1 (en) 1997-10-28 2002-11-26 3M Innovative Properties Company Reflective LCD projection system using wide-angle Cartesian polarizing beam splitter
US6127908A (en) 1997-11-17 2000-10-03 Massachusetts Institute Of Technology Microelectro-mechanical system actuator device and reconfigurable circuits utilizing same
EP0958571B1 (en) 1997-11-29 2003-07-23 Koninklijke Philips Electronics N.V. Display device comprising a light guide
GB9727148D0 (en) 1997-12-22 1998-02-25 Fki Plc Improvemnts in and relating to electomagnetic actuators
JP4118389B2 (en) * 1997-12-29 2008-07-16 日本ライツ株式会社 Light guide plate and flat illumination device
JPH11202325A (en) 1998-01-08 1999-07-30 Seiko Instruments Inc Reflection type liquid crystal display device and production therefor
US6473220B1 (en) 1998-01-22 2002-10-29 Trivium Technologies, Inc. Film having transmissive and reflective properties
CA2260679C (en) 1998-02-03 2009-04-21 Thomas H. Loga Fluid flow system, casing, and method
AUPP176898A0 (en) 1998-02-12 1998-03-05 Moldflow Pty Ltd Automated machine technology for thermoplastic injection molding
IL123579A0 (en) 1998-03-06 1998-10-30 Heines Amihai Apparatus for producing high contrast imagery
US6046836A (en) 1998-03-06 2000-04-04 Electro-Optical Products Corporation Low frequency optical shutter
JP3824290B2 (en) 1998-05-07 2006-09-20 富士写真フイルム株式会社 Array type light modulation element, array type exposure element, flat display, and method for driving array type light modulation element
US6211521B1 (en) 1998-03-13 2001-04-03 Intel Corporation Infrared pixel sensor and infrared signal correction
US6195196B1 (en) * 1998-03-13 2001-02-27 Fuji Photo Film Co., Ltd. Array-type exposing device and flat type display incorporating light modulator and driving method thereof
JP3376308B2 (en) 1998-03-16 2003-02-10 株式会社東芝 Reflector and liquid crystal display
JPH11271744A (en) 1998-03-24 1999-10-08 Minolta Co Ltd Color liquid crystal display device
US6710920B1 (en) 1998-03-27 2004-03-23 Sanyo Electric Co., Ltd Stereoscopic display
WO1999052006A2 (en) 1998-04-08 1999-10-14 Etalon, Inc. Interferometric modulation of radiation
JPH11296150A (en) 1998-04-10 1999-10-29 Masaya Okita High-speed driving method for liquid crystal
US20020163482A1 (en) 1998-04-20 2002-11-07 Alan Sullivan Multi-planar volumetric display system including optical elements made from liquid crystal having polymer stabilized cholesteric textures
WO1999057485A1 (en) * 1998-04-30 1999-11-11 Casio Computer Co., Ltd. Display device using ambient light and a lighting panel
US6329974B1 (en) 1998-04-30 2001-12-11 Agilent Technologies, Inc. Electro-optical material-based display device having analog pixel drivers
US6249269B1 (en) 1998-04-30 2001-06-19 Agilent Technologies, Inc. Analog pixel drive circuit for an electro-optical material-based display device
US20010040538A1 (en) 1999-05-13 2001-11-15 William A. Quanrud Display system with multiplexed pixels
US6459467B1 (en) 1998-05-15 2002-10-01 Minolta Co., Ltd. Liquid crystal light modulating device, and a manufacturing method and a manufacturing apparatus thereof
JP3954198B2 (en) 1998-06-01 2007-08-08 富士通株式会社 Output circuit, level converter circuit, logic circuit, and operational amplifier circuit
US5995688A (en) 1998-06-01 1999-11-30 Lucent Technologies, Inc. Micro-opto-electromechanical devices and method therefor
ES2160433T3 (en) 1998-06-02 2001-11-01 Rainer Glatzer FLAT SCREEN.
JP3428446B2 (en) * 1998-07-09 2003-07-22 富士通株式会社 Plasma display panel and method of manufacturing the same
JP2000028938A (en) 1998-07-13 2000-01-28 Fuji Photo Film Co Ltd Array type optical modulation element, array type exposure element, and method for driving plane display device
JP2000057832A (en) 1998-07-31 2000-02-25 Hitachi Ltd Lighting system and liquid crystal display device using same
GB2340281A (en) 1998-08-04 2000-02-16 Sharp Kk A reflective liquid crystal display device
US6710538B1 (en) 1998-08-26 2004-03-23 Micron Technology, Inc. Field emission display having reduced power requirements and method
IT1302170B1 (en) 1998-08-31 2000-07-31 St Microelectronics Srl VOLTAGE REGULATOR WITH SOFT VARIATION OF THE ABSORBED CURRENT.
US6249370B1 (en) 1998-09-18 2001-06-19 Ngk Insulators, Ltd. Display device
US6962419B2 (en) 1998-09-24 2005-11-08 Reflectivity, Inc Micromirror elements, package for the micromirror elements, and projection system therefor
US6523961B2 (en) * 2000-08-30 2003-02-25 Reflectivity, Inc. Projection system and mirror elements for improved contrast ratio in spatial light modulators
DE69834847T2 (en) 1998-09-24 2007-02-15 Reflectivity Inc., Santa Clara REFLECTIVE SPATIAL LIGHT MODULATOR WITH DOUBLE SUBSTRATE AND SELF-RESTRICTING MICROMECHANICAL ELEMENTS
JP2000105547A (en) 1998-09-29 2000-04-11 Casio Comput Co Ltd Information processor
JP2000214393A (en) 1999-01-20 2000-08-04 Fuji Photo Film Co Ltd Optical modulator, array optical modulator and flat- panel display device
US6288829B1 (en) 1998-10-05 2001-09-11 Fuji Photo Film, Co., Ltd. Light modulation element, array-type light modulation element, and flat-panel display unit
JP2000111813A (en) 1998-10-05 2000-04-21 Fuji Photo Film Co Ltd Optical modulation element and array type optical modulation element as well as plane display device
JP2000131627A (en) 1998-10-27 2000-05-12 Fuji Photo Film Co Ltd Optical modulation element and array type optical modulation element as well as plane display device
JP3934269B2 (en) 1999-01-20 2007-06-20 富士フイルム株式会社 Flat panel display
JP3919954B2 (en) 1998-10-16 2007-05-30 富士フイルム株式会社 Array type light modulation element and flat display driving method
US6404942B1 (en) 1998-10-23 2002-06-11 Corning Incorporated Fluid-encapsulated MEMS optical switch
US6034807A (en) 1998-10-28 2000-03-07 Memsolutions, Inc. Bistable paper white direct view display
US6639572B1 (en) 1998-10-28 2003-10-28 Intel Corporation Paper white direct view display
IL126866A (en) 1998-11-02 2003-02-12 Orbotech Ltd Apparatus and method for fabricating flat workpieces
US6288824B1 (en) 1998-11-03 2001-09-11 Alex Kastalsky Display device based on grating electromechanical shutter
US6201664B1 (en) 1998-11-16 2001-03-13 International Business Machines Corporation Polymer bumps for trace and shock protection
US6300294B1 (en) 1998-11-16 2001-10-09 Texas Instruments Incorporated Lubricant delivery for micromechanical devices
GB2343980A (en) 1998-11-18 2000-05-24 Sharp Kk Spatial light modulator and display
JP4434359B2 (en) 1999-05-19 2010-03-17 東芝モバイルディスプレイ株式会社 Flat display device and manufacturing method thereof
JP2000172219A (en) 1998-12-08 2000-06-23 Canon Inc Display controller, display control method and record medium
GB9828074D0 (en) 1998-12-18 1999-02-17 Glaxo Group Ltd Therapeutically useful compounds
US6154586A (en) 1998-12-24 2000-11-28 Jds Fitel Inc. Optical switch mechanism
US6498685B1 (en) 1999-01-11 2002-12-24 Kenneth C. Johnson Maskless, microlens EUV lithography system
JP3912760B2 (en) 1999-01-20 2007-05-09 富士フイルム株式会社 Driving method of array type light modulation element and flat display device
JP2000214397A (en) 1999-01-22 2000-08-04 Canon Inc Optical polarizer
JP2000214831A (en) 1999-01-27 2000-08-04 Hitachi Ltd Display processor and information processor
US6266240B1 (en) 1999-02-04 2001-07-24 Palm, Inc. Encasement for a handheld computer
JP2000235152A (en) 1999-02-12 2000-08-29 Victor Co Of Japan Ltd Light deflector
US6476886B2 (en) * 1999-02-15 2002-11-05 Rainbow Displays, Inc. Method for assembling a tiled, flat-panel microdisplay array
US6567138B1 (en) 1999-02-15 2003-05-20 Rainbow Displays, Inc. Method for assembling a tiled, flat-panel microdisplay array having imperceptible seams
US20050024849A1 (en) * 1999-02-23 2005-02-03 Parker Jeffery R. Methods of cutting or forming cavities in a substrate for use in making optical films, components or wave guides
US6752505B2 (en) 1999-02-23 2004-06-22 Solid State Opto Limited Light redirecting films and film systems
US7364341B2 (en) 1999-02-23 2008-04-29 Solid State Opto Limited Light redirecting films including non-interlockable optical elements
US6827456B2 (en) 1999-02-23 2004-12-07 Solid State Opto Limited Transreflectors, transreflector systems and displays and methods of making transreflectors
US7167156B1 (en) 1999-02-26 2007-01-23 Micron Technology, Inc. Electrowetting display
CA2365547A1 (en) 1999-03-04 2000-09-08 Flixel Ltd. Micro-mechanical flat panel display with touch sensitive input and vibration source
JP2000259116A (en) 1999-03-09 2000-09-22 Nec Corp Driving method and device for multi-level display plasma display
US6316278B1 (en) 1999-03-16 2001-11-13 Alien Technology Corporation Methods for fabricating a multiple modular assembly
JP2000275604A (en) 1999-03-23 2000-10-06 Hitachi Ltd Liquid crystal display device
US6428173B1 (en) 1999-05-03 2002-08-06 Jds Uniphase, Inc. Moveable microelectromechanical mirror structures and associated methods
JP2000321566A (en) 1999-05-11 2000-11-24 Ricoh Microelectronics Co Ltd Liquid crystal display device
US6633301B1 (en) 1999-05-17 2003-10-14 Displaytech, Inc. RGB illuminator with calibration via single detector servo
JP2000338523A (en) 1999-05-25 2000-12-08 Nec Corp Liquid crystal display device
US6201633B1 (en) 1999-06-07 2001-03-13 Xerox Corporation Micro-electromechanical based bistable color display sheets
JP4508505B2 (en) * 1999-06-23 2010-07-21 シチズンホールディングス株式会社 Liquid crystal display
US6507138B1 (en) * 1999-06-24 2003-01-14 Sandia Corporation Very compact, high-stability electrostatic actuator featuring contact-free self-limiting displacement
WO2001006295A1 (en) 1999-07-14 2001-01-25 Eiki Matsuo Image-forming optical system
JP2001035222A (en) 1999-07-23 2001-02-09 Minebea Co Ltd Surface lighting system
US6248509B1 (en) 1999-07-27 2001-06-19 James E. Sanford Maskless photoresist exposure system using mems devices
JP2001042340A (en) 1999-08-03 2001-02-16 Minolta Co Ltd Production of liquid crystal display device
US6229640B1 (en) 1999-08-11 2001-05-08 Adc Telecommunications, Inc. Microelectromechanical optical switch and method of manufacture thereof
JP3926948B2 (en) 1999-08-19 2007-06-06 株式会社小糸製作所 Vehicle headlamp
JP3665515B2 (en) 1999-08-26 2005-06-29 セイコーエプソン株式会社 Image display device
US6322712B1 (en) 1999-09-01 2001-11-27 Micron Technology, Inc. Buffer layer in flat panel display
JP2001075534A (en) 1999-09-01 2001-03-23 Victor Co Of Japan Ltd Liquid crystal display device
JP4198281B2 (en) 1999-09-13 2008-12-17 日本ライツ株式会社 Light guide plate and flat illumination device
CN100360496C (en) * 1999-09-20 2008-01-09 因维斯塔技术有限公司 Multidentate phosphite ligands, catalytic compositions containing such ligands and catalytic processes utilizing such catalytic compositions
US6275320B1 (en) 1999-09-27 2001-08-14 Jds Uniphase, Inc. MEMS variable optical attenuator
JP3643508B2 (en) 1999-09-28 2005-04-27 株式会社東芝 Movable film type display device
US6441829B1 (en) 1999-09-30 2002-08-27 Agilent Technologies, Inc. Pixel driver that generates, in response to a digital input value, a pixel drive signal having a duty cycle that determines the apparent brightness of the pixel
JP2001175216A (en) 1999-10-04 2001-06-29 Matsushita Electric Ind Co Ltd High gradation display technology
KR20010050623A (en) 1999-10-04 2001-06-15 모리시타 요이찌 Display technique for high gradation degree
WO2003007049A1 (en) 1999-10-05 2003-01-23 Iridigm Display Corporation Photonic mems and structures
US6583915B1 (en) 1999-10-08 2003-06-24 Lg. Philips Lcd Co., Ltd. Display device using a micro light modulator and fabricating method thereof
US7046905B1 (en) 1999-10-08 2006-05-16 3M Innovative Properties Company Blacklight with structured surfaces
CA2323189A1 (en) 1999-10-15 2001-04-15 Cristian A. Bolle Dual motion electrostatic actuator design for mems micro-relay
JP3618066B2 (en) 1999-10-25 2005-02-09 株式会社日立製作所 Liquid crystal display
US7041224B2 (en) 1999-10-26 2006-05-09 Reflectivity, Inc. Method for vapor phase etching of silicon
US7071520B2 (en) 2000-08-23 2006-07-04 Reflectivity, Inc MEMS with flexible portions made of novel materials
US6690422B1 (en) * 1999-11-03 2004-02-10 Sharp Laboratories Of America, Inc. Method and system for field sequential color image capture using color filter array
KR100312432B1 (en) 1999-11-25 2001-11-05 오길록 Optical Switch using Micro Structures
EP1240708A2 (en) 1999-11-29 2002-09-18 Iolon, Inc. Balanced microdevice and rotary electrostatic microactuator for use therewith
JP2001154642A (en) 1999-11-30 2001-06-08 Toshiba Corp Information processor
JP3639482B2 (en) 1999-12-01 2005-04-20 理想科学工業株式会社 Screen printing apparatus and stencil sheet assembly
US6700554B2 (en) 1999-12-04 2004-03-02 Lg. Philips Lcd Co., Ltd. Transmissive display device using micro light modulator
US6535311B1 (en) 1999-12-09 2003-03-18 Corning Incorporated Wavelength selective cross-connect switch using a MEMS shutter array
KR100679095B1 (en) 1999-12-10 2007-02-05 엘지.필립스 엘시디 주식회사 Transparent Type Display Device Using Micro Light Modulator
JP2001249287A (en) * 1999-12-30 2001-09-14 Texas Instr Inc <Ti> Method for operating bistabl micro mirror array
JP2001201698A (en) 2000-01-19 2001-07-27 Seiko Epson Corp Image display device, optical modulation unit suitable for the same and drive unit
JP3884207B2 (en) 2000-01-20 2007-02-21 インターナショナル・ビジネス・マシーンズ・コーポレーション Liquid crystal display
EP1118901A1 (en) * 2000-01-21 2001-07-25 Dicon A/S A rear-projecting device
US6407851B1 (en) 2000-08-01 2002-06-18 Mohammed N. Islam Micromechanical optical switch
JP2002262551A (en) 2000-02-07 2002-09-13 Fiderikkusu:Kk Voltage step-down dc-dc converter
WO2001061383A1 (en) 2000-02-16 2001-08-23 Matsushita Electric Industrial Co., Ltd. Irregular-shape body, reflection sheet and reflection-type liquid crystal display element , and production method and production device therefor
EP1128201A1 (en) 2000-02-25 2001-08-29 C.S.E.M. Centre Suisse D'electronique Et De Microtechnique Sa Switching device, particularly for optical switching
JP4006918B2 (en) 2000-02-28 2007-11-14 オムロン株式会社 Surface light source device and manufacturing method thereof
JP2001242826A (en) 2000-03-02 2001-09-07 Fujitsu Hitachi Plasma Display Ltd Plasma display device and its driving method
EP1202244A4 (en) 2000-03-14 2005-08-31 Mitsubishi Electric Corp Image display and image displaying method
EP1143744B1 (en) 2000-03-17 2008-09-24 Hitachi, Ltd. Image display device
US6747784B2 (en) 2000-03-20 2004-06-08 Np Photonics, Inc. Compliant mechanism and method of forming same
US6593677B2 (en) 2000-03-24 2003-07-15 Onix Microsystems, Inc. Biased rotatable combdrive devices and methods
US6296838B1 (en) 2000-03-24 2001-10-02 Council Of Scientific And Industrial Research Anti-fungal herbal formulation for treatment of human nails fungus and process thereof
JP3558332B2 (en) 2000-03-30 2004-08-25 株式会社東芝 Movable film display
US6697035B2 (en) * 2000-03-30 2004-02-24 Kabushiki Kaisha Toshiba Display device and moving-film display device
US6545385B2 (en) 2000-04-11 2003-04-08 Sandia Corporation Microelectromechanical apparatus for elevating and tilting a platform
US20010043177A1 (en) 2000-04-14 2001-11-22 Huston James R. System and method for color and grayscale drive methods for graphical displays utilizing analog controlled waveforms
US20020034418A1 (en) 2000-04-19 2002-03-21 Koch Earl D. Temporary ramp
US6227677B1 (en) 2000-04-21 2001-05-08 Mary M. Willis Portable light
JP2003532146A (en) 2000-04-25 2003-10-28 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Method for reducing errors in displays using double line subfield addressing
US6388661B1 (en) 2000-05-03 2002-05-14 Reflectivity, Inc. Monochrome and color digital display systems and methods
JP2001320310A (en) * 2000-05-09 2001-11-16 Nec Corp Diversity radio transmitter/receiver
JP4403633B2 (en) 2000-05-10 2010-01-27 ソニー株式会社 Liquid crystal display device and manufacturing method thereof
US6578436B1 (en) 2000-05-16 2003-06-17 Fidelica Microsystems, Inc. Method and apparatus for pressure sensing
JP2001331144A (en) 2000-05-18 2001-11-30 Canon Inc Video signal processing device, display device, projector, display method, and information storage medium
JP2001331142A (en) 2000-05-18 2001-11-30 Canon Inc Picture display device and method therefor
AU2001262065A1 (en) 2000-05-26 2001-12-03 Chaker Khalfaoui A stiction-free electrostatically driven microstructure device
JP2001337649A (en) 2000-05-29 2001-12-07 Mitsubishi Electric Corp Plasma display equipment
AU2001265012B2 (en) 2000-06-01 2006-07-13 Georgetown University Systems and methods for monitoring health and delivering drugs transdermally
AU2001265426A1 (en) 2000-06-06 2001-12-17 Iolon, Inc. Damped micromechanical device and method for making same
JP4439084B2 (en) 2000-06-14 2010-03-24 日東電工株式会社 Liquid crystal display
JP2001356281A (en) 2000-06-14 2001-12-26 Sharp Corp Display element and display device
US7555333B2 (en) 2000-06-19 2009-06-30 University Of Washington Integrated optical scanning image acquisition and display
JP2002006325A (en) 2000-06-20 2002-01-09 Nec Corp Method for manufacturing liquid crystal display panel
TW594218B (en) 2000-07-03 2004-06-21 Alps Electric Co Ltd Reflector and reflective liquid crystal display device
DE60142452D1 (en) 2000-07-03 2010-08-05 Sony Corp Optical multilayer structure, optical switching device and image display device
US6781742B2 (en) 2000-07-11 2004-08-24 Semiconductor Energy Laboratory Co., Ltd. Digital micromirror device and method of driving digital micromirror device
JP4801289B2 (en) 2000-07-11 2011-10-26 株式会社半導体エネルギー研究所 Micromirror devices, projectors, printers, and copiers
US6677709B1 (en) 2000-07-18 2004-01-13 General Electric Company Micro electromechanical system controlled organic led and pixel arrays and method of using and of manufacturing same
US6532044B1 (en) 2000-07-21 2003-03-11 Corning Precision Lens, Incorporated Electronic projector with equal-length color component paths
JP2002040336A (en) 2000-07-21 2002-02-06 Fuji Photo Film Co Ltd Optical modulation element and exposure device and flat display device using the same
JP4460732B2 (en) * 2000-07-21 2010-05-12 富士フイルム株式会社 Flat display device and exposure apparatus
JP4066620B2 (en) 2000-07-21 2008-03-26 日亜化学工業株式会社 LIGHT EMITTING ELEMENT, DISPLAY DEVICE HAVING LIGHT EMITTING ELEMENT AND METHOD FOR MANUFACTURING DISPLAY DEVICE
JP2002040337A (en) 2000-07-24 2002-02-06 Fuji Photo Film Co Ltd Optical modulation element and exposure device and flat display device using the same
JP4136334B2 (en) 2000-07-27 2008-08-20 日本ビクター株式会社 Information collection system
JP4542243B2 (en) * 2000-07-28 2010-09-08 エーユー オプトロニクス コーポレイション Liquid crystal cell, display device, and method of manufacturing liquid crystal cell
IT1318679B1 (en) 2000-08-11 2003-08-27 Enichem Spa PROCESS FOR THE PRODUCTION OF OXYGEN WATER.
US6559827B1 (en) 2000-08-16 2003-05-06 Gateway, Inc. Display assembly
US7057246B2 (en) 2000-08-23 2006-06-06 Reflectivity, Inc Transition metal dielectric alloy materials for MEMS
US7167297B2 (en) 2000-08-30 2007-01-23 Reflectivity, Inc Micromirror array
US6733354B1 (en) 2000-08-31 2004-05-11 Micron Technology, Inc. Spacers for field emission displays
US6738177B1 (en) 2000-09-05 2004-05-18 Siwave, Inc. Soft snap-down optical element using kinematic supports
US6531947B1 (en) 2000-09-12 2003-03-11 3M Innovative Properties Company Direct acting vertical thermal actuator with controlled bending
US8157654B2 (en) 2000-11-28 2012-04-17 Nintendo Co., Ltd. Hand-held video game platform emulation
CN100487304C (en) 2000-09-25 2009-05-13 三菱丽阳株式会社 Light source device
GB0024804D0 (en) 2000-10-10 2000-11-22 Microemissive Displays Ltd An optoelectronic device
JP4594510B2 (en) 2000-11-02 2010-12-08 三菱電機株式会社 Transmission type image display device and driving method of transmission type image display device
US6760505B1 (en) 2000-11-08 2004-07-06 Xerox Corporation Method of aligning mirrors in an optical cross switch
US6664779B2 (en) 2000-11-16 2003-12-16 Texas Instruments Incorporated Package with environmental control material carrier
US6762868B2 (en) 2000-11-16 2004-07-13 Texas Instruments Incorporated Electro-optical package with drop-in aperture
DE10057783A1 (en) 2000-11-22 2002-06-06 Siemens Ag Method for controlling a matrix converter
JP2004524550A (en) 2000-11-22 2004-08-12 フリクセル リミテッド Micro electro mechanical display device
AU2002230520A1 (en) 2000-11-29 2002-06-11 E-Ink Corporation Addressing circuitry for large electronic displays
JP2002229532A (en) 2000-11-30 2002-08-16 Toshiba Corp Liquid crystal display and its driving method
US6414316B1 (en) 2000-11-30 2002-07-02 Fyodor I. Maydanich Protective cover and attachment method for moisture sensitive devices
US6992375B2 (en) * 2000-11-30 2006-01-31 Texas Instruments Incorporated Anchor for device package
US6504641B2 (en) 2000-12-01 2003-01-07 Agere Systems Inc. Driver and method of operating a micro-electromechanical system device
US7307775B2 (en) 2000-12-07 2007-12-11 Texas Instruments Incorporated Methods for depositing, releasing and packaging micro-electromechanical devices on wafer substrates
US6906847B2 (en) 2000-12-07 2005-06-14 Reflectivity, Inc Spatial light modulators with light blocking/absorbing areas
US20020086456A1 (en) 2000-12-19 2002-07-04 Cunningham Shawn Jay Bulk micromachining process for fabricating an optical MEMS device with integrated optical aperture
JP4446591B2 (en) 2000-12-20 2010-04-07 京セラ株式会社 Optical waveguide and optical circuit board
JP4560958B2 (en) 2000-12-21 2010-10-13 日本テキサス・インスツルメンツ株式会社 Micro electro mechanical system
JP3649145B2 (en) 2000-12-28 2005-05-18 オムロン株式会社 REFLECTIVE DISPLAY DEVICE, ITS MANUFACTURING METHOD, AND DEVICE USING THE SAME
JP2002207182A (en) 2001-01-10 2002-07-26 Sony Corp Optical multilayered structure and method for manufacturing the same, optical switching element, and image display device
US6947195B2 (en) 2001-01-18 2005-09-20 Ricoh Company, Ltd. Optical modulator, optical modulator manufacturing method, light information processing apparatus including optical modulator, image formation apparatus including optical modulator, and image projection and display apparatus including optical modulator
JP2002287718A (en) 2001-01-18 2002-10-04 Sharp Corp Display device, portable appliance and substrate
AU2002248440A1 (en) * 2001-01-19 2002-07-30 Massachusetts Institute Of Technology Characterization of compliant structure force-displacement behaviour
WO2002058089A1 (en) * 2001-01-19 2002-07-25 Massachusetts Institute Of Technology Bistable actuation techniques, mechanisms, and applications
TW548689B (en) 2001-01-25 2003-08-21 Fujitsu Display Tech Reflection type liquid crystal display device and manufacturing method thereof
JP4724924B2 (en) 2001-02-08 2011-07-13 ソニー株式会社 Manufacturing method of display device
US20030058543A1 (en) 2001-02-21 2003-03-27 Sheedy James B. Optically corrective lenses for a head-mounted computer display
US6746886B2 (en) 2001-03-19 2004-06-08 Texas Instruments Incorporated MEMS device with controlled gas space chemistry
JP2002279812A (en) 2001-03-19 2002-09-27 Casio Comput Co Ltd Surface light source
JP4619565B2 (en) * 2001-03-29 2011-01-26 株式会社リコー Image forming apparatus
JP2002297085A (en) 2001-03-30 2002-10-09 Ricoh Co Ltd Gradation display method and gradation display device
TW583299B (en) 2001-04-13 2004-04-11 Fuji Photo Film Co Ltd Liquid crystal composition, color filter and liquid crystal display device
JP3912999B2 (en) 2001-04-20 2007-05-09 富士通株式会社 Display device
US6756317B2 (en) 2001-04-23 2004-06-29 Memx, Inc. Method for making a microstructure by surface micromachining
US6965375B1 (en) 2001-04-27 2005-11-15 Palm, Inc. Compact integrated touch panel display for a handheld device
AU2002309629A1 (en) 2001-05-04 2002-11-18 L3 Optics, Inc. Method and apparatus for detecting and latching the position of a mems moving member
JP2002333619A (en) 2001-05-07 2002-11-22 Nec Corp Liquid crystal display element and manufacturing method therefor
JP3475940B2 (en) 2001-05-14 2003-12-10 ソニー株式会社 Projector device
JP2002341343A (en) 2001-05-14 2002-11-27 Nitto Denko Corp Lighting device and liquid crystal display device
US6429625B1 (en) 2001-05-18 2002-08-06 Palm, Inc. Method and apparatus for indicating battery charge status
US6671078B2 (en) 2001-05-23 2003-12-30 Axsun Technologies, Inc. Electrostatic zipper actuator optical beam switching system and method of operation
JP2002351431A (en) 2001-05-30 2002-12-06 Sony Corp Display driving method
JP3548136B2 (en) 2001-06-01 2004-07-28 三洋電機株式会社 Image processing device
JP2004521389A (en) 2001-06-05 2004-07-15 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Display device based on attenuated total reflection
JP2002365650A (en) 2001-06-05 2002-12-18 Fujitsu Ltd Method for manufacturing liquid crystal display panel
US6710008B2 (en) 2002-01-17 2004-03-23 Exxonmobil Chemical Patents Inc. Method of making molecular sieve catalyst
US6998219B2 (en) 2001-06-27 2006-02-14 University Of South Florida Maskless photolithography for etching and deposition
US6764796B2 (en) 2001-06-27 2004-07-20 University Of South Florida Maskless photolithography using plasma displays
US7119786B2 (en) * 2001-06-28 2006-10-10 Intel Corporation Method and apparatus for enabling power management of a flat panel display
EP1271187A3 (en) 2001-06-28 2004-09-22 Alps Electric Co., Ltd. Reflector and reflective liquid crystal display
JP4181495B2 (en) 2001-06-29 2008-11-12 シチズンホールディングス株式会社 LCD panel
US7291363B2 (en) 2001-06-30 2007-11-06 Texas Instruments Incorporated Lubricating micro-machined devices using fluorosurfactants
FR2826691B1 (en) 2001-07-02 2003-09-26 Solvay CIRCUIT FOR RESPIRATING THE CRANKCASE GASES OF AN INTERNAL COMBUSTION ENGINE
WO2003005733A1 (en) 2001-07-06 2003-01-16 Explay Ltd. An image projecting device and method
US7535624B2 (en) 2001-07-09 2009-05-19 E Ink Corporation Electro-optic display and materials for use therein
JP4945059B2 (en) 2001-07-10 2012-06-06 クアルコム メムス テクノロジーズ インコーポレイテッド Photonic MEMS and structure
JP2003029295A (en) 2001-07-11 2003-01-29 Sony Corp Liquid crystal display device
JP2003091002A (en) * 2001-07-12 2003-03-28 Alps Electric Co Ltd Liquid crystal display device
US6897843B2 (en) 2001-07-14 2005-05-24 Koninklijke Philips Electronics N.V. Active matrix display devices
JP2003029720A (en) 2001-07-16 2003-01-31 Fujitsu Ltd Display device
JP3909812B2 (en) 2001-07-19 2007-04-25 富士フイルム株式会社 Display element and exposure element
US7057251B2 (en) * 2001-07-20 2006-06-06 Reflectivity, Inc MEMS device made of transition metal-dielectric oxide materials
JP2003036057A (en) 2001-07-23 2003-02-07 Toshiba Corp Display device
JP2003036713A (en) 2001-07-25 2003-02-07 International Manufacturing & Engineering Services Co Ltd Surface light source device
EP1279994A3 (en) 2001-07-27 2003-10-01 Alps Electric Co., Ltd. Semitransparent reflective liquid-crystal display device
US6702759B2 (en) 2001-07-31 2004-03-09 Private Concepts, Inc. Intra-vaginal self-administered cell collecting device and method
US6589625B1 (en) 2001-08-01 2003-07-08 Iridigm Display Corporation Hermetic seal and method to create the same
US7023606B2 (en) 2001-08-03 2006-04-04 Reflectivity, Inc Micromirror array for projection TV
US6980177B2 (en) 2001-08-03 2005-12-27 Waterstrike Incorporated Sequential inverse encoding apparatus and method for providing confidential viewing of a fundamental display image
US6576887B2 (en) 2001-08-15 2003-06-10 3M Innovative Properties Company Light guide for use with backlit display
US6863219B1 (en) 2001-08-17 2005-03-08 Alien Technology Corporation Apparatuses and methods for forming electronic assemblies
US6781208B2 (en) 2001-08-17 2004-08-24 Nec Corporation Functional device, method of manufacturing therefor and driver circuit
US6755534B2 (en) 2001-08-24 2004-06-29 Brookhaven Science Associates Prismatic optical display
US20030042157A1 (en) 2001-08-30 2003-03-06 Mays Joe N. Baseball bat and accessory bag
US6784500B2 (en) 2001-08-31 2004-08-31 Analog Devices, Inc. High voltage integrated circuit amplifier
US20030048036A1 (en) 2001-08-31 2003-03-13 Lemkin Mark Alan MEMS comb-finger actuator
JP4880838B2 (en) 2001-09-05 2012-02-22 株式会社東芝 Method and apparatus for assembling liquid crystal display device
JP2003086233A (en) 2001-09-07 2003-03-20 Mitsubishi Electric Corp Flat plate type battery
JP4785300B2 (en) 2001-09-07 2011-10-05 株式会社半導体エネルギー研究所 Electrophoretic display device, display device, and electronic device
US6731492B2 (en) 2001-09-07 2004-05-04 Mcnc Research And Development Institute Overdrive structures for flexible electrostatic switch
JP3928395B2 (en) 2001-09-21 2007-06-13 オムロン株式会社 Surface light source device
JP2003098984A (en) 2001-09-25 2003-04-04 Rohm Co Ltd Image display device
US6794793B2 (en) 2001-09-27 2004-09-21 Memx, Inc. Microelectromechnical system for tilting a platform
JP2005504359A (en) 2001-09-28 2005-02-10 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Equipment with flat display
US6701039B2 (en) 2001-10-04 2004-03-02 Colibrys S.A. Switching device, in particular for optical applications
KR20030029251A (en) 2001-10-05 2003-04-14 삼성전자주식회사 Liquid crystal display device
US7046221B1 (en) 2001-10-09 2006-05-16 Displaytech, Inc. Increasing brightness in field-sequential color displays
US20090065429A9 (en) 2001-10-22 2009-03-12 Dickensheets David L Stiffened surface micromachined structures and process for fabricating the same
JP4032696B2 (en) 2001-10-23 2008-01-16 日本電気株式会社 Liquid crystal display
US6809851B1 (en) 2001-10-24 2004-10-26 Decicon, Inc. MEMS driver
JP2003140561A (en) 2001-10-30 2003-05-16 Seiko Epson Corp Optoelectronic device and its manufacturing method and electronic equipment
KR100764592B1 (en) 2001-10-30 2007-10-08 엘지.필립스 엘시디 주식회사 backlight for liquid crystal display devices
US20030085849A1 (en) 2001-11-06 2003-05-08 Michael Grabert Apparatus for image projection
US6936968B2 (en) 2001-11-30 2005-08-30 Mule Lighting, Inc. Retrofit light emitting diode tube
CN1608221A (en) 2001-12-03 2005-04-20 弗利克赛尔公司 Display devices
KR20050044695A (en) 2001-12-05 2005-05-12 솔리드 스테이트 옵토 리미티드 Transreflectors, transreflector systems and displays and methods of making transreflectors
US7185542B2 (en) 2001-12-06 2007-03-06 Microfabrica Inc. Complex microdevices and apparatus and methods for fabricating such devices
JP2003177723A (en) 2001-12-11 2003-06-27 Seiko Epson Corp Method for driving electro-optical device, driving circuit therefor, electro-optical device, and electronic equipment
EP2420873A3 (en) 2001-12-14 2013-01-16 QUALCOMM MEMS Technologies, Inc. Uniform illumination system
KR100685948B1 (en) 2001-12-14 2007-02-23 엘지.필립스 엘시디 주식회사 A Liquid Crystal Display Device And The Method For Manufacturing The Same
GB2383886B (en) 2001-12-20 2005-07-20 Corning Inc Spatial light modulators with improved inter-pixel performance
GB2383641A (en) 2001-12-21 2003-07-02 Nokia Corp Reflective displays
JP3755460B2 (en) 2001-12-26 2006-03-15 ソニー株式会社 Electrostatically driven MEMS element and manufacturing method thereof, optical MEMS element, light modulation element, GLV device, laser display, and MEMS apparatus
JP2003202519A (en) 2001-12-28 2003-07-18 Canon Inc Stereoscopic image display device
US6785436B2 (en) 2001-12-28 2004-08-31 Axiowave Networks, Inc. Method of and operating architectural enhancement for combining optical (photonic) and data packet-based electrical switch fabric networks with a common software control plane while providing increased utilization of such combined networks
WO2003060920A1 (en) 2002-01-11 2003-07-24 Reflectivity, Inc. Spatial light modulator with charge-pump pixel cell
AU2002367045A1 (en) 2002-01-15 2003-07-30 Koninklijke Philips Electronics N.V. Light emitting display device with mechanical pixel switch
US7253845B2 (en) * 2002-01-22 2007-08-07 Thomson Licensing Color non-uniformity correction for LCOS
JP4013562B2 (en) 2002-01-25 2007-11-28 豊田合成株式会社 Lighting device
AU2003215117A1 (en) 2002-02-09 2003-09-04 Display Science, Inc. Flexible video displays and their manufacture
US6794119B2 (en) 2002-02-12 2004-09-21 Iridigm Display Corporation Method for fabricating a structure for a microelectromechanical systems (MEMS) device
US6897164B2 (en) 2002-02-14 2005-05-24 3M Innovative Properties Company Aperture masks for circuit fabrication
EP1478972A1 (en) 2002-02-19 2004-11-24 Koninklijke Philips Electronics N.V. Subtractive display device
EP1478974B1 (en) 2002-02-19 2012-03-07 Samsung LCD Netherlands R&D Center B.V. Display device
EP1478964B1 (en) 2002-02-20 2013-07-17 Koninklijke Philips Electronics N.V. Display apparatus
JP2003248463A (en) 2002-02-25 2003-09-05 Matsushita Electric Ind Co Ltd Liquid crystal display device
MXPA04008313A (en) 2002-02-26 2005-07-05 Uni Pixel Displays Inc Enhancements to optical flat panel displays.
JP2003254115A (en) 2002-02-26 2003-09-10 Yamaha Motor Co Ltd Throttle opening sensor
US6574033B1 (en) 2002-02-27 2003-06-03 Iridigm Display Corporation Microelectromechanical systems device and method for fabricating same
US7283112B2 (en) 2002-03-01 2007-10-16 Microsoft Corporation Reflective microelectrical mechanical structure (MEMS) optical modulator and optical display system
JP2003262734A (en) 2002-03-08 2003-09-19 Citizen Electronics Co Ltd Light guide plate
WO2003079384A2 (en) 2002-03-11 2003-09-25 Uni-Pixel Displays, Inc. Double-electret mems actuator
US7055975B2 (en) 2002-03-12 2006-06-06 Memx, Inc. Microelectromechanical system with non-collinear force compensation
US6650806B2 (en) 2002-03-14 2003-11-18 Memx, Inc. Compliant push/pull connector microstructure
US6707176B1 (en) 2002-03-14 2004-03-16 Memx, Inc. Non-linear actuator suspension for microelectromechanical systems
US6831390B2 (en) 2002-03-14 2004-12-14 Memx, Inc. Microelectromechanical system with stiff coupling
JP2005521085A (en) 2002-03-20 2005-07-14 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Foil display screen drive method and apparatus having the display screen
JP3895202B2 (en) 2002-03-22 2007-03-22 富士通株式会社 Method and apparatus for forming coating film on inner surface of thin tube
JP4247987B2 (en) 2002-03-26 2009-04-02 ディーコン エーエス Fine light modulator array
US7345824B2 (en) 2002-03-26 2008-03-18 Trivium Technologies, Inc. Light collimating device
JP3875130B2 (en) 2002-03-26 2007-01-31 株式会社東芝 Display device and manufacturing method thereof
US7053519B2 (en) 2002-03-29 2006-05-30 Microsoft Corporation Electrostatic bimorph actuator
JP2005522733A (en) 2002-04-09 2005-07-28 ディーコン エーエス Light modulation engine
JP2003313299A (en) 2002-04-22 2003-11-06 Seiko Epson Corp Higher order silane composition and process for forming silicon film using the same
US7217588B2 (en) 2005-01-05 2007-05-15 Sharp Laboratories Of America, Inc. Integrated MEMS packaging
US7125451B2 (en) 2002-04-23 2006-10-24 Sharp Laboratories Of America, Inc. Crystal-structure-processed mechanical devices and methods and systems for making
MXPA04010999A (en) 2002-05-06 2005-06-08 Uni Pixel Displays Inc Field sequential color efficiency.
US7362889B2 (en) 2002-05-10 2008-04-22 Massachusetts Institute Of Technology Elastomeric actuator devices for magnetic resonance imaging
US6879307B1 (en) 2002-05-15 2005-04-12 Ernest Stern Method and apparatus for reducing driver count and power consumption in micromechanical flat panel displays
JP4140816B2 (en) 2002-05-24 2008-08-27 富士通株式会社 Micro mirror element
JP2004004216A (en) 2002-05-31 2004-01-08 Victor Co Of Japan Ltd Liquid crystal display device
JP3871615B2 (en) 2002-06-13 2007-01-24 富士通株式会社 Display device
US6972889B2 (en) * 2002-06-27 2005-12-06 Research Triangle Institute Mems electrostatically actuated optical display device and associated arrays
US6777946B2 (en) 2002-07-01 2004-08-17 Honeywell International Inc. Cell buffer with built-in test
US6741377B2 (en) * 2002-07-02 2004-05-25 Iridigm Display Corporation Device having a light-absorbing mask and a method for fabricating same
WO2004005983A1 (en) * 2002-07-08 2004-01-15 Koninklijke Philips Electronics N.V. Foil display with two light guides
US20040013204A1 (en) 2002-07-16 2004-01-22 Nati Dinur Method and apparatus to compensate imbalance of demodulator
JP2004053839A (en) 2002-07-18 2004-02-19 Murata Mfg Co Ltd Light switching device
KR20040010026A (en) 2002-07-25 2004-01-31 가부시키가이샤 히타치세이사쿠쇼 Field emission display
JP3882709B2 (en) 2002-08-01 2007-02-21 日本ビクター株式会社 Driving method of liquid crystal display device
US7317465B2 (en) 2002-08-07 2008-01-08 Hewlett-Packard Development Company, L.P. Image display system and method
KR100484953B1 (en) * 2002-08-12 2005-04-22 엘지.필립스 엘시디 주식회사 reflective electrode of reflection or transflective type LCD and fabrication method of thereof
TWM251142U (en) 2002-08-14 2004-11-21 Hannstar Display Corp Liquid crystal display panel
US6700173B1 (en) 2002-08-20 2004-03-02 Memx, Inc. Electrically isolated support for overlying MEM structure
US7154458B2 (en) * 2002-08-21 2006-12-26 Nec Viewtechnology, Ltd. Video display device with spatial light modulator
JP3781743B2 (en) 2002-08-21 2006-05-31 Necビューテクノロジー株式会社 Video display device
JP2006500606A (en) 2002-08-21 2006-01-05 ノキア コーポレイション Switchable lens display
JP2004093760A (en) 2002-08-30 2004-03-25 Fujitsu Display Technologies Corp Method of manufacturing liquid crystal display
MXPA05003106A (en) 2002-09-20 2005-06-22 Honeywell Int Inc High efficiency viewing screen.
JP2004117833A (en) 2002-09-26 2004-04-15 Seiko Epson Corp Optical attenuator, electronic equipment, and method for driving optical attenuator
AU2003283965A1 (en) * 2002-09-27 2004-04-19 Professional Tool Manufacturing Llc Drill sharpener
EP1563480A4 (en) 2002-09-30 2010-03-03 Nanosys Inc Integrated displays using nanowire transistors
US6908202B2 (en) 2002-10-03 2005-06-21 General Electric Company Bulk diffuser for flat panel display
US6967986B2 (en) 2002-10-16 2005-11-22 Eastman Kodak Company Light modulation apparatus using a VCSEL array with an electromechanical grating device
JP3774715B2 (en) 2002-10-21 2006-05-17 キヤノン株式会社 Projection display
US7113165B2 (en) 2002-10-25 2006-09-26 Hewlett-Packard Development Company, L.P. Molecular light valve display having sequenced color illumination
US6666561B1 (en) 2002-10-28 2003-12-23 Hewlett-Packard Development Company, L.P. Continuously variable analog micro-mirror device
US7370185B2 (en) 2003-04-30 2008-05-06 Hewlett-Packard Development Company, L.P. Self-packaged optical interference display device having anti-stiction bumps, integral micro-lens, and reflection-absorbing layers
US6747773B2 (en) 2002-10-31 2004-06-08 Agilent Technologies, Inc. Method and structure for stub tunable resonant cavity for photonic crystals
US7474180B2 (en) 2002-11-01 2009-01-06 Georgia Tech Research Corp. Single substrate electromagnetic actuator
US6911964B2 (en) 2002-11-07 2005-06-28 Duke University Frame buffer pixel circuit for liquid crystal display
KR100513723B1 (en) 2002-11-18 2005-09-08 삼성전자주식회사 MicroElectro Mechanical system switch
US7405860B2 (en) 2002-11-26 2008-07-29 Texas Instruments Incorporated Spatial light modulators with light blocking/absorbing areas
US6844959B2 (en) * 2002-11-26 2005-01-18 Reflectivity, Inc Spatial light modulators with light absorbing areas
JP4150250B2 (en) 2002-12-02 2008-09-17 富士フイルム株式会社 Drawing head, drawing apparatus and drawing method
WO2004086098A2 (en) 2002-12-03 2004-10-07 Flixel Ltd. Display devices
JP3873149B2 (en) 2002-12-11 2007-01-24 株式会社日立製作所 Display device
US6698348B1 (en) 2002-12-11 2004-03-02 Edgetec Group Pty. Ltd. Stencil clip for a curb
JP2004191736A (en) 2002-12-12 2004-07-08 Ngk Insulators Ltd Display device
KR20050086917A (en) 2002-12-16 2005-08-30 이 잉크 코포레이션 Backplanes for electro-optic displays
US6857751B2 (en) * 2002-12-20 2005-02-22 Texas Instruments Incorporated Adaptive illumination modulator
JP2004205973A (en) 2002-12-26 2004-07-22 Fuji Photo Film Co Ltd Flat plane display element and method of driving the same
JP2004212673A (en) 2002-12-27 2004-07-29 Fuji Photo Film Co Ltd Planar display device and its driving method
JP2004212444A (en) 2002-12-27 2004-07-29 Internatl Business Mach Corp <Ibm> Method for manufacturing liquid crystal display device and device for bonding substrate
JP4238124B2 (en) 2003-01-07 2009-03-11 積水化学工業株式会社 Curable resin composition, adhesive epoxy resin paste, adhesive epoxy resin sheet, conductive connection paste, conductive connection sheet, and electronic component assembly
US20040136680A1 (en) 2003-01-09 2004-07-15 Teraop Ltd. Single layer MEMS based variable optical attenuator with transparent shutter
TWI234041B (en) 2003-01-14 2005-06-11 Benq Corp Low power backlight module
EP1584114A1 (en) 2003-01-17 2005-10-12 Diode Solutions, Inc. Display employing organic material
JP2004246324A (en) 2003-01-24 2004-09-02 Murata Mfg Co Ltd Electrostatic type actuator
JP2006516755A (en) 2003-01-27 2006-07-06 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Display device
US6888657B2 (en) 2003-01-28 2005-05-03 Hewlett-Packard Development Company, L.P. Multiple-bit storage element for binary optical display element
US20040145580A1 (en) 2003-01-29 2004-07-29 Perlman Stephen G. Apparatus and method for reflective display of images on a card
JP4493274B2 (en) 2003-01-29 2010-06-30 富士通株式会社 Display device and display method
US7180677B2 (en) 2003-01-31 2007-02-20 Fuji Photo Film Co., Ltd. Display device
US7042622B2 (en) 2003-10-30 2006-05-09 Reflectivity, Inc Micromirror and post arrangements on substrates
US7417782B2 (en) 2005-02-23 2008-08-26 Pixtronix, Incorporated Methods and apparatus for spatial light modulation
JP3669363B2 (en) 2003-03-06 2005-07-06 ソニー株式会社 Electrodeposition type display panel manufacturing method, electrodeposition type display panel, and electrodeposition type display device
US6967763B2 (en) 2003-03-11 2005-11-22 Fuji Photo Film Co., Ltd. Display device
JP4476649B2 (en) 2003-03-19 2010-06-09 ゼロックス コーポレイション Optical multi-state latch switch for microelectromechanical systems
US6947624B2 (en) 2003-03-19 2005-09-20 Xerox Corporation MEMS optical latching switch
JP4505189B2 (en) 2003-03-24 2010-07-21 富士フイルム株式会社 Transmission type light modulation device and mounting method thereof
JP4138672B2 (en) 2003-03-27 2008-08-27 セイコーエプソン株式会社 Manufacturing method of electro-optical device
JP4413515B2 (en) 2003-03-31 2010-02-10 シャープ株式会社 Image processing method and liquid crystal display device using the same
EP1614095A1 (en) 2003-04-02 2006-01-11 Koninklijke Philips Electronics N.V. Foil display
TW591287B (en) 2003-04-10 2004-06-11 Au Optronics Corp Liquid crystal display with an uniform common voltage and method thereof
JP4396124B2 (en) 2003-04-11 2010-01-13 セイコーエプソン株式会社 Display device, projector, and driving method thereof
US20040207768A1 (en) 2003-04-15 2004-10-21 Yin Liu Electron-beam controlled micromirror (ECM) projection display system
JP2004317785A (en) 2003-04-16 2004-11-11 Seiko Epson Corp Method for driving electrooptical device, electrooptical device, and electronic device
US7095546B2 (en) 2003-04-24 2006-08-22 Metconnex Canada Inc. Micro-electro-mechanical-system two dimensional mirror with articulated suspension structures for high fill factor arrays
US7129925B2 (en) 2003-04-24 2006-10-31 Hewlett-Packard Development Company, L.P. Dynamic self-refresh display memory
EP1649445A4 (en) 2003-04-24 2009-03-25 Displaytech Inc Microdisplay and interface on a single chip
JP4149305B2 (en) 2003-04-25 2008-09-10 富士フイルム株式会社 Optical shutter and image display device using the same
US6741384B1 (en) 2003-04-30 2004-05-25 Hewlett-Packard Development Company, L.P. Control of MEMS and light modulator arrays
US7218499B2 (en) 2003-05-14 2007-05-15 Hewlett-Packard Development Company, L.P. Charge control circuit
US6846089B2 (en) * 2003-05-16 2005-01-25 3M Innovative Properties Company Method for stacking surface structured optical films
WO2004104670A1 (en) 2003-05-22 2004-12-02 Koninklijke Philips Electronics N.V. Display device
KR20060014407A (en) * 2003-05-22 2006-02-15 코닌클리케 필립스 일렉트로닉스 엔.브이. Dynamic foil display having low resistivity electrodes
JP4338442B2 (en) 2003-05-23 2009-10-07 富士フイルム株式会社 Manufacturing method of transmissive light modulation element
US20050018322A1 (en) 2003-05-28 2005-01-27 Terraop Ltd. Magnetically actuated fast MEMS mirrors and microscanners
JP4039314B2 (en) 2003-05-29 2008-01-30 セイコーエプソン株式会社 Image reading apparatus having power saving mode
JP2004354763A (en) 2003-05-29 2004-12-16 Seiko Epson Corp Screen, image display device, and rear projector
US7292235B2 (en) 2003-06-03 2007-11-06 Nec Electronics Corporation Controller driver and display apparatus using the same
CN1234237C (en) 2003-06-12 2005-12-28 浙江大学 Tricolour convergent method for matrix pixel device projector scope
EP1489449A1 (en) 2003-06-20 2004-12-22 ASML Netherlands B.V. Spatial light modulator
US7221495B2 (en) 2003-06-24 2007-05-22 Idc Llc Thin film precursor stack for MEMS manufacturing
US20050012197A1 (en) * 2003-07-15 2005-01-20 Smith Mark A. Fluidic MEMS device
DE10332647A1 (en) 2003-07-18 2005-02-17 Monty Knopp Method for image generation with microelectromechanical system (MEMS) switch filters, with several information of colour, red, green, blue plasma monitors reproducing full information per pixel
JP2005043674A (en) 2003-07-22 2005-02-17 Moritex Corp Comb type electrostatic actuator and optical controller using the same
JP2005043726A (en) 2003-07-23 2005-02-17 Fuji Photo Film Co Ltd Display element and portable equipment using it
JP4178327B2 (en) 2003-08-11 2008-11-12 株式会社村田製作所 Buckling actuator
TWI294976B (en) 2003-08-18 2008-03-21 Seiko Epson Corp Method for controlling optical control device, optical control device, spatial light modulation device, and projector
US6996306B2 (en) 2003-08-25 2006-02-07 Asia Pacific Microsystems, Inc. Electrostatically operated micro-optical devices and method for manufacturing thereof
US7315294B2 (en) * 2003-08-25 2008-01-01 Texas Instruments Incorporated Deinterleaving transpose circuits in digital display systems
JP4131218B2 (en) 2003-09-17 2008-08-13 セイコーエプソン株式会社 Display panel and display device
JP4530632B2 (en) 2003-09-19 2010-08-25 富士通株式会社 Liquid crystal display
TW200523503A (en) 2003-09-29 2005-07-16 Sony Corp Backlight, light guiding plate, method for manufacturing diffusion plate and light guiding plate, and liquid crystal display device
US20050073471A1 (en) 2003-10-03 2005-04-07 Uni-Pixel Displays, Inc. Z-axis redundant display/multilayer display
US7003193B2 (en) 2003-10-10 2006-02-21 Japan Aviation Electronics Industry Limited Miniature movable device
JP2005134896A (en) 2003-10-10 2005-05-26 Japan Aviation Electronics Industry Ltd Fine movable device
US7012726B1 (en) 2003-11-03 2006-03-14 Idc, Llc MEMS devices with unreleased thin film components
JP2007513365A (en) 2003-11-14 2007-05-24 ユニ−ピクセル ディスプレイズ, インコーポレイテッド Simple matrix addressing on the display
JP2005158665A (en) 2003-11-24 2005-06-16 Toyota Industries Corp Lighting system
KR20050055203A (en) 2003-12-05 2005-06-13 한국전자통신연구원 Structure for manufacturing optical module
US7430355B2 (en) 2003-12-08 2008-09-30 University Of Cincinnati Light emissive signage devices based on lightwave coupling
US7123796B2 (en) 2003-12-08 2006-10-17 University Of Cincinnati Light emissive display based on lightwave coupling
US7161728B2 (en) 2003-12-09 2007-01-09 Idc, Llc Area array modulation and lead reduction in interferometric modulators
US7142346B2 (en) 2003-12-09 2006-11-28 Idc, Llc System and method for addressing a MEMS display
KR100531796B1 (en) 2003-12-10 2005-12-02 엘지전자 주식회사 Optical shutter for plasma display panel and driving method therof
US7182463B2 (en) 2003-12-23 2007-02-27 3M Innovative Properties Company Pixel-shifting projection lens assembly to provide optical interlacing for increased addressability
FR2864526B1 (en) 2003-12-26 2006-10-13 Commissariat Energie Atomique ELECTROSTATIC ACTUATING DEVICE
DE10361915B4 (en) 2003-12-29 2009-03-05 Bausenwein, Bernhard, Dr. 2-channel stereo image display device with microelectromechanical systems
JP2005195734A (en) 2004-01-05 2005-07-21 Fujitsu Ltd Light-emitting control apparatus, display apparatus, display control apparatus and display control program
JP4267465B2 (en) 2004-01-07 2009-05-27 富士フイルム株式会社 REFLECTIVE COLOR DISPLAY ELEMENT, ITS MANUFACTURING METHOD, AND INFORMATION DISPLAY DEVICE PROVIDED WITH THE DISPLAY ELEMENT
US7532194B2 (en) 2004-02-03 2009-05-12 Idc, Llc Driver voltage adjuster
US7342705B2 (en) 2004-02-03 2008-03-11 Idc, Llc Spatial light modulator with integrated optical compensation structure
ITVA20040004A1 (en) * 2004-02-06 2004-05-06 St Microelectronics Srl OPEN RING VOLTAGE DRIVING METHOD AND CIRCUIT OF A DC MOTOR
TW200536536A (en) 2004-02-25 2005-11-16 Schering Corp Pyrazolotriazines as kinase inhibitors
US7119945B2 (en) 2004-03-03 2006-10-10 Idc, Llc Altering temporal response of microelectromechanical elements
US7706050B2 (en) 2004-03-05 2010-04-27 Qualcomm Mems Technologies, Inc. Integrated modulator illumination
US7855824B2 (en) 2004-03-06 2010-12-21 Qualcomm Mems Technologies, Inc. Method and system for color optimization in a display
JP2005257981A (en) 2004-03-11 2005-09-22 Fuji Photo Film Co Ltd Method of driving optical modulation element array, optical modulation apparatus, and image forming apparatus
US6912082B1 (en) 2004-03-11 2005-06-28 Palo Alto Research Center Incorporated Integrated driver electronics for MEMS device using high voltage thin film transistors
US20050244099A1 (en) 2004-03-24 2005-11-03 Pasch Nicholas F Cantilevered micro-electromechanical switch array
TWI244535B (en) 2004-03-24 2005-12-01 Yuan Lin A full color and flexible illuminating strap device
JP4639104B2 (en) 2004-03-24 2011-02-23 富士フイルム株式会社 Light modulation element array driving method, light modulation element array, and image forming apparatus
US7304782B2 (en) 2004-03-24 2007-12-04 Fujifilm Corporation Driving method of spatial light modulator array, spatial light modulator array, and image forming apparatus
US20050225501A1 (en) 2004-03-30 2005-10-13 Balakrishnan Srinivasan Self-aligned microlens array for transmissive MEMS image arrray
US8267780B2 (en) 2004-03-31 2012-09-18 Nintendo Co., Ltd. Game console and memory card
CN1957471A (en) 2004-04-06 2007-05-02 彩光公司 Color filter integrated with sensor array for flat panel display
CN1981318A (en) 2004-04-12 2007-06-13 彩光公司 Low power circuits for active matrix emissive displays and methods of operating the same
US7158278B2 (en) 2004-04-12 2007-01-02 Alexander Kastalsky Display device based on bistable electrostatic shutter
CA2562606A1 (en) * 2004-04-13 2005-10-27 Cambridge Biostability Limited Liquids containing suspended glass particles
US7026821B2 (en) 2004-04-17 2006-04-11 Hewlett-Packard Development Company, L.P. Testing MEM device array
EP1591824B1 (en) 2004-04-26 2012-05-09 Panasonic Corporation Microactuator
TWI330282B (en) 2004-04-30 2010-09-11 Chimei Innolux Corp Light guide plate and backlight moudule using same
JP2005317439A (en) 2004-04-30 2005-11-10 Seiko Epson Corp Display panel and display device
US7476327B2 (en) 2004-05-04 2009-01-13 Idc, Llc Method of manufacture for microelectromechanical devices
US7060895B2 (en) 2004-05-04 2006-06-13 Idc, Llc Modifying the electro-mechanical behavior of devices
US7164520B2 (en) 2004-05-12 2007-01-16 Idc, Llc Packaging for an interferometric modulator
US8025831B2 (en) 2004-05-24 2011-09-27 Agency For Science, Technology And Research Imprinting of supported and free-standing 3-D micro- or nano-structures
US7067355B2 (en) 2004-05-26 2006-06-27 Hewlett-Packard Development Company, L.P. Package having bond-sealed underbump
US7952189B2 (en) 2004-05-27 2011-05-31 Chang-Feng Wan Hermetic packaging and method of manufacture and use therefore
US7997771B2 (en) 2004-06-01 2011-08-16 3M Innovative Properties Company LED array systems
JP4211689B2 (en) 2004-06-14 2009-01-21 オムロン株式会社 Diffuser and surface light source device
US7787170B2 (en) 2004-06-15 2010-08-31 Texas Instruments Incorporated Micromirror array assembly with in-array pillars
EP1771763A1 (en) * 2004-06-24 2007-04-11 Cornell Research Foundation, Inc. Fibrous-composite-material-based mems optical scanner
US7636795B2 (en) 2004-06-30 2009-12-22 Intel Corporation Configurable feature selection mechanism
US7256922B2 (en) * 2004-07-02 2007-08-14 Idc, Llc Interferometric modulators with thin film transistors
WO2006017129A2 (en) 2004-07-09 2006-02-16 University Of Cincinnati Display capable electrowetting light valve
US20060012781A1 (en) * 2004-07-14 2006-01-19 Negevtech Ltd. Programmable spatial filter for wafer inspection
US7187487B2 (en) * 2004-07-30 2007-03-06 Hewlett-Packard Development Company, L.P. Light modulator with a light-absorbing layer
US20060028811A1 (en) * 2004-08-05 2006-02-09 Ross Charles A Jr Digital video recording flashlight
US20060033676A1 (en) * 2004-08-10 2006-02-16 Kenneth Faase Display device
US7453445B2 (en) 2004-08-13 2008-11-18 E Ink Corproation Methods for driving electro-optic displays
JP2006058770A (en) 2004-08-23 2006-03-02 Toshiba Matsushita Display Technology Co Ltd Driving circuit for display apparatus
US7215459B2 (en) 2004-08-25 2007-05-08 Reflectivity, Inc. Micromirror devices with in-plane deformable hinge
US6980349B1 (en) 2004-08-25 2005-12-27 Reflectivity, Inc Micromirrors with novel mirror plates
US7119944B2 (en) 2004-08-25 2006-10-10 Reflectivity, Inc. Micromirror device and method for making the same
US7515147B2 (en) 2004-08-27 2009-04-07 Idc, Llc Staggered column drive circuit systems and methods
US7551159B2 (en) 2004-08-27 2009-06-23 Idc, Llc System and method of sensing actuation and release voltages of an interferometric modulator
US7889163B2 (en) 2004-08-27 2011-02-15 Qualcomm Mems Technologies, Inc. Drive method for MEMS devices
US7505108B2 (en) 2004-09-02 2009-03-17 Nano Loa, Inc. Liquid crystal material filling method and liquid crystal material filling apparatus
US7564874B2 (en) 2004-09-17 2009-07-21 Uni-Pixel Displays, Inc. Enhanced bandwidth data encoding method
US7417735B2 (en) * 2004-09-27 2008-08-26 Idc, Llc Systems and methods for measuring color and contrast in specular reflective devices
US7843410B2 (en) 2004-09-27 2010-11-30 Qualcomm Mems Technologies, Inc. Method and device for electrically programmable display
US7573547B2 (en) 2004-09-27 2009-08-11 Idc, Llc System and method for protecting micro-structure of display array using spacers in gap within display device
US7535466B2 (en) 2004-09-27 2009-05-19 Idc, Llc System with server based control of client device display features
US7184202B2 (en) 2004-09-27 2007-02-27 Idc, Llc Method and system for packaging a MEMS device
US8004504B2 (en) 2004-09-27 2011-08-23 Qualcomm Mems Technologies, Inc. Reduced capacitance display element
US7446927B2 (en) 2004-09-27 2008-11-04 Idc, Llc MEMS switch with set and latch electrodes
US7545550B2 (en) 2004-09-27 2009-06-09 Idc, Llc Systems and methods of actuating MEMS display elements
US7420725B2 (en) 2004-09-27 2008-09-02 Idc, Llc Device having a conductive light absorbing mask and method for fabricating same
US7345805B2 (en) 2004-09-27 2008-03-18 Idc, Llc Interferometric modulator array with integrated MEMS electrical switches
US20060132383A1 (en) 2004-09-27 2006-06-22 Idc, Llc System and method for illuminating interferometric modulator display
US20060066540A1 (en) * 2004-09-27 2006-03-30 Texas Instruments Incorporated Spatial light modulation display system
US7525730B2 (en) 2004-09-27 2009-04-28 Idc, Llc Method and device for generating white in an interferometric modulator display
JP4414855B2 (en) 2004-09-30 2010-02-10 富士フイルム株式会社 Manufacturing method of transmissive light modulation element
JP5102623B2 (en) 2004-11-04 2012-12-19 ランバス・インターナショナル・リミテッド Long curved wedges in optical films
US20060104061A1 (en) 2004-11-16 2006-05-18 Scott Lerner Display with planar light source
US7199916B2 (en) 2004-12-07 2007-04-03 Hewlett-Packard Development Company, L.P. Light modulator device
JP4546266B2 (en) 2005-01-13 2010-09-15 シャープ株式会社 Sheet image display device
US7627330B2 (en) 2005-01-31 2009-12-01 Research In Motion Limited Mobile electronic device having a geographical position dependent light and method and system for achieving the same
CN101487948B (en) 2005-01-31 2011-04-13 凸版印刷株式会社 Optical sheet, and backlight unit and display using the same
JP4534787B2 (en) 2005-02-21 2010-09-01 ブラザー工業株式会社 Image forming apparatus
US7675665B2 (en) 2005-02-23 2010-03-09 Pixtronix, Incorporated Methods and apparatus for actuating displays
US8482496B2 (en) 2006-01-06 2013-07-09 Pixtronix, Inc. Circuits for controlling MEMS display apparatus on a transparent substrate
US7502159B2 (en) 2005-02-23 2009-03-10 Pixtronix, Inc. Methods and apparatus for actuating displays
US7746529B2 (en) 2005-02-23 2010-06-29 Pixtronix, Inc. MEMS display apparatus
US9229222B2 (en) 2005-02-23 2016-01-05 Pixtronix, Inc. Alignment methods in fluid-filled MEMS displays
US7616368B2 (en) 2005-02-23 2009-11-10 Pixtronix, Inc. Light concentrating reflective display methods and apparatus
US9261694B2 (en) 2005-02-23 2016-02-16 Pixtronix, Inc. Display apparatus and methods for manufacture thereof
US8310442B2 (en) 2005-02-23 2012-11-13 Pixtronix, Inc. Circuits for controlling display apparatus
US8519945B2 (en) 2006-01-06 2013-08-27 Pixtronix, Inc. Circuits for controlling display apparatus
US9082353B2 (en) 2010-01-05 2015-07-14 Pixtronix, Inc. Circuits for controlling display apparatus
US20080158635A1 (en) 2005-02-23 2008-07-03 Pixtronix, Inc. Display apparatus and methods for manufacture thereof
US7271945B2 (en) 2005-02-23 2007-09-18 Pixtronix, Inc. Methods and apparatus for actuating displays
DE602006003737D1 (en) 2005-02-23 2009-01-02 Pixtronix Inc LIGHT MODULATOR AND METHOD FOR THE PRODUCTION THEREOF
US20070205969A1 (en) 2005-02-23 2007-09-06 Pixtronix, Incorporated Direct-view MEMS display devices and methods for generating images thereon
US7304785B2 (en) 2005-02-23 2007-12-04 Pixtronix, Inc. Display methods and apparatus
EP2085807B1 (en) 2005-02-23 2014-10-29 Pixtronix, Inc. Display methods and apparatus
US7999994B2 (en) 2005-02-23 2011-08-16 Pixtronix, Inc. Display apparatus and methods for manufacture thereof
US9158106B2 (en) 2005-02-23 2015-10-13 Pixtronix, Inc. Display methods and apparatus
US20060209012A1 (en) 2005-02-23 2006-09-21 Pixtronix, Incorporated Devices having MEMS displays
US7405852B2 (en) * 2005-02-23 2008-07-29 Pixtronix, Inc. Display apparatus and methods for manufacture thereof
US8159428B2 (en) 2005-02-23 2012-04-17 Pixtronix, Inc. Display methods and apparatus
US7304786B2 (en) 2005-02-23 2007-12-04 Pixtronix, Inc. Methods and apparatus for bi-stable actuation of displays
EP1858796B1 (en) 2005-02-23 2011-01-19 Pixtronix Inc. Methods and apparatus for actuating displays
US7755582B2 (en) 2005-02-23 2010-07-13 Pixtronix, Incorporated Display methods and apparatus
US7742016B2 (en) 2005-02-23 2010-06-22 Pixtronix, Incorporated Display methods and apparatus
US20080279727A1 (en) 2005-03-01 2008-11-13 Haushalter Robert C Polymeric Fluid Transfer and Printing Devices
US9042461B2 (en) 2005-03-10 2015-05-26 Qualcomm Incorporated Efficient employment of digital upsampling using IFFT in OFDM systems for simpler analog filtering
US7982777B2 (en) 2005-04-07 2011-07-19 Axis Engineering Technologies, Inc. Stereoscopic wide field of view imaging system
US7349140B2 (en) 2005-05-31 2008-03-25 Miradia Inc. Triple alignment substrate method and structure for packaging devices
US20060280319A1 (en) 2005-06-08 2006-12-14 General Mems Corporation Micromachined Capacitive Microphone
EP1734502A1 (en) 2005-06-13 2006-12-20 Sony Ericsson Mobile Communications AB Illumination in a portable communication device
US7826125B2 (en) * 2005-06-14 2010-11-02 California Institute Of Technology Light conductive controlled shape droplet display device
US7773733B2 (en) * 2005-06-23 2010-08-10 Agere Systems Inc. Single-transformer digital isolation barrier
WO2007002452A2 (en) 2005-06-23 2007-01-04 E Ink Corporation Edge seals and processes for electro-optic displays
US7684660B2 (en) 2005-06-24 2010-03-23 Intel Corporation Methods and apparatus to mount a waveguide to a substrate
US20070052660A1 (en) 2005-08-23 2007-03-08 Eastman Kodak Company Forming display color image
US7449759B2 (en) 2005-08-30 2008-11-11 Uni-Pixel Displays, Inc. Electromechanical dynamic force profile articulating mechanism
US8509582B2 (en) 2005-08-30 2013-08-13 Rambus Delaware Llc Reducing light leakage and improving contrast ratio performance in FTIR display devices
US7355779B2 (en) 2005-09-02 2008-04-08 Idc, Llc Method and system for driving MEMS display elements
WO2007029407A1 (en) 2005-09-05 2007-03-15 Sharp Kabushiki Kaisha Backlight device and display device
KR100668498B1 (en) 2005-11-09 2007-01-12 주식회사 하이닉스반도체 Apparatus and method for outputting data of semiconductor memory
JP2007155983A (en) 2005-12-02 2007-06-21 Hitachi Displays Ltd Liquid crystal display apparatus
JP2007163647A (en) 2005-12-12 2007-06-28 Mitsubishi Electric Corp Image display apparatus
US7486854B2 (en) 2006-01-24 2009-02-03 Uni-Pixel Displays, Inc. Optical microstructures for light extraction and control
US8526096B2 (en) 2006-02-23 2013-09-03 Pixtronix, Inc. Mechanical light modulators with stressed beams
JP4789662B2 (en) 2006-03-17 2011-10-12 富士通セミコンダクター株式会社 Power supply device control circuit, power supply device and control method therefor
WO2007120887A2 (en) 2006-04-13 2007-10-25 Qualcomm Mems Technologies, Inc Packaging a mems device using a frame
TW200745680A (en) 2006-04-19 2007-12-16 Omron Tateisi Electronics Co Diffuser plate and surface light source device
US7711239B2 (en) 2006-04-19 2010-05-04 Qualcomm Mems Technologies, Inc. Microelectromechanical device and method utilizing nanoparticles
US7876489B2 (en) 2006-06-05 2011-01-25 Pixtronix, Inc. Display apparatus with optical cavities
GB0611125D0 (en) 2006-06-06 2006-07-19 Liquavista Bv Transflective electrowetting display device
WO2007149475A2 (en) 2006-06-21 2007-12-27 Qualcomm Mems Technologies, Inc. Method for packaging an optical mems device
US7843637B2 (en) 2006-06-22 2010-11-30 3M Innovative Properties Company Birefringent structured film for LED color mixing in a backlight
JP5125005B2 (en) 2006-07-04 2013-01-23 セイコーエプソン株式会社 Display device and display system using the same
DE102006033312A1 (en) * 2006-07-17 2008-01-31 Heraeus Kulzer Gmbh Dental implant system part with a coating
US20080043726A1 (en) 2006-08-21 2008-02-21 Telefonaktiebolaget L M Ericsson (Publ) Selective Control of User Equipment Capabilities
US8872753B2 (en) 2006-08-31 2014-10-28 Ati Technologies Ulc Adjusting brightness of a display image in a display having an adjustable intensity light source
JP2008098984A (en) 2006-10-12 2008-04-24 Fukushin Techno Research Co Ltd Portable folding antenna
EP2080045A1 (en) 2006-10-20 2009-07-22 Pixtronix Inc. Light guides and backlight systems incorporating light redirectors at varying densities
US9176318B2 (en) 2007-05-18 2015-11-03 Pixtronix, Inc. Methods for manufacturing fluid-filled MEMS displays
US7852546B2 (en) 2007-10-19 2010-12-14 Pixtronix, Inc. Spacers for maintaining display apparatus alignment
EP2074464A2 (en) 2007-01-19 2009-07-01 Pixtronix Inc. Mems display apparatus
JP2013061658A (en) 2007-01-19 2013-04-04 Pixtronix Inc Mems display apparatus
GB0701641D0 (en) 2007-01-29 2007-03-07 Iti Scotland Ltd Analyte manipulation and detection
WO2008102842A1 (en) 2007-02-23 2008-08-28 Ngk Spark Plug Co., Ltd. Spark plug and internal combustion engine with spark plug
US7903104B2 (en) 2007-03-21 2011-03-08 Spatial Photonics, Inc. Spatial modulator display system using two memories and display time slices having differing times
JP5125378B2 (en) 2007-10-03 2013-01-23 セイコーエプソン株式会社 Control method, control device, display body, and information display device
JP2009111813A (en) 2007-10-31 2009-05-21 Seiko Epson Corp Projector, image data acquisition method for projector, and imaging device
WO2009102471A1 (en) 2008-02-12 2009-08-20 Pixtronix, Inc. Mechanical light modulators with stressed beams
JP2009207590A (en) 2008-03-03 2009-09-17 Topcon Corp Stereomicroscope
US7920317B2 (en) 2008-08-04 2011-04-05 Pixtronix, Inc. Display with controlled formation of bubbles
US8169679B2 (en) 2008-10-27 2012-05-01 Pixtronix, Inc. MEMS anchors
US20110205259A1 (en) 2008-10-28 2011-08-25 Pixtronix, Inc. System and method for selecting display modes
FR2948689B1 (en) 2009-07-29 2011-07-29 Alcan Int Ltd GROOVED ANODE OF ELECTROLYTIC TANK
FI20095988A0 (en) 2009-09-28 2009-09-28 Valtion Teknillinen Micromechanical resonator and method of manufacture thereof
BR112012019383A2 (en) 2010-02-02 2017-09-12 Pixtronix Inc CIRCUITS TO CONTROL DISPLAY APPARATUS
US20120133006A1 (en) 2010-11-29 2012-05-31 International Business Machines Corporation Oxide mems beam
JP5870558B2 (en) 2011-02-17 2016-03-01 株式会社リコー Transmission management system, transmission management method, and program
JP2012230079A (en) 2011-04-27 2012-11-22 Hitachi-Ge Nuclear Energy Ltd Nuclear power plant, fuel pool water cooling apparatus, and fuel pool water cooling method
US9809445B2 (en) 2011-08-26 2017-11-07 Qualcomm Incorporated Electromechanical system structures with ribs having gaps
US8698980B2 (en) 2011-11-14 2014-04-15 Planck Co., Ltd. Color regulating device for illumination and apparatus using the same, and method of regulating color
US20140078154A1 (en) 2012-09-14 2014-03-20 Pixtronix, Inc. Display apparatus with multi-height spacers
US9201236B2 (en) 2012-11-27 2015-12-01 Pixtronix, Inc. Display apparatus with stiction reduction features
US20140184573A1 (en) 2012-12-28 2014-07-03 Pixtronix, Inc. Electromechanical Systems Color Transflective Display Apparatus
US20140184621A1 (en) 2012-12-28 2014-07-03 Pixtronix, Inc. Display apparatus including dual actuation axis electromechanical systems light modulators
US9176317B2 (en) 2013-03-13 2015-11-03 Pixtronix, Inc. Display apparatus incorporating dual-level shutters
US9134552B2 (en) 2013-03-13 2015-09-15 Pixtronix, Inc. Display apparatus with narrow gap electrostatic actuators
US9134530B2 (en) 2013-03-13 2015-09-15 Pixtronix, Inc. Display apparatus incorporating dual-level shutters

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020000959A1 (en) * 1998-10-08 2002-01-03 International Business Machines Corporation Micromechanical displays and fabrication method
US20010050661A1 (en) * 2000-06-12 2001-12-13 Matsushita Electric Industrial Co., Ltd. Color image display device and projection-type image display apparatus
US20040233498A1 (en) * 2000-10-31 2004-11-25 Microsoft Corporation Microelectrical mechanical structure (MEMS) optical modulator and optical display system
US20040156246A1 (en) * 2002-09-18 2004-08-12 Seiko Epson Corporation Optoelectronic-device substrate, method for driving same, digitally-driven liquid-crystal-display, electronic apparatus, and projector
US20050140636A1 (en) * 2003-12-29 2005-06-30 Chung In J. Method and apparatus for driving liquid crystal display

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9087486B2 (en) 2005-02-23 2015-07-21 Pixtronix, Inc. Circuits for controlling display apparatus
US9261694B2 (en) 2005-02-23 2016-02-16 Pixtronix, Inc. Display apparatus and methods for manufacture thereof
US8519923B2 (en) 2005-02-23 2013-08-27 Pixtronix, Inc. Display methods and apparatus
US9135868B2 (en) 2005-02-23 2015-09-15 Pixtronix, Inc. Direct-view MEMS display devices and methods for generating images thereon
US9177523B2 (en) 2005-02-23 2015-11-03 Pixtronix, Inc. Circuits for controlling display apparatus
US9500853B2 (en) 2005-02-23 2016-11-22 Snaptrack, Inc. MEMS-based display apparatus
US9274333B2 (en) 2005-02-23 2016-03-01 Pixtronix, Inc. Alignment methods in fluid-filled MEMS displays
US9336732B2 (en) 2005-02-23 2016-05-10 Pixtronix, Inc. Circuits for controlling display apparatus
US9229222B2 (en) 2005-02-23 2016-01-05 Pixtronix, Inc. Alignment methods in fluid-filled MEMS displays
US9158106B2 (en) 2005-02-23 2015-10-13 Pixtronix, Inc. Display methods and apparatus
US8482496B2 (en) 2006-01-06 2013-07-09 Pixtronix, Inc. Circuits for controlling MEMS display apparatus on a transparent substrate
US8519945B2 (en) 2006-01-06 2013-08-27 Pixtronix, Inc. Circuits for controlling display apparatus
US9128277B2 (en) 2006-02-23 2015-09-08 Pixtronix, Inc. Mechanical light modulators with stressed beams
US8526096B2 (en) 2006-02-23 2013-09-03 Pixtronix, Inc. Mechanical light modulators with stressed beams
US9176318B2 (en) 2007-05-18 2015-11-03 Pixtronix, Inc. Methods for manufacturing fluid-filled MEMS displays
US9116344B2 (en) 2008-10-27 2015-08-25 Pixtronix, Inc. MEMS anchors
US8599463B2 (en) 2008-10-27 2013-12-03 Pixtronix, Inc. MEMS anchors
US9182587B2 (en) 2008-10-27 2015-11-10 Pixtronix, Inc. Manufacturing structure and process for compliant mechanisms
US9082353B2 (en) 2010-01-05 2015-07-14 Pixtronix, Inc. Circuits for controlling display apparatus
US8873004B2 (en) * 2010-02-04 2014-10-28 Samsung Electronics Co., Ltd. 2D/3D switchable image display device
US9207459B2 (en) * 2010-02-04 2015-12-08 Samsung Electronics Co., Ltd. 2D/3D switchable image display device
US20150015816A1 (en) * 2010-02-04 2015-01-15 Samsung Electronics Co., Ltd. 2d/3d switchable image display device
US20110188106A1 (en) * 2010-02-04 2011-08-04 Samsung Electronics Co., Ltd. 2d/3d switchable image display device
US9857628B2 (en) 2011-01-07 2018-01-02 Semiconductor Energy Laboratory Co., Ltd. Display device
US9134552B2 (en) 2013-03-13 2015-09-15 Pixtronix, Inc. Display apparatus with narrow gap electrostatic actuators
WO2015156938A1 (en) * 2014-04-09 2015-10-15 Pixtronix, Inc. Field sequential color (fsc) display apparatus and method employing different subframe temporal spreading
US9583035B2 (en) 2014-10-22 2017-02-28 Snaptrack, Inc. Display incorporating lossy dynamic saturation compensating gamut mapping
US10685598B2 (en) 2016-03-25 2020-06-16 Sharp Kabushiki Kaisha Display panel, display apparatus, and method for manufacturing display panel
US11024656B2 (en) 2016-06-28 2021-06-01 Sharp Kabushiki Kaisha Active matrix substrate, optical shutter substrate, display device, and method for manufacturing active matrix substrate
US20190333460A1 (en) * 2018-04-27 2019-10-31 Japan Display Inc. Display device and image determination device
US10885859B2 (en) * 2018-04-27 2021-01-05 Japan Display Inc. Display device and image determination device
TWI749628B (en) * 2020-07-09 2021-12-11 瑞昱半導體股份有限公司 Scalar, display device and associated data processing method

Also Published As

Publication number Publication date
US20160275876A1 (en) 2016-09-22
US9135868B2 (en) 2015-09-15
US20120320112A1 (en) 2012-12-20
US20120320111A1 (en) 2012-12-20
US20070205969A1 (en) 2007-09-06

Similar Documents

Publication Publication Date Title
US9135868B2 (en) Direct-view MEMS display devices and methods for generating images thereon
EP1966788B1 (en) Direct-view mems display devices and methods for generating images thereon
JP5989848B2 (en) Field sequential color display using composite colors
US9530344B2 (en) Circuits for controlling display apparatus
US9398666B2 (en) Reflective and transflective operation modes for a display device
US20130321477A1 (en) Display devices and methods for generating images thereon according to a variable composite color replacement policy
EP2402934A2 (en) A direct-view display

Legal Events

Date Code Title Description
AS Assignment

Owner name: PIXTRONIX, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HAGOOD, NESBITT W., IV;GANDHI, JIGNESH;MCALLISTER, ABRAHAM R.;AND OTHERS;SIGNING DATES FROM 20090320 TO 20090331;REEL/FRAME:028871/0488

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: SNAPTRACK, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PIXTRONIX, INC.;REEL/FRAME:039905/0188

Effective date: 20160901